By Raúl Carrillo
In the fall of 2013, on the 50th anniversary of the March on Washington for Jobs and Freedom, Ohio State University Law Professor Michelle Alexander penned a brilliant essay in The Nation, entitled “Breaking My Silence¨. In the piece, Alexander, author of the groundbreaking book, The New Jim Crow, urged social justice advocates to get out of our “lanes” and “do what Dr. King demanded we should: connect the dots between poverty, racism, militarism and materialism.”
In this spirit, I am writing to encourage readers to take up yet another task, one I’ve unfortunately only recently shouldered myself: to understand how digital surveillance reinforces socioeconomic hierarchies.
As writers on this site argue every day, despite its strikingly clear ability to do so, the U.S. government does not fund the public infrastructure necessary for sustaining basic human dignity. Thus, most people striving for economic security in this country can only look to private institutional creditors. Today, this usually means also interacting with Big Data. Yet for certain groups of people—the usual suspects—social surveillance not only jeopardizes civil rights, but threatens financial predation and exclusion beyond the norm. In the age of the internet, injustice is increasingly intersectional.
In 2015, even the mainstream media harps on invasions of privacy. At this point most Americans are aware of the existence of data-mining, if not its full extent. This is partially because the practice touches most everyone. Even some of the most privileged people’s information is siphoned so companies can barrage us with advertisements, assaulting our anonymity and autonomy, while supposedly compensating us with consumer convenience.
Yet this harvesting of personal data is not merely irritating or unsettling. It outright punishes many people.
Last spring, in a piece entitled “Redlining for the 21st Century”, Bill Davidow of The Atlantic sketched some ways in which algorithms deny people financial services, or charge them much higher rates, based on their inferred race. More specifically, watchdogs have noted that mortgage providers can now determine an applicant’s ZIP Code via their internet access, and then proceed to charge them based on the neighborhood where they reside. If human employees did this, they would be acting in clear violation of The Fair Housing Act of 1968. Yet, because algorithms are the actors, the practice often goes unexamined, and people who might otherwise receive credit at a reasonable rate get hurt.
Humans write the code. Humans set up the servers. Humans make sure everything hums. Yet despite these obvious facts, there is a lengthy legal history of presuming machines to be independent and untainted. For example, University of Maryland Law Professors Danielle Keats Citron and Frank Pasquale have recently detailed how credit scoring systems, despite their supposedly objective simplicity, are “inevitably subjective and value-laden.” Although the systems were initially built to eliminate discriminatory practices, many are now accomplishing the opposite function. Systems can only be as free from bias as their software, and thus only as righteous as the values of developers and programmers. Even intent hardly matters, though. As mathematician Cathy O’Neil has written, seemingly neutral choices can have a disparate impact.
Digital discrimination isn’t limited to the private sector, either. In an essay entitled “Big Data and Human Rights”, Virginia Eubanks, an Associate Professor of Women’s, Gender, and Sexuality Studies at SUNY Albany, notes that although New York State social service providers initially used algorithms to expose the discrimination of their own employees, the tables have turned. Faced with the fiscal burden of supplying benefits in the wake of recession, the state commissioned technologies that replaced the decisions of social workers, allowing bureaucracies to deny benefits behind the illusion of angelic machines. Now, computers, rather than human employees directly, can make choices about social spending based on built-in prejudices regarding welfare recipients.
The law generally ignores all of this. In his profound new book, The Black Box Society, Pasquale writes that although the majority of people hurt by the structure of a particular form of surveillance may be Black, Latino, women, LGBTQ, or people with disabilities, the tools used are almost entirely immune from scrutiny. Title VII of the Civil Rights Act of 1964 has been deemed “largely ill equipped” to address this sort of discrimination. There are few applicable laws to enforce–or at least laws that protect our privacy rather than corporate ways and means. For example, in the case of credit scoring, many credit bureaus are not required to reveal how they convert data into scores. In exquisitely Orwellian fashion, those processes are deemed “trade secrets”, shielded from the eyeballs of debtors.
In my last post for New Economic Perspectives, I argued that if one views the financial system through the lenses of Legal Realism and Modern Money, financial firms serve many of the functions typically associated with public utilities. One arrives at the conclusion that, in a sense, when chartered lending institutions create credit for certain people and not others, they are privatizing pieces of a commons. This fact is not bad in and of itself, but it should make us think critically about the financial-legal-digital matrix now increasingly mediating our lives.
For example, because there is no “equivalent of Medicare for housing” in this country, as Dr. King called for, many people rely on government-supported private lenders to finance their housing needs. In fact, politicians have increasingly voted to subsidize the private mortgage industry rather than create a true public option for housing. If, because of data-mining, some people have reasonable access to mortgages and others do not–say, merely because they are Black, or pregnant, or gender nonconforming–this process becomes tantamount to government-sanctioned denial of access to housing. Many people will rent space from a landlord if they can’t get a mortgage, of course, but the point is that broad swathes of the population are pushed toward a private market, and then subsequently shut out of it.
And so it goes. If social justice advocates don’t understand digital surveillance, we will always be one step behind at best. Government spending and private lending are inherently discriminatory. To a certain extent, they always will be. Yet technology is creating more warped forms of discrimination–in addition to reviving old ones. As Modern Money friends Sandy Darity and Darrick Hamilton have often noted, a large part of the racial wealth gap can be explained by redlining, as well as general housing and lending discrimination. These practices were supposed to have been eradicated by the Civil Rights Movement and subsequent legislation, at least allowing people to be discriminated against merely because they were poor, and not because of other factors. In the absence of massive social spending, we’re all supposed to get an equal opportunity shot at credit. That’s part of the American Dream. But the advances of minorities may be barred simply by operating through the internet.
Data-gathering technology is rendering assaults on civil and economic rights increasingly difficult to untwist. Therefore, privacy advocates should provide auxiliary support to other social justice movements, especially #BlackLivesMatter. Equality advocates should take up the call to check government and corporate surveillance.
There is plenty of anger in this country at invasions of privacy. There is justified outrage at inequity and inequality. Now there must be synthesis and alliance. As Michelle Alexander argued, we have to get out of our lanes. Indeed, we may no longer have a choice. When I was younger, it was trite to say the internet was an information superhighway. Today, the digital economy is an information labyrinth. We have to navigate it together: the idea of isolated lanes doesn’t even make sense anymore.
Excellent and informative. Privacy and social justice activists are two groups, of many, who need to talk to and work with each other. Algorithmic discrimination is reminiscent of the intricacies of old South African racial classification – complicated pseudoscience is still pseudoscience. Reminds me of some old John Brunner sci-fi.