Big Data as a Civil Rights Issue...
In classes on Information Systems, we talk about the rising use of "Big Data" - enormous collections of data sets that are difficult to process using traditional database management tools or data processing applications, and which are increasingly used to find correlations that, for instance, spot business trends, personalize advertisements for individual Web users, combat crime, or determine real-time roadway traffic conditions.
But is "personalization" just a guise for discrimination?
That's the argument put forth in Alistair Croll's 2012 instant-classic post titled, "Big data is our generation's civil rights issue, and we don't know it". He goes on to argue that, although corporations market the practice of digital personalization as "better service", in practice this personalization allows for discrimination based on race, religion, gender, sexual orientation, and more.
The way this works is that, by mining Big Data, a list of "trigger words" emerges that help identify people's race, gender, religion, sexual orientation, etc. From a marketing company's point of view, they then "personalize" their marketing efforts towards someone based on such characteristics. And that makes it a civil rights issue.
For example, American Express uses customer purchase histories to adjust credit limits based on where a customer shops - and as a result there have been cases reported of individuals having their credit limits lowered because they live and shop in less-affluent neighborhoods, despite having excellent credit histories.
In another example, Chicago uses Big Data to create its "heat map". According to TechPresident, the heat map is "a list of more than 400 Chicago residents identified, through computer analysis, as being most likely to be involved in a shooting. The algorithm used by the police department, in an initiative funded by the National Institute of Justice, takes criminal offenses into account, as well as known acquaintances and their arrest histories. A 17-year-old girl made the list, as well as Robert McDaniel, a 22-year-old with only one misdemeanor conviction on his record."
In yet another example, a Wall Street Journal investigation in 2012 revealed that Staples displays different product prices to online consumers based on their location. Consumers living near another major office supply store like OfficeMax or Office Depot would usually see a lower price than those not near a direct competitor...
One consequence of this practice is that areas that saw the discounted price generally had a higher average income than in the areas that saw the higher prices...
Price discrimination (what economists call differential pricing) is only illegal when based on race, sex, national origin or religion. Price discrimination based on ownership — for example, Orbitz showing more expensive hotel options to Mac users—or on place of residence, as in the Staples example, is technically okay in the eyes of the law...
However, when you consider that black Americans with incomes of more than $75,000 usually live in poorer areas than white Americans with incomes of only $40,000 a year, it is hard not to find Staples' price discrimination, well, discriminatory.
And in an especially frightening read earlier this month, The Atlantic published an article outlining how companies are using Big Data not only to exploit consumers, but also to exclude and alienate especially "undesirable" consumers.
The idea behind civil rights is that we should all be considered on an individual basis. People should not be treated differently solely due to their race, religion, gender, or sexual orientation. The Civil Rights Act of 1964 explicitly banned such differential treatment in the private sector. That is why there are no longer separate drinking fountains on the basis of race.
So as Big Data permeates society, and as algorithms and various modelling techniques try to find patterns that seek to predict individual behavior, if those algorithms are indeed "personalizing" content on the basis of race, religion, gender, or sexual orientation, then how is it NOT discriminatory?
Just because it's the result of an algorithm doesn't make it OK. Algorithms are programmed by people, after all.