You are viewing a single comment's thread.

view the rest of the comments →

0
3

[–] Psycoth 0 points 3 points (+3|-0) ago 

I really don't like the article's use of the the term "algorithmic bias". It makes it seem that algorithms that process data are intentionally discriminatory. It seems to me that these programs are simply pointing out the larger social issues that we have developed as a society.

E.g. With the Princeton Review issue:

The PR wants to do one thing. Maximize profits. They're a business that's what they do. In order to do this they create an algorithm that adjusts price based on how much people are willing to pay for their product. I don't have access as to how they arrived at the base price, but I'm guessing they used A/B testing to arrive at the conclusion that they can charge more or less based on zip code. Zip codes being a mandatory field that people fill out when they order their product, and location based demographic information is a common metric in analyzing data.

Once the data has been gathered and the algorithm has been put in place, what do we find? If you take the time to scrutinize the data with respect to race, you can come to the conclusion that asians are getting charged more. However, if you take the time to scrutinize the data and determine exactly what is going on you might find that:

  • People tend to move to or live in areas that are populated by members of their own race. Meaning that some zip codes will have higher concentrations of a specific race.
  • Asian people are willing to pay higher prices for SAT tutoring materials.

In this case, there's no racial bias imposed by the algorithm. It's just pointed out that some of our stereotypes about asians may not be that far from the truth. It seems that asking to remove this perceived bias is more shooting the messenger than correcting for discrimination. It seems that this type of market analytics is pointing out that the US (Maybe other countries) suffer from cultural fragmentation and social inequalities. These are the kinds of issues that need to be addressed on a different level. Forcing people to change these algorithms because we don't like what they may be exposing about our culture is a poor reaction.

This, of course, assumes that someone at PR didn't just walk in one day and say "Hey I bet we can charge asians more money to use our product"