You can login if you already have an account or register by clicking the button below.
Registering is free and all you need is a username and password. We never ask you for your e-mail.
As someone who has worked in this area for a couple of decades, I'm not sure that this is really news. What is probably different is the increased awareness by the press and the public about the development of such technologies and the wider use of analytics. As the article suggests, these problems are not technical, but have more to do with design, awareness of the law and best practices, and sufficient business oversight and review.
For example, historically a real problem in mortgage lending was a practice called redlining, which is illegally using racial factors in lending. By far the worst practices involved biased (and bigoted) human decisions and in the 90s there was a pretty big crackdown. One response from lenders was a greater reliance on statistical models that would identify potential customers for 'invitation to apply' programs, and I worked with a number of financial institutions on developing these. Input variables would typically include a range of household demographic attributes, and it was fairly straightforward to eliminate obvious sources of bias, such as race and ethnicity. As /u/Psycoth points out, though, using other ostensibly neutral factors (household income, education levels, etc.) still could cause problems, since neighborhoods with higher minority populations could index lower on many of these factors. Doing more analysis on the back end of model development was required to make sure that the resulting selections did not wind up once again redlining neighborhoods.
TL;DR: not a new phenomenon, but I agree that companies need to be aware of unintended consequences and the law.
view the rest of the comments →
[–] ProgHog231 0 points 2 points 2 points (+2|-0) ago
As someone who has worked in this area for a couple of decades, I'm not sure that this is really news. What is probably different is the increased awareness by the press and the public about the development of such technologies and the wider use of analytics. As the article suggests, these problems are not technical, but have more to do with design, awareness of the law and best practices, and sufficient business oversight and review.
For example, historically a real problem in mortgage lending was a practice called redlining, which is illegally using racial factors in lending. By far the worst practices involved biased (and bigoted) human decisions and in the 90s there was a pretty big crackdown. One response from lenders was a greater reliance on statistical models that would identify potential customers for 'invitation to apply' programs, and I worked with a number of financial institutions on developing these. Input variables would typically include a range of household demographic attributes, and it was fairly straightforward to eliminate obvious sources of bias, such as race and ethnicity. As /u/Psycoth points out, though, using other ostensibly neutral factors (household income, education levels, etc.) still could cause problems, since neighborhoods with higher minority populations could index lower on many of these factors. Doing more analysis on the back end of model development was required to make sure that the resulting selections did not wind up once again redlining neighborhoods.
TL;DR: not a new phenomenon, but I agree that companies need to be aware of unintended consequences and the law.