CC – Week 10: Data – Journal
Machine learning is a much-hyped, but nonetheless truly unprecedented technology. Based on sufficient training data, it can create models to answer questions humans or traditional algorithms could never. The industry’s marketing certainly promises a bright future powered by artificial intelligence, and every major tech company is trying to speed towards that future as quickly as possible.
Machine learning—like everything else—is far from perfect. As much as its creators & prominent users like to promise otherwise, it carries over many human problems into its methods, especially bias.
We saw this majorly in the news this week with the Apple Card. David Heinemeier Hansson, the creator of Ruby on Rails & a white male, tweeted that though his wife had a better credit score than him, Apple Card was giving her a hilariously lower credit limit. Many other users chimed in with similar experiences, concluding that the Card/Goldman Sachs’s algorithms were sexist.
A Goldman spokesman said: “In all cases, we have not and will not make decisions based on factors like gender.” But this is exactly the point: machine learning is excellent at using “latent variable”, which are not explicitly identified, but through a large dataset can be identifies & used for correlations. So even if Apple/Goldman weren’t inputting gender as a variable, the system can still be discriminatory. The state of New York is suing.
Machine learning is trendy, and has an infinite number of incredible applications. It truly can help us build parts of a better future, or simply making our phones smarter. But we must watch its use carefully, and rigorously examine our results and applications of it. Otherwise, we’re continuing to propagate the same bias we were attempting to avoid with our “smarter” “intelligent” systems.