How Are We Solving Inequality with AI?

ODSC - Open Data Science
6 min readSep 27, 2019

When people talk about artificial intelligence, they often praise it as an unemotional, efficient task manager, and, to an extent, that’s what it is. But when we think about it in bigger terms than Siri or Alexa, we can see real ways people are solving inequality with AI. In this article, we’ll look at a few ways AI is already fixing real sociological problems of inequality, as well as consider some ways it could be implemented in the future.

3 Ways We’re Already Solving Inequality with AI

Automating Court and Hiring Processes

One AI application that’s gotten a lot of media coverage is automated court and hiring processes. The concept is that companies or the government can use machine learning to predict good candidates to hire or the likelihood of repeat offenses to decide on sentencing, which, in turn, would reduce individual biases. Unfortunately, most of the press has been negative, due to faulty programming and data bias — where the machine is learning from biased data that predicts men to be better candidates or white people to be less risky for shorter sentences.

Despite the rocky start to this application, companies are now re-training their machines to account for that bias and there’s real potential to this programming.

In industries that are and have always been dominated by one gender — due to a variety of reasons, usually not actually related to one gender’s higher aptitude to that job — hiring managers may be biased against the opposite gender and turn down qualified candidates. By incorporating AI into this process, we can remove that gender bias and undo that inequality, creating a company where candidates are truly hired based on merit over assumptions.

In the government, there are many ways this technology could be implemented. This includes basics like saving money by automating menial tasks, but can be most impactful when applied to the prison and court system. There have been many cases where judges seem to give lower sentences to one race or another. This isn’t necessarily purposeful racism, but can be rooted in stereotypes and biases that can be hard for us to detect within ourselves. One company is exploring the idea of applying machine learning to the



ODSC - Open Data Science

Our passion is bringing thousands of the best and brightest data scientists together under one roof for an incredible learning and networking experience.