Predictive Analytics in Child Welfare – Helping Hand, or Racial Bias? 1

Digital image of family in cradled hands
Does the use of predictive analytics in child welfare end up racially profiling families?

In 2014, then-Attorney General to the U.S. Sentencing Commission, Eric Holder, expressed open concern about the use of predictive analytics in the criminal justice system. For those of you who haven’t encountered this term before, predictive analytics refers to an advanced form of analytics, which is used to make predictions about unknown future events. Predictive analytics uses a wide variety of techniques, from data mining, statistics, modeling, machine learning, and artificial intelligence, to analyze current data and then make predictions about the future.

Although predictive analytics was originally made out to be a beneficial intervention when used in the criminal justice system, allowing agencies to reduce incarceration and create rehabilitation and support services to defendants in need, the truth was something else altogether. What actually took place, was none other than computerised racial profiling.

Holder’s concern was that the use of predictive analytics in criminal justice would actually undermine the criminal justice system’s attempts to provide equal justice to all people. By profiling people without regard for their circumstances, and without recognition of their individuality, the use of predictive analytics could in fact exacerbate unjust disparities between individuals that are already too common, both in society and in the criminal justice system.

The reason for why this is so dangerous, is that the risk assessment that is generated by the use of predictive analytics every time someone is arrested, takes into account the color of their skin. And this factor, while not something that should matter when considering what kind of threat a person presents to society, definitively affects the result. For example, a black person convicted of a crime is often ranked as being at a higher risk of reoffending than a white person accused of exactly the same crime.

Terrifying as the implications are for the criminal justice justice system, it is equally scary to realize that this system is now also being used in child welfare to assess parents and families that are at a higher risk for future abuse. While it might sound smart to allow a dispassionate computer, unaffected by personal bias and racial tensions, to make those kinds of choices, it turns out that even software harbor the biases of those who program it. This is not just unfair, but also unconstitutional.

Professor Sonja Starr of the University of Michigan Law School has written about the disturbing trend in which predictive analytics are used to flag people as “high risk” or “low risk,” and has pointed out that factors which influence this outcome are usually closely affiliated with poverty. Which means that according to the risk assessment software, poor people are automatically a higher risk for both criminal behavior and, when it comes to child welfare, abuse and neglect.

Join us next time, when we will be continuing this difficult discussion, and looking at how predictive analytics are being used to label children and families considered to be “at risk”, and how this is nothing more than a form of computer generated racial profiling.


Posted

in

by