California Department of Social Services (CDSS) Deputy Director Greg Rose, who oversees the state’s foster care system, says that the state’s new predictive risk modeling project is designed to give social workers better information about past child welfare cases when they first field a call about child abuse and neglect.
“There’s data that we ought to be better utilizing,” Rose said.
The proof-of -concept project comes at a time when the lure of using complex algorithms and data models to anticipate which children are at greatest risk of being abused is strong. Leaders associated with the initiative are hoping that an emphasis on transparency will assuage the fears of critics who say that predictive analytics is a scary proposition straight out of a dystopian science-fiction tale. The use of predictive risk modeling, they say, could lead to heightened scrutiny of poor families of color, with more children removed from at-risk families based on a mysterious mathematical formula.
“Just because you’re poor, you have multiple children and you’re on public benefits, all of a sudden you’re at risk; we already know that,” said Kathy Icenhower, CEO of SHIELDS for Families, an organization that works with families in South Los Angeles. “That doesn’t mean that everybody that meets those criteria should be suspected of child abuse or neglect.”
Despite these well-known concerns, California is forging ahead in development of a tool of its own.
With a $300,000 grant from the California Department of Social Services and the Laura and John Arnold Foundation, a team of researchers led by Emily Putnam-Hornstein, co-director of the Children’s Data Network at the University of Southern California, is building and testing a data analytics tool to help child abuse investigators gauge the risk of maltreatment when a report of child abuse or neglect is made.
“We are trying to figure out when we are first screening a report of abuse or neglect, whether this is a child or a family where we’re going to see them reported again in six months, again in 12 months, or whether this is a family and child where we’re going to see the child placed in foster care, but maybe a few years down the road,” Putnam-Hornstein said.
Within the next 18 months, Putnam-Hornstein and her team aim to have a powerful predictive tool that she can share with the state and its 58 counties.
“If there is information relevant to the screening of that allegation and the future safety and potential harm to that child that may exist in other data systems, that can be really, really important to making good decisions,” she said.
Interest in the use of predictive analytics in child welfare has been growing in recent years. In March, Congress and the White House commissioned a group of national child welfare leaders to study and review child fatalities nationwide. The resulting report from the Commission to Eliminate Child Abuse and Neglect Fatalities hailed the use of predictive analytics to identify families at highest risk for fatalities in Hillsborough County, Florida. It also recommended greater use of predictive tools to identify children at greatest risk of being killed as a result of child abuse.
In Pennsylvania, Allegheny County has been a leading proponent of predictive analytics. There, social workers use vast data sets to assess the risk to a child reported to its child-abuse hotline, a model that Putnam-Hornstein hope to build on.
Rose, from the state’s Department of Social Services, said that the state is taking a “cautious and measured” approach to the project.
“As a department, we won’t move forward with anything related to predictive analytics or predictive risk modeling that’s done in a non-transparent way,” Rose said. “Unlike some of the commercial products out there, we feel really strongly that if we’re going to use any predictive analytics or predictive risk modeling, that the community writ large has to really understand what it is we’re doing, how we’re going about it, and see its value so it’s not misrepresented in some kind of way.”
At the end of December, CDSS invited a host of California child-welfare leaders and advocates to participate in an advisory group for the predictive modeling project. At the meeting, Putnam-Hornstein promised that the group would have a chance to review initial results at a forthcoming meeting in May and provide feedback on the tool. She also said that the project will hire an ethicist to tangle with the thorny issues of bias, profiling and risk presented by the use of a predictive analytics tool.
But Icenhower of SHIELDS for Families says she remains skeptical of the project and its promise for keeping families out of the child-welfare system.
“We have deep, deep concerns about how they’re deciding to use the data that they have,” Icenhower said. “Using the data to essentially target who should be in the system — that to me is a backwards way of using the data.”
As the predictive analytics project moves ahead, many will be watching to see how the state responds to those worries.
By Jeremy Loudenback This post California Bets on Big Data to Predict Child Abuse appeared first on The Chronicle of Social Change.
Written By Chronicle Of Social Change
California Bets on Big Data to Predict Child Abuse was originally published @ The Chronicle of Social Change and has been syndicated with permission.
Sources:
Our authors want to hear from you! Click to leave a comment
Related Posts
Transparency is in the procedural pudding, not the promises before. Cal shall see. i am certain there will be disputes about what they all see.
Someone tell Kathy Icenhower that even Little Data is used “to essentially target who should be in the system — that to me is a backwards way of using the data.” How else can evidence be used? Unless you have a system where one requiires evidence only to decide who should nt be in the system. That sounds more of a draconian miission creep than “targeting” the most troublesome cases. Napoleonic law- prove innocence against “we were called to investigate, you must be guilty…”
CPS needs some TRIAGE. this might just take the workers off the forever hotseat of “did we do the right thing?”
What I have not seen employed is a number of observers making independent evaluations of the case at hand data. How many eyes look and grade the severity of a case. Sociologists investigating crime and neighborhoods (those poor ones that might have a few of the concernswelfare concerns) train and cross index the grades of various observers. How many/few case workers have looked before a case moves to consideration of action; My CASA work quick demonstrated biases, cultural biases even without race involved. Income, religion both had impacts.