The mother’s age. The parents’ history with drug or alcohol abuse. The young person’s experiences with child protective services.
These are among the hundreds of factors that could be digitally analyzed to predict whether a child is at risk of being abused or neglected — and child welfare agencies in places like Pennsylvania, Florida and California have been exploring whether this “big data” can be harnessed to protect the most vulnerable kids.
“We have checklists that help our child welfare workers gather information,” said Emily Putnam-Hornstein, an associate professor at USC’s School of Social Work. “But we often fail to assemble and make good use of historical data.”
Analyzing historical data to forecast an event is called predictive analytics, and it’s already being used in some industries; credit card companies, for instance, use predictive analytic tools to detect fraud. In the case of child and family services, these tools could be used to process historical data culled from multiple sources — such as birth and arrest records — to help flag cases where a parent or caregiver is most likely to harm or neglect a child again.
The interest in predictive analytics comes as local child welfare agencies continue to face scrutiny. An estimated four to eight children die every day from abuse or neglect in the United States, according to a national report last year by a commission appointed by Congress and then-President Obama.
In one high-profile case last year, an 11-year-old boy was found dead in his mother’s closet in Los Angeles. Yonatan Daniel Aguilar, who weighed 34 pounds at the time, had been reported to the county’s Department of Children and Family Services six times, but his case was never thoroughly investigated, according to the Los Angeles Times.
“Anytime a tragedy happens, the community wonders, ‘How did we not connect the dots? How did we miss that this child was in danger despite multiple referrals and other red flags?’” said Putnam-Hornstein, who co-directs the Children’s Data Network, a team of researchers studying how data can be applied to make better decisions about the health and well-being of kids.
The Children’s Data Network received a grant last year from the Laura and John Arnold Foundation and the California Department of Social Services’ Office of Child Abuse Prevention. It is building and testing a predictive analytics program, one that could potentially be deployed to California’s 58 counties. It taps into historical child welfare data, running the information through a computer algorithm to determine the likelihood that a child will be re-reported to the child welfare agencies.
Such a tool could make a difference in how future cases of abuse or neglect are evaluated. Currently, social workers can have varying responses when they look into a child’s situation, depending on their personal biases, level of experience or the information they rely on. With predictive analytics, the same standards are consistently applied. A computer can also quickly assess reams of data, connecting dots that a social worker might miss.
Critics, however, caution that the technology could reach too far, comparing it to the “Minority Report,” the futuristic Tom Cruise movie about a program that predicts crimes before they occur. Skeptics also question whether such a program could be embedded with bias, leading to heightened scrutiny of poor families, particularly poor families of color. Could the technology prompt investigations that aren’t warranted, and therefore needlessly scar children? Could the results be relied on too much, so that they override a social worker’s own judgment, training and intuition?
These concerns are among the reasons Los Angeles County is taking a cautious approach to predictive analytics tools. In 2014, Los Angeles County’s Department of Children and Family Services teamed up with SAS Institute, a technology company based in Cary, North Carolina, to develop a pilot project using predictive analytics.
The results were mixed. Though the experimental program correctly identified a high percentage of children who were at risk of abuse, it also flagged nearly 4,000 cases that were unfounded. Unsatisfied with the results, county officials are setting aside the system until it can be refined. In a recent report, Judge Michael Nash, the executive director of the Office of Child Protection, recommended establishing clear standards, such as determining who will receive the information and how it will be used, before any program is rolled out.
Despite concerns, other jurisdictions are nevertheless moving forward with predictive analytics. One of the first counties in the nation to adopt it was Pennsylvania’s Allegheny County, which launched its system last summer and continues to use it.
In a report this spring, the county detailed how the new system is being used and the precautions being taken: When a call comes through its hotline, a screener collects information and assesses the case. Next, the child’s identity is run through its predictive analytics tool. The case is given a score from 1 to 20; the higher the score, the more likely the child is at risk for abuse or neglect. Finally, the screener, in consultation with a supervisor, decides whether the case should be investigated further.
The county emphasized that the new data analytics tool isn’t meant to replace human decisions -- a case with a low score could still trigger an investigation. Rather, the new program should help its staff make better, more informed decisions.
In an ethical review of the new program, researchers concluded that there will always be the possibility that the wrong decision could be made: “The question is, how can we make the fewest errors in our efforts to protect children and families? (Predictive analytics) seems an ethical and potentially important contribution to that effort.”
Shares