Can a pc program assist predict little one abuse? This Pa. county thinks so

By KIM STRONG, York Daily Record

YORK, Pennsylvania (AP) – When a 4-year-old girl drowned in her swimming pool near Pittsburgh on an August day in 2019, the family was already known to the youth welfare office.

Between December 2017 and July 2019, six reports of “lack of supervision” in the home were made to Allegheny County’s Department of Human Services, and five of those reports were found to be “unfounded” or “invalid” according to a state document. The sixth call was investigated and the family was on child protection when Ca-Niyah Mitchell slipped into the pool without a life jacket while her father was at her home.

Charles Mitchell pleaded guilty to endangering children last year and received a suspended sentence.

This is the child welfare horror story where the red flags were hoisted to see a child was in danger but they were missed, sorted out by the person who answered the call and put on the “baseless” pile.

When there are serious cases of neglect and abuse in Allegheny County, “we scrutinize them with tons of stakeholders and that’s the only time we see these calls for abuse and neglect and we sort them out” said Erin Dalton, director of the district’s Department of Human Services.

Dalton and her predecessor Marc Cherna set out to change that six years ago and worked with engineers in New Zealand to develop an algorithm or predictive analytics tool to aid decision-making for the person reviewing these critical calls.

It has its critics, but Dalton remains the algorithm’s greatest cheerleader.

Allegheny County uses an algorithm to make a critical decision about a child’s well-being.

The critical call

There is a critical moment in child welfare that happens when a phone screener receives a call from someone reporting child neglect or abuse.

A decision must be made whether to open an investigation or to drop the case. Screeners looking at the facts of a case often leaned in the wrong direction in Allegheny County, even if the decision was based on family background.

An analysis by the district’s DHS five years ago found that DHS investigated 27% of child welfare cases with the highest risk and 48% of cases with the lowest risk.

“Our job is first and foremost to keep the children safe,” said Dalton.

The county decided to use data it had been collecting on the county’s families since 1998.

DHS investigators had used this information to make their decisions at the critical moment when an abuse or neglect case was brought into investigation or screened. It was just a lot of information, however, and the screeners had limited time to sift through everything, Dalton said.

The DHS decided that information could be automated. A partnership with Auckland University of Technology resulted in the Allegheny Family Screening Tool, which assigns a number to a child based on the information entered into the system. The data come from publicly available records.

The algorithm assigns the child a number and predicts the likelihood that the family will be referred to the DHS again. The higher the number, the more likely it is.

“It’s just that people just aren’t good at it. They have their own prejudices. So, having a tool like this that can give that kind of information to really talented people really changes everything, ”said Dalton.

  • More: Luzerne County’s Youth Welfare Director accused of mistakenly ending hundreds of cases

Repair what’s broken

“I was hired 25 years ago to fix childcare. We were known as a national shame, ”Marc Cherna wrote to his colleagues last year when he announced his resignation as head of Allegheny County’s Department of Human Services.

When he took over the department, Allegheny County children were being removed from their homes and placed in foster care at shockingly high rates, and Cherna tried to change that.

“He didn’t go far enough (to fix the system) and then he thinks we still can’t find all the horror stories, so we’re going to use the algorithm,” said Richard Wexler, executive director of the National Coalition for Child Protection Reform.

Wexler and other critics of the algorithm believe it has its own prejudices and is driving more children out of their homes because they are poor, not because they are in danger.

“If this is so great, then why, by and large, the people who are most excited about it are people whose testimonies or track records show a strong propensity for taking children away (from home)? If this is really a way to sustain families, then why isn’t the family preservation movement leading the charge for it? Were not. Why isn’t the Racial Justice Movement leading the charge for this instead of saying, hey, we know what happened in the criminal justice system, why do we think it will be different in child welfare? ”He said.

“You destroy families. They traumatize children emotionally, expose children to high rates of foster abuse, and at the same time make it more difficult to find the few children who really need to be saved. … Improper removal leads to all problems, “said Wexler.

What’s wrong with algorithms?

According to Nicol Turner-Lee, Director of the Center for Technology Innovation and Senior Fellow in Governance Studies at the Brookings Institution, one of the problems with algorithms in general is the data they are fed with.

“First and foremost, computers don’t discriminate; People do. The people behind these models can come up with explicit and implicit biases that are built into the model, ”she said.

The data used for algorithms is primarily public data, so the poor family who use government services for food, housing, drug and alcohol counseling and psychological treatment have much more data in the public domain than a wealthier family who use private insurance for advice and treatment.

“The computer technology used is taking on the face of these communities, so unfortunately any algorithm can be looked at like a criminal justice algorithm, and it takes historical legacy that is verified in unfair systems,” said Turner-Lee.

“I had a keen interest in not only addressing the output portion of the problems, which is the ultimate prediction that can have a disproportionate impact on vulnerable populations, but I was also involved in the design and evaluation of these products. Who is sitting at the table? How are they formed? What questions are you trying to solve? And regardless of whether they are done from a different perspective or not, ”she said.

Political scientist and author Virginia Eubanks has written a book on predictive algorithms. Titled “Inequality Automation: How High-Tech Tools Profile, Monitor, and Punish the Poor,” the problems encountered by these predictive tools are recorded in three locations, including Allegheny County and its Family Screening Tool.

“The belief that big data, algorithmic decision making, and predictive analysis can solve our thorniest social problems – poverty, homelessness, and violence – resonates deeply with our beliefs as a culture. But that belief is out of place, ”wrote Eubanks in an article in Wired magazine. “These high hopes rest on the premise that digital decision-making is inherently more transparent, accountable, and fairer than human decision-making. But, as data scientist Cathy O’Neil wrote, “Models are opinions that are embedded in mathematics.”

She continues, “Allegheny County has an incredible amount of information about the use of public programs. However, the county does not have access to data about people who do not use public services. Parents who receive private drug treatment, psychological counseling or financial support are not shown in the DHS data. Since variables describing their behavior were neither defined nor included in the regression, crucial pieces of the child abuse puzzle are left out of the AFST. “

Reducing racial disparities

The University of Pittsburgh hosts a task force dealing with algorithms used by government agencies, including the Allegheny Family Screening Tool, as the use of algorithms becomes more common.

“When I think about this system and others, I’ve had this kind of framing in my head, what is it that is replacing it? What was the legacy, the human decision-making process? Does that thing have any advantages? And for the screening tool, the county has shown some data that has reduced the racial differences. That’s what we want in a system like this, ”said Chris Deluzio, who works on this task force and is policy director at Pitt’s Institute for Cyber ​​Law, Policy, and Security.

The task force is working on a report on the child welfare algorithm to be published later this year.

In an independent ethical analysis of the tool, two professors concluded that “the tool is ethically appropriate, particularly because its accuracy exceeded the alternatives at the time and there would be ethical problems not to use the most accurate measure,” according to Allegheny’s DHS County.

The state also supports the instrument: “The Department of Human Services supports Allegheny County’s efforts to protect children and strengthen families. DHS has taken some initial steps to explore predictive risk modeling, but there are no immediate plans to develop a nationwide model, ”said spokeswoman Erin James.

More: Child abuse reports have fallen sharply in Pennsylvania, and experts are concerned

Comments are closed.