Investigations and crimes involving children are difficult for all involved to endure. Interviewing a child in order to establish evidence for a trial, for one, is a task that is often challenging for law enforcement as well as victims. For a variety of reasons, children may also be reluctant to talk about what's happened to them. New research shows how robots could help police interview kids involved in child abuse cases, which could mitigate some of those challenges but potentially introduce some new ones, too.
In 2014, 702,000 victims of child abuse were reported to child protective services, according to the CDC. No one can be sure of how many more cases go unreported, but for those who do become involved in an investigation, the process can be devastating no matter the outcome. Even the most seasoned police detectives can find it difficult to interview a child — not just because it's an emotionally sensitive situation, but because getting reliable information during interviews to be submitted as evidence in court can be a daunting task. Children can be very susceptible to suggestion and often struggle to understand that they aren't in trouble themselves, according to resources from the Child Welfare Information Gateway. Because of this, they may be reluctant to talk to an adult (who they don't know) and share intimate details about any abuse they've experienced (which, often times, they've been explicitly instructed not to discuss).
For investigators, getting information that is usable in court is essential to taking a child abuse case to trial. Getting misinformation, or coerced responses, isn't just dangerous for the child, but could also lead to a false conviction of an innocent party, according to New Scientist. Keeping these challenges in mind, researchers at Mississippi State University's Social, Therapeutic, and Robotic Systems Lab wondered if they could use artificial intelligence in the form of friendly robots to help interview children.
The researchers thought that using robots would address both the issues of neutrality (because they lack any emotional involvement in the case) and may also be less threatening to children. One of the robots in use for the study is a a 2-foot tall Nao robot, which the team equipped with forensic interviewing software, cameras, and voice recording, all of which can be remotely operated by a human investigator. Cindy Bethel, a forensic interviewer and associate professor at the university, said of the research, according to science website Seeker:
Unfortunately, the challenge is far from solved: while robots may fix some of the issues related to human fallibility and bias, bringing AI into the equation also presents challenges of its own. The most obvious being that, since the robots appear toy-like, kids may regard them as such and not engage with them in a meaningful way for the purpose of information-collection, as pointed out by Seeker.
Kids could also view the robot in such a playful way that they are encouraged to make up things to talk about so that they can continue to engage with it. There's also the risk (as their would be with human interviewers) that the robot's detached and direct style of questioning could potentially deceive a child into revealing something they did not willingly want to discuss — and any information that's acquired through deception wouldn't be admissible in court, according to the American Psychological Association.
The findings were presented at the Conference on Human Robot Interaction, where some researchers pointed out another possible pitfall: given the highly sensitive nature of child abuse investigations, it's sometimes the emotional support and connection with trusting adults that make all the difference for a vulnerable child. Robots — who lack empathy by design — would not be able to provide that.
While it's unlikely that robots will replace humans in these types of investigators entirely, they certainly may be an asset in some cases. And one thing many can agree on is that any research designed to find better ways to protect children, and bring justice to them, is worthy of further consideration.