A father photographed his son naked for a medical examination in the United States;  Google saw a crime

A father photographed his son naked for a medical examination in the United States; Google saw a crime

Mark noticed something was wrong with his son. The two-year-old’s penis felt swollen and sore. Mark, who lives in San Francisco and takes care of the boy, took out his Android smartphone and took photos to document the problem so he could track its progress.

It was a Friday night in February 2021. Mark’s wife called the doctor to set up an emergency appointment, via video link – they were in the middle of the pandemic. A nurse asked them to send the pictures so the doctor could see them before the appointment.

Mark’s wife took her husband’s cell phone and streamed close-ups of her son’s groin area on her iPhone so she could upload them to the office’s messaging system.

That episode cost Mark more than a decade of calls, emails, and photos and made him the target of an investigation. The man – who asked not to be identified except by his first name – was caught by an algorithm network designed to connect people who share child sexual abuse material.

As they analyze such a large volume of data, tech companies have been pressured to scrutinize the material passing through their servers to detect and prevent criminal behaviour. Advocates for children and teens say this collaboration is essential to combating the spread of sexual abuse images online.

But that act can involve spying on private files, and in at least two episodes highlighted by The New York Times, this has led to innocent behavior being viewed in an evil light.

Technology expert John Callas, of the Digital Civil Liberties Defense Organization, says the issues serve as red flags for this type of situation.

Mark, who is in his forties, created a Gmail account in the mid-2000s and has since used Google heavily. Two days after the photos, Mark’s phone sounded a notification: his account had been disabled for “harmful content” that constituted a “serious violation of Google’s policies and may have been illegal.”

Link taking it to a list of possible causes, including “child sexual abuse and exploitation”. Mark was confused at first, but then remembered his son’s injury. He thought, “Oh my God, Google must have thought this was child pornography.”

He filled out a Google review request form and explained the boy’s injury. At the same time, he discovered the multiplier effect of company rejection. Not only did he lose emails, contact details from friends, former colleagues, and documents from his son’s early years, but his Google Fi account was closed – he had to get a new number from another carrier.

Without access to his phone and email, he was unable to obtain the passwords he needed to access his other online accounts. It has been left out of a lot of digital life.

“Child sexual abuse material is abhorrent,” Google said in a statement. “We are committed to preventing it from spreading on our platform.”

A few days after the appeal was filed, Google responded that it would not recover Mark’s account without further explanation.

Meanwhile, the same scenario was implemented in Texas. A two-year-old, too, had an infection in his “intimate organs,” his father wrote in a post I came across when writing about Mark’s story. At the pediatrician’s request, Casio – who also asked that only his first name be used – used his Android phone to take the photos, and automatically save them to Google Photos. Then send the pictures to the woman via Google Chat.

Casio was in the process of buying a house when his Gmail account was deactivated. “It was a headache,” he says.

The first tool the industry used to disrupt the widespread exchange of child pornography on the Internet was PhotoDNA, a database of images of known abuse, converted into unique digital codes. It can be used to quickly scan large numbers of images to discover a match, even with small changes. After Microsoft launched the system in 2009, Facebook and other companies started using it.

Significant progress was made in 2018, when Google developed an artificial intelligence tool capable of unprecedented image recognition for children to explore. This means not only finding known photos of abused children, but also unknown victims that the authorities can rescue. Google has made the technology available to other companies, including Facebook.

When photos taken by Mark and Casio were automatically uploaded from the phone to Google’s servers, technology identified them. A Google spokesperson said Google only analyzes when a user initiates an “affirmative action” — and that includes when the phone saves photos in the company’s cloud.

A human content moderator reportedly checked the photos after they were identified by Amnesty International to ensure they met the federal definition of child sexual abuse material. When Google makes such a discovery, it blocks the user’s account, does a search for exploitative content, and then, as required by federal law, reports the case to some sort of electronic hotline.

In 2021, CyberTipline reported that it alerted authorities to “more than 4,260 potential new child victims.” Mark and Cassio’s children are included in this figure.

In December, Mark received an envelope in the mail from the police department. It was a message informing him that he had been investigated, along with copies of search warrants that were sent to Google and his ISP. The investigation requested everything on Mark’s Google account: Internet searches, his location history, messages, and any documents, photos and videos he had stored with the company.

The research, related to “child exploitation videos”, was carried out in February, a week after he took his son’s photos.

Mark called Detective Nicholas Hillard, who told him to drop the case. He tried calling, but Mark’s phone and email didn’t work. “I have determined that the incident was not a crime and that there was no crime,” he wrote in his report. Mark again appealed to Google, and filed a police report, but to no avail.

Cassio was also investigated. A Houston police detective called and asked him to come to the station. After showing his letters to the pediatrician, he was quickly released. But he was not able to get his Google account back, which he had owned for a decade and for which he was a paid user.

Not all images of nude children are pornographic, offensive, or exploitative. It can be difficult to determine what constitutes sexually offensive imagery, but she agrees with police about medical images — saying they are not considered offensive, says Carissa Bern-Hesik, a professor of law at the University of North Carolina. “There was no child abuse. The photos were taken for non-sexual reasons.”

I had access to the photos Mark took. The decision to identify them as worrisome was understandable: they are explicit images of a child’s genitals. But context matters: It was made by a father worried about his sick son.

“We recognize that in the era of telehealth and Covid, it has been essential for parents to take pictures of their children to receive a diagnosis,” says Claire Lilly, director of child safety operations at Google. She says the company has consulted with pediatricians to make human reviewers aware of medical conditions that may appear in medical images.

Casio heard from a customer support representative that sending photos to his wife using Google Hangouts violated the system’s terms of service.

As for Mark, Claire Lilly says that reviewers did not detect any redness or rashes in the photos he took of his son and that a subsequent account review revealed a video six months earlier Google also found a problem: a young child lying in bed with a naked woman.

Mark can’t remember the video and no longer has access to it, but he said it felt like a certain moment he wanted to capture, not knowing that others would see it or be judged. “I can imagine. We woke up, it was a beautiful day and I wanted to record that moment,” he says. “If we had slept in our pajamas, all of this could have been avoided.”

A Google spokesperson said the company stood by its decisions, despite the fact that the two men were acquitted by police.

#father #photographed #son #naked #medical #examination #United #States #Google #crime

Leave a Comment

Your email address will not be published.