“Google Closed My Account For “Sexual Content”. But They Don’t Tell Me What It Is And I Lost Everything » | Science & Technology

Five years ago, following the death of a friend and band member, David Barberá decided to pay for a Google Drive cloud account. He wanted to store music files so that his friend’s children would one day hear how their father played. “So I signed up for the Google Drive Service,” he says. “It was the surest thing that came to my mind so that Javi’s music would not be lost, because the children were very young at the time. »

Barberá, a 46-year-old high school teacher from Valencia, eastern Spain, hadn’t anticipated a key detail: Google’s terms of service hide a system that deactivates accounts when it detects content prohibited, including sexual material involving children or terrorism. “The only thing I can think of is that I may have downloaded something that I shouldn’t have downloaded, like movies that I downloaded back in the days of [peer-to-peer file exchange program] Emulates. Could there be child pornography or terrorism in there? It is possible,” Barberá explains in a long telephone conversation.

At first, Barberá had no idea why he was banned from his account. He only started connecting the dots after reading posts in online forums and news articles. He describes a desperate experience of helplessness as he struggled to speak to a human being at Google and find out how exactly he had violated the company’s abuse policies.

– Advertising –

In July of this year, Barberá needed music files he had on old hard drives. In order to better organize the material, he started uploading everything to his Google Drive account, which he still pays for every month in order to have two terabytes of cloud storage space. Minutes into the process, Google deactivated his account with a message that “harmful content” had been found.

He filed several complaints, responded to emails from apparent Google employees asking for new details (whose names were Nahuel, Rocío, Laura), and called every company phone he could find without never be able to talk to a human. At that time, he asked a relative who works in journalism for help and finally managed to talk to an alleged Google employee who asked him for “patience”.

Sexual content

From this whole process, Barberá got only one concrete answer, and it was a message addressed to his wife’s email (which he had added as a secondary account). The post read: “We believe your account contained sexual content that may violate Google’s Terms of Service and may also be prohibited by law. But then he added, “We’ve removed this content” and “if you continue to break our rules, we may terminate your Google Account.” This message was received on August 26, and although it looks like a warning, the account is still suspended.

“I’ve had everything there for 14 years, and for five years I’ve only had it there -,” Barberá says, alluding to the fact that he doesn’t keep files on external drives. Losing the Google account doesn’t just mean that its photos and videos are gone. Barberá also lost class materials, a blog he kept and his YouTube account, not to mention other services he contracted with his email, from Amazon to Netflix to a German music app.

In August, the New York Times published an article with two similar cases in the United States. Google told the reporter that the problematic images were photos of children’s genitals that two parents took to send to the pediatrician for a skin problem. When EL PAÍS asked about Barberá’s case, Google replied that they could not provide this information due to privacy laws, since the user involved is European. The company said it would only share this information with the affected party. But Barberá has not yet received any details.

Google has offered this journal access to employees on the condition that their identities are not revealed and that they are not quoted verbatim. According to the company, which insisted it was not talking about this specific case, a “sexual content” email is only sent in cases of child abuse, not adult pornography. Why then, this “don’t start again”? Google didn’t elaborate, except to say that it all depends on what was in that account. A Google employee asked if this newspaper would name the affected user, but did not say why he was interested in knowing.

EL PAÍS found three other similar cases in Barberá: two others with Google accounts and one with Microsoft. All cases are from 2022 and in only one case the account was restored. In this case, it was not about alleged sexual images of children, but about a problem with the password. The decision to reinstate it was never clarified either.

Large downloads

Another victim, who asked to remain anonymous because his company might have Google among its customers, turned to “a close friend” who works at the company in Spain. This friend doesn’t work in a department related to content moderation, but he did some internal research and the response was less than optimistic: these cases are handled overseas and he had no idea if anyone one was actually reading the claims.

This user had had his account deactivated after downloading 40 gigabytes of photos, videos and WhatsApp conversations he had on his hard drive. The file upload was so remarkable that his company’s cybersecurity officials called him to ask what was going on. Google does not specify when or how it analyzes the accounts of its users. But in the two Spanish cases, as well as those documented in the New York Times, it happened when file movements were detected.

The third victim is suing Microsoft, desperate because she has lost data from her private life but also from her work: “Her master’s degree, her tax forms, her photos of the birth of her children and her professional databases. He is suffering, ”explains his lawyer, Marta Pascual. “The judge could say that his right to privacy was violated, although I couldn’t find any case studies. »

Pascual’s client believes the suspicious files come from WhatsApp groups, the content of which was backed up automatically. All three victims have children, and while they don’t recall the pediatrician’s photos, they had the typical images of children in the tub, in bed, or in the pool.

Microsoft gives even less information than Google. It only sends a few statements about how it fights child pornography in its systems: “First, we fund research to better understand how criminals abuse technology. Second, we are developing technology like PhotoDNA to detect cases of child sexual exploitation. Third, our staff promptly investigates reports of abusive content and removes them. And fourth, we’re working with other technology companies and law enforcement to deflect crimes. »

Like Microsoft, in a conversation this newspaper had with Google, the trust these companies place in their detection systems is remarkable. Google’s software is finding more and more false positives: between July and December 2021, it suspended 140,868 accounts, almost double compared to the first half of 2020.

Google scans accounts for child-related sexual material with two technologies: Known pornographic images have a numerical code that identifies them. If the systems find images that match these codes, it deactivates the account. This is the PhotoDNA system cited by Microsoft.

The problem is the new photos. For these, Google has created a second system that interprets the images and assigns them a probability that they are child pornography. Then, in theory, they go to human reviewers who decide if a photo crosses the sexual threshold.

Google has also spoken with pediatricians, so the computer will be able to distinguish between images taken for medical purposes and others. But, despite the laudable goal, the system can also lead many innocent people into a trap that may even involve a police investigation.

“I have a friend who is a member of the national police and I called him to tell him about the case and he told me that he would ask colleagues who specialize in computer crimes,” explains Barberá. “They told him they weren’t aware of any cases like mine. In the United States, companies such as Google and Microsoft must report any suspicious finds to the National Center for Missing and Exploited Children (NCMEC), which in turn notifies the police. The NCMEC sent 33,136 reports to Spain in 2021. These are generally cases that are not investigated, and in all cases the police do not report to Google or Microsoft that such or such a person is not a suspect. As a result, companies make their own decisions and it is up to the victim to justify why the material was legitimate.

Leave a Comment