Facebook Encryption Eyed in Fight Against Online Child Sex Abuse

An explosion in reports of child sexual abuse imagery on the internet is prompting the authorities to step up pressure on technology companies over their use of encryption — and Facebook, which flags by far the largest amount of the material, is drawing outsize attention.

The tension is part of a growing debate over privacy and policing in the digital age. Law enforcement groups have long lamented the use of encryption, a tool that protects personal data from hackers and government surveillance but also lets child predators and other criminals hide their online activities.

There is increasing “international consensus, at least among law enforcement folks, that this is a serious problem,” said Sujit Raman, an associate deputy attorney general in the Justice Department. “And the companies, you know, they’re just not as engaged on the issue as they really need to be.”

The New York Times reported on Saturday that Facebook Messenger, which is not encrypted, accounted for nearly two-thirds of reports last year of online child sexual abuse imagery. On Wednesday, the Justice Department said that Facebook as a whole was responsible for 90 percent of the reports.

[Read The Times’s investigation into online child abuse images.]

In March, the company’s chief executive, Mark Zuckerberg, announced that the messaging service would move to encryption in coming years, setting up a direct conflict between its business interests and the demands of law enforcement.

Jay Sullivan, Facebook’s director for messaging privacy, said the company was transitioning to encryption because it was one of the fastest areas of online growth, according to prepared remarks at a conference last month at Stanford University. The company’s other messaging service, WhatsApp, is already encrypted.

Justice Department officials, including Attorney General William P. Barr, are expected to raise concerns about the change to Messenger, and about encryption overall, at an event on Friday with the Federal Bureau of Investigation, the national clearinghouse for child sexual abuse imagery and officials from the Australia and Britain.

“Online child exploitation has increased dramatically in the past few years, and offenders continue to adopt more sophisticated means to entice victims and evade justice,” the department said in invitations to the event.

Law enforcement agencies say encryption is a major obstacle in child sex abuse, terrorism and other investigations. The Justice Department has repeatedly sought ways to break encryption, but doing so would create opportunities for hackers and allow for broader surveillance. Those efforts have been strongly opposed by groups focused on internet privacy protections.

“A secure messenger should provide the same amount of privacy as you have in your living room,” said Erica Portnoy, a technologist at the Electronic Frontier Foundation, a digital rights group. “And the D.O.J. is saying it would be worth putting a camera in every living room to catch a few child predators.”

But Mr. Raman, the associate deputy attorney general, said platforms like Facebook provided opportunities for child predators that were not readily available in the real world, creating a need for more scrutiny.

“It’s a unique platform,” he said. “You have a combination of open profiles, which almost creates a menu of options if you’re a child predator, and also will provide — if they implement their plans — the sort of mechanisms for drilling down and contacting children” with encryption.

Criminals have become adept at using encryption and the so-called dark web to cloak themselves, requiring greater effort by law enforcement to identify them. Even so, mainstream technology companies, like Apple, have come to embrace encryption technologies, particularly after disclosures about large-scale data collection by the National Security Agency.

“There are really good reasons to have end-to-end encryption, but we have to acknowledge it comes with trade-offs,” said Hany Farid, a professor at the University of California, Berkeley, who helped develop technology in 2009 for detecting online child abuse imagery.

Gavin Portnoy, a spokesman for the clearinghouse of abuse material, the National Center for Missing and Exploited Children, said officials there opposed the widespread adoption of encryption. “The rape and sexual exploitation of children that we see in millions of reports to the cyber tip line will continue to circulate online, but will be invisible to tech companies,” he said.

The Times reported on Saturday that tech companies flagged 45 million photos and videos as child sex abuse material last year. Those images were contained in 18.4 million reports, nearly 12 million of which came from Facebook Messenger. It was unclear whether photos and videos of abuse were actually more prevalent on Facebook or were just being detected at a high rate.

“We’re not picking on Facebook,” Mr. Raman said. He compared the millions of reports from Messenger with those sent by Apple, which encrypts its iMessage system. Apple sent 43 reports last year and about 150 so far in 2019.

An Apple spokesman said the company worked closely with law enforcement and complied with all legal obligations.

The decision to encrypt Messenger has raised alarms other countries that rely on reports submitted by tech companies to the national clearinghouse. The center distributes the reports to the authorities both domestically and internationally.

Fernando Ruiz Pérez, head of operations for cybercrimes at Europol, said Facebook was responsible for a “very high percentage” of reports to the European Union. He said that if Facebook moved to encrypt messaging, the “possibility to flag child sexual abuse content will disappear.”

Technologists at Facebook and elsewhere have been discussing ways to limit the spread of illicit content on a system that uses encryption, but it is proving a thorny problem.

Guy Rosen, Facebook’s vice president for integrity, said at the Stanford conference that the company would rely on “signals” that could indicate the spread of abuse imagery even if Facebook was unable to see it — when users tried to distribute messages to large groups, for example.

But Dr. Farid, the professor who helped build the detection technology, said that would be inadequate. He suggested scanning for abuse content by making a fingerprint of an image before the message was encrypted, and then comparing the fingerprint with a database of known illegal material.

“I don’t think there’s a technical barrier here,” Dr. Farid said. “They’re doing this because they want to avoid liability.”

Multiple cryptography experts said the practice would significantly weaken the privacy benefits of end-to-end encryption on images, however.

“They’re saying it’s O.K. to open it up to hackers or Chinese censorship,” said Ms. Portnoy, the digital rights technologist. “That system right there is very similar to how Chinese browsers implement censorship.”

Alex Stamos, a Stanford professor who arranged the conference, and Facebook’s former chief information security officer, acknowledged that all the proposals involved trade-offs. But he said the discussion was necessary because child sexual abuse material was “by far the worst thing that happens online.”

“There are options here,” Mr. Stamos said. “But you can’t quite have your cake and eat it too.”

Source

Be the first to comment

Leave a Reply

Your email address will not be published.


*


10 + 2 =