Around the world, countries and corporations are rethinking their relationship with encryption. In the wake of terrorist attacks, legislation in India
has sought to give law enforcement access to encrypted communications, in moves that could threaten the security of encryption around the world. In the United States, Apple has staked its reputation on protecting encrypted communications even when they belong to terrorists
— while Facebook pledged this year to shift the company to private messaging
The moves have exposed obvious tensions between free speech and safety. In an effort to move the discussion forward, the Stanford Internet Observatory
today held a conference in which tech platforms, government agencies, nongovernmental organizations, civil rights activists, and academics met to hash it out. I was among a handful of journalists who attended the event, and I came away mostly encouraged that all sides are determined to find a workable balance — even though it seemed clear that each group would strike that balance somewhat differently.
The government agencies want to maintain what they call “lawful access” to communications when needed for investigations, even if it means hacking into devices. Civil rights groups (represented today by the Electronic Frontier Foundation) warned that law enforcement is building a powerful surveillance operation and is increasingly arguing in court that they shouldn’t need a warrant to snoop on our communications. Tech platforms want to promote democratic free speech of the variety that produced Black Lives Matter and the #MeToo movement while also helping to catch terrorists and child predators. And nongovernmental organizations, such as those who work on protecting exploited children, worry that efforts to protect speech with encryption will make catching those predators much harder.
An example from the National Center for Missing and Exploited Children drove the point home. The organization has long operated a tipline in which people can report coming across child pornography and other incidents of abuse. In the late 1990s, the tipline received 200 to 300 reports per week, said Michelle DeLaune, NCMEC’s chief operating officer. But as the internet gained adoption, and platforms began collaborating with the organization, reports to the tipline exploded. In 2018, NCMEC received more than 18 million reports of exploitative imagery.
Strikingly, 99 percent of those reports come directly from the tech platforms. Through the use of artificial intelligence, hashed images, and partnerships between companies, we’re now arguably much better informed about the scope and spread of these images — and are better equipped to catch abusers. A Facebook executive says that the company bans a whopping 250,000 accounts a month for sharing child exploitation imagery. And a representative of GCHQ, the United Kingdom intelligence agency, said that in the UK last year 2,500 people were arrested due to NCMEC reports.
“That’s what we lose if we get this wrong,” said Crispin Robinson, GCHQ’s technical director for cryptoanalysis.
Meanwhile, the flip side of this discussion — the potential for government abuse of these tools — is on full display in Hong Kong. Maciej Ceglowski, the brilliantly acerbic writer-thinker-entrepreneur, recently returned from a month in the city-state reporting on the protests. He described how young pro-democracy protesters organize on Telegram, with the largely leaderless movement coordinating via in-app polls. Curiously, he said, the app has become popular even though its messages not end-to-end encrypted
by default. But it allows users to find nearby protesters, to speak to thousands of them at once, and to send disappearing messages that make prosecuting them harder if they are arrested — and that has been enough to make it an anchor of the pro-democracy movement.
Ceglowski’s talk underscored a point made throughout the day’s talks: that something can be secure even if it’s not encrypted, and something can be unsafe even if it is. As with so much in our conversations about technology, security, and democracy, encryption debates can be emotional in a way that undercuts nuance.
Alex Stamos, who came to the Internet Observatory from Facebook and who organized the event, reminded the audience that encryption solves another, growing problem for platforms. As countries demand that they remove more speech from their servers, it becomes desirable to remove products from speech debates entirely. A company can’t moderate what it can’t see — and so it may increasingly have an incentive not to see it. There are lots of public-minded reasons a company like Facebook wants to promote encryption — but there are nakedly self-interested ones, too.
No one in the room on Thursday morning offered a simple solution for squaring all these circles. But it was heartening, at least, that they came into the room and were willing to have at least some of these discussions in public.