View profile

Facebook sets the Days Without A Scandal counter back to 0

Revue
 
Facebook's business works because people are generally willing to give up a healthy amount of their p
 
June 7 · Issue #153 · View online
The Interface
Facebook’s business works because people are generally willing to give up a healthy amount of their privacy in exchange for the ability to easily stay in touch with friends and family. When criticisms about its privacy policies crop up, Facebook is quick to say that it gives users control over their privacy settings. During his April testimony before Congress, faced with one such line of attack, Mark Zuckerberg said that privacy controls are of supreme importance to the company (emphasis mine):
You know, every day, about 100 billion times a day, people come to one of our products, whether it’s Facebook or Messenger or Instagram or WhatsApp, to put in a piece of content, whether it’s a — a photo that they want to share or a message they want to send someone.
And, every time, there’s a control right there about who you want to share it with. Do you want to share it publicly, to broadcast it out to everyone? Do you want to share it with your friends, a specific group of people? Do you want to message it to just one — one person or a couple of people? That’s the most important thing that we do. And I think that, in the product, that’s quite clear.
Of course, you can’t really call it a privacy “control” if you don’t control it. And last month, due to a bug, Facebook lost control of its controls. Here’s Kurt Wagner:
Up to 14 million Facebook users who thought they were posting items that only their friends or smaller groups could see may have been posting that content publicly, the company said Thursday.
According to Facebook, a software bug, which was live for 10 days in May, updated the audience for some user’s posts to “public” without any warning. Facebook typically lets users set the audiences who get to see posts; that setting is “sticky,” which means it remains the default setting until manually updated.
Facebook was unclear about how many of the 14 million people may have posted to friends without realizing they were sharing that information publicly. The company said it will begin to alert people who were impacted on Thursday.
How did this happen? Here’s how Facebook described it:
The error occurred while we were building a new way to share featured items on your profile, like a photo. Since these featured items are public we inadvertently made the suggested audience for all new posts – not just these items — public.
On one hand, while the setting was changed against users’ will, it was still visible. If you went to post something after years of only publishing to an audience of friends, there’s a good chance you may have noticed the accidental switch to public sharing. There’s also a good chance — maybe even a better chance — that you would not have noticed. Which means the thing you never intended your boss / ex / rival / enemy to see might have made it into their News Feed anyway.
The news comes as Facebook had appeared to right itself following the Cambridge Analytica scandal, with its stock price returning to near-all time highs. But even as the business surges ahead, trust in the company is still declining. The most visible consequence of this is the degree to which the company has lost the benefit of the doubt.
I see this regularly now, in all sorts of ways. Facebook launches a well intentioned effort to fight revenge porn, and is dismissed as a creep begging for nudes. It builds a first-party app for Chinese phones that stores all user data on the devices where it belongs, and now might face a Congressional investigation over it. On the latter issue, when I tweeted that I didn’t see what the story was supposed to be, the replies all hit the same note: “why would anyone believe what Facebook says at this point?
At the Code Conference last week, Kara Swisher’s first question to Sheryl Sandberg was blunt: why was no one fired over the Cambridge Analytica scandal? Sandberg’s answer was that Mark Zuckerberg held himself responsible for the whole thing and — she didn’t say this part — he wasn’t about to fire himself over it.
I predict that, if anyone is fired over exposing 14 million to potential shame and ridicule, we’ll hear nothing about it. I think that’s the wrong move. If the perception holds that no one at Facebook ever faces any consequences for their mistakes, trust in the company will further decline. And doing good in the world will get that much harder.

Democracy
Congress roasted Facebook on TV, but won’t hear any bills to regulate it
Google will pause election ads in Washington state in unprecedented response to new law
Facebook's tie-up with Chinese smartphone giant Huawei makes a bad situation worse
After Scrutinizing Facebook, Congress Turns to Google Deal With Huawei
Internal Documents Show How Facebook Decides When a Poop Emoji Is Hate Speech
Elsewhere
Pivot to traditional: Direct-to-consumer brands sour on Facebook ads
Former revenge porn mogul Craig Brittain sues Twitter for banning him - The Verge
MIT fed an AI data from Reddit, and now it only thinks about murder
Launches
Instagram now lets you instantly repost stories you’re mentioned in to your own story
Fb.gg is Facebook’s game streaming hub for stealing Fortnite streamers away from Twitch
Panda raises funding for its video chat app
Takes
Facebook’s newest drama is a reminder of one of the company’s big failures: It never owned the phone
And finally ...
Facebook cuts downs annoying “now connected on Messenger” alerts
Talk to me
Questions? Comments? Posts that were only intended for your friends and family? casey@theverge.com
Did you enjoy this issue?
If you don't want these updates anymore, please unsubscribe here
If you were forwarded this newsletter and you like it, you can subscribe here
Powered by Revue