View profile

The Diamond and Silk disaster in Congress

April 27 · Issue #125 · View online
The Interface
There were plenty of reasons to criticize lawmakers after this month’s Facebook hearings: too many of them came in unprepared, misunderstood key issues, and failed to extract meaningful concessions from Mark Zuckerberg. 
But those lawmakers look like Winston Churchill compared to the gang who welcomed Diamond and Silk to the Capitol this week. Last Friday, we argued Facebook, Twitter, and Google shouldn’t attend the hearings, which were organized to promote the transparently false notion that social platforms are rigged to prevent conservative ideas from spreading.
Indeed, the platforms declined the invitation to appear before Congress. And so we were left with the ugly spectacle of white Republican lawmakers pretending to care about elevating the voices of people of color in order to advance a lie about how social media works.
My colleague Adi Robertson captured the madness:
For Republicans, the day was mostly an excuse to lecture Silicon Valley and heap praise on Hardaway and Richardson, albeit not always accurately — Rep. Steve Chabot (R-OH) addressed a question at one point to “Diamond and Spice.” It’s really not clear what transpired between the vloggers and Facebook’s policy team, and a ThinkProgress analysis found that their traffic dips were probably tied to broad algorithm changes. But the company’s letter dubbing them “unsafe to the community,” even if it was completely retracted, sounds plausibly sinister.
Diamond and Silk escalated this complaint into a sweeping conspiracy where Mark Zuckerberg interfered in the 2018 elections (we’re not sure which ones) by manipulating conservative users’ advertising preferences so they’d be targeted with liberal ads. They proceeded to get into shouting matches with various committee members, as representatives pressed the pair about receiving money from the Trump campaign — something they claimed under oath had never happened, before admitting that they’d been reimbursed for plane tickets. “We can see that you do look at fake news,” Richardson sneered at Rep. Hakeem Jeffries (D-NY), when he read out a line from an official Trump campaign finance filing, stating that Diamond and Silk had been paid for consulting. “Are you calling this FEC document fake?” Jeffries asked.
For their part, Diamond and Silk seem to have perjured themselves. I’m glad Congress is beginning to understand the power of the social platforms, but this week Republicans gave us the most disingenuous possible reading of it.
Meanwhile, Facebook did attend a hearing this week with a UK parliamentary committee looking at the Cambridge Analytica scandal. The company sent its CTO, Mike Schroepfer, to take the beating.
[Member of Parliament Damian] Collins asked why Facebook didn’t spot Russia’s use of the social network to target voters sooner. “We were slow to spot that,” Schroepfer said, adding, “I’m way more disappointed in this than you are.” The claim prompted laughs from around the interview room and a subsequent apology from the CTO. “It’s a high bar,” Collins replied.
The committee heard a version of what we’ve heard in the United States:
Facebook said it will make sure political ads on its platform will be vetted and transparent in time for England and Northern Ireland’s 2019 local elections, that only verified accounts will be allowed to pay for political ads, and users will be able to view all promotions paid for by a campaign — not just those targeted to them based on their demographic or “likes.”
If Congress was a clown show, Parliament was at least asking the right questions. Schroepfer promised to follow up with the committee on almost 40 items. And Americans hoping for sustained scrutiny on their social media platforms find themselves once again relying on Europe to take the lead.

Everyone should read a trio of stories from Amanda Taub and Max Fisher based on their trip to Sri Lanka. The reports are rich in detail and illustrate how a failure to anticipate the consequences of social networking can have deadly consequences.
1. Where Countries Are Tinderboxes and Facebook Is a Match. It’s been a week and I still can’t get this story out of my mind. Rumors spread on Facebook lead to ethnic violence into Sri Lanka. And Facebook’s response doesn’t go much beyond “we remove such content as soon as we’re made aware of it.”
Facebook did not create Sri Lanka’s history of ethnic distrust any more than it created anti-Rohingya sentiment in Myanmar.
But the platform, by supercharging content that taps into tribal identity, can upset fragile communal balances. In India, Facebook-based misinformation has been linked repeatedly to religious violence, including riots in 2012 that left several dead, foretelling what has since become a wider trend.
“We don’t completely blame Facebook,” said Harindra Dissanayake, a presidential adviser in Sri Lanka. “The germs are ours, but Facebook is the wind, you know?”
2. In Search of Facebook’s Heroes, Finding Only Victims. The writers find that a monk who stopped an anti-Muslim mob that had organized on Facebook was himself an advocate of anti-Muslim views that first spread …… on Facebook.
Yes, he’d stopped the mob. But he also recited to us, as unvarnished truth, some of the very anti-Muslim rumors that had spread in viral Facebook memes before the attack. Muslim shop owners put chemicals in bras to sterilize Buddhist women. Muslim doctors sterilized Buddhist patients without their knowledge. Muslims in government were secret extremists.
How did he know this? “The whole country heard about this” on Facebook, he answered. Had he discussed this with his constituents in the run-up to the violence? Of course.
3. Does Facebook Just Harbor Extremists? Or Does It Create Them? In this piece, they examine how Facebook can push ordinary users to extremes. "The problem arises when negative, tribal emotions begin to permeate social media,” they write, “which increasingly dominates users’ lives and therefore shapes their perceptions of the world offline.”
Everyday users might not intend to participate in online outrage, much less lead it. But the incentive structures and social cues of algorithm-driven social media sites like Facebook can train them over time — perhaps without their awareness — to pump up the anger and fear. Eventually, feeding into one another, users arrive at hate speech on their own. Extremism, in other words, can emerge organically.
We saw this firsthand in the small town of Digana, Sri Lanka, a week after anti-Muslim mobs had torn through. 
Facebook Set Lobbying Record Ahead of Cambridge Analytica Furor
Facebook shakes up D.C. operation amid controversies
Facebook’s Battle Against Fake News Notches an Uneven Scorecard
Facebook made an ad about how bad Facebook has become
Is Facebook’s Campbell Brown a Force to Be Reckoned With? Or Is She Fake News?
Facebook’s News Feed changes appear to be hurting — not helping — local news
Facebook’s Privacy Scandal Appears to Have Little Effect on Its Bottom Line
Dating apps are refuges for Egypt’s LGBTQ community, but they can also be traps
New Snap Spectacles hands-on: Worth it? (Video starring me!)
Snap’s second-generation Spectacles are more grown up — and more expensive - The Verge
The Future of Snapchat Looks a Lot Like Magic Leap
Introducing New Tools for Facebook Fundraisers
Facebook’s Messenger Kids is getting a sleep mode
Are you really Facebook’s product? The history of a dangerous idea.
And finally ...
Talk to me
Questions? Comments? Diamond and/or Silk burns?
Did you enjoy this issue?
In order to unsubscribe, click here.
If you were forwarded this newsletter and you like it, you can subscribe here.
Powered by Revue