Why you can't fix culture with technology

Revue
 
The Interface is, in part, a journal of unintended consequences. Tech platforms build tools that are
 

The Interface

December 11 · Issue #44 · View online
An evening newsletter about Facebook, social networks, and democracy.

The Interface is, in part, a journal of unintended consequences. Tech platforms build tools that are misused, in ways that can destabilize entire countries. We put the onus on them to fix it. 
But what if we have that part wrong? The researched danah boyd, in an interview with Wired, says we’re too focused on finding technological solutions to what are ultimately cultural problems:
I think that we’re still not taking a true public accounting of all of the different cultural factors that are at play. What’s really striking about what’s at stake is that we have an understanding of our American society and of there being a rational, bureaucratic process around democracy. But now there are such notable societal divisions, and rather than trying to bridge them, trying to remedy them, trying to figure out why people’s emotions are speaking past one another, it’s about looking for a blame, looking for somebody that we can hold responsible without holding ourselves individually and collectively responsible. Unfortunately, that’s going to do squat. And, for the most part, we’re looking for something new to blame, which is why so much of the attention is focused on technology companies instead of politics, news media, or our economic incentives. We need to hold ourselves individually and collectively responsible, but that’s not where people are at.
I suspect this view is popular at tech companies! But tech companies have work to do too, she says:
I think that one of the mistakes that people in the tech sector have made is that they realized the importance of connecting people across distance—but they thought that it would happen naturally if they just made it possible. And they were wrong. They were wrong to say that people would actively connect to those who were different than them because they could through technology. You actually have to make it intentional. I think there’s a lot that the tech sector can and should do around this. No one has a better model of the networks of America than those tech companies. No one understands better where the disconnects are. What would it mean to actually understand and seek to remedy the divisions? 
Arguably Facebook’s two biggest initiatives this year have been moves in this direction: a move toward groups, and toward private messaging. But are we connecting to people who are different from us, or just rebuilding our bubbles in more private spaces?
It’s one of the most interesting questions for social media in 2018.

Democracy
When news breaks, Google still can’t separate rumor from fact
Former Facebook exec says social media is ripping apart society
Shadowy Facebook Ads That Pushed Trump Are Back in Alabama
Elsewhere
YouTubers Made Hundreds Of Thousands Off Of Bizarre And Disturbing Child Content
Nextdoor Raised $75 Million in New Round
VR pioneer Jaron Lanier on dystopia, empathy, and the future of the internet
YouTuber Jake Paul Builds a Multimillion-Dollar Empire—With the Help of Thousands of Tween Fans
Time’s up for the Ticker? Facebook appears to axe feed for tracking your friends’ activity
This Instagram story ad with a fake hair in it is sort of disturbing
Launches
Facebook Messenger to start making plans for you
Facebook is trying to make the Poke happen again
Takes
How to manage your image, from a teen with four Instagram accounts
And finally ...
Trivia game HQ finally has a competitor called The Q
Talk to me
Questions? Comments? casey@theverge.com 
Did you enjoy this issue?
Thumbs up 1ae5a7bdfcd3220e2b376aa0c1607bc5edaba758e5dd83b482d03965219a220b Thumbs down e13779fa29e2935b47488fb8f82977fedcf689a0cc0cc3c19fa3c6bb14d1493b
Carefully curated by Casey Newton with Revue.
If you were forwarded this newsletter and you like it, you can subscribe here.
If you don't want these updates anymore, please unsubscribe here.