Programming note: It’s time for our summer vacation! The stretch between January and today is the longest uninterrupted period we have ever written The Interface, and so we’re taking a break to recharge. We return August 17th.
On any other day, the memo that Kevin Mayer published Wednesday
might have been the talk of the tech world. TikTok’s new CEO, who was thrust into a crisis over the future of the Chinese-owned app from the moment he took the job, has quickly emerged as the company’s top diplomat.
In his blog post, he came bearing economic gifts for the country that is currently blocking his app from federal and military devices, and threatening to ban it completely — as India recently did. Mayer promised to expand a fund for creators from $200 million to $1 billion. He said the company would hire 10,000 Americans — no small thing during a global recession.
And most intriguingly, Mayer said TikTok would do something that lawmakers have often asked of other social networks, but so far none have even suggested they would consider: allowing regulators to inspect the company’s algorithms, which select which videos and accounts to promote throughout the app. Mayer wrote:
We believe our entire industry should be held to an exceptionally high standard. That’s why we believe all companies should disclose their algorithms, moderation policies, and data flows to regulators. We will not wait for regulation to come, but instead TikTok has taken the first step by launching a Transparency and Accountability Center for moderation and data practices. Experts can observe our moderation policies in real-time, as well as examine the actual code that drives our algorithms. This puts us a step ahead of the industry, and we encourage others to follow suit.
The idea that algorithms ought to be open for inspection comes largely from Republicans. Senators including Josh Hawley (R-MO) have called for external audits of social network algorithms
, nominally to examine them for signs of “censorship.” (The idea that conservative voices are being censored during a time when they enjoy the broadest audience, and some of the widest support, in human history has become an article of faith for the Republican Party.)
We would all benefit, I think, from having greater insight into how social networks choose what to show us in our feeds. It has long unsettled me that not a single engineer at Facebook, or Twitter, or YouTube, can tell me for sure why any particular post shows up in the feed where it does — they can only describe for me a series of statistical probabilities, with many attendant blindspots
that we learn about only years later.
Not that they would be able to explain things perfectly after inspecting TikTok’s algorithm, either. The company told me that while it would make its source code open for review, it would not share information about individual users with reviewers for privacy reasons. But it will give reviewers insight into the signals the app takes into account when choosing which videos to promote — a move that, the company hopes, will dispel fears that it will be used to push propaganda or influence campaigns at the direction of the Chinese Communist Party.
“You’ll be able to see what informs our content ranking system, and how that information is used to make recommendations,” said Michael Beckerman, a TikTok vice president and head of US public policy. “Clearly that’s something that our stakeholders, and certainly policy makers, have questions about broadly for the industry.”
The algorithms that inform that system are updated regularly, and so TikTok intends to keep an open door to a select set of policy makers and experts, Beckerman told me. (It also posted new tips for creators on Thursday, you really don’t have to add #fyp to every single post
.) The original idea was for TikTok to invite people to a physical office to inspect its source code and review its content moderation policies; the pandemic has forced the company to reimagine the Transparency and Accountability Center as a virtual experience. In time, though, the company hopes to have visitors again.
There are risks to opening up the source code, such as giving creators too fine a sense of what will go viral, allowing them to game the system. That’s part of the reason a virtual field trip to the accountability center won’t be available to everyone. But the company believes that risk is outweighed by the potential trust it could earn by showing that it has nothing to hide, at least when it comes to content recommendation. “Seeing is believing,” Beckerman said.
It is likely only because TikTok’s position in the United States is so fraught that it would even consider opening up its source code to regulators. And yet even just by introducing the idea, the company has meaningfully shifted the window of what we might consider possible. For that, I’m grateful.
Then again, trust is in short supply everywhere these days, and even if regulators were to accept this compromise as a condition for letting TikTok continue to operate, I’m skeptical it would totally alleviate regulators’ concerns. Just because you know how a video gets promoted doesn’t guarantee you that some shadowy force isn’t working behind the scenes to put a thumb on the scale. It’s hard to prove a negative.
“We’ve had the conversation internally [about whether] someone could come in and say, ‘that’s fake code,’” Beckerman said. But “legitimate researchers and the people coming in will know that it’s real.”
And what if that still
doesn’t convince people? Then TikTok will likely be forced to sell. The reported valuation of the app — $50 billion
— is stratospheric, at 50 times its projected earnings. It’s hard to imagine anyone but a giant affording that purchase — and hard to imagine regulators in a time of antitrust approving it.
Which might make the middle ground more attractive to everyone, lawmakers included. If TikTok is to survive in America, it has to find a way to earn officials’ trust. Opening up its algorithm is the boldest move the company has made to date, and it could be one of its best.