In July, amid increasing scrutiny from the Trump administration, TikTok announced a novel effort to build trust with regulators: a physical office known as the Transparency and Accountability Center. The center would allow visitors to learn about the company’s data storage and content moderation practices, and even to inspect the algorithms that power its core recommendation engine.
“We believe all companies should disclose their algorithms, moderation policies, and data flows to regulators,” then-TikTok CEO Kevin Mayer said at the time
. “We will not wait for regulation to come.”
With so much turmoil, you might expect the company to set aside its efforts to show visitors its algorithms, at least temporarily. But the TikTok Transparency and Accountability Center is now open for (virtual) business — and on Wednesday I was part of a small group of reporters who got to take a tour over Zoom.
Much of the tour functioned as an introduction to TikTok: what it is, where it’s located, and who runs it. (It’s an American app, located in America, run by Americans, was the message delivered.) We also got an overview of the app’s community guidelines, its approach to child safety, and how it keeps data secure. All of it is basically in keeping with how American social platforms manage these concerns, though it’s worth noting that 2-year-old TikTok built this infrastructure much faster than its predecessors did.
More interesting was the section where Richard Huang, who oversees the algorithm responsible for TikTok’s addictive For You page, explained to us how it works. For You is the first thing you see when you open TikTok, and it reliably serves up a feed of personalized videos that leaves you saying “I’ll just look at one more
of these” for 20 minutes longer than you intended. Huang told us that when a new user opens TikTok, the algorithm fetches eight popular but diverse videos to show them. Sara Fischer at Axios
has a nice recap of what happens from there
The algorithm identifies similar videos to those that have engaged a user based on video information, which could include details like captions, hashtags or sounds. Recommendations also take into account user device and account settings, which include data like language preference, country setting, and device type.
Once TikTok collects enough data about the user, the app is able to map a user’s preferences in relation to similar users and group them into “clusters.” Simultaneously, it also groups videos into “clusters” based on similar themes, like “basketball” or “bunnies.”
As you continue to use the app, TikTok shows you videos in clusters that are similar to ones you have already expressed interest in. And the next thing you know, 80 minutes have passed
Eventually the transparency center will be a physical location that invited guests can visit, likely both in Los Angeles and in Washington, DC. The tour will include some novel hands-on activities, such as using the company’s moderation software, called Task Crowdsourcing System, to evaluate dummy posts. Some visitors will also be able to examine the app’s source code directly, TikTok says.
I think this is great. Trust in technology companies has been in decline, and allowing more people to examine these systems up close feels like a necessary step toward rebuilding it. If you work at a tech company and ever feel frustrated by the way some people discuss algorithms as if they’re magic spells rather than math equations — well, this how you start to demystify them. (Facebook has a similar effort to describe what you’ll find in the News Feed here
; I found it vague and overly probabilistic compared to what TikTok is offering. YouTube has a more general guide
to how the service works, with fairly sparse
commentary on how recommendations function.)
Three other takeaways from my day with TikTok:
TikTok is worried about filter bubbles.
Facebook has long denied that it creates filter bubbles
, saying that people find a variety of diverse viewpoints on the service. That’s why I was interested to hear from TikTok executives that they are quite concerned about the issue, and are regularly refining their recommendation algorithm to ensure you see a mix of things. “Within a filter bubble, there’s an informational barrier that limits opposing viewpoints and the introduction of diverse types of content,” Huang said. “So, our focus today is to ensure that misinformation and disinformation does not become concentrated in users’ For You page.”
The problems are somewhat different on the two networks — Facebook is primarily talking about ideological diversity, where TikTok is more concerned with promoting different types of content — but I still found the distinction striking. Do social networks pull us into self-reinforcing echo chambers, or don’t they?
TikTok is building an incident command center in Washington, DC. The idea is to be able to identify critical threats in real time and respond quickly, the company said, which feels particularly important during an election year. I don’t know how big a deal this is, exactly — for the time being, it sounds like it could just be some trust and safety folks working in a shared Slack channel? But the effort does have an undeniably impressive and redundant official name: a “monitoring, response and investigative fusion response center.” OK!
You can’t prove a negative.
TikTok felt compelled to design these guided tours amid fears that the app would be used to share data with Chinese authorities or promote Communist Party propaganda to Americans. (Ben Thompson has a great, subscribers-only interview with the New York Times’ Paul Mozur
that touches on these subjects today.) The problem with the tour, though, is that you can’t show
doing something. And I wonder if that won’t make the transparency center less successful than the company hoped.
I asked Michael Beckerman, a TikTok vice president and head of US public policy, about that challenge.
“That’s why we’re trying to be even more transparent — we’re meeting and talking to everybody that we can,” Beckerman told me. “What a lot of people are saying — people that are really well read into global threats — is that TikTok doesn’t rank. So if you’re spending too much time worrying about TikTok, what are you missing?”
Anyway, TikTok’s transparency center is great — a truly forward-leaning effort from a young company. Assuming TikTok survives beyond November, I’d love to visit it in person sometime.