In July, amid increasing scrutiny from the Trump administration, TikTok announced a novel effort to build trust with regulators: a physical office known as the Transparency and Accountability Center. The center would allow visitors to learn about the companyâs data storage and content moderation practices, and even to inspect the algorithms that power its core recommendation engine.
âWe believe all companies should disclose their algorithms, moderation policies, and data flows to regulators,â then-TikTok CEO Kevin Mayer
said at the time. âWe will not wait for regulation to come.â
With so much turmoil, you might expect the company to set aside its efforts to show visitors its algorithms, at least temporarily. But the TikTok Transparency and Accountability Center is now open for (virtual) business â and on Wednesday I was part of a small group of reporters who got to take a tour over Zoom.
Much of the tour functioned as an introduction to TikTok: what it is, where itâs located, and who runs it. (Itâs an American app, located in America, run by Americans, was the message delivered.) We also got an overview of the appâs community guidelines, its approach to child safety, and how it keeps data secure. All of it is basically in keeping with how American social platforms manage these concerns, though itâs worth noting that 2-year-old TikTok built this infrastructure much faster than its predecessors did.
More interesting was the section where Richard Huang, who oversees the algorithm responsible for TikTokâs addictive For You page, explained to us how it works. For You is the first thing you see when you open TikTok, and it reliably serves up a feed of personalized videos that leaves you saying âIâll just look at
one more of theseâ for 20 minutes longer than you intended. Huang told us that when a new user opens TikTok, the algorithm fetches eight popular but diverse videos to show them. Sara Fischer at
Axios has
a nice recap of what happens from there:
The algorithm identifies similar videos to those that have engaged a user based on video information, which could include details like captions, hashtags or sounds. Recommendations also take into account user device and account settings, which include data like language preference, country setting, and device type.
Once TikTok collects enough data about the user, the app is able to map a userâs preferences in relation to similar users and group them into âclusters.â Simultaneously, it also groups videos into âclustersâ based on similar themes, like âbasketballâ or âbunnies.â
As you continue to use the app, TikTok shows you videos in clusters that are similar to ones you have already expressed interest in. And the next thing you know,
80 minutes have passed.
Eventually the transparency center will be a physical location that invited guests can visit, likely both in Los Angeles and in Washington, DC. The tour will include some novel hands-on activities, such as using the companyâs moderation software, called Task Crowdsourcing System, to evaluate dummy posts. Some visitors will also be able to examine the appâs source code directly, TikTok says.
I think this is great. Trust in technology companies has been in decline, and allowing more people to examine these systems up close feels like a necessary step toward rebuilding it. If you work at a tech company and ever feel frustrated by the way some people discuss algorithms as if theyâre magic spells rather than math equations â well, this how you start to demystify them. (Facebook has a similar effort to describe
what youâll find in the News Feed here; I found it vague and overly probabilistic compared to what TikTok is offering. YouTube has
a more general guide to how the service works, with
fairly sparse commentary on how recommendations function.)
Three other takeaways from my day with TikTok:
TikTok is worried about filter bubbles. Facebook
has long denied that it creates filter bubbles, saying that people find a variety of diverse viewpoints on the service. Thatâs why I was interested to hear from TikTok executives that they are quite concerned about the issue, and are regularly refining their recommendation algorithm to ensure you see a mix of things. âWithin a filter bubble, thereâs an informational barrier that limits opposing viewpoints and the introduction of diverse types of content,â Huang said. âSo, our focus today is to ensure that misinformation and disinformation does not become concentrated in usersâ For You page.â
The problems are somewhat different on the two networks â Facebook is primarily talking about ideological diversity, where TikTok is more concerned with promoting different types of content â but I still found the distinction striking. Do social networks pull us into self-reinforcing echo chambers, or donât they?
TikTok is building an incident command center in Washington, DC. The idea is to be able to identify critical threats in real time and respond quickly, the company said, which feels particularly important during an election year. I donât know how big a deal this is, exactly â for the time being, it sounds like it could just be some trust and safety folks working in a shared Slack channel? But the effort does have an undeniably impressive and redundant official name: a âmonitoring, response and investigative fusion response center.â OK!Â
You canât prove a negative. TikTok felt compelled to design these guided tours amid fears that the app would be used to share data with Chinese authorities or promote Communist Party propaganda to Americans. (Ben Thompson has a great,
subscribers-only interview with the New York Timesâ Paul Mozur that touches on these subjects today.) The problem with the tour, though, is that you canât
show TikTok
not doing something. And I wonder if that wonât make the transparency center less successful than the company hoped.
I asked Michael Beckerman, a TikTok vice president and head of US public policy, about that challenge.
âThatâs why weâre trying to be even more transparent â weâre meeting and talking to everybody that we can,â Beckerman told me. âWhat a lot of people are saying â people that are really well read into global threats â is that TikTok doesnât rank. So if youâre spending too much time worrying about TikTok, what are you missing?â
Anyway, TikTokâs transparency center is great â a truly forward-leaning effort from a young company. Assuming TikTok survives beyond November, Iâd love to visit it in person sometime.