TikTok is, once again, facing an uncertain future. The company has spent the last two years quietly negotiating with US government officials in order to avoid an outright ban. But that process has now stalled, and calls for a ban have only intensified.
Next month, TikTok CEO Shou Zi Chew will testify at a House Energy and Commerce Committee hearing, his first Congressional appearance. Many lawmakers have called for a more sweeping ban, and will likely quiz Chew about TikTok’s alleged risks to national security, and its parent company’s Chinese ownership.
TikTok has long denied that it’s a threat, and downplays its ties to China. But now the company is also trying a new tactic to prove it has nothing to hide: its Transparency and Accountability Center. The company first introduced the idea in 2020, but the actual facility didn’t open until recently due to COVID-related delays. Last week, the company took a handful of reporters on a tour of the center as part of a new charm offensive as it tries to fend off regulators and the looming prospect of more bans in the United States.
The first thing you notice when you walk in is that, despite being dedicated to “transparency,” there are no windows in the space, which is housed in an office park near TikTok’s Culver City US HQ. Instead, visitors are greeted with neon-lit signs and big, interactive displays dedicated to explaining various aspects of the app.
The company hopes visitors will walk away with a better understanding of how the app operates and, perhaps, less suspicion. “We really do understand the critique that big media, big tech, plays as it relates to how algorithms work, how moderation policies work and the data flows of the systems,” says TikTok COO Vanessa Pappas. “A lot of these are unprecedented levels of transparency that we’re providing.”
What you’ll actually learn by touring the center, though, largely depends on how much you already know about TikTok when you walk in the door. It’s primarily dedicated to explaining the app’s content moderation policies, and how it handles recommendations, both of which have been heavily scrutinized.
There are two interactive exhibits: a “moderation station,” where visitors can play the role of a TikTok content moderator, and another room that’s meant to “demystify” the app’s vaunted recommendation algorithm.
In the moderation room, you can watch sample videos — presented in an interface similar to what TikTok’s actual content moderators see — and try your hand at judging which ones violate the app’s rules. Meanwhile, the room next door is dedicated to “the algorithm.” It’s more of an illustrated FAQ that offers fairly broad explanations to high-level questions about how the app recommends content. The content is more detailed than TikTok’s extremely vague in-app explanations, but that’s not saying much. For example, under the heading “What information does TikTok use to create personalized experiences?” it explains that users’ interactions with content are tracked to inform the underlying recommendation model. That might be useful info if you know nothing about how algorithms work, but it doesn’t tell you very much about TikTok.
Each explanation is also accompanied by a visualization and a snippet of “simulated code” — the company tightly controls who can view the app’s actual source code — to illustrate what’s happening at various stages of the recommendations process. But again, this felt like it was more designed for people who know nothing about TikTok rather than those who are trying to understand the nuances of its algorithm. There is a space at the transparency center, a server room behind a neon “LATC” sign, where auditors can enter and — after heavy security — dig into TikTok’s actual source code. But the vast majority of visitors to the center will never make into that room.
Overall, I can see how the tour might be a worthwhile exercise for lawmakers, who too often show they know shockingly little about how the internet works. But it also feels a bit performative, and I can’t help but remember Facebook’s infamous “war room” tour, when it invited reporters to visit a conference room dedicated to safeguarding elections only to shut it down a month later.
To be clear, TikTok does intend for the transparency center to be a permanent fixture. And the company plans to open more of them in other locations around the world. But while these facilities may help Boomer lawmakers and regulators understand what TikTok is, I’m not sure they will be able to dispel the perception that there’s something else, something more secretive, going on within the company. It’s one thing to illustrate how TikTok’s algorithm works at a high level, but it’s another to prove that something isn’t happening.
It’s notable, then, that TikTok’s Transparency Center doesn’t address some of the biggest concerns that have been raised about TikToK: its relationship with parent company Bytedance and whether the Chinese government could somehow take advantage of the relationship to advance its interests. “If you fundamentally distrust the autocratic Chinese government, and how it uses its relationship with large Chinese-based corporations to extend its influence around the world, then all the promises TikTok can pile up are not going to completely allay your anxiety about TikTok,” Paul Barrett, the deputy director of NYU’s Stern Center for Business and Human Rights, told Engadget.
TikTok does, however, have a plan to address government concerns that it could be a national security threat. The company has been locked in negotiations with the Committee on Foreign Investment in the United States (CFIUS) for more than two years over its future in the US. And it struck a deal with Oracle last year to safeguard US user data as part of this effort, known as “Project Texas,” to reassure US officials.
Until now, TikTok has been fairly tight-lipped about Project Texas and its dealings with CFIUS. But now that those talks have stalled — despite TikTok claiming it’s addressed every concern raised by regulators — the company has been cautiously sharing more details about its arrangements with Oracle.
Reporters who attended the tour were given an overview of the plan, but were asked not to directly quote the executives who described it.
Central to the plan is a new US subsidiary called TikTok US Data Security (USDS), which will have an independent board of CFIUS-approved directors with national security and cybersecurity backgrounds. On the TikTok side, there will be two executives running the US subsidiary, who will report to the board.
Meanwhile, all US user data will be housed within Oracle’s Cloud infrastructure with strict controls to prevent unauthorized access and to keep most data from leaving. (Some data about what US users are doing will inevitably have to leave in order to, for example, allow people to interact with content and users from other countries.) Oracle will also review TikTok’s entire source code, as will a separate, outside auditor. Future app updates will also be inspected by Oracle, which will take over responsibility for sending updates to the app stores. Oracle will also monitor TikTok’s recommendation algorithm and content moderation systems. The US government, via CFIUS, will continue to have visibility and oversight into what USDS is doing on an ongoing basis.
TikTok says they are confident these steps address every issue that’s been raised about what TikTok could potentially be doing. Executives also point out that the company has already dedicated an astonishing amount of money — $1.5 billion — and resources to Project Texas. If all that’s good enough for CFIUS, they say, it should be good enough for Congress.
Whether lawmakers will be satisfied with any scenario that allows TikTok to operate in the United States without being fully divested from ByteDance, though, remains to be seen. “They [TikTok] can make all of these arrangements, and put in place all these safeguards, almost to infinity,” Barrett says. “And it’s not clear to me that that would satisfy China hawks in the United States.”
That’s partly because TikTok is a convenient punching bag for lawmakers who want to appear tough on China. But there are also legitimate reasons to be concerned about TikTok. ByteDance recently fired four employees who accessed the personal data of an American journalist who had reported on the company. TikTok also has a history of taking, at best, a heavy handed approach to content moderation that some have equated with censorship favorable to the Chinese government.
According to TikTok, Project Texas will ensure neither scenario can happen again. But the fact that it already has will undoubtedly lead to further questions about just how deep the company’s commitment to transparency and accountability really is.