fbpx

Its algorithm decides what each user sees, and it’s the opacity of the Tiktok algorithm that presents the most worrying national security risk.

A Closer Look at the Tiktok Algorithm

There’s one thing we’re all getting wrong about TikTok: it’s not really a social media app.

As TikTok Australia’s general manager told the Senate Select Committee on Foreign Interference through Social Media in September last year, the app is ‘less about social connection and more about broadcasting creativity and expression’.

Put another way, think of TikTok more as the modern incarnation of a media publisher—like a newspaper or a TV network—than as a social forum like Facebook or Twitter.

That’s because TikTok is much more assertively curatorial than its competitors.

It’s not a forum, it’s an editor. Its algorithm decides what each user sees, and it’s the opacity of that algorithm that presents the most worrying national security risk.

It may sound like an insignificant distinction, but TikTok’s emphasis on an ‘interest graph’ instead of a ‘social graph’ took the app’s competitors completely by surprise. What’s more, it has largely gone over the heads of most lawmakers.

Chinese technology company ByteDance owns Tiktok and the app hit 2.3 billion downloads in August 2020. So it’s high time policymakers understood exactly what makes TikTok tick.

An essay by Eugene Wei should be at the top of their reading list.

What is Tiktok so popular?

A San Francisco–based start-up investor and former Amazon and Facebook employee, Wei dissects TikTok’s strategy. In addition, he shows how its recommendation engine keeps users glued to their screens.

It does it not by connecting them with friends or family, but by closely analysing their behaviour on the app. Then it serves them more of what they’re interested in.

Wei’s opus, which approaches 20,000 words and is only the first in a three-part series, explains how TikTok is not the same as the major social media platforms we’re more familiar with.

The algorithm quickly learns how users interact with content to decide what it delivers next.

Put simply, on Facebook and Twitter, who you follow largely decides what content you see. On TikTok, however, the user doesn’t have to follow anyone.

Instead, the algorithm very quickly learns from how users interact with the content they’re served in the app’s ‘For You’ feed to decide what it should deliver to them next.

The approach is similar to that of Spotify and Netflix. Their recommendation algorithms observe which songs and movies you listen to or watch in full and which you skip to decide what new content to suggest.

As Wei puts it, ‘TikTok’s algorithm is so effective that it doesn’t feel like work for viewers. Just by watching stuff and reacting, the app learns your tastes quickly. It feels like passive personalization.’

It’s a strategy, Wei argues, that allowed a team of Chinese engineers—who didn’t necessarily have a good understanding of the cultures in the places where the app is available—to take the world by storm.

TikTok didn’t just break out in America. It became unbelievably popular in India and in the Middle East, more countries whose cultures and language were foreign to the Chinese Bytedance product teams.

Imagine an algorithm so clever it enables its builders to treat another market and culture as a complete black box. What do people in that country like?

No, even better, what does each individual person in each of those foreign countries like? You don’t have to figure it out. The algorithm will handle that. The Tiktok algorithm knows.

But that’s not the only thing the Tiktok algorithm knows.

Tiktok algorithm censorship

In a recent Protocol China exposé, a former censor at ByteDance said the company’s ‘powerful algorithms not only can make precise predictions and recommend content to users, one of the things it’s best known for in the rest of the world. It can also assist content moderators with swift censorship’.

A former employee, described working at ByteDance as being ‘a tiny cog in a vast, evil machine’.

He went on to say that even live-streamed shows on the company’s apps are ‘automatically transcribed into text. This allows algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names.

It even has Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.’

There’s no doubt that TikTok can monitor and censor its users. The question is, will they continue to?

Certainly, the blunt censorship that typified TikTok’s earlier approach to content moderation is unlikely to continue.

Especially after the public scrutiny over TikTok’s censoring of content related to the Tiananmen Square massacreBlack Lives Matter protests and Beijing’s persecution of Uyghurs and other ethnic minorities.

But there’s ample room for ByteDance to covertly tweak users’ feeds. It can subtly nudge them towards content favoured by governments and ruling parties—including the Chinese Communist Party.

China’s state media strategy

After all, it’s an approach that would be in line with the strategy that China’s Ministry of Foreign Affairs and state media are already deploying.

Beijing is exploiting pre-existing grievance narratives. It is also amplifying pro-CCP Western influencers knowing that Western voices are more likely to reach target online networks than official CCP spokespeople.

The strategy, referred to as ‘Borrowing mouths to speak’ (借嘴说话), is reminiscent of the Kremlin’s approach. It is perfectly suited to being covertly deployed on Chinese-owned and operated social media apps.

Experiments show that TikTok’s algorithm can hurtle users from a politically neutral feed into a far-right firehose of content. China could easily use Tiktok to send users down any extreme rabbit hole.

By design, the app groups people into ‘clusters’ (otherwise known as filter bubbles) based on their preferences. TikTok’s executives stress that they have measures in place to ensure people don’t become trapped in those filter bubbles.

TikTok’s recommendation system ‘works to intersperse diverse types of content along with those you already know you love’, the company claims. The goal, they say, is to ensure that users are exposed to ‘new perspectives and ideas’. But who decides which new perspectives and ideas?

A human rights tragedy unfolding

What’s to stop Beijing from pressuring TikTok to encourage communities of Xinjiang denialists to flourish on the platform? As our report revealed, there’s already evidence that this is happening.

Our analysis of the hashtag #Xinjiang showed a depiction of the region that glosses over the human rights tragedy unfolding there. It provides a more politically convenient version for the CCP, replete with smiling and dancing Uyghurs.

It’s time for policymakers to take action

We have underestimated the power of social media apps before. Facebook started as a ‘hot or not’ website in a Harvard dorm room at the turn of the millennium. Who expected it would go on to play a role in inciting violence 13,000 kilometres away?

So how do policymakers deal with a Chinese-owned social media app? Especially one that isn’t really a social media app, but a modern-day interactive TV station, whose algorithm decides content?

It’s past time governments realised the unique problem TikTok presents. Now they must tailor solutions to deal with it properly.

Resources and Further Reading

The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of Vision of Humanity.

AUTHOR

Fergus Ryan

Australian Strategic Policy Institute
FULL BIO

The Strategist

The Strategist is the commentary and analysis site of the Australian Strategic Policy Institute, an independent, non-partisan think tank based in Canberra, Australia.