20 мая, понедельник
С картинками
Текстовый вид
ru
Украинский
Русский
How TikTok is turning into an AI dump. A study of a Russian-language pirate TikTok farm
How TikTok is turning into an AI dump. A study of a Russian-language pirate TikTok farm

How TikTok is turning into an AI dump. A study of a Russian-language pirate TikTok farm

With the development of artificial intelligence capabilities, social networks are increasingly looking like an "AI dump" - a place where lazy content creators test the power of algorithms and try to go viral with other people's and synthetically processed content. We uncovered one such TikTok farm of 300 channels that, after automated processing, presented YouTube content as their own. At the same time, they were "accidentally" spreading Russian propaganda.

Читати українською

Almost every month, a new and original way to get rich without effort appears on the Internet. And it seems that crypto and blockchain are starting to compete with instructions on how to become a super successful blogger using AI.

Many have already adopted AI tools to improve their work. Regional newsrooms are launching their anchors, and the Ministry of Foreign Affairs is creating a virtual spokesperson (by the way, we have already written why this is not ok). And translation or grammar-checking services have become a part of our lives and are hardly associated with AI anymore.

However, in addition to improving the finished material, there is another niche: generating fully artificial content. It can be created "effortlessly" in just a couple of minutes. An entire ecosystem of AI applications is used to generate video (or process stolen video), generate text and audio tracks, edit everything, and add text with descriptions and hashtags for publication on the platform. All in a matter of seconds.

And while social platforms are gradually turning into an AI dump, such tools give users the hope that one of the generated videos will go viral and bring money. Or they can simply help to scale the Russian propaganda messages.

We'll explain how it works in more detail using the example of a bot farm of 300 channels that used AI to process and publish videos from Russian and Belarusian YouTube.

How to detect a bot farm?

Meet Anastasia Kozhevnikova, a native of Mariupol who is now a popular Russian YouTube vlogger with 190,000 subscribers on YouTube alone. She is a well-known disseminator of the Russian narrative about "Russia the rescuer" and the incredible rebuilding of Mariupol.

Screenshot of Anastasia Kozhevnikova's YouTube channel. "Now I'm filming how my Mariupol is rising from the ashes," she writes in her description.

However, what you immediately notice when you get to her profile is the lack of information about the author. Who is she, and what does she make videos about? Where are the links to other social networks? A link to a super-popular YouTube channel? Or even card numbers for generous donors? Nothing. Instead, there's just a neutral phrase with a call to subscribe. Any channel can have one. And as it turned out, many channels do have it.

Using an advanced Google search, we identified another 298 channel tiktoks with the same profile description. And, as it turned out, they had more in common than that. This is how we started researching one Russian-language artificial TikTok farm.

We found 298 channel tiktoks with identical descriptions. They include both recipes and typical Russian propaganda
We found 298 channel tiktoks with identical descriptions. They include both recipes and typical Russian propaganda

Why do we know for sure that it is a "farm"?

The fact that these 299 channels are part of the same network (or, more simply, farm) is obvious for many reasons.

In addition to the same description (which is not just the same, but identical, right down to the period at the end of the sentence), all of these channels also have identically designed videos. These are always YouTube videos whose original title has been moved to the video description on the TikTok. The upper black box duplicates it again, and the lower one indicates the part number (if the original video was cut into smaller pieces). The video itself is accompanied by automatically generated subtitles (generated because it contains a large number of errors and incorrect text transfers).

The only thing that distinguishes these channels is their subject matter. Among them, you can find not only Russian propaganda but also popular imperial conspiracy theories. There are also recipes, car repairs, tips for mechanics, DIY videos, children's cartoons, and even a blogger who gives tours of Russian cemeteries or a treasure hunter among Russian scrap metal.

All of the farm's channels are designed in the same way, with generated subtitles, yellow video titles at the top and the part number at the bottom.
All of the farm's channels are designed in the same way, with generated subtitles, yellow video titles at the top and the part number at the bottom.

The only thing that unites these authors on YouTube is the absence of their own TikTok channels, which are the only simulacra we find. By the way, in Russia, from March 2022 until recently, it was forbidden to upload videos to TikTok, although the network was one of the most popular there.

The absence of information about the authors on the channels of the discovered farm confirms the assumption that we are dealing with piracy. As of the beginning of May, the channel has already brought its owners more than 10 million likes and even more views in the six months of its existence.

Why so much effort?

The question of monetisation remains open. Or how this farm plans to make money.

In Russia (as well as in Ukraine, by the way), there is no advertising on TikTok, and with it, no support programmes for content creators. And while in the United States (where they are already actively moving towards blocking the Chinese platform), Germany, or Japan, TikTok has launched a beta version of financial support for content creators, these opportunities remain unavailable for our space. They are unlikely to open shortly.

Instead, there is another option - you can always publish something of your own on a channel promoted with the help of someone else's content for money from the customer.

On the one hand, ticker algorithms help content go viral, even from a newly created profile. And Russia is actively using this to launch its narratives into the Ukrainian or European information space through hundreds of newly created bots. The last significant exposure of a Russian bot farm included more than 12,000 accounts that spread fakes about corruption in the Ministry of Defence in the EU.

But disinformation campaigns are undergoing gradual changes. With the improvement of algorithms for detecting "coordinated inauthentic behaviour" (i.e. bots), it is becoming much easier for large social platforms to counter bot farms. As a result, starting in 2022, disinformation campaigns are taking on a new form - no longer just launching bots that reinforce pro-Russian narratives or any other information the customer needs, but creating entire fake websites and trying to engage local bloggers,politicians or opinion leaders to spread their messages. In other words, botnets need to find distributors that cannot be blocked by social platforms at a moment's notice. Or at least create and promote such bloggers on stolen YouTube content.

And even if we imagine that this particular farm did not intend to spread Russian propaganda further, it still turned out to be useful. After all, albeit in violation of copyright, it increased the audience not only for videos with recipes for Napoleon cake but also for videos about the happy life of refugees from Mariupol in Russia, about Russia's difficult struggle against NATO and globalists, or about "WHY DOES THE RUSSIAN LANGUAGE HURT AMERICANS?!!".

The further we go, the more inauthentic content will become, and Russia will increasingly use it in its information warfare. Since there is a lot of Russian disinformation and propaganda online, some of it will be reproduced by bot farms that simply replicate what has a lot of views.

Artificial intelligence tools will become an increasing challenge for social platforms. They will once again bring to life the marginal theory of the "dead internet", which claims that there are no real people on the internet today. This time, with new research findings, almost half of all internet traffic last year came from bots.

We are starting a series of articles about the role of TikTok in Russian information campaigns in Ukraine and the occupied territories. Don't miss it - subscribe to our newsletter.

This article was originally written in Ukrainian. It has been translated into English using AI tools such as DeepL, ChatGPT, and Grammarly. If you encounter an error that requires immediate attention, please inform us via Facebook, Twitter, or Instagram. Your understanding and support are appreciated.

Источник материала
Поделиться сюжетом