Scroll Top

Coordinated inauthentic behavior in the Moldovan elections

ro_RO

Authors: Voinea Mădălina, Sorin Ioniță

Expert Forum (EFOR) has identified a coordinated network of 17 TikTok accounts that have generated over 1 million views in the last seven days to promote parliamentary candidate Vasile Costiuc. The network shows clear signs of inauthentic behavior, with 993 “soldier” accounts systematically following the same political profiles. This is an example of how electoral systems are vulnerable to manipulation through political advertising disguised as social media content. In response, we must take systemic action to increase transparency and generate real countermeasures from large platforms. 

The broader and unresolved issue of social media platforms

Election periods remain a sensitive time and an unresolved issue for social media companies, which theoretically claim to provide citizens with information but in practice become a favorite environment for spreading disinformation. 

The obsessive argument of limited responsibility, i.e., the insistence that “we are just information intermediaries,” or the self-proclamation as an entertainment platform, as TikTok has  repeatedly  done, has brought us to the point where people get their information mainly from social media, but democratic states have their hands tied when it comes to imposing transparency requirements, monitoring foreign influence (FIMI), and disguised political advertising. A live experiment will take place in the EU, which will be all the more interesting given that both Meta and Google are abandoning political advertising due to the new European TTPA regulation, which comes into full force in October. [TTPA – Transparency and Targeting of Political Advertising – is the new European regulation that requires platforms to be more transparent about political advertising.

We have reached this point because there is still a lot of inertia in Europe and, as such, very little cooperation with social media platforms. Most European countries have not even mapped their information space in terms of the main threats, patterns of inauthentic infiltration, or artificial traffic.

The case of Romania sent shock waves, but prevention mechanisms remain weak. This is because we still allow social media companies to claim that they have limited responsibility for what happens in the online information space during election periods, but not only then.

The Moldovan elections, fertile ground for manipulation

In the Republic of Moldova, all these limitations in creating a transparent and secure information ecosystem for users during intense moments such as the parliamentary elections in September create fertile ground for illustrating how vulnerable we are. On the one hand, we are faced with the reluctance of platforms to provide access to public data for live monitoring of what is happening. On the other hand, there are regulatory gaps in European legislation, which is otherwise excessive on details that concerned us five years ago but are now less relevant, fueling the platforms’ lack of willingness to be accountable. In addition, there is a constant background interest on the part of Russia, generously funded, to interfere in elections in European countries, using means ranging from direct intervention through voter bribery to more rudimentary and easily organized online influence operations.

The case of Vasile Costiuc: a suspicious rise online

Vasile Costiuc is the president of the “Democracy at Home” political party, a self-proclaimed radical unionist and partner of Romania’s AUR party and its leader, George Simion, with whom he has engaged in joint political activities. Costiuc campaigned for the AUR in the Romanian presidential elections, including media actions related to alleged vote rigging (without evidence), which were taken up and amplified by the AUR. In the past, Costiuc has been active in public actions to expose corruption, sometimes real, sometimes alleged. At the same time, he was part of a larger group of politicians loyal to oligarch Vladimir Plahotniuc, being used against the opposition at the time. Previously, information emerged that Vasile Costiuc was involved in events organized by FSB general Alexandr Kondyakov in Russia. 

The anonymous accounts promoting Vasile Costiuc have seen an unusual surge in popularity on TikTok in recent weeks. A 300% increase in the last seven days compared to the previous week raises legitimate suspicions about the authenticity of this popularity.

The figures are impressive, but not atypical for the TikTok platform, where everything goes viral unpredictably: over 1 million views in the last week alone. This explosion prompted us to investigate whether there were any patterns of inauthentic behavior behind this growth and to see who was creating content about the politician, but also who was consuming it, that is, who his followers were.

Identified Network Characteristics

Basic structure: We identified 17 accounts that created 417 videos in the last seven days, mostly promoting politician Vasile Costiuc and his party.

The visibility paradox: Despite the high number of views for the videos, we are talking about relatively small accounts with up to 1,000 followers. It is very easy for theoretically irrelevant accounts to slip under the radar, but according to TikTok criteria that remain unknown to us, they can explode algorithmically with millions of views, quantified according to their own completely random formula.

Source: TikTok

Source: Context.ro software – F.A.C.T

Source: Context.ro software – F.A.C.T

Methodology

To analyze the followers of each of the 17 main accounts, we used an analysis based on classifying each account into a risk category. An account suspected of being inauthentic was included in the analysis only if it was in the medium or high risk category; those with a low category were excluded.

We used a heuristic model (Python script) that combines several parameters to calculate the probability that an account is inauthentic. The model analyzes suspicious behaviors such as: zero videos but many followers; likes without own content; unnatural ratio of followers/following; empty profiles without bio/avatar; repetitive patterns in usernames and click-bait keywords. The sum of these behaviors generates a risk score: high, medium, or low.

Results

Of the total 13,627 followers classified as HIGH or MEDIUM risk, we analyzed the distribution across 16 of the 17 main accounts (we were unable to scan one account). The largest account, “politica fara idioti” (politics without idiots), has 7,535 followers classified as medium/high risk by CIB out of a total of 21,200 followers. 

The rest of the accounts that constantly redistribute “politica fara idioti” have an average of 851 followers per account, but the most important indicator is that 993 accounts with medium/high CIB risk (accounts with no or mostly deleted videos) simultaneously follow multiple profiles in the network. In practice, the same groups already identified as potential CIBs are moving in groups to follow the same accounts. 

Coordinated pattern identified: The same 993 followers systematically follow multiple accounts in the network, creating a density of 95.8%. Almost every pair of accounts has common followers.

Source: Expert Forum data

What do you see in the matrix above? Each cell shows what percentage of account A’s followers also follow account B. Examples of interpretation:

  • moldovasatula farailuzii: 29.1% means that 29.1% of moldovasatula’s followers also follow farailuzii
  • voceadinglod moldovasatula: 43.7% means that 43.7% of voceadinglod’s followers also follow moldovasatula

CLUSTERS WITH HIGH OVERLAP:

  1. voceadinglod moldovasatula: 43.7% overlap (VERY HIGH!)
  2. cetateanfarafrica moldovasatula: 36.6% overlap
  3. farailuzii moldovasatula: 29.1% overlap
  4. prime1_md moldovasatula: 30.7% overlap

What does all this mean? The cluster around the accounts moldovasatula, voceadinglod, farailuzii, cetateanfarafrica, prime1_md shows a 25-44% overlap of followers who are themselves classified as high risk for CIB (see here for more on the theory of artificial content invading social media platforms)

What are the possible interpretations?

  • Coordinated activity by bots: the same fake accounts follow multiple targets
  • Artificial amplification network: these accounts are likely promoted together
  • Unusual behavior of TikTok users who have empty accounts, which they regularly empty of content and follow such click-bait accounts. 

We also found less suspicious overlaps of coordination. For example, accounts such as politica fara idioti and jfjf.hxjx have much smaller overlaps (0.2-2.6%), suggesting more independent follower bases.

What coordination pattern do we observe, however, in accounts with large overlaps of potentially inauthentic followers? *(Accounts supporting Vasile Costiuc followed by over 20% of suspicious followers, medium/high risk)

Source: Expert Forum data

INTERACTIVE VIEW HERE

Identified operational structure

Layer 1 – Main accounts (16 accounts): These accounts function as the core of the operation, characterized by massive reposting among themselves, the exclusivity of political messages, and manifest patterns on behalf of the accounts – all positioning themselves against the current Moldovan political class in power. Their coordination is evident through the use of the same targeting hashtags and message synchronization.

Layer 2 – Bot Follower Infrastructure (993 accounts): Over 900 accounts function as loyal followers of the same 16 main accounts, popularising content and functioning exclusively as artificial amplifiers. These accounts do not produce their own content, but exist primarily to create the illusion of organic popularity.

Example of the relationship between the identified accounts:

farailuzii (616 followers) ↔ moldovasatula (669 followers): 179 mutual followers with medium/high CIB potential

  • 29.1% of farailuzii’s followers also follow moldovasatula
  • 26.8% of moldovasatula’s followers also follow farailuzii
  • almost 1 in 3 CIB followers overlap!

voceadinglod (371 followers) ↔ moldovasatula (669 followers): 162 mutual followers with medium/high CIB potential

  • 43.7% of voceadinglod‘s followers also follow moldovasatula
  • 24.2% of moldovasatula’s followers also follow voceadinglod
  • Highest overlap rate: almost half of voceadinglod’s audience!

farailuzii (616 followers) ↔ voceadinglod (371 followers): 145 mutual followers with medium/high CIB potential

  • 23.5% of farailuzii’s followers also follow voceadinglod
  • 39.1% of voceadinglod’s followers also follow farailuzii
  • Massive bidirectional coordination

cetateanfarafrica (393 followers) ↔ moldovasatula (669 followers): 144 mutual followers with medium/high CIB potential

Data source: Expert Forum

Each bar shows the exact number of followers that two accounts have in common.

Please note: We do not analyze the overlap between organic followers, where people may naturally be interested in the same accounts. We analyze the overlap of suspicious followers (those already scanned and classified as medium/high risk by CIB). This allows us to analyze how often the same suspicious groups appear in other accounts promoting Vasile Costiuc.

The figure below shows some examples of dubious bot accounts. 

A screenshot of a computer AI-generated content may be incorrect.

The model of disguised political advertising

What we are seeing in Moldova is an evolution of digital electoral manipulation techniques: a model of disguised political advertising similar to that identified in the Romanian presidential elections. The operation functions as an invisible popularity factory, where political content is presented as spontaneous expressions of public opinion, but is in fact the result of coordinated orchestration.

The mechanism starts with “mother” accounts that produce seemingly authentic political content, messages that appear to come from ordinary citizens expressing their support for Vasile Costiuc. These accounts create the illusion of a grassroots movement, of organic enthusiasm for the candidate. Then the infrastructure kicks in: hundreds of accounts with CIB potential artificially amplify this content through likes, shares, and comments, creating an artificial volume of interactions.

What do we think is happening? This artificial amplification exploits the vulnerability of TikTok’s algorithms, which interpret high levels of engagement as an indicator of content relevance and quality. The algorithm begins to promote this content to increasingly wider audiences, creating an avalanche effect that can influence public perception. The end result is that anonymous content promoting a candidate with limited real support can create the impression of significant online popularity, fooling both the media and the electorate.

Why are these operations difficult to detect early on? The model systematically exploits weaknesses in the current digital ecosystem. The accounts used remain deliberately small, with few followers, creating the appearance of irrelevance that allows them to operate under the radar of authorities and platforms. The opacity of TikTok’s algorithms makes it virtually impossible to distinguish between organic and artificial popularity, and limited monitoring capabilities, combined with poor regulation of political advertising on platforms, allow these operations to function virtually undetected.

The danger of influencing voting in a digital space dominated by inauthentic accounts

This report does not advocate censorship of political content, but rather guarantees a minimum level of fairness and transparency in an election campaign which, although largely conducted on global social media platforms, remains a national democratic process. However, it is deeply influenced by the information circulating on these platforms, a significant part of which is generated by accounts used exclusively for political support, without the possibility of verifying their identity, despite the political agenda they promote. If we ignore these phenomena, we risk that the next elections will no longer be about ideas or voters, but about who builds the most effective invisible network of support, regardless of who the candidates are.

Discussions about regulating political content online often seem too technical or too abstract. In reality, however, we can sum it up by saying that elections are now marked by a new and persistent form of disguised electoral advertising, carried out on social media through accounts used exclusively for this purpose, whose authenticity is, at best, questionable.

Since the beginning of 2024, Expert Forum has been monitoring electoral content distributed on TikTok. After the 2024 Romanian presidential elections, we noticed a significant increase in accounts producing exclusively political content while claiming to be individual support accounts. In previous reports, we called this a new industry of disguised advertising and a normalization of fraudulent online promotion techniques that elevated candidate Călin Georgescu.

Why is all this important? We are talking about accounts that end up at the top of the views for candidates’ hashtags and exhibit patterns of inauthentic behavior that are likely internal in nature. In our opinion, these are either accounts managed by marketing or consulting companies, or created by extremely active supporters throughout the campaign.

The fundamental question remains: in an era of global platforms and opaque algorithms, how do we ensure that national democratic processes are not distorted by artificial manipulation techniques? The answer requires stronger regulation, greater transparency from platforms, and improved monitoring capabilities from democratic states.

Appendix

Main accounts monitored

https://www.tiktok.com/@politicafaraidioti
https://www.tiktok.com/@jfjf.hxjx
https://www.tiktok.com/@ministerulminciunii3
https://www.tiktok.com/@rezistmmpreun?_t=ZN-8yqrNbbizZO&_r=1
https://www.tiktok.com/@moldova__libera?_t=ZN-8yqrOaEVFzi&_r=1
https://www.tiktok.com/@moldovasatula?_t=ZN-8yqrPTPDCEw&_r=1
https://www.tiktok.com/@raportdezastru6?_t=ZN-8yqrQOC7siE&_r=1
https://www.tiktok.com/@prime1.md?_t=ZN-8yqrREbxEOM&_r=1
https://www.tiktok.com/@voceadinglod?_t=ZN-8yqrSLX5gR8&_r=1
https://www.tiktok.com/@ceteanfrfric?_t=ZN-8yqrTG646CM&_r=1
https://www.tiktok.com/@guvernullaperete0?_t=ZN-8yqrU3lb1F8&_r=1
https://www.tiktok.com/@farailuzii?_t=ZN-8yqrUvgoOiU&_r=1
https://www.tiktok.com/@poporulvorbeste?_t=ZN-8yqrW5R2OkI&_r=1
https://www.tiktok.com/@user36156701035617?_t=ZN-8yqraZGfbVe&_r=1
https://www.tiktok.com/@devezicefacei?_t=ZN-8yqrdixN1ks&_r=1
https://www.tiktok.com/@globalnew25?_t=ZN-8yqrfQmMWiU&_r=1
https://www.tiktok.com/@voceastrzii?_t=ZN-8yqriubStDa&_r=1
Related posts