Author: Madalina Voinea
Contributions: Septimius Pârvu
After two years of election campaigns, we end the year with the Bucharest mayoral elections, in an extremely close race. As close as it may seem in reality, we couldn’t help but notice on TikTok a campaign with a high probability of coordinated inauthentic behavior (CIB) that propelled candidate Daniel Băluță into the feeds of Bucharest residents in an insistent, constant, and organized manner. We are talking about a fairly simple campaign, using methods of organization tested in the presidential elections in Romania and the Republic of Moldova.
Specifically, we collected 601 videos supporting Daniel Băluță published in November, distributed from 46 potentially inauthentic accounts. Eight of these accounts are superstars, with 22 million views, even though they only have an average of 100-200 followers each.


It should be noted that this analysis does not aim to, nor could it, assess the electoral impact of this type of covert online campaign. The effectiveness of these promotions depends on a multitude of factors, ranging from the credibility of the candidates, the conditions of the debates (including their absence) in real life, the situation on other networks, voters’ choices, etc. In short, EFOR does not support the idea that a social network dictates voters’ electoral preferences; voting is an individual act influenced by many different factors.
1. What exactly can we see? – The main anomaly identified
The promotion of Daniel Băluță

Source: TikTok
Although we identified 46 active accounts (and several dozen others that deleted their activity) promoting Daniel Băluță using the same formula, eight of them dominate the promotion of the candidate on TikTok. Specifically, using the main hashtag promoting the candidate on TikTok, #danielbaluta, with 60.9 million views in November, we collected 601 videos, of which 354 have over 10,000 views. Of these, 249 of the most popular videos about Daniel Băluță were created by the inauthentic network we discovered. In other words, 70.3% of the viral content about Băluță came from these accounts in November.
By comparison, only 64 videos from the official account made it into the virality top (at least 10,000 views), or about 18%. Most of these viral videos in November use the same 10 hashtags: #sector4 #romania #danielbaluta #bucurestiromania #sector1 #sector2 #sector3 #sector5 #sector6

What does it mean when the main promotional content for a candidate is created by anonymous accounts?
In addition to the most bitter lesson of online elections, namely the lack of transparency, we are witnessing a major failure on the part of platforms to impose minimum safety mechanisms to prevent such information manipulation. We are talking about a lazy campaign, with low complexity and probably low costs, with volunteers if we are to speculate, or a marketing company that had a daily posting schedule on a defined number of accounts.
When an account with 239 followers, bucuresti.now, constantly has viral videos reaching up to 1.2 million views per piece, we seriously question the integrity of the services that social media platforms offer, especially when it comes to the viralization of political content. In such cases, the platform has a great responsibility not to allow itself to become a tool of manipulation, contributing with its own algorithm to this phenomenon, which is downright absurd, but cheap and easy for covert electoral promotion.

We do not suspect this campaign of any hidden complexity, but rather of exploiting systemic vulnerabilities through which social media platforms are so easily infiltrated and manipulated into disproportionately giving a louder voice to politicians who benefit from such accounts, a problem that cannot be masked by the pretext of users’ organic interest in these candidates. Anca Alexandrescu, by comparison, a traditional presence on TikTok, has apparently been dethroned by these accounts promoting Daniel Băluță.
These posts are not marked as political/electoral advertising material, according to Romanian law and EU Regulation 2024/900 on political advertising. According to these, political actors are required to mark such material, including with the name of the sponsor and the entity that produced the campaign. An actor is understood to be a political party, party members, and even individuals who carry out a campaign that is usually paid for. According to Romanian law, competitors cannot be promoted by third parties, and any electoral material must be paid for from the campaign account declared by the competitor to the AEP. Although citizens enjoy the right to free speech and can post about candidates without being required to label their comments, we believe that the situation described in this report may be more than just posts by unrelated citizens—as noted, it appears to be coordinated behavior. This raises the question of whether some of these posts may have been made by entities that did not properly label and declare an election campaign.
In this context, where platforms have banned transparent political advertising declared through their official advertising mechanism, what can honest political actors do? They have only two options:
- Continue to play by the rules and be defeated by inauthentic networks, or
- Create such campaigns themselves so that they can fight on equal terms
Scenario for interpreting the coordinated inauthentic network – Follower patterns
To analyze the followers of each of the eight main accounts that generated 22 million views in one month exclusively with political content promoting candidate Daniel Băluță, we used an analysis based on classifying each account into a risk category. An account suspected of being inauthentic was included in the analysis only if it fell into the medium or high risk category; those with a low category were excluded.
We used an analysis model (Python script with a heuristic model) that combines several parameters to calculate the probability that an account is inauthentic. The model analyzes suspicious behaviors such as: zero videos but many followers; likes without own content; unusual ratio between followers and followed; empty profiles with no bio/avatar; repetitive patterns in usernames and click-bait keywords. The sum of these behaviors generates a risk score: high, medium, or low.

Source: TikTok
Please note: We do not analyze the overlap between organic followers, who may naturally be interested in the same accounts. We analyze the overlap of suspicious accounts (empty accounts scanned and classified as medium/high risk by CIB). This allows us to analyze how often the same suspicious groups appear in other accounts promoting the same content.
Results
Of the total 1,286 followers collected, 713 (55.4%) were classified as having a HIGH or MEDIUM risk of being fake accounts. We analyzed the distribution of these followers among the eight main accounts. First, we see accounts where the overlap is high, such as ajunul.de.maine – doza.romniei-202: they share 82.8% of the same followers (empty accounts), victoria.emma.voi- bucuresti365: 77.6%, ajunul.de.maine – linkalazar: 77.6%, victoria.emma.voi – bucurestinow: 75.9%
The cluster formed around the accounts bucurestinow, ilinkalazar, doza.romniei-202, victoria.emma.voi, ajunul.de.maine, and pelimbatuturor shows an overlap of 50-83% of followers who are themselves classified as high risk for CIB (Coordinated Inauthentic Behavior).
What does this overlap of suspicious followers mean? When we say that ajunul.de.and doza.romniei-202 have an overlap of 82.8%, it means that of the 58 suspicious followers of the ajunul.de.maine account (out of a total of 133 followers of the account), 48 suspicious followers (82.8%) follow BOTH accounts.

- Other viral patterns identified
The situation of candidate Anca Alexandrescu
The ecosystem of accounts promoting Anca Alexandrescu is from the self-proclaimed sovereignist movement, with the realitatea.plus account being the main supporting account.

Another account that caught our attention is the alleged marketing agency Swapera, which we identified in November 2024 promoting the POT party and Ana Maria Gavrilă before the parliamentary elections. Once again, it is unclear to us what regulations apply to accounts that claim to be influencer marketing companies, such as the one we identified, which grows its page using entertainment content and then posts political content for a fee. On the one hand, we need to discuss how the platform regulates these accounts and whether they must declare their identity in order to be traceable in real life. On the other hand, we also need to discuss the obligations of online marketing companies to declare their identity and to report and flag political advertising.

Last but not least, potential signals of inauthentic accounts began to increase in early November, such as the account ancaalexandrescufp with 127,000 followers, a supposed fan account, but which posted only two videos between October and December. Similarly, the account alexandrescuanca2025 frequently posts about general topics of interest related to realitatea.plus and AUR, but does not directly and consistently promote Anca Alexandrescu, despite having an audience of 66,000 followers. Last but not least, we monitored the account created in recent months, ancaalexandrescuprimar, which had a fairly rapid growth of 128,000 followers, with less than 1% of followers from Nigeria, a potential inauthentic amplification.
However, the account appears to be inactive at this time, and we cannot estimate more data about the extent or potential extent of this promotion.

Anti-campaign Cătălin Drulă and Ciprian Ciucu

We decided to analyze the candidates’ presence together because both have an approximately equal share of negative and positive content recorded in November, content created by anonymous accounts that exclusively create anti-campaign content against Ciprian Ciucu, Catalin Drula, and Anca Alexandrescu. However, for the USR and PNL candidates, these accounts had a significant share of the total visibility recorded on the main hashtags associated with them. In terms of positive accounts, we identified several accounts without a clear identity, such as ciprianciucupmb, which appears to be another official account, or littleone.raul, drula.pentru.bucu, but it is uncertain whether they represent campaign accounts or real supporters. At this time, we have not identified any additional signs of inauthentic coordination beyond the traceability of a clear identity.


Here, the fine line is between the right to free speech and, implicitly, the right to post anonymously, and the use of this right by political parties to engage in covert political advertising.
Conclusions
Before any hybrid war with Russia, we suggest taking a look at our own traditional political parties, where we are seeing a real phenomenon in which the same promotion techniques are beginning to be used, either out of desperation to catch up with the online propaganda already mastered by extremist parties, or because they have realized for themselves what great opportunities there are for manipulating the public.
However, what we want to emphasize at the end of 2025 is that we already have a new undeclared promotion technique that we encounter in every election: the construction of inauthentic networks, some simpler, some more complex, which use a mix of real people who behave in a coordinated manner, whether paid or not, and anonymous accounts that end up being much more popular in promoting candidates than official accounts and which create exclusively political content.
What criteria do we use to define this behavior? We suggest looking at the legal basis we have in the European Union, specifically the Code of Practice against Disinformation integrated into the “Digital Constitution” of Europeans, the Digital Services Act (DSA). From our analysis, in Romania, the phenomenon that most affected the integrity of the 2024 and 2025 online elections is Coordinated Inauthentic Behavior, whose characteristics we find in this code as unacceptable behavior on social networks, defined as follows:
“ – Creating and using fake accounts, taking over accounts, and amplification through bots,
- Hacking and disclosure of information,
- Identity theft,
- Malicious forgeries,
- Purchasing fake interactions,
- Non-transparent paid messages or promotion by influencers,
- Creation and use of accounts that engage in coordinated inauthentic behavior,
- User conduct aimed at artificially amplifying the perceived spread or public support for misinformation.” (Commitment 14 – Service Integrity)
Under the terms and conditions of operation of large platforms, these practices are also sanctioned because they affect the integrity of the services offered. What does this mean? That it is an exaggeration and an absurd simplification to say that Daniel Băluță could win the election solely because of an online campaign promoting him on TikTok.
The problem, however, lies in the practice of being able to so easily control what content dominates social media during a sensitive period, when people spend time and get their information on social media. And all this without marking the material as political advertising. When real discussions in society, real accounts, and legally conducted campaigns have a much smaller impact compared to such anonymous accounts, we believe that the integrity of the electoral process is affected, giving a much louder voice and an unequal platform to political actors who benefit from inauthentic promotion.
At the same time, we are lying to ourselves that an election campaign costs 810,000 lei, which is the maximum allowed for running for mayor or county council president. However, in reality, there are these unmarked campaigns, which we do not know how much they cost and who pays for them, unsupervised by the relevant authorities because they do not have the capacity to proactively monitor them. Therefore, it is pointless to look at invoices for flyers, posters, and legal campaigns if we miss all these undeclared funds—whether or not a candidate is aware of the campaigns carried out by and/or through third parties.
Whose problem is this?
- First and foremost, EFOR believes that it is the responsibility of platforms not to allow these inauthentic practices of manipulating the flow of information. Most likely, no one on the platforms is really looking at these trends, when a think tank with limited resources and poor access to public data like us can see these trends that have brought Daniel Băluță disproportionately more views than the rest of the candidates. It is clear that either the platforms’ self-monitoring and so-called mechanisms for monitoring inauthentic behavior and possible influence operations have been put on hold, or they were not very effective from the outset. We have sufficient analysis from Victor Ilie and Luiza Vasiliu, Public Record, and Funky to show that there is also a systemic problem on Facebook.
- Secondly, political parties have an ethical and legal responsibility towards citizens. Here, we believe that a code of conduct adopted by political parties acting in good faith, who wish to operate in a democratic space, is becoming increasingly necessary. Not because we are naive and believe that these practices will not continue, but rather because it is a start in obtaining a public commitment on the basis of which citizens can understand the credibility and techniques that parties use to influence their vote.
- Romanian institutions. We understand that throughout the EU and the world there is a state of numbness regarding the impact of online campaigns, especially since they have become the main means of promotion for many political parties, particularly extremist parties (until now). However, we reiterate EFOR’s recommendations to start building real capacity within a civil institution to monitor the online environment during election periods. In this regard, we must abandon the institutional reflex seen in the 2025 presidential elections, manifested in the task force for regulating political advertising, an exercise that ended up censoring the legitimate opinions of some citizens and demonstrating the ineffectiveness of an attempt to police the internet. What needs to be done is to track inauthentic viral networks in advance and hold our politicians accountable for what they do online.
Methodological Notes
Data Collection and Selection
The analysis carried out for this project covers the entire month of November and is based on a comparative set of data collected previously, in October. The aim was to track the evolution of the online presence of the main candidates for Bucharest City Hall on TikTok and to identify both the organic dynamics of interactions and possible forms of inauthentic amplification.
The data collection process was based on a mixed infrastructure of technical tools and manual checks. We used public scrapers and specialized applications such as Exolyt and Apify to download relevant content and build a coherent picture of the activity on the platform. In total, we analyzed 601 videos associated with Daniel Băluță, 301 videos by Ciprian Ciucu, the same number for Anca Alexandrescu, and 201 for Cătălin Drulă. To ensure a minimum level of visibility and relevance, we only included materials that exceeded the threshold of 10,000 views. For each video, we collected data provided by TikTok, the number of views, likes, and shares, and all videos were automatically transcribed to allow for further thematic analysis of the verbal content.
An essential part of the methodology focused on how we differentiated organic activity from potentially inauthentic signals. For anonymous accounts, we excluded users who post personal content or who are easily identifiable as politicians, influencers, or public figures. We were not interested in analyzing authentic citizens who express their political opinions, even when they do so under a pseudonym, but strictly those accounts that exhibit manipulative behavior or a high risk of inauthenticity. In this category, we considered anonymous accounts that predominantly or exclusively post political content, have no traces of personal activity, or systematically reproduce the same material, sometimes accompanied by disinformation or conspiracy theories. The identification of a potentially inauthentic network was not achieved solely through automated signals, but through a continuous process of human validation, precisely in order to limit the errors inherent in a field where data is incompletely accessible and constantly changing.
For the thematic analysis of the downloaded content, we used BERTopic modeling, a tool that builds semantic representations of texts and groups them into coherent clusters. This process was followed by a manual interpretation of the themes, in order to avoid excessive reliance on automatic classifications and to maintain analytical control over the nuances of discourse.
Like any digital monitoring approach, this one also has certain limitations. The quality of the results depends directly on the reliability of the data collected and the technical parameters of the tools used; clustering models may react differently to minor variations in data; and thematic interpretations, even when informed and rigorous, remain human interpretations. To reduce the effects of these limitations, the methodology combined automated processes with manual checks.
This approach always has an inherent limitation: the possible invisibility of potentially viral accounts that do not use hashtags, possibly because they do not want to be detected. We cannot rule out the existence of such campaigns for this electoral process that our analysis did not take into account due to technical limitations and limited access to public data from the platforms.
Final remarks
The network promoting Daniel Băluță was also analyzed due to its visibility in the sample of data we collected using the hashtags of all candidates. This analysis does not exclude the existence of campaigns with similar practices for other candidates using other hashtags or completely different techniques, tactics, and procedures that may not have been visible to us due to the limited access to public data that researchers have from social media platforms.
At the same time, it is important to clarify how we interpret views on TikTok. The metric provided by the platform, view_count, indicates how many times a video is displayed in the feed, not necessarily how many times it is viewed in its entirety or attentively by a user. This feature explains, at least in part, the high and often spectacular view counts on TikTok compared to platforms such as Meta. The phenomenon also contributes to the platform’s appeal among users, who perceive these high numbers as indicators of high visibility.
In the present analysis, this distinction becomes essential: especially in the context of a network that can also generate artificial traffic, the real impact of these campaigns remains difficult to estimate. Without rigorous impact studies, including sociological research conducted in Romania, we cannot deduce from the raw viewing data a direct effect on political behavior or voting intention. The results must be interpreted with caution and contextualized within a broader framework, in which the figures represent only one component, not the entire phenomenon.

