WARSAW — The Polish government has formally requested that the European Commission investigate TikTok following the emergence of AI-generated content urging Poland to exit the European Union. Officials in Warsaw have identified the campaign as Russian disinformation, citing the use of Russian syntax within the synthetic recordings.
In a letter to the Commission, Deputy Digitalization Minister Dariusz Standerski argued that the platform is failing to meet its obligations as a “Very Large Online Platform” (VLOP) under the EU’s Digital Services Act (DSA). Standerski warned that the content—which featured AI-generated videos of women in Polish national colors—threatens public order and the integrity of democratic processes across the EU.
While the specific profile behind the videos has since disappeared, TikTok stated it remains in contact with Polish authorities and has removed content that violates its internal rules. The European Commission confirmed receipt of Poland’s request. Under the DSA, platforms like TikTok must assess and mitigate risks stemming from AI and face potential fines of up to 6% of their global annual turnover for non-compliance.
Analysis: The Convergence of AI and Geopolitical Disinformation
The incident in Poland underscores a burgeoning era of “hybrid warfare,” where generative AI is leveraged to destabilize national interests from within. Unlike traditional propaganda, synthetic audiovisual materials—often referred to as deepfakes—allow bad actors to manufacture a sense of grassroots domestic dissent that is difficult for the average user to distinguish from authentic content.
The use of Russian syntax in these Polish-language videos illustrates a persistent challenge for foreign intelligence operations: the “uncanny valley” of linguistics. Even as AI visual quality improves, subtle grammatical errors often remain the “digital fingerprint” that allows intelligence agencies to trace the origins of a campaign back to state-sponsored actors.
This request for a probe also highlights the increasing teeth of the Digital Services Act. By categorizing platforms like TikTok as VLOPs, the EU has shifted the burden of proof onto the tech giants. It is no longer sufficient for a platform to simply react to reports; they are now legally mandated to proactively audit their algorithms for systemic risks. This marks a significant pivot from the “Wild West” era of social media to a regulated environment where digital negligence carries massive financial consequences.





