Engineer · Adversarial Researcher · Bootstrapper

I build tools
states actors and platforms would
rather not exist.

I'm Amaury Lesplingart, a developer who ended up doing adversarial research on state actors and platforms.

About

Who I am
& how I got here.

I started as a developer. Before any of this, I was building SaaS products, running small agencies, doing IT for a radio station. A bootstrapper by conviction, more interested in shipping things than pitching them.

The shift came by accident. During the first COVID lockdown, I ended up in a Discord server where journalists were debunking false claims in real time. I stayed, started contributing, and together we built Journalistes Solidaires: a live open newsroom where anyone could follow along. It wasn't a plan. It was a reflex, and eventually the seed of CheckFirst, a Finnish adversarial research company I co-founded.

What followed was five years of building things in the open, for the community. The common thread: if something is happening online and no one can see it clearly, build something that shows it. Recommendation audits run from real households across eleven countries. Propaganda networks tracked in real time, hourly. Investigations that ended up in European Commission proceedings. Policy frameworks now used in government briefings from Brussels to Berlin.

The approach has always been the same: build it in the open, publish the methodology, and make the findings impossible to deny.

Engineering
PythonNode.jsJavaScriptPHPBashHTML5 / CSSWeb ScrapingAPI DesignUX / Design AI / LLM
Research & Intelligence
OSINTAlgorithm AuditingInfo Manipulation ResearchFIMIDSA CompliancePlatform AnalysisInfluence Operations
Entrepreneurship
BootstrappingProduct StrategyTeam BuildingNGO CollaborationPublic Speaking
Selected Work · 2020 – 2026

Things I
actually built.

From platform monitoring tools and propaganda trackers to OSINT training platforms and policy frameworks: a cross-section of years of adversarial research engineering.

2026

2025

Influence by Design
In collaboration with Reset.tech & AI Forensics

Unveiled the extent to which Kremlin-linked entity SDA exploited Meta platforms to disseminate propaganda, revealing how Meta accepted Russian propaganda payments despite sanctions.

High Impact Russia / Sanctions Big Tech
Portal Kombat
In collaboration with DFRLab

Real-time tracker of the Russian Pravda propaganda network: 3.7M articles monitored, updated hourly, from 28+ countries. Reverse-engineered the Pravda web API and released the full dataset publicly. Shows how Kremlin content is injected into Wikipedia, AI chatbots, and X.

High Impact Russia / Ukraine DFRLab
Tutki (prev. Oppi)
OSINT Training Platform · Creator

Interactive OSINT training platform for analysts and researchers. Trainers create custom information manipulation scenarios or use professional templates. Participants work through realistic multi-platform simulations. No tracking, no passwords. Used by democratic institutions and civil society.

Training Platform Live Product
RADAR
DSA Compliance Tool · Author

A shared framework for researchers, civil society, and regulators to categorise and report Digital Services Act infringements, creating a shared language and interoperable framework for the European ecosystem.

EU Standard Policy Tool DSA
Community Notes Dashboard
Platform Monitoring

An interactive dashboard analysing Community Notes data from X. Tracks global distribution, fact-checking source usage, note visibility and responsiveness, author patterns, and AI-detected content trends across languages. Updated twice daily, with filters by language and timeframe.

Live Product Tool Big Tech
Impact-Risk Index & Calculator
In collaboration with EU DisinfoLab

Updated EU DisinfoLab’s Impact-Risk Index to reflect the latest advances in AI and coordinated inauthentic behaviour. I also built the accompanying automated Impact Calculator, freely available to the community, standardising how researchers assess the reach and severity of individual hoaxes.

Tool EU DisinfoLab Open Access

2024

Operation Overload
In collaboration with Reset.tech

Exposed a pro-Russian campaign designed to exhaust fact-checkers: 800+ organisations targeted, 2,400+ tweets, 200+ emails. The campaign timed attacks to major events including the Paris Olympics and elections and generated 250+ fact-check articles amplifying the fake assets it had created.

High Impact Influence Ops Reset.tech
Meta Ad Transparency Audit
In collaboration with AI Forensics

Systematic audit of political ads on Meta evading moderation via character hiding and word obfuscation, reaching 3M+ accounts across Italy, France, Germany and Poland. Directly cited in EC DSA proceedings against Meta. Presented separately from Facebook Hustles, covering political information manipulation.

Influence Ops AI Forensics
Mozilla Ad Tools Study
Commissioned by Mozilla

Commissioned by Mozilla, I audited the ad transparency frameworks of 11 major tech platforms (Google, Apple, Microsoft, Meta, TikTok, X, Pinterest, Snap…) ahead of the 2024 elections. Findings featured on CNBC. Named X as the worst offender.

High Impact Mozilla CNBC Featured
Amazon Algorithm Audit
In collaboration with AI Forensics

A deep-dive into Amazon’s recommendation engine, uncovering content amplification mechanisms and user entrapment in dubious narratives. Presented at Infox sur Seine 2024, Paris.

Big Tech AI Forensics
Meta’s Role in Romania’s 2024 Election
Research Report

Uncovered a coordinated network of Facebook pages linked to the far-right AUR party running 3,640 political ads, reaching an audience of 148 million, in systematic violation of Meta’s ad policies and Romanian electoral law. Cross-platform coordination also identified on TikTok and Google Ads. Cited by TechPolicy.Press, DSA Observatory, and Candid Technology.

Elections DSA Big Tech

2023

Facebook Hustles
Scam Ads Investigation

Month-long investigation exposing a large-scale scam operation on Facebook, with 1,500+ ads across a network of deceitful media sites, reaching 3M+ users in 12 European countries in just 3 weeks. Triggered European Commission formal proceedings against Meta under the DSA.

High Impact EC / DSA Big Tech
ObSINT Guidelines
EFCSN / EU DisinfoLab

Co-authored the European standard for public-interest OSINT investigations under the EFCSN project, now the reference methodology for fact-checkers, researchers, and civil society across Europe.

EU Standard EFCSN

2022

22vlalapub
French Presidential Election

A monitoring tool for political advertising around the 2022 French presidential elections on Meta platforms. Tracked candidates, parties, and campaign themes: who pays for the ads, who broadcasts them, which audiences and territories are targeted, and how much money is involved.

Elections Ad Transparency Big Tech

2021

Crossover
Platform Monitoring · 2021–2025

Mini-computers deployed to real households across 11 European & African countries, tracking what Google, YouTube, Twitter, Reddit and Google News push algorithmically. Fully open methodology, so platforms can’t deny the data. Started in Belgium, now active across two continents.

Flagship 11 Countries Live Product
Key Collaborations

Working at the
heart of Europe's
FIMI response.

The nature of this work means you end up in unusual rooms. Over five years I've co-authored reports with national agencies, supplied technical data to regulatory bodies, built tools that ended up in government briefings, and sat at the table with institutions shaping responses to foreign information manipulation.

None of that was planned. It followed from building things in the open and making them impossible to ignore.

Jan 2026 · Joint Report

Building a Common Operational Picture of FIMI

Together with EU DisinfoLab, VIGINUM, CASSINI, the German Federal Foreign Office (Auswärtiges Amt), the EEAS and DFRLab. We contributed to applying VIGINUM's Information Manipulation Set (IMS) framework to five major Russian operations: Doppelgänger, Media Brands/RRN, Undercut, Storm-1516, and Overload.

EEAS VIGINUM EU DisinfoLab CASSINI Auswärtiges Amt DFRLab CheckFirst
Publications & Academic Citations

Authored work
& cited research.

Press & Recognition

They've written
about the work.

CNBC
"Ad transparency tools are a major disappointment ahead of election"

Featured interview on the Mozilla ad transparency audit of 11 major platforms. Named X as the worst offender. Published ahead of the 2024 global election cycle.

Read article →
Bloomberg
Russian trolls use vampire expert to spread information manipulation

Bloomberg cited findings on coordinated Russian influence campaigns targeting European audiences, published in January 2025.

Read article →
Politico EU
Pro-Russian ad networks spread on Facebook despite EU probe, say researchers

Politico EU covered the findings on coordinated pro-Russian advertising campaigns persisting on Meta platforms despite ongoing European Commission DSA proceedings.

Read article →
Frankfurter Allgemeine Zeitung
Wieso Infokrieger aus Russland auf europäische Faktenchecker abzielen

One of Germany's most prestigious newspapers covered the Operation Overload research, exposing why Russian actors systematically target European fact-checkers to overwhelm and discredit them.

Read article →
El País
Russia used the von der Leyen no-confidence vote to stir up polarisation in the EU

El País cited the research on Russian information manipulation operations targeting EU political processes, published in July 2025.

Read article →
Nieman Lab · Harvard
A new Mozilla report exposes major flaws in social media ad libraries

Harvard's Nieman Journalism Lab covered the Mozilla ad transparency audit, highlighting findings on the structural inadequacy of platform ad repositories for researchers.

Read article →
Neue Zürcher Zeitung
How researchers unmasked Russia's spy network by analysing medals

Video interview for NZZ on how analysing medals awarded to members of Russian intelligence services reveals the hidden structures of the FSB, offering a rare look behind the facade.

Watch video →
Le Figaro
France, the number one target of pro-Russian propaganda

Le Figaro covered the research on how polarised debates, fake polls, and chatbots are being used by pro-Russian operations to target France as their primary European audience.

Read article →
Speaking

On stage,
across Europe.

2025
Sofia Information Integrity Forum
SIIF 2025 · Sofia, Bulgaria · Information manipulation and platform accountability
Conference
2025
Centre for the Study of Democracy
Building Europe's Media Democracy Shield: Countering Foreign Information Manipulation and Interference
Conference
2025
SANS Institute: SEC587 Guest
Advanced OSINT Gathering & Analysis · Information manipulation detection module
Training
2024
EFCSN Annual Conference · Brussels
"Foreign and/or domestic: who runs information manipulation campaigns?" · European Fact-Checking Standards Network
Conference
2024
Infox sur Seine · Paris
Presented "The Amazing Library", Amazon algorithm audit with AI Forensics
Conference
2024
EU DisinfoLab Annual Conference
"Transparency without accountability is not enough" · Platform governance and big tech
Conference
2024
LeHack · Paris
Workshop on the ObSINT Guidelines for public interest OSINT investigations
Workshop
2023
EU DisinfoLab Webinar: ObSINT Guidelines
Co-presented the European OSINT Guidelines for public interest investigations with Alexandre Alaphilippe
Webinar
2023
Digital Methods Initiative · Winter School
Project lead: YouTube System Recommendation research project
Workshop
2022
Atlantic Council · Digital Sherlocks
DFRLab Digital Sherlocks programme gathering · OSINT and information manipulation research
Event
2021
CIRCOM Regional: Fighting Fake News
Learning from a Global Pandemic: OSINT tools and verification techniques for journalists
Conference
Contact

Get in
touch.