Essay
The substrate nobody voted for
After eight maps on elections, campaign finance, social media, misinformation, and institutional trust — a pattern emerges: we've been protecting democracy's procedures while its substrate erodes beneath them.
On November 24, 2024, Romanian voters went to the polls in a presidential election. The result was a shock: Călin Georgescu, an ultranationalist candidate polling in single digits weeks earlier, won the first round with 23 percent of the vote. He had run almost no traditional campaign. What he had done was post obsessively on TikTok, where his videos accumulated 52 million views in four days. Romanian intelligence subsequently declassified documents showing 25,000 or more coordinated accounts, paid influencer networks, and over one million euros in undeclared campaign funds. TikTok's algorithm had recommended his content between four and fourteen times more often than that of any rival. Global Witness documented the amplification. On December 6, Romania's Constitutional Court unanimously annulled the entire election — the first European country to cancel a presidential contest on grounds of algorithmic and cyber interference.
The election had been free in the formal sense. Citizens voted without coercion. Ballots were counted accurately. The result was annulled not because the procedure failed but because the substrate beneath it had been compromised — the information environment through which voters formed their views, the financial architecture through which campaigns competed, the epistemic conditions that made a meaningful democratic choice possible.
Romania is a concentrated version of what Ripple's democracy and governance maps have been circling for two years. Eight disputes that look separate — about voting methods, social media platforms, AI in elections, campaign money, misinformation, journalism, and institutional trust — share a structure that no single map can reveal. They are all arguments about the same thing: not how democracy's procedures work, but whether the substrate beneath those procedures still holds.
The hidden structure
Democratic theory has long distinguished between the procedures of democracy and the conditions that make those procedures meaningful. Elections can be formally free and fair — accurate counts, equal access to ballots, peaceful transitions — while failing to be substantively democratic in any richer sense. The substantive conditions are harder to name and harder to measure, which is why democratic debates tend to fixate on procedures. Procedures are legible. Substrate is not.
Across the eight maps, three substrate conditions appear repeatedly, usually unnamed, always contested:
The first is epistemic commons — a shared informational environment in which citizens, though they disagree about values, are working from a roughly common factual reality. Jürgen Habermas called this the precondition for democratic deliberation: an open, rational public sphere in which claims are tested against evidence and subjected to public scrutiny. Cass Sunstein, writing in Republic.com (2001) and updated in #Republic (2017), warned that algorithmically personalized media would produce "information cocoons" — not merely filter bubbles that narrow what people see, but "cybercascades" that amplify extreme positions through like-minded networks until groups drift toward conclusions none of their members would have reached alone. Lisa Herzog, in Citizen Knowledge (2023), named what this erodes: the "epistemic infrastructure" of democracy, the schools, independent media, and civil society that produce citizens capable of democratic self-governance. When that infrastructure is degraded — whether by market logic, platform architecture, or deliberate manipulation — elections continue but their epistemic preconditions weaken.
The second is equal political agency — the condition that citizens have roughly comparable influence over political outcomes. This is never perfectly achieved; the wealthy have always had more political access than the poor. But there is a threshold below which the gap is so large that "one person, one vote" becomes a formal description of an unequal reality. Martin Gilens's analysis of 1,779 policy cases found that average Americans have near-zero statistical influence on whether a proposed policy is adopted — while the preferences of economic elites and organized interest groups are strongly predictive. The Gilens finding has methodological critics (Bashir's 2015 replication challenge argues the model underestimates citizen influence), and the debate is genuinely live. But the pattern — that in domain after domain the adopted policy is closer to high-income preferences — does not depend on a single study.
The third is institutional legitimacy — citizens' basic belief that the system's rules are fair enough to accept, even when they lose. This is the substrate condition that, once lost, is hardest to rebuild. The 2025 Edelman Trust Barometer, surveying 33,000 people across 28 countries, found that 69 percent of respondents globally worried that government officials were intentionally lying to them — up from 58 percent in 2021. Pew Research found that in April 2024, only 22 percent of Americans trusted the federal government to do the right thing most of the time; in 1964, the figure was 77 percent. In spring 2025, Democrats' trust hit a recorded low of 9 percent. These are not just opinion measurements. They are measurements of whether the substrate holds.
The epistemic gap: what the information debates are really about
The misinformation and epistemic crisis map, the social media and democracy map, the AI and democracy map, and the journalism and media trust map all look, on the surface, like disputes about technology and speech. They are actually disputes about the epistemic commons — about what the shared informational substrate of democracy requires and who is responsible for maintaining it.
The platform accountability position holds that social media companies have made editorial choices — about what to amplify, what to suppress, and how to design recommendation systems — that have systematically degraded the epistemic commons. Frances Haugen's internal documents showed that Facebook's own researchers identified harms the company chose not to mitigate. Stanford Internet Observatory's closure in 2024, under pressure from lawmakers who viewed its disinformation research as censorship advocacy, is treated by this position as evidence of regulatory capture: the entities being studied successfully delegitimized the researchers studying them.
The free speech and anti-moderation position holds that the greater threat to the epistemic commons is not misinformation but state-adjacent efforts to define it — that when researchers, platforms, and governments collaborate to label content as misinformation, they produce a different epistemic distortion, one that suppresses heterodox but sometimes accurate information and concentrates epistemic authority in institutions that are themselves politically positioned. The Twitter Files, from this view, documented exactly this: an informal moderation regime that exceeded what any democratic mandate authorized.
Both positions are protecting something real. The platform accountability position is protecting the epistemic commons from the commercial logic of attention maximization, which is indifferent to truth and systematically rewards outrage. The anti-moderation position is protecting the epistemic commons from the monopolization of truth-determination by any single institution — including institutions that dress their preferences in epistemic authority. Neither position is wrong about what it's afraid of. They are afraid of different failure modes in the same substrate.
The AI and democracy dispute adds a further dimension. In 2024, AI was used in electoral contexts in more than 80 percent of countries that held elections. Of documented uses, 90 percent involved content creation — deepfakes, AI-generated endorsements, synthetic avatars of politicians — rather than direct manipulation of vote counts. The Alan Turing Institute's CETaS, having studied more than a hundred elections since 2023, found no conclusive evidence that AI content had changed vote totals; it did find that AI material had "influenced election discourse, amplified harmful narratives, and entrenched political polarization." Romania is the clearest case: the manipulation was epistemic, not procedural. The ballots were counted accurately. The environment in which voters formed their choices was not.
A 2024 Pew survey found that 64 percent of Americans believed social media has been "more of a bad thing" for democracy — the most negative assessment among 27 nations polled. Trust in national news organizations fell by 20 percentage points between 2016 and 2025, to 56 percent. These numbers don't settle which regulatory approach is correct. But they measure the scale of the epistemic gap — the distance between the informational conditions democracy needs and the informational conditions it actually has.
The agency gap: what the political money debates are really about
The campaign finance and political money map and the electoral reform map are both, at bottom, disputes about equal political agency — about whether the formal equality of the ballot corresponds to any real equality of influence over outcomes.
Dark money spending in the 2024 U.S. federal elections reached a record $1.9 billion, according to the Brennan Center — roughly double the 2020 total, and a thousandfold increase from the roughly $5 million figure in 2006. The Brennan Center's data sits alongside NewsGuard's 2024 finding that the number of partisan-backed websites designed to look like impartial local news now exceeds the number of actual local daily newspapers in the United States — at least 1,265 such sites, nearly half targeted to swing states. The same dark money that funds political campaigns increasingly funds the infrastructure of synthetic local news.
The free speech position, grounded in Buckley v. Valeo (1976) and the Citizens United majority (2010), holds that limiting campaign spending limits the reach of political speech — that a cap on expenditure is a cap on voice. This is not a cynical position. It reflects a genuine constitutional commitment to political speech as the most protected category of expression, and a real concern that spending limits have historically protected incumbents from challenger campaigns with less established fundraising infrastructure.
But the Gilens data, taken seriously, suggests that the speech framework obscures the agency question. The issue is not whether wealthy donors can speak — they can, and loudly. The issue is what happens to the political system when the people who fund campaigns are not representative of the people who vote. Lawrence Lessig's "dependence corruption" argument is not about explicit bribes; it is about the selection filter that campaign finance creates at the entry point to political careers. People who can raise money from major donors are more likely to receive party infrastructure. People who receive party infrastructure are more likely to win. The result is not a market of bribed officials but a market that produces officials who were pre-screened for palatability to major funders before they ever held office.
The electoral reform debates carry the same agency logic into the voting system itself. Ranked choice voting, proportional representation, and multi-member districts are all, at bottom, proposals for making the translation of citizen preferences into political outcomes more faithful. The plurality voting system produces winners with minority support, incentivizes strategic rather than sincere voting, and systematically squeezes out third parties and independent voices. The defense of plurality voting — that it produces stable majorities and clear mandates — is a defense of a system that concentrates the benefits of political agency in coalition-builders while distributing its costs to everyone outside the two major coalition blocs. These are genuine tradeoffs. But they are tradeoffs in a dispute about the same underlying condition: whether the formal mechanism through which citizens translate preferences into power actually delivers equal agency.
The legitimacy gap: what the trust debates are really about
The social trust and institutional legitimacy map and the journalism and media trust map are the closest to naming the substrate directly. They are disputes about the third condition: whether citizens believe the system's rules are fair enough to accept.
The V-Dem Democracy Report 2025 — tracking liberal democracy across 180 countries with the most comprehensive longitudinal data available — found that the global population-weighted average of liberal democracy had returned to 1985 levels. Autocracies, at 91, now outnumber democracies (88) for the first time in more than twenty years. 72 percent of the world's population lives under autocratic rule. The report described the United States as undergoing "the fastest evolving episode of autocratization" it had seen in the country's modern history — language calibrated carefully by researchers who had spent years warning against both complacency and overstatement.
Levitsky and Ziblatt's How Democracies Die (2018) named the mechanism: democratic erosion rarely looks like a coup. It looks like elected officials incrementally weakening the institutions that constrain them — independent courts, professional civil services, free press — while maintaining the procedures that legitimate their power. The procedures continue; the substrate erodes. Guriev and Treisman, in Spin Dictators (2022), documented how modern authoritarians have largely abandoned open violence in favor of information manipulation and manufactured consent. They win not by shooting opponents but by controlling the epistemic environment that makes opposition coherent.
The social trust debates are internally contested about cause and remedy — whether declining trust reflects rational updating to evidence of institutional failure, deliberate erosion by actors who benefit from distrust, or a structural transformation in how people form political identities when the information environment is atomized. These are not the same diagnosis and they point to different remedies. The reform institutionalist says trust is rebuilt by making institutions worthy of it. The social cohesion position says trust is rebuilt by reweaving civic and associational life. The structural critic says neither works without addressing the political economy that corrodes both — the concentration of power that makes institutions serve narrow interests and atomizes civic life in the process.
What the journalism map adds to this is a concrete mechanism: when local news disappears — and 2,200 American newspapers have closed since 2005, leaving nearly 200 counties without any local news outlet — the epistemic commons and institutional legitimacy conditions erode together. Local journalism was, among other things, the mechanism through which local governments were held accountable. When it disappears, corruption becomes less detectable, municipal decisions become less legible to citizens, and the gap between the formal procedures of democracy and the substantive conditions for meaningful democratic participation widens.
Two tensions that run through everything
Across all eight maps, two structural tensions appear that no individual dispute can surface on its own.
The first is the procedure/substrate confusion. Democratic reform debates almost universally focus on procedures: which voting method to use, what spending limits to impose, how to moderate content on platforms, whether to require paper ballots or electronic records. Procedures are the right focus when they are the binding constraint — when the problem is that ballots aren't counted accurately, or that some citizens are denied the right to vote. But when the binding constraint is the substrate — the epistemic commons, the agency distribution, the legitimacy conditions — procedural reforms miss the target. Ranked choice voting produces better preference aggregation and still operates in an information environment where voters are algorithmically sorted into incompatible factual realities. Campaign spending disclosure makes funders more visible and still operates within a structure where the 0.01 percent of Americans who fund 28 percent of federal campaign money are not representative of the electorate. The procedures can be improved while the substrate continues to erode, and the improvements don't slow the erosion because they're not touching it.
The second is the speed asymmetry. The forces degrading the democratic substrate move at the speed of capital allocation decisions, software deployment cycles, and viral content propagation. The mechanisms available for democratic response move at the speed of legislative deliberation, constitutional interpretation, and international treaty negotiation. Romania's election was influenced by an algorithmic recommendation system and 25,000 coordinated accounts that were built and deployed in weeks. The Constitutional Court's response — annulling the election — took ten days from the first round. The regulatory framework that might have prevented the manipulation doesn't exist yet in any European jurisdiction with adequate enforcement mechanisms. The speed asymmetry is not incidental; it is structural to the problem. The actors with incentives to degrade the democratic substrate — whether foreign intelligence services, domestic political operatives, or commercial platforms optimizing for engagement — face no regulatory latency. The actors with authority to restore it do.
The 2025 Expert Survey on the Global Information Environment, polling 438 researchers across 76 countries on the reliability of information supply chains, found that 72 percent expected the information environment to worsen — up from 54 percent in 2023. Three-quarters cited absent platform accountability as the gravest global threat. This is researchers who study the problem full-time. The gap between their assessment and the pace of institutional response is a measurement of the speed asymmetry in action.
The deepest finding
The welfare cluster synthesis found that the question "can we afford collective provision?" is almost always wrong because it implies the alternative is no cost, when in reality the alternative is differently-distributed cost. The democracy cluster reveals an analogous displacement: the question "are elections free and fair?" focuses on the procedures while the substrate transfers its failures to categories that don't show up in the procedural audit.
An election can be free and fair in the procedural sense — accurate counts, universal suffrage, peaceful transitions — while failing in the substrate sense simultaneously. Voters can be formally equal at the moment they mark a ballot while one candidate has spent $200 million in dark money to define the informational environment in which that ballot is marked. Citizens can formally have access to the same voting system while the news environment in their county has collapsed to zero local outlets, leaving them without the information infrastructure that makes a considered vote possible. The formal procedure is intact. The substantive substrate is not.
What the cluster reveals is that the genuinely contested question in democratic politics is not "do elections happen?" but what are elections for? The free speech position answers: they are a mechanism for aggregating individual preferences as they exist, and the role of democratic rules is to do that aggregation accurately. The structural position answers: they are a mechanism for collective self-governance, and that requires conditions under which preferences are formed in ways that are not systematically distorted by unequal resources or manipulated information environments. These are not different readings of the same democratic theory. They are different democratic theories, and the disagreement between them is genuinely fundamental.
The Edelman data and the V-Dem data together describe the consequence of that disagreement remaining unresolved: declining trust in institutions, rising grievance, and the slow erosion of citizens' basic sense that the system's rules are fair enough to accept. The 2025 Edelman survey found that among high-grievance respondents — 61 percent of global respondents hold moderate or high grievance — 4 in 10 said they would approve of "hostile activism," including spreading disinformation or committing violence, to achieve political ends. Among 18-to-34-year-olds, the figure was 53 percent.
That is what happens when the substrate erodes long enough. Not that elections stop happening. That citizens stop believing the results are worth accepting.
What this means for the debates
None of this resolves the genuine disputes in the democracy cluster. The platform accountability and free speech positions are protecting genuinely different things: one protects the epistemic commons from commercial degradation; the other protects it from institutional monopolization. Both risks are real. The disclosure and public financing advocates are right that dark money changes who politicians are accountable to; the free speech absolutists are right that spending limits have historically protected incumbents and that the press exception creates an arbitrary constitutional distinction. The reform institutionalists who believe trust is rebuilt through institutional performance are right that institutions do need to be worthy of trust; the structural critics who argue that the political economy itself generates distrust are also right about the mechanism.
What the cluster clarifies is the question these positions are all answering differently: what are the conditions under which democratic self-governance is possible, and who is responsible for maintaining them?
Procedures are maintained through law — election commissions, courts, legislatures. Substrate is maintained through a more diffuse and contested set of mechanisms: independent journalism, public education, campaign finance rules, platform regulation, the associational life of civil society, the norms of political competition. The substrate mechanisms are harder to defend because they are harder to name, harder to measure, and — crucially — because the actors with the most to gain from their erosion are also the actors with the most political power to resist their restoration.
Making the substrate visible is not a policy argument. It does not settle whether TikTok should be regulated under the EU's Digital Services Act, whether ranked choice voting should replace plurality elections, or whether the Citizens United framework should be overturned by constitutional amendment or superseded by public financing programs. But it changes the terms of the argument — from "are the procedures intact?" to "are the conditions for meaningful democratic participation being maintained?" — and that reframing is, at minimum, a more honest way to assess what is actually at stake.
The democracy cluster — maps in this series
- Electoral Reform and Ranked Choice Voting — defenders of plurality voting, ranked choice advocates, proportional representation advocates, and electoral reform skeptics; what each position is protecting about how preferences translate into power
- Campaign Finance and Political Money — First Amendment/free speech absolutism, disclosure/transparency advocates, public financing advocates, and the structural/dependence corruption critique; the Gilens data, dark money's record $1.9 billion in 2024, and the pending NRSC v. FEC case
- Social Media and Democracy — platform libertarians, algorithmic accountability advocates, epistemic commons advocates, and structural critics; what each position is protecting about the information environment democracy requires
- AI and Democracy — epistemic defenders, free expression advocates, electoral process defenders, and structural critics; Romania 2024, AI-generated content in 80%+ of 2024 elections, and the debate over whether AI's electoral effects are being over- or understated
- Misinformation and the Epistemic Crisis — platform accountability advocates, free speech and anti-moderation advocates, structural media ecosystem critics, and epistemic security and information warfare advocates; the Stanford Internet Observatory closure, GEC shutdown, and DSA enforcement
- Journalism and Media Trust — the structural argument for public interest journalism, the market/innovation position, the political bias critique, and the local news collapse; what the loss of 2,200 newspapers since 2005 does to democratic accountability
- Social Trust and Institutional Legitimacy — reform institutionalists, social cohesion advocates, structural critics, and democratic resilience researchers; the Edelman Trust Barometer data, Pew's trust-in-government time series, and the V-Dem backsliding findings
- Platform Moderation and Free Expression — the structural architecture of how platforms make editorial choices, and what different accountability frameworks would require of them; the relationship between content moderation and the epistemic commons
References and further reading
- Jürgen Habermas, The Structural Transformation of the Public Sphere (1962, trans. 1989) — the foundational argument that democracy requires a public sphere in which citizens deliberate on shared terms; the diagnosis against which social media's architecture is most clearly legible.
- Cass Sunstein, #Republic: Divided Democracy in the Age of Social Media (2017) — the most influential account of how algorithmically personalized media produces "information cocoons," "cybercascades," and group polarization that undermine the shared informational substrate democracy requires.
- Lisa Herzog, Citizen Knowledge: Markets, Experts, and the Infrastructure of Democracy (2023) — coins "epistemic infrastructure" and argues that market logic has systematically defunded the institutions that produce democratically capable citizens.
- Martin Gilens, Affluence and Influence: Economic Inequality and Political Power in America (2012) — the 1,779-policy-case analysis finding that average citizens have near-zero statistical influence on policy outcomes; the most cited and most contested empirical study in contemporary campaign finance debates.
- Lawrence Lessig, Republic, Lost: The Corruption of Equality and the Steps to End It (revised ed. 2015) — the dependence corruption argument: the problem is not explicit quid pro quo bribery but the selection filter that campaign finance creates at the entry point to political careers.
- Steven Levitsky and Daniel Ziblatt, How Democracies Die (2018) — the argument that democratic erosion now proceeds incrementally through elected insiders who weaken institutions while maintaining procedures; the global comparative evidence for this pattern.
- Sergei Guriev and Daniel Treisman, Spin Dictators: The Changing Face of Tyranny in the 21st Century (2022) — documents how modern authoritarians have replaced open violence with information manipulation and manufactured consent; directly relevant to the speed asymmetry and epistemic substrate arguments.
- V-Dem Institute, Democracy Report 2025: 25 Years of Autocratization – Democracy Trumped? (2025) — the most comprehensive longitudinal tracking of democratic erosion; the source of the 1985 equivalence claim, the autocracy/democracy count reversal, and the characterization of U.S. democratic breakdown risk.
- Edelman, 2025 Edelman Trust Barometer Global Report — the 28-market survey source for the grievance, hostile-activism, and institutional-trust findings that frame the essay's legitimacy-crisis argument.
- Brennan Center for Justice, Dark Money Hit a Record High of $1.9 Billion in 2024 Federal Races (2025) — the definitive quantification of dark money's scale and trajectory; the source for the 2006 baseline and the 2024 record.
- Global Witness, What Happened on TikTok Around the Annulled Romanian Presidential Election? An Investigation and Poll (December 17, 2024) — the investigation documenting skewed TikTok recommendation patterns around Călin Georgescu and the associated coordinated-account concerns; the primary source for the Romania case.
- Centre for Emerging Technology and Security (CETaS), AI-Enabled Influence Operations: Safeguarding Future Elections (November 2024) — a synthesis of the 2024 global election cycle arguing that generative AI intensified discourse manipulation and polarization risks even where direct vote-outcome effects remain hard to prove.
- NewsGuard, “Sad Milestone: Fake Local News Sites Now Outnumber Real Local Newspaper Sites in U.S.” (June 13, 2024) — the finding that partisan-backed "pink slime" outlets now outnumber daily local newspapers, with 1,265 such sites identified nationwide.
- International Panel on the Information Environment, Trends in the Information Environment: 2025 Expert Survey Results (October 2025) — the survey of 438 researchers in 76 countries finding 72% expect the information environment to worsen, up from 54% in 2023.
- Pew Research Center, “Americans' Trust in Federal Government and Attitudes Toward It” (June 24, 2024) — the long-run trust time series showing the collapse from 77% trust in 1964 to 22% in 2024; the methodologically clean longitudinal baseline for institutional legitimacy erosion.