Submitted by Ben Bache on

2024: A Campaign of Disinformation

An Historical Sampler of Fake News

Fake news has existed at least since Roman times. A year after Julius Caesar’s assassination in 44 BC, Caesar’s adopted son Octavian, general Mark Antony, and religious official (Pontifex Maximus) Marcus Aemilius Lepidus formed the political alliance known as the Second Triumvirate to establish justice and order in Rome. The arrangement granted dictatorial powers to the three men, initially allies. The alliance soon fractured, however, as Mark Antony embarked on the conquest of Parthia (now Iran), leaving Octavian to consolidate power in Rome. In a speech to the Senate noted by Cicero, Octavian appeared to tout his shared family and political lineage with Caesar, while in fact laying groundwork for his own political ambitions. Although at least initially the more popular figure, Mark Antony’s foreign exploits left Octavian free to build his base of support in Rome. He arranged land grants for army veterans, adding them to his base of support. His marriage to Livia Drusilla and his successful effort to add men from wealthy families to the Senate forged an alliance with patrician Rome.

It was on his way to Parthia that Mark Antony met Cleopatra, and made no attempt to hide  staying with her in Alexandria, Egypt, even adopting “eastern” clothing.  Politically this damaged his reputation in Rome, both from the pain and uncertainty it brought his family there, but also providing Octavian with opportunities to demonstrate that he was the “true” patriotic Roman. The conflict between Octavian and Mark Antony escalated as Antony relocated to Egypt, and then asked that the Senate not reappoint him when the Second Triumvirate expired in 33 BC. Octavian declared Mark Antony an enemy of Rome and warned that Antony intended to establish a Roman monarchy with its capital in Alexandria.

In a speech to his troops ahead of the naval Battle of Actium Octavian is reported to have denounced Mark Antony declaring that he “abandoned all his ancestors’ habits of life, has emulated all alien and barbarian customs … and finally has taken for himself the title of Osiris or Dionysus …”.  “Let no one count him a Roman,” he continued, “but rather an Egyptian, nor call him Antony, but rather Serapeion”. (Serapeion refers to a Greco-Egyptian god with devotees in Alexandria, who combined features from green-skinned part-mummy god Osiris and the sacred bull god Aspis.)

Mark Antony’s forces were outmaneuvered by the more nimble Roman vessels, and despite aid from Cleopatra’s fleet, Octavian was victorious at Alexandria on August 1, 30 BC. Antony and Cleopatra committed suicide. Octavian became Caesar Augustus, the first Roman emperor for life. In the Res Gestae Divi Augusti [Deeds of the Divine Augustus] he had the final word, writing that ““the whole of Italy voluntarily took oath of allegiance to me and demanded me as its leader in the war in which I was victorious at Actium.”

From the earliest times technology has played a role in the dissemination of information, both true and false. In what some observers have likened to a precursor of social media, Octavian arranged for slogans to be printed on coins over the course of his conflict with Mark Antony. In the 1700s as Britain’s King George II sought to project the image of a strong leader in order to face down the Jacobite rebellion, seditious printers distributed fake news that the king was ill in an attempt to destabilize the situation. The rebellion did not succeed but the incident was another example of using fake news as a political weapon.

In 1835 the New York Sun published a series of articles intended as satire, which described well-known astronomer Sir John Herschel having discovered unicorns and flying bat-men on the moon. As happens in the present day on social media, the intended satire was read and spread as reality, and the Sun’s sales surged.

In September 1913 around 10,000 coal miners associated with the United Mine Workers of America went on strike in Colorado. Over the next several months they were evicted from company towns by John D. Rockefeller’s Colorado Fuel and Iron Company. Miners built tent colonies, the largest being at Ludlow, CO. The colonies were fraught with violence between the strikers and company detectives, and eventually the Colorado National Guard was brought in – nominally to reduce violence, but in fact supported the operators, protecting strikebreakers and ignoring detectives’ violent acts. In April 1914 cost concerns led to a reduction in the number of National Guard troops, leading to an increase in violence. On April 19, 1914 the National Guard opened fire on the Ludlow camp with a machine gun. Three strike leaders were killed. That night the National Guard set fire to tents in the camp, killing 25 people including 3 guardsmen. Miners counterattacked, and the final death total reached 50.

Rockefeller was widely blamed for the massacre, and in response hired Ivy Lee, who would become known as the father of public relations. Lee persuaded Rockefeller to visit miners’ families and listen to their concerns, documented with photographs for eventual press releases. “What is a fact?” Lee famously asked. “The effort to state an absolute fact is simply an attempt to give you my interpretation of the facts”

When allied misfortunes at the start of World War I led President Woodrow Wilson to break his campaign promise not to enter the war he created the Committee on Public Information (CPI) to develop the propaganda campaign justifying the decision. A key participant in the CPI was Sigmund Freud’s nephew Edward Bernays, who documented the approach in his 1928 Propaganda.

During World War II, Germany, Britain, and the US all deployed propaganda. German propaganda portrayed allied forces as cowards or butchers; British radio broadcasts sought to undermine the morale of German forces, while delivering subversive messages to German civilians. In the US, President Roosevelt established the Office of War Information (OWI), documenting wartime phenomena such as women in the workforce, and addressing potentially problematic issues such as the presence of ethnic Japanese Americans in the military.

Propaganda was a key aspect of the Cold War, with the Soviet Union promoting an image of itself as benign and peaceful in contrast to the capitalist “warmongers,” while much domestic US propaganda seemed aimed at maintaining a level of anxiety among the general public that would justify the actions of political and military leaders.

The George W. Bush administration notoriously capitalized on national anxiety and anti-Muslim sentiment in the aftermath of the 9/11/2001 attack on the World Trade Towers,  channelling that animosity into the Iraq War, despite there being no credible connection between the Saddam Hussein’s regime and the attack or al-Qaeda.

Taxonomy of Disinformation

Disinformation researcher and journalist Brooke Binkowski divides disinformation into three main categories:

  • Misinformation. Generally a mistake. Can be used to mislead people, but is not created specifically to mislead.
  • Disinformation. Misleading narratives intentionally created with a specific objective.
  • Propaganda. Frequently jingoistic or nationalistic. Overlaps with advertising and marketing. (See reference to Bernays, above, who initially referred to his work as propaganda, later changing it to “public relations.”)

In general usage, misinformation and disinformation are often used interchangeably, so care must be taken when reading about “misinformation” to determine whether its deployment is focused on achieving a particular objective or outcome (in which case it would more properly be referred to as disinformation).

A common variety of disinformation is “firehosing.” Applying the term to disinformation is generally credited to Christopher Paul and Miriam Mathews of the Rand Corporation who used it in their 2016 article describing Russian disinformation efforts during the 2014 annexation of the Crimean peninsula. The distinguishing features of firehosing are “high numbers of messages and channels, and a shameless willingness to disseminate partial truths or outright fictions.”  Paul and Mathews point out that the success of firehousing as a technique runs counter to conventional wisdom regarding effective communications “from government or defense sources “which traditionally emphasize the importance of truth, credibility, and the avoidance of contradiction.” Key factors in the successful spread of disinformation include:

  • Variety of sources.
  • Number and volume of sources.
  • The views of others, especially of those who are similar to the message recipient.

“False statements are more likely to be accepted when backed by evidence, even when that evidence is false.”

Another disinformation technique is signaling and the related “dogwhistling.” Signaling in this context refers to communicating the signaler’s alliance with a political group, typically via gestures, images (memes), etc. This is sometimes referred to as “covert identity signaling.” A 2022 PNAS report found that “[m]uch of online conversation today consists of signaling one’s political identity.”

Dogwhistling is a related technique in which specific phrases (e.g. “welfare queens,” “urban thugs,” “inner cities”) are code for racial or socioeconomic groups. The phrases sound more-or-less innocuous to the uninitiated, but telegraph the signaler’s membership in and to the in-group.

A third type of disinformation is “gaslighting,” which gets its name from the 1938 play Gas Light and its film adaptations. Binkowski defines gaslighting as an attempt to detach the target from reality. Political use of the technique has parallels in interpersonal context. In gaslighting the target is “deliberately and systematically fed false information that leads them to question what they know to be true….” On the TruthOrFiction website (as of this writing, apparently dormant since 2023), Binkowski cites Hannah Arendt’s essay “Truth and Politics:”

… [T]o the extent to which unwelcome factual truths are tolerated in free countries they are often, consciously or unconsciously, transformed into opinions – as though the fact of Germany’s support of Hitler or of France’s collapse before the German armies in 1940 or of Vatican policies during the Second World War were not a matter of historical record but a matter of opinion. Since such factual truths concern issues of immediate political relevance, there is more at stake here than the perhaps inevitable tension between two ways of life within the framework of a common and commonly recognized reality. What is at stake here is this common and factual reality itself….

The Faith Based Presidency and the Reality Based Community

At the recent funeral for 39th US President Jimmy Carter, former president George W. Bush notably declined to shake hands with Donald Trump. Yet arguably the “Dubya” Bush administration laid the groundwork for multiple aspects of the communications framework deployed by the political right wing in subsequent decades. In an October 2004 article in the New York Times Magazine, journalist Ron Suskind reported comments from Bruce Bartlett, who held government positions in the Reagan and “Poppy” Bush administrations. Of Dubya Bartlett said “... [H]e dispenses with people who confront him with inconvenient facts…. He truly believes he's on a mission from God. Absolute faith like that overwhelms a need for analysis. The whole thing about faith is to believe things for which there is no empirical evidence."

Suskind observed that, matters of personal faith aside, Dubya “demanded unquestioning faith from his followers, his staff, his senior aides and his kindred in the Republican Party.” “Once he makes a decision,” Suskind observed,” -- often swiftly, based on a creed or moral position -- he expects complete faith in its rightness.” The founders’ insistence on a clear separation of church and state seemed a relic of the past as Dubya created what Suskind christened the “faith-based presidency.”

Suskind also documented an exchange with an unnamed “senior adviser” to Dubya. This person told Suskind that people like him were in “‘what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernable reality’.”  He continued:

That's not the way the world really works anymore. We're an empire now, and when we act, we create our own reality. And while you're studying that reality -- judiciously, as you will -- we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.

Trumpian and Right-Wing Disinformation

In 2005 the character Pepe the Frog appeared with other characters in the comic “Boy’s Club,” leading the life of “bros,” being gross, and so on, without particular political content. By 2008 the image was being modified and posted online, and in September 2016 the Anti-Defamation League (ADL) designated the image a hate symbol. Morphing into a classic example of signaling, although versions of Pepe with overtly racist modifications appeared (adding a Hiter mustache, replacing his signature “Feels good, man” with “Kills Jews, man”) the original image was vested with some of the racist connotations. Use of the unmodified image could provide the in-group plausible deniability, especially among the uninitiated, but in-group members would receive the message that the sender was one of them. In 2016 Donald Trump, Jr. posted a photoshopped image of Pepe with Trump Sr. and various associates arrayed like a poster for The Expendables, but with the title “The Deplorables” – a reference to Hillary Clinton’s comment about some of Trump’s supporters.

Another example of signaling is Trump associate Stephen Miller’s “OK” hand gesture with thumb and forefinger touching while the other fingers are outstretched. This gesture which in the English-speaking world dates at least to the 17th century has the general meaning of approval, understanding, etc. In 2017 members of the 4chan bulletin-board-like website started a hoax claiming that the gesture really signified “WP” for “white power.” The gesture was then picked up by right-wingers, showing themselves in images making the “OK” sign at first with the intent of mocking the hoax. Over time the ironic context was lost, and the gesture came to actually symbolize what the hoax originally intended to mock. The ADL cites, for example, Australian white supremacist Brenton Tarrant, who displayed the gesture in court after his arrest for allegedly shooting 50 people in a mosque in New Zealand in 2019.

In 2020 the New York Times reported that, unlike 2016 when Russian intelligence agents created disinformation-laden pro-Trump political ads, this time “Trump was feeding many of the disinformation campaigns” himself. In 2016 three Russian agents traveled US MAGA country in search of divisive issues; in 2020 they were able to stay in Russia and simply grab “screenshots of … Trump’s Twitter posts, or quot[e] his misleading statements and then [amplify] those messages.” Trump’s false messages in 2020 included assertions that China was a “FAR greater threat than Russia, Russia, Russia,” and that voting by mail was vulnerable to counterfeit ballots. The FBI and the Department of Homeland Security (DHS) warned that foreign agents were expected to propagate “disinformation that includes reports of voter suppression, cyberattacks targeting election infrastructure, voter or ballot fraud and other problems intended to convince the public of the elections’ illegitimacy,” – subjects most of which Trump had warned about without factual basis.

A 2020 whistleblower complaint alleged that DHS leadership in the first Trump administration “blocked the release of a threat assessment that contained warnings of Russian interference because of how it ‘would reflect upon President Trump.’” Instead analysts were ordered to emphasize threats from China and Iran, to be consistent with Trump’s pronouncements.

Fast forward to 2024, and Trump was described as “not operating a political campaign as much as mounting a disinformation campaign.” The most egregious example, which morphed into textbook firehosing, is the false narrative that the US was being overrun with criminal immigrants, with a key component that 20,000 Haitian immigrants had invaded the town of Springfield, OH (population 58,000), “destroying their entire way of life,” and eating their pets.

Mother Jones’ David Corn summarized Trump’s bogus narrative in the 2024 election as

… [T]rying to convince tens of millions of a reality that does not exist: They’re living in a dangerous hellhole in which they’re imperiled by barbarians, who happen to be people of color…. The overarching goal of Trump’s disinformation efforts is to persuade voters that they should live in fear—and that only he can save them.


Corn enumerates Trumpian falsehoods: Kamala Harris has a phone app that she uses to instruct cartel heads where to drop the immigrants. Immigrants will “walk into your kitchen and cut your throat.” Armed with AK-47s immigrants will take over Aurora and the rest of Colorado, “Unless I become president.” Migrants were “attacking villages and cities throughout the Midwest.” Despite the actual crime rates down nationwide, including for murder, Trump proclaimed “You can’t walk across the street to get a loaf of bread. You get shot, you get mugged, you get raped, you get whatever it may be and you’ve seen it and I’ve seen it.”

Other examples of disinformation in the 2024 presidential campaign include the fake news item that disaster relief was going to undocumented immigrants, a photo of “Kamala Harris in a swimsuit hugging convicted sex offender Jeffrey Epstein,” and a deepfake video report that vice presidential candidate Tim Walz had abused a young man in 2004. The Walz video and other disinformation was believed to be the work of “Russian-aligned” propaganda network Storm-1516

Writing for Salon.com, journalist Paul Rosenberg identifies six “big lies” that he suggests “won Trump the election.”

  • Presenting the climate crisis as a “hoax” effectively kept it largely off the “political stage” in Rosenberg’s view. He notes the connection to the immigration crisis (one of Binkowski’s persistent themes), for instance as a key factor in Central Americans replacing Mexicans as the largest demographic group seeking entry to the US since the first Trump administration. Hence denial of climate change is critical to the “immigrant crime” narrative.
  • The “great replacement” which Rosenberg suggests “equates immigration with genocide.”  In this conspiracy theory, immigration and declining white birthrates bring together racism, misogyny, and anti-semitism as urban elites (i.e. Jews) are deemed responsible for the phenomenon.
  • Voter fraud, which is “virtually nonexistent,” is nonetheless promoted endlessly. Democrats are presented as encouraging immigrants to vote illegally, justifying voter-suppression initiatives and intimidation. Spreading this falsehood helps Trump burnish his image as “defender of democracy.”
  • Roe v Wade and
  • Abortion. Trump asserts falsely that “everyone wanted” an end to Rove v. Wade, and the abortion issue “back in the states where it belongs.” Then he creepily adds “I will be your protector.”
  • Economy. As Rosenberg notes, generations of evidence demonstrate that the US economy fares better under Democrats than Republicans. Despite the economic indicators showing healthy growth, Trump declared “Our country is a failing nation. This is a failing nation…We’re failing at everything we’re doing.”

In his conclusion Rosenberg acknowledges what he calls the “global trend of incumbent losses,” that we covered in WriteToLeft’s "Election 2024: What and Why."  “But gaslighting is a central factor in the operation of fascism,” he writes,  “and the failure of media in liberal democracies even to recognize its existence, much less to fight it, puts the very survival of liberal democracy at risk.”

David Corn notes that a key aspect of Trump’s disinformation campaign is to present Democrats as “perverse extremists … and baby killers,” claiming at rallies that in Democratic states infanticide is legal (not true), and that Vice President Harris would legalize fentanyl as president (also not true). Trump also invoked a litany of epithets denouncing Harris: “mentally disabled,” “communist,” “fascist” (make up your mind), “radical,” the person who destroyed San Francisco, and destroyed California. (All false, obviously.) “[T]ruth is always under assault,” Stanford’s Larry Diamond told Corn, “But to have a presidential campaign doing it on this scale—we’ve never seen anything like it. But this is not new for Trump. It’s his persona and mode of operation. In this campaign, it’s … more chronic and extreme.”  New York University’s Ruth Ben-Ghiat, who specializes in authoritarianism, agreed with Corn, “Trump is running a disinformation campaign. …It’s unprecedented, even among most autocrats on the rise. People like Rodrigo Duterte, the former president of the Philippines, would tell lies about some things or target some subjects, but Trump lies about everything, on the model of the Kremlin (big surprise).” 

Research

Perhaps not surprisingly, as disinformation has increased, so has its study by social and political scientists.

Prevalence and Durability of Misperceptions
Dartmouth’s Brendan Nyhan authored a study published by PNAS in 2021 that asked “Why are misperceptions about contentious issues in politics and science seemingly so persistent and difficult to correct?” The study noted earlier findings that efforts to “set the record straight” in the face of disinformation often fail if the accurate information “contradicts [the target’s] predispositions or existing beliefs and reason toward a preferred conclusion….” This is especially true if the issue is related to the target’s individual or group identity. These earlier findings showed that in some cases efforts to correct disinformation “actually increased belief in the targeted misperception among groups that were predisposed to believe the claim,” which is known as the “backfire effect.”

Subsequent research showed that efforts to correct false information did generate “modest but significant improvements in belief accuracy,” however. In the 2021 study Nyhan asserts that the durability of misperceptions is instead likely related to “failure to reach people with corrective information that durably changes their mind.” “[R]apid decay effects after exposure” to the disinformation, and problems identifying the target of “corrective information” were key factors.

Nyhan found that when people holding objectively false opinions were confronted with corrective information they “often update their beliefs but interpret the information that they receive in an attitude-consistent manner—for instance, by assigning blame or responsibility for the facts in question in a manner that is consistent with their political views or by expressing distrust in the credibility of the information that they have learned.” The ultimate solution, Nyhan suggests, though, may be “discouraging elites from promoting false claims or linking them to salient political and group identities,” which does not seem particularly likely at this juncture.

(See also “‘Belonging is Stronger than Facts’: The Age of Misinformation,” New York Times, May 7, 2021).

Gaslighting
A 2022 report by researchers in the UK, Belgium and the Netherlands found that “exposing and contesting gaslighting is a tool of resilience and resistance….”  Citing work from 2014 the authors note that the goal of gaslighting is “to destroy even the possibility of disagreement,” “to put out of circulation a particular way of understanding the world.” Research in 2020 extended the definition of gaslighting from “interpersonal and political” to include entire communities. In this manifestation the gaslighting targets a group, trying to engender collective doubt in the community’s “sense and beliefs.” The researchers note two different uses of the term gaslighting: first as a way to “disempower already marginalized standpoints with loss of epistemic trust,” but also as a “contemporary metaphor for post-truth politics.” They cite the 2016 article in Teen Vogue titled “Donald Trump is Gaslighting America,” which focuses not on Trump gaslighting an individual or “trying to deceive the American public into believing a specific false narrative,” but rather “manipulating the American public into a confused, self-doubting state that cannot confidently distinguish truthhoods from falsehoods.” This collective disorientation has been referred to as the “post-truth era,” which has clear resonance with the “reality-based community” metaphor from the Dubya era.

Viewing gaslighting in this way, the authors found two patterns of response. One was for individuals to opt out of “certain sources of information,” e.g. “choosing to limit your news consumption.” Another response was seeking out individuals who supported and amplified the “voices of the gaslit.” The latter has a risk of isolating “... social media users in hermetically sealed chambers where they are not confronted with contradictory views, leading to polarization and even radicalization.” However, the authors refer to other research that found the negative effects of such “echo chambers” to be overstated, and in fact identify a phenomenon labeled “survival echoing” that can help individuals maintain their “warranted self-trust and stability of beliefs … in the face of gaslighting.”

In their findings the authors note that one solution to gaslighting is “simply to ignore it.” A complexity in the constellation of gaslighting, however, is that it can be “self-sealing” and “self-fulfilling.”

The gaslight metaphor is self-sealing because it allows accusers to insulate themselves from correction and dispute. In addition, the gaslight metaphor is also self-fulfilling because it can sow the very doubt in one’s perceptions of reality that it is intended to expose. Any claim of gaslighting is vulnerable to this risk, as it calls for a radical doubt about not only what is true, but how one is forming their judgments about what is true. These qualities help to stimulate the suspicion and isolation that is critical for conspiracy theories, suggesting that echoing may be an important part of how conspiracy theories emerge and spread.

Social Media
An article submitted to the journal Nature in 2021 but apparently not published until June 2024 systematically examined the propagation of false information via social media. (Note: the article uses the term “misinformation,” but it’s clear from context that they are often referring to what we’ve labeled “disinformation” here.)

The authors take issue with frequently-heard claims from public figures:

  • Average exposure to problematic content is high
  • Algorithms are largely responsible for this exposure
  • Social media is a primary cause of broader social problems such as polarization

In fact, the researchers found, exposure to “false and inflammatory content” via social media is “concentrated among a narrow fringe with strong motivations to seek out such information.” Rather than vaguely defined “algorithms” being responsible for the distribution of the content in question the researchers found that it was largely spread by “individual users deliberately seeking such content.”

Academic research after the 2016 election found that Russian efforts to use social media platforms to affect voter choice were in fact “a tiny part of people’s information diets and … not associated with changes in voter attitudes or behavior.” Exposure to posts from Russian social media accounts was significantly higher among Republicans, and concentrated in a small set of users. “[J]ust 1% were responsible for 70% of exposures, and 10% were responsible for 98% of exposures.” A similar pattern was observed regarding assertions about social media amplifying disinformation. Research has pointed instead to “consumer demand for false and extremist content, the role of media and political elites” in promoting falsehoods, and “how platform affordances enable the distribution of such content to subscribers and followers.” And the study found that attributions of power to social media in spreading disinformation, along with “political incivility and even political violence” were a case of the classic logical fallacy that confuses correlation with causation. In other words, when people see or experience behavior in the political realm at approximately the same time as it is described in social media, an assumption is made that the media has caused the behavior. The authors note that “much of the research explicitly designed to identify causal effects" does not find them.

In a related study published in December 2024, specifically of the diffusion of information on Facebook, researchers found that the spread of disinformation relies primarily on what the authors termed “viral spread, powered by a tiny minority of users who tend to be older and more conservative.” Also the diffusion pattern was deeper than it was broad – in other words, rather than a large number of “friends” of an initial poster re-sharing the misinformation post, a small number of users initially spread the information to a small group of “friends,” who did the same, and so on “down” the diffusion tree.

The Nature article concludes with a set of recommendations “for improving public discourse about social media through evidence-based research.”

  • Measure exposure and mobilization among extremist fringes. Statistic gathering so far has apparently focused on what one suspects is the relatively easier-to-measure population of “typical” news consumers or social media users.
  • Reduce demand for false and extremist content and amplification of it by media and political elites. Needless to say this seems to be a core aspect and one that has no clear remedy in the current political and social context. Witness, e.g. Facebook’s recent announcement that it was ending fact checking.
  • Increase transparency and conduct experiments to identify causal relationships and mitigate harms. The researchers propose academic-industry collaboration on field experiments to observe the effects of social media. This also seems unlikely in the current context (see previous item)
  • Fund and encourage research around the world. The Nature study authors there is little social media data from areas of the world outside the west for researchers to use. “[C]ontent moderation may be more limited and extremist content on social media correspondingly more frequent,” they suggest.

Now What?

Johns Hopkins political scientist Henry Farrell, author of the Programmable Mutter blog, argues that “We’re getting the social media crisis wrong.” Farrell’s thesis is that focusing on disinformation is addressing the wrong question. The real problem, Farrell suggests, is not that individuals are being convinced that false things are true, but that what he calls “publics with malformed collective understandings” are being created. Farrell asserts that “people can actually think much better collectively than individually.”

[M]y pig-headed advocacy for my particular flawed perspective allows me to see the flaws in your pig-headed arguments and point them out with gusto, and vice versa, for the general improvement of our thought.

In a democracy, Farrell writes, political decisions are not supposed to be made by “kings or dictators,” but by the public, or its representatives. But we have no way to measure directly what the public as a whole wants. Voting in a two-party system is an approximation, as are opinion polls. And, as Farrell notes, “these systems are not just passive measures of public opinion but active forces that rework it…”

Farrell concludes:

Can democracy work, if a couple of highly atypical men exercise effective control over large swathes of the public space? How can that control be limited or counteracted, even in principle? What practical steps for reform are available in a democracy shaped by the people who you want to reform out of power?

… If you want to work towards a better system of democracy, which is both more stable and more actually responsive to what people want and need, how do you do this? It is easy (I think personally, but I am biased too) to see what is wrong with the public at X/Twitter. It is harder to think clearly about what a healthy public would look like, let alone how to build one.

I don’t have good answers to these questions; just questions. Still, I think they are the questions we need to ask to better understand the situation that is developing around us right now.

🙏

Article topic