Brandolini’s Law
Automatic translate
Brandolini’s Law (also known as the Bullshit Asymmetry Principle ) is an internet aphorism coined by Italian programmer Alberto Brandolini in 2013. Its essence is that the amount of effort required to refute disinformation is an order of magnitude greater than the amount of effort required to create it.
The original formulation is: “ The amount of energy needed to refute bullshit is an order of magnitude greater than to produce it .”
2 Formal structure of the principle
3 Cognitive foundations
4 Related concepts
5 Empirical evidence on asymmetry
6 Scope of application
7 Countermeasure strategies
8 Criticisms and limitations
9 Relationship with other concepts
10 Place in academic discourse
Origin and author
Alberto Brandolini is an Italian programmer and consultant from Faenza (Emilia-Romagna), founder of Avanscoperta, a company specializing in training software developers. He is widely known in the IT community as the creator of the EventStorming methodology, a collaborative business process modeling technique.
The aphorism was first publicly formulated in January 2013 in a social media post. According to Brandolini himself, it was prompted by a confluence of two events: shortly before, he had read Daniel Kahneman’s book Thinking, Fast and Slow , and then watched an Italian political talk show in which journalist Marco Travaglio and former Prime Minister Silvio Berlusconi traded accusations. Observing how each new assertion required a lengthy and documented rebuttal, Brandolini formulated a principle that had long existed in practice but lacked a clear name.
The law quickly gained currency in professional and academic communities. It began to be cited by researchers in the fields of media literacy, cognitive psychology, political communications, and education.
Formal structure of the principle
What is an "order of magnitude"?
The expression "an order of magnitude greater" in the original formulation denotes a difference of approximately tenfold. In this context, it is not a strict mathematical statement, but a qualitative characteristic: refuting one statement requires incomparably more time, attention, and resources than creating it.
A false claim can be created in a matter of seconds — it’s enough to write a short phrase without any evidence. A refutation, however, requires searching for sources, analyzing the context, understanding the audience, and a clear explanation of why the original claim is false. Thus, asymmetry arises not from malicious intent, but from the very nature of the cognitive process.
Distinguishing between disinformation, misinformation, and "nonsense"
In the context of Brandolini’s Law, the English word " bullshit" is used, which philosopher Harry Frankfurt distinguished from lying back in 1986 (in his essay "On Nonsense," published as a book in 2005). A liar knows the truth and deliberately conceals it. Someone who spreads bullshit is indifferent to the truth of their words — their goal is not to deceive, but to impress or achieve a desired effect.
This distinction is crucial: a lie can be refuted by presenting contrary evidence. Refuting a truth-neutral assertion is more difficult, because the author of such an assertion is not bound by any factual obligations in advance and can freely shift positions with each new objection.
Lies are the opposite of truth. Nonsense is indifferent to them.
The distinction between misinformation (disinformation spread without intent) and disinformation (deliberate disinformation) is also important for understanding the law. Brandolini’s principle applies to both categories, but in the intentional dissemination of knowingly false information, the asymmetry of effort is instrumental — it is deliberately used as a strategy.
Cognitive foundations
System 1 and system 2 according to Kahneman
Brandolini drew inspiration from Kahneman’s book, and for good reason. The concept of two thinking systems — fast, intuitive (System 1) and slow, analytical (System 2) — explains why creating and consuming simplistic statements is cheap, while verifying them is expensive.
System 1 operates automatically: it picks up familiar patterns and instantly forms a judgment. A false statement, framed in a simple and emotionally charged phrase, passes through System 1 with virtually no resistance. Refutation, however, requires the engagement of System 2 — a slow, energy-consuming, and conscious analysis.
The familiarity effect and the illusion of truth
Repeating a false statement makes it subjectively more credible — this phenomenon is called the illusory truth effect. Experiments show that even if a person knows a statement is false, when it’s repeated many times, their brain begins to perceive it as familiar, and familiarity becomes associated with truth.
This means that a simple refutation is often insufficient. By repeating the original assertion, even for the purpose of criticizing it, the refuter inadvertently increases its "weight" in the audience’s mind.
Confirmation bias and motivated thinking
When a false statement aligns with a person’s preexisting beliefs, resistance to refutation increases dramatically. Motivated thinking forces people to critically evaluate the refutation rather than the original statement: they look for flaws in the opponent’s arguments, not in their own positions.
Research published in the journal Frontiers in Psychology in 2024 showed that when a belief is closely tied to a person’s personal identity or social class, receiving corrective information often triggers defense mechanisms — instead of revising their views, they reinforce their original ones.
Related concepts
Gish’s Gallop
The tactic known as the "Gish Gallop" is a direct practical application of Brandolini’s Law in debate. A debater using this technique unleashes a stream of assertions — half-truths, distorted facts, and outright fabrications — on their opponent at such a speed that the opponent physically lacks time to refute each one.
The name took hold in 1994, when Eugene Scott, director of the US National Center for Science Education, used it to describe debates between creationist biochemist Duane Gish and evolutionary biologists. Each of Gish’s arguments took seconds; rebuttals took minutes. Given the limited format of the debates, this offered a clear advantage.
The mechanism of the Gish gallop relies entirely on the asymmetry described by Brandolini: if the opponent does not refute all one hundred assertions, the galloper declares victory on the remaining ones.
The Boomerang Effect and its Reassessment
In 2010, political scientists Brendan Nyan and Jason Reifler published a study documenting the so-called "backfire effect": refuting beliefs sometimes led people to further entrench them. These findings were widely cited and led to the idea that correcting misinformation was utterly pointless.
However, subsequent research has cast doubt on this conclusion. A meta-analysis of 52 political belief modification experiments by Porter and Wood found no evidence for a systemic boomerang effect. A 2022 study published in a peer-reviewed journal found that where the effect was observed, it was largely due to measurement uncertainty rather than a robust psychological mechanism.
However, the difficulty of refutation — the central idea of Brandolini’s law — is not refuted by these data. Fact-checking reduces belief in false claims, but the process itself remains significantly more labor-intensive than their creation.
The problem of burden of proof
Brandolini’s principle is closely related to the uneven distribution of the burden of proof. In public discourse, the creator of a claim often risks nothing: they can speak without supporting their words with sources. However, their opponent is obligated to construct a chain of evidence, verify sources, consider counterarguments, and do all of this convincingly to the audience.
Carl Sagan’s aphorism, "Extraordinary claims require extraordinary evidence," and Hitchens’s razor principle, "What is asserted without evidence can be rejected without evidence," are directly opposed in spirit to the implicit norm that actually governs many public exchanges.
Empirical evidence on asymmetry
The speed of dissemination of false information
A 2018 study by Washburn and Aral, published in Science, analyzed approximately 126,000 news dissemination chains in the microblogosphere from 2006 to 2017. False news spread faster, further, and more widely than true news, and this superiority persisted across all topic categories. False political news spread particularly rapidly.
The study’s authors linked this effect to novelty: fake news often contains unexpected or emotionally charged information, which evokes surprise and encourages sharing. True news is, on average, "boring" — it fits into an already familiar worldview.
The effectiveness of fact-checking
A large-scale study conducted simultaneously in four countries — Argentina, Nigeria, South Africa, and the United Kingdom — and published in PNAS in 2021 found that fact-checking reduced belief in falsehoods by an average of 0.59 points on a five-point scale. Meanwhile, exposure to disinformation alone, without any correction, increased belief in it by only 0.07 points.
In other words, a single dose of disinformation has little effect on beliefs, but the cumulative effect of repeated exposure, in the absence of systematic correction, is significant. This is where the real asymmetry manifests itself: false claims can be produced continuously and en masse, while their verification remains a rare and expensive process.
The price of professional fact-checking
A study conducted by the Harvard Kennedy School Misinformation Review found that professional fact-checkers (including Snopes, PolitiFact, and other reputable organizations) spend significantly more time and resources verifying a single claim than it takes to create it. Automated fact-checking systems reduce this disparity but do not eliminate it: for complex, contextualized claims, machine analysis still lags behind human review.
Scope of application
Political communications
In politics, Brandolini’s law is particularly clear. A political assertion — an accusation, a suspicion, a simplified interpretation of a complex issue — can be made in seconds and immediately spread across the media. Its verification, even if it results in a refutation, takes days or weeks and requires consulting experts, documents, and statistics.
Election debates are a prime example. In a debate format with strict time limits, every false claim made by a candidate requires a concise but convincing response from the opponent — and this is physically impossible given the high frequency of such claims.
Scientific discourse and pseudoscience
In science, Brandolini’s principle explains the persistence of pseudoscientific concepts. Anti-vaccination narratives, flat-Earth theories, climate change denial — each of these positions relies on a set of simple premises whose refutation requires detailed explanations with references to peer-reviewed research.
The "publish or perish" pressure under which the academic community operates plays a special role here: researchers are often interested in striking results rather than the monotonous work of refuting others’ errors. The replication crisis in psychology and the social sciences is partly due to the fact that sensational results are published more quickly and readily than their careful corrections.
Medical misinformation
The medical field is particularly vulnerable to the asymmetry described by Brandolini. The claim that "vaccines cause autism" was made in a single publication in The Lancet in 1998 — a publication subsequently retracted and deemed fraudulent. Disproving this claim required dozens of large epidemiological studies involving millions of children and took over two decades.
The cost of this asymmetry is declining vaccination coverage and measles outbreaks in regions where anti-vaccine narratives are particularly entrenched.
Corporate environment and management
In an organizational context, Brandolini’s principle describes a phenomenon faced by many managers and specialists: a rumor, a misconception, or an inaccurate interpretation of statistics can paralyze a department’s work or compromise an entire project. Rebuttal requires time, formal explanations, and often the convening of special meetings.
Teams working under high information load are particularly prone to accepting the "first to arrive" statement as a working hypothesis: there are simply no resources to systematically test each one.
Countermeasure strategies
Prebanking and the Theory of Immunization
University of Cambridge researcher Sander van der Linden and his colleagues developed the concept of "prebunking" — pre-inoculation against disinformation, similar to medical vaccination. The idea is that by exposing people to attenuated forms of the manipulative techniques used in disinformation, they develop cognitive resilience to them.
A study of the Bad News game, conducted in four countries (Sweden, Germany, Poland, and Greece), confirmed that participants who played a game scenario in which they themselves “produced” fakes were subsequently significantly better at recognizing manipulative techniques and were less likely to trust disinformation.
Pre-banking, unlike traditional fact-checking, works before a false statement has taken hold, and therefore requires less effort per person protected.
Shifting the burden of proof
One way to neutralize this asymmetry is to refuse to shoulder the burden of refutation where it shouldn’t be. If a claim isn’t supported by evidence, it formally doesn’t require refutation: according to Hitchens’ razor, it can be rejected with the same justification it was put forward.
In practice, this means that instead of analyzing each specific disinformation thesis, it is more effective to ask the question: “What is this assertion based on?” and shift the burden of proof to the party that made it.
Media literacy as a systemic response
The "Calling Bullshit" course, developed by professors Carl Bergstrom and Jevin West at the University of Washington, draws directly on Brandolini’s principle. The course teaches not so much fact-checking as recognizing the structural signs of manipulation — in statistics, data visualization, rhetorical techniques, and source selection.
The goal is not to combat disinformation in each individual case, but to reduce overall susceptibility to it by reducing the demand for unsubstantiated claims.
Automated fact-checking
Because manual fact-checking fundamentally fails to cope with the volume of disinformation produced, the research community and tech companies are investing in automated verification systems. These systems are capable of comparing claims against databases of verified facts in real time, identifying previously debunked narratives, and flagging materials with a high probability of being false.
The limitation of automation is that it works well with factually determined statements and poorly with contextual manipulations, half-truths, and rhetorical distortions that do not formally contain false facts.
Criticisms and limitations
Lack of strict verification
Brandolini’s law is an aphorism, not a formal scientific hypothesis. It contains no operational definitions of "energy" or "effort" and does not imply a specific numerical proportion. The phrase "an order of magnitude greater" is a qualitative assessment, not an empirically measured quantity.
A number of researchers point out that the specific asymmetry varies greatly depending on the nature of the claim: refuting a mathematical error can be as quick as it was committed, whereas refuting a historical falsification will require extensive archival research.
The risk of excessive skepticism
A study conducted in three countries (the United States, Poland, and Hong Kong) and published in the Harvard Kennedy School Misinformation Review in 2024 found an unexpected side effect of active fact-checking: people exposed to intensive corrective interventions began with increased skepticism about not only false information, but also reliable information.
This means that aggressively combating disinformation can indirectly undermine trust in verified knowledge — an effect that in itself becomes a vector for the further spread of doubt.
Contextual and cultural differences
The degree of asymmetry described by the law varies depending on the media environment, the level of public trust in institutions, and the cultural norms of public debate. In societies with high levels of media literacy and strong fact-checking institutions, the cost of refutation is lower — not because it’s easier, but because the audience is better prepared to accept it.
On the contrary, in conditions of information fragmentation, when different groups of the population consume fundamentally different media streams, the cost of refutation increases: reaching an audience that already believes a false statement is significantly more difficult than reaching a neutral one.
Relationship with other concepts
Brandolini’s principle fits organically into a number of related concepts studying information asymmetry and cognitive biases.
Potter’s principle — the informal statement that "the root of a lie grows faster than its refutation" — was formulated independently and describes the same mechanism when applied to bureaucratic and administrative contexts.
The laws of rumor spreading , studied by social psychologists as early as the mid-20th century (Allport and Postman, 1947), document the tendency of rumors to become simplified and sharpened during transmission: each retelling intensifies vivid details and discards nuances, making the rumor less and less verifiable.
The theory of agnoiology — a philosophical discipline that studies the production of ignorance — examines the systemic mechanisms by which economically or politically motivated actors deliberately create informational uncertainty. The historian of science Robert Proctor coined the term agnotology for this phenomenon ; a classic example is tobacco companies’ campaigns to sow doubt about the link between smoking and lung cancer.
Place in academic discourse
Brandolini’s law has become a widely cited informal concept in academic texts. It is mentioned in studies on cognitive psychology, media literacy, political communication, and the ethics of science.
In their book "Calling Bullshit: The Art of Skepticism in a Data-Driven World" (2020), biologists Carl Bergstrom and Jevin West systematized various forms of information manipulation, drawing in part on the principle of asymmetry. This work transformed an informal aphorism into an operational tool for critical thinking pedagogy.
The British Medical Journal (BMJ) blog published an essay, "The Unbearable Asymmetry of Bullshit," in which the principle was applied to medical science: the authors showed that unfair or simply careless studies occupy the media space, while their methodical refutation remains the domain of a few ungrateful reviewers.
The Limits of the "Order of Magnitude" Metaphor
Brandolini’s principle is often cited in a weaker version: "much more difficult." This softening reflects doubts among some academics that the asymmetry truly reaches a tenfold scale in each specific case. Some researchers prefer to speak of substantial asymmetry, but not universally multiple.
At the same time, the general idea — that producing unsubstantiated claims is structurally cheaper than verifying them — is supported by a fairly wide range of data from various fields, from measuring the speed of news dissemination to the economics of fact-checking.
You cannot comment Why?