Cultural parasitism
Automatic translate
The ideas and beliefs that are most widely held in society are those that are most likely to be passed on to others, not those that are most likely to be true. Fake news is an example of an idea that spreads very quickly despite being false.
Back in 1976, British biologist Richard Dawkins, in his book "The Selfish Gene," proposed the concept that cultural evolution is driven by special units of information — memes. Similar to genes, memes are copied, passed from person to person, and subject to selection. However, unlike biological selection, which favors genes beneficial to an organism, cultural selection does not favor ideas that are true or beneficial to society — it favors those that spread easily.
It is precisely this gap between "spreadability" and "truth" that forms the essence of the phenomenon that can be described as cultural parasitism. A parasitic idea does not necessarily intentionally cause harm. It simply occupies cognitive space, displaces competing concepts, and reproduces itself through the host’s psychological mechanisms — just as a biological parasite exploits its host’s resources for its own "interests."
The term and its origin
Dawkins himself did not coin the term "cultural parasitism." In his essay "Viruses of the Mind" (1991), he viewed religious beliefs as memetic viruses that spread through contagion: an idea enters the mind, becomes integrated into the host’s belief system, and is then transmitted through any suitable contact. American philosopher Daniel Dennett further developed this approach, emphasizing that each person is responsible for the memes they transmit to others — since a meme, like a virus, cannot be completely eradicated; it can only be studied and resistance developed.
In her book "The Meme Machine" (1999), Susan Blackmore expanded the concept by introducing the concept of "memeplexes" — stable complexes of interconnected memes that support each other and jointly enhance their survival in a cultural environment. From a memetic perspective, it is memeplexes — religions, ideologies, political movements — that exhibit the most pronounced parasitic properties: they create immunity to competing ideas directly within their host.
Biological analogy
The analogy with parasitism in biology is no coincidence. A classic parasite is an organism that lives off its host without killing it outright, but also without benefiting it. A parasitic idea operates in a similar way: it exploits a person’s cognitive resources — attention, memory, emotional engagement — and spreads through their social connections, without necessarily providing anything in return.
An important difference from biological parasitism is that a cultural parasite doesn’t have a physical existence. It lives in belief structures, narratives, and repetitive speech patterns. Its "body" is neural patterns, and its "medium" is conversations, texts, images, and videos. Dawkins explicitly pointed out that the host of a meme is not only the brain of a specific person but also any external information carrier: a book, a building, a melody.
2 Fake news as a model case
3 Theoretical framework
4 Historical examples
5 Psychology of the carrier
6 Parasitism and cultural diversity
7 Social consequences
8 Opposition
Mechanisms of dissemination
Cognitive distortions as a breeding ground
Cultural parasites thrive where human thinking is most predictably vulnerable. One of the main mechanisms is confirmation bias: people tend to accept and disseminate information that aligns with their pre-existing beliefs and reject that which contradicts them. This isn’t a weakness of individual people — it’s a universal feature of human cognition, documented in hundreds of experimental studies.
A 2024 study published in the journal Frontiers in Public Health found that participants aware of confirmation bias were less likely to fall for misinformation, especially among those who initially held a strongly negative stance toward COVID-19 vaccination. In other words, awareness of the cognitive bias itself partially neutralizes it, although it does not eliminate it completely.
In addition to confirmation bias, researchers identify several other mechanisms that fuel cultural parasitism:
- Motivated thinking is the tendency to look for arguments in favor of a decision already made, rather than evaluate arguments neutrally.
- The simplicity effect : simpler and more straightforward narratives are easier to remember and reproduce than nuanced explanations.
- Emotional resonance — fear, anger, and indignation — dramatically increase the likelihood of a message being spread, even if it is factually untrue.
- Social proof — the belief that “everyone thinks so” — makes an idea subjectively more believable
Social structures and echo chambers
Social media creates a unique environment in which parasitic ideas thrive. Recommendation algorithms don’t optimize for content veracity — they optimize for engagement. This leads to emotionally charged, controversial, and often inaccurate content receiving disproportionate amounts of distribution.
The echo chamber phenomenon — a situation in which people receive information primarily from like-minded individuals — creates conditions for the self-perpetuation of parasitic narratives. Within such a chamber, critical thinking is suppressed: any doubt is perceived as a betrayal of group identity, and any confirmation of already shared views is accepted without verification. A study conducted in the context of politically oriented communities showed that bots spread disinformation at roughly the same rate as real users — meaning the problem lies not in automation, but in the very structure of people’s beliefs.
Cultural norms and collectivism
The propensity for cultural parasitism varies depending on the cultural context. Research using Hofstede’s scale shows that in highly collectivist societies, information that aligns with group narratives spreads more quickly: considerations of group identity outweigh individual fact-checking. This doesn’t mean that collectivist societies are more "trusting" in principle — but they exhibit different patterns: trust in the group is higher there than trust in external sources, even if the latter are more credible.
Individualistic cultures have a different vulnerability: a tendency toward independent judgment, coupled with a distrust of institutions, creates fertile ground for conspiracy narratives that appeal to personal critical thinking — but are themselves well-packaged parasitic ideas.
Fake news as a model case
Speed and range of propagation
The phenomenon of fake news has become perhaps the most studied example of cultural parasitism in modern times. In 2018, MIT researchers Soroche Vossoughi, Deb Roy, and Sinan Aral published a large-scale study in the journal Science: they analyzed approximately 126,000 online news threads, shared by approximately 3 million users over 4.5 million times between 2006 and 2017.
The results were clear: fake news spread faster, deeper, and more widely than true news across all categories of information. False messages reached 1,500 people approximately six times faster than true ones, and the likelihood of a false tweet being retweeted was 70% higher than for a true one. This effect was particularly pronounced for political fakes: they reached 20,000 people approximately three times faster than other types of false information reached 10,000.
The most important finding of this study is that bots did not play a decisive role in this process. The speed and reach of fake news were explained by the behavior of people, not automated accounts. This directly confirms the memetic hypothesis: a parasitic idea does not require external automation — it uses its bearers themselves as agents of its own dissemination.
Novelty as a factor of virality
MIT researchers discovered another important fact: fake news was more likely to be perceived as new and unexpected. This is consistent with cognitive psychology: the brain pays increased attention to unexpected information — an adaptive response that evolved long before the advent of writing. The parasitic idea exploits this response: the more unexpected and sensational a message appears, the more attention it attracts and the more readily it is shared.
A 2021 study published in PMC on the spread of COVID-19 misinformation found an additional factor: false messages actively appealed to cognitive biases through mentions of threats, gossip, and celebrities — precisely the elements that most strongly activate social attention mechanisms.
Prestige and authority do not protect
It’s widely believed that disinformation is spread primarily through influential individuals with large audiences. The data doesn’t support this. In an MIT study, users who spread false news had significantly fewer followers, were less likely to be verified, and demonstrated lower overall activity than those who spread truthful information.
A study conducted in the context of disinformation on social media also found that social prestige was not a determining factor in the spread of maladaptive cultural traits. This means that parasitic ideas do not require opinion leaders — they can spread horizontally, through ordinary media, using the network’s structure of connections.
Theoretical framework
Memetics and its criticism
Memetics as a discipline emerged by the late 1990s. Its central question: can Darwin’s principles of heredity, variation, and selection be applied to units of cultural information? The answer given by its key figures — Dawkins, Dennett, and Blackmore — was cautiously affirmative, but with caveats.
In "The Extended Phenotype" (1982), Dawkins himself pointed out the significant differences between memes and genes: memes are not arranged into chromosomes, their copying accuracy is incomparably lower, and "mutations" can be not only random but also intentional. Psychologist Jeremy Burman, writing in the journal Perspectives on Science, further noted that Dawkins initially used the meme as a rhetorical device — a metaphor for redefining the unit of selection in biology — rather than as a rigorous scientific concept.
Criticism of memetics by anthropologists and sociologists focuses on several points. First, the analogy with the gene is fundamentally incomplete: cultural transmission is mediated by consciousness, language, and interpretation, whereas genetic transmission is mediated by chemical copying. Second, "selection" in culture is not random, but rather through deliberate human choices, tastes, and values. Third, the very concept of "meme" is so vague that it is difficult to operationalize for empirical research.
Cultural evolution as an alternative framework
The discipline of "cultural evolution" is considered a more rigorous theoretical tool — it’s less metaphorical than memetics and relies on quantitative methods. Its proponents — Peter Richerson, Robert Boyd, and Joseph Henrich — view cultural diffusion through the lens of learning theory: people don’t simply copy ideas; they selectively adopt them from those they perceive as successful, similar, or authoritative.
Within the framework of cultural evolution, the phenomenon of cultural parasitism is described as a "maladaptive cultural trait" — a cultural trait that reduces the fitness of its host or group, but is highly transmissible. This distinction is important: not every widespread idea is parasitic, but a parasitic idea almost always spreads well — precisely by appealing to psychological mechanisms, not by its credibility.
The concept of "global trust" and social proof
A separate mechanism of cultural parasitism is social contagion through a sense of "normality." Experiments at the University of California, Berkeley, showed that when people were told that a particular false belief was shared by the majority, they began to find it more plausible — without any additional factual justification. "We found that on virtually every item, people shifted their beliefs based solely on social data," the researchers noted.
This phenomenon explains why fake news remains viable even after official refutations: by the time a refutation appears, the idea has already become integrated into the group norm. Convincing someone that their group is wrong is a task of a fundamentally different order of difficulty than offering them a new fact.
Historical examples
Before the digital age
Cultural parasitism is not a new phenomenon. Rumors and moral panics spread through medieval cities with astonishing speed: news of an epidemic, heresy, or the approach of an enemy would travel dozens of kilometers in days, while refutations and clarifications would lag weeks. The mechanisms were the same — emotional charge, narrative simplicity, and an appeal to threat.
Conspiracy theories about Jews in medieval Europe, rumors about witches in the early modern period, and the yellow press of the late 19th century — all these are examples of ideas that spread not because they were true, but because they were psychologically effective. They explained fear, named an enemy, and offered a simple picture of a complex world.
German propaganda of the 1930s and Soviet propaganda of the same period have been studied as systematic state attempts to manage cultural parasitism from the top down: deliberately creating parasitic ideas, introducing them into the media, and displacing competing narratives. Viktor Klemperer, in The Language of the Third Reich (LTI), documented how parasitic constructions penetrated everyday speech and gradually altered the very perception of reality by speakers.
The Digital Age: Acceleration
Digital platforms haven’t changed the nature of cultural parasitism, but they have radically altered its scale and speed. While a rumor in a medieval city might reach a few thousand people in a few days, a viral fake news story in the digital environment reaches millions of people in just a few hours.
The COVID-19 pandemic has exposed this mechanism with particular clarity. The WHO has officially coined the term "infodemic" to describe the parallel crisis of disinformation. False claims about the nature of the virus, vaccines, and treatment methods spread as fast as the disease itself — and in some cases, led to real consequences: refusal of vaccination, the use of dangerous "folk remedies," and social conflict.
Psychology of the carrier
Who becomes a carrier?
Research hasn’t identified a single psychological profile for people prone to spreading parasitic ideas. It’s not a specific personality type — it’s a situational vulnerability that everyone is susceptible to. Key risk factors include high anxiety, belonging to a cohesive group with a strong identity, low media literacy, and information overload.
Daniel Kahneman’s dual-process theory (System 1/System 2) provides a useful conceptual framework: fast, intuitive thinking (System 1) processes information on social media by default — in the fast-paced, emotionally charged environment of endless content. System 1 is particularly susceptible to parasitic ideas that appeal to emotion rather than reflection.
Emotions as a vector of transmission
Fear and anger are two of the most effective carriers of parasitic ideas. Research shows that messages that evoke these emotions are more likely to be assessed as credible and are more likely to be passed on without verification. This is explained by evolutionary logic: for survival, it is more important to immediately respond to a threat than to double-check its veracity.
Parasitic ideas are typically constructed around threats: an enemy within, a conspiracy at the top, a danger from without. This narrative structure activates primitive defense mechanisms and complicates critical assessment. Notably, rebuttals formulated in neutral academic language are less effective than rebuttals that appeal to the same emotional registers — because the battle against the parasite is waged within the same cognitive domain.
Backfire effect
One of the paradoxes of combating parasitic ideas is the "backfire effect": in some cases, the presentation of refuting facts does not weaken, but rather strengthens the original belief. When a person encounters information that threatens their identity or worldview, the emotional regions of the brain are more activated than the areas responsible for rational analysis.
Neuroimaging studies have documented this pattern: presenting politically controversial information to people with strong political beliefs resulted in activation of areas associated with emotional defense, not analytical thinking. This suggests that fact-checking alone is a necessary, but clearly insufficient, tool for countering cultural parasitism.
Parasitism and cultural diversity
Competition of narratives
Cultural parasites don’t exist in a vacuum — they compete with each other and with "healthy" ideas for limited cognitive resources: attention, memory, and emotional engagement. A parasitic idea displaces its competitors not by its content, but by its better adaptation to the host’s psychology.
This competition has an important consequence: when parasitic narratives dominate public discourse, they literally shrink the cognitive space for more complex, nuanced, and accurate descriptions of reality. Research shows that regular exposure to simplified narratives reduces the ability to perceive complex explanations — not because people become "dumber," but because mental habits are shaped by practice.
Memplexes and systemic stability
Memeplexes — stable clusters of interconnected ideas — are particularly resilient. A conspiracy theory is an example: it typically contains a built-in immunity to refutation. Any counterargument is explained as part of the conspiracy ("that’s exactly what they want to tell us"), and any contradiction is interpreted as confirmation ("that’s how they cover their tracks").
This structure makes the parasitic memeplex virtually invulnerable to a frontal attack with facts. Dennett called such constructions "defensible beliefs" — they require not the refutation of specific theses, but the destruction of the entire interpretive system, which is much more difficult.
Social consequences
Polarization and the destruction of trust
The widespread dissemination of parasitic ideas has measurable social consequences. The most well-documented of these is political polarization: when different population groups live in fundamentally different information realities, political dialogue becomes difficult and compromise conceptually unattainable.
Trust in institutions — the media, science, and government — is declining through a dual mechanism. On the one hand, parasitic narratives deliberately attack institutional authority ("it’s all a lie," "scientists are bought"). On the other hand, the institutions themselves, reacting to parasitic narratives, often adopt a defensive stance, which is perceived by audiences as a sign of insecurity.
The COVID-19 pandemic has become a large-scale natural experiment that has exposed these connections. In countries with a developed infodemic, vaccination was slower, compliance with restrictions was lower, and social tensions were higher.
Economic costs
Cultural parasitism also has direct economic implications. The WHO and the World Bank have estimated the costs of public health misinformation at billions of dollars, through excessive strain on medical systems, delays in response, and distrust of preventative measures. Pre-election disinformation is associated with increased political instability, which impacts the investment climate and economic predictability.
A separate category of costs is reputational damage for organizations and individuals targeted by parasitic narratives. Unlike physical damage, reputational damage is recovered slowly: research shows that once a false accusation is disseminated, its traces in search engines and collective memory persist significantly longer than a refutation.
Opposition
Media literacy
The most widely discussed response to cultural parasitism is improving media literacy. The concept suggests that people who are able to critically evaluate sources, verify facts, and recognize manipulative tactics are less vulnerable to parasitic ideas.
Research data cautiously supports this hypothesis, but with caveats. Media literacy programs reduce susceptibility to disinformation, but the effect is often limited to the types of content with which participants were trained. Transfer of skills to new formats of parasitic narratives is only partial. Furthermore, people with high levels of analytical skills are sometimes more adept at rationalizing their initially accepted beliefs — a phenomenon described as the "smart error."
Pre-banking and grafting approach
A promising approach is "pre-banking" — preemptive warnings about manipulation methods before a person encounters a specific parasitic idea. The analogy with vaccination is clear: a small dose of a weakened "virus" — that is, familiarization with the structure of the manipulation — develops cognitive immunity.
Experiments with browser games in which users "created" disinformation themselves revealed a consistent effect: understanding how manipulation works from the inside significantly increases its detection from the outside. This approach works regardless of the participants’ political views — which is especially important in polarized environments.
Algorithmic and institutional measures
Efforts are being made at the platform level to slow the spread of parasitic narratives: fact-checking, warning labels, and slowing the spread of unverified content. Research shows that even simple warnings ("this material is controversial") reduce its likelihood of spread — although the effect is modest and partially offset by the "non-disclosure effect": content without a warning is perceived as implicitly verified.
The systemic challenge is that the business models of most digital platforms are still built on maximizing engagement — and engagement is most effectively generated by parasitic narratives. Without changing this structure, technical measures remain palliative.
Institutional transparency
A separate area is increasing the transparency of the institutions themselves, which become targets of parasitic narratives. Secretive and inconsistent communication from scientific, medical, and government agencies fuels mistrust, which parasitic narratives then exploit. Open data, public methodology, and the recognition of uncertainty — all of this reduces the vulnerability of institutional authority to attack by parasitic ideas.
The irony is that combating parasitic ideas requires the very qualities that counter them: patience, nuance, and a willingness to embrace complexity — precisely the qualities that parasitic ideas are inferior to in the competition for audience attention.
This article is based on published scientific research and academic sources in the fields of memetics, cultural evolution, and the psychology of disinformation.
You cannot comment Why?