# The Invisible War for Your Mind > Published on ADIN (https://adin.chat/world/theyre-not-trying-to-change-your-mind-theyre-trying-to-break-it) > Author: Priyanka > Date: 2026-03-02 Trevor Paglen's photograph *[They Watch the Moon](https://www.sfmoma.org/artwork/2011.264/)* appears, at first glance, to be a landscape of pure stillness -- silver ridges dissolving into velvet darkness, as serene as a Romantic painting. Then your eye adjusts. The "ridges" are radomes, giant golf balls housing NSA antennae that capture radio signals bouncing off the lunar surface. What looked like nature is surveillance infrastructure. Paglen's point arrives like a chill: the most consequential power structures are the ones you don't notice until someone forces you to look. That quiet revelation is a near-perfect metaphor for cognitive warfare -- a term you may not have encountered, but whose effects you have almost certainly felt. ## What Cognitive Warfare Actually Is [NATO's July 2024 Cognitive Warfare Concept](https://www.act.nato.int/article/cogwar-concept/) states the matter plainly: *"The brain is both the target and the weapon in the fight for cognitive superiority."* This is not hyperbole. It is doctrine. Cognitive warfare is not propaganda, though it uses propaganda. It is not disinformation, though it weaponizes disinformation. It is not psychological operations, though it encompasses them. Cognitive warfare is the *system* -- the integrated, persistent effort to shape how adversary populations perceive reality, process information, and make decisions. Where traditional warfare seeks to destroy capacity, cognitive warfare seeks to corrupt cognition. The target is not a bridge or a server farm. The target is the architecture of your reasoning. A key Russian concept, **reflexive control**, captures the logic. The goal is not to implant a belief but to *"transfer the bases for decision-making"* from one actor to another. Feed someone carefully curated premises; let them reason their way to the conclusion you desire. They will defend that conclusion fiercely, because they believe they arrived at it themselves. Reflexive control doesn't hijack the mind -- it hijacks the *process* of thinking. The deepest aim is not conversion. It is **exhaustion**. Flood the information environment with so many competing narratives, half-truths, and contradictions that certainty itself feels naïve. When no one knows what to believe, paralysis follows -- and paralysis favors the aggressor. ## The Mechanics: How It Actually Works Understanding cognitive warfare requires understanding *how* it operates -- the specific tactics that turn abstract doctrine into measurable effect. ### 1. The Firehose of Falsehood In 2016, [RAND Corporation analysts](https://rand.org/pubs/perspectives/PE198.html) coined the term *firehose of falsehood* to describe Russian propaganda under Vladimir Putin. Two characteristics stand out: **high volume across many channels** and **a shameless disregard for consistency or truth**. This is counterintuitive. Conventional wisdom says effective persuasion requires credibility, and credibility requires consistency. The firehose model rejects this. It floods the zone -- social media, state TV, cloned news sites, bot networks -- with overlapping, sometimes contradictory messages. The goal is not to convince but to overwhelm. When audiences encounter dozens of conflicting accounts of the same event, many simply disengage. That disengagement is the victory. ### 2. Exploiting Cognitive Biases Humans are not rational processors of information; we are pattern-matchers running on heuristics. Cognitive warfare exploits these shortcuts: - **Confirmation bias**: We seek information that validates existing beliefs. Micro-targeted content ensures we find it. - **Availability heuristic**: We judge likelihood by how easily examples come to mind. Flood feeds with examples of crime or chaos, and perceived danger rises -- regardless of statistics. - **Negativity bias**: Threats capture attention faster than opportunities. Emotionally charged content (fear, outrage, disgust) spreads farther, faster. These are not bugs in human cognition; they are features that served us well for millennia. Cognitive warfare turns them into attack vectors. ### 3. Micro-Targeting and Psychological Profiling The Cambridge Analytica scandal revealed how psychographic profiling -- personality assessment at scale using social media data -- could enable precision influence. According to [peer-reviewed analysis](https://www.frontiersin.org/journals/communication/articles/10.3389/fcomm.2020.00067/pdf), the firm combined "Big Five" personality models with programmatic advertising to deliver tailored political messages based on psychological vulnerabilities. Imagine a voter who scores high on neuroticism. Serve them content emphasizing threats and uncertainty. Another scores high on openness. Serve them content about stagnation and the excitement of change. Same election, different framings -- each optimized for psychological resonance. Whether Cambridge Analytica's specific methods were as effective as claimed remains debated among researchers. But the underlying capability -- precision psychological targeting at scale -- is now widespread. ### 4. Narrative Seeding Rather than push conclusions directly, sophisticated operators seed *premises*. They introduce framing, vocabulary, and assumptions into public discourse, then let organic actors -- journalists, influencers, citizens -- build narratives on that foundation. The conclusion appears to emerge "naturally," obscuring its engineered origins. Russian information operations rarely argue explicitly that NATO is an aggressive threat. They seed stories about NATO exercises near Russian borders, amplify local opposition voices, and surface historical grievances. Over time, a narrative forms: NATO encirclement. The audience feels it discovered this conclusion; in reality, the premises were planted. ### 5. Trust Erosion Perhaps the most corrosive tactic is not promoting a specific lie but degrading trust in *all* sources of truth -- mainstream media, scientific institutions, electoral systems, expertise itself. When trust collapses, power flows to whoever can assert their version of reality most forcefully. This is the endgame: not a world where everyone believes the attacker's narrative, but a world where no one believes anything confidently. In that vacuum, the loudest voice wins. ## Historical Precedents Cognitive warfare did not emerge from a vacuum. **MK-ULTRA (1953-1973)**: The CIA's illegal program researching mind control -- LSD experiments, sensory deprivation, hypnosis -- sought to directly manipulate human cognition. Exposed by the Church Committee in 1975, it revealed the depths to which intelligence agencies would go to weaponize the mind. **Operation Mockingbird**: A Cold War-era effort (the scope remains debated among historians) to influence American media by cultivating relationships with journalists. The premise: control the information environment, shape what people believe is possible. **Soviet "active measures"**: Dezinformatsiya, forgeries, front organizations, agents of influence -- the USSR built an entire bureaucracy dedicated to shaping Western perceptions. The playbook never disappeared; it migrated online. These operations were crude by today's standards -- slow, expensive, limited in reach. What has changed is scale, speed, and precision. ## Modern Doctrine: Russia and China ### Russia: "Cognitive Warfare Is Russia's Way of War" The Institute for the Study of War's June 2025 report, [*A Primer on Russian Cognitive Warfare*](https://understandingwar.org/research/cognitive-warfare/a-primer-on-russian-cognitive-warfare/), states bluntly: *"Understanding cognitive warfare is a national security requirement for the United States."* Russia does not view cognitive warfare as a supplement to conventional operations. It *is* the operation. The Kremlin's goal is to shape the information environment so thoroughly that adversary populations question their own institutions, fracture along existing fault lines, and lose the will to resist. **The Doppelganger Campaign** exemplifies this approach. Beginning in 2022, Russian operators created pixel-perfect clones of Western news outlets -- *The Washington Post*, *Fox News*, *Der Spiegel*, *Bild* -- and seeded them with fabricated articles. According to [DFRLab's September 2024 analysis](https://dfrlab.org/2024/09/18/doppelganger-us-election/), these fake sites were promoted through social media ads, bot amplification, and coordinated sharing. The goal was not mass deception but *plausibility*: introduce enough fake content that audiences begin to doubt even authentic sources. [U.S. Cyber Command](https://www.cybercom.mil/Media/News/Article/3895345/russian-disinformation-campaign-doppelgnger-unmasked-a-web-of-deception/) later exposed the operation's infrastructure, and the EU has since sanctioned individuals involved. But the template is now public, and variants continue to emerge. ### China: The Three Warfares and Beyond China's approach differs in emphasis but shares the same objective: cognitive dominance. The PLA's **"Three Warfares"** doctrine -- psychological warfare, public opinion warfare, legal warfare -- has been operational since 2003. But China is now moving beyond information operations into the *neurological* domain. The **China Brain Project**, a national initiative running from 2016 to 2030 with an estimated $4.7 billion in funding, officially focuses on neuroscience research and brain-disease treatment. But as the [Jamestown Foundation](https://jamestown.org/program/tiktok-an-expanding-front-in-cognitive-warfare/) and other analysts have documented, the project includes PLA-linked laboratories researching brain-computer interfaces and cognitive enhancement technologies with potential military applications. **Taiwan's 2024 presidential election** provided a live demonstration. According to [Defense One](https://www.defenseone.com/technology/2024/04/how-china-used-tiktok-ai-and-big-data-target-taiwans-elections/395569/), Chinese actors deployed AI-generated content at scale -- deepfake videos, synthetic news anchors, fake social media accounts -- combined with algorithmic manipulation to flood Taiwanese voters with divisive narratives. Taiwan's response offers a counterpoint: robust civic fact-checking infrastructure, rapid government communication, and high media literacy blunted some effects. The operation demonstrated both what is now possible and that democracies can build resilience. ## Case Studies: The Invisible War in Action ### Doppelganger: Cloning Reality In September 2024, investigators exposed the full scope of the Doppelganger operation. Russian actors had cloned over 60 Western media outlets, creating near-identical websites that published fabricated stories. One fabricated article, mimicking *The Washington Post*, claimed U.S. intelligence agencies were planning to abandon Ukraine. Another, mimicking a German outlet, alleged NATO was secretly preparing offensive operations against Russia. These stories were designed to be shared and screenshotted -- seeding doubt even among audiences who never visited the fake sites directly. ### Taiwan 2024: AI-Powered Cognitive Flooding Taiwan's election was a proving ground for AI-enabled influence. Researchers at [Doublethink Lab](https://medium.com/doublethinklab/artificial-multiverse-foreign-information-manipulation-and-interference-in-taiwans-2024-national-f3e22ac95fe7) documented a coordinated campaign involving: - **AI-generated video**: Deepfakes of candidates making inflammatory statements, released hours before voting. - **Synthetic personas**: Thousands of fake social media accounts with AI-generated profile photos and consistent posting histories. - **Algorithmic manipulation**: Evidence suggesting TikTok's recommendation algorithm disproportionately surfaced content favorable to Beijing's preferred narratives. Taiwan's fact-checking networks and experienced electorate limited the damage. But the operation demonstrated what is now achievable at industrial scale. ### Ukraine: The Surrender That Never Happened In March 2022, days after Russia's invasion, a deepfake video of Ukrainian President Volodymyr Zelensky appeared online, showing him urging soldiers to surrender. The video was crude -- Zelensky's head poorly composited, his voice slightly off -- and was debunked within hours. But the episode revealed the new terrain. In kinetic war, cognitive warfare runs in parallel: synthetic media designed to demoralize troops, erode civilian resolve, and fracture alliances. Ukraine has since invested heavily in counter-disinformation infrastructure, treating the information war as seriously as the artillery war. ## The AI Accelerant Generative AI has transformed cognitive warfare from an artisanal craft to an industrial process. **Scale**: Large language models can produce thousands of unique, contextually appropriate messages per hour. What once required teams of propagandists now requires a prompt and API access. **Personalization**: AI enables micro-targeted content at granularity previously impossible -- unique messages for individuals, optimized for psychological profile. **Speed**: Real-time narrative adaptation. As events unfold, AI systems can generate, test, and refine messaging within minutes. **Synthetic personas**: AI-generated faces, voices, and posting histories create "people" who do not exist but can build followings and shape discourse. NATO's 2026 Chief Scientist Report warned that adversaries are developing capabilities for *"modeling population cognition, identifying psychological pressure points, and generating influence content at scale."* The report called for "operational readiness" against cognitive threats -- language typically reserved for kinetic warfare. These capabilities are emerging, not yet fully deployed. But the trajectory is clear, and the window for building defenses is narrowing. ## The Vulnerability of Open Societies Democracies face a structural asymmetry in cognitive warfare. Open societies depend on free speech, pluralistic media, and rapid information flow. These are strengths -- but they are also attack surfaces. Authoritarian states can insulate their populations behind censorship and narrative control. Democracies, by design, cannot. Social media amplifies the problem. Algorithms optimize for engagement, and nothing engages like outrage. Divisive, emotionally charged content spreads faster than nuanced analysis. The incentives of the information ecosystem align -- accidentally but effectively -- with the goals of cognitive attackers. Trust deficits compound the vulnerability. When citizens already distrust institutions, they are primed to accept alternative narratives. Cognitive warfare does not create polarization; it *exploits* polarization that already exists. **A note of caution**: Attribution in this domain is difficult. Not every divisive narrative is a foreign operation; most are organic expressions of genuine disagreement. The two are increasingly entangled, and distinguishing them requires careful analysis. Researchers also debate how effective these operations actually are -- changing minds is harder than flooding feeds, and resilience varies by society and context. What is clear is that the *attempt* is persistent, well-resourced, and evolving. ## What This Means for You Cognitive warfare is not an abstraction. It is happening now, and you are inside it. The content you encounter on social media, the news stories that surface in your feed, the outrage that flares and fades -- all of this unfolds on contested terrain. Not every post is an operation; most are genuine. But the environment itself is shaped by actors who understand how attention works and how trust erodes. The goal of cognitive warfare is not to make you believe a particular lie. It is to make you **tired** -- tired of conflicting claims, tired of questioning sources, tired of the cognitive effort required to discern truth. The desired end state is not belief but exhaustion. Trevor Paglen once said his art is about *"making visible the infrastructures that define our era."* Today's defining infrastructure is not a radome or a satellite dish. It is something far more intimate: the architecture of your attention, the pathways of your reasoning, the stories you use to make sense of the world. The invisible war is not coming. It is here. The only question is whether you see it. ## Sources - [NATO Allied Command Transformation: Cognitive Warfare Concept (July 2024)](https://www.act.nato.int/article/cogwar-concept/) - [Institute for the Study of War: A Primer on Russian Cognitive Warfare (June 2025)](https://understandingwar.org/research/cognitive-warfare/a-primer-on-russian-cognitive-warfare/) - [RAND Corporation: The Russian "Firehose of Falsehood" Propaganda Model (2016)](https://rand.org/pubs/perspectives/PE198.html) - [DFRLab: Doppelganger -- How Russia Mimicked Real News Sites (September 2024)](https://dfrlab.org/2024/09/18/doppelganger-us-election/) - [U.S. Cyber Command: Russian Disinformation Campaign "DoppelGänger" Unmasked](https://www.cybercom.mil/Media/News/Article/3895345/russian-disinformation-campaign-doppelgnger-unmasked-a-web-of-deception/) - [Defense One: How China Used TikTok, AI, and Big Data to Target Taiwan's Elections (April 2024)](https://www.defenseone.com/technology/2024/04/how-china-used-tiktok-ai-and-big-data-target-taiwans-elections/395569/) - [Jamestown Foundation: TikTok -- An Expanding Front in Cognitive Warfare (February 2024)](https://jamestown.org/program/tiktok-an-expanding-front-in-cognitive-warfare/) - [Doublethink Lab: Foreign Information Manipulation in Taiwan's 2024 Elections](https://medium.com/doublethinklab/artificial-multiverse-foreign-information-manipulation-and-interference-in-taiwans-2024-national-f3e22ac95fe7) - [Frontiers in Communication: Cambridge Analytica's Psychographic Profiling (2020)](https://www.frontiersin.org/journals/communication/articles/10.3389/fcomm.2020.00067/pdf) - [SFMOMA: Trevor Paglen, *They Watch the Moon* (2010)](https://www.sfmoma.org/artwork/2011.264/)