Cognitive warfare and grey-zone tactics now sit at the centre of modern competition. Instead of relying only on kinetic force, adversaries mix information manipulation, psychological pressure and deniable proxy actions into long campaigns that stop short of open war. In practice, they use cognitive warfare and grey-zone tactics to erode will, cohesion and decision-making long before a traditional crisis appears. As a result, the “battlefield” increasingly includes minds, narratives and social trust, not only terrain and platforms.
Key Facts
- Core concept: Adversaries fuse cognitive warfare and grey-zone tactics to attack perception, morale and decision-making below the threshold of war.
- Main tools: Disinformation, deepfakes, social media manipulation, cyber attacks on critical infrastructure, economic coercion and proxy activity.
- Leading doctrines: Russian “reflexive control” and Chinese “Information Confrontation” / “Three Warfares” (psychological, media and legal warfare).
- Strategic effect: A “war that does not declare itself” but still undermines institutions, paralyses infrastructure and softens targets from within.
- Future trend: AI-generated content and ubiquitous connectivity will amplify influence operations and cognitive campaigns by 2030.
- NATO response: Allies now treat information and the human domain as contested battlefields and invest in resilience and cognitive security.
From Kinetic Clashes to Cognitive Campaigns
Modern adversaries no longer wait for formal hostilities. Instead, they wage continuous contests in the information and human domains. Cognitive warfare targets how leaders and societies perceive reality, form judgments and choose actions. Grey-zone tactics occupy the space between peace and declared war and include cyber sabotage, economic pressure, lawfare, maritime harassment and covert proxy operations.
Together, cognitive warfare and grey-zone tactics create a persistent pressure campaign. Disinformation and conspiracy narratives polarise societies, weaken trust in institutions and blur the line between fact and fiction. At the same time, cyber intrusions hit infrastructure, paramilitary formations probe borders or sea lanes, and economic levers generate strategic leverage. Consequently, adversaries can shift the balance of power through slow, cumulative gains rather than dramatic breakthroughs.
Analysts from organisations such as the NATO Foundation highlight how troll farms, fake news and deepfake content now seek to “sow chaos and erode public trust” in democratic systems. In many cases, these campaigns coincide with elections, referendums or high-stakes security debates. The aim is simple: confuse voters, fragment elites and constrain coherent policy responses.
Russian Reflexive Control: Shaping the Adversary’s Mind
Russia offers the most explicit doctrinal example. Its concept of reflexive control focuses on shaping an opponent’s perception and decision process so that the opponent chooses options favourable to Moscow. Russian strategists use deception, selective leaks, information operations and staged incidents to distort an adversary’s understanding of reality.
In practice, Russia combines reflexive control with broader hybrid tactics:
- It floods the information space with narratives that justify or obscure military moves.
- It launches coordinated cyber attacks against government and infrastructure networks.
- It deploys irregular forces and “little green men” to create faits accomplis without clear attribution.
- It exploits energy, trade and diplomatic levers to fracture allied responses.
The campaigns around Crimea in 2014 and the early phases of the war in Ukraine illustrate this pattern. First, information operations shape expectations and narratives. Then, irregular forces and covert units generate new facts on the ground. Finally, cyber and economic tools raise the cost of resistance.
Therefore, Western warning indicators cannot focus only on troop movements and order-of-battle changes. Defence planners must also track disinformation surges, coordinated cyber activity and sudden shifts in hostile narratives aimed at Allied publics.
China’s Information Confrontation and the “Three Warfares”
China follows a different path but pursues similar objectives. Its doctrine of Information Confrontation and the associated “Three Warfares” – psychological, media and legal warfare – treats conflict as a continuous, whole-of-state competition. Beijing seeks information dominance that supports political aims without triggering major war.
The “Three Warfares” work together:
- Psychological warfare targets the morale of foreign militaries, political elites and populations. It highlights Chinese strength and inevitability while amplifying doubt and risk on the other side.
- Media warfare uses state media, controlled platforms and social networks to shape global narratives and normalise Chinese positions.
- Legal warfare (lawfare) reinterprets legal frameworks to legitimise Chinese actions and delegitimise rival claims, especially at sea.
The South China Sea shows this doctrine in action. Chinese maritime militia and coast guard vessels operate in large numbers to assert presence and intimidate other claimants. At the same time, Beijing advances legal arguments, historical narratives and media campaigns that seek to portray these activities as lawful and restrained. Social media influence operations reinforce these messages and discredit critics.
Once again, cognitive warfare and grey-zone tactics reinforce each other. Chinese forces change the operational status quo in disputed waters while information operations seek to frame these actions as normal, inevitable and non-escalatory. As a result, rivals face a constant dilemma: respond and risk escalation, or accept incremental loss of position.
AI, Deepfakes and the 2030 Threat Environment
By 2030, cognitive warfare and grey-zone tactics will benefit from rapid advances in AI and connectivity. Adversaries already experiment with:
- AI-generated text, audio and video that closely mimic trusted figures and institutions.
- Hyper-targeted influence operations aimed at specific demographics such as veterans, military families or key swing constituencies.
- Adaptive botnets and troll farms that respond in near real time to fact-checking efforts and counter-narratives.
Moreover, always-connected societies create a dense attack surface. Every device, app and platform can carry influence messages. In crises, decision-makers may need to act while under intense cognitive pressure from false reports, manipulated images and engineered online outrage. If commanders or political leaders doubt their own information feeds, adversaries gain another advantage.
Therefore, modern deterrence cannot rely only on physical hardening and kinetic superiority. It must also protect information integrity and human judgement.
Implications for NATO and Allied Defence Planning
NATO and partner nations now treat information and the human domain as contested battlefields. This shift has several practical implications.
1. Build Societal Resilience
First, governments must strengthen societal resilience. Media literacy campaigns, critical-thinking education and transparent public communication all reduce the impact of disinformation. Trusted public broadcasters and verified digital channels also help citizens navigate crises. When societies recognise manipulation early, adversaries find it harder to fracture domestic consensus.
2. Create Dedicated Cognitive and Information Warfare Units
Second, Allies increasingly stand up specialist units for information operations, psychological defence and strategic communication. These teams work alongside kinetic, cyber and space forces rather than as afterthoughts. They plan campaigns, coordinate messaging and support commanders who operate in complex information environments.
3. Improve Attribution and Exposure
Third, effective deterrence requires credible attribution. Intelligence services, cyber defence teams and open-source analysts must cooperate to trace disinformation campaigns, cyber attacks and proxy activity back to their sponsors. Once governments can attribute actions with sufficient confidence, they can expose malign behaviour, impose costs and rally partners.
4. Integrate Emerging Technologies into Cognitive Defence
Finally, Allies need to use technology to defend the cognitive domain. AI-enabled tools can detect synthetic media, identify coordinated online networks and flag anomalies in information flows. At the same time, defence organisations must design AI support systems that maintain human trust under stress. If commanders rely on opaque “black box” recommendations, adversaries can target that trust with tailored narratives or data poisoning.
Conclusion: Securing the Cognitive Frontiers
Cognitive warfare and grey-zone tactics allow adversaries to change facts on the ground and inside societies without crossing the legal threshold of war. They can undermine institutions, paralyse decision-making and soften targets from within. Russia’s reflexive control doctrine and China’s “Three Warfares” illustrate how major powers now integrate these methods into formal strategy.
For NATO and its partners, the response must go beyond better messaging. Allies need resilient societies, professional information warfare capabilities, robust attribution tools and AI-enabled defences that protect human judgement. In short, nations must secure their cognitive frontiers as carefully as they defend airspace, sea lanes and cyberspace. The outcome of 21st-century competition may depend on who adapts fastest to this new, shadowy battlespace.
Further Reading
- NATO Foundation – analyses on cognitive warfare, hybrid threats and grey-zone strategies:
https://www.natofoundation.org - NDU Press – studies on information operations, reflexive control and information confrontation:
https://ndupress.ndu.edu - Trends Research & Advisory – research on hybrid warfare and grey-zone competition:
https://trendsresearch.org - Defence Agenda – related articles on battlefield AI trust and dual-use defence technologies:
- Battlefield AI and trust in combat decision-making: https://defenceagenda.com/battlefield-ai-trust/
- Dual-use defence technologies in LEO and quantum sensing: https://defenceagenda.com/dual-use-defence-technologies-leo-quantum-sensing/









