Tag Archives: Chinese Military Use of Cognitive Confrontation within the Combat Domain

[Chinese National Defense] Establishing Correct Awareness to Contain China and Conduct Cognitive Warfare Operations

[中國國防]建立正確的意識,以遏制中國並進行認知戰爭行動

現代英語:

As the world continued to actively combat the COVID-19 pandemic, the British newspaper The Guardian reported in late May 2021 that Fazze, a public relations and marketing agency with close ties to Russian officials, was accused of providing funding to influential YouTubers, bloggers, and other opinion leaders in France, Germany, and other European countries to spread false information claiming that vaccines like Pfizer (BNT) and AstraZeneca (AZ) had caused hundreds of deaths. The false information also criticized the EU vaccine procurement system for harming public health in European countries, with the goal of sowing public distrust of Western vaccines and shifting public acceptance toward Russia’s Sputnik V vaccine. This is the most significant example of “perception warfare” in recent international history.

 In fact, human society has always adhered to the principle of “conquering the enemy without fighting” as the guiding principle for optimal military operations. While traditional warfare still primarily takes place in physical space, victory requires the physical capture of cities and territories, as well as the destruction of enemy forces. However, as humanity’s understanding of the nature of war deepens, the use of information technology has become a new trend in warfare, enabling the achievement of traditional combat effectiveness without the need for physical engagement. Given the increasing attention paid to “information warfare” and “hybrid warfare,” this article discusses the closely related concept of “cognitive warfare,” exploring the emerging threats facing our country and our national defense response strategy.

 Whether it’s what the US calls “hybrid warfare” or what Russia calls “information warfare,” the implications are quite similar: centered on the cognitive realm, the use of information to influence and manipulate targets, encompassing both peacetime public opinion and wartime decision-making. The rise of Nazi Germany after World War I was arguably the first modern regime to master the use of information to shape perceptions within its own country and even abroad. Its successful use of propaganda and lies, delivered through various communication technologies, was highly successful. Principles such as “repetition is power” and “negative information is more easily accepted and remembered than positive information” would later profoundly influence authoritarian governments, including Russia.

 Using information capabilities to subvert national regimes

 At the beginning of the 21st century, Russia began to pay attention to the situation where international discourse power was completely controlled by Western countries. It successively put forward theories such as “Information Warfare Theory” and “Sixth Generation Warfare Theory”, arguing that the sixth generation of warfare is a non-contact war that uses precision weapons and information warfare to traverse the battlefield. The purpose of war is no longer a devastating global war, but to achieve effects that cannot be achieved through traditional warfare by exploiting the enemy’s information capabilities to exploit its weaknesses, including changing social and cultural orientations and values, and thus subverting national regimes.

 In 2005, Russia established the international news channel “Russia Today.” Initially focused on soft power propaganda, it shifted its focus after the 2008 Georgian War to attacking negative aspects of Western society and fostering conspiracy theories. The 2014 Ukraine crisis became a training ground for Russian information warfare forces. Using electronic jamming and cyber theft, they intercepted Ukrainian communications, inferring subsequent Ukrainian actions and releasing damaging information at critical moments. They also targeted sensitive issues in eastern Ukraine, including the status of ethnic Russians and economic downturn, distributing a large amount of carefully selected, targeted information to resonate with the public, influencing their perceptions and behavior and gaining control of media opinion. In terms of “cognitive warfare,” Russia’s approach has been successful, and has become a model for the Chinese Communist Party.

 Manipulating “brain control” to control the public

 In 2014, the Chinese Communist Party (CCP) proposed the cognitive operational concept of “brain control,” building on its past “three warfares” of psychological warfare, legal warfare, and public opinion warfare, as well as Russia’s theoretical framework of “information warfare.” It states that a nation’s cognitive space is composed of the superposition of countless individuals, and that “brain control” uses national languages, propaganda media, and cultural products as weapons to comprehensively infiltrate and control the cognition, emotions, and consciousness of the general public and national elites, ultimately distorting, disintegrating, and reshaping their national spirit, values, ideology, history, and culture, thereby achieving the strategic goal of winning without fighting.

 Therefore, the CCP’s “cognitive operations” fall under the broad category of psychological warfare. In the era of information globalization, it integrates information warfare, psychological warfare, and public opinion warfare, becoming the core of the CCP’s overall strategy. Since the 2016 military reform, it has been led by the newly formed “Strategic Support Force” and implemented at all political and military levels. On the one hand, the PLA has adopted American operational thinking in the field of “cognitive operations,” using units such as the 311 Base, the National University of Defense Technology, and the Academy of Military Sciences to develop tactics such as “psychological operations,” “ideological operations,” “consciousness manipulation,” and “strategic communication” to strengthen the “cognitive operations” capabilities jointly constructed by military-civilian integration and joint combat systems. On the other hand, it uses professional personnel to operate media platforms, shape the public opinion environment, and introduce “cognitive operations” into the actual combat application stage.

 The CCP’s recent “cognitive warfare” offensive against Taiwan reveals its methods and tactics. First, the CCP primarily uses the internet to collect personal data from Taiwanese citizens, using big data databases to categorize information by target group, based on political leanings, age, occupation, and other factors. Second, it leverages intelligence gathering to launch targeted cognitive attacks on specific social media platforms, influencing the psychology of the targeted groups, particularly by releasing disinformation to weaken and distract Taiwanese society. Third, it employs online virtual organizations to set up fake social media accounts, infiltrate online communities, and disguise themselves as whistleblowers, deliberately spreading fabricated information to create confusion. Cybertroopers then massively repost and discuss this information, manipulating audience perceptions and creating a cycle of disrupting information retention, manipulating cognitive psychology, and altering thinking patterns.

 Identify fake news and fight back together

 At this stage, the CCP’s campaign for “brain control” over Taiwan aims to influence Taiwanese society’s cognition, distorting public opinion, devaluing democratic values, intensifying opposition, disrupting political conditions, and undermining public trust in the government. The following preventive measures can be taken within the national defense system:

 1. Strengthening educational functions

 Through national defense education in schools, institutions, and society, we will raise the public’s awareness of the threat posed by the CCP’s “cognitive warfare” and their ability to identify false information, and cultivate the habit of rationality, verification, and calmness.

 2. Follow the constraints

 Although there are currently no internationally accepted legal rules that can clearly define the extent to which cognitive warfare constitutes an act of war, making it even more difficult to hold people accountable, media platforms can still strengthen the review of their own reporting content in accordance with existing regulations, and the public can also refrain from spreading suspicious information and following the trend of tennis melee, so as to facilitate the establishment of information verification measures and mechanisms.

 3. Combining Military and Civilian Strength

 Incorporate information and communication-related institutions and industries into the national defense mobilization mechanism, coordinate in peacetime the review, analysis, and disposal of fake news, strengthen talent training and research cooperation, and enhance the capabilities of professional units of the government and the national army; in wartime, cooperate with the overall national actions and carry out countermeasures.

 Currently, Taiwan already has the National Security Bureau’s National Security Operations Center responsible for responding to controversial information from hostile foreign forces. There’s also the non-profit Taiwan Fact-Checking Center. Facing the challenges of cognitive warfare, we must continue to integrate various sectors, strive for international intelligence exchange and experience sharing, optimize the media environment, collaborate across multiple channels, and instantly identify the authenticity and source of information, jointly building our offensive capacity to respond to cognitive warfare.

 Conclusion

 In reality, all countries around the world face threats related to cognitive warfare and information-based psychological warfare. However, democratic and free societies are by no means vulnerable to cognitive warfare attacks and must instead rely on diverse strategies and methods to protect them. We aim to establish a more comprehensive and substantive framework, build a powerful counterforce, and enhance the quality and discernment of our citizens, thereby gaining immunity from the CCP’s cognitive warfare campaign to seize control of our minds.

(The author is a PhD candidate at the Institute of Strategic Studies, Tamkang University)

現代國語:

在全球持續積極對抗新冠疫情之際,英國《衛報》2021年5月下旬報道,與俄羅斯官員關係密切的公關和營銷機構Fazze被指控向法國、德國和其他歐洲國家頗具影響力的YouTube用戶、博主和其他意見領袖提供資金,用於傳播虛假信息,聲稱輝瑞(BNTAZ)和阿斯特利康(BNTAZ)和阿斯特疫苗已導致數百人死亡。這些假訊息也批評歐盟疫苗採購體系損害了歐洲國家的公共衛生,目的是挑起大眾對西方疫苗的不信任,並促使大眾接受俄羅斯的Sputnik V疫苗。這是近代國際史上最顯著的「感知戰」案例。

事實上,人類社會自古以來,均以「不戰而屈人之兵」作為最佳軍事行動指導原則,儘管傳統戰爭主要仍在物理空間進行,需透過實際攻城掠地、消滅敵有生力量,才能獲得勝利。然隨人類對戰爭本質認知深化,利用資訊科技,於不需實體短兵相接的情況下,卻能達到傳統戰爭效果,已成為新型態戰爭趨勢。鑑於「資訊戰」、「混合戰」日益受重視,謹就與其密切相關的「認知作戰」概念進行論述,並探討我國所面臨的新型威脅及全民國防因應策略。

無論是美國所稱的「混合戰」,或俄國所說的「資訊戰」,其實指涉意涵很相似,即以認知領域為核心,利用訊息影響、操控對象目標涵蓋承平時期輿論及戰時決策的認知功能。一戰後,逐漸興起的納粹德國,可謂當代首個擅長運用資訊形塑本國,甚至外國民眾認知的政權,其透過各種傳播技術的政治宣傳與謊言包裝,相當成功;而所謂「重複是一種力量」、「負面訊息總是比正面訊息,更容易讓人接受和印象深刻」等實踐原則,日後更深刻影響專制極權政府與現在的俄羅斯。

藉資訊能力 顛覆國家政權

俄國於進入21世紀初,開始注意國際話語權遭西方國家完全掌控的情形,陸續提出「資訊戰理論」、「第6代戰爭理論」等論述,主張第6代戰爭是以精確武器及資訊戰,縱橫戰場的非接觸式戰爭,戰爭目的不再是毀滅性的全球大戰,而是藉利用敵方弱點的資訊能力,達成傳統戰爭無法實現的效果,包括改變社會文化取向、價值觀,進而顛覆國家政權等。

2005年,俄國成立國際新聞頻道「Russia Today」,起初主要是軟實力宣傳,2008年「喬治亞戰爭」後,轉為攻擊西方社會負面問題與製造陰謀論;2014年「烏克蘭危機」,成為俄軍資訊戰部隊的練兵場,透過電子干擾、網路竊密等手段,截收烏國對外通聯訊息,依此推判烏方後續舉動,並選擇在關鍵時機,釋放對烏國政府不利消息;另選定烏東地區敏感議題,包括俄裔民族地位、經濟不振等,投放大量經篩選的特定資訊,引發民眾共鳴,從而影響烏東人民認知與行為,取得媒體輿論主動權。就「認知作戰」言,俄國作法是成功的,更成為中共的效法對象。

操弄「制腦權」 控制社會大眾

中共2014年於過去心理戰、法律戰、輿論戰等「三戰」基礎,以及俄國「資訊戰」理論架構上,提出「制腦權」認知操作概念,指國家認知空間係由無數個體疊加而成,「制腦」是以民族語言、宣傳媒體、文化產品為武器,全面滲透、控制社會大眾與國家精英之認知、情感與意識,最終扭曲、瓦解、重塑其民族精神、價值觀念、意識形態、歷史文化等,達致不戰而勝的戰略目標。

是以,中共「認知作戰」屬於廣義心理戰範疇,是資訊全球化時代,融合資訊戰、心理戰及輿論戰的戰法,成為中共整體戰略主軸,並自2016年「軍改」後,由新組建的「戰略支援部隊」操盤,在各政略、軍事層次開展執行。一方面,共軍擷取美國在「認知作戰」領域的操作思維,以311基地、國防科技大學、軍事科學院等單位研提「心理作戰」、「思想作戰」、「意識操縱」、「戰略傳播」等戰法,以加強軍民融合及聯戰體系共同建構的「認知作戰」能力;另一方面,則以專業人員操作媒體平臺,形塑輿論環境,將「認知作戰」導入實戰運用階段。

從近年中共對臺進行的「認知作戰」攻勢,可拆解其途徑與手段。首先,中共主要係以網路蒐集國人個資,透過大數據資料庫,劃分政治傾向、年齡、職業等不同目標族群資訊;其次,配合情報偵蒐,針對個別社群媒體展開認知精準打擊,影響目標群眾心理,尤其釋放假訊息,以削弱、分散臺灣社會注意力;再次,則運用網路虛擬組織設置社群媒體假帳號,打入網路族群,偽裝成揭密者、吹哨者,刻意傳散變造資訊,製造混亂,再由網軍大量轉傳、討論,操弄受眾認知,進入阻斷資訊記憶、操縱認知心理、改變思考模式的運作循環。

識別假訊息 全民齊反制

基於現階段,中共對臺「制腦權」作戰,影響臺灣社會認知的目的,在於扭曲輿論、貶低民主價值、激化對立、擾亂政情、減損民眾對政府信任等,於全民國防體系可採取的防制辦法包括:

一、強化教育功能

分別透過全民國防之學校教育、機關教育、社會教育途徑,提高公眾對中共「認知作戰」威脅的認識,與對假訊息識別能力,養成理性、查證、冷靜習慣。

二、遵循約束規範

儘管目前尚無國際通用的法律規則,可明確定義何種程度的認知作戰已構成戰爭行為,更難以究責;然各媒體平臺仍可按既有規範,對自身報導內容加強審查,民眾也可做到不傳播可疑訊息、不跟風網壇混戰,俾利訊息查證措施與機制建立。

三、結合軍民力量

將資訊與傳播相關機構、產業,納入全民防衛動員機制,平時協調因應假訊息審查、分析、處置,加強人才培訓、研究合作,提升政府、國軍專業單位能力;戰時則配合國家整體作為,執行反制任務。

目前我國已有國安局「國家安全作業中心」執行對境外敵對勢力爭議訊息應處有關工作,民間亦有非營利組織成立的「臺灣事實查核中心」。面對「認知作戰」挑戰,仍應持續整合各界力量,爭取國際情報交流與經驗共享,優化媒體環境,多管道合作,即時辨識訊息真偽與來源,共同建設應處「認知作戰」攻勢能量。

結語

事實上,世界各國都同樣面臨「認知作戰」、「資訊心理戰」等相關威脅,然民主自由的社會環境,絕非易受「認知作戰」攻擊的溫床,更需仰賴多元策略與方式守護。期以更完善周全的實質架構,建構強而有力的反制力量,並提升我國公民素質及識別能力,於中共奪取「制腦權」的認知作戰中,獲得免疫。

(作者為淡江大學戰略研究所博士)

中國原創軍事資源:https://www.ydn.com.tw/news/newsInsidePage?chapterID=1431550

China’s Weaponized Communication in International Public Opinion Warfare: Scenarios and Risk Responses

中國在國際公眾輿論戰爭中的武器交流:場景和風險回應

現代英語:

【Abstract】 In the international public opinion war, weaponized communication has penetrated into military, economic, diplomatic and other fields, bringing imagination and practice “everything can be weaponized”. Weaponized communication manipulates public perception through technology, platforms, and policies, reflecting the complex interaction of power distribution and cultural games. Driven by globalization and digitalization, cognitive manipulation, social fragmentation, emotional polarization, digital surveillance, and information colonization have become new means of influencing national stability, which not only exacerbates competition between information-powerful and weak countries, but also provides information-weak countries with the opportunity to achieve reversal through flexible strategies and technological innovation. Under the global asymmetric communication landscape, how to find a point of convergence and balance between technological innovation and ethical responsibility, strategic goals and social balance will be key elements that will influence the future international public opinion landscape.

【Keywords】 Public opinion warfare; weaponized communication; information manipulation; asymmetric communication; information security

If “propaganda is a rational recognition of the modern world” [1], then weaponized communication is a rational application of modern technological means. In the “public opinion war”, each participating subject achieves strategic goals through different communication methods, making them superficially reasonable and concealed. Unlike traditional military conflicts, modern warfare involves not only physical confrontation, but also competition in several fields, including information, economics, psychology, and technology. With the advancement of technology and globalization, the shape of war has changed profoundly, and traditional physical confrontations have gradually shifted to multi-dimensional and multi-field integrated warfare. In this process, weaponized communication, as a modern form of warfare, becomes an invisible means of violence that affects the psychology, emotions and behavior of the opposing enemy or target audience by controlling, guiding and manipulating public opinion, thereby achieving political, military or strategic ends.》 “On War” believes that war is an act of violence that makes the enemy unable to resist and subservient to our will. [ 2] In modern warfare, the realization of this goal not only relies on the confrontation of military forces, but also requires support from non-traditional fields such as information, networks, and psychological warfare. Sixth Generation Warfare heralds a further shift in the shape of warfare, emphasizing the application of emerging technologies such as artificial intelligence, big data, and unmanned systems, as well as comprehensive games in the fields of information, networks, psychology, and cognition. The “frontline” of modern warfare has expanded to include social media, economic sanctions, and cyberattacks, requiring participants to have stronger information control and public opinion guidance capabilities.

At present, the spread of weaponization has penetrated into the military, economic, diplomatic and other fields, bringing with it the apprehension that “everything can be weaponized”. In the sociology of war, communication is seen as an extended tool of power, with information warfare penetrating deeply and accompanying traditional warfare. Weaponized communication is precisely under the framework of information control, by shaping public perceptions and emotions, consolidating or weakening the power of states, regimes or non-state actors. This process not only occurs in wartime, but also affects power relations within and outside the state in non-combatant states. In international political communication, information manipulation has become a key tool in the great power game, as countries try to influence global public opinion and international decision-making by spreading disinformation and launching cyberattacks. Public opinion warfare is not only a means of information dissemination, but also involves the adjustment of power games and diplomatic relations between countries, directly affecting the governance structure and power pattern of the international community. Based on this, this paper will delve into the conceptual evolution of weaponized communication, analyze the social mentality behind it, elaborate on the specific technical means and the risks they entail, and propose multidimensional strategies to deal with them at the national level.

1. From weaponization of communication to weaponization of communication: conceptual evolution and metaphor

Weapons have been symbols and tools of war throughout human history, and war is the most extreme and violent form of conflict in human society. Thus, “weaponized” refers to the use of certain tools for confrontation, manipulation or destruction in warfare, emphasizing the way in which these tools are used.“ Weaponization ”(weaponize) translated as“ makes it possible to use something to attack an individual or group of people”. In 1957, the term “weaponization” was proposed as a military term, and Werner von Braun, leader of the V-2 ballistic missile team, stated that his main work was “weaponizing the military’s ballistic missile technology‘ [3].

“Weaponization ”first appeared in the space field, during the arms race between the United States and the Soviet Union, and the two major powers tried to compete for dominance in outer space.“ Weaponization of space ”refers to the process of using space for the development, deployment or use of military weapons systems, including satellites, anti-satellite weapons and missile defense systems, etc., with the purpose of conducting strategic, tactical or defensive operations. From 1959 to 1962, the United States and the Soviet Union proposed a series of initiatives to ban the use of outer space for military purposes, especially the deployment of weapons of mass destruction in outer space orbit. In 2018, then-U.S. President Trump signed Space Policy Directive-3, launching the construction of the “Space Force” and treating space as an important combat area on the same level as land, air, and ocean. In 2019, the “Joint Statement of the People’s Republic of China and the Russian Federation on Strengthening Contemporary Global Strategic Stability” proposed “prohibiting the placement of any type of weapons in outer space” [4].

In addition to weaponization in the space sector, there is also a trend towards weaponization in the military, economic and diplomatic fields.“ Military weaponization” is the use of resources (such as drones, nuclear weapons, etc.) for military purposes, the deployment of weapons systems, or the development of military capabilities. During the Russo-Ukrainian War in 2022, a report from the Royal United Services Institute showed that Ukraine lost approximately 10,000 drones every month due to the impact of Russian jamming stations. [ 5] “weaponization” also often appears in expressions such as “financial war ”“diplomatic battlefield”. In the economic sphere, weaponization usually refers to the use of shared resources or mechanisms in the global financial system by countries or organizations; diplomatic weaponization is manifested in countries pursuing their own interests and exerting pressure on other countries through economic sanctions, diplomatic isolation, and manipulation of public opinion. Over time, the concept of “weaponization” has gradually expanded into the political, social, cultural and other fields, especially in the information field, and since the 2016 United States presidential election, manipulation of public opinion has become a universal tool in political struggles. David Petraeus, a former director of the CIA in the United States, once said at a National Institute for Strategic Studies conference that the time has come for “the weaponization of everything”.[ 6]

As a metaphor, “weaponization” not only refers to the use of actual physical tools, but also symbolizes the transformation of adversarial and aggressive behavior, emphasizing how the concept of “weapons” permeates daily life, cultural production, and political strategies, showing how social actors use various tools to achieve strategic goals. Nowadays, many areas that should remain neutral, such as the media, law and government agencies, are often described as “weaponized” to criticize their excessive politicization and improper use, highlighting their illegality and negative impact on society. Influence. Through this metaphor, one unconsciously contrasts the current political environment with an idealized and seemingly more moderate past, making one think that the political climate of the past was more rational and civilized, while the present appears too extreme and oppositional.[ 7] Therefore, the essence of “weaponization” is the process of political mediation, which is the use of various means and channels by political forces to influence or control areas that should remain neutral, making them political purposes and tools of political struggle.

In the field of information, the weaponization of communication is a long-standing and strategic means. During World War I and II, propaganda and public opinion warfare were widely used in various countries, and means of communication were used as a psychological tactic. Weaponized communication is the embodiment of the weaponization of communication in the modern information society. It uses algorithms and big data analysis to accurately control the speed and scope of information dissemination, and then controls public opinion and emotions. It reflects the combination of technology, platforms and strategies, making Political forces can more accurately and efficiently control the public perception and public opinion environment. As the ontology of public opinion, information is “weaponized” and used to influence social cognition and group behavior, and the concept of “war” has changed accordingly, no longer just traditional military confrontation, but also includes psychological warfare and cognitive warfare through information dissemination and public opinion manipulation. This shift has led to a range of new terms such as unrestricted warfare, new generation warfare, asymmetric warfare, and irregular warfare. Almost all of these terms are borrowed from “warfare” (warfare) to emphasize diverse conflicts in the information field, and information becomes the core content of “weaponization”.

Although there is some view that the term “war” does not apply to situations where hostilities are not formally declared [8], weaponized communication extends the concept of “war” by weakening the traditional political attributes of war and treating overt or covert forces and forms in various fields in general terms. as an act of communication. It is important to note that in English terms “weaponization” there are two formulations: one is “weaponized noun ”noun“, which means that something has been ”weaponized“ with a weapon function or purpose, and the other is ”weaponization of noun, which refers to the process of converting something into a weapon or having the nature of a weapon. In the academic sphere, Chinese translations differ, although weaponized communication and weaponization of communication are not yet strictly distinguished.“ Weaponized communication ”which focuses more on the means of communication or the message itself“ being weaponized” in order to achieve a certain strategic goal, and “weaponization of communication”, which emphasizes the process of communication itself as a transformation process of weapons. When discussing specific technical means, most academic papers adopt weaponized or weaponizing as a prefix to modify specific means of dissemination.

This article focuses on specific communication strategies in the international public opinion war, focusing on describing the weaponization phenomenon that has occurred, so unified use “weaponized communication” is a method of using communication means, technical tools and information platforms to accurately control information flow, public cognition and emotional response, a strategic communication method to achieve specific military, political or social purposes. Weaponized communication is also not a simple state of war or wartime, but a continuous communication phenomenon. It reflects the interaction and game between various subjects and is the flow of information sharing and meaning space.

2. Application scenarios and implementation strategies of weaponized communication

If at the end of the 1990s, weaponization in the information field was still a “dead topic”, and countries were mainly chasing upgrading competitions for physical weapons such as missiles and drones, then entering the 21st century, cyber wars have truly entered the public eye, and deeply embedded in people’s daily lives, through social media and smart devices, the public will inevitably be involved in the war of public opinion and unconsciously become participants or communication nodes. With the spread of technology, weaponized means gradually expanded from state-led instruments of war to socialized and politicized areas, and control over individuals and society shifted from explicit state apparatus to more covert conceptual manipulation. The exposure of Project Prism (PRISM) has raised strong global concerns about privacy breaches, highlighting the potential for states to use advanced technology for surveillance and control, seen as a new type of weaponization. Since Trump was elected President of the United States in 2016, the large-scale application of information weapons such as social robots has become a common phenomenon in the global political game. Information warfare ——including electronic warfare, computer network warfare, psychological warfare, and military deception—— is widely used to manipulate the flow of information and influence the landscape of public opinion. Not only do these methods work in military wars and political elections, but they also gradually permeate cultural conflicts, social movements and transnational games, perpetuating the traditional logic of information warfare. Nowadays, weaponized communication, as a socio-political tool, profoundly affects the ecology of public opinion, international relations and the daily lives of individuals.

(1) Information manipulation warfare in the military field

Information flow can directly influence the direction of military conflicts, shaping public and military perceptions and decisions, which in turn affects morale, strategic judgment, and social stability. In modern warfare, information is no longer a mere aid, and the field of information has become a central battleground. By manipulating the flow of information, the enemy’s situation assessment may be misled, the will to fight is weakened, and the trust and support of the people are shaken, which in turn affects the decision-making process and continuity of the war.

The Gulf War is regarded as the beginning of modern information warfare. In this war, the United States carried out systematic strikes against Iraq through high-tech means ——including electronic warfare, air strikes, and information operations——. The U.S. military used satellites and AWACS early warning aircraft to monitor the battlefield situation in real time, and induced the Iraqi army to surrender from a psychological level by airdropping leaflets and radio stations to convey to Iraqi soldiers the advantages of the U.S. military and its preferential treatment policy after surrender. The war marked the key place of information control in military conflicts, demonstrating the potential of information warfare in modern warfare. In the 21st century, cyberwarfare has become an important part of information warfare. Cyberwarfare involves not only the dissemination and manipulation of information, but also control over enemy social functions through attacks on critical infrastructure. In 2007, Estonia suffered a large-scale DDoS (Distributed Denial of Service Attack) attack, demonstrating a trend towards the fusion of information manipulation and cyberattacks. In the WannaCry ransomware incident in 2017, attackers used a Windows system vulnerability (EternalBlue) to encrypt the files of approximately 200,000 computers in 150 countries around the world and demanded a ransom, seriously affecting the British National Health Service (NHS) and causing the interruption of emergency services. and hospital system paralysis, further revealing the threat of cyber warfare to critical infrastructure. In addition, in long-term conflicts, infrastructure control is widely used to undermine the strategic capabilities of adversaries to compete for public information space due to its ability to directly determine the speed, scope, and direction of information dissemination. Israel has effectively weakened Palestinian communications capabilities by restricting the use of radio spectrum, controlling Internet bandwidth and disrupting communications facilities. At the same time, Israel also restricts the development of the Palestinian telecommunications market through economic sanctions and legal frameworks, suppresses Palestinian competitiveness in the flow of information, and consolidates its own strategic advantage in the conflict [9] in order to maintain the unequal flow of information.

Social media provides an immediate and extensive channel for information manipulation, allowing it to cross borders and influence global public sentiment and political situations, as well as shifting the focus of war from mere physical destruction to manipulation of public opinion. During the Russo-Ukrainian War, deepfake technology was used as a visual weapon, which significantly interfered with public perception and public opinion about the war. On March 15, 2022, a fake video of Ukrainian President Volodymyr Zelenskyy was circulated on Twitter, in which he “called” Ukrainian soldiers to lay down their weapons, triggering public confusion for a short period of time. Similarly, fake videos of Russian President Vladimir Putin have been used to confuse the public. Although the videos were promptly annotated “Stay informed” by the platform (pending instructions on understanding the situation), they still caused obvious interference to public emotions and perceptions within a short period of time. These events highlight the critical role of social media in modern information warfare, where state and non-state actors can exert interference in military conflicts through disinformation, emotional manipulation, and other means.

The complexity of information manipulation warfare is also reflected in its dual nature ——both a tool for attack and a means of defense. In the military sphere, states ensure national security, protect critical infrastructure, maintain military secrets, and in some cases influence adversary combat effectiveness versus decision-making by defending against and countering cyberattacks. In 2015 and 2017, Russian hackers launched large-scale cyber attacks against Ukraine (such as BlackEnergy and NotPetya). Ukraine successfully resisted some attacks and took countermeasures by quickly upgrading its cyber defense systems, avoiding larger-scale infrastructure paralysis. In addition, units such as the NATO Center of Excellence for Strategic Communications and the British 77th Brigade focus on researching public opinion shaping in peacetime [10], using strategic communications, psychological warfare, and social media monitoring to expand strategic control in the information field and strengthen defense and public opinion shaping capabilities, further increasing the strategic height of information warfare.

Today, information manipulation warfare is a key link in modern military conflicts. Through the high degree of integration of information technology and psychological manipulation, it not only changes the rules of traditional warfare, but also profoundly affects public perception and the global security landscape. By taking control of critical infrastructure and social media platforms, countries, multinational corporations or other actors can gain strategic advantages in the global information ecosystem by restricting the flow of information and manipulating communication paths.

(2) Public opinion intervention in political elections

Political elections are the most direct field of competition for power in democratic politics, and the dissemination of information has an important influence on voter decision-making in the process. By calculating propaganda and other means, external forces or political groups are able to manipulate the sentiments of voters and mislead the public, thereby influencing the results of elections, destabilizing politics or weakening the democratic process, and elections are thus the most effective application scenario for weaponized communication.

In recent years, global political elections have shown a trend towards polarization, with large ideological differences between groups with different political affiliations. Polarization leads the public to selectively accept information that is consistent with their own views, while excluding other information, and this “echo chamber effect” intensifies the public’s one-sided perception of positions, giving greater scope for public opinion intervention. And the rise of information dissemination technology, especially computational propaganda, has enabled external forces to more accurately manipulate public opinion and influence voter decision-making. Computational Propaganda refers to the use of computing technology, algorithms and automated systems to control the flow of information to disseminate political information, interfere with election results and influence public opinion. Its core characteristics are algorithm-driven accuracy and the scale of automated communication. By breaking through The limitations of traditional manual communication have significantly enhanced the effect of public opinion manipulation. In the 2016 U.S. presidential election, the Trump team analyzed Facebook user data through Cambridge Analytica and pushed customized political advertisements to voters, accurately affecting voters’ voting intentions [11]. This incident was seen as a classic case of computational propaganda interfering in elections, and also provided an operational template for other politicians, driving the widespread use of computational propaganda worldwide. In the 2017 French presidential election, candidate Emmanuel Macron’s team was hacked, and internal emails were stolen and made public, claiming that Macron had secret accounts overseas and was involved in tax evasion in an attempt to discredit his image. During the 2018 Brazilian presidential election, the team of candidate Jair Bolsonaro used WhatsApp groups to spread inflammatory political content, targeting and pushing a large number of images, videos and inflammatory messages to influence voter sentiment. According to statistics, from 2017 to 2019, the number of countries using computing for propaganda worldwide increased from 28 to 70, and in 2020 this number rose to 81. This suggests that computational propaganda is redefining the rules of public opinion in global elections through technical means and communication strategies.

Computational propaganda is also an important tool for state actors in the war of public opinion intervention. In 2011, the U.S. Defense Advanced Research Projects Agency (DARPA) launched Operation “Voice of Ernest” in the Middle East to distort conversations on Arabic-language social media by establishing and managing multiple false identities (sockpuppets). Russia also frequently uses computational propaganda to intervene, operating about 200,000 social media accounts in Canada, using far-right and far-left movements to spread pro-Russian rhetoric, create false social hot spots, and try to undermine Canada’s support for Ukraine [12]. As an important part of computing propaganda, social robots create the heat of public opinion through automation and scale, increase the exposure of information on social platforms through specific tags, and control the priority of issues. During the 2016 U.S. election, Russia used social robots to post content supporting Putin and attacking the opposition, covering up the opposition’s voice through information overload, and strengthening the pro-Putin public opinion atmosphere. [ 13] During the 2017 Gulf crisis, Saudi Arabia and Egypt used Twitter bots to create anti-Qatar hashtags#AlJazeeraInsultsKingSalman, which made it a hot topic and fictionalized the peak of anti-Qatar sentiment, which in turn affected global public opinion attitudes towards Qatar. [ 14] Deepfake technology further improves the accuracy and concealment of computing propaganda. In 2024, a fake video of U.S. President Joe Biden went viral on X (formerly Twitter), showing him using offensive language in the Oval Office, sparking controversy in public opinion and influencing voter sentiment. According to a survey by cybersecurity firm McAfee, 63% of respondents had watched a political deepfake video within two months, and nearly half said the content influenced their voting decisions.[ 15]

Globally, computing propaganda has infiltrated public opinion wars in various countries, affecting social stability and national security. The Israel Defense Forces waged a public opinion war against Palestine through digital weapons, Turkey cultivated “a patriotic troll army” to manipulate public opinion at home and abroad, and the Mexican government used botnets to influence public opinion. Computational propaganda is changing the landscape of global political communication as an important means of modern public opinion intervention warfare. With the development of technologies such as artificial intelligence and quantum computing, computing propaganda may also interfere with electoral processes through more covert and efficient means, or even directly threaten the core operating logic of democratic institutions.

(3) Symbolic identity war in the cultural field

Weaponized communication attempts to influence the public’s thoughts, emotions, and behaviors by manipulating information, symbols, and values, which in turn shapes or changes society’s collective cognition and cultural identity. This mode of communication consists not only in the transmission of information, but also in promoting the transmission and identification of a specific ideological or political idea through a specific narrative framework, cultural symbols and emotional resonance. Through the manipulation of cultural symbols, social emotions and collective memory, weaponized communication interferes with social structure and cultural identity in the cultural field, becoming a core means of symbolic identity warfare.

Memes, as a cultural symbol that combines visual elements and concise words, stimulate the emotional response of the audience in a humorous, satirical or provocative way, affecting their political attitudes and behaviors. Pepe the Frog began as a harmless comic book character that was repurposed and weaponized by far-right groups to spread hate speech, gradually evolving into a racist and anti-immigrant symbol. Memes transform complex political sentiments into easy-to-spread visual symbols that quickly stir up public distrust and anger over policy, seen as “weaponized iconoclastic weaponization” (Iconoclastic Weaponization). This process, by manipulating cultural symbols in order to achieve the purpose of political or social struggle [16], aggravates the public’s division of society and politics. For example, during Brexit, memes bearing the words “Take Back Control” Take Back Control spread rapidly, reinforcing nationalist sentiments.

In addition to the manufacture of cultural symbols, the screening and shielding of symbols are equally capable of shaping or deepening a certain cultural identity or political stance. Censorship has been an important means for power to control information since ancient times, and as early as the ancient Greek and Roman periods, governments censored public speeches and literary works to maintain social order and power stability. Entering the digital age, the rise of the Internet and social media has driven the modernization of censorship, and platform censorship has gradually replaced traditional censorship methods as a core tool for contemporary information control and public opinion guidance. Algorithm review detects sensitive topics, keywords, and user behavior data through artificial intelligence, automatically deletes or blocks content deemed “violations”, and the review team of social media manually screens user-generated content to ensure its compliance with platform policies and laws and regulations. The role of platform censorship is not only to limit the dissemination of certain content, but also to guide public opinion and shape the public perception framework through push, deletion and blocking. Although mainstream social platforms control the spread of information through strict content moderation mechanisms, some edge platforms such as Gab, Gettr, Bitchute, and others have become hotbeds of extreme speech and malicious information due to the lack of effective censorship. These platforms do not place sufficient restrictions on content publishing, allowing extreme views and disinformation to spread wantonly. For example, Gab has been repeatedly criticized for its extremist content and is accused of promoting violence and hatred. In the “echo chamber”, users can only access information that is consistent with their own views. This information environment further strengthens extreme ideas and leads to increased antagonism among social groups.[ 17]

Language, as a carrier and tool for information dissemination, can profoundly influence group behavior and cultural identity through emotional manipulation, symbolic politics, and social mobilization. The weaponization of language focuses on how language forms and cultural contexts affect the way information is received, emphasizing how language can be used to manipulate, guide or change people’s cognition and behavior. This involves not only the use of specific lexical and rhetorical devices, but also the construction of specific social meanings and cultural frameworks through linguistic representations. As another important tool of symbolic identity warfare, language shapes the narrative framework “of antagonism between the enemy and the enemy”. The Great Translation Movement spread the nationalist rhetoric of Chinese netizens to international social media platforms through selective translation, triggering negative perceptions of China. This language manipulation amplifies controversial content through emotional expression and deepens the cultural bias of the international community.

The deep logic of the weaponization of language lies in emotional and inflammatory forms of language. Western countries often justify acts of intervention by using the labels of justice such as “human rights” and “democracy”, legitimizing political or military action. White supremacists reshape ideologies using vague labels such as “alt-right”, transforming traditional “white supremacist” with strongly negative connotations into a more neutral concept, reducing the vocabulary’s social resistance, broadening the base of its supporters with a broad “umbrella” identity. Through the infiltration of secular discourse, hate politics and extreme speech are justified, gradually creating a political normality. Language is truly weaponized after the public routineizes this politics.[ 18] In Nigeria, hate-mongering content spreads through racial, religious and regional topics, profoundly deteriorating social relations. [ 19] Linguistic ambiguity and reasonable denial strategies have also become powerful tools for communicators to circumvent their responsibilities and spread complex social and political issues in simplified narratives. Through negative labeling and emotional discourse, Trump’s America First policy deliberately puts forward views that are opposed to mainstream opinions by opposing globalization, questioning climate change science, and criticizing traditional allies, stimulating public distrust of globalization, reshaping the cultural identity of national interests first. [ 20]

III Risks and challenges of weaponized dissemination: legitimacy and destructiveness

Although weaponized communication poses a great risk to the international public opinion landscape, it may be given some legitimacy by certain countries or groups through legal, political or moral frameworks in specific situations. For example, after the “9/11” incident, the United States passed the Patriot Act to expand the surveillance authority of intelligence agencies and implement extensive information control in the name of “anti-terrorism”. This “legitimacy” is often criticized as undermining civil liberties and eroding the core values of democratic society.

In the international political game, weaponized transmission is more often seen as a means of “Gray Zone” (Gray Zone). Confrontations between countries are no longer limited to economic sanctions or diplomatic pressure, but are waged through non-traditional means such as information manipulation and social media intervention. Some States use “the protection of national interests” as a pretext to disseminate false information, arguing that their actions are compliant and, although they may be controversial under international law, are often justified as necessary means “to counter external threats”. In some countries where the regulation of information lacks a strict legal framework, interference in elections is often tolerated or even seen as a “justified” political exercise. At the cultural level, certain countries attempt to shape their own cultural influence on a global scale by disseminating specific cultural symbols and ideologies. Western countries often promote the spread of their values in the name of “cultural sharing” and “communication of civilizations”, but in actual operations, they weaken the identity of other cultures by manipulating cultural symbols and narrative frameworks, leading to global cultural ecology. imbalance. The legal framework also provides support, to a certain extent, for the justification of weaponized dissemination. In the name of “counter-terrorism” and “against extremism”, some countries restrict the dissemination of so-called “harmful information” through information censorship, content filtering and other means. However, this justification often pushes moral boundaries, leading to information blockades and suppression of speech. Information governance on the grounds of “national security”, although internally recognized to a certain extent, provides space for the proliferation of weaponized communications.

Compared to legitimacy, the spread of weaponization is particularly devastating. At present, weaponized communication has become an important tool for power structures to manipulate public opinion. It not only distorts the content of information, but also profoundly affects public perception, social emotions, and international relations through privacy violations, emotional mobilization, and cultural penetration.

(1) Information distortion and cognitive manipulation

Distortion of information means that information is deliberately or unintentionally distorted during dissemination, resulting in significant differences between what the public receives and the original information. On social media, the spread of disinformation and misleading content is rampant, and generated content from artificial intelligence models (such as GPT) may be exacerbated by bias in training data. Gender, race, or social bias may be reflected in automatically generated text, amplifying the risk of information distortion. The fast-spreading nature of social media also makes it difficult for traditional fact-checking mechanisms to keep up with the spread of disinformation. Disinformation often dominates public opinion in a short period of time, and cross-platform dissemination and anonymity complicate clarification and correction. The asymmetries in communication undermine the authority of traditional news organizations, and the public’s preference for trusting instantly updated social platform information over in-depth coverage by traditional news organizations further diminishes the role of news organizations in resisting disinformation.

In addition to the distortion of the information itself, weaponized communication makes profound use of the psychological mechanisms of cognitive dissonance. Cognitive dissonance refers to the psychological discomfort that occurs when an individual is exposed to information that conflicts with their pre-existing beliefs or attitudes. By creating cognitive dissonance, communicators shake the established attitudes of their target audience and even induce them to accept new ideologies. In political elections, targeted dissemination of negative information often forces voters to re-examine their political positions or even change their voting tendencies. Weaponized communication further intensifies the formation of “information cocoon houses” through selective exposure, allowing audiences to tend to access information consistent with their own beliefs, ignoring or rejecting opposing views. This not only reinforces the cognitive biases of individuals, but also allows disinformation to spread rapidly within the group, making it difficult to be broken by external facts and rational voices, and ultimately forming a highly homogeneous ecology of public opinion.

(2) Privacy leakage and digital monitoring

In recent years, the abuse of deepfakes has exacerbated the problem of privacy violations. In 2019, the “ZAO” face-changing software was removed from the shelves due to default user consent to portrait rights, revealing the risk of overcollection of biometric data. Photos uploaded by users that have been processed through deep learning can either generate an accurate face-changing video or become a source of privacy leaks. What’s more, techniques such as deepfakes are abused for gender-based violence, the faces of multiple European and American actresses are illegally planted with fake sex videos and widely distributed, and although the platforms remove this content in some cases, the popularity of open-source programs makes it easy for malicious users to copy and share forged content. In addition, when users use social media, they tend to authorize the platform by default to access their devices’ photos, cameras, microphones and other app permissions. Through these rights, the platform not only collects a large amount of personal data, but also analyzes users’ behavioral characteristics, interest preferences, and social relationships through algorithms, allowing it to accurately deliver ads, recommend content, and even implement information manipulation. This large-scale data acquisition drives global discussion of privacy protections. In Europe, the General Data Protection Regulation attempts to strengthen the protection of individuals’ right to privacy through strict regulations on data collection and use. However, due to “implicit consent” or complex user agreements, platforms often bypass regulations that make the data-processing process less transparent, making it difficult for regular users to understand what the data is actually used for. Section 230 of the U.S. Communications Decency Act provides that online platforms are not legally responsible for user-generated content, a provision that has fueled the development of content moderation on platforms but has also left them with little incentive to respond to privacy infringements. Platforms, motivated by commercial interests, often lag behind in dealing with disinformation and privacy issues, leading to ongoing shelving of audit responsibilities.

In terms of digital surveillance, social platforms work with governments to make user data a core resource “of surveillance capitalism”. The National Security Agency (NSA) implements mass surveillance through phone records, Internet communications, and social media data, and works with large enterprises such as Google and Facebook to obtain users’ online behavioral data for intelligence gathering and behavioral analysis worldwide. The abuse of transnational surveillance technologies is what pushes privacy violations to an international level. Pegasus spyware developed by the Israeli cybersecurity company NSO, which compromises target devices through “zero-click attacks”, can steal private information and communication records in real time. In 2018, in the case of the murder of Saudi journalist Jamal Khashoggi, the Saudi government monitored its communications through Pegasus, revealing the profound threat this technology poses to individual privacy and international politics.

(3) Emotional polarization and social division

Emotions play a key role in influencing individual cognition and decision-making. Weaponized communication influences rational judgment by inciting feelings of fear, anger, sympathy, etc., and pushes the public to react irrationally, driven by emotions. War, violence and nationalism often become the main content of emotional mobilization. Through carefully designed topics, communicators implant elements such as patriotism and religious beliefs into information dissemination, quickly arousing public emotional resonance. The widespread adoption of digital technologies, particularly the combination of artificial intelligence and social media platforms, further amplifies the risk of emotional polarization. The rapid spread of disinformation and extreme speech on the platform comes not only from the sharing behavior of ordinary users, but is also driven by algorithms. Platforms tend to prioritize the push of emotional and highly interactive content, which often contains inflammatory language and extreme views, thus exacerbating the spread of hate speech and extreme views.

Social media hashtags and algorithmic recommendations play a key role in emotional polarization. After the Charlie Hebdo incident, the #StopIslam hashtag became a communication tool for hate speech, with the help of which users posted messages of hatred and violent tendencies. During the 2020 presidential election in the United States, extreme political rhetoric and misinformation on social platforms were also amplified in a bitter partisan struggle. Through precise emotional manipulation, weaponized communication not only tears apart public dialogue, but also greatly affects the democratic process of society. Another particular extremist mobilization tactic is “Weaponized Autism”, where far-right groups use the technical expertise of autistic individuals to implement emotional manipulation. These groups recruit technically competent but socially challenged individuals, transforming them into enforcers of information warfare by giving them a false sense of belonging. These individuals, guided by extremist groups, are used to spread hate speech, carry out cyberattacks and promote extremism. This phenomenon reveals not only the deep-seated mechanisms of emotional manipulation, but also how technology can be exploited by extremist groups to serve the larger political and social agenda.[ 21]

(4) Information colonization and cultural penetration

“Weaponized Interdependence” theory Weaponized Interdependence Theory reveals how states use key nodes in political, economic, and information networks to exert pressure on other states. [ 22] Especially in the field of information, developed countries further consolidate their cultural and political advantages by controlling the implementation of information flows “information colonization”. Digital platforms became the vehicles of this colonial process, the countries of the Global South were highly dependent on Western-dominated technology platforms and social networks for information dissemination, and in sub-Saharan Africa, Facebook has become synonymous with “the Internet”. This dependence not only generates huge advertising revenues for Western businesses, but also has a profound impact on indigenous African cultures and values through algorithmic recommendations, especially in terms of gender, family, and religious beliefs, making cultural penetration the norm.

Digital inequality is another manifestation of information colonization. The dominance of developed countries in digital technology and information resources has increasingly marginalized countries of the South in the economic, educational and cultural fields. Palestine’s inability to effectively integrate into the global digital economy due to inadequate infrastructure and technological blockade both limits local economic development and further weakens its voice in global information dissemination. Through technological blockades and economic sanctions, the world’s major economies and information powers restrict other countries’ access to key technological and innovation resources, which not only hinders the development of science and technology in target countries, but also exacerbates the rupture of the global technology and innovation ecosystem. Since withdrawing from the Iran Nuclear Deal in 2018, U.S. economic sanctions on Iran have blocked its development in the semiconductor and 5G sectors, and the asymmetry between technology and innovation has widened the gap in the global technology ecosystem, putting many countries at a disadvantage in information competition.

IV Reflection and discussion: the battle for the right to speak in the asymmetric communication landscape

In the competitive landscape of “Asymmetric Communication”, strong parties often dominate public opinion through channels such as mainstream media and international news organizations, while weak parties need to use innovative communication technologies and means to make up for their disadvantages and compete for the right to speak. At the heart of this communication landscape lies Information Geopolitics, the idea that the contest of power between states depends not only on geographical location, military power, or economic resources, but also on control over information, data, and technology. The game between the great powers is no longer limited to the control of physical space, but extends to the competition for public opinion space. These “information landscapes” involve the right to speak, information circulation and media influence in the global communication ecosystem. In this process, the country continuously creates landscapes to influence international public opinion and shape the global cognitive framework, thereby achieving its strategic goals. The strategy of asymmetric communication is not only related to the transmission of information content, but more importantly, how to bridge the gap between resources and capabilities with the help of various communication technologies, platforms and means. The core of information communication is no longer limited to the content itself, but revolves around the right to speak. The competition unfolds. With the rise of information warfare and cognitive warfare, whoever has the information will have a head start in global competition.

(1) Technology catching up under the advantage of latecomers

Traditional large countries or strong communicators control the dominance of global public opinion, and by contrast, weak countries often lack communication channels to compete with these large countries. The theory of latecomer advantage advocates that latecomer countries can rapidly rise and circumvent inefficient and outdated links in early technological innovation by leaping forward and bypassing traditional technological paths and introducing existing advanced technologies and knowledge. In the context of weaponized communication, this theory provides information-weak countries with a path to break through the barriers of communication in large countries through emerging technologies, helping them to catch up at the technical level. Traditional media are often constrained by resources, influence and censorship mechanisms, with slow dissemination of information, limited coverage and vulnerability to manipulation by specific countries or groups. The rise of digital media has brought about a fundamental change in the landscape of information dissemination, enabling disadvantaged countries, with the help of globalized Internet platforms, to directly target international audiences without having to rely on traditional news organizations and mainstream media. Through emerging technologies, disadvantaged countries can not only transmit information more precisely, but also rapidly expand their influence in international public opinion through targeted communication and emotional guidance. Later-developing countries can use advanced technologies (such as big data, artificial intelligence, 5G networks, etc.) to achieve precise information dissemination and create efficient communication channels. Taking “big data analysis” as an example, latecomer countries can gain an in-depth understanding of audience needs and public opinion trends, quickly identify the pulse of global public opinion, implement targeted communication, and quickly expand international influence. AI technology not only predicts the direction of public opinion development, but also optimizes communication strategies in real time. The popularization of 5G networks has greatly improved the speed and coverage of information dissemination, allowing latecomer countries to break through the limitations of traditional communication models in a low-cost and efficient manner and form unique communication advantages.

Through transnational cooperation, late-developing countries can integrate more communication resources and expand the breadth and depth of communication. For example, Argentina has established “Latin American News Network” with other Latin American countries to push Latin American countries to speak with a single voice in international public opinion and counter the single narrative of Western media through news content sharing. In Africa, South Africa has partnered with Huawei to promote the “Smart South Africa” project to build a modern information infrastructure and promote digital transformation and efficiency improvements in public services. Governments of late-developing countries should invest more in technological research and development and innovation, and encourage the development of local enterprises and talent. At the same time, attention should be paid to the export of culture and the construction of the media industry, so as to enhance the country’s voice in the international information space through globalized cooperation and decentralized communication models. Governments can fund digital cultural creations, support the growth of local social media platforms, and integrate more communication resources through an international cooperation framework.

(2) Construction of barriers in information countermeasures

Unlike a full-scale conflict that may be triggered by military action, or the risks that economic sanctions may pose, weaponized dissemination is able to achieve strategic objectives without triggering full-scale war, and it is extremely attractive based on cost and strategic considerations. Because weaponized communication is characterized by low cost and high returns, an increasing number of State and non-State actors have chosen to manipulate information in order to reach strategic objectives. The spread of this means of dissemination makes countries face even more complex and variable threats in the face of attacks involving information from outside and inside. With the increasing intensity of information warfare, mere traditional military defense can no longer meet the needs of modern warfare. Instead, building a robust information defense system becomes a key strategy for the country to maintain political stability, safeguard social identity, and enhance international competitiveness. Therefore, how to effectively deal with external interference in information and manipulation of public opinion, as well as counter-information, has become an urgent issue for all countries to address. A complete cybersecurity infrastructure is key to maintaining national security against the manipulation or tampering of sensitive information from outside. Take, for example, the European Union’s push to strengthen cybersecurity in member states through its “Digital Single Market” strategy, which requires internet companies to be more aggressive in dealing with disinformation and external interference. The EU’s cybersecurity directives also provide for member states to establish emergency response mechanisms to protect critical information infrastructure from cyberattacks. In addition, the EU has established cooperation with social platform companies, such as Facebook, Twitter and Google, to combat the spread of fake news by providing anti-disinformation tools and data analysis technologies. Artificial intelligence, big data, and automation technologies are becoming important tools for information defense, used to monitor information propagation paths in real time, identify potential disinformation, and resist public opinion manipulation. In the field of cybersecurity, big data analysis helps decision makers identify and warn against malicious attacks, and optimize countermeasures. The application of these technologies will not only enhance information defence capabilities at the domestic level, but also enhance national initiative and competitiveness in the international information space.

Counter-mechanisms are another important component of the information defence system, especially under pressure from international public opinion, where real-time monitoring of the spread of external information and timely correction of disinformation become key to safeguarding the initiative of public opinion. Since the 2014 Crimean crisis, Ukraine has built a rather large-scale cyber defense system through cooperation with NATO and the United States. Ukraine’s National Cyber Security Service has set up “information countermeasures teams” to counter cyberthreats, using social media and news release platforms to refute false Russian reports in real time, a tactic that has significantly boosted Ukraine’s reputation and trust in international public opinion.

(3) Agenda setting in public opinion guidance

In the global competitive landscape of informatization and digitalization, public opinion guidance involves not only the content of information dissemination, but more importantly, how to set the agenda and focus on hot topics of global concern. The agenda-setting theory suggests that whoever can take control of the topics of information circulation can guide the direction of public opinion. Agenda setting influences public attention and evaluation of events by controlling the scope and focus of discussion of topics, and the rise of social media provides a breakthrough for information-disadvantaged countries to compete for dominance in information dissemination through multi-platform linkage. In the case of Ukraine, for example, during the Russo-Ukrainian War, it disseminated the actual war situation through social media, not only publishing the actual combat situation, but also incorporating the emotional demands of the people, and using the tragic narrative of civilian encounters and urban destruction to inspire sympathy and attention from the international community. While resisting interference from external information, the State also needs to proactively disseminate positive narratives and tell cultural stories that can resonate with the international community. The story should correspond to the emotional needs of international public opinion, while at the same time showing the uniqueness of the country and strengthening the link with the international community. Taking my country’s “One Belt, One Road” co-construction as an example, in the “One Belt, One Road” co-construction country, my country has invested in and constructed a large number of infrastructure projects. These projects not only helped improve local economic basic conditions, but also demonstrated China’s globalization process. Responsibility provides a window for cultural cooperation and exchange activities, showing the rich history and culture of the Chinese nation to the world It has demonstrated the inclusiveness and responsibility of Chinese culture to the international community.

However, because countries of the Global South often face constraints in terms of resources, technology and international communication platforms, and have difficulty in competing directly with developed countries, they rely on more flexible and innovative means of communication to participate in the setting of the global agenda. For example, Brazil is under negative public opinion pressure from the Western media when it comes to dealing with issues of environmental protection and climate change, especially the deforestation of the Amazon forest. To this end, the Brazilian government actively creates the country’s image in the field of environmental protection by using social media to publish recent data and success stories about Amazon protection. At the same time, Brazil has strengthened its voice on climate issues by engaging with other developing countries in global climate change negotiations and promoting South-South cooperation. Large international events, humanitarian activities and the production of cultural products, among others, are also effective ways of telling national stories. International sports events such as the World Cup and the Olympic Games are not only a display platform for sports competitions, but also an exhibition venue for national image and cultural soft power. By hosting or actively participating in these global events, the country can show its strength, value and cultural charm to the world, promoting a positive public opinion agenda.

“War is nothing more than the continuation of politics through another means”[23]. This classic Clausewitz assertion is modernized in the context of weaponized communication. Weaponized communication breaks through the physical boundaries of traditional warfare and becomes a modern strategic means of integrating information warfare, cognitive warfare, and psychological warfare. It manipulates the flow of information and public perception in a non-violent form, so that State and non-State actors can achieve political goals without relying on direct military action, reflecting a highly strategic and targeted nature. By manipulating information, emotions and values, weaponized communication can achieve strategic goals while avoiding all-out war, and in global competition and conflict, it has become an important means of political suppression by powerful countries against weak ones.

The core of weaponized communication lies in weakening the enemy’s decision-making and operational capabilities through information manipulation, but its complexity makes the communication effect difficult to fully predict. Although information-powerful countries suppress information-weak countries through technological advantages and communication channels, the effectiveness of communication is uncertain. Especially in the context of the globalization of social media and digital platforms, the boundaries and effects of information flow are becoming increasingly difficult to control. This complexity offers the weak countries the opportunity to break through the hegemony of discourse and promote the reverse game of information dissemination. Weak countries can use these platforms to launch confrontations, challenge the information manipulation of powerful countries, and take their place in global public opinion. The asymmetric game reflects the dynamic balance of international public opinion, whereby communication is no longer one-way control, but more complex interaction and dialogue, giving the weak the possibility of influencing public opinion. The current international public opinion landscape is still dominated by the one-way suppression of information-weak countries by information-powerful countries, but this situation is not unbreakable. Information warfare has a high degree of asymmetry, and information-weak countries can counter it step by step with technological innovation, flexible strategies and transnational cooperation. By exerting “asymmetric advantages”, weak countries are not only able to influence global public opinion, but also to enhance their voice with the help of joint action and information-sharing. Transnational cooperation and the establishment of regional alliances provide the weak countries with a powerful tool to counter the powerful, enabling them to form a synergy in international public opinion and challenge the dominance of the information powers. Under the “war framework”, countries can flexibly adjust their strategies and proactively shape the information dissemination pattern, rather than passively accepting information manipulation by powerful countries.

Sociology of war emphasizes the role of social structure, cultural identity, and group behavior in warfare. Weaponized communication is not only a continuation of military or political behavior, but also profoundly affects the psychosocial, group emotions, and cultural identity. Powerful countries use information dissemination to shape other countries’ perceptions and attitudes in order to achieve their own strategic goals. However, from a sociological perspective, weaponized transmission is not a one-way suppression, but rather the product of complex social interactions and cultural responses. In this process, the information-weak countries are not completely vulnerable, but, on the contrary, they can counter external manipulation with “soft power” with the help of cultural communication, social mobilization and dynamic confrontation of global public opinion, shaping a new collective identity and demonstrating the legitimacy of “weak weapons”.

(Fund Project: Research results of the National Social Science Fund Major Project to Study and Interpret the Spirit of the Third Plenary Session of the 20th Central Committee of the Communist Party of China “Research on Promoting the Integrated Management of News Publicity and Online Public Opinion” (Project No.: 24ZDA084))

現代國語:

作者:

郭小安 康如诗来源:

  发布时间:

2025-05-06

【摘要】在國際輿論戰中,武器化傳播已滲透軍事、經濟、外交等領域,帶來“一切皆可武器化”的想像與實踐。武器化傳播通過技術、平台和政策操控公眾認知,體現了權力分配與文化博弈的複雜互動。在全球化和數字化的推動下,認知操控、社會分裂、情感極化、數字監控、信息殖民已成為影響國家穩定的新型手段,這不僅加劇了信息強國與弱國間的競爭,也為信息弱國提供了通過靈活策略和技術創新實現逆轉的機會。在全球非對稱傳播格局下,如何在技術創新與倫理責任、戰略目標與社會平衡間找到契合點和平衡點,將是影響未來國際輿論格局的關鍵要素。

【關鍵詞】輿論戰;武器化傳播;信息操縱;非對稱傳播;信息安全

如果說“宣傳是對現代世界的理性認可”[1],那麼武器化傳播則是對現代技術手段的理性應用。在輿論戰中,各參與主體通過不同傳播手段實現戰略目標,做到表面合理且隱蔽。與傳統軍事衝突不同,現代戰爭不僅涉及物理對抗,還涵蓋信息、經濟、心理及技術等多個領域的競爭。隨著技術進步和全球化的推動,戰爭形態發生深刻變化,傳統的物理對抗逐漸轉向多維度、多領域的綜合作戰。在這一過程中,武器化傳播作為一種現代戰爭形式,成為通過控制、引導和操縱輿論,影響敵對方或目標受眾的心理、情感與行為,進而實現政治、軍事或戰略目的的隱形暴力手段。 《戰爭論》認為,戰爭是讓敵人無力抵抗,且屈從於我們意志的一種暴力行為。 [2]在現代戰爭中,這一目標的實現不僅依賴於軍事力量的對抗,更需要信息、網絡與心理戰等非傳統領域的支持。第六代戰爭(Sixth Generation Warfare)預示戰爭形態的進一步轉變,強調人工智能、大數據、無人系統等新興技術的應用,以及信息、網絡、心理和認知領域的全面博弈。現代戰爭的“前線”已擴展到社交媒體、經濟制裁和網絡攻擊等層面,要求參與者俱備更強的信息控制與輿論引導能力。

當前,武器化傳播已滲透到軍事、經濟、外交等領域,帶來“一切皆可武器化”的憂慮。在戰爭社會學中,傳播被視為權力的延伸工具,信息戰爭深刻滲透並伴隨傳統戰爭。武器化傳播正是在信息控制的框架下,通過塑造公眾認知與情感,鞏固或削弱國家、政權或非國家行為者的權力。這一過程不僅發生在戰時,也在非戰斗狀態下影響著國家內外的權力關係。在國際政治傳播中,信息操控已成為大國博弈的關鍵工具,各國通過傳播虛假信息、發動網絡攻擊等手段,試圖影響全球輿論和國際決策。輿論戰不僅是信息傳播的手段,更涉及國家間權力博弈與外交關係的調整,直接影響國際社會的治理結構與權力格局。基於此,本文將深入探討武器化傳播的概念流變,分析其背後的社會心態,闡述具體的技術手段及所帶來的風險,並從國家層面提出多維應對策略。

一、從傳播武器化到武器化傳播:概念流變及隱喻

武器在人類歷史上一直是戰爭的象徵和工具,戰爭則是人類社會中最極端、暴力的衝突形式。因此,“被武器化”是指將某些工具用於戰爭中的對抗、操控或破壞,強調這些工具的使用方式。 “武器化”(weaponize)譯為“使得使用某些東西攻擊個人或團體成為可能”。 1957年,“武器化”一詞作為軍事術語被提出,V-2彈道導彈團隊的領導者沃納·馮·布勞恩表示,他的主要工作是“將軍方的彈道導彈技術‘武器化’”[3]。

“武器化”最早出現在太空領域,時值美蘇軍備競賽時期,兩個大國力圖爭奪外太空主導權。 “太空武器化”是指將太空用於發展、部署或使用軍事武器系統的過程,包括衛星、反衛星武器和導彈防禦系統等,目的是進行戰略、戰術或防禦性行動。 1959年至1962年,美蘇提出了一系列倡議,禁止將外太空用於軍事目的,尤其是禁止在外層空間軌道部署大規模毀滅性武器。 2018年,當時的美國總統特朗普簽署了《空間政策指令-3》,啟動“太空軍”建設,將太空視為與陸地、空中、海洋同等的重要作戰領域。 2019年,《中華人民共和國和俄羅斯聯邦關於加強當代全球戰略穩定的聯合聲明》中倡議“禁止在外空放置任何類型武器”[4]。

除太空領域的武器化外,軍事、經濟、外交等領域也顯現武器化趨勢。 “軍事武器化”是將資源(如無人機、核武器等)用於軍事目的、部署武器系統或發展軍事能力。 2022年俄烏戰爭期間,英國皇家聯合軍種研究所的報告顯示,烏克蘭每月因俄羅斯干擾站的影響,損失約10000架無人機。 [5]“武器化”也常出現在“金融戰爭”“外交戰場”等表述中。在經濟領域,武器化通常指國家或組織對全球金融系統中的共享資源或機制的利用;外交武器化則表現為國家通過經濟制裁、外交孤立、輿論操控等手段,追求自身利益並對他國施加壓力。隨著時間的推移,“武器化”概念逐漸擴展到政治、社會、文化等領域,尤其在信息領域,自2016年美國總統大選以來,輿論操縱已成為政治鬥爭的普遍工具。美國前中央情報局局長戴維·彼得雷烏斯曾在國家戰略研究所會議上表示,“萬物武器化”(the weaponization of everything)的時代已經來臨。 [6]

作為一種隱喻,“武器化”不僅指實際物理工具的使用,還像徵著對抗性和攻擊性行為的轉化,強調“武器”這一概念如何滲透至日常生活、文化生產和政治策略中,展現社會行動者如何利用各種工具達成戰略目的。時下,許多本應保持中立的領域,如媒體、法律和政府機構,常被描述為“武器化”,用以批判它們的過度政治化和被不正當利用,突出其非法性及對社會的負面影響。通過這一隱喻,人們無意識地將當前的政治環境與理想化的、看似更溫和的過去進行對比,使人們認為過去的政治氛圍更加理性和文明,而現今則顯得過於極端和對立。 [7]因此,“武器化”的實質是政治中介化的過程,是政治力量通過各種手段和渠道,影響或控製本應保持中立的領域,使其成為政治目的和政治鬥爭的工具。

在信息領域,傳播武器化是長期存在的一種戰略手段。第一、二次世界大戰期間,各國就廣泛使用了宣傳和輿論戰,傳播手段被作為一種心理戰術使用。武器化傳播是傳播武器化在現代信息社會中的體現,其利用算法和大數據分析精準地控制信息的傳播速度和範圍,進而操控輿論和情感,反映了技術、平台和策略的結合,使得政治力量可以更加精準和高效地操控公眾認知與輿論環境。信息作為輿論的本體,被“武器化”並用於影響社會認知和群體行為,“戰爭”的概念也隨之變化,不再只是傳統的軍事對抗,還包括通過信息傳播和輿論操控實現的心理戰和認知戰。這種轉變促生了一系列新術語,例如無限制戰爭(unrestricted warfare)、新一代戰爭(new generation warfare)、非對稱戰爭(asymmetric warfare)和非常規戰爭(irregular warfare)等。這些術語幾乎都藉用“戰爭”(warfare)強調信息領域中的多樣化衝突,信息成為被“武器化”的核心內容。

儘管有部分觀點認為“戰爭”一詞不適用於未正式宣布敵對行動的情況[8],但武器化傳播通過弱化戰爭的傳統政治屬性,將各領域的公開或隱蔽的力量和形式籠統地視作傳播行為,從而擴展了“戰爭”這一概念的外延。值得注意的是,在英文術語中“武器化”有兩種表述方式:一種是“weaponized noun(名詞)”,即表示某物已經“被武器化”,具備武器功能或用途;另一種是“weaponization of noun”,指將某物轉化為武器或具有武器性質的過程。在學術領域,儘管weaponized communication和weaponization of communication尚未嚴格區分,但中文翻譯有所區別。 “武器化傳播”更側重於傳播手段或信息本身“被武器化”,以實現某種戰略目標;“傳播武器化”則強調傳播過程本身作為武器的轉化過程。在討論具體技術手段時,多數學術論文采用weaponed或weaponizing作為前綴,以修飾具體的傳播手段。

本文重點討論的是國際輿論戰中的具體傳播策略,著重描述已經發生的武器化現象,故統一使用“武器化傳播”,其是一種利用傳播手段、技術工具和信息平台,通過精確操控信息流動、公眾認知與情感反應,達到特定軍事、政治或社會目的的策略性傳播方式。武器化傳播也並非單純的戰爭或戰時狀態,而是一種持續的傳播現象,它反映了各主體間的互動與博弈,是信息共享和意義空間的流動。

二、武器化傳播的應用場景及實施策略

如果說20世紀90年代末,信息領域的武器化仍是一個“死話題”,各國主要追逐導彈、無人機等實體武器的升級競賽,那麼步入21世紀,網絡戰爭則真正衝進了公眾視野,並深刻嵌入人們的日常生活,經由社交媒體和智能設備,公眾不可避免地捲入輿論戰爭,不自覺地成為參與者或傳播節點。隨著技術的普及,武器化手段逐漸從國家主導的戰爭工具擴展到社會化和政治化領域,對個人和社會的控制從顯性的國家機器轉向更隱蔽的觀念操控。棱鏡計劃(PRISM)的曝光引發了全球對隱私洩露的強烈擔憂,凸顯了國家利用先進技術進行監視和控制的潛力,這被視為一種新型的武器化。自2016年特朗普當選美國總統以來,社交機器人等信息武器的大規模應用,成為全球政治博弈中的常見現象。信息作戰——包括電子戰、計算機網絡作戰、心理戰和軍事欺騙——被廣泛用於操控信息流動,影響輿論格局。這些手段不僅在軍事戰爭和政治選舉中發揮作用,還逐漸滲透到文化衝突、社會運動及跨國博弈之中,傳統的信息作戰邏輯得以延續。如今,武器化傳播作為一種社會政治工具,深刻影響著輿論生態、國際關係以及個人的日常生活。

(一)軍事領域的信息操縱戰

信息流能夠直接影響軍事衝突的走向,塑造公眾和軍隊的認知與決策,進而影響士氣、戰略判斷和社會穩定。在現代戰爭中,信息不再是單純的輔助工具,信息領域已成為核心戰場。通過操控信息流向,敵方的形勢評估可能被誤導,戰鬥意志被削弱,民眾的信任與支持被動搖,進而影響戰爭的決策過程與持續性。

海灣戰爭(Gulf War)被視為現代信息戰的開端。在這場戰爭中,美國通過高科技手段——包括電子戰、空中打擊和信息操作——實施了對伊拉克的系統性打擊。美軍利用衛星和AWACS預警機實時監控戰場態勢,通過空投傳單和廣播電台向伊拉克士兵傳遞美軍優勢及投降後的優待政策,從心理層面誘使伊軍投降。這場戰爭標誌著信息控制在軍事衝突中的關鍵地位,展示了信息戰在現代戰爭中的潛力。進入21世紀,網絡戰成為信息戰的重要組成部分。網絡戰不僅涉及信息的傳播和操控,還包括通過攻擊關鍵基礎設施實現對敵方社會功能的控制。 2007年愛沙尼亞遭遇大規模DDoS(Distributed Denial of Service Attack)攻擊,展示了信息操縱與網絡攻擊融合的趨勢。 2017年在WannaCry勒索軟件事件中,攻擊者利用Windows系統漏洞(EternalBlue)加密全球150個國家約20萬台計算機文件,要求支付贖金,嚴重影響英國國家健康服務體系(NHS),導致急診服務中斷和醫院系統癱瘓,進一步揭示了網絡戰對關鍵基礎設施的威脅。此外,在長期衝突中,基礎設施控制因能夠直接決定信息傳播的速度、範圍和方向,被廣泛用於削弱對手的戰略能力,爭奪公共信息空間。以色列通過限制無線電頻譜使用、控制互聯網帶寬和破壞通信設施,有效削弱了巴勒斯坦的通信能力。同時,以色列還通過經濟制裁和法律框架限制巴勒斯坦電信市場的發展,壓制巴勒斯坦在信息流動中的競爭力,鞏固自身在衝突中的戰略優勢[9],以維持信息的不平等流動。

社交媒體為信息操縱提供了即時、廣泛的信息傳播渠道,使其能夠跨越國界,影響全球公眾情緒和政治局勢,也使戰爭焦點從單純的物理破壞轉向輿論操控。俄烏戰爭期間,深度偽造技術作為視覺武器,對公眾認知和戰爭輿論產生了顯著干擾。 2022年3月15日,烏克蘭總統澤連斯基的偽造視頻在Twitter上傳播,視頻中他“呼籲”烏克蘭士兵放下武器,引發了短時間內的輿論混亂。同樣,俄羅斯總統普京的偽造視頻也被用以混淆視聽。儘管這些視頻被平台迅速標註“Stay informed”(等待了解情況)的說明,但其在短時間內仍然對公眾情緒和認知造成明顯干擾。這些事件凸顯了社交媒體在現代信息戰中的關鍵作用,國家和非國家行為體可以通過虛假信息、情感操控等手段對軍事衝突施加干擾。

信息操縱戰的複雜性還體現在其雙重特性上——既是攻擊工具,也是防禦的手段。在軍事領域,各國通過防禦和反擊網絡攻擊來確保國家安全、保護關鍵基礎設施、維護軍事機密,並在某些情況下影響對手的戰鬥力與決策。 2015年和2017年,俄羅斯黑客發起了針對烏克蘭的大規模網絡攻擊(如BlackEnergy和NotPetya),烏克蘭通過迅速升級網絡防禦系統,成功抵禦部分攻擊並採取反制措施,避免了更大規模的基礎設施癱瘓。此外,北約戰略傳播卓越中心和英國第77旅等單位專注研究和平時期的輿論塑造[10],利用戰略傳播、心理戰和社交媒體監控等手段,擴大信息領域的戰略控制,並強化了防禦與輿論塑造能力,進一步提高了信息戰的戰略高度。

如今,信息操縱戰已經成為現代軍事衝突中的關鍵環節。通過信息技術與心理操控的高度結合,它不僅改變了傳統戰爭的規則,也深刻影響著公眾認知和全球安全格局。國家、跨國公司或其他行為體通過掌控關鍵基礎設施和社交媒體平台,限制信息流動、操控傳播路徑,從而在全球信息生態中獲得戰略優勢。

(二)政治選舉的輿論干預戰

政治選舉是民主政治中最直接的權力競爭場域,信息傳播在此過程中對選民決策具有重要影響。通過計算宣傳等手段,外部勢力或政治團體能夠操縱選民情緒、誤導公眾認知,從而左右選舉結果、破壞政治穩定或削弱民主進程,選舉因此成為武器化傳播最具效果的應用場景。

近年來,全球政治選舉呈現極化趨勢,持不同政治立場的群體之間存在巨大的意識形態差異。極化導致公眾選擇性接受與自身觀點一致的信息,同時排斥其他信息,這種“回音室效應”加劇了公眾對立場的片面認知,為輿論干預提供了更大的空間。而信息傳播技術,尤其是計算宣傳的興起,使外部勢力能夠更加精準地操控輿論和影響選民決策。計算宣傳(Computational Propaganda)指利用計算技術、算法和自動化系統操控信息流動,以傳播政治信息、干預選舉結果和影響輿論,其核心特徵在於算法驅動的精準性和自動化傳播的規模化,通過突破傳統人工傳播的限制,顯著增強了輿論操控的效果。 2016年美國總統選舉中,特朗普團隊通過劍橋分析公司分析Facebook用戶數據,為選民定向推送定制化的政治廣告,精準影響了選民的投票意向[11]。這一事件被視為計算宣傳干預選舉的典型案例,也為其他政客提供了操作模板,推動了計算宣傳在全球範圍內的廣泛應用。 2017年法國總統選舉中,候選人埃馬紐埃爾·馬克龍(Emmanuel Macron)團隊遭遇黑客攻擊,內部郵件被竊取並公開,內容稱馬克龍在海外擁有秘密賬戶並涉及逃稅,企圖抹黑其形象。 2018年巴西總統選舉期間,候選人雅伊爾·博索納羅(Jair Bolsonaro)團隊利用WhatsApp群組傳播煽動性政治內容,定向推送大量圖像、視頻和煽動性消息以影響選民情緒。據統計,自2017年至2019年,全球採用計算宣傳的國家由28個增加至70個,2020年這一數量上升至81個。這表明,計算宣傳正通過技術手段和傳播策略,重新定義全球選舉中的輿論規則。

計算宣傳也是國家行為者在輿論干預戰中的重要工具。 2011年,美國國防高級研究計劃局(DARPA)在中東地區開展“歐內斯特之聲”行動,通過建立和管理多個虛假身份(sockpuppets),扭曲阿拉伯語社交媒體的對話。俄羅斯也頻繁利用計算宣傳實施干預,在加拿大操作約20萬個社交媒體賬戶,借助極右翼和極左翼運動散佈親俄言論,製造虛假的社會熱點,試圖破壞加拿大對烏克蘭的支持[12]。作為計算宣傳的重要組成部分,社交機器人通過自動化和規模化手段製造輿論熱度,藉由特定標籤在社交平台上增加信息的曝光率,操控議題的優先級。 2016年美國大選期間,俄羅斯利用社交機器人發布支持普京和攻擊反對派的內容,通過信息過載(information overload)掩蓋反對派聲音,強化親普京的輿論氛圍。 [13]2017年海灣危機期間,沙特阿拉伯和埃及通過Twitter機器人製造反卡塔爾標籤#AlJazeeraInsultsKingSalman的熱度,使其成為熱門話題,虛構了反卡塔爾情緒的高峰,進而影響了全球範圍內對卡塔爾的輿論態度。 [14]深度偽造技術則進一步提升了計算宣傳的精準性與隱蔽性。 2024年,美國總統喬·拜登的偽造視頻在X(原Twitter)上迅速傳播,視頻顯示其在橢圓形辦公室使用攻擊性語言,引發輿論爭議並影響選民情緒。據網絡安全公司McAfee調查,63%的受訪者在兩個月內觀看過政治深度偽造視頻,近半數表示這些內容影響了他們的投票決定。 [15]

在全球範圍內,計算宣傳已滲透各國輿論戰中,影響著社會穩定與國家安全。以色列國防軍通過數字武器對巴勒斯坦展開輿論戰,土耳其培養了“愛國巨魔軍隊”操控國內外輿論,墨西哥政府利用殭屍網絡影響輿論。作為現代輿論干預戰的重要手段,計算宣傳正在改變全球政治傳播的格局。隨著人工智能、量子計算等技術的發展,計算宣傳還可能通過更隱蔽和高效的方式乾預選舉流程,甚至直接威脅民主制度的核心運行邏輯。

(三)文化領域的符號認同戰

武器化傳播通過操控信息、符號和價值觀,試圖影響公眾的思想、情感和行為,進而塑造或改變社會的集體認知與文化認同。這種傳播方式不僅在於信息的傳遞,更通過特定的敘事框架、文化符號和情感共鳴,推動某種特定的意識形態或政治理念的傳播與認同。通過操縱文化符號、社會情感和集體記憶,武器化傳播在文化領域干擾社會結構與文化認同,成為符號認同戰的核心手段。

模因(Meme)作為一種集視覺元素和簡潔文字於一體的文化符號,以幽默、諷刺或挑釁的方式激發觀眾的情感反應,影響他們的政治態度和行為。佩佩模因(Pepe the Frog)起初是一個無害的漫畫角色,被極右翼群體重新利用並武器化,用以傳播仇恨言論,逐漸演變為種族主義和反移民的象徵。模因將復雜的政治情緒轉化為便於傳播的視覺符號,迅速激起公眾對政策的不信任和憤怒,被視為“武器化的偶像破壞主義”(Iconoclastic Weaponization)。這一過程通過操控文化符號,以達到政治或社會鬥爭的目的[16],加劇了公眾對社會和政治的分裂。例如,在英國脫歐期間,帶有“Take Back Control”(奪回控制權)字樣的模因迅速傳播,強化了民族主義情緒。

除了文化符號的製造外,符號的篩选和屏蔽同樣能夠塑造或加深某種文化認同或政治立場。審查制度自古以來就是權力控制信息的重要手段,早在古希臘和古羅馬時期,政府就對公共演講和文學作品進行審查,以維持社會秩序和權力穩定。進入數字時代,互聯網和社交媒體的興起推動了審查制度的現代化,平台審查逐漸取代傳統的審查方式,成為當代信息控制和輿論引導的核心工具。算法審查通過人工智能檢測敏感話題、關鍵詞和用戶行為數據,自動刪除或屏蔽被視為“違規”的內容,社交媒體的審核團隊會對用戶生成的內容進行人工篩選,確保其符合平台政策和法律法規。平台審查的作用不僅是限制某些內容的傳播,更是通過推送、刪除和屏蔽等方式引導輿論,塑造公眾認知框架。儘管主流社交平台通過嚴格的內容審核機制控制信息傳播,但一些邊緣平台,如Gab、Gettr、Bitchute等因缺乏有效審查,成為極端言論和惡意信息的溫床。這些平台未對內容髮布做出足夠限制,極端觀點和虛假信息得以肆意擴散,例如,Gab因極端主義內容屢遭批評,被指助長暴力和仇恨。在迴聲室中,用戶只能接觸與自身觀點一致的信息,這種信息環境更強化了極端思想,導致社會群體間的對立加劇。 [17]

語言作為信息傳播的載體和工具,能夠通過情感操控、符號政治和社會動員等方式,深刻影響群體行為和文化認同。語言武器化聚焦於語言形式和文化語境如何影響信息的接收方式,強調語言如何被用來操控、引導或改變人們的認知與行為。這不僅涉及特定詞彙和修辭手法的使用,更包括通過語言表述建構特定的社會意義和文化框架。作為符號認同戰的另一重要工具,語言塑造了“敵我對立”的敘事框架。大翻譯運動(Great Translation Movement)通過選擇性翻譯中國網民的民族主義言論,將其傳播到國際社交媒體平台,引發了對中國的負面認知。這種語言操控通過情緒化表達放大了爭議性內容,加深了國際社會的文化偏見。

語言武器化的深層邏輯在於情緒化和煽動性的語言形式。西方國家常以“人權”與“民主”等正義化標籤為乾預行為辯護,合法化政治或軍事行動。白人至上主義者使用“另類右翼”等模糊標籤重塑意識形態,將傳統的帶有強烈負面含義的“白人至上主義”轉化為一個較為中立的概念,降低了該詞彙的社會抵抗力,用寬泛的“傘式”身份擴大其支持者的基礎。通過對世俗話語的滲透,仇恨政治和極端言論被正當化,逐漸形成一種政治常態。當公眾將這種政治日常化後,語言實現了真正的武器化。 [18]在尼日利亞,煽動仇恨的內容通過種族、宗教和地區話題擴散,深刻惡化了社會關係。 [19]語言的模糊性和合理否認策略也成為傳播者規避責任的有力工具,在被簡化的敘事中傳播複雜的社會和政治議題。特朗普的美國優先(America First)政策通過否定性標籤和情緒化話語,以反對全球化、質疑氣候變化科學、抨擊傳統盟友等方式,故意提出與主流意見相對立的觀點,激發公眾對全球化的不信任,重塑國家利益優先的文化認同。 [20]

三、武器化傳播的風險與挑戰:正當性與破壞性

儘管武器化傳播給國際輿論格局帶來了巨大風險,但特定情形下,其可能會被某些國家或團體通過法律、政治或道德框架賦予一定的正當性。如“9·11”事件後,美國通過《愛國法案》擴大了情報部門的監控權限,以“反恐”為名實施廣泛的信息控制,這種“正當性”常被批評為破壞公民自由,侵蝕了民主社會的核心價值。

在國際政治博弈中,武器化傳播更常被視為“灰色區域”(Gray Zone)的手段。國家間的對抗不再局限於經濟制裁或外交壓力,而是通過信息操控、社交媒體干預等非傳統方式展開。部分國家以“保護國家利益”為藉口傳播虛假信息,辯稱其行為是合規的,儘管這些行為可能在國際法上存在爭議,但往往被合理化為“反制外部威脅”的必要手段。在一些信息監管缺乏嚴格法律框架的國家,選舉的干預行為往往被容忍,甚至被視為一種“正當”的政治活動。在文化層面,某些國家通過傳播特定的文化符號和意識形態,試圖在全球範圍內塑造自身的文化影響力。西方國家常以“文化共享”和“文明傳播”為名,推動其價值觀的傳播,而在實際操作中,卻通過操控文化符號和敘事框架,削弱其他文化的認同感,導致全球文化生態的不平衡。法律框架也在一定程度上為武器化傳播的正當性提供了支持。一些國家以“反恐”和“反對極端主義”為名,通過信息審查、內容過濾等手段限制所謂“有害信息”的傳播。然而,這種正當性往往突破了道德邊界,導致信息封鎖和言論壓制。以“國家安全”為理由的信息治理,雖然在一定程度上獲得了內部認可,卻為武器化傳播的氾濫提供了空間。

相較於正當性,武器化傳播的破壞性尤為顯著。目前,武器化傳播已成為權力結構操控輿論的重要工具,其不僅扭曲了信息內容,還通過隱私侵犯、情感動員和文化滲透等方式,深刻影響了公眾認知、社會情緒以及國際關係。

(一)信息失真與認知操控

信息失真指信息在傳播過程中被故意或無意扭曲,導致公眾接收到的內容與原始信息存在顯著差異。在社交媒體上,虛假信息和誤導性內容的傳播日益猖獗,人工智能模型(如GPT)的生成內容,可能因訓練數據的偏見而加劇這一問題。性別、種族或社會偏見可能被反映在自動生成的文本中,放大信息失真的風險。社交媒體的快速傳播特性也使傳統的事實核查機制難以跟上虛假信息的擴散速度。虛假信息在短時間內往往佔據輿論主導地位,跨平台傳播和匿名性使得澄清與糾正變得更加複雜。傳播的不對稱性削弱了傳統新聞機構的權威性,公眾更傾向於相信即時更新的社交平台信息,而非傳統新聞機構的深入報導,這進一步削弱了新聞機構在抵制虛假信息中的作用。

除了信息本身的失真,武器化傳播還深刻利用了認知失調的心理機制。認知失調指個體接觸到與其已有信念或態度相衝突的信息時產生的心理不適感。傳播者通過製造認知失調,動搖目標受眾的既有態度,甚至誘導其接受新的意識形態。在政治選舉中,定向傳播負面信息常迫使選民重新審視政治立場,甚至改變投票傾向。武器化傳播通過選擇性暴露進一步加劇了“信息繭房”的形成,讓受眾傾向於接觸與自身信念一致的信息,忽視或排斥相反觀點。這不僅強化了個體的認知偏見,也讓虛假信息在群體內部快速擴散,難以被外界的事實和理性聲音打破,最終形成高度同質化的輿論生態。

(二)隱私洩露與數字監控

近年來,深度偽造技術的濫用加劇了隱私侵權問題。 2019年,“ZAO”換臉軟件因默認用戶同意肖像權而被下架,揭示了生物特徵數據的過度採集風險。用戶上傳的照片經深度學習處理後,既可能生成精確的換臉視頻,也可能成為隱私洩露的源頭。更嚴重的是,深度偽造等技術被濫用於性別暴力,多名歐美女演員的面孔被非法植入虛假性視頻並廣泛傳播,儘管平台在部分情況下會刪除這些內容,但開源程序的普及讓惡意用戶能夠輕鬆複製和分享偽造內容。此外,用戶在使用社交媒體時,往往默認授權平台訪問其設備的照片、相機、麥克風等應用權限。通過這些權限,平台不僅收集了大量個人數據,還能夠通過算法分析用戶的行為特徵、興趣偏好和社交關係,進而精準投放廣告、內容推薦甚至實施信息操控。這種大規模數據採集推動了對隱私保護的全球討論。在歐洲,《通用數據保護條例》(General Data Protection Regulation)試圖通過嚴格的數據收集和使用規定,加強個人隱私權保障。然而,由於“隱性同意”或複雜的用戶協議,平台常常繞過相關規定,使數據處理過程缺乏透明度,導致普通用戶難以了解數據的實際用途。美國《通信規範法》第230條規定,網絡平台無需為用戶生成的內容承擔法律責任,這一規定推動了平台內容審核的發展,但也使其在應對隱私侵權時缺乏動力。平台出於商業利益的考慮,往往滯後處理虛假信息和隱私問題,導致審核責任被持續擱置。

在數字監控方面,社交平台與政府的合作使用戶數據成為“監控資本主義”的核心資源。美國國家安全局(NSA)通過電話記錄、互聯網通信和社交媒體數據,實施大規模監控,並與Google、Facebook等大型企業合作,獲取用戶的在線行為數據,用於全球範圍內的情報收集和行為分析。跨國監控技術的濫用更是將隱私侵犯推向國際層面。以色列網絡安全公司NSO開發的Pegasus間諜軟件,通過“零點擊攻擊”入侵目標設備,可實時竊取私人信息和通信記錄。 2018年,沙特記者賈馬爾·卡舒吉(Jamal Khashoggi)被謀殺一案中,沙特政府通過Pegasus監聽其通信,揭示了這種技術對個體隱私和國際政治的深遠威脅。

(三)情感極化與社會分裂

情感在影響個體認知與決策中起著關鍵作用。武器化傳播通過煽動恐懼、憤怒、同情等情緒,影響理性判斷,推動公眾在情緒驅動下做出非理性反應。戰爭、暴力和民族主義常成為情感動員的主要內容,傳播者通過精心設計的議題,將愛國主義、宗教信仰等元素植入信息傳播,迅速引發公眾情感共鳴。數字技術的廣泛應用,特別是人工智能和社交媒體平台的結合,進一步放大了情感極化的風險。虛假信息與極端言論在平台上的快速傳播,不僅來自普通用戶的分享行為,更受到算法的驅動。平台傾向優先推送情緒化和互動性高的內容,這些內容常包含煽動性語言和極端觀點,從而加劇了仇恨言論和偏激觀點的傳播。

社交媒體標籤和算法推薦在情感極化中扮演著關鍵角色。在查理周刊事件後,#StopIslam標籤成為仇恨言論的傳播工具,用戶借助該標籤發布仇視和暴力傾向的信息。在美國2020年總統選舉期間,社交平台上的極端政治言論和錯誤信息也在激烈的黨派鬥爭中被放大。通過精確的情感操控,武器化傳播不僅撕裂了公共對話,還極大影響了社會的民主進程。另一種特殊的極端主義動員策略是“武器化自閉症”(Weaponized Autism),即極右翼團體利用自閉症個體的技術專長,實施情感操控。這些團體招募技術能力較強但有社交障礙的個體,通過賦予虛假的歸屬感,將其轉化為信息戰的執行者。這些個體在極端組織的指引下,被用於傳播仇恨言論、執行網絡攻擊和推動極端主義。這種現像不僅揭示了情感操控的深層機制,也表明技術如何被極端團體利用來服務於更大的政治和社會議程。 [21]

(四)信息殖民與文化滲透

“武器化相互依賴”理論(Weaponized Interdependence Theory)揭示了國家如何利用政治、經濟和信息網絡中的關鍵節點,對其他國家施加壓力。 [22]特別是在信息領域,發達國家通過控制信息流實施“信息殖民”,進一步鞏固其文化和政治優勢。數字平台成為這一殖民過程的載體,全球南方國家在信息傳播中高度依賴西方主導的技術平台和社交網絡,在撒哈拉以南非洲地區,Facebook已成為“互聯網”的代名詞。這種依賴不僅為西方企業帶來了巨大的廣告收入,還通過算法推薦對非洲本土文化和價值觀,尤其是在性別、家庭和宗教信仰等方面,產生了深遠影響,使文化滲透成為常態。

數字不平等是信息殖民的另一表現。發達國家在數字技術和信息資源上的主導地位,使南方國家在經濟、教育和文化領域日益邊緣化。巴勒斯坦因基礎設施不足和技術封鎖,難以有效融入全球數字經濟,既限制了本地經濟發展,又進一步削弱了其在全球信息傳播中的話語權。全球主要經濟體和信息強國通過技術封鎖和經濟制裁,限制他國獲取關鍵技術與創新資源,這不僅阻礙了目標國的科技發展,也加劇了全球技術與創新生態的斷裂。自2018年退出《伊朗核協議》以來,美國對伊朗的經濟制裁導致其在半導體和5G領域發展受阻,技術與創新的不對稱拉大了全球技術生態的差距,使許多國家在信息競爭中處於劣勢。

四、反思與討論:非對稱傳播格局中的話語權爭奪

在國際非對稱傳播(Asymmetric Communication)競爭格局下,強勢方常常通過主流媒體和國際新聞機構等渠道佔據輿論的主導地位,而弱勢方則需要藉助創新傳播技術和手段來彌補劣勢,爭奪話語權。這一傳播格局的核心在於信息地緣政治(Information Geopolitics),即國家之間的權力較量不僅僅取決於地理位置、軍事力量或經濟資源,更取決於對信息、數據和技術的控制。大國間的博弈已不再僅限於物理空間的控制,而擴展至輿論空間的爭奪。這些“信息景觀”涉及全球傳播生態中的話語權、信息流通和媒體影響力等,在這一過程中,國家通過不斷製造景觀,以影響國際輿論、塑造全球認知框架,進而實現其戰略目標。非對稱傳播的策略不僅關乎信息內容的傳遞,更重要的是如何借助各種傳播技術、平台和手段彌補資源與能力上的差距,信息傳播的核心不再局限於內容本身,而圍繞著話語權的爭奪展開。隨著信息戰和認知戰的興起,誰掌握了信息,誰就能在全球競爭中占得先機。

(一)後發優勢下的技術赶超

傳統的大國或強勢傳播者掌控著全球輿論的主導權,相比之下,弱勢國家往往缺乏與這些大國抗衡的傳播渠道。後發優勢理論主張後發國家能夠通過跳躍式發展,繞過傳統的技術路徑,引進現有的先進技術和知識,從而迅速崛起並規避早期技術創新中的低效和過時環節。在武器化傳播的背景下,這一理論為信息弱國提供了通過新興科技突破大國傳播壁壘的路徑,有助於其在技術層面上實現赶超。傳統媒體往往受到資源、影響力和審查機制的限制,信息傳播速度慢、覆蓋面有限,且容易受到特定國家或集團的操控。數字媒體的崛起使信息傳播的格局發生了根本性變化,弱勢國家能夠借助全球化的互聯網平台,直接面向國際受眾,而不必依賴傳統的新聞機構和主流媒體。通過新興技術,弱勢國家不僅能更精準地傳遞信息,還能通過定向傳播和情感引導,迅速擴大其在國際輿論中的影響力。後發國家可以利用先進技術(如大數據、人工智能、5G網絡等)實現精準的信息傳播,打造高效的傳播渠道。以大數據分析為例,後發國家可以深入了解受眾需求和輿情趨勢,快速識別全球輿論脈搏,實施定向傳播,快速擴大國際影響力。人工智能技術不僅能夠預測輿論發展方向,還能實時優化傳播策略。 5G網絡的普及大大提升了信息傳播的速度與覆蓋範圍,使後發國家能夠以低成本、高效率的方式突破傳統傳播模式的局限,形成獨特的傳播優勢。

通過跨國合作,後發國家可以整合更多的傳播資源,擴大傳播的廣度與深度。例如,阿根廷與拉美其他國家共同建立了“拉美新聞網絡”,通過新聞內容共享,推動拉美國家在國際輿論中發出統一的聲音,反擊西方媒體的單一敘事。在非洲,南非與華為合作推動“智慧南非”項目,建設現代化信息基礎設施,促進數字化轉型和公共服務效率的提升。後發國家政府應加大對技術研發和創新的投入,鼓勵本土企業和人才的發展。同時,還應注重文化輸出和媒體產業建設,通過全球化合作和去中心化傳播模式提升國家在國際信息空間中的話語權。政府可以資助數字文化創作,支持本地社交媒體平台的成長,並通過國際合作框架整合更多傳播資源。

(二)信息反制中的壁壘構建

與軍事行動可能引發的全面衝突,或經濟制裁可能帶來的風險不同,武器化傳播能夠在不觸發全面戰爭的情況下實現戰略目標,基於成本和戰略考量,其具有極大的吸引力。由於武器化傳播具備低成本、高回報的特點,越來越多的國家和非國家行為體選擇通過操控信息來達到戰略目標。這種傳播手段的普及,使得國家在面對來自外部和內部的信息攻擊時,面臨更加複雜和多變的威脅。隨著信息戰爭的日益激烈,單純的傳統軍事防禦已經無法滿足現代戰爭的需求。相反,構建強有力的信息防禦體系,成為國家保持政治穩定、維護社會認同和提升國際競爭力的關鍵策略。因此,如何有效應對外部信息干擾和輿論操控,並進行信息反制,已成為各國迫切需要解決的問題。完善的網絡安全基礎設施是維護國家安全的關鍵,用以防范敏感信息不被外部操控或篡改。以歐盟為例,歐盟通過“數字單一市場”戰略推動成員國加強網絡安全建設,要求互聯網公司更積極地應對虛假信息和外部干預。歐盟的網絡安全指令還規定各成員國建立應急響應機制,保護重要信息基礎設施免受網絡攻擊。此外,歐盟還與社交平台公司,如Facebook、Twitter和Google等建立合作,通過提供反虛假信息工具和數據分析技術來打擊假新聞傳播。人工智能、大數據和自動化技術正在成為信息防禦的重要工具,被用以實時監控信息傳播路徑,識別潛在的虛假信息和抵禦輿論操控。在網絡安全領域,大數據分析幫助決策者識別和預警惡意攻擊,並優化反制策略。這些技術的應用不僅能夠在國內層面增強信息防禦能力,還能提高國家在國際信息空間中的主動性和競爭力。

反制機制是信息防禦體系的另一重要組成部分,尤其是在國際輿論壓力下,實時監控外部信息傳播並及時糾正虛假信息成為維護輿論主動權的關鍵。烏克蘭自2014年克里米亞危機以來,通過與北約和美國合作,建立了頗具規模的網絡防禦體系。烏克蘭的國家網絡安全局為應對網絡威脅設立了“信息反制小組”,利用社交媒體和新聞發布平台實時駁斥俄羅斯的虛假報導,這一策略顯著提升了烏克蘭在國際輿論中的聲譽和信任度。

(三)輿論引導中的議程設置

在信息化和數字化的全球競爭格局中,輿論引導不僅涉及信息傳播內容,更關鍵的是如何設置議程並聚焦全球關注的熱點話題。議程設置理論表明,誰能掌控信息流通的議題,誰就能引導輿論的方向。議程設置通過控制話題的討論範圍和焦點,影響公眾對事件的關注與評價,社交媒體的興起為信息弱勢國提供了突破口,使其可以通過多平台聯動來爭奪信息傳播的主導權。以烏克蘭為例,其在俄烏戰爭中通過社交媒體傳播戰爭實況,不僅發布戰鬥實況,還融入民眾的情感訴求,借助平民遭遇和城市破壞的悲情敘事,激發國際社會的同情與關注。在抵禦外部信息干擾的同時,國家還需要主動傳播正面敘事,講述能夠引發國際社會共鳴的文化故事。故事應該符合國際輿論的情感需求,同時展現國家的獨特性,強化與國際社會的聯繫。以我國的“一帶一路”共建為例,在“一帶一路”共建國家,我國投資建設了大量基礎設施項目,這些項目不僅幫助改善了當地的經濟基礎條件,也展示了中國在全球化進程中的責任擔當,更為文化合作和交流活動提供了窗口,向世界展示了中華民族豐富的歷史文化,為國際社會展現了中華文化的包容性和責任感。

但由於全球南方國家往往面臨資源、技術與國際傳播平台的限制,難以直接與發達國家競爭,因此它們依賴更加靈活、創新的傳播手段來參與全球議程的設置。例如,巴西在應對環保和氣候變化議題上,尤其是亞馬遜森林的砍伐問題,面臨來自西方媒體的負面輿論壓力。為此,巴西政府利用社交媒體發布關於亞馬遜保護的最新數據和成功案例,積極塑造國家在環境保護領域的形象。同時,巴西通過與其他發展中國家合作,參與全球氣候變化談判,推動南南合作,增強了在氣候問題上的話語權。大型國際事件、人道主義活動和製作文化產品等,也是講述國家故事的有效方式。國際體育賽事如世界杯、奧運會等,不僅是體育競技的展示平台,更是國家形象和文化軟實力的展現場所,通過承辦或積極參與這些全球性事件,國家能夠向世界展示其實力、價值和文化魅力,推動積極的輿論議程。

“戰爭無非是政治通過另一種手段的延續”[23]。這一克勞塞維茨的經典論斷在武器化傳播的語境下得到了現代化的詮釋。武器化傳播突破了傳統戰爭的物理邊界,成為一種融合信息戰、認知戰和心理戰的現代戰略手段。它以非暴力的形式操控信息流向和公眾認知,使國家和非國家行為者無須依賴直接軍事行動即可實現政治目標,體現出極強的戰略性和目標性。通過操控信息、情緒和價值觀,武器化傳播能夠在避免全面戰爭的同時達成戰略目的,在全球競爭和衝突中,已成為強國對弱國進行政治壓制的重要手段。

武器化傳播的核心在於通過信息操控削弱敵方的決策力與行動能力,但其複雜性使得傳播效果難以完全預測。儘管信息強國通過技術優勢和傳播渠道壓制信息弱國,傳播效果卻充滿不確定性。尤其是在社交媒體和數字平台全球化的背景下,信息流動的邊界和效果愈加難以控制。這種複雜性為弱國提供了突破話語霸權的機會,推動信息傳播的反向博弈。弱國可以利用這些平台發起對抗,挑戰強國的信息操控,在全球輿論中佔據一席之地。非對稱性博弈反映了國際輿論的動態平衡,傳播不再是單向的控制,而是更為複雜的交互和對話,賦予弱者影響輿論的可能性。當前國際輿論格局仍以信息強國對信息弱國的單向壓制為主,但這一局面並非不可打破。信息戰爭具有高度的不對稱性,信息弱國可以憑藉技術創新、靈活策略和跨國合作逐步反制。通過發揮“非對稱優勢”,弱國不僅能夠影響全球輿論,還能藉助聯合行動和信息共享提升話語權。跨國合作與地區聯盟的建立,為弱國提供了反制強國的有力工具,使其能夠在國際輿論上形成合力,挑戰信息強國的主導地位。在戰爭框架下,各國可以靈活調整策略,主動塑造信息傳播格局,而非被動接受強國的信息操控。

戰爭社會學強調社會結構、文化認同和群體行為在戰爭中的作用。武器化傳播不僅是軍事或政治行為的延續,更深刻影響社會心理、群體情感和文化認同。強國利用信息傳播塑造他國的認知與態度,以實現自己的戰略目標。然而,從社會學視角來看,武器化傳播並非單向的壓制,而是複雜的社會互動和文化反應的產物。在這一過程中,信息弱國並非完全處於弱勢,相反,它們可以藉助文化傳播、社會動員和全球輿論的動態對抗,以“軟實力”反擊外部操控,塑造新的集體認同,展示“弱者武器”的正當性。

(基金項目:研究闡釋黨的二十屆三中全會精神國家社科基金重大專項“推進新聞宣傳和網絡輿論一體化管理研究”(項目編號:24ZDA084)的研究成果)

References:

[1] Lasswell H D Propaganda techniques in the world wars [M] Beijing: Renmin University Press, 2003

[2] Clausewitz C V. On War: Volume 1 [M] Academy of Military Sciences of the People’s Liberation Army of China, translated Beijing: The Commercial Press, 1978.

[3]Herrman J. If everything can be ‘weaponized,’ what should we fear? [EB/OL]. (2017-03-14)[2024-12-20].https://www.nytimes.com/2017/03/14/magazine/if-everything-can-be-weaponized-what-should-we-fear.html.

[4] Ministry of Foreign Affairs of the People’s Republic of China Joint statement by the People’s Republic of China and the Russian Federation on strengthening contemporary global strategic stability (full text) [EB/OL].https://www.mfa.gov.cn/web/ziliao_674904/1179_674909/201906/t20190606_7947892.shtml.

[5]Mazarr M J, Casey A, Demus A, et al. Hostile social manipulation: present realities and emerging trends[M]. Santa Monica, CA USA: Rand Corporation, 2019.

[6]Bob Y J. Ex-CIA director Petraeus: Everything can be hijacked, weaponized[EB/OL].(2018-01-30)[2024-12-20].https://www.jpost.com/israel-news/ex-cia-director-petraeus-everything-can-be-hijacked-weaponized-540235.

[7]Mattson G. Weaponization: Metaphorical Ubiquity and the Contemporary Rejection of Politics[EB/OL].OSF(2019-01-08)[2024-12-20].osf.io/5efrw.

[8]Robinson L, Helmus T C, Cohen R S, et al. Modern political warfare[J]. Current practises and possible responses, 2018.

[9]Kreitem H M. Weaponization of Access, Communication Inequalities as a Form of Control: Case of Israel/Palestine[J]. Digital Inequalities in the Global South, 2020: 137-157.

[10]Laity M. The birth and coming of age of NATO StratCom: a personal history[J]. Defence Strategic Communications, 2021, 10(10): 21-70.

[11]Confessore N. Cambridge Analytica and Facebook: The scandal and the fallout so far[J]. The New York Times, 2018(4).

[12]McQuinn B, Kolga M, Buntain C, et al. Russia Weaponization of Canada’s far Right and far Left to Undermine Support for Ukraine[J]. International Journal,(Toronto,Ont),2024,79(2):297-311.

[13]Stukal D, Sanovich S, Bonneau R, et al. Why botter: how pro-government bots fight opposition in Russia[J]. American political science review, 2022, 116(3): 843-857.

[14]Jones M O. The gulf information war| propaganda, fake news, and fake trends: The weaponization of twitter bots in the gulf crisis[J]. International journal of communication13(2019):27.

[15]Genovese D. Nearly 50% of voters said deepfakes had some influence on election decision. [EB/OL].(2024-10-30)[2024-12-20].https://www.foxbusiness.com/politics/nearly-50-voters-said-deepfakes-had-some-influence-election-decision.

[16]Peters C, Allan S. Weaponizing memes: The journalistic mediation of visual politicization[J]. Digital Journalism, 2022, 10(02):217-229.

[17]Gorissen S. Weathering and weaponizing the# TwitterPurge: digital content moderation and the dimensions of deplatforming[J]. Communication and Democracy, 2024, 58(01): 1-26.

[18]Pascale C M. The weaponization of language: Discourses of rising right-wing authoritarianism[J]. Current Sociology, 2019, 67(06): 898-917.

[19]Ridwa1ah A O, Sule S Y, Usman B, et al. Politicization of Hate and Weaponization of Twitter/X in a Polarized Digital Space in Nigeria[J]. Journal of Asian and African Studies, 2024.

[20]Mercieca J R. Dangerous demagogues and weaponized communication[J]. Rhetoric Society Quarterly, 2019, 49(03): 264-279.

[21]Welch C, Senman L, Loftin R, et al. Understanding the use of the term “Weaponized autism” in an alt-right social media platform[J]. Journal of Autism and Developmental Disorders, 2023, 53(10): 4035-4046.

[22]Farrell H, Newman A L. Weaponized interdependence: How global economic networks shape state coercion[J]. International security,2019,44(01):42-79.

[23] Clausewitz C V. On War: Volume 1 [M] Academy of Military Sciences of the People’s Liberation Army of China, translated Beijing: The Commercial Press, 1978

作者簡介:郭小安,重慶大學新聞學院教授、博士生導師,重慶市哲學社會科學智能傳播與城市國際推廣重點實驗室執行主任(重慶 400044);康如詩,重慶大學新聞學院碩士生(重慶 400044)。

中國原創軍事資源:https://www.cjwk.cn/journal/guidelinesDetails/192031322246497484888

Chinese Military to Utilize Artificial Intelligence Empowering Cognitive Confrontation Success on the Modern Battlefield

中國軍隊將利用人工智慧增強現代戰場認知對抗的成功

現代英語:

With the advent of the “smart +” era, artificial intelligence is widely used in the military field, and conventional warfare in physical space and cognitive confrontation in virtual space are accelerating integration. Deeply tapping the potential of artificial intelligence to empower cognitive confrontation is of great significance to improving the efficiency of cross-domain resource matching and controlling the initiative in future operations.

Data mining expands the boundaries of experience and cognition

Data-driven, knowing the enemy and knowing yourself. With the advancement of big data-related technologies, data information has become cognitive offensive and defensive ammunition, and information advantage has become increasingly important on the battlefield. Empowering traditional information processing processes with artificial intelligence technology can enhance the ability to analyze related information, accelerate information integration across domains through cross-domain data collection and false information screening, and enhance dynamic perception capabilities. Artificial intelligence can also help alleviate battlefield data overload, organically integrate enemy information, our own information, and battlefield environment information, and build a holographic intelligent database to provide good support for cognitive confrontation.

Everything is connected intelligently, and humans and machines collaborate. Modern warfare is increasingly integrated between the military and civilians, and the boundaries between peace and war are blurred. Technology has redefined the way people interact with each other, people with equipment, and equipment with equipment, and battlefield data is constantly flowing. Through big data mining and cross-domain comparative analysis, unstructured data such as images, audio, and video can be refined, and the truth can be retained to expand the boundaries of experience cognition and improve the level of human-machine collaboration. The in-depth application of the Internet of Things and big data technologies has promoted the continuous improvement of the intelligent level of data acquisition, screening, circulation, and processing processes, laying a solid foundation for the implementation of cognitive domain precision attacks.

Break through barriers and achieve deep integration. Relying on battlefield big data can effectively break through the barriers of full-domain integration, help connect isolated information islands, promote cross-domain information coupling and aggregation, accelerate barrier-free information flow, and promote the transformation of data fusion and information fusion to perception fusion and cognitive fusion. The comprehensive penetration of intelligent equipment into the command system can accelerate the deep integration of situation awareness, situation prediction and situation shaping, optimize multi-dimensional information screening and cognitive confrontation layout, and promote the continuous iteration and upgrading of cognitive domain combat styles.

Intelligent algorithms enhance decision-making efficiency

Accelerate decision-making and cause confusion to the enemy. The outcome of cognitive confrontation depends to a certain extent on the game of commanders’ wisdom and strategy. Through full-dimensional cross-domain information confrontation and decision-making games, with the help of intelligent technology, we can analyze and intervene in the opponent’s cognition and behavior, and finally gain the initiative on the battlefield. At present, artificial intelligence has become a catalyst for doubling combat effectiveness. In peacetime, it can play the role of an intelligent “blue army” to simulate and deduce combat plans; in wartime, through intelligent decision-making assistance, it can improve the quality and efficiency of the “detection, control, attack, evaluation, and protection” cycle, create chaos for the enemy, and paralyze its system.

Autonomous planning and intelligent formation. In the future intelligent battlefield, “face-to-face” fighting will increasingly give way to “key-to-key” offense and defense. In cognitive domain operations, the use of intelligent algorithms to accurately identify identity information, pre-judge the opponent’s intentions, and control key points in advance can quickly transform information advantages into decision-making advantages and action advantages. Using intelligent algorithms to support cognitive domain operations can also help identify the weaknesses of the enemy’s offense and defense system, autonomously plan combat tasks according to the “enemy”, intelligently design combat formations, and provide real-time feedback on combat effects. Relying on data links and combat clouds to strengthen intelligent background support, we can strengthen combat advantages in dynamic networking and virtual-real interaction.

Make decisions before the enemy and attack with precision. Intelligent algorithms can assist commanders in predicting risks, dynamically optimizing combat plans according to the opponent’s situation, and implementing precise cognitive attacks. In future intelligent command and control, the “cloud brain” can be used to provide algorithm support, combined with intelligent push to predict the situation one step ahead of the enemy, make decisions one step faster than the enemy, and completely disrupt the opponent’s thinking and actions. We should focus on using intelligent technology to collect and organize, deeply analyze the opponent’s decision-making and behavioral preferences, and then customize plans to actively induce them to make decisions that are beneficial to us, aiming at the key points and unexpectedly delivering a fatal blow to them.

Powerful computing power improves the overall operation level

Plan for the situation and create momentum, and suppress with computing power. “He who wins before the battle has more calculations; he who loses before the battle has less calculations.” The situation of cognitive confrontation is complex and changeable, and it is difficult to deal with it only by relying on the experience and temporary judgment of commanders. Intelligent tools can be used to strengthen the penetration of enemy thinking before the battle, actively divide and disintegrate the cognitive ability of the enemy team, and improve our battlefield control ability and combat initiative. At the same time, we should use powerful intelligent computing power to improve flexible command and overall planning capabilities, take advantage of the situation, build momentum, and actively occupy the main position of cognitive confrontation.

Smart soft attack, computing power raid. The rapid development of artificial intelligence has promoted the transformation of war from “hard destruction” to “soft killing”, which is expected to completely subvert the traditional war paradigm. For example, the latest technical concepts can be used to gain in-depth insights into the operating mechanism of the enemy system, actively familiarize oneself with the opponent, and mobilize the opponent. It is also possible to use the psychological anchoring effect and the network superposition amplification effect to interfere with the opponent’s cognitive loop link, disrupt the opponent’s command decision-making, and slow down the opponent’s reaction speed.

Cross-domain coordination and computing power support. To win the proactive battle of cognitive confrontation, we must coordinate across domains, gather forces in multiple dimensions, use intelligent tools to autonomously control the flow of information, realize the integrated linkage of physical domain, information domain and cognitive domain, lead forward-looking deployment and distributed coordination, launch a comprehensive parallel offensive, and form cognitive control over the enemy. Effectively carry out joint actions of virtual and real interaction in the entire domain, intervene in the enemy’s cognition, emotions and will, and use powerful computing power to take the initiative and fight proactive battles.

China Military Network Ministry of National Defense Network

Thursday, April 20, 2023

Chen Jialin, Xu Jun, Li Shan

現代國語:

伴隨「智慧+」時代的到來,人工智慧廣泛應用於軍事領域,物理空間的常規戰爭與虛擬空間的認知對抗加速融合。深度挖掘人工智慧潛力為認知對抗賦權,對提升跨域資源匹配效率,掌控未來作戰主動權具有重要意義。

資料挖潛拓展經驗認知邊界

數據驅動,知彼知己。隨著大數據相關技術的進步,數據資訊已成為認知攻防彈藥,資訊優勢在戰場上變得越來越重要。運用人工智慧技術賦能傳統資訊加工流程,可強化關聯資訊分析能力,透過跨領域資料擷取、虛假資訊甄別,加速資訊全局融合,強化動態感知能力。人工智慧還可協助緩解戰場數據過載,有機整合敵情、我情、戰場環境訊息,建立全像智慧資料庫,為認知對抗提供良好支撐。

萬物智聯,人機協同。現代戰爭日漸軍民一體、平戰界線模糊,技術重新定義了人與人、人與裝備、裝備與裝備的互動方式,戰場資料源源不絕。透過大數據探勘與跨域比較分析,可對影像、音訊、視訊等非結構化資料去粗取精、去偽存真,拓展經驗認知邊界,提升人機協同水準。物聯網、大數據技術的深度運用,推動資料取得、篩選、流轉、加工流程的智慧化程度不斷提升,為實施認知域精準攻擊夯實基礎。

打通壁壘,深度融合。依靠戰場大數據可有效突破全域融合的壁壘,有助於聯通條塊分割的資訊孤島,促進跨域資訊耦合聚合,加速資訊無障礙流通,推動資料融合與資訊融合向感知融合與認知融合轉化。智慧裝備全面滲透進入指揮體系,能夠加速態勢感知、態勢預測與態勢塑造的深度融合,優化多維資訊篩選與認知對抗佈局,推動認知域作戰樣式不斷迭代升級。

智慧演算法強化輔助決策效能

加速決策,致敵混亂。認知對抗的勝負,某種程度上取決於指揮家智慧謀略的博弈。可透過全維度跨域資訊對抗與決策博弈,借助智慧技術分析並介入對手認知與行為,最終贏得戰場主動。目前,人工智慧已成為戰鬥力倍增的催化劑,平時可扮演智慧「藍軍」模擬推演作戰方案;戰時透過智慧輔助決策,提升「偵、控、打、評、保」循環品質效率,給敵方製造混亂,促使其體系癱瘓。

自主規劃,智能編組。未來智慧化戰場上,「面對面」的拼殺將越來越多地讓位給「鍵對鍵」的攻防。在認知域作戰中,利用智慧演算法精準甄別身分資訊、預先研判對手企圖、事先扼控關鍵要點,能夠將資訊優勢快速轉化為決策優勢與行動優勢。利用智慧演算法支撐認知域作戰,還可協助摸清敵方攻防體系弱點,因「敵」制宜自主規劃作戰任務,智慧設計作戰編組,即時回饋作戰效果,依托資料鏈、作戰雲強化智慧後台支撐,在動態組網、虛實互動中強化作戰勝勢。

先敵決策,精準攻擊。智慧演算法可輔助指揮者預判風險,根據對手狀況動態優化作戰方案,實施精準認知攻擊。在未來智慧化指揮控制中,可利用「雲端大腦」提供演算法支撐,結合智慧推送先敵一步預判態勢,快敵一招制定決策,徹底打亂對手思路和行動。應著重運用智慧科技收集整理、深度分析對手決策和行為偏好,進而專項客製化計劃,積極誘導其作出有利於我的決策,瞄準要害出其不意地對其進行致命一擊。

強大算力提升全域運籌水平

謀勢造勢,算力壓制。 「夫未戰而廟算勝者,得算多也;未戰而廟算不勝者,得算少也。」認知對抗態勢複雜多變,僅靠指揮經驗和臨時判斷難以應對,可利用智能工具在戰前即對敵思維認知加強滲透,積極分化瓦解敵方團隊認知力,提升我戰場控局能力和作戰性。同時,應藉助強大智能算力,提升靈活指揮與全局運籌能力,順勢謀勢、借勢造勢,積極佔領認知對抗主陣地。

巧打軟攻,算力突襲。人工智慧的快速發展,推動戰爭進一步從「硬摧毀」轉向「軟殺傷」,可望徹底顛覆傳統戰爭範式。如可運用最新技術理念,深入洞察敵方體系運作機理,積極熟悉對手、調動對手。還可利用心理沉錨效應和網路疊加放大效應,幹擾對手認知循環鏈路,打亂對手指揮決策,遲滯對手反應速度。

跨域統籌,算力支撐。打贏認知對抗主動仗須全域跨域統籌、多維同向聚力,利用智慧工具自主控制資訊的流量流向,實現物理域、資訊域與認知域的一體聯動,引領前瞻性布勢與分散式協同,全面展開並行攻勢,形成對敵認知控制。有效進行全域虛實相生的聯合行動,對敵認知、情緒和意志實施幹預,借助強大算力下好先手棋、打好主動仗。

中國軍網 國防部網 // 2023年4月20日 星期四

陳佳琳 徐 珺 李 山

中國原創軍事資源:http://www.81.cn/jfjbmap/content/2023-04/20/content_338002888.htm

Chinese Intelligent Warfare is Accelerating and Advancing

中國智能化戰爭正在加速推進

中國軍網 國防部網. 2022年3月17日 星期四

現代英語:

With the widespread application of artificial intelligence in the military field, intelligent warfare has gradually become a hot topic. History has repeatedly proved that the evolution of war forms will lead to profound changes in the winning mechanism. In today’s era when information warfare is developing in depth and intelligent warfare is beginning to emerge, the armies of major countries in the world have made great efforts to promote military intelligence, and many of these trends are worthy of attention.

Strengthen top-level design

Outlining a “roadmap” for intelligent warfare

Driven by a new round of scientific and technological revolution and industrial revolution, intelligent military transformation is developing in depth. The United States, Russia, Japan and other countries have regarded artificial intelligence as a disruptive technology that “changes the rules of the war game” and have made early arrangements, strengthened top-level design and planning guidance, and explored the direction of military application of artificial intelligence.

The U.S. military has detailed the current status and development plan of artificial intelligence in documents such as “Preparing for the Future of Artificial Intelligence”, “National Artificial Intelligence Research and Development Strategic Plan”, “Artificial Intelligence and National Security”, “Integrated Roadmap for Unmanned Systems, Fiscal Year 2017-2042”, and “American Artificial Intelligence Initiative: First Annual Report”, and has elevated the development of artificial intelligence to the national strategic level. In 2021, the U.S. military pointed out in its “U.S. Department of Defense Artificial Intelligence Posture: Assessment and Improvement Recommendations” that the U.S. military should consider three guiding questions in developing artificial intelligence: what is the current state of artificial intelligence related to the U.S. military; what is the current situation of the U.S. military in artificial intelligence; and what internal actions and potential legislative or regulatory actions may enhance the U.S. military’s artificial intelligence advantage.

Russia has invested a lot of resources to maintain a balance with the United States in the competition for the application of artificial intelligence in the military field. In 2021, Russian President Vladimir Putin stated at the first Defense Ministry meeting of the year that artificial intelligence will greatly promote changes in the military field, and the Russian Federation Armed Forces must accelerate the research and development of artificial intelligence application technologies such as robots, intelligent individual systems, and intelligent weapon modules, so as to form core technical capabilities and battlefield competitive advantages as soon as possible. Documents such as “Special Outline for the Research and Development of Future Military Robot Technology and Equipment before 2025”, “Future Russian Military Robot Application Concept”, and “The Development Status and Application Prospects of Artificial Intelligence in the Military Field” have established a series of mechanisms at the national level for the Russian military to promote the military application of artificial intelligence.

The Japanese government has also issued an “Artificial Intelligence Strategy” to lead the research and development of artificial intelligence technology and industrial development. In the “Robotics and Artificial Intelligence” strategic plan formulated by the United Kingdom, the application of artificial intelligence in battlefield construction is emphasized. In January 2021, the Australian Department of Defense released “Fighting the Artificial Intelligence War: Operational Concepts for Future Intelligent Warfare”, which explores how to apply artificial intelligence to land, sea and air combat.

Innovative combat concepts

Promoting the “Thinking First” Approach to Intelligent Warfare

The innovation of operational concepts has an ideological driving effect on the development of military science and technology and the evolution of war forms. In the past, people’s understanding and grasp of war mainly came from the summary of practical experience, and operational concepts were empirical concepts. In the future era of intelligent warfare, operational concepts are not only empirical concepts, but also the conception, design and foresight of operations.

The U.S. Army has proposed the concept of “multi-domain warfare”, which requires deep integration and close coordination of combat capabilities in various domains such as land, sea, air, space, electromagnetic, and network. To this end, the U.S. Army has successively issued white papers such as “Multi-Domain Warfare: The Development of Synthetic Arms in the 21st Century (2025-2040)”, “U.S. Army Multi-Domain Warfare (2028)”, and “Using Robotics and Autonomous Technologies to Support Multi-Domain Warfare”. In March 2021, the U.S. Department of the Army issued the document “Army Multi-Domain Transformation: Preparing to Win in Competition and Conflict”, indicating that “multi-domain warfare” has become a “flag” leading the transformation and development of the U.S. Army. The Defense Advanced Research Projects Agency proposed the concept of “mosaic warfare”, which aims to create a highly decentralized and highly adaptable “kill net” composed of different combat functional units, based on advanced computer technology and network technology. The U.S. Department of Defense strongly supports the concept of “joint all-domain operations”. In March 2020, the U.S. Air Force took the lead in writing “joint all-domain operations” into the doctrine to explore how the Air Force can play a role in “joint all-domain operations”.

The Russian military proposed the concept of “charge disintegration”. “Disintegration” is one of the most important operational concepts in Russia at present. The Russian electronic warfare forces set the goal of making the enemy’s information, charge, electronic warfare and robot systems ineffective, and believe that this goal will “determine the fate of all military operations”. Disrupting the command and control of enemy forces and weapon systems and reducing the efficiency of enemy reconnaissance and use of weapons are the primary tasks of electronic warfare. At present, the Russian military is considering forming 12 types of electronic warfare forces. The Russian military also proposed the concept of “non-nuclear containment system”, the core of which is to use non-nuclear offensive strategic weapons to contain opponents. The non-nuclear offensive strategic weapons it defines include all ballistic missiles equipped with non-nuclear warheads, as well as strategic bombers and long-range air-based and sea-based cruise missiles. In addition, the Russian military also proposed the concept of “hybrid warfare”, hoping to use artificial intelligence systems to seek battlefield information advantages.

The British Ministry of Defense has proposed the concept of “multi-domain integration” and will develop a new command and control system with intelligent capabilities to achieve comprehensive, persistent, accurate and rapid battlefield perception and force coordination.

Focus on technology research and development

Shaping the Intelligent Warfare Operational Model

The key to the effectiveness of artificial intelligence is the combination with other technologies, which is also described as the “AI stack”. Various technologies interact to produce a combined effect, thereby enhancing the capabilities and effects of each technology. In the intelligent warfare supported by artificial intelligence technology, the collaborative combat mode of “man-machine integration, cloud brain control”, the cluster combat mode of “mixed formation, group intelligence”, and the cognitive combat mode of “intelligence-led, attacking with intelligence first” will constantly update people’s understanding of war.

Focus on the research and development of innovative projects. The US military is vigorously promoting the application of artificial intelligence chips in existing weapons and equipment systems, adding “intelligent brains” to weapons to enable them to have human-like thinking and autonomous interaction capabilities. In October 2021, the US Navy launched the “Beyond Plan”, which is regarded as the “current highest priority”. It aims to accelerate the delivery of artificial intelligence and machine learning tools by building a military Internet of Things for maritime operations, integrating manned and unmanned joint formations, supporting a new intelligent naval architecture, enhancing large-scale firepower killing, and realizing intelligent distributed operations of the navy. In addition, the Defense Advanced Research Projects Agency has also carried out cognitive electronic warfare projects such as “Adaptive Electronic Warfare Behavior Learning”, “Adaptive Radar Countermeasures”, and “Communications under Extreme Radio Frequency Spectrum Conditions”, and developed a prototype of a cognitive radar electronic warfare system. The Russian Ministry of Defense’s Intelligent Technology and Equipment Research and Experimental Center cooperated with the Institute of Control Problems of the Russian Academy of Sciences to develop and test autonomous intelligent algorithms including drone swarm command and control, and also jointly developed an object automatic recognition software system based on neural network principles with the National Aviation System Research Institute.

Establish innovative R&D institutions. The continuous emergence of new technologies is an inexhaustible driving force for the vigorous development of military intelligence. High-level military intelligence construction cannot be separated from the technical research and development of professional institutions. Some countries and militaries have established R&D centers, focusing on innovative development from a technical level. The U.S. Department of Defense has established a joint artificial intelligence center, which is planned to be built into a national key laboratory to lead the promotion of hundreds of artificial intelligence-related projects and ensure the efficient use of artificial intelligence-related data and information to maintain the United States’ technological advantage in this field. Russia has established an artificial intelligence and big data alliance, a national artificial intelligence center, and a robotics technology research and experimental center under the Ministry of Defense, mainly conducting theoretical and applied research in the fields of artificial intelligence and information technology. France has established an innovative defense laboratory, the United Kingdom has set up an artificial intelligence laboratory, and India has established an artificial intelligence task force to explore related technologies.

Strengthen equipment research and development and deployment. In recent years, many countries have attached great importance to the research and development of intelligent weapons and equipment, and unmanned aerial vehicles, unmanned combat vehicles, unmanned ships, unmanned submarines, etc. have continued to emerge. At present, the US Air Force has begun to practice the combat concept of “man-machine collaboration, man in the loop” on the F-35 fighter. The US XQ-58A “Valkyrie” stealth drone previously mainly carried out man-machine collaborative operations with F-35 and F-22 fighters. In April 2021, the stealth drone successfully launched the ALTIUS-600 small drone system, further enhancing its manned and unmanned collaborative combat capabilities. Russia is focusing on reconnaissance and surveillance, command and decision-making, firepower strikes, combat support and other fields, and is developing and deploying intelligent equipment. It plans to increase the proportion of unmanned combat systems in weapons and equipment to more than 30% by 2025. Russia’s ground unmanned combat weapons, represented by the “Uranus” series and “Platform-M” and “Argo” models, are developing rapidly. Among them, the Nerekhta unmanned combat vehicle can be equipped with remote-controlled machine guns and rocket launchers. In addition to the combat capabilities of ordinary armored vehicles, it also has transportation and reconnaissance functions. In addition, the Japanese Self-Defense Forces plan to officially deploy an unmanned aerial formation with strong combat capabilities in 2035.

(Author’s unit: National University of Defense Technology)

國語中文:

■賈珍珍 丁 寧 陳方舟

隨著人工智慧在軍事領域的廣泛應用,智慧化戰爭逐漸成為備受矚目的焦點話題。歷史多次證明,戰爭形態的演進將引發致勝機理的深刻改變。在資訊化戰爭向縱深發展、智慧化戰爭初露端倪的當今時代,世界主要國家軍隊紛紛下大力推動軍事智慧化,其中的諸多動向值得關注。

加強頂層設計

勾勒智能化戰爭“路線圖”

在新一輪科技革命與產業革命推動下,智慧化軍事變革正向縱深發展。美國、俄羅斯、日本等國紛紛把人工智慧視為「改變戰爭遊戲規則」的顛覆性技術,並事先佈局,加強頂層設計和規劃引領,探索人工智慧的軍事應用方向。

美軍在《為人工智慧的未來做好準備》《國家人工智慧研究與發展戰略計畫》《人工智慧與國家安全》《2017至2042財年無人係統綜合路線圖》《美國人工智慧計畫》:在首個年度報告》等文件中,詳述了人工智慧的發展現狀和發展規劃,並將人工智慧發展提升至國家戰略層面。 2021年,美軍在發布的《美國防部人工智慧態勢:評估與改進建議》中指出,美軍發展人工智慧應考慮三個指導性問題:與美軍相關的人工智慧現處於何種狀態;美軍目前在人工智慧方面的態勢如何;哪些內部行動以及潛在的立法或監管行動可能會增強美軍的人工智慧優勢。

俄羅斯投入大量資源,以維持與美國在人工智慧軍事領域應用競爭的平衡。 2021年,俄總統普丁在年度首場國防部會議上表示,人工智慧將大幅推動軍事領域變革,俄國聯邦武裝力量要加速機器人、智慧單兵系統和武器智慧化模組等人工智慧應用技術的研發工作,早日形成核心技術能力和戰場競爭優勢。 《2025年前未來軍用機器人技術裝備研發專題綱要》《未來俄軍用機器人應用構想》《人工智慧在軍事領域的發展現狀以及應用前景》等文件,從國家層面為俄軍推動人工智慧軍事應用確立了一系列機制。

日本政府也推出了《人工智慧戰略》,旨在引領人工智慧技術研發和產業發展。在英國制定的《機器人與人工智慧》戰略規劃中,強調了人工智慧在戰場建設中的應用。 2021年1月,澳洲國防部發布《打好人工智慧戰爭:未來智慧化戰爭之作戰構想》,這份文件探討如何將人工智慧應用到陸、海、空作戰領域。

創新作戰概念

推動智慧化戰爭“思想先行”

作戰概念創新對軍事科技發展、戰爭形態演變具有思想牽引作用。過去人們對戰爭的認識與掌握,主要源自於對實踐經驗的歸納總結,作戰概念即經驗概念。未來智慧化戰爭時代,作戰概念不僅是經驗概念,更是對作戰的構想、設計與前瞻。

美陸軍提出「多域戰」概念,要求陸、海、空、天、電磁、網路等各域作戰能力深度整合與密切協同。為此,美陸軍先後發布《多域戰:21世紀合成兵種的發展(2025至2040)》《美國陸軍多域戰(2028)》《運用機器人與自主技術支援多域戰》等白皮書。 2021年3月,美陸軍部發布文件《陸軍多域轉型:準備在競爭和衝突中取勝》,顯示「多域戰」已成為引領美陸軍轉型發展的一面「旗幟」。美國防高級研究計畫局提出「馬賽克戰」概念,旨在打造一種由不同作戰功能單元構成的、以先進電腦技術與網路技術為基礎的、高度分散、具有高度適應性的「殺傷網」。美國防部大力支持「聯合全域作戰」概念。 2020年3月,美空軍率先將「聯合全域作戰」寫入條令,探討空軍如何在「聯合全域作戰」中發揮作用。

俄軍提出「指控瓦解」概念。 「瓦解」是當前俄羅斯最重要的作戰概念之一,俄軍電子戰部隊把使敵人的訊息、指控、電子戰和機器人系統失效作為目標,認為這一目標將「決定所有軍事行動的命運」。擾亂敵方部隊和武器系統的指揮和控制,降低敵方偵察和使用武器的效率,是進行電子戰的首要任務。目前,俄軍正在考慮組建12種類型的電子戰部隊。俄軍也提出「非核武遏制體系」概念,核心是使用非核武進攻性戰略武器來遏制對手。其所定義的非核武攻擊性戰略武器既包括所有裝備非核彈頭的彈道飛彈,也包括戰略轟炸機和遠程空基、海基巡航飛彈。此外,俄軍也提出「混合戰爭」概念,希望利用人工智慧系統謀求戰場資訊優勢。

英國防部提出「多域融合」概念,將發展具備智慧化能力的新型指控系統,以實現全面、持久、準確、快速的戰場感知與力量協同。

注重技術研發

塑造智慧化戰爭作戰模式

人工智慧發揮效用的關鍵是與其他多種技術的組合,這種組合也被描述為「人工智慧堆疊」。各種技術透過互動的方式產生組合效應,進而提升每項技術所產生的能力與效果。在人工智慧技術支援的智慧化戰爭中,「人機一體、雲腦控制」的協同作戰模式,「混搭編組、群體智慧」的集群作戰模式,「智慧主導、攻智為上」的認知作戰模式等,將不斷更新人們對戰爭的認知。

聚焦創新專案研發。美軍正在大力推廣人工智慧晶片在現有武器裝備系統中的應用,為武器加上“智慧大腦”,使之具備類人思考和自主互動能力。 2021年10月,美海軍推出被視為“當前最高優先事項”的“超越計劃”,旨在通過構建海上作戰軍事物聯網,整合有人無人聯合編隊,加速交付人工智能和機器學習工具,支撐全新的智慧化海軍架構,提升大規模火力殺傷、實現海軍智慧化分散式作戰。此外,美國防高級研究計畫局也進行了「自適應電子戰行為學習」「自適應雷達對抗」「極端射頻頻譜條件下的通訊」等認知電子戰項目,研發出認知雷達電子戰系統原型機。俄國防部智慧技術裝備科研試驗中心與俄聯邦科學院控制問題研究所合作,開發測試了包括無人機群指揮控制在內的自主智慧演算法,也與國家航空系統科研所共同開發基於神經網路原理的物體自動辨識軟體系統。

組成創新研發機構。新技術的不斷湧現是軍事智慧化蓬勃發展的不竭動力,高水準的軍事智慧化建設離不開專職機構的技術研發。一些國家和軍隊組成研發中心,注重從技術層面創新發展。美國國防部建立了聯合人工智慧中心,計劃將該中心打造成國家級重點實驗室,用於領導數百個與人工智慧相關的項目,確保對人工智慧相關數據資訊的高效利用,以保持美國在該領域的技術優勢。俄羅斯組成了人工智慧和大數據聯盟、國家人工智慧中心和隸屬國防部的機器人技術科研試驗中心,主要進行人工智慧和資訊科技領域的理論和應用研究。法國成立了創新國防實驗室,英國設立了人工智慧實驗室,印度組成了人工智慧特別工作小組,進行相關技術探索。

加強裝備研發列裝。近年來,多國重視研發智慧武器裝備,無人飛行器、無人戰車、無人艦艇、無人潛航器等不斷湧現。目前,美空軍已開始在F-35戰機上實踐「人機協同,人在迴路」的作戰理念。美XQ-58A「女武神」隱身無人機先前主要與F-35和F-22戰機進行人機協同作戰,2021年4月該隱身無人機成功投放ALTIUS-600小型無人機系統,進一步提升了其有人無人協同作戰能力。俄羅斯正聚焦偵察監視、指揮決策、火力打擊、作戰支援等多個領域,展開智慧裝備研發和列裝工作,計畫到2025年將無人作戰系統在武器裝備中的比例提高到30%以上。以“天王星”系列和“平台-M”“阿爾戈”等型號為代表的俄地面無人作戰武器發展迅速。其中,Nerekhta無人戰車可搭載遙控機槍和火箭發射器,除擁有一般裝甲車的戰鬥力外,還兼具運輸和偵察功能。此外,日本自衛隊計劃在2035年正式部署具有較強作戰能力的無人空中編隊。

(作者單位:國防科技大學)

中國軍事資料來源:https://www.81.cn/jfjbmap/content/2022-03/17/content_311555.htm

How Chinese Military Will Achieve Precise Strikes in Cognitive Domain Operations

中國軍隊如何在認知域作戰中實現精準打擊

現代英語:

How to achieve precise strikes in cognitive domain operations

■Bu Jiang Jiang Rilie

introduction

Currently, driven by intelligent technology, cognitive domain operations are showing new characteristics such as precise perception, precise prediction and precise calculation. Studying and grasping the connotation mechanism of precision strikes in cognitive domain operations to ensure clear operational targets, personalized information generation, and precise information delivery will be more conducive to seizing the commanding heights and initiative in future cognitive domain operations.

Accurately establish combat goals

The establishment of operational goals is often the primary issue of concern in cognitive domain operations. With the continuous application of artificial intelligence, big data and other technologies, the party with a technological advantage is often able to quickly and efficiently collect cognitive data of different dimensions, levels and modalities, thereby discovering the weaknesses and sensitivities of the opponent’s cognitive system. point and detonation point.

Massive “data sources” refine target clarity. Today, as the Internet becomes more popular, cognitive data is growing exponentially. With the support of big data, psychometric and other technologies, target portraits are gradually evolving rapidly towards accurate portraits and intelligent portraits. According to foreign statistics, as of July 2022, the global Internet penetration rate reached 69%, and the Internet has become an essential platform for users’ daily lives. With the help of the Internet, both combatants can widely and quickly realize target object cognitive data collection and cognitive situation awareness, providing support for analyzing the target object’s political beliefs, values, national sentiments, public opinion positions, etc. It is reported that in foreign elections in recent years, foreign data analysis companies have captured social media user data, established character analysis models, accurately portrayed voters’ personalities and cognitive characteristics, and on this basis pushed suggestive campaigns to swing voters. advertising, thereby influencing their electoral decisions.

Dynamic “tag pool” improves target recognition rate. Labeling usually refers to the abstract classification and generalization of certain characteristics of a specific group or object. In cognitive domain operations, labeling is an important process to achieve classification and visualization of cognitive data. In the face of massive user data, establishing a mature and reliable label system is a prerequisite for sorting out, analyzing, and making good use of cognitive data. Using the label system to filter useless data and mine potential value information can provide information for presetting combat scenarios in the cognitive domain. Direct frame of reference. The development of the labeling system should be based on the logical starting point of cognitive domain operations, and ultimately comes down to the application of cognitive domain operations. For the target object, the transfer of interests, changes in personality, and changes in emotion are real-time and dynamic. The establishment of a “tag pool” can sense the cognitive dynamics of the target object in real time and accurately improve the target recognition rate.

Intelligent “algorithm library” shows target relevance. If data is compared to the “fuel” of cognitive domain operations, algorithms are the “engine” and an important source of power for cognitive precision strikes. In a certain sense, cognitive domain operations are “confrontation of data or algorithms.” Through intelligent algorithms, we can deeply mine the multi-dimensional correlation data of the target object’s behavior, build an accurate target portrait, and then combine it with machine learning algorithms to build a prediction model to automatically match and associate cognitive information with the target object, at the right time and at the right place. Deliver cognitive information in an appropriate manner to change the target object’s cognition. As analyzed by some foreign research institutions, with 10 likes, the algorithm can know you better than your colleagues; with 150 likes, the algorithm will know you better than your parents; with 300 likes, the algorithm will know you better than your closest partner you.

Accurately generate information “ammunition”

Designing information “ammunition” that conforms to the target’s thinking habits and perception style is the key to improving the cognitive domain killing effect. The development and application of intelligent science and technology provides a convenient means to achieve “private customization” of cognitive information themes, content and forms, making it possible to instantly and forcibly change the decisions and actions of target objects.

Information theme planning based on target value orientation. Cognitive information theme is the central idea represented by the information and the core of the information content. From legal advice, military deterrence, conflict and separation, and emotional summons, to moral guidance, war mobilization, behavioral instructions, and motivational incentives, different information themes exert different influences. Practice shows that the theme of cognitive information must be planned closely around the target object. According to the different value orientations shown by different combat stages and different target objects, the information theme must be optimized in a timely manner so that the information “ammunition” can satisfy the target object to the maximum extent. needs. According to the analysis of foreign research institutions, foreign election campaign advertisements in recent years are often inseparable from the support of big data. Accurately designing different advertising themes for voters with different values ​​​​can resonate with voters’ values.

Information content design based on target mindset. In the Internet era, the life trajectory, geographical location, hobbies, social relationships, etc. of the target object are all recorded on the Internet, making it possible to accurately create an “information cocoon” that caters to the target object’s way of thinking. Driven by big data technology, the interaction trajectories of target objects in the virtual world can also be easily captured, perceived and calculated. With the assistance of multimedia content intelligent generation systems, information similar to the target’s thinking habits can be generated in batches, causing the target to be trapped in an “information cocoon”. The scope of information acceptance narrows, and the perception of the outside world gradually decreases, and then falls into cognitive confusion. Know the trap. In recent years, many “color revolutions” that have occurred around the world are inseparable from the support of cognitive control. Some Western countries use “deep forgery” technology to instill false information in target objects that conforms to their way of thinking, creating anti-intellectual, The information environment stupefies the people, forming cognitive biases and inducing them to deny their own national and cultural values, thereby creating anti-government sentiments.

Information form selection based on target perceptual characteristics. Psychology believes that the formation and change of the cognitive subject’s attitude needs to go through three processes of “attention-understanding-acceptance”. Whether the target object can be affected by the dissemination of information, attracting attention is the first step. Information form is an important carrier to attract the attention of the target audience, and its form design is crucial to improving the acceptance, dissemination and infectivity of information “ammunition”. Through big data technology, we can mine the national emotions, customs and habits, religious beliefs, personal preferences and other characteristics of the target object, and scientifically judge the perceptual characteristics such as information receiving habits. On this basis, we can comprehensively use text, language, video, image and other information carriers to integrate Color, layout and other elements can cause strong stimulation to the target object’s senses. Since 2011, some Syrian anti-war activists have produced a number of anti-war propaganda short films from the perspectives of children and women and spread them on the international Internet, arousing strong responses from international public opinion. This internationally accepted information carrier meets the aesthetic needs of the public, avoids differential interpretation by the audience, and can often achieve unexpected results.

Accurately deliver information

Cognitive information delivery follows the laws of information dissemination. In order to achieve the effect of precise cognitive attack, it is necessary to deal with issues such as delivery objects, delivery channels, and delivery timing.

Extract cognitive features and filter information delivery objects. The profiling technology supported by big data makes it possible to extract the cognitive characteristics of target objects. Through the cognitive characteristic library, objects with similar characteristics can be screened out from groups of different races, different parties, different occupations, etc., thereby upgrading the traditional extensive screening method. , so that the information “ammunition” is more closely matched with the target object, thereby improving the pertinence and accuracy of cognitive attacks. In recent years, Cambridge Analytica has used machine learning methods to classify Facebook users according to five personality types: openness to experience, conscientiousness, extroversion, agreeableness, and emotional instability, and established a linear regression of the five personality traits. model to establish a “target” for precise delivery of campaign advertisements. This move has many implications for the world. In the future, cognitive domain operations, based on the extensive collection of users’ cognitive characteristics, will place more emphasis on accurately dividing groups, and carry out targeted campaigns based on the differences in values ​​and behavioral habits of different groups. information delivery and behavior prediction.

Follow social habits and match information delivery channels. The deep popularization of the Internet has brought about tremendous changes in the way information is disseminated, and the ways in which people receive information are becoming more diversified and diversified. According to data from foreign research institutions, there are currently more than 4.62 billion social media users worldwide, and social media platforms have become the main battlefield in the cognitive domain. In the many “color revolutions” that have occurred in recent years, social media such as Facebook, Twitter, and YouTube, controlled by Western countries, have played an important role in spreading public opinion, organizing protests, and mobilizing the public. It is reported that in similar operations, Facebook is often used to determine the schedule, Twitter is used to coordinate actions, and YouTube is used to spread the word widely. Future operations in the cognitive domain will place great emphasis on focusing on the target’s social habits and characteristics, fully understanding the target’s social circle and life circle, and selecting information delivery methods from multiple channels, including online and offline, military and civilian, to ensure the effectiveness of cognitive information. Delivery rate.

Track cognitive dynamics and accurately deliver information at the right time. Changes in cognition do not happen overnight. Blindly pursuing high rhythm and achieving goals in an instant will have the opposite effect. Therefore, cognitive domain operations must grasp the rhythm and intensity of “time immersion”, select the correct delivery time based on the cognitive dynamics of the target object, and gradually seek to expand the effect advantage. Before the target object has formed a preliminary understanding of a certain event, it is necessary to actively seize the priority of information release, carry out information “bombing” as soon as possible, and strive to “preemptively strike first.” In addition, during the public opinion fermentation stage of the incident, the subject’s cognition has not yet been completely solidified. At this time, by repeatedly disseminating a specific information, the purpose of subtly reconstructing the subject’s cognition can also be achieved.

(Author’s unit: National University of Defense Technology)

國語中文:

如何實現認知域作戰精準打擊

■卜江 蔣日烈

引言

目前,在智慧化技術的推動下,認知域作戰正呈現出精確感知、精確預測和精確計算等全新特徵。研究掌握好認知域作戰精準打擊的內涵機理,從而確保作戰目標清晰化、資訊生成個性化、資訊投射精準化,將更有利於奪取未來認知域作戰制高點和主動權。

精準確立戰目標

作戰目標的確立往往是認知域作戰關注的首要問題。隨著人工智慧、大數據等技術的不斷應用,佔據技術優勢的一方往往能夠快速、有效率地採集不同維度、不同層級、不同模態的認知數據,進而發現對手認知體系的薄弱點、敏感點和爆燃點。

海量「資料來源」細化目標清晰度。在網路深入普及的今天,認知數據正呈指數級增長,目標畫像在大數據、心理測量等技術的支撐下,正逐漸朝著精準畫像、智慧畫像的方向快速演進。根據國外統計數據顯示,截至2022年7月,全球互聯網滲透率達69%,互聯網已成為用戶日常生活的必備平台。借助互聯網,作戰雙方能夠廣泛快速地實現目標對象認知數據收集和認知態勢感知,為分析目標對象的政治信念、價值觀念、民族情感、輿論立場等提供支撐。據悉,在近年的國外大選中,國外數據分析公司就曾透過抓取社群媒體用戶數據,建立人物分析模型,精準刻畫選民性格、認知特徵,在此基礎上對搖擺選民推送暗示性競選廣告,從而影響其選舉決策。

動態「標簽池」提升目標辨識率。貼標簽通常是指對某一類特定群體或物件的某項特徵進行的抽象分類和概括。在認知域作戰中,貼標簽是實現認知資料分類與可視化的重要過程。面對海量的用戶數據,建立一套成熟可靠的標簽體係是梳理分析、用活用好認知數據的前提,利用標簽體系過濾無用數據,挖掘潛在價值信息,能夠為認知域作戰場景預設提供直接參考框架。標簽體系的開發要基於認知域作戰這個邏輯起點,最終歸結於認知域作戰應用。對於目標對象來講,興趣的遷移、性格的改變、情感的變化是即時動態的,建立「標簽池」能夠即時感知目標對象的認知動態,精準提升目標辨識率。

智慧“演算法庫”顯現目標關聯性。如果將數據比作認知域作戰的“燃料”,演算法則是“引擎”,是認知精準打擊的重要動力源。從一定意義上講,認知域作戰是「數據或演算法的對抗」。透過智慧演算法,可以深度挖掘目標對象行為的多維關聯數據,構建精準目標畫像,再結合機器學習演算法構建預測模型,將認知資訊與目標對象進行自動匹配關聯,在合適的時間、合適的地點,以合適的方式投送認知訊息,從而改變目標對象認知。正如國外一些研究機構分析發現,透過10個點贊,演算法可以比同事更了解你;150個點贊,演算法將比你父母更了解你;300個點贊,演算法將比最親密的伴侶更了解你。

精準生成資訊“彈藥”

設計符合目標對象思維習慣和感知風格的訊息“彈藥”,是提升認知域殺傷效果的關鍵所在。智慧科學技術的發展運用,為實現認知資訊主題、內容和形式的「私人客製化」提供了便捷手段,即時、強制地改變目標對象決策和行動成為可能。

基於目標價值取向的資訊主題策劃。認知資訊主題是資訊所表現的中心思想,是資訊內容的核心。從法理勸告、軍事威懾、矛盾離間、情感召喚,到義理引導、戰爭動員、行為指示、動機激勵,不同的訊息主題發揮不同的影響作用。實踐表明,認知訊息的主題必須緊緊圍繞目標對像做策劃,針對不同作戰階段、不同目標對象所表現出來的不同價值取向,及時優化信息主題,才能使信息“彈藥”最大限度地滿足目標對象的需求。根據國外研究機構分析,近年來的國外大選競選廣告背後往往離不開大數據的支撐,針對不同價值觀的選民精準設計不同廣告主題,可以引起選民價值共鳴。

基於目標思維方式的資訊內容設計。在互聯網時代,目標對象的生活軌跡、地理位置、興趣愛好、社交關係等都被網絡所記錄,精準打造迎合目標對象思維方式的「資訊繭房」成為可能。在大數據技術驅動下,目標對像在虛擬世界中的互動軌跡也很容易被捕捉、被感知和被計算。在多媒體內容智慧生成系統等輔助下,可大量產生與目標對象思維習慣類似的訊息,致使其陷於「訊息繭房」之中,訊息接受範圍變窄,對外界的感知度逐漸降低,進而陷入認知陷阱。近年來,全球發生的多起「顏色革命」背後都離不開認知控制的支撐,一些西方國家利用「深度偽造」技術,向目標對象灌輸符合其思維方式的虛假信息,製造反智化、愚民化資訊環境,形成認知偏差,誘導其否定自身民族文化價值理念,進而產生反政府情緒。

基於目標感知特性的資訊形式選擇。心理學認為,認知主體的態度的形成與改變需經過「注意力-了解-接受」三個過程,目標對象能否受到訊息傳播的影響,吸引註意是第一步。資訊形式作為引起目標對象注意的重要載體,其形式設計對提高訊息「彈藥」的接受度、傳播力、感染性至關重要。透過大數據技術可以挖掘目標對象民族情感、風俗習慣、宗教信仰、個人喜好等特徵,科學判斷訊息接受習慣等感知特性,在此基礎上綜合運用文本、語言、視頻、圖像等資訊載體,加以融入色彩、佈局等元素,可以給目標感官造成強烈刺激。自2011年以來,一些敘利亞反戰人士以兒童、婦女等視角,製作出多部反戰宣傳短片在國際互聯網上傳播,引起國際社會輿論強烈反響。這種國際通用的資訊載體,符合大眾審美需求,避免了受眾差異性解讀,往往能達到意想不到的效果。

精準實現資訊投送

認知訊息投送遵循資訊傳播規律,要達到認知精準打擊效果,需要處理好投送對象、投送管道、投送時機等問題。

提取認知特徵,篩選訊息傳遞對象。大數據支撐的畫像技術使提取目標對象認知特徵成為可能,透過認知特徵庫,可以從不同種族、不同黨派、不同職業等群體中篩選出具有相似特徵的對象,從而升級傳統的粗放篩選方式,讓資訊「彈藥」與目標對象更加匹配,從而提高認知攻擊的針對性和精準性。近年來,劍橋分析公司曾使用機器學習方法,依照經驗開放型、盡責型、外向型、親和型、情緒不穩定型五類人格對臉書用戶進行分類,建立了五種人格特質的線性回歸模型,為精準投送競選廣告立起「標靶」。此舉對世人的啟示是多方面的,未來認知域作戰,在廣泛蒐集用戶認知特徵的基礎上,將更加強調精準劃分群體,依據不同群體的價值觀念和行動習慣的差異,進行有目的地信息投送和行為預測。

遵循社交習慣,匹配資訊投送管道。互聯網的深度普及使資訊的傳播方式正發生巨大變革,人們接受資訊的方式更加多樣化、多元化。根據國外調研機構數據顯示,目前全球社群媒體用戶超過46.2億,社群媒體平台成為認知域作戰主戰場。在近年來發生的多起「顏色革命」中,臉書、推特、優兔等社群媒體在西方國家操縱下,在輿論傳播、組織抗議、動員民眾等方面發揮了重要作用。據悉,在類似行動中臉書往往用來確定日程,推特用來協調行動,優兔用來廣泛傳播。未來的認知域作戰,十分強調著眼目標對象社交習慣和特點,充分掌握目標對象的社交圈、生活圈,從線上線下、軍用民用等多渠道選擇信息投送方式,從而確保認知信息的送達率。

追蹤認知動態,把準資訊投送時機。認知的改變,並非一蹴而就,一味地追求高節奏、瞬間達成目的反而會起到反面效果。因此,認知域作戰要掌握好「時間沉浸」的節奏與力度,根據目標對象認知動態選準投送時間,逐步漸進地謀求擴大效果優勢。在目標對像對某一事件還未形成初步認知前,要積極搶佔信息的發布優先權,第一時間進行信息“轟炸”,力求“先發製人、先入為主”。此外,在事件的輿論發酵階段,主體的認知還未徹底固化,此時透過不斷重復傳播某個特定訊息,也可以達到潛移默化地重構主體認知的目的。

(作者單位:國防科技大學)

中國軍事資料來源:https://www.81.cn/yw_208727/16209631.html

How the Chinese Military Identify Key Targets for Cognitive Domain Operations

中國軍隊如何辨識認知域作戰的關鍵目標

現代英語:

Cognitive domain combat targets refer to the specific role of cognitive domain combat. In cognitive domain combat, compared with combat targets, combat targets solve the problem of precise aiming, that is, to let commanders understand and grasp the precise coordinates of what to hit, where to hit, and to what extent. Only by deeply understanding the connotation and characteristics of cognitive domain combat targets can we accurately find key targets through appearances and thus seize the initiative in future combat.

Cognitive focus that influences behavioral choices

The cognitive focus is the “convergence point” of the cognitive subject’s multi-dimensional thinking cognition in war activities. As a dynamic factor, it affects the cognitive process and behavioral results. Generally speaking, the cognitive factors that affect individual behavioral choices in war activities mainly include political attribute cognition, interest-related cognition, group belonging cognition, risk loss cognition, emotional orientation cognition, war morality cognition, etc. For war activities and groups or individuals who pay attention to war activities, the cognitive focus that affects their attitudes, tendencies and behaviors is not the same. Judging from the local wars and regional conflicts in the world in recent years, there are obvious differences in the cognitive focus of different groups or individuals. Politicians pay more attention to political attribute cognition and interest-related cognition, those who may intervene in the war pay more attention to risk loss cognition and interest-related cognition, ordinary people pay more attention to interest-related cognition and emotional orientation cognition, and people in other countries outside the region generally pay more attention to war morality cognition and group belonging cognition because their own interests will not be directly lost. In combat practice, foreign militaries are good at targeting the cognitive focus of different objects, accurately planning topics, and pushing related information to induce specific behavioral choices. For example, before the Gulf War, the Hill Norton public relations company fabricated the non-existent “incubator incident” by using Naira, the daughter of the Kuwaiti ambassador to the United States, as a “witness” to show the “inhumanity” of the Iraqi army, induce the American people’s ethical and moral cognition, and then support the US government to send troops to participate in the Gulf War.

Style preferences that constrain command decisions

Cognitive style directly affects decision-making behavior preferences. Cognitive style refers to the typical way of individual cognition, memory, thinking, and problem solving. According to the preference of command decision-making style, commanders can be divided into calm cognitive style and impulsive cognitive style. Commanders with calm cognitive style pay attention to accuracy but not speed in the decision-making process. The quality of the decisions they make is high, but they are prone to fall into the comparison and analysis of various intelligence information sources and overemphasize the accuracy and objectivity of information analysis. Commanders with calm cognitive style are often easily disturbed by the diverse and diverse information stimulation in battlefield cognitive offensive and defensive operations, and their mental energy is easily disturbed and dissipated, which may lead to missed opportunities. Commanders with impulsive cognitive style pay attention to speed but not accuracy. The decision-making reaction speed is fast, but the quality is not high. They are easily emotional and prone to conflict with team members. Commanders with impulsive cognitive style are also prone to over-interpret the ambiguous external security environment, and constantly look for “evidence” to strengthen and verify individual erroneous thinking, narrowing individual attention and leading to command decision-making deviations. In combat practice, foreign armies pay more attention to analyzing the decision-making style of commanders of combat opponents, and then select specific information to influence them psychologically. For example, during the U.S. invasion of Panama, when besieging the hiding place of Panamanian President Noriega, the U.S. military repeatedly played rock and heavy metal music, and used language that stimulated and humiliated Noriega to carry out cognitive and psychological attacks on him, causing Noriega to gradually collapse physically and mentally.

Backdoor channel to control thinking and cognition

Once a computer is infected with a “Trojan” virus, it will send a connection request to the hacker control terminal at a specific time. Once the connection is successful, a backdoor channel will be formed, allowing the hacker to control the computer at will. Similarly, the human brain also has a cognitive “backdoor” and may be controlled by others. Cognitive psychologists have found that by sending information to the target object’s audio-visual perception channel, carefully pushing information content that the target object recognizes and accepts, catering to the target object’s existing experience memory, conforming to the target object’s thinking habits, and stimulating the target object’s emotional pain points, it is possible to control and interfere with the target object’s cognition and promote its instinctive emotional and behavioral reactions. With the support of cutting-edge cognitive science and technology, using the two modes of automatic start and control processing of brain information processing, the target object can easily fall into a “cognitive cocoon”. In cognitive domain operations, by immersing individuals in massive amounts of artificially constructed information, and continuously providing them with “evidence” to prove that their judgments and cognitions are “correct”. Over time, the individual’s cognitive vision becomes smaller and smaller, and the ability to perceive the external environment gradually decreases. Eventually, they will not be able to see the truth of the matter and will be immersed in the “cognitive cocoon” and unable to extricate themselves. When foreign militaries conduct operations in the cognitive domain, they often target their opponents’ cognitive biases on a certain issue and continuously push situational information and intelligence information through various channels to support their opponents’ so-called “correct cognition,” causing errors and deviations in their opponents’ command decisions.

Sensory stimuli that induce attention

Effective perceptual stimulation is the first prerequisite for attracting the attention of the target object. The human brain will perceive and react to stimuli within the perceptual range. Cognitive psychology experimental research has found that information such as dynamic, dangerous, relevant, survival safety, and contrast between before and after is more likely to attract the attention of the human brain. In the era of intelligence, the psychological cognitive process of the target object often follows the law of “attracting attention, cultivating interest, actively searching, strengthening memory, actively sharing, and influencing others”. In combat, foreign troops often use exclusive revelations, intelligence leaks, authoritative disclosures, on-site connections, and other methods, and cleverly use exaggeration, contrast, association, metaphor, suspense, and contrast to push information that subverts common sense, cognitive conflicts, and strong contrasts to attract the attention of the target object. For example, the “Lin Qi rescue incident” created by the US military in the Iraq War and the “Gaddafi Golden Toilet” in the Libyan War mostly choose stories familiar to the audience as the blueprint, hiding the purpose and embedding the viewpoint in the story plot, which attracted the attention of the general public. In addition, the human brain will also process stimuli outside the perceptual range. In recent years, the military of Western countries has attached great importance to the research of subthreshold information stimulation technology, and has developed subthreshold visual information implantation technology, subthreshold auditory information implantation technology, subthreshold information activation technology, subconscious sound manipulation technology of the nervous system, etc., continuously expanding the application scope of neurocognitive science and technology in the military field.

Meta-value concepts that give rise to cognitive resonance

In cognitive theory, cognitive resonance refers to information that can cross the cognitive gap between the two parties and trigger the ideological and psychological resonance and cognitive empathy of both parties, thereby achieving the deconstruction and reconstruction of the other party’s cognitive system. In cognitive domain warfare, this cognitive energy-gathering effect is not a simple concentration of power, but an internal accumulation of system synergy. Under the diffusion and dissemination of modern information media, this cognitive resonance effect can spread rapidly to all parts of the world in a short period of time, and produce secondary indirect psychological effects or more levels of derivative psychological effects, presenting a state of cumulative iteration. Once it exceeds the psychological critical point, it will present a state of psychological energy explosion, thereby changing the direction or outcome of the event. The targets that can induce this cognitive resonance are mainly value beliefs, moral ethics, common interests, etc. In war, if one party touches or violates human meta-values, common emotional orientation, etc., it is very easy to induce collective condemnation, bear the accusation of violating human morality, and fall into a moral trough. For example, a photo during the Vietnam War shows a group of Vietnamese children, especially a 9-year-old girl, running naked on the road because of burns after being attacked by US napalm bombs. In 1972, this photo caused a huge sensation after it was published, setting off an anti-war wave in the United States and even the world, and accelerating the end of the Vietnam War.

Cognitive gaps in a split cognitive system

In daily life, seemingly hard steel is very easy to break due to the brittleness of the material due to factors such as low temperature environment, material defects, and stress concentration. The same is true for the cognitive system. Cognitive gaps refer to the cracks, pain points, weaknesses, and sensitive points in the cognitive thinking of the target object, which are mainly manifested as the individual’s worry that he is unable to cope with or adapt to the environment, and under the influence of anxiety, cognitive vulnerability is formed. The experience of security threats, the looseness of group structure, the confusion of beliefs and ideals, and the loss of voice of authoritative media will all cause cognitive conflicts and tearing of the target object. In cognitive domain operations, sometimes seemingly powerful combat opponents hide a large number of thinking cracks and psychological weaknesses behind them. Often a news event can shake the cognitive framework of the combat opponent and puncture the cognitive bubble. In addition, this cognitive psychological conflict will also cause moral damage and psychological trauma to individuals. In recent years, the U.S. and Western countries’ troops carrying out overseas missions have faced “enemies disguised as civilians” that appear anytime and anywhere, and their uncertainty about the battlefield environment has continued to increase. They generally lack the perception of the significance of combat, and are filled with guilt and sin. A large number of soldiers developed post-traumatic stress disorder, the number of self-harm on the battlefield, post-war suicides and crimes increased sharply, and the number of suicides among veterans of the war even exceeded the number of deaths on the battlefield.

(Author’s unit: Political Science Academy of National Defense University)

國語中文:

引言

認知域作戰標靶是指認知域作戰的具體作用指向。在認知域作戰中,相較於作戰對象,作戰標靶解決的問題是精確瞄準,也就是讓指揮官了解掌握具體打什麼、往哪裡打、打到什麼程度的精準座標問題。只有深刻理解認知域作戰標靶的內涵特點,才能透過表象準確找到關鍵標靶,以便在未來作戰中掌握先機。

影響行為選擇的認知重心

認知重心是戰爭活動中認知主體多元思維認知的“匯聚點”,作為一種能動因素影響認知進程和行為結果。一般而言,影響戰爭活動中個人行為選擇的認知因素,主要包含政治屬性認知、利益關聯認知、群體歸屬認知、風險損失認知、情緒定向認知、戰爭道德認知等。對於戰爭活動以及關注戰爭活動的群體或個體而言,影響其態度、傾向和行為的認知重心並不相同。從近年來的世界局部戰爭和地區衝突來看,不同群體或個體關注的認知重心有著明顯差異,政治人物更加關注政治屬性認知和利益關聯認知,戰爭可能介入者更關注風險損耗認知和利益關聯認知,一般民眾更關注利益關聯認知和情感定向認知,而域外他國民眾由於自身利益不會受到直接損失,普遍更關注戰爭道德認知和群體歸屬認知。外軍在作戰實踐中,善於針對不同對象的認知重心,精準策劃主題,推送關聯訊息,誘發特定的行為選擇。如同在海灣戰爭前,希爾·諾頓公關公司炮製了根本不存在的“育嬰箱事件”,就是利用科威特駐美大使的女兒娜伊拉“做證”,展現伊拉克軍隊的“慘無人道”,誘發美國民眾的倫理道德認知,進而支持美國政府派兵參加海灣戰爭。

制約指揮決策的風格偏好

認知風格直接影響決策行為偏好。認知風格是指個體認知、記憶、思考、解決問題的典型方式。根據指揮決策風格偏好,指揮家可以分為冷靜型認知風格和衝動型認知風格。冷靜型認知風格的指揮者在決策過程中重視準確但不重視速度,作出的決策品質較高,但容易陷入對各類情報資訊來源的比對分析,過度強調資訊分析的準確客觀。冷靜型認知風格的指揮在戰場認知攻防行動中,常常容易受到紛繁多元的信息刺激幹擾,心智精力容易被擾亂和耗散,進而可能貽誤戰機。衝動型認知風格的指揮者重視速度但不重視準確度,作出的決策反應速度較快,但品質不高,且容易情緒激動,易與團隊成員發生衝突。衝動型認知風格的指揮者也容易將模稜兩可的外在安全環境進行過度曲解,並不斷尋找「證據」強化和驗證個體錯誤思維,使個體注意力變窄,導致出現指揮決策偏差。外軍在作戰實務中,比較著重分析作戰對手指揮官決策風格,進而選擇特定資訊對其進行心理影響。如美軍入侵巴拿馬戰爭中,在圍攻巴拿馬總統諾列加躲藏處時,美軍反複播放搖滾和重金屬音樂,運用刺激和羞辱諾列加的語言對其進行認知打擊和心理進攻,使諾列加身心逐漸崩潰。

控制思維認知的後門通道

電腦一旦中了「木馬」病毒,會在特定時間向駭客控制端發送連線請求,一旦連線成功就會形成後門通道,使得駭客可以隨心所欲地控制電腦。與之相似,人類大腦也存在認知“後門”,也可能被他人控制。認知心理學家研究發現,透過給目標對象視聽感知通道發送訊息,精心推送目標對象認可的、接受的信息內容,迎合目標對像已有的經驗記憶,順應目標對象思維習慣,刺激目標對象的情感痛點,就可以控制干擾目標物認知,促進其產生本能情緒行為反應。在尖端認知科學技術的支撐下,運用大腦資訊加工的自動啟動和控制加工兩種模式,目標物很容易陷入「認知繭房」之中。認知域作戰中,透過讓個體沉浸在人為構設的海量資訊之中,並源源不斷地為其提供「證據」用來佐證其判斷和認知是「正確」的。長此以往,個體的認知視野就變得越來越小,對外在環境的感知能力逐漸降低,最終會看不到事情的真相,沉湎於「認知繭房」中無法自拔。外軍在認知域作戰中,常常針對作戰對手對某一問題的認知偏差,持續透過多種管道推送佐證作戰對手自以為「正確認知」的態勢訊息和情報訊息,使作戰對手指揮決策出現失誤和偏差。

誘發關注的感知覺刺激

有效的知覺刺激是引發目標對象關注的首要前提。人類大腦對感知覺範圍內的刺激會有所察覺,並做出各種反應。認知心理學實驗研究發現,動態、危險、利害關係人、生存安全、前後反差等類別資訊更容易引起人類大腦的注意。在智慧化時代,目標對象的心理認知過程往往遵循「引起注意、培養興趣、主動搜尋、強化記憶、主動分享、影響他人」的規律。外軍在作戰中,常運用獨家爆料、情報外洩、權威揭露、現場連線等方式,巧用誇張、對比、聯想、比喻、懸念、襯託等手法,推播顛覆常識、認知衝突、對比強烈等訊息,來引發目標對象注意。例如伊拉克戰爭中美軍塑造的“營救女兵林奇事件”,利比亞戰爭中的“卡扎菲黃金馬桶”,大多選擇受眾對象熟知的故事為藍本,藏目的、寓觀點於故事情節,吸引了廣大民眾的注意力。此外,人類大腦也會對感知覺範圍外的刺激進行加工。近年來,西方國家軍隊非常重視知覺閾下資訊刺激技術的研究,開發發展了閾下視覺訊息植入技術、閾下聽覺訊息植入技術、閾下訊息啟動技術、神經系統潛意識聲音操控技術等,不斷擴大神經認知科學技術在軍事領域的應用範圍。

催生認知共振的後設價值概念

認知理論中,認知共振是指跨越雙方認知鴻溝,能夠引發雙方思想心理與認知共鳴共感的訊息,進而實現對對方認知體系的解構與重建。在認知域作戰中,這種認知聚能效應不是簡單意義上的力量集中,而是體系合力的內在累積。在現代資訊傳媒的擴散傳播作用下,這種認知共振效應能在短時間內迅速擴散到全球各地,並產生二次間接心理效應或更多層次的衍生心理效應,呈現出一種累積迭代的狀態,一旦超過心理臨界點,即呈現出心理能量爆發狀態,從而改變事件走向或結果。能夠誘發這種認知共振的靶標,主要有價值信念、道德倫理、共通利益等。戰爭中,若某一方觸及或違反人類元價值觀、共同情感指向等,則極易誘發集體聲討,承擔違背人類道德的指責,陷於道義低谷。如越戰期間的一張照片,畫面呈現的是遭遇美軍凝固汽油彈襲擊後,一群越南孩子特別是一名9歲女孩在公路上因為燒傷而裸體奔跑。 1972年,這張照片刊登後引發巨大轟動,掀起美國乃至全球的反戰浪潮,加速了越戰的結束。

分裂認知體系的認知縫隙

日常生活中,看似堅硬的鋼鐵,受低溫環境、材質缺陷、應力集中等因素影響,非常容易因材料脆性而斷裂,認知體係也是如此。認知縫隙是指目標對象認知思考中的裂縫、痛點、弱點與敏感點,主要表現為個體擔心自己沒有能力應對或無法適應環境的想法,並在焦慮情緒的作用下,構成認知脆弱性。安全威脅的經驗、團體結構的鬆散、信念理想的迷惘、權威媒介的失聲等,都會使得目標物出現認知上的衝突與撕裂。認知域作戰中,有時看似強大的作戰對手,背後卻潛藏著大量的思維裂隙與心理弱點,往往一個新聞事件就能動搖作戰對手的認知框架,刺破認知泡沫。此外,這種認知心理衝突也會使個體產生道德損傷和心理創傷。近年來,執行海外任務的美西方國家軍隊面對隨時隨地出現的“偽裝成平民的敵人”,對戰場環境的不確定感不斷提升,普遍缺乏作戰意義感知,進而內心充滿內疚與罪惡。大量士兵產生戰爭創傷後壓力障礙,戰場自殘自傷、戰後自殺與犯罪人數激增,參戰老兵自殺人數甚至超過戰場死亡人數。

(作者單位:國防大學政治學院)唐國東

中國軍網 國防部網 // 2023年3月23日 星期四

中國原創軍事資源:https://www.81.cn/jfjbmap/content/2023-03/23/content_336888.htm

Chinese Military Training and the Metaverse: Challenges & Opportunities Coexist

中國軍事訓練與虛擬世界:挑戰與機會並存

現代英語:

 In the field of military training, the basic technology of the Metaverse has long been used as a virtual resource by the military to varying degrees. It must be acknowledged that the value and potential of the Metaverse in military training is immeasurable and is the focus of current and future military competition. However, due to the immaturity of the development of Metaverse-related technologies and their application in military training, the bright prospects are accompanied by potential risks.

1. The past and present of the military training metaverse
       
 The metaverse relies on a technology group with virtual reality technology as its core. In its early form in the military field, it is also called virtual simulation or simulated Internet. It can be said that virtual simulation training is very close to today’s concept of the metaverse and is the primary form of the military training metaverse. From ancient times to the present, the progress that has brought great influence in the field of science and technology is generally for winning wars or maintaining combat effectiveness. As the leading technology of the third scientific and technological revolution, the metaverse is used for military training in different forms of basic metaverses in the global military field.
      The US military began to deploy the “Military Metaverse” plan very early. In 1978, Jack Thorpe, a captain of the US Air Force, proposed the idea of ​​a military simulator network in his paper, hoping to establish a distributed or networked military modeling system to facilitate training. In 1983, the Advanced Research Projects Agency (DARPA) of the US Department of Defense developed the Virtual Battlefield Network Simulator (SIMNET Simulator), which uses computers to generate virtual battlefields, simulate the situation of fighting between the two sides, and summarize errors and failures. Replacing field exercises in this way saves costs to a certain extent and improves the effectiveness of training. Although the SIMNET simulator, as the earliest version, was still at a lower level of battlefield simulation, it pioneered distributed or networked modeling and simulation. By the end of the 1980s, the project reached its peak, and eventually more than 200 simulated interconnected tank and aircraft simulators based on local area networks and wide area networks were formed across the United States and across Europe, and used for large-scale training and exercises. The distributed interactive simulation (DIS) protocol developed at that time is still in use today, and through more advanced high-level architectures, different military simulations can be linked to provide a richer collective training or mission preparation experience. It can be said that the SIMNET simulator project directly or indirectly promoted the development of many key technologies of the current metaverse. Today, the US military is very interested in the metaverse that has sprung up like mushrooms after rain. The newly established military branch, the United States Space Force (USSF), wants to create a military-specific metaverse for collaborative operations, training, and mission execution. Its chief technology officer, Lisa Costa, declared: “Soldiers cannot go to space in person. The only way they can experience their own combat territory is through visual data display. The virtual reality environment will provide them with situational awareness and understand their options in order to make decisions.”
       In recent years, virtual reality and augmented reality technologies of the metaverse have been incorporated into the regular military training of the US military. In 2014, the BlueShark project developed by the Office of Naval Research and the Institute for Creative Technologies at the University of Southern California allowed soldiers to collaborate in a virtual environment to conduct driving technology training; in 2018, the US Army and Microsoft cooperated to develop an integrated visual enhancement system IVAS for soldiers to conduct regular training; in 2020, the US Navy launched the Avengers Project to conduct flight course training through virtual reality, artificial intelligence and biometric technology; in 2021, Boeing created a military aircraft training system that enables maintenance personnel to use AR technology for related simulated maintenance drills; on May 10, 2022, two US fighter pilots took a jet and completed a high-altitude prototype metaverse experiment over the California desert. Refueling operations were performed using a virtual tanker through a specially designed augmented reality display connected to a computer system that displayed a glowing image of a virtual refueling aircraft.

 (I) The US military uses virtual reality technology for military training on a large scale
       
 . At the same time, Russia is also a leader in the development of virtual training systems. Almost all of its advanced weapons and equipment are equipped with corresponding virtual training systems, and are developing in the direction of universalization and embeddedness. For example, the Sound M universal virtual training system is a universal virtual training equipment for combat personnel of surface-to-air missile weapon systems. The Tor M1 surface-to-air missile system is also equipped with a special virtual training vehicle, which can complete battlefield simulation training while searching for targets and conducting weapon operations.

 (II) The Tor M1 surface-to-air missile system is also equipped with a dedicated virtual training vehicle.
        
 In addition, other countries have also begun to explore the combination of metaverse technology and military training. The British Army has been committed to studying the use of extended reality technology, which can put more than 30 soldiers in the same virtual training scene. The British Ministry of Defense’s “Single Synthetic Environment” has used this technology in soldier training. In South Korea, a developer and supplier of a military training simulator called “Optimus Prime” completed the development of the DEIMOS military training system based on metaverse technology in 2019 and applied it to the training of the armed forces. The system can create various environments for professional military training, including precision shooting training, tactical behavior training and observation training.

       2. The inherent advantages of the metaverse in military training Military training
       
is a commonplace in the military, specifically referring to the military theory and related professional knowledge education, combat skills training and military operations exercises conducted by the armed forces and other trainees. The continuous innovation of technologies such as artificial intelligence and virtual reality has accelerated the trend of intelligentization in future wars. Single actual combat exercises in traditional forms will be difficult to meet the combat requirements under the new situation. As a huge group of new technologies, the metaverse plays an increasingly important role in military training. If training is an important support for combat effectiveness, then the primary use of the metaverse in military training is as an important “enabler” for simulation training.       Immersive experience can improve the effectiveness of battlefield environment simulation. As a practical science, military training is centered on experience and the key to training is immersion. The virtual space created by the metaverse makes people feel a “common sense of embodied presence”, allowing trainees to fully immerse themselves in the virtual space and experience a war close to reality. Battlefield environment simulation uses virtual reality technology to process battlefield element data such as battlefield terrain, battlefield personnel, weapons and equipment through computer systems, and finally creates a realistic three-dimensional battlefield environment. Soldiers are immersed in digital environments such as deserts, mountains or plateaus. Each environment has different tactics, techniques and procedures, and soldiers can constantly practice tasks. Even if the soldiers are not in the actual battlefield environment, this technology is enough to restore the authenticity of the environment. More importantly, through battlefield simulation training, not only can soldiers become familiar with the battlefield environment and obtain information to the greatest extent, but they can also improve their ability to observe things from multiple angles and solve emergencies. The US military has developed a virtual reality system called a laser sand table, which can identify and convert photos and videos sent back by satellites, and turn them into realistic three-dimensional maps, presenting the battlefield environment thousands of miles away to commanders. Before the wars in Afghanistan and Iraq, the US military used virtual reality technology to create real war scenes, including battlefield conditions, personnel appearances, etc., in order to allow soldiers to adapt to the environment in advance and improve their combat capabilities.

 (II) On the eve of the wars in Afghanistan and Iraq, the U.S. military used virtual reality technology to create real war scenes
        
. Open interconnection better supports synthetic training. The various parts of the Metaverse ecosystem can be interconnected and operated, and information can be transmitted across platforms and across the world (between virtual worlds or between the virtual world and the real world) without hindrance. Synthetic training uses the open interconnection advantage of Metaverse technology to supplement actual combat training. According to statistics, since 2015, the number of non-combat deaths in the U.S. military has exceeded the number of deaths in actual military operations each year, and many of the deaths in non-combat operations are caused by conventional military training. Therefore, the U.S. Army has begun to use Metaverse to carry out synthetic training in an attempt to establish a virtual synthetic training environment (STE) to reduce casualties in training. From urban warfare to mountain warfare, the “synthetic training environment” integrates “real-time, virtual and constructive training environments into a single synthetic training environment, and provides training functions to ground, transport and air platforms and command posts where needed.” Practice has proved that the synthetic training environment built by the metaverse, with the help of multi-sensory simulation and restoration, can help soldiers break through the limitations of theoretical learning and cognition, and improve the quality and ability of team combat coordination, injury treatment and safe evacuation. On the eve of the Iraq War, the US military stationed in Kuwait conducted synthetic training on Iraq’s urban conditions, which enhanced the soldiers’ urban combat capabilities while minimizing casualties in actual combat. The
      imaginative space stimulates innovation in military training thinking to the greatest extent. War exercises have been valued by military strategists since ancient times. During the Warring States Period, Mozi and Gongshu Ban’s deduction games of “untying belts to make a city” and “wooden pieces to make weapons” rehearsed the real situation on the battlefield, thus avoiding fighting between the two armies. In the deep scene era opened by the future metaverse, the military system will become highly intelligent, and the two sides of the war may be able to conduct war deductions in the battlefield metaverse, and even compete in the virtual world. Based on the information obtained in the virtual world, the two sides of the deduction capture and predict the changes in the battlefield through thinking processes such as association, reasoning and logical judgment, which is not only conducive to learning more war laws, but also can exercise the soldiers’ logical deduction ability. In the Gulf War of 1991, the U.S. military conducted war games based on the training level of the troops, the possible course of the war, and the time required for actual combat before implementing Operation Desert Storm. Practice has proved that the U.S. military used the problems found in this war game to transform the combat concept into an actual action plan and ultimately won. This also fully demonstrates that the real battlefield is full of uncertainties, so it is necessary to be fully prepared through continuous war exercises. Undoubtedly, it is almost impossible for the enemy and us to conduct coordinated deductions in the real world, but if the deployment of the enemy and us can be made public to a certain extent by their respective satellites, air and ground reconnaissance equipment, then at a certain time point, between two or more parties about to break out a military crisis, it is expected that the deployment of troops in the metaverse can be carried out first, and the actual military conflict can be resolved.
      The application of metaverse technology in military training can not only avoid accidental casualties during training, but also allow a single or many trainees to complete training tasks in different virtual environments without leaving home and without actual contact. And this kind of non-contact training plays a more obvious role in the regular form of the new crown epidemic.
       3. Potential risks of the metaverse in military training
      
Although the metaverse provides technical support for military training to a large extent, it should never be simply understood as a training program or considered as a means of conducting training. Even if the metaverse technology brings convenience and innovation to military training, the technology itself and its accompanying challenges and uncertainties cannot be ignored.
      The development of metaverse technology may cause security issues. The metaverse is a huge technology group. Its system architecture, core algorithms and immersive technology are still in a stage of continuous development. The supporting industry, value consensus, management standards, etc. have not yet been reached. In general, the metaverse is still a new thing, and its application in military training is even more so. Although the use of virtual training systems can reduce casualties to a certain extent, it is worth thinking about whether such training can be truly used in combat sites. It is still an unknown. And whether the technology is safe enough in operation is also an urgent problem to be solved. James Crowley of Virtual City Training Experts pointed out that computer power may be the most influential part of it. Unless the delay can be reduced to a level that does not make people uncomfortable and feels real, and unless the movement and communication data between different people can be stored in different simulators, it will not be able to provide practical training tools. At the same time, another challenge is the security issue of mutual contact between the armies of different countries in the open virtual world of the Metaverse.
      Virtual training environments are prone to cognitive illusions. Military training in the Metaverse world is the result of a contest between human intelligence and technology. War simulations and military training conducted in a virtual environment can have a powerful deterrent effect on future wars, just like “nuclear weapons”. Although it makes up for the limited senses of people at the physiological level, it also brings psychological cognitive illusions to trainees. Taking unmanned combat military training as an example, long-term combat training under a virtual system will cause the operator to have a gaming mentality. Because the audiovisual senses are out of touch with the real battlefield situation, they are alienated from the real people and society, and have a numb mentality towards the behavior of depriving others of their lives. With the continuous maturity of Metaverse technology, the interaction between the virtual world and the real world will become closer and closer, forming a mixed world that is difficult to distinguish between the real and the virtual. By then, it will not only cause a distinction dilemma for the cognitive psychology of soldiers, but also a major challenge for future military training. The
      “decentralization” of the Metaverse deviates from the traditional military training structure. In the world of the metaverse, all parties involved are virtual entities with equal status after computer processing and digitization. They can act autonomously in the metaverse, so they pursue “decentralization”. However, the traditional military training organizational structure is highly centralized and hierarchical management from top to bottom, which is contrary to the value needs of the metaverse. The US military has made a lot of efforts in pursuing “decentralized” operations, such as the “network-centric warfare” proposed in the 1990s, and the current distributed lethality and mosaic warfare. However, the traditional military training structure and thinking inertia are still obstacles to “decentralization”, and this situation is common in the armies of various countries.
     Yuval Noah Harari said in “Sapiens: A Brief History of Humankind” that humans conquer the world by relying on the ability of fiction and imagination. The metaverse gives us the ability to fiction and imagine, and at the same time, the uncertainty of the metaverse in the field of military training also increases the element of fear. Therefore, we must pay attention to innovative scientific and technological theories, develop cutting-edge metaverse technologies, continuously stimulate the potential of the military training metaverse, and at the same time improve relevant laws, regulations and moral and ethical regulations to make advance preparations for winning future intelligent wars.

國語中文:

在軍事訓練領域,元宇宙的基本技術其實早就作為一種虛擬資源,在不同程度上為軍方使用。必須承認,元宇宙在軍事訓練中的價值潛力不可估量,是當下和未來軍事領域爭鋒的焦點。但由於元宇宙相關技術的發展及其在軍事訓練中的應用尚不成熟,美好前景背後也伴隨著潛在風險。
一、軍事訓練元宇宙的前世今生
元宇宙依賴的是以虛擬實境技術為核心的技術群,在軍事領域的早期形態又稱為虛擬模擬或模擬互聯網。可以說,虛擬模擬訓練已經非常接近今天的元宇宙概念,是軍事訓練元宇宙的初級形態。從古至今,科技領域帶來巨大影響力的進步普遍都是為了贏得戰爭或保持戰鬥力。作為第三次科技革命的領導技術,元宇宙在全球軍事領域,以不同形式的基本元宇宙被用於軍事訓練。
美軍很早就開始部署「軍事元宇宙」計畫。 1978年,美空軍上尉傑克·索普在自己的論文中提出了軍事模擬器網路的構想,希望建立一個分散式或網路化的軍事建模系統方便訓練。 1983年美國防部高級研究計畫局(DARPA),開發了虛擬戰場網路模擬器(SIMNET模擬器),以電腦生成虛擬戰場,模擬雙方交戰的情形進行推演,總結錯誤和失敗。用這樣的方式取代實地演習,一定程度上節省了成本,也提高了訓練的效果。雖然SIMNET模擬器作為最早的版本仍處於較低階的戰場仿真,但卻開闢了分散式或網路化建模仿真的先河。到了20世紀80年代末,該計畫達到頂峰,最終落地形成200多個遍布美國、橫跨歐洲,基於區域網路和廣域網路的模擬互聯坦克和飛機模擬器,並用於大規模訓練與演習。而當時開發的分散式互動式模擬(DIS)協議,至今仍在使用,並且透過更先進的高階體系結構,可以連結不同的軍事模擬,以提供更豐富的集體訓練或任務準備體驗。可以說SIMNET模擬器專案直接或間接推動了當前元宇宙的許多關鍵技術的發展。時至今日,美軍對如雨後春筍般崛起的元宇宙興趣正濃,新成立的軍種——美國太空部隊(USSF)欲打造軍事專用元宇宙,用於協同作戰、訓練、執行任務。其技術主管利薩·科斯塔宣稱:「軍人們並不能親自上太空,他們體驗自身作戰疆域的唯一途徑就是視覺數據顯示,虛擬現實環境會為他們提供態勢感知,並了解自己的選項,以便做出決策。
近年來,元宇宙的虛擬實境和擴增實境技術已納入美軍的常規軍事訓練。 2014年,南加州大學海軍研究辦公室和創意技術研究所開發的BlueShark項目,讓士兵在虛擬環境中協作配合,進行駕駛技術訓練;2018年,美陸軍與微軟合作開發了一款集成視覺增強系統IVAS ,供士兵進行常規訓練;2020年,美海軍又推出了復仇者計劃,透過虛擬現實、人工智慧以及生物識別技術,開展飛行課程培訓;2021年,波音公司打造了一個軍用飛機培訓系統,使維修人員利用AR技術進行相關模擬維修演練;2022年5月10日,兩名美軍戰鬥機飛行員乘坐噴射機,在加州沙漠上空完成了一次高空原型元宇宙實驗。透過特製的擴增實境顯示器,連接到一個虛擬加油飛機發光影像的電腦系統,使用虛擬加油機進行了加油操作。
(一)美軍大量採用虛擬實境技術進行軍事訓練
同時,俄羅斯在虛擬訓練系統的開發上也是領先者,其先進武器裝備幾乎都配有相應的虛擬訓練系統,並且正在朝著通用化和嵌入式的方向發展。如音色M通用虛擬訓練系統就是用於地對空飛彈武器系統作戰人員的通用虛擬訓練裝備。道爾M1型地對空飛彈系統也配備有專用虛擬訓練車,可在目標搜尋和武器作戰的同時完成戰地模擬訓練。

(二)道爾M1型地空飛彈系統也配備有專用虛擬訓練車
此外,其他國家也紛紛開始探索元宇宙技術與軍事訓練的結合。英陸軍一直致力於研究擴展實境技術的使用,可以讓30多名士兵處於相同的虛擬訓練場景。英國防部的「單一合成環境」已經在士兵訓練中使用了這項技術。在韓國,一家名為「擎天柱」的軍事訓練模擬器的開發商和供應商,在2019年完成了基於元宇宙技術的DEIMOS軍事訓練系統研發並應用於武裝部隊的訓練。該系統能夠為專業軍事訓練創造各種環境,包括精準射擊訓練、戰術行為訓練和觀察訓練。
二、軍事訓練元宇宙的內在優勢
軍事訓練乃是兵家常事,具體指武裝力量及其他受訓對象所進行的軍事理論及相關專業知識教育、作戰技能教練和軍事行動演練的活動。人工智慧、虛擬實境等技術的不斷革新,加速了未來戰爭的智慧化趨勢。傳統形式下的單一實戰演練將難以滿足新情勢下的作戰要求。而元宇宙作為一個龐大的新技術群,在軍事訓練中扮演越來越重要的角色。如果說訓練是戰鬥效能的重要支撐,那麼元宇宙在軍事訓練中的首要用途便是作為模擬訓練重要的「賦能器」。
沉浸式體驗能夠提升戰場環境模擬效能。軍事訓練作為實踐科學,訓練的核心在體驗,訓練的關鍵在沉浸。元宇宙所創造的虛擬空間,使人感受到一種“共同的具身在場感”,讓受訓者完全沉浸虛擬空間,體驗一場接近真實的戰爭。戰場環境模擬正是利用虛擬實境技術,透過電腦系統對取得的戰場要素資料如戰場地形、戰場人員、武器裝備等進行處理,最終創設出逼真的立體戰場環境。士兵們沉浸在沙漠、山區或高原的數位環境中,每個環境都有不同的戰術、技術和程序,士兵們可以不斷地演練任務。即便士兵不是在實際的戰場環境中,但這項技術足以還原環境的真實度,更重要的是透過戰場模擬訓練,不僅能夠讓士兵熟悉戰場環境,最大程度獲取信息,而且還能提升其多角度觀察事物、解決突發狀況的能力。美軍目前已研發出一款被稱為雷射沙盤的虛擬實境系統,能夠辨識和轉換衛星發回的照片和錄像,並將其轉變成逼真的立體地圖,將遠在千里外的戰場環境呈現給指揮員。在阿富汗戰爭和伊拉克戰爭前夕,美軍都採用了虛擬實境技術來打造真實的戰爭場景,包括戰地狀況、人員樣貌等。旨在讓士兵提前適應環境,提升作戰能力。

(一)元宇宙所創造的虛擬空間,使人感受到一種“共同的具身在場感”

(二)在阿富汗戰爭和伊拉克戰爭前夕,美軍都採用了虛擬實境技術打造真實的戰爭場景
開放式互聯較能支撐合成訓練開展。元宇宙生態系統各部分之間可以實現相互連接和操作,資訊可以暢通無阻地實現跨平台和跨世界傳輸(在虛擬世界之間或虛擬世界與現實世界之間)。合成訓練正是利用元宇宙技術的這一開放式互聯優勢,來實現實戰訓練的補充。根據統計,從2015年開始,美軍每年的非戰鬥死亡人數超出了在實際軍事行動中犧牲的人數,而在非戰鬥行動中喪生的人員很多是由常規軍事訓練造成的。因此,美陸軍已經開始採用元宇宙進行合成訓練,試圖建立虛擬合成訓練環境(STE)來減少訓練中的傷亡。從城市作戰到山地作戰,「合成訓練環境」將「即時、虛擬和建設性的訓練環境整合到一個單一的合成訓練環境中,並在有需要的地方向地面、運載和空中平台以及指揮所提供訓練功能」。實務證明,透過元宇宙建構的合成訓練環境,藉助多感官模擬還原,能夠幫助戰士突破理論學習和認知局限,提升團隊作戰協同、傷情處置和安全撤離等素質和能力。伊拉克戰爭前夕,駐紮在科威特的美軍就對伊拉克的城市狀況進行了合成訓練,增強了士兵城市作戰能力的同時,把實戰中的傷亡降到了最低。
想像性空間最大程度激發軍事訓練思維創新。戰爭演習自古就受到兵家重視,戰國時期墨子和公輸班「解帶為城」「木片為械」的推演遊戲將戰場上的真實情況演練出來,從而避免了兩軍交戰。在未來元宇宙開啟的深度場景時代,軍事體系將走向高度智慧化,作戰雙方或許能在戰場元宇宙進行戰爭推演,甚至在虛擬世界一決高下。推演雙方根據虛擬世界獲取的信息,透過聯想、推理和邏輯判斷等思維過程,對戰場風雲變化進行捕捉和預判,不僅有利於習得更多戰爭規律,還能夠鍛鍊士兵的邏輯推演能力。 1991年的海灣戰爭中,美軍就在實施「沙漠風暴」行動前,根據部隊的訓練水準和可能的戰爭進程,以及實際作戰所需時間進行了兵棋推演。實踐證明,美軍借助這次兵棋推演發現的問題,將作戰設想轉化為實際行動方案,最終取得勝利。這也充分說明了真實的戰場充滿了種種不確定性,因此需要透過不斷進行戰爭演習來做好充分準備。毋庸置疑,敵我雙方在現實世界中進行協同推演幾乎是不可能的,但若敵我雙方的兵力部署可以被各自的衛星、空中和地面偵查設備進行一定程度的公開,那麼在某個時間節點,在即將爆發軍事危機的雙方或多方之間,先在元宇宙中進行排兵布陣,可以化解現實的軍事衝突則有望實現。
元宇宙技術在軍事訓練中的應用不僅可以避免訓練中的人員意外傷亡,還可以讓單一或眾多參訓者在足不出戶,無需實際接觸便可在不同虛擬環境下完成訓練任務。而這種非接觸式訓練在新冠疫情的常規化形態下,所扮演的角色更加明顯。
三、軍事訓練元宇宙的潛在風險
元宇宙在很大程度上雖然為軍事訓練提供技術支撐,但絕不能僅僅將其簡單地理解為一種訓練項目,或者被認為是一種開展訓練的手段。即使元宇宙技術為軍事訓練帶來便利和創新,也不能忽視科技本身及其伴生的挑戰和不確定性。
元宇宙技術發展或引發安全問題。元宇宙是一個龐大的技術群,其體系架構、核心演算法和沈浸技術等尚處於不斷開發的階段,配套產業、價值共識、管理標準等還沒有達成,總的來說,元宇宙還是一個新事物,在軍事訓練中的應用更是如此。儘管使用虛擬訓練系統能夠在一定程度上減少傷亡,但值得思考的是這樣的訓練是否能夠真正用於作戰現場,目前仍是一個未知數。而技術在運作中是否夠安全也是一個亟待解決的問題。虛擬城市訓練專家公司的詹姆斯·克勞利指出,電腦能力可能是其中最有影響力的部分,除非可以將延遲降低到不會讓人不適並且感覺真實的程度,除非可以在不同的模擬器中儲存不同人之間的行動和通訊數據,否則將無法提供實用的訓練工具。同時,另一個擺在眼前的挑戰則是在開放的元宇宙虛擬世界中,不同國家軍隊之間相互接觸的安全問題。
虛擬訓練環境易造成認知錯覺。元宇宙世界裡的軍事訓練是人類智力和技術較量的結果,在虛擬環境下進行的戰爭推演、軍事訓練等對未來戰爭的作用,如同「核武」一般,不用動用實槍實彈也能起到強大的威懾效果。儘管在生理層面彌補了人的有限感官,同時也帶來了受訓者在心理上的認知錯覺。以無人作戰的軍事訓練為例,長期在虛擬系統下進行作戰訓練,將會造成操縱者的遊戲心態。由於視聽感官與真實戰場情況脫節,而疏離了現實的人與社會,對於剝奪他人生命的行為產生麻木心態。隨著元宇宙技術的不斷成熟,虛擬世界、現實世界的互動將會越來越緊密,形成虛實難分的混合世界。到那時,不僅對士兵的認知心理造成區分困境,對於未來的軍事訓練也是重大挑戰。
元宇宙「去中心化」與傳統軍事訓練結構相背離。在元宇宙的世界中,參與各方都是經過電腦處理、資料化後產生的地位平等的虛擬主體,可以在元宇宙中自主活動,因而其追求「去中心化」。但傳統的軍事訓練組織結構則是高度集中、自上而下的分層化管理,這一方面與元宇宙的價值需求是背離的。美軍在追求“去中心化”作戰上做出了很多努力,如20世紀90年代提出的“網絡中心戰”,以及當前的分散式殺傷以及馬賽克戰等。但傳統的軍事訓練結構和思維慣性仍然是「去中心化」的阻力,而這種情況普遍存在於各國軍隊。
尤瓦爾·赫拉利在《人類簡史》中談道,人類靠著的是虛構和想像的能力征服世界。元宇宙給了我們虛構和想像的能力,同時元宇宙在軍事訓練領域的不確定性也增加了恐懼的成分。因此,我們必須關注創新科技理論、發展元宇宙前沿科技,不斷激發軍事訓練元宇宙的潛力,同時完善相關法律法規和道德倫理規約,為打贏未來智慧化戰爭做好超前準備。
版權聲明:本文刊於2023年1期《軍事文摘》雜誌,作者:張愷悅、李傑春,如需轉載請務必註明「轉自《軍事文摘》」。

中國軍事原文來源:https//www.81it.com/2023/0321/14167.html

Chinese Military Use of Cognitive Confrontation within the Combat Domain

中國軍事在作戰領域使用認知對抗

現代英語作為外語音譯:

Modern warfare, according to the characteristics of material form, usually divides the combat domain into the physical domain, the information domain, and the cognitive domain. The three domains interact with each other to form the field and soil for military confrontation. Although cognitive domain operations occur in the cognitive domain, their operational support often spans various fields. War practice shows that with the enhanced effectiveness of hard strikes in the physical domain, cognitive formation can often be accelerated, and cognitive realization can better meet combat needs.

Cognitive offense and defense cannot be separated from physical support

Today’s world is a world where everything is interconnected. The collection of different objects connected to each other greatly enhances the function of independent individuals acting alone. Cognitive domain operations are never isolated operations between cognitive carriers. Only by integrating cognitive offense and defense into an integrated joint operations chain, closely integrating with physical domain military strike operations, and tightly integrating with the entire combat system can we fully exert combat effectiveness.

The starting point of cognition. Existence determines consciousness. Thinking and cognition is not a fairy from the sky, but a true or tortuous reflection of the real world. Without the foundation of the material world, thinking and cognition will lose the source of information, the basis for analysis and judgment, and the accuracy of decision-making and action, making it difficult for people to trust, recognize, and rely on. Even the most psychedelic science fiction wars still have references to real combat targets, specific combat objectives, and corresponding combat paths. Therefore, intelligence reconnaissance analysis has become an indispensable and important link for commanders to organize troops and plan. “Without investigation, there is no right to speak” is regarded as a golden rule that must be followed in decision-making. Battlefield simulation simulations have become an important step for the success of combat operations. In history, most of the combat commands of accomplished generals and classic combat cases that can withstand the test of history and practice are all based on full investigation and research and scientific intelligence analysis. Without the hard-core support of the real world, “human beings think about , and God laughs.”

The basis of cognitive effects. A golden rule of operations in the cognitive domain is that soft power at the cognitive level must be supported by hard strikes at the physical level in order to ensure and strengthen its effect. Strong military pressure is a necessary prerequisite for cognitive means to work, and continuous victory on the battlefield is the core support for winning cognitive wars. If the United States does not have the high-pressure pressure of its super comprehensive national strength and superior technology, its “Star Wars Plan” may not really work. If cognitive domain operations lack the support of specific military operations in the physical domain, they will never produce the good effects of doubting, confusing, deterring, and defeating the enemy. To grasp the initiative in thinking and cognition and to take the initiative in cognitive domain operations, we must not only strengthen the construction of cognitive ontology, improve the ability to directly use strategies and technical means to strengthen self-protection, intervene and influence the opponent’s thinking and cognition, but also actively strive to The physical domain leverages the conduction effect of military operations in the physical domain to enhance thinking and cognition.

The starting point for cognitive realization. Marxism believes that once theory grasps the masses, it will also become material force. From the perspective of combat in the cognitive domain, spiritual creation at the superstructure level of cognition will not automatically turn into material power. Only by being attached to a certain material carrier and practical grasp can it be possible to realize spiritual to material and consciousness. A critical leap into existence. Just as in World War II, if the German army had not bypassed the Maginot Line, broke through the Ardennes Forest, and launched a surprise attack into the French hinterland, it would have been impossible to demonstrate the foresight of the cognitive achievement of the “Manstein Plan”; similarly, if there had been no Allied Forces, The military’s successful landing in Normandy, which invaded the east and west, also failed to highlight the ingenuity of the “overlord plan” strategy of “building plank roads openly and concealing warehouses secretly”. Thinking and cognition are transmitted through people to specific military operations in the physical domain, and then the specific military operations in the physical domain realize the material transformation of cognitive results, forming the fundamentals of the two-way interaction between cognitive offense and defense and military strikes in the physical domain.

The basic method of physical attack to support cognitive offense and defense

The methods and methods used by military strikes in the physical domain to support cognitive offense and defense follow the general law that matter determines consciousness and existence determines thinking. The basic methods can be divided into enhanced support, confirmation support and realization support.

Enhanced support. Military strikes in the physical domain strengthen the formation and development of thinking and cognition. Although thinking and cognition depend on the quality of the cognitive carrier itself, it will be difficult to achieve without the support of military operations in the physical domain. The most basic role of military operations in the physical domain in the cognitive domain is to provide solid support for the formation and development of thinking and cognition. Thinking and cognition can only be stable and far-reaching if it is based on real physical actions. For example, in the early days of the Korean War, when the Korean People’s Army was overwhelming, our army’s combat staff Lei Yingfu and others accurately predicted the landing of the US military based on the war situation, geographical and weather characteristics of the Korean Peninsula, especially the various actions of the US and South Korean troops at that time, etc. time and location. Similarly, Li Qiwei of the “United Nations Army” also made a judgment on the “worship offensive” based on the logistics support, weapons and equipment, and tactical use of the volunteers, and used “magnetic tactics” to fight me. These are all enhancements to the formation and development of thinking and cognition caused by combat in the physical domain.

Confirmation type support. Military strikes in the physical domain confirm preset thinking, precognition, and prejudgment. Cognitive attack and defense does not only occur at the cognitive level, but is the interaction between cognition and practice. War is a “place of life and death, a way of survival”. If one’s cognitive decision-making cannot be verified in many directions at the practical level, then acting rashly is the greatest irresponsibility for war. During the revolutionary war years, our military’s decision-makers were always under the control of the overall strategy and gave front-line commanders the power to act as appropriate and in accordance with the overall strategic direction principle. This is a positive confirmation of strategic thinking. During the Second World War, the Allies used “false facts” to mislead, constantly shaping and strengthening the German army’s misunderstanding of the Allied landing sites on the European continent, and finally successfully landed in Normandy with minimal cost. This was a counter-attack. To confirm.

Implementation support. Provide direct physical support for the realization of thinking, cognition, judgment and decision-making. Thinking and cognition must be transformed into actual results that change the world. The thinking and cognition acting on the opponent is not the end but a new starting point. Next, it must be acted upon in the physical world through “skilled hands” and “brave heart”. In other words In short, it is to provide direct physical action support for the value realization of thinking and cognition. This is just like Zhuge Liang’s clever plan, no matter how brilliant it is, if there is no implementation by the “Five Tiger Generals” and other Shu Han soldiers, it can only remain at the cognitive level of talking on paper. No matter how efficiently the first three parts of the “OODA” loop operate, if the execution link “A” is missing, it will be a “dead loop”. Similarly, the results of our military’s command decisions also depend on the resolute, thorough, and creative execution of the officers and soldiers. The quality and efficiency of the execution directly determines the effectiveness of the implementation of the command decisions. In this regard, physical actions at the execution level are of extremely important practical significance.

Effectively strengthen the interaction between cognitive offense and defense and physical strikes

Thinking and cognition must rely on the support of physical actions, which is an objective law that is independent of human will. It is an extremely important task to strengthen the communication and interaction between thinking and cognition and physical strikes to make our thinking and decision-making more targeted, objective and operable, so as to better transform cognitive advantages into action advantages and winning advantages. .

Be more proactive and solidify your cognitive foundation. Whether the thinking and cognition is correct depends fundamentally on its compatibility with objective reality and its applicability to combat opponents. Only thinking and cognition based on full investigation and research, seeking truth from facts and comparative advantages can stand the test of practice and actual combat. The practice of absolute, sacred, and nihilistic thinking or generals’ genius, wisdom, and inspiration is idealistic, one-sided, and harmful. This requires that we must work hard to base our thinking and cognition on the basis of extensive investigation, research and intelligence analysis, and truly understand the enemy’s situation, our situation, and other people’s situations, truly know our enemies and ourselves, know everything we should know, and adapt to local conditions. The camera moves. At the same time, we must combine reading books without words with books with words, unify indirect theory with living practice that is constantly developing and changing, and dialectically recognize past experiences and lessons and other people’s experiences and lessons, so that they become our own knowledge. Help instead of shackles, assist instead of dominate.

Be more proactive and strengthen cognitive rationality. Correct understanding that can withstand the long-term test of practice and actual combat comes from practice and is strengthened through feedback from practice. Cognitive practical experience is only the basic material for obtaining correct cognition. To form scientific cognition, we need to further eliminate the false and preserve the true in the repeated collision and verification of consciousness and matter, thinking and existence, in order to improve cognitive rationality. It is wrong and even fatal to think that true knowledge can be obtained once and for all from only local situations, fragmented information and individual periods of time. In the Battle of Chibi in ancient China, Cao Cao’s side only came to the understanding of conjoining warships from the common sense that iron cables can balance the shaking of the ship’s hull, but did not confirm it from the actual combat effects or consequences of concatenating warships. If you don’t know how to recreate, you will easily tie up the ship with iron ropes and tie yourself up, and ultimately end up in the disastrous defeat of “burning Red Cliff”. Times have changed, and the enemy situation on the modern battlefield is ever-changing. There has never been an unchanging cognitive practice, nor a once-and-for-all cognitive achievement. It can only strip away impurities and extract the essence from material to cognitive to material confirmation for re-cognition. , can we return to rationality.

Be more proactive in objectifying cognitive outcomes. Cognitive achievements are only the result of thinking and consciousness nurtured in cognitive carriers. Without timely and effective material transformation, it will be like walking at night wearing brocade clothes or hiding treasures in the mountains, and it will be difficult to demonstrate its own value. Thinking and cognition are based on physical actions, and ultimately rely on specific actions in the physical domain before they can be materialized and transformed into actual results that change the subjective and objective worlds. This requires us to not only consolidate the cognitive foundation and strengthen cognitive rationality, but also improve the operability of cognitive decision-making and planning as much as possible, opening the door for smoother materialization and transformation. At the same time, efforts must be made to improve the execution capabilities of decision-making and deployment executors, so that they can correctly understand the intention of decision-making, creatively adopt appropriate methods based on specific realities, and maximize the implementation of cognitive results and operational decision-making plans to the end. Be a good “ferryman” and “bridge across the river” that connects and transforms cognitive results with combat effectiveness.

(Author’s unit: Military Political Work Research Institute, Academy of Military Sciences)

原始繁體中文:

現代戰爭根據物質形態的特點,通常將作戰域分為物理域、資訊域和認知域。 這三個領域相互作用,形成軍事對抗的場域和土壤。 認知域操作雖然發生在認知領域,但其操作支援往往跨越各領域。 戰爭實踐表明,隨著物理領域硬打擊效能的增強,往往可以加速認知形成,認知實現更能滿足作戰需求。

認知攻防都離不開物質支撐

當今世界是一個萬物互聯的世界。 相互連結的不同物體的集合極大地增強了獨立個體單獨行動的功能。 認知域操作從來都不是認知載體之間孤立的操作。 將認知攻防融入一體化聯合作戰鏈,與物理域軍事打擊行動緊密結合,與整個作戰體系緊密結合,才能充分發揮戰鬥力。

認知的起點。 存在決定意識。 思維和認知不是天上來的仙女,而是現實世界的真實或曲折的反映。 離開了物質世界的基礎,思考和認知就會失去資訊的來源、分析判斷的基礎、決策和行動的準確性,使人難以信任、認知、依賴。 即使是最迷幻的科幻戰爭,仍然會參考真實的作戰目標、具體的作戰目標以及相應的作戰路徑。 因此,情報偵察分析成為指揮組織部隊、規劃不可或缺的重要環節。 「沒有調查就沒有話語權」被視為決策必須遵循的金科玉律。 戰場模擬模擬已成為作戰行動成功的重要一步。 歷史上,大部分功將的作戰指揮和經得起歷史和實踐檢驗的經典作戰案例,都是建立在充分調查研究和科學情報分析的基礎上的。 沒有現實世界的硬派支撐,「人類一思考,上帝就笑」。

認知效應的基礎。 認知領域作戰的一條金科玉律是,認知層面的軟實力必須有實體層面的硬實力支撐,才能確保並強化其效果。 強大的軍事壓力是認知手段發揮作用的必要前提,戰場上的持續勝利是贏得認知戰爭的核心支撐。 如果美國沒有超強的綜合國力和優越的技術的高壓壓力,其「星際大戰計畫」可能無法真正發揮作用。 認知域作戰如果缺乏物理域具體軍事行動的支撐,永遠不會產生疑、迷、震懾、克敵的良好效果。 要掌握思維認知的主動權,掌握認知域作戰的主動權,不僅要加強認知本體建設,提高直接運用策略和技術手段加強自我保護、幹預和影響對手的能力。思維和認知,還積極努力在物理領域利用軍事行動在物理領域的傳導效應,增強思維和認知。

認知實現的起點。 馬克思主義認為,理論一旦掌握了群眾,也就成為物質力量。 從認知領域的戰鬥來看,認知上層建築層面的精神創造並不會自動轉化為物質力量。 只有執著於一定的物質載體和實踐把握,才有可能實現精神到物質、意識的轉變。 實現的關鍵飛躍。 正如二戰時,如果德軍沒有繞過馬其諾防線,突破阿登森林,向法國腹地發起奇襲,就不可能展現「德軍認知成就」的先見之明。曼斯坦計畫」; 同樣,如果沒有盟軍,軍隊在東西兩進的諾曼第成功登陸,也未能凸顯出「明修棧道、暗藏倉庫」的「霸王計畫」戰略的巧妙之處。 思維認知透過人傳遞到物理域的具體軍事行動,再由物理域的具體軍事行動實現物質轉化。

認知結果,形成認知攻防和物理領域軍事打擊之間雙向互動的基礎。

物理攻擊支撐認知攻防的基本方法

物理領域軍事打擊支持認知攻防所採用的手段和方式,遵循物質決定意識、存在決定思維的一般法則。 基本方式可分為增強支援、確認支援和變現支援。

增強支援。 物理領域的軍事打擊加強了思維和認知的形成和發展。 思維認知雖然依賴認知載體本身的品質,但如果沒有物理領域軍事行動的支持,就很難實現。 物理領域軍事行動在認知領域最基本的作用就是為思考認知的形成與發展提供堅實的支持。 思考和認知只有建立在真實的身體行動的基礎上,才能穩定、深遠。 例如,朝鮮戰爭初期,朝鮮人民軍勢不可擋時,我軍作戰參謀雷英夫等人根據朝鮮半島戰局、地理、天氣特點,準確預測了美軍登陸,尤其是當時美軍和韓國軍隊的各種行動等等時間地點。 同樣,「聯合國軍」的李奇偉也根據志願軍的後勤保障、武器裝備、戰術運用等,對「拜拜攻勢」做出了判斷,用「磁性戰術」與我作戰。 這些都是物理領域的戰鬥對思維認知的形成與發展的增強。

確認類型支援。 物理領域的軍事打擊證實了預設的思維、預知和預判。 認知攻防不僅發生在認知層面,而是認知與實踐的互動。 戰爭是「生死之地,生存之道」。 如果一個人的認知決策無法在實踐層面得到多方位的驗證,那麼輕舉妄動就是對戰爭最大的不負責任。 革命戰爭年代,我軍決策層始終處於整體戰略的掌控之中,賦予第一線指揮官依照整體戰略方向原則酌情行動的權力。 這是對戰略思維的正面肯定。 二戰期間,盟軍利用「虛假事實」進行誤導,不斷塑造並強化德軍對歐洲大陸盟軍登陸地點的誤解,最終以最小的成本成功登陸諾曼第。 這是一次反擊。 確認。

實施支援。 為思維、認知、判斷和決策的實現提供直接的物質支持。 思維和認知必須轉化為改變世界的實際結果。 作用於對手的思維和認知不是終點而是新的起點。 接下來,必須透過「巧手」和「勇敢的心」在物質世界中付諸行動。 換句話說,簡而言之,就是為思考認知的價值實現提供直接的身體行動支撐。 這就像是諸葛亮的巧妙計劃,無論多麼輝煌,如果沒有「五虎將」和其他蜀漢將士的實施,也只能停留在紙上談兵的認知層面。 無論“OODA”循環的前三部分運行得多麼高效,如果缺少執行環節“A”,那麼這將是一個“死循環”。 同樣,我軍指揮決策的結果也取決於官兵的堅決、徹底、創造性執行。 執行的品質和效率直接決定指揮決策的執行效果。 就此而言,執行層面的身體動作具有極為重要的現實意義。

有效加強認知攻防與身體打擊的互動

思考和認知必須依靠身體動作的支持,這是不依賴人的意志的客觀規律。 加強思考認知與身體打擊的溝通互動,使我們的思維和決策更加具有針對性、客觀性和可操作性,從而更好地將認知優勢轉化為行動優勢和製勝優勢,是一項極其重要的任務。 。

更加積極主動並鞏固您的認知基礎。 思維認識是否正確,從根本上取決於它是否符合客觀現實,是否適用於打擊對手。 唯有思考

而充分調查研究、實事求是、比較優勢的認識是經得起實踐和實戰檢驗的。 實行絕對的、神聖的、虛無的思想或將軍的天才、智慧、靈感,是唯心主義的、片面的、有害的。 這就要求我們必須努力把思維認識建立在廣泛調查研究和情報分析的基礎上,真正了解敵情、我情、他人情,真正知己知彼、知己知彼。應該了解並因地制宜。 相機移動。 同時,要把閱讀無字書與有字書結合起來,把間接理論與不斷發展變化的生活實踐結合,辯證地認識過去的經驗教訓和別人的經驗教訓,使之成為我們自己的經驗教訓。知識。 幫助而不是束縛,協助而不是支配。

更積極主動,強化認知理性。 經得起實踐和實戰長期檢驗的正確認識來自於實踐,並透過實踐的回饋得到強化。 認知實務經驗只是獲得正確認知的基礎材料。 形成科學認知,需要在意識與物質、思考與存在的反覆碰撞與驗證中進一步去偽存真,以提高認知理性。 認為只有從局部情況、碎片資訊和個別時期才能一勞永逸地獲得真正的知識是錯誤的,甚至是致命的。 在中國古代的赤壁之戰中,曹操一方只是從常識中得出了連體戰船的認識,即鐵纜可以平衡船體的晃動,但並沒有從實戰效果或連體後果中證實這一點。軍艦。 如果不懂得再造,很容易就會用鐵繩把船綁起來,把自己綁起來,最後落得「火燒赤壁」的慘敗。 時代變遷,現代戰場敵情瞬息萬變。 從來沒有一成不變的認知實踐,也沒有一勞永逸的認知成就。 它只能從物質中剔除雜質,提取精華,去認知,去物質確認,重新認知。 ,我們能否回歸理性。

更主動地客觀化認知結果。 認知成就只是認知載體中孕育思考和意識的結果。 如果沒有及時有效的物質改造,就會像穿著錦衣走夜路或藏寶藏山一樣,很難展現出自身的價值。 思維和認知是以物理行為為基礎的,最終要依靠物理領域的具體行為才能具體化,轉化為改變主觀世界和客觀世界的實際結果。 這就要求我們不僅要夯實認知基礎、強化認知理性,還要盡可能提高認知決策和規劃的可操作性,為更順利的物化和轉化打開大門。 同時,要努力提高決策部署執行者的執行能力,使他們能夠正確理解決策意圖,根據具體實際創造性地採取適當的方法,最大限度地落實認知結果和經營決策計劃進行到底。 當好認知結果與戰鬥力銜接轉化的「擺渡人」、「過河橋樑」。

(作者單位:軍事科學學院軍事政治工作研究所)

中國軍事資料來源:https://www.81.cn/jfjbmap/content/2022-12/06/content_38888.htm