Tag Archives: #Cognitive Domain Warfare

[Chinese National Defense] Establishing Correct Awareness to Contain China and Conduct Cognitive Warfare Operations

[中國國防]建立正確的意識,以遏制中國並進行認知戰爭行動

現代英語:

As the world continued to actively combat the COVID-19 pandemic, the British newspaper The Guardian reported in late May 2021 that Fazze, a public relations and marketing agency with close ties to Russian officials, was accused of providing funding to influential YouTubers, bloggers, and other opinion leaders in France, Germany, and other European countries to spread false information claiming that vaccines like Pfizer (BNT) and AstraZeneca (AZ) had caused hundreds of deaths. The false information also criticized the EU vaccine procurement system for harming public health in European countries, with the goal of sowing public distrust of Western vaccines and shifting public acceptance toward Russia’s Sputnik V vaccine. This is the most significant example of “perception warfare” in recent international history.

 In fact, human society has always adhered to the principle of “conquering the enemy without fighting” as the guiding principle for optimal military operations. While traditional warfare still primarily takes place in physical space, victory requires the physical capture of cities and territories, as well as the destruction of enemy forces. However, as humanity’s understanding of the nature of war deepens, the use of information technology has become a new trend in warfare, enabling the achievement of traditional combat effectiveness without the need for physical engagement. Given the increasing attention paid to “information warfare” and “hybrid warfare,” this article discusses the closely related concept of “cognitive warfare,” exploring the emerging threats facing our country and our national defense response strategy.

 Whether it’s what the US calls “hybrid warfare” or what Russia calls “information warfare,” the implications are quite similar: centered on the cognitive realm, the use of information to influence and manipulate targets, encompassing both peacetime public opinion and wartime decision-making. The rise of Nazi Germany after World War I was arguably the first modern regime to master the use of information to shape perceptions within its own country and even abroad. Its successful use of propaganda and lies, delivered through various communication technologies, was highly successful. Principles such as “repetition is power” and “negative information is more easily accepted and remembered than positive information” would later profoundly influence authoritarian governments, including Russia.

 Using information capabilities to subvert national regimes

 At the beginning of the 21st century, Russia began to pay attention to the situation where international discourse power was completely controlled by Western countries. It successively put forward theories such as “Information Warfare Theory” and “Sixth Generation Warfare Theory”, arguing that the sixth generation of warfare is a non-contact war that uses precision weapons and information warfare to traverse the battlefield. The purpose of war is no longer a devastating global war, but to achieve effects that cannot be achieved through traditional warfare by exploiting the enemy’s information capabilities to exploit its weaknesses, including changing social and cultural orientations and values, and thus subverting national regimes.

 In 2005, Russia established the international news channel “Russia Today.” Initially focused on soft power propaganda, it shifted its focus after the 2008 Georgian War to attacking negative aspects of Western society and fostering conspiracy theories. The 2014 Ukraine crisis became a training ground for Russian information warfare forces. Using electronic jamming and cyber theft, they intercepted Ukrainian communications, inferring subsequent Ukrainian actions and releasing damaging information at critical moments. They also targeted sensitive issues in eastern Ukraine, including the status of ethnic Russians and economic downturn, distributing a large amount of carefully selected, targeted information to resonate with the public, influencing their perceptions and behavior and gaining control of media opinion. In terms of “cognitive warfare,” Russia’s approach has been successful, and has become a model for the Chinese Communist Party.

 Manipulating “brain control” to control the public

 In 2014, the Chinese Communist Party (CCP) proposed the cognitive operational concept of “brain control,” building on its past “three warfares” of psychological warfare, legal warfare, and public opinion warfare, as well as Russia’s theoretical framework of “information warfare.” It states that a nation’s cognitive space is composed of the superposition of countless individuals, and that “brain control” uses national languages, propaganda media, and cultural products as weapons to comprehensively infiltrate and control the cognition, emotions, and consciousness of the general public and national elites, ultimately distorting, disintegrating, and reshaping their national spirit, values, ideology, history, and culture, thereby achieving the strategic goal of winning without fighting.

 Therefore, the CCP’s “cognitive operations” fall under the broad category of psychological warfare. In the era of information globalization, it integrates information warfare, psychological warfare, and public opinion warfare, becoming the core of the CCP’s overall strategy. Since the 2016 military reform, it has been led by the newly formed “Strategic Support Force” and implemented at all political and military levels. On the one hand, the PLA has adopted American operational thinking in the field of “cognitive operations,” using units such as the 311 Base, the National University of Defense Technology, and the Academy of Military Sciences to develop tactics such as “psychological operations,” “ideological operations,” “consciousness manipulation,” and “strategic communication” to strengthen the “cognitive operations” capabilities jointly constructed by military-civilian integration and joint combat systems. On the other hand, it uses professional personnel to operate media platforms, shape the public opinion environment, and introduce “cognitive operations” into the actual combat application stage.

 The CCP’s recent “cognitive warfare” offensive against Taiwan reveals its methods and tactics. First, the CCP primarily uses the internet to collect personal data from Taiwanese citizens, using big data databases to categorize information by target group, based on political leanings, age, occupation, and other factors. Second, it leverages intelligence gathering to launch targeted cognitive attacks on specific social media platforms, influencing the psychology of the targeted groups, particularly by releasing disinformation to weaken and distract Taiwanese society. Third, it employs online virtual organizations to set up fake social media accounts, infiltrate online communities, and disguise themselves as whistleblowers, deliberately spreading fabricated information to create confusion. Cybertroopers then massively repost and discuss this information, manipulating audience perceptions and creating a cycle of disrupting information retention, manipulating cognitive psychology, and altering thinking patterns.

 Identify fake news and fight back together

 At this stage, the CCP’s campaign for “brain control” over Taiwan aims to influence Taiwanese society’s cognition, distorting public opinion, devaluing democratic values, intensifying opposition, disrupting political conditions, and undermining public trust in the government. The following preventive measures can be taken within the national defense system:

 1. Strengthening educational functions

 Through national defense education in schools, institutions, and society, we will raise the public’s awareness of the threat posed by the CCP’s “cognitive warfare” and their ability to identify false information, and cultivate the habit of rationality, verification, and calmness.

 2. Follow the constraints

 Although there are currently no internationally accepted legal rules that can clearly define the extent to which cognitive warfare constitutes an act of war, making it even more difficult to hold people accountable, media platforms can still strengthen the review of their own reporting content in accordance with existing regulations, and the public can also refrain from spreading suspicious information and following the trend of tennis melee, so as to facilitate the establishment of information verification measures and mechanisms.

 3. Combining Military and Civilian Strength

 Incorporate information and communication-related institutions and industries into the national defense mobilization mechanism, coordinate in peacetime the review, analysis, and disposal of fake news, strengthen talent training and research cooperation, and enhance the capabilities of professional units of the government and the national army; in wartime, cooperate with the overall national actions and carry out countermeasures.

 Currently, Taiwan already has the National Security Bureau’s National Security Operations Center responsible for responding to controversial information from hostile foreign forces. There’s also the non-profit Taiwan Fact-Checking Center. Facing the challenges of cognitive warfare, we must continue to integrate various sectors, strive for international intelligence exchange and experience sharing, optimize the media environment, collaborate across multiple channels, and instantly identify the authenticity and source of information, jointly building our offensive capacity to respond to cognitive warfare.

 Conclusion

 In reality, all countries around the world face threats related to cognitive warfare and information-based psychological warfare. However, democratic and free societies are by no means vulnerable to cognitive warfare attacks and must instead rely on diverse strategies and methods to protect them. We aim to establish a more comprehensive and substantive framework, build a powerful counterforce, and enhance the quality and discernment of our citizens, thereby gaining immunity from the CCP’s cognitive warfare campaign to seize control of our minds.

(The author is a PhD candidate at the Institute of Strategic Studies, Tamkang University)

現代國語:

在全球持續積極對抗新冠疫情之際,英國《衛報》2021年5月下旬報道,與俄羅斯官員關係密切的公關和營銷機構Fazze被指控向法國、德國和其他歐洲國家頗具影響力的YouTube用戶、博主和其他意見領袖提供資金,用於傳播虛假信息,聲稱輝瑞(BNTAZ)和阿斯特利康(BNTAZ)和阿斯特疫苗已導致數百人死亡。這些假訊息也批評歐盟疫苗採購體系損害了歐洲國家的公共衛生,目的是挑起大眾對西方疫苗的不信任,並促使大眾接受俄羅斯的Sputnik V疫苗。這是近代國際史上最顯著的「感知戰」案例。

事實上,人類社會自古以來,均以「不戰而屈人之兵」作為最佳軍事行動指導原則,儘管傳統戰爭主要仍在物理空間進行,需透過實際攻城掠地、消滅敵有生力量,才能獲得勝利。然隨人類對戰爭本質認知深化,利用資訊科技,於不需實體短兵相接的情況下,卻能達到傳統戰爭效果,已成為新型態戰爭趨勢。鑑於「資訊戰」、「混合戰」日益受重視,謹就與其密切相關的「認知作戰」概念進行論述,並探討我國所面臨的新型威脅及全民國防因應策略。

無論是美國所稱的「混合戰」,或俄國所說的「資訊戰」,其實指涉意涵很相似,即以認知領域為核心,利用訊息影響、操控對象目標涵蓋承平時期輿論及戰時決策的認知功能。一戰後,逐漸興起的納粹德國,可謂當代首個擅長運用資訊形塑本國,甚至外國民眾認知的政權,其透過各種傳播技術的政治宣傳與謊言包裝,相當成功;而所謂「重複是一種力量」、「負面訊息總是比正面訊息,更容易讓人接受和印象深刻」等實踐原則,日後更深刻影響專制極權政府與現在的俄羅斯。

藉資訊能力 顛覆國家政權

俄國於進入21世紀初,開始注意國際話語權遭西方國家完全掌控的情形,陸續提出「資訊戰理論」、「第6代戰爭理論」等論述,主張第6代戰爭是以精確武器及資訊戰,縱橫戰場的非接觸式戰爭,戰爭目的不再是毀滅性的全球大戰,而是藉利用敵方弱點的資訊能力,達成傳統戰爭無法實現的效果,包括改變社會文化取向、價值觀,進而顛覆國家政權等。

2005年,俄國成立國際新聞頻道「Russia Today」,起初主要是軟實力宣傳,2008年「喬治亞戰爭」後,轉為攻擊西方社會負面問題與製造陰謀論;2014年「烏克蘭危機」,成為俄軍資訊戰部隊的練兵場,透過電子干擾、網路竊密等手段,截收烏國對外通聯訊息,依此推判烏方後續舉動,並選擇在關鍵時機,釋放對烏國政府不利消息;另選定烏東地區敏感議題,包括俄裔民族地位、經濟不振等,投放大量經篩選的特定資訊,引發民眾共鳴,從而影響烏東人民認知與行為,取得媒體輿論主動權。就「認知作戰」言,俄國作法是成功的,更成為中共的效法對象。

操弄「制腦權」 控制社會大眾

中共2014年於過去心理戰、法律戰、輿論戰等「三戰」基礎,以及俄國「資訊戰」理論架構上,提出「制腦權」認知操作概念,指國家認知空間係由無數個體疊加而成,「制腦」是以民族語言、宣傳媒體、文化產品為武器,全面滲透、控制社會大眾與國家精英之認知、情感與意識,最終扭曲、瓦解、重塑其民族精神、價值觀念、意識形態、歷史文化等,達致不戰而勝的戰略目標。

是以,中共「認知作戰」屬於廣義心理戰範疇,是資訊全球化時代,融合資訊戰、心理戰及輿論戰的戰法,成為中共整體戰略主軸,並自2016年「軍改」後,由新組建的「戰略支援部隊」操盤,在各政略、軍事層次開展執行。一方面,共軍擷取美國在「認知作戰」領域的操作思維,以311基地、國防科技大學、軍事科學院等單位研提「心理作戰」、「思想作戰」、「意識操縱」、「戰略傳播」等戰法,以加強軍民融合及聯戰體系共同建構的「認知作戰」能力;另一方面,則以專業人員操作媒體平臺,形塑輿論環境,將「認知作戰」導入實戰運用階段。

從近年中共對臺進行的「認知作戰」攻勢,可拆解其途徑與手段。首先,中共主要係以網路蒐集國人個資,透過大數據資料庫,劃分政治傾向、年齡、職業等不同目標族群資訊;其次,配合情報偵蒐,針對個別社群媒體展開認知精準打擊,影響目標群眾心理,尤其釋放假訊息,以削弱、分散臺灣社會注意力;再次,則運用網路虛擬組織設置社群媒體假帳號,打入網路族群,偽裝成揭密者、吹哨者,刻意傳散變造資訊,製造混亂,再由網軍大量轉傳、討論,操弄受眾認知,進入阻斷資訊記憶、操縱認知心理、改變思考模式的運作循環。

識別假訊息 全民齊反制

基於現階段,中共對臺「制腦權」作戰,影響臺灣社會認知的目的,在於扭曲輿論、貶低民主價值、激化對立、擾亂政情、減損民眾對政府信任等,於全民國防體系可採取的防制辦法包括:

一、強化教育功能

分別透過全民國防之學校教育、機關教育、社會教育途徑,提高公眾對中共「認知作戰」威脅的認識,與對假訊息識別能力,養成理性、查證、冷靜習慣。

二、遵循約束規範

儘管目前尚無國際通用的法律規則,可明確定義何種程度的認知作戰已構成戰爭行為,更難以究責;然各媒體平臺仍可按既有規範,對自身報導內容加強審查,民眾也可做到不傳播可疑訊息、不跟風網壇混戰,俾利訊息查證措施與機制建立。

三、結合軍民力量

將資訊與傳播相關機構、產業,納入全民防衛動員機制,平時協調因應假訊息審查、分析、處置,加強人才培訓、研究合作,提升政府、國軍專業單位能力;戰時則配合國家整體作為,執行反制任務。

目前我國已有國安局「國家安全作業中心」執行對境外敵對勢力爭議訊息應處有關工作,民間亦有非營利組織成立的「臺灣事實查核中心」。面對「認知作戰」挑戰,仍應持續整合各界力量,爭取國際情報交流與經驗共享,優化媒體環境,多管道合作,即時辨識訊息真偽與來源,共同建設應處「認知作戰」攻勢能量。

結語

事實上,世界各國都同樣面臨「認知作戰」、「資訊心理戰」等相關威脅,然民主自由的社會環境,絕非易受「認知作戰」攻擊的溫床,更需仰賴多元策略與方式守護。期以更完善周全的實質架構,建構強而有力的反制力量,並提升我國公民素質及識別能力,於中共奪取「制腦權」的認知作戰中,獲得免疫。

(作者為淡江大學戰略研究所博士)

中國原創軍事資源:https://www.ydn.com.tw/news/newsInsidePage?chapterID=1431550

China’s Weaponized Communication in International Public Opinion Warfare: Scenarios and Risk Responses

中國在國際公眾輿論戰爭中的武器交流:場景和風險回應

現代英語:

【Abstract】 In the international public opinion war, weaponized communication has penetrated into military, economic, diplomatic and other fields, bringing imagination and practice “everything can be weaponized”. Weaponized communication manipulates public perception through technology, platforms, and policies, reflecting the complex interaction of power distribution and cultural games. Driven by globalization and digitalization, cognitive manipulation, social fragmentation, emotional polarization, digital surveillance, and information colonization have become new means of influencing national stability, which not only exacerbates competition between information-powerful and weak countries, but also provides information-weak countries with the opportunity to achieve reversal through flexible strategies and technological innovation. Under the global asymmetric communication landscape, how to find a point of convergence and balance between technological innovation and ethical responsibility, strategic goals and social balance will be key elements that will influence the future international public opinion landscape.

【Keywords】 Public opinion warfare; weaponized communication; information manipulation; asymmetric communication; information security

If “propaganda is a rational recognition of the modern world” [1], then weaponized communication is a rational application of modern technological means. In the “public opinion war”, each participating subject achieves strategic goals through different communication methods, making them superficially reasonable and concealed. Unlike traditional military conflicts, modern warfare involves not only physical confrontation, but also competition in several fields, including information, economics, psychology, and technology. With the advancement of technology and globalization, the shape of war has changed profoundly, and traditional physical confrontations have gradually shifted to multi-dimensional and multi-field integrated warfare. In this process, weaponized communication, as a modern form of warfare, becomes an invisible means of violence that affects the psychology, emotions and behavior of the opposing enemy or target audience by controlling, guiding and manipulating public opinion, thereby achieving political, military or strategic ends.》 “On War” believes that war is an act of violence that makes the enemy unable to resist and subservient to our will. [ 2] In modern warfare, the realization of this goal not only relies on the confrontation of military forces, but also requires support from non-traditional fields such as information, networks, and psychological warfare. Sixth Generation Warfare heralds a further shift in the shape of warfare, emphasizing the application of emerging technologies such as artificial intelligence, big data, and unmanned systems, as well as comprehensive games in the fields of information, networks, psychology, and cognition. The “frontline” of modern warfare has expanded to include social media, economic sanctions, and cyberattacks, requiring participants to have stronger information control and public opinion guidance capabilities.

At present, the spread of weaponization has penetrated into the military, economic, diplomatic and other fields, bringing with it the apprehension that “everything can be weaponized”. In the sociology of war, communication is seen as an extended tool of power, with information warfare penetrating deeply and accompanying traditional warfare. Weaponized communication is precisely under the framework of information control, by shaping public perceptions and emotions, consolidating or weakening the power of states, regimes or non-state actors. This process not only occurs in wartime, but also affects power relations within and outside the state in non-combatant states. In international political communication, information manipulation has become a key tool in the great power game, as countries try to influence global public opinion and international decision-making by spreading disinformation and launching cyberattacks. Public opinion warfare is not only a means of information dissemination, but also involves the adjustment of power games and diplomatic relations between countries, directly affecting the governance structure and power pattern of the international community. Based on this, this paper will delve into the conceptual evolution of weaponized communication, analyze the social mentality behind it, elaborate on the specific technical means and the risks they entail, and propose multidimensional strategies to deal with them at the national level.

1. From weaponization of communication to weaponization of communication: conceptual evolution and metaphor

Weapons have been symbols and tools of war throughout human history, and war is the most extreme and violent form of conflict in human society. Thus, “weaponized” refers to the use of certain tools for confrontation, manipulation or destruction in warfare, emphasizing the way in which these tools are used.“ Weaponization ”(weaponize) translated as“ makes it possible to use something to attack an individual or group of people”. In 1957, the term “weaponization” was proposed as a military term, and Werner von Braun, leader of the V-2 ballistic missile team, stated that his main work was “weaponizing the military’s ballistic missile technology‘ [3].

“Weaponization ”first appeared in the space field, during the arms race between the United States and the Soviet Union, and the two major powers tried to compete for dominance in outer space.“ Weaponization of space ”refers to the process of using space for the development, deployment or use of military weapons systems, including satellites, anti-satellite weapons and missile defense systems, etc., with the purpose of conducting strategic, tactical or defensive operations. From 1959 to 1962, the United States and the Soviet Union proposed a series of initiatives to ban the use of outer space for military purposes, especially the deployment of weapons of mass destruction in outer space orbit. In 2018, then-U.S. President Trump signed Space Policy Directive-3, launching the construction of the “Space Force” and treating space as an important combat area on the same level as land, air, and ocean. In 2019, the “Joint Statement of the People’s Republic of China and the Russian Federation on Strengthening Contemporary Global Strategic Stability” proposed “prohibiting the placement of any type of weapons in outer space” [4].

In addition to weaponization in the space sector, there is also a trend towards weaponization in the military, economic and diplomatic fields.“ Military weaponization” is the use of resources (such as drones, nuclear weapons, etc.) for military purposes, the deployment of weapons systems, or the development of military capabilities. During the Russo-Ukrainian War in 2022, a report from the Royal United Services Institute showed that Ukraine lost approximately 10,000 drones every month due to the impact of Russian jamming stations. [ 5] “weaponization” also often appears in expressions such as “financial war ”“diplomatic battlefield”. In the economic sphere, weaponization usually refers to the use of shared resources or mechanisms in the global financial system by countries or organizations; diplomatic weaponization is manifested in countries pursuing their own interests and exerting pressure on other countries through economic sanctions, diplomatic isolation, and manipulation of public opinion. Over time, the concept of “weaponization” has gradually expanded into the political, social, cultural and other fields, especially in the information field, and since the 2016 United States presidential election, manipulation of public opinion has become a universal tool in political struggles. David Petraeus, a former director of the CIA in the United States, once said at a National Institute for Strategic Studies conference that the time has come for “the weaponization of everything”.[ 6]

As a metaphor, “weaponization” not only refers to the use of actual physical tools, but also symbolizes the transformation of adversarial and aggressive behavior, emphasizing how the concept of “weapons” permeates daily life, cultural production, and political strategies, showing how social actors use various tools to achieve strategic goals. Nowadays, many areas that should remain neutral, such as the media, law and government agencies, are often described as “weaponized” to criticize their excessive politicization and improper use, highlighting their illegality and negative impact on society. Influence. Through this metaphor, one unconsciously contrasts the current political environment with an idealized and seemingly more moderate past, making one think that the political climate of the past was more rational and civilized, while the present appears too extreme and oppositional.[ 7] Therefore, the essence of “weaponization” is the process of political mediation, which is the use of various means and channels by political forces to influence or control areas that should remain neutral, making them political purposes and tools of political struggle.

In the field of information, the weaponization of communication is a long-standing and strategic means. During World War I and II, propaganda and public opinion warfare were widely used in various countries, and means of communication were used as a psychological tactic. Weaponized communication is the embodiment of the weaponization of communication in the modern information society. It uses algorithms and big data analysis to accurately control the speed and scope of information dissemination, and then controls public opinion and emotions. It reflects the combination of technology, platforms and strategies, making Political forces can more accurately and efficiently control the public perception and public opinion environment. As the ontology of public opinion, information is “weaponized” and used to influence social cognition and group behavior, and the concept of “war” has changed accordingly, no longer just traditional military confrontation, but also includes psychological warfare and cognitive warfare through information dissemination and public opinion manipulation. This shift has led to a range of new terms such as unrestricted warfare, new generation warfare, asymmetric warfare, and irregular warfare. Almost all of these terms are borrowed from “warfare” (warfare) to emphasize diverse conflicts in the information field, and information becomes the core content of “weaponization”.

Although there is some view that the term “war” does not apply to situations where hostilities are not formally declared [8], weaponized communication extends the concept of “war” by weakening the traditional political attributes of war and treating overt or covert forces and forms in various fields in general terms. as an act of communication. It is important to note that in English terms “weaponization” there are two formulations: one is “weaponized noun ”noun“, which means that something has been ”weaponized“ with a weapon function or purpose, and the other is ”weaponization of noun, which refers to the process of converting something into a weapon or having the nature of a weapon. In the academic sphere, Chinese translations differ, although weaponized communication and weaponization of communication are not yet strictly distinguished.“ Weaponized communication ”which focuses more on the means of communication or the message itself“ being weaponized” in order to achieve a certain strategic goal, and “weaponization of communication”, which emphasizes the process of communication itself as a transformation process of weapons. When discussing specific technical means, most academic papers adopt weaponized or weaponizing as a prefix to modify specific means of dissemination.

This article focuses on specific communication strategies in the international public opinion war, focusing on describing the weaponization phenomenon that has occurred, so unified use “weaponized communication” is a method of using communication means, technical tools and information platforms to accurately control information flow, public cognition and emotional response, a strategic communication method to achieve specific military, political or social purposes. Weaponized communication is also not a simple state of war or wartime, but a continuous communication phenomenon. It reflects the interaction and game between various subjects and is the flow of information sharing and meaning space.

2. Application scenarios and implementation strategies of weaponized communication

If at the end of the 1990s, weaponization in the information field was still a “dead topic”, and countries were mainly chasing upgrading competitions for physical weapons such as missiles and drones, then entering the 21st century, cyber wars have truly entered the public eye, and deeply embedded in people’s daily lives, through social media and smart devices, the public will inevitably be involved in the war of public opinion and unconsciously become participants or communication nodes. With the spread of technology, weaponized means gradually expanded from state-led instruments of war to socialized and politicized areas, and control over individuals and society shifted from explicit state apparatus to more covert conceptual manipulation. The exposure of Project Prism (PRISM) has raised strong global concerns about privacy breaches, highlighting the potential for states to use advanced technology for surveillance and control, seen as a new type of weaponization. Since Trump was elected President of the United States in 2016, the large-scale application of information weapons such as social robots has become a common phenomenon in the global political game. Information warfare ——including electronic warfare, computer network warfare, psychological warfare, and military deception—— is widely used to manipulate the flow of information and influence the landscape of public opinion. Not only do these methods work in military wars and political elections, but they also gradually permeate cultural conflicts, social movements and transnational games, perpetuating the traditional logic of information warfare. Nowadays, weaponized communication, as a socio-political tool, profoundly affects the ecology of public opinion, international relations and the daily lives of individuals.

(1) Information manipulation warfare in the military field

Information flow can directly influence the direction of military conflicts, shaping public and military perceptions and decisions, which in turn affects morale, strategic judgment, and social stability. In modern warfare, information is no longer a mere aid, and the field of information has become a central battleground. By manipulating the flow of information, the enemy’s situation assessment may be misled, the will to fight is weakened, and the trust and support of the people are shaken, which in turn affects the decision-making process and continuity of the war.

The Gulf War is regarded as the beginning of modern information warfare. In this war, the United States carried out systematic strikes against Iraq through high-tech means ——including electronic warfare, air strikes, and information operations——. The U.S. military used satellites and AWACS early warning aircraft to monitor the battlefield situation in real time, and induced the Iraqi army to surrender from a psychological level by airdropping leaflets and radio stations to convey to Iraqi soldiers the advantages of the U.S. military and its preferential treatment policy after surrender. The war marked the key place of information control in military conflicts, demonstrating the potential of information warfare in modern warfare. In the 21st century, cyberwarfare has become an important part of information warfare. Cyberwarfare involves not only the dissemination and manipulation of information, but also control over enemy social functions through attacks on critical infrastructure. In 2007, Estonia suffered a large-scale DDoS (Distributed Denial of Service Attack) attack, demonstrating a trend towards the fusion of information manipulation and cyberattacks. In the WannaCry ransomware incident in 2017, attackers used a Windows system vulnerability (EternalBlue) to encrypt the files of approximately 200,000 computers in 150 countries around the world and demanded a ransom, seriously affecting the British National Health Service (NHS) and causing the interruption of emergency services. and hospital system paralysis, further revealing the threat of cyber warfare to critical infrastructure. In addition, in long-term conflicts, infrastructure control is widely used to undermine the strategic capabilities of adversaries to compete for public information space due to its ability to directly determine the speed, scope, and direction of information dissemination. Israel has effectively weakened Palestinian communications capabilities by restricting the use of radio spectrum, controlling Internet bandwidth and disrupting communications facilities. At the same time, Israel also restricts the development of the Palestinian telecommunications market through economic sanctions and legal frameworks, suppresses Palestinian competitiveness in the flow of information, and consolidates its own strategic advantage in the conflict [9] in order to maintain the unequal flow of information.

Social media provides an immediate and extensive channel for information manipulation, allowing it to cross borders and influence global public sentiment and political situations, as well as shifting the focus of war from mere physical destruction to manipulation of public opinion. During the Russo-Ukrainian War, deepfake technology was used as a visual weapon, which significantly interfered with public perception and public opinion about the war. On March 15, 2022, a fake video of Ukrainian President Volodymyr Zelenskyy was circulated on Twitter, in which he “called” Ukrainian soldiers to lay down their weapons, triggering public confusion for a short period of time. Similarly, fake videos of Russian President Vladimir Putin have been used to confuse the public. Although the videos were promptly annotated “Stay informed” by the platform (pending instructions on understanding the situation), they still caused obvious interference to public emotions and perceptions within a short period of time. These events highlight the critical role of social media in modern information warfare, where state and non-state actors can exert interference in military conflicts through disinformation, emotional manipulation, and other means.

The complexity of information manipulation warfare is also reflected in its dual nature ——both a tool for attack and a means of defense. In the military sphere, states ensure national security, protect critical infrastructure, maintain military secrets, and in some cases influence adversary combat effectiveness versus decision-making by defending against and countering cyberattacks. In 2015 and 2017, Russian hackers launched large-scale cyber attacks against Ukraine (such as BlackEnergy and NotPetya). Ukraine successfully resisted some attacks and took countermeasures by quickly upgrading its cyber defense systems, avoiding larger-scale infrastructure paralysis. In addition, units such as the NATO Center of Excellence for Strategic Communications and the British 77th Brigade focus on researching public opinion shaping in peacetime [10], using strategic communications, psychological warfare, and social media monitoring to expand strategic control in the information field and strengthen defense and public opinion shaping capabilities, further increasing the strategic height of information warfare.

Today, information manipulation warfare is a key link in modern military conflicts. Through the high degree of integration of information technology and psychological manipulation, it not only changes the rules of traditional warfare, but also profoundly affects public perception and the global security landscape. By taking control of critical infrastructure and social media platforms, countries, multinational corporations or other actors can gain strategic advantages in the global information ecosystem by restricting the flow of information and manipulating communication paths.

(2) Public opinion intervention in political elections

Political elections are the most direct field of competition for power in democratic politics, and the dissemination of information has an important influence on voter decision-making in the process. By calculating propaganda and other means, external forces or political groups are able to manipulate the sentiments of voters and mislead the public, thereby influencing the results of elections, destabilizing politics or weakening the democratic process, and elections are thus the most effective application scenario for weaponized communication.

In recent years, global political elections have shown a trend towards polarization, with large ideological differences between groups with different political affiliations. Polarization leads the public to selectively accept information that is consistent with their own views, while excluding other information, and this “echo chamber effect” intensifies the public’s one-sided perception of positions, giving greater scope for public opinion intervention. And the rise of information dissemination technology, especially computational propaganda, has enabled external forces to more accurately manipulate public opinion and influence voter decision-making. Computational Propaganda refers to the use of computing technology, algorithms and automated systems to control the flow of information to disseminate political information, interfere with election results and influence public opinion. Its core characteristics are algorithm-driven accuracy and the scale of automated communication. By breaking through The limitations of traditional manual communication have significantly enhanced the effect of public opinion manipulation. In the 2016 U.S. presidential election, the Trump team analyzed Facebook user data through Cambridge Analytica and pushed customized political advertisements to voters, accurately affecting voters’ voting intentions [11]. This incident was seen as a classic case of computational propaganda interfering in elections, and also provided an operational template for other politicians, driving the widespread use of computational propaganda worldwide. In the 2017 French presidential election, candidate Emmanuel Macron’s team was hacked, and internal emails were stolen and made public, claiming that Macron had secret accounts overseas and was involved in tax evasion in an attempt to discredit his image. During the 2018 Brazilian presidential election, the team of candidate Jair Bolsonaro used WhatsApp groups to spread inflammatory political content, targeting and pushing a large number of images, videos and inflammatory messages to influence voter sentiment. According to statistics, from 2017 to 2019, the number of countries using computing for propaganda worldwide increased from 28 to 70, and in 2020 this number rose to 81. This suggests that computational propaganda is redefining the rules of public opinion in global elections through technical means and communication strategies.

Computational propaganda is also an important tool for state actors in the war of public opinion intervention. In 2011, the U.S. Defense Advanced Research Projects Agency (DARPA) launched Operation “Voice of Ernest” in the Middle East to distort conversations on Arabic-language social media by establishing and managing multiple false identities (sockpuppets). Russia also frequently uses computational propaganda to intervene, operating about 200,000 social media accounts in Canada, using far-right and far-left movements to spread pro-Russian rhetoric, create false social hot spots, and try to undermine Canada’s support for Ukraine [12]. As an important part of computing propaganda, social robots create the heat of public opinion through automation and scale, increase the exposure of information on social platforms through specific tags, and control the priority of issues. During the 2016 U.S. election, Russia used social robots to post content supporting Putin and attacking the opposition, covering up the opposition’s voice through information overload, and strengthening the pro-Putin public opinion atmosphere. [ 13] During the 2017 Gulf crisis, Saudi Arabia and Egypt used Twitter bots to create anti-Qatar hashtags#AlJazeeraInsultsKingSalman, which made it a hot topic and fictionalized the peak of anti-Qatar sentiment, which in turn affected global public opinion attitudes towards Qatar. [ 14] Deepfake technology further improves the accuracy and concealment of computing propaganda. In 2024, a fake video of U.S. President Joe Biden went viral on X (formerly Twitter), showing him using offensive language in the Oval Office, sparking controversy in public opinion and influencing voter sentiment. According to a survey by cybersecurity firm McAfee, 63% of respondents had watched a political deepfake video within two months, and nearly half said the content influenced their voting decisions.[ 15]

Globally, computing propaganda has infiltrated public opinion wars in various countries, affecting social stability and national security. The Israel Defense Forces waged a public opinion war against Palestine through digital weapons, Turkey cultivated “a patriotic troll army” to manipulate public opinion at home and abroad, and the Mexican government used botnets to influence public opinion. Computational propaganda is changing the landscape of global political communication as an important means of modern public opinion intervention warfare. With the development of technologies such as artificial intelligence and quantum computing, computing propaganda may also interfere with electoral processes through more covert and efficient means, or even directly threaten the core operating logic of democratic institutions.

(3) Symbolic identity war in the cultural field

Weaponized communication attempts to influence the public’s thoughts, emotions, and behaviors by manipulating information, symbols, and values, which in turn shapes or changes society’s collective cognition and cultural identity. This mode of communication consists not only in the transmission of information, but also in promoting the transmission and identification of a specific ideological or political idea through a specific narrative framework, cultural symbols and emotional resonance. Through the manipulation of cultural symbols, social emotions and collective memory, weaponized communication interferes with social structure and cultural identity in the cultural field, becoming a core means of symbolic identity warfare.

Memes, as a cultural symbol that combines visual elements and concise words, stimulate the emotional response of the audience in a humorous, satirical or provocative way, affecting their political attitudes and behaviors. Pepe the Frog began as a harmless comic book character that was repurposed and weaponized by far-right groups to spread hate speech, gradually evolving into a racist and anti-immigrant symbol. Memes transform complex political sentiments into easy-to-spread visual symbols that quickly stir up public distrust and anger over policy, seen as “weaponized iconoclastic weaponization” (Iconoclastic Weaponization). This process, by manipulating cultural symbols in order to achieve the purpose of political or social struggle [16], aggravates the public’s division of society and politics. For example, during Brexit, memes bearing the words “Take Back Control” Take Back Control spread rapidly, reinforcing nationalist sentiments.

In addition to the manufacture of cultural symbols, the screening and shielding of symbols are equally capable of shaping or deepening a certain cultural identity or political stance. Censorship has been an important means for power to control information since ancient times, and as early as the ancient Greek and Roman periods, governments censored public speeches and literary works to maintain social order and power stability. Entering the digital age, the rise of the Internet and social media has driven the modernization of censorship, and platform censorship has gradually replaced traditional censorship methods as a core tool for contemporary information control and public opinion guidance. Algorithm review detects sensitive topics, keywords, and user behavior data through artificial intelligence, automatically deletes or blocks content deemed “violations”, and the review team of social media manually screens user-generated content to ensure its compliance with platform policies and laws and regulations. The role of platform censorship is not only to limit the dissemination of certain content, but also to guide public opinion and shape the public perception framework through push, deletion and blocking. Although mainstream social platforms control the spread of information through strict content moderation mechanisms, some edge platforms such as Gab, Gettr, Bitchute, and others have become hotbeds of extreme speech and malicious information due to the lack of effective censorship. These platforms do not place sufficient restrictions on content publishing, allowing extreme views and disinformation to spread wantonly. For example, Gab has been repeatedly criticized for its extremist content and is accused of promoting violence and hatred. In the “echo chamber”, users can only access information that is consistent with their own views. This information environment further strengthens extreme ideas and leads to increased antagonism among social groups.[ 17]

Language, as a carrier and tool for information dissemination, can profoundly influence group behavior and cultural identity through emotional manipulation, symbolic politics, and social mobilization. The weaponization of language focuses on how language forms and cultural contexts affect the way information is received, emphasizing how language can be used to manipulate, guide or change people’s cognition and behavior. This involves not only the use of specific lexical and rhetorical devices, but also the construction of specific social meanings and cultural frameworks through linguistic representations. As another important tool of symbolic identity warfare, language shapes the narrative framework “of antagonism between the enemy and the enemy”. The Great Translation Movement spread the nationalist rhetoric of Chinese netizens to international social media platforms through selective translation, triggering negative perceptions of China. This language manipulation amplifies controversial content through emotional expression and deepens the cultural bias of the international community.

The deep logic of the weaponization of language lies in emotional and inflammatory forms of language. Western countries often justify acts of intervention by using the labels of justice such as “human rights” and “democracy”, legitimizing political or military action. White supremacists reshape ideologies using vague labels such as “alt-right”, transforming traditional “white supremacist” with strongly negative connotations into a more neutral concept, reducing the vocabulary’s social resistance, broadening the base of its supporters with a broad “umbrella” identity. Through the infiltration of secular discourse, hate politics and extreme speech are justified, gradually creating a political normality. Language is truly weaponized after the public routineizes this politics.[ 18] In Nigeria, hate-mongering content spreads through racial, religious and regional topics, profoundly deteriorating social relations. [ 19] Linguistic ambiguity and reasonable denial strategies have also become powerful tools for communicators to circumvent their responsibilities and spread complex social and political issues in simplified narratives. Through negative labeling and emotional discourse, Trump’s America First policy deliberately puts forward views that are opposed to mainstream opinions by opposing globalization, questioning climate change science, and criticizing traditional allies, stimulating public distrust of globalization, reshaping the cultural identity of national interests first. [ 20]

III Risks and challenges of weaponized dissemination: legitimacy and destructiveness

Although weaponized communication poses a great risk to the international public opinion landscape, it may be given some legitimacy by certain countries or groups through legal, political or moral frameworks in specific situations. For example, after the “9/11” incident, the United States passed the Patriot Act to expand the surveillance authority of intelligence agencies and implement extensive information control in the name of “anti-terrorism”. This “legitimacy” is often criticized as undermining civil liberties and eroding the core values of democratic society.

In the international political game, weaponized transmission is more often seen as a means of “Gray Zone” (Gray Zone). Confrontations between countries are no longer limited to economic sanctions or diplomatic pressure, but are waged through non-traditional means such as information manipulation and social media intervention. Some States use “the protection of national interests” as a pretext to disseminate false information, arguing that their actions are compliant and, although they may be controversial under international law, are often justified as necessary means “to counter external threats”. In some countries where the regulation of information lacks a strict legal framework, interference in elections is often tolerated or even seen as a “justified” political exercise. At the cultural level, certain countries attempt to shape their own cultural influence on a global scale by disseminating specific cultural symbols and ideologies. Western countries often promote the spread of their values in the name of “cultural sharing” and “communication of civilizations”, but in actual operations, they weaken the identity of other cultures by manipulating cultural symbols and narrative frameworks, leading to global cultural ecology. imbalance. The legal framework also provides support, to a certain extent, for the justification of weaponized dissemination. In the name of “counter-terrorism” and “against extremism”, some countries restrict the dissemination of so-called “harmful information” through information censorship, content filtering and other means. However, this justification often pushes moral boundaries, leading to information blockades and suppression of speech. Information governance on the grounds of “national security”, although internally recognized to a certain extent, provides space for the proliferation of weaponized communications.

Compared to legitimacy, the spread of weaponization is particularly devastating. At present, weaponized communication has become an important tool for power structures to manipulate public opinion. It not only distorts the content of information, but also profoundly affects public perception, social emotions, and international relations through privacy violations, emotional mobilization, and cultural penetration.

(1) Information distortion and cognitive manipulation

Distortion of information means that information is deliberately or unintentionally distorted during dissemination, resulting in significant differences between what the public receives and the original information. On social media, the spread of disinformation and misleading content is rampant, and generated content from artificial intelligence models (such as GPT) may be exacerbated by bias in training data. Gender, race, or social bias may be reflected in automatically generated text, amplifying the risk of information distortion. The fast-spreading nature of social media also makes it difficult for traditional fact-checking mechanisms to keep up with the spread of disinformation. Disinformation often dominates public opinion in a short period of time, and cross-platform dissemination and anonymity complicate clarification and correction. The asymmetries in communication undermine the authority of traditional news organizations, and the public’s preference for trusting instantly updated social platform information over in-depth coverage by traditional news organizations further diminishes the role of news organizations in resisting disinformation.

In addition to the distortion of the information itself, weaponized communication makes profound use of the psychological mechanisms of cognitive dissonance. Cognitive dissonance refers to the psychological discomfort that occurs when an individual is exposed to information that conflicts with their pre-existing beliefs or attitudes. By creating cognitive dissonance, communicators shake the established attitudes of their target audience and even induce them to accept new ideologies. In political elections, targeted dissemination of negative information often forces voters to re-examine their political positions or even change their voting tendencies. Weaponized communication further intensifies the formation of “information cocoon houses” through selective exposure, allowing audiences to tend to access information consistent with their own beliefs, ignoring or rejecting opposing views. This not only reinforces the cognitive biases of individuals, but also allows disinformation to spread rapidly within the group, making it difficult to be broken by external facts and rational voices, and ultimately forming a highly homogeneous ecology of public opinion.

(2) Privacy leakage and digital monitoring

In recent years, the abuse of deepfakes has exacerbated the problem of privacy violations. In 2019, the “ZAO” face-changing software was removed from the shelves due to default user consent to portrait rights, revealing the risk of overcollection of biometric data. Photos uploaded by users that have been processed through deep learning can either generate an accurate face-changing video or become a source of privacy leaks. What’s more, techniques such as deepfakes are abused for gender-based violence, the faces of multiple European and American actresses are illegally planted with fake sex videos and widely distributed, and although the platforms remove this content in some cases, the popularity of open-source programs makes it easy for malicious users to copy and share forged content. In addition, when users use social media, they tend to authorize the platform by default to access their devices’ photos, cameras, microphones and other app permissions. Through these rights, the platform not only collects a large amount of personal data, but also analyzes users’ behavioral characteristics, interest preferences, and social relationships through algorithms, allowing it to accurately deliver ads, recommend content, and even implement information manipulation. This large-scale data acquisition drives global discussion of privacy protections. In Europe, the General Data Protection Regulation attempts to strengthen the protection of individuals’ right to privacy through strict regulations on data collection and use. However, due to “implicit consent” or complex user agreements, platforms often bypass regulations that make the data-processing process less transparent, making it difficult for regular users to understand what the data is actually used for. Section 230 of the U.S. Communications Decency Act provides that online platforms are not legally responsible for user-generated content, a provision that has fueled the development of content moderation on platforms but has also left them with little incentive to respond to privacy infringements. Platforms, motivated by commercial interests, often lag behind in dealing with disinformation and privacy issues, leading to ongoing shelving of audit responsibilities.

In terms of digital surveillance, social platforms work with governments to make user data a core resource “of surveillance capitalism”. The National Security Agency (NSA) implements mass surveillance through phone records, Internet communications, and social media data, and works with large enterprises such as Google and Facebook to obtain users’ online behavioral data for intelligence gathering and behavioral analysis worldwide. The abuse of transnational surveillance technologies is what pushes privacy violations to an international level. Pegasus spyware developed by the Israeli cybersecurity company NSO, which compromises target devices through “zero-click attacks”, can steal private information and communication records in real time. In 2018, in the case of the murder of Saudi journalist Jamal Khashoggi, the Saudi government monitored its communications through Pegasus, revealing the profound threat this technology poses to individual privacy and international politics.

(3) Emotional polarization and social division

Emotions play a key role in influencing individual cognition and decision-making. Weaponized communication influences rational judgment by inciting feelings of fear, anger, sympathy, etc., and pushes the public to react irrationally, driven by emotions. War, violence and nationalism often become the main content of emotional mobilization. Through carefully designed topics, communicators implant elements such as patriotism and religious beliefs into information dissemination, quickly arousing public emotional resonance. The widespread adoption of digital technologies, particularly the combination of artificial intelligence and social media platforms, further amplifies the risk of emotional polarization. The rapid spread of disinformation and extreme speech on the platform comes not only from the sharing behavior of ordinary users, but is also driven by algorithms. Platforms tend to prioritize the push of emotional and highly interactive content, which often contains inflammatory language and extreme views, thus exacerbating the spread of hate speech and extreme views.

Social media hashtags and algorithmic recommendations play a key role in emotional polarization. After the Charlie Hebdo incident, the #StopIslam hashtag became a communication tool for hate speech, with the help of which users posted messages of hatred and violent tendencies. During the 2020 presidential election in the United States, extreme political rhetoric and misinformation on social platforms were also amplified in a bitter partisan struggle. Through precise emotional manipulation, weaponized communication not only tears apart public dialogue, but also greatly affects the democratic process of society. Another particular extremist mobilization tactic is “Weaponized Autism”, where far-right groups use the technical expertise of autistic individuals to implement emotional manipulation. These groups recruit technically competent but socially challenged individuals, transforming them into enforcers of information warfare by giving them a false sense of belonging. These individuals, guided by extremist groups, are used to spread hate speech, carry out cyberattacks and promote extremism. This phenomenon reveals not only the deep-seated mechanisms of emotional manipulation, but also how technology can be exploited by extremist groups to serve the larger political and social agenda.[ 21]

(4) Information colonization and cultural penetration

“Weaponized Interdependence” theory Weaponized Interdependence Theory reveals how states use key nodes in political, economic, and information networks to exert pressure on other states. [ 22] Especially in the field of information, developed countries further consolidate their cultural and political advantages by controlling the implementation of information flows “information colonization”. Digital platforms became the vehicles of this colonial process, the countries of the Global South were highly dependent on Western-dominated technology platforms and social networks for information dissemination, and in sub-Saharan Africa, Facebook has become synonymous with “the Internet”. This dependence not only generates huge advertising revenues for Western businesses, but also has a profound impact on indigenous African cultures and values through algorithmic recommendations, especially in terms of gender, family, and religious beliefs, making cultural penetration the norm.

Digital inequality is another manifestation of information colonization. The dominance of developed countries in digital technology and information resources has increasingly marginalized countries of the South in the economic, educational and cultural fields. Palestine’s inability to effectively integrate into the global digital economy due to inadequate infrastructure and technological blockade both limits local economic development and further weakens its voice in global information dissemination. Through technological blockades and economic sanctions, the world’s major economies and information powers restrict other countries’ access to key technological and innovation resources, which not only hinders the development of science and technology in target countries, but also exacerbates the rupture of the global technology and innovation ecosystem. Since withdrawing from the Iran Nuclear Deal in 2018, U.S. economic sanctions on Iran have blocked its development in the semiconductor and 5G sectors, and the asymmetry between technology and innovation has widened the gap in the global technology ecosystem, putting many countries at a disadvantage in information competition.

IV Reflection and discussion: the battle for the right to speak in the asymmetric communication landscape

In the competitive landscape of “Asymmetric Communication”, strong parties often dominate public opinion through channels such as mainstream media and international news organizations, while weak parties need to use innovative communication technologies and means to make up for their disadvantages and compete for the right to speak. At the heart of this communication landscape lies Information Geopolitics, the idea that the contest of power between states depends not only on geographical location, military power, or economic resources, but also on control over information, data, and technology. The game between the great powers is no longer limited to the control of physical space, but extends to the competition for public opinion space. These “information landscapes” involve the right to speak, information circulation and media influence in the global communication ecosystem. In this process, the country continuously creates landscapes to influence international public opinion and shape the global cognitive framework, thereby achieving its strategic goals. The strategy of asymmetric communication is not only related to the transmission of information content, but more importantly, how to bridge the gap between resources and capabilities with the help of various communication technologies, platforms and means. The core of information communication is no longer limited to the content itself, but revolves around the right to speak. The competition unfolds. With the rise of information warfare and cognitive warfare, whoever has the information will have a head start in global competition.

(1) Technology catching up under the advantage of latecomers

Traditional large countries or strong communicators control the dominance of global public opinion, and by contrast, weak countries often lack communication channels to compete with these large countries. The theory of latecomer advantage advocates that latecomer countries can rapidly rise and circumvent inefficient and outdated links in early technological innovation by leaping forward and bypassing traditional technological paths and introducing existing advanced technologies and knowledge. In the context of weaponized communication, this theory provides information-weak countries with a path to break through the barriers of communication in large countries through emerging technologies, helping them to catch up at the technical level. Traditional media are often constrained by resources, influence and censorship mechanisms, with slow dissemination of information, limited coverage and vulnerability to manipulation by specific countries or groups. The rise of digital media has brought about a fundamental change in the landscape of information dissemination, enabling disadvantaged countries, with the help of globalized Internet platforms, to directly target international audiences without having to rely on traditional news organizations and mainstream media. Through emerging technologies, disadvantaged countries can not only transmit information more precisely, but also rapidly expand their influence in international public opinion through targeted communication and emotional guidance. Later-developing countries can use advanced technologies (such as big data, artificial intelligence, 5G networks, etc.) to achieve precise information dissemination and create efficient communication channels. Taking “big data analysis” as an example, latecomer countries can gain an in-depth understanding of audience needs and public opinion trends, quickly identify the pulse of global public opinion, implement targeted communication, and quickly expand international influence. AI technology not only predicts the direction of public opinion development, but also optimizes communication strategies in real time. The popularization of 5G networks has greatly improved the speed and coverage of information dissemination, allowing latecomer countries to break through the limitations of traditional communication models in a low-cost and efficient manner and form unique communication advantages.

Through transnational cooperation, late-developing countries can integrate more communication resources and expand the breadth and depth of communication. For example, Argentina has established “Latin American News Network” with other Latin American countries to push Latin American countries to speak with a single voice in international public opinion and counter the single narrative of Western media through news content sharing. In Africa, South Africa has partnered with Huawei to promote the “Smart South Africa” project to build a modern information infrastructure and promote digital transformation and efficiency improvements in public services. Governments of late-developing countries should invest more in technological research and development and innovation, and encourage the development of local enterprises and talent. At the same time, attention should be paid to the export of culture and the construction of the media industry, so as to enhance the country’s voice in the international information space through globalized cooperation and decentralized communication models. Governments can fund digital cultural creations, support the growth of local social media platforms, and integrate more communication resources through an international cooperation framework.

(2) Construction of barriers in information countermeasures

Unlike a full-scale conflict that may be triggered by military action, or the risks that economic sanctions may pose, weaponized dissemination is able to achieve strategic objectives without triggering full-scale war, and it is extremely attractive based on cost and strategic considerations. Because weaponized communication is characterized by low cost and high returns, an increasing number of State and non-State actors have chosen to manipulate information in order to reach strategic objectives. The spread of this means of dissemination makes countries face even more complex and variable threats in the face of attacks involving information from outside and inside. With the increasing intensity of information warfare, mere traditional military defense can no longer meet the needs of modern warfare. Instead, building a robust information defense system becomes a key strategy for the country to maintain political stability, safeguard social identity, and enhance international competitiveness. Therefore, how to effectively deal with external interference in information and manipulation of public opinion, as well as counter-information, has become an urgent issue for all countries to address. A complete cybersecurity infrastructure is key to maintaining national security against the manipulation or tampering of sensitive information from outside. Take, for example, the European Union’s push to strengthen cybersecurity in member states through its “Digital Single Market” strategy, which requires internet companies to be more aggressive in dealing with disinformation and external interference. The EU’s cybersecurity directives also provide for member states to establish emergency response mechanisms to protect critical information infrastructure from cyberattacks. In addition, the EU has established cooperation with social platform companies, such as Facebook, Twitter and Google, to combat the spread of fake news by providing anti-disinformation tools and data analysis technologies. Artificial intelligence, big data, and automation technologies are becoming important tools for information defense, used to monitor information propagation paths in real time, identify potential disinformation, and resist public opinion manipulation. In the field of cybersecurity, big data analysis helps decision makers identify and warn against malicious attacks, and optimize countermeasures. The application of these technologies will not only enhance information defence capabilities at the domestic level, but also enhance national initiative and competitiveness in the international information space.

Counter-mechanisms are another important component of the information defence system, especially under pressure from international public opinion, where real-time monitoring of the spread of external information and timely correction of disinformation become key to safeguarding the initiative of public opinion. Since the 2014 Crimean crisis, Ukraine has built a rather large-scale cyber defense system through cooperation with NATO and the United States. Ukraine’s National Cyber Security Service has set up “information countermeasures teams” to counter cyberthreats, using social media and news release platforms to refute false Russian reports in real time, a tactic that has significantly boosted Ukraine’s reputation and trust in international public opinion.

(3) Agenda setting in public opinion guidance

In the global competitive landscape of informatization and digitalization, public opinion guidance involves not only the content of information dissemination, but more importantly, how to set the agenda and focus on hot topics of global concern. The agenda-setting theory suggests that whoever can take control of the topics of information circulation can guide the direction of public opinion. Agenda setting influences public attention and evaluation of events by controlling the scope and focus of discussion of topics, and the rise of social media provides a breakthrough for information-disadvantaged countries to compete for dominance in information dissemination through multi-platform linkage. In the case of Ukraine, for example, during the Russo-Ukrainian War, it disseminated the actual war situation through social media, not only publishing the actual combat situation, but also incorporating the emotional demands of the people, and using the tragic narrative of civilian encounters and urban destruction to inspire sympathy and attention from the international community. While resisting interference from external information, the State also needs to proactively disseminate positive narratives and tell cultural stories that can resonate with the international community. The story should correspond to the emotional needs of international public opinion, while at the same time showing the uniqueness of the country and strengthening the link with the international community. Taking my country’s “One Belt, One Road” co-construction as an example, in the “One Belt, One Road” co-construction country, my country has invested in and constructed a large number of infrastructure projects. These projects not only helped improve local economic basic conditions, but also demonstrated China’s globalization process. Responsibility provides a window for cultural cooperation and exchange activities, showing the rich history and culture of the Chinese nation to the world It has demonstrated the inclusiveness and responsibility of Chinese culture to the international community.

However, because countries of the Global South often face constraints in terms of resources, technology and international communication platforms, and have difficulty in competing directly with developed countries, they rely on more flexible and innovative means of communication to participate in the setting of the global agenda. For example, Brazil is under negative public opinion pressure from the Western media when it comes to dealing with issues of environmental protection and climate change, especially the deforestation of the Amazon forest. To this end, the Brazilian government actively creates the country’s image in the field of environmental protection by using social media to publish recent data and success stories about Amazon protection. At the same time, Brazil has strengthened its voice on climate issues by engaging with other developing countries in global climate change negotiations and promoting South-South cooperation. Large international events, humanitarian activities and the production of cultural products, among others, are also effective ways of telling national stories. International sports events such as the World Cup and the Olympic Games are not only a display platform for sports competitions, but also an exhibition venue for national image and cultural soft power. By hosting or actively participating in these global events, the country can show its strength, value and cultural charm to the world, promoting a positive public opinion agenda.

“War is nothing more than the continuation of politics through another means”[23]. This classic Clausewitz assertion is modernized in the context of weaponized communication. Weaponized communication breaks through the physical boundaries of traditional warfare and becomes a modern strategic means of integrating information warfare, cognitive warfare, and psychological warfare. It manipulates the flow of information and public perception in a non-violent form, so that State and non-State actors can achieve political goals without relying on direct military action, reflecting a highly strategic and targeted nature. By manipulating information, emotions and values, weaponized communication can achieve strategic goals while avoiding all-out war, and in global competition and conflict, it has become an important means of political suppression by powerful countries against weak ones.

The core of weaponized communication lies in weakening the enemy’s decision-making and operational capabilities through information manipulation, but its complexity makes the communication effect difficult to fully predict. Although information-powerful countries suppress information-weak countries through technological advantages and communication channels, the effectiveness of communication is uncertain. Especially in the context of the globalization of social media and digital platforms, the boundaries and effects of information flow are becoming increasingly difficult to control. This complexity offers the weak countries the opportunity to break through the hegemony of discourse and promote the reverse game of information dissemination. Weak countries can use these platforms to launch confrontations, challenge the information manipulation of powerful countries, and take their place in global public opinion. The asymmetric game reflects the dynamic balance of international public opinion, whereby communication is no longer one-way control, but more complex interaction and dialogue, giving the weak the possibility of influencing public opinion. The current international public opinion landscape is still dominated by the one-way suppression of information-weak countries by information-powerful countries, but this situation is not unbreakable. Information warfare has a high degree of asymmetry, and information-weak countries can counter it step by step with technological innovation, flexible strategies and transnational cooperation. By exerting “asymmetric advantages”, weak countries are not only able to influence global public opinion, but also to enhance their voice with the help of joint action and information-sharing. Transnational cooperation and the establishment of regional alliances provide the weak countries with a powerful tool to counter the powerful, enabling them to form a synergy in international public opinion and challenge the dominance of the information powers. Under the “war framework”, countries can flexibly adjust their strategies and proactively shape the information dissemination pattern, rather than passively accepting information manipulation by powerful countries.

Sociology of war emphasizes the role of social structure, cultural identity, and group behavior in warfare. Weaponized communication is not only a continuation of military or political behavior, but also profoundly affects the psychosocial, group emotions, and cultural identity. Powerful countries use information dissemination to shape other countries’ perceptions and attitudes in order to achieve their own strategic goals. However, from a sociological perspective, weaponized transmission is not a one-way suppression, but rather the product of complex social interactions and cultural responses. In this process, the information-weak countries are not completely vulnerable, but, on the contrary, they can counter external manipulation with “soft power” with the help of cultural communication, social mobilization and dynamic confrontation of global public opinion, shaping a new collective identity and demonstrating the legitimacy of “weak weapons”.

(Fund Project: Research results of the National Social Science Fund Major Project to Study and Interpret the Spirit of the Third Plenary Session of the 20th Central Committee of the Communist Party of China “Research on Promoting the Integrated Management of News Publicity and Online Public Opinion” (Project No.: 24ZDA084))

現代國語:

作者:

郭小安 康如诗来源:

  发布时间:

2025-05-06

【摘要】在國際輿論戰中,武器化傳播已滲透軍事、經濟、外交等領域,帶來“一切皆可武器化”的想像與實踐。武器化傳播通過技術、平台和政策操控公眾認知,體現了權力分配與文化博弈的複雜互動。在全球化和數字化的推動下,認知操控、社會分裂、情感極化、數字監控、信息殖民已成為影響國家穩定的新型手段,這不僅加劇了信息強國與弱國間的競爭,也為信息弱國提供了通過靈活策略和技術創新實現逆轉的機會。在全球非對稱傳播格局下,如何在技術創新與倫理責任、戰略目標與社會平衡間找到契合點和平衡點,將是影響未來國際輿論格局的關鍵要素。

【關鍵詞】輿論戰;武器化傳播;信息操縱;非對稱傳播;信息安全

如果說“宣傳是對現代世界的理性認可”[1],那麼武器化傳播則是對現代技術手段的理性應用。在輿論戰中,各參與主體通過不同傳播手段實現戰略目標,做到表面合理且隱蔽。與傳統軍事衝突不同,現代戰爭不僅涉及物理對抗,還涵蓋信息、經濟、心理及技術等多個領域的競爭。隨著技術進步和全球化的推動,戰爭形態發生深刻變化,傳統的物理對抗逐漸轉向多維度、多領域的綜合作戰。在這一過程中,武器化傳播作為一種現代戰爭形式,成為通過控制、引導和操縱輿論,影響敵對方或目標受眾的心理、情感與行為,進而實現政治、軍事或戰略目的的隱形暴力手段。 《戰爭論》認為,戰爭是讓敵人無力抵抗,且屈從於我們意志的一種暴力行為。 [2]在現代戰爭中,這一目標的實現不僅依賴於軍事力量的對抗,更需要信息、網絡與心理戰等非傳統領域的支持。第六代戰爭(Sixth Generation Warfare)預示戰爭形態的進一步轉變,強調人工智能、大數據、無人系統等新興技術的應用,以及信息、網絡、心理和認知領域的全面博弈。現代戰爭的“前線”已擴展到社交媒體、經濟制裁和網絡攻擊等層面,要求參與者俱備更強的信息控制與輿論引導能力。

當前,武器化傳播已滲透到軍事、經濟、外交等領域,帶來“一切皆可武器化”的憂慮。在戰爭社會學中,傳播被視為權力的延伸工具,信息戰爭深刻滲透並伴隨傳統戰爭。武器化傳播正是在信息控制的框架下,通過塑造公眾認知與情感,鞏固或削弱國家、政權或非國家行為者的權力。這一過程不僅發生在戰時,也在非戰斗狀態下影響著國家內外的權力關係。在國際政治傳播中,信息操控已成為大國博弈的關鍵工具,各國通過傳播虛假信息、發動網絡攻擊等手段,試圖影響全球輿論和國際決策。輿論戰不僅是信息傳播的手段,更涉及國家間權力博弈與外交關係的調整,直接影響國際社會的治理結構與權力格局。基於此,本文將深入探討武器化傳播的概念流變,分析其背後的社會心態,闡述具體的技術手段及所帶來的風險,並從國家層面提出多維應對策略。

一、從傳播武器化到武器化傳播:概念流變及隱喻

武器在人類歷史上一直是戰爭的象徵和工具,戰爭則是人類社會中最極端、暴力的衝突形式。因此,“被武器化”是指將某些工具用於戰爭中的對抗、操控或破壞,強調這些工具的使用方式。 “武器化”(weaponize)譯為“使得使用某些東西攻擊個人或團體成為可能”。 1957年,“武器化”一詞作為軍事術語被提出,V-2彈道導彈團隊的領導者沃納·馮·布勞恩表示,他的主要工作是“將軍方的彈道導彈技術‘武器化’”[3]。

“武器化”最早出現在太空領域,時值美蘇軍備競賽時期,兩個大國力圖爭奪外太空主導權。 “太空武器化”是指將太空用於發展、部署或使用軍事武器系統的過程,包括衛星、反衛星武器和導彈防禦系統等,目的是進行戰略、戰術或防禦性行動。 1959年至1962年,美蘇提出了一系列倡議,禁止將外太空用於軍事目的,尤其是禁止在外層空間軌道部署大規模毀滅性武器。 2018年,當時的美國總統特朗普簽署了《空間政策指令-3》,啟動“太空軍”建設,將太空視為與陸地、空中、海洋同等的重要作戰領域。 2019年,《中華人民共和國和俄羅斯聯邦關於加強當代全球戰略穩定的聯合聲明》中倡議“禁止在外空放置任何類型武器”[4]。

除太空領域的武器化外,軍事、經濟、外交等領域也顯現武器化趨勢。 “軍事武器化”是將資源(如無人機、核武器等)用於軍事目的、部署武器系統或發展軍事能力。 2022年俄烏戰爭期間,英國皇家聯合軍種研究所的報告顯示,烏克蘭每月因俄羅斯干擾站的影響,損失約10000架無人機。 [5]“武器化”也常出現在“金融戰爭”“外交戰場”等表述中。在經濟領域,武器化通常指國家或組織對全球金融系統中的共享資源或機制的利用;外交武器化則表現為國家通過經濟制裁、外交孤立、輿論操控等手段,追求自身利益並對他國施加壓力。隨著時間的推移,“武器化”概念逐漸擴展到政治、社會、文化等領域,尤其在信息領域,自2016年美國總統大選以來,輿論操縱已成為政治鬥爭的普遍工具。美國前中央情報局局長戴維·彼得雷烏斯曾在國家戰略研究所會議上表示,“萬物武器化”(the weaponization of everything)的時代已經來臨。 [6]

作為一種隱喻,“武器化”不僅指實際物理工具的使用,還像徵著對抗性和攻擊性行為的轉化,強調“武器”這一概念如何滲透至日常生活、文化生產和政治策略中,展現社會行動者如何利用各種工具達成戰略目的。時下,許多本應保持中立的領域,如媒體、法律和政府機構,常被描述為“武器化”,用以批判它們的過度政治化和被不正當利用,突出其非法性及對社會的負面影響。通過這一隱喻,人們無意識地將當前的政治環境與理想化的、看似更溫和的過去進行對比,使人們認為過去的政治氛圍更加理性和文明,而現今則顯得過於極端和對立。 [7]因此,“武器化”的實質是政治中介化的過程,是政治力量通過各種手段和渠道,影響或控製本應保持中立的領域,使其成為政治目的和政治鬥爭的工具。

在信息領域,傳播武器化是長期存在的一種戰略手段。第一、二次世界大戰期間,各國就廣泛使用了宣傳和輿論戰,傳播手段被作為一種心理戰術使用。武器化傳播是傳播武器化在現代信息社會中的體現,其利用算法和大數據分析精準地控制信息的傳播速度和範圍,進而操控輿論和情感,反映了技術、平台和策略的結合,使得政治力量可以更加精準和高效地操控公眾認知與輿論環境。信息作為輿論的本體,被“武器化”並用於影響社會認知和群體行為,“戰爭”的概念也隨之變化,不再只是傳統的軍事對抗,還包括通過信息傳播和輿論操控實現的心理戰和認知戰。這種轉變促生了一系列新術語,例如無限制戰爭(unrestricted warfare)、新一代戰爭(new generation warfare)、非對稱戰爭(asymmetric warfare)和非常規戰爭(irregular warfare)等。這些術語幾乎都藉用“戰爭”(warfare)強調信息領域中的多樣化衝突,信息成為被“武器化”的核心內容。

儘管有部分觀點認為“戰爭”一詞不適用於未正式宣布敵對行動的情況[8],但武器化傳播通過弱化戰爭的傳統政治屬性,將各領域的公開或隱蔽的力量和形式籠統地視作傳播行為,從而擴展了“戰爭”這一概念的外延。值得注意的是,在英文術語中“武器化”有兩種表述方式:一種是“weaponized noun(名詞)”,即表示某物已經“被武器化”,具備武器功能或用途;另一種是“weaponization of noun”,指將某物轉化為武器或具有武器性質的過程。在學術領域,儘管weaponized communication和weaponization of communication尚未嚴格區分,但中文翻譯有所區別。 “武器化傳播”更側重於傳播手段或信息本身“被武器化”,以實現某種戰略目標;“傳播武器化”則強調傳播過程本身作為武器的轉化過程。在討論具體技術手段時,多數學術論文采用weaponed或weaponizing作為前綴,以修飾具體的傳播手段。

本文重點討論的是國際輿論戰中的具體傳播策略,著重描述已經發生的武器化現象,故統一使用“武器化傳播”,其是一種利用傳播手段、技術工具和信息平台,通過精確操控信息流動、公眾認知與情感反應,達到特定軍事、政治或社會目的的策略性傳播方式。武器化傳播也並非單純的戰爭或戰時狀態,而是一種持續的傳播現象,它反映了各主體間的互動與博弈,是信息共享和意義空間的流動。

二、武器化傳播的應用場景及實施策略

如果說20世紀90年代末,信息領域的武器化仍是一個“死話題”,各國主要追逐導彈、無人機等實體武器的升級競賽,那麼步入21世紀,網絡戰爭則真正衝進了公眾視野,並深刻嵌入人們的日常生活,經由社交媒體和智能設備,公眾不可避免地捲入輿論戰爭,不自覺地成為參與者或傳播節點。隨著技術的普及,武器化手段逐漸從國家主導的戰爭工具擴展到社會化和政治化領域,對個人和社會的控制從顯性的國家機器轉向更隱蔽的觀念操控。棱鏡計劃(PRISM)的曝光引發了全球對隱私洩露的強烈擔憂,凸顯了國家利用先進技術進行監視和控制的潛力,這被視為一種新型的武器化。自2016年特朗普當選美國總統以來,社交機器人等信息武器的大規模應用,成為全球政治博弈中的常見現象。信息作戰——包括電子戰、計算機網絡作戰、心理戰和軍事欺騙——被廣泛用於操控信息流動,影響輿論格局。這些手段不僅在軍事戰爭和政治選舉中發揮作用,還逐漸滲透到文化衝突、社會運動及跨國博弈之中,傳統的信息作戰邏輯得以延續。如今,武器化傳播作為一種社會政治工具,深刻影響著輿論生態、國際關係以及個人的日常生活。

(一)軍事領域的信息操縱戰

信息流能夠直接影響軍事衝突的走向,塑造公眾和軍隊的認知與決策,進而影響士氣、戰略判斷和社會穩定。在現代戰爭中,信息不再是單純的輔助工具,信息領域已成為核心戰場。通過操控信息流向,敵方的形勢評估可能被誤導,戰鬥意志被削弱,民眾的信任與支持被動搖,進而影響戰爭的決策過程與持續性。

海灣戰爭(Gulf War)被視為現代信息戰的開端。在這場戰爭中,美國通過高科技手段——包括電子戰、空中打擊和信息操作——實施了對伊拉克的系統性打擊。美軍利用衛星和AWACS預警機實時監控戰場態勢,通過空投傳單和廣播電台向伊拉克士兵傳遞美軍優勢及投降後的優待政策,從心理層面誘使伊軍投降。這場戰爭標誌著信息控制在軍事衝突中的關鍵地位,展示了信息戰在現代戰爭中的潛力。進入21世紀,網絡戰成為信息戰的重要組成部分。網絡戰不僅涉及信息的傳播和操控,還包括通過攻擊關鍵基礎設施實現對敵方社會功能的控制。 2007年愛沙尼亞遭遇大規模DDoS(Distributed Denial of Service Attack)攻擊,展示了信息操縱與網絡攻擊融合的趨勢。 2017年在WannaCry勒索軟件事件中,攻擊者利用Windows系統漏洞(EternalBlue)加密全球150個國家約20萬台計算機文件,要求支付贖金,嚴重影響英國國家健康服務體系(NHS),導致急診服務中斷和醫院系統癱瘓,進一步揭示了網絡戰對關鍵基礎設施的威脅。此外,在長期衝突中,基礎設施控制因能夠直接決定信息傳播的速度、範圍和方向,被廣泛用於削弱對手的戰略能力,爭奪公共信息空間。以色列通過限制無線電頻譜使用、控制互聯網帶寬和破壞通信設施,有效削弱了巴勒斯坦的通信能力。同時,以色列還通過經濟制裁和法律框架限制巴勒斯坦電信市場的發展,壓制巴勒斯坦在信息流動中的競爭力,鞏固自身在衝突中的戰略優勢[9],以維持信息的不平等流動。

社交媒體為信息操縱提供了即時、廣泛的信息傳播渠道,使其能夠跨越國界,影響全球公眾情緒和政治局勢,也使戰爭焦點從單純的物理破壞轉向輿論操控。俄烏戰爭期間,深度偽造技術作為視覺武器,對公眾認知和戰爭輿論產生了顯著干擾。 2022年3月15日,烏克蘭總統澤連斯基的偽造視頻在Twitter上傳播,視頻中他“呼籲”烏克蘭士兵放下武器,引發了短時間內的輿論混亂。同樣,俄羅斯總統普京的偽造視頻也被用以混淆視聽。儘管這些視頻被平台迅速標註“Stay informed”(等待了解情況)的說明,但其在短時間內仍然對公眾情緒和認知造成明顯干擾。這些事件凸顯了社交媒體在現代信息戰中的關鍵作用,國家和非國家行為體可以通過虛假信息、情感操控等手段對軍事衝突施加干擾。

信息操縱戰的複雜性還體現在其雙重特性上——既是攻擊工具,也是防禦的手段。在軍事領域,各國通過防禦和反擊網絡攻擊來確保國家安全、保護關鍵基礎設施、維護軍事機密,並在某些情況下影響對手的戰鬥力與決策。 2015年和2017年,俄羅斯黑客發起了針對烏克蘭的大規模網絡攻擊(如BlackEnergy和NotPetya),烏克蘭通過迅速升級網絡防禦系統,成功抵禦部分攻擊並採取反制措施,避免了更大規模的基礎設施癱瘓。此外,北約戰略傳播卓越中心和英國第77旅等單位專注研究和平時期的輿論塑造[10],利用戰略傳播、心理戰和社交媒體監控等手段,擴大信息領域的戰略控制,並強化了防禦與輿論塑造能力,進一步提高了信息戰的戰略高度。

如今,信息操縱戰已經成為現代軍事衝突中的關鍵環節。通過信息技術與心理操控的高度結合,它不僅改變了傳統戰爭的規則,也深刻影響著公眾認知和全球安全格局。國家、跨國公司或其他行為體通過掌控關鍵基礎設施和社交媒體平台,限制信息流動、操控傳播路徑,從而在全球信息生態中獲得戰略優勢。

(二)政治選舉的輿論干預戰

政治選舉是民主政治中最直接的權力競爭場域,信息傳播在此過程中對選民決策具有重要影響。通過計算宣傳等手段,外部勢力或政治團體能夠操縱選民情緒、誤導公眾認知,從而左右選舉結果、破壞政治穩定或削弱民主進程,選舉因此成為武器化傳播最具效果的應用場景。

近年來,全球政治選舉呈現極化趨勢,持不同政治立場的群體之間存在巨大的意識形態差異。極化導致公眾選擇性接受與自身觀點一致的信息,同時排斥其他信息,這種“回音室效應”加劇了公眾對立場的片面認知,為輿論干預提供了更大的空間。而信息傳播技術,尤其是計算宣傳的興起,使外部勢力能夠更加精準地操控輿論和影響選民決策。計算宣傳(Computational Propaganda)指利用計算技術、算法和自動化系統操控信息流動,以傳播政治信息、干預選舉結果和影響輿論,其核心特徵在於算法驅動的精準性和自動化傳播的規模化,通過突破傳統人工傳播的限制,顯著增強了輿論操控的效果。 2016年美國總統選舉中,特朗普團隊通過劍橋分析公司分析Facebook用戶數據,為選民定向推送定制化的政治廣告,精準影響了選民的投票意向[11]。這一事件被視為計算宣傳干預選舉的典型案例,也為其他政客提供了操作模板,推動了計算宣傳在全球範圍內的廣泛應用。 2017年法國總統選舉中,候選人埃馬紐埃爾·馬克龍(Emmanuel Macron)團隊遭遇黑客攻擊,內部郵件被竊取並公開,內容稱馬克龍在海外擁有秘密賬戶並涉及逃稅,企圖抹黑其形象。 2018年巴西總統選舉期間,候選人雅伊爾·博索納羅(Jair Bolsonaro)團隊利用WhatsApp群組傳播煽動性政治內容,定向推送大量圖像、視頻和煽動性消息以影響選民情緒。據統計,自2017年至2019年,全球採用計算宣傳的國家由28個增加至70個,2020年這一數量上升至81個。這表明,計算宣傳正通過技術手段和傳播策略,重新定義全球選舉中的輿論規則。

計算宣傳也是國家行為者在輿論干預戰中的重要工具。 2011年,美國國防高級研究計劃局(DARPA)在中東地區開展“歐內斯特之聲”行動,通過建立和管理多個虛假身份(sockpuppets),扭曲阿拉伯語社交媒體的對話。俄羅斯也頻繁利用計算宣傳實施干預,在加拿大操作約20萬個社交媒體賬戶,借助極右翼和極左翼運動散佈親俄言論,製造虛假的社會熱點,試圖破壞加拿大對烏克蘭的支持[12]。作為計算宣傳的重要組成部分,社交機器人通過自動化和規模化手段製造輿論熱度,藉由特定標籤在社交平台上增加信息的曝光率,操控議題的優先級。 2016年美國大選期間,俄羅斯利用社交機器人發布支持普京和攻擊反對派的內容,通過信息過載(information overload)掩蓋反對派聲音,強化親普京的輿論氛圍。 [13]2017年海灣危機期間,沙特阿拉伯和埃及通過Twitter機器人製造反卡塔爾標籤#AlJazeeraInsultsKingSalman的熱度,使其成為熱門話題,虛構了反卡塔爾情緒的高峰,進而影響了全球範圍內對卡塔爾的輿論態度。 [14]深度偽造技術則進一步提升了計算宣傳的精準性與隱蔽性。 2024年,美國總統喬·拜登的偽造視頻在X(原Twitter)上迅速傳播,視頻顯示其在橢圓形辦公室使用攻擊性語言,引發輿論爭議並影響選民情緒。據網絡安全公司McAfee調查,63%的受訪者在兩個月內觀看過政治深度偽造視頻,近半數表示這些內容影響了他們的投票決定。 [15]

在全球範圍內,計算宣傳已滲透各國輿論戰中,影響著社會穩定與國家安全。以色列國防軍通過數字武器對巴勒斯坦展開輿論戰,土耳其培養了“愛國巨魔軍隊”操控國內外輿論,墨西哥政府利用殭屍網絡影響輿論。作為現代輿論干預戰的重要手段,計算宣傳正在改變全球政治傳播的格局。隨著人工智能、量子計算等技術的發展,計算宣傳還可能通過更隱蔽和高效的方式乾預選舉流程,甚至直接威脅民主制度的核心運行邏輯。

(三)文化領域的符號認同戰

武器化傳播通過操控信息、符號和價值觀,試圖影響公眾的思想、情感和行為,進而塑造或改變社會的集體認知與文化認同。這種傳播方式不僅在於信息的傳遞,更通過特定的敘事框架、文化符號和情感共鳴,推動某種特定的意識形態或政治理念的傳播與認同。通過操縱文化符號、社會情感和集體記憶,武器化傳播在文化領域干擾社會結構與文化認同,成為符號認同戰的核心手段。

模因(Meme)作為一種集視覺元素和簡潔文字於一體的文化符號,以幽默、諷刺或挑釁的方式激發觀眾的情感反應,影響他們的政治態度和行為。佩佩模因(Pepe the Frog)起初是一個無害的漫畫角色,被極右翼群體重新利用並武器化,用以傳播仇恨言論,逐漸演變為種族主義和反移民的象徵。模因將復雜的政治情緒轉化為便於傳播的視覺符號,迅速激起公眾對政策的不信任和憤怒,被視為“武器化的偶像破壞主義”(Iconoclastic Weaponization)。這一過程通過操控文化符號,以達到政治或社會鬥爭的目的[16],加劇了公眾對社會和政治的分裂。例如,在英國脫歐期間,帶有“Take Back Control”(奪回控制權)字樣的模因迅速傳播,強化了民族主義情緒。

除了文化符號的製造外,符號的篩选和屏蔽同樣能夠塑造或加深某種文化認同或政治立場。審查制度自古以來就是權力控制信息的重要手段,早在古希臘和古羅馬時期,政府就對公共演講和文學作品進行審查,以維持社會秩序和權力穩定。進入數字時代,互聯網和社交媒體的興起推動了審查制度的現代化,平台審查逐漸取代傳統的審查方式,成為當代信息控制和輿論引導的核心工具。算法審查通過人工智能檢測敏感話題、關鍵詞和用戶行為數據,自動刪除或屏蔽被視為“違規”的內容,社交媒體的審核團隊會對用戶生成的內容進行人工篩選,確保其符合平台政策和法律法規。平台審查的作用不僅是限制某些內容的傳播,更是通過推送、刪除和屏蔽等方式引導輿論,塑造公眾認知框架。儘管主流社交平台通過嚴格的內容審核機制控制信息傳播,但一些邊緣平台,如Gab、Gettr、Bitchute等因缺乏有效審查,成為極端言論和惡意信息的溫床。這些平台未對內容髮布做出足夠限制,極端觀點和虛假信息得以肆意擴散,例如,Gab因極端主義內容屢遭批評,被指助長暴力和仇恨。在迴聲室中,用戶只能接觸與自身觀點一致的信息,這種信息環境更強化了極端思想,導致社會群體間的對立加劇。 [17]

語言作為信息傳播的載體和工具,能夠通過情感操控、符號政治和社會動員等方式,深刻影響群體行為和文化認同。語言武器化聚焦於語言形式和文化語境如何影響信息的接收方式,強調語言如何被用來操控、引導或改變人們的認知與行為。這不僅涉及特定詞彙和修辭手法的使用,更包括通過語言表述建構特定的社會意義和文化框架。作為符號認同戰的另一重要工具,語言塑造了“敵我對立”的敘事框架。大翻譯運動(Great Translation Movement)通過選擇性翻譯中國網民的民族主義言論,將其傳播到國際社交媒體平台,引發了對中國的負面認知。這種語言操控通過情緒化表達放大了爭議性內容,加深了國際社會的文化偏見。

語言武器化的深層邏輯在於情緒化和煽動性的語言形式。西方國家常以“人權”與“民主”等正義化標籤為乾預行為辯護,合法化政治或軍事行動。白人至上主義者使用“另類右翼”等模糊標籤重塑意識形態,將傳統的帶有強烈負面含義的“白人至上主義”轉化為一個較為中立的概念,降低了該詞彙的社會抵抗力,用寬泛的“傘式”身份擴大其支持者的基礎。通過對世俗話語的滲透,仇恨政治和極端言論被正當化,逐漸形成一種政治常態。當公眾將這種政治日常化後,語言實現了真正的武器化。 [18]在尼日利亞,煽動仇恨的內容通過種族、宗教和地區話題擴散,深刻惡化了社會關係。 [19]語言的模糊性和合理否認策略也成為傳播者規避責任的有力工具,在被簡化的敘事中傳播複雜的社會和政治議題。特朗普的美國優先(America First)政策通過否定性標籤和情緒化話語,以反對全球化、質疑氣候變化科學、抨擊傳統盟友等方式,故意提出與主流意見相對立的觀點,激發公眾對全球化的不信任,重塑國家利益優先的文化認同。 [20]

三、武器化傳播的風險與挑戰:正當性與破壞性

儘管武器化傳播給國際輿論格局帶來了巨大風險,但特定情形下,其可能會被某些國家或團體通過法律、政治或道德框架賦予一定的正當性。如“9·11”事件後,美國通過《愛國法案》擴大了情報部門的監控權限,以“反恐”為名實施廣泛的信息控制,這種“正當性”常被批評為破壞公民自由,侵蝕了民主社會的核心價值。

在國際政治博弈中,武器化傳播更常被視為“灰色區域”(Gray Zone)的手段。國家間的對抗不再局限於經濟制裁或外交壓力,而是通過信息操控、社交媒體干預等非傳統方式展開。部分國家以“保護國家利益”為藉口傳播虛假信息,辯稱其行為是合規的,儘管這些行為可能在國際法上存在爭議,但往往被合理化為“反制外部威脅”的必要手段。在一些信息監管缺乏嚴格法律框架的國家,選舉的干預行為往往被容忍,甚至被視為一種“正當”的政治活動。在文化層面,某些國家通過傳播特定的文化符號和意識形態,試圖在全球範圍內塑造自身的文化影響力。西方國家常以“文化共享”和“文明傳播”為名,推動其價值觀的傳播,而在實際操作中,卻通過操控文化符號和敘事框架,削弱其他文化的認同感,導致全球文化生態的不平衡。法律框架也在一定程度上為武器化傳播的正當性提供了支持。一些國家以“反恐”和“反對極端主義”為名,通過信息審查、內容過濾等手段限制所謂“有害信息”的傳播。然而,這種正當性往往突破了道德邊界,導致信息封鎖和言論壓制。以“國家安全”為理由的信息治理,雖然在一定程度上獲得了內部認可,卻為武器化傳播的氾濫提供了空間。

相較於正當性,武器化傳播的破壞性尤為顯著。目前,武器化傳播已成為權力結構操控輿論的重要工具,其不僅扭曲了信息內容,還通過隱私侵犯、情感動員和文化滲透等方式,深刻影響了公眾認知、社會情緒以及國際關係。

(一)信息失真與認知操控

信息失真指信息在傳播過程中被故意或無意扭曲,導致公眾接收到的內容與原始信息存在顯著差異。在社交媒體上,虛假信息和誤導性內容的傳播日益猖獗,人工智能模型(如GPT)的生成內容,可能因訓練數據的偏見而加劇這一問題。性別、種族或社會偏見可能被反映在自動生成的文本中,放大信息失真的風險。社交媒體的快速傳播特性也使傳統的事實核查機制難以跟上虛假信息的擴散速度。虛假信息在短時間內往往佔據輿論主導地位,跨平台傳播和匿名性使得澄清與糾正變得更加複雜。傳播的不對稱性削弱了傳統新聞機構的權威性,公眾更傾向於相信即時更新的社交平台信息,而非傳統新聞機構的深入報導,這進一步削弱了新聞機構在抵制虛假信息中的作用。

除了信息本身的失真,武器化傳播還深刻利用了認知失調的心理機制。認知失調指個體接觸到與其已有信念或態度相衝突的信息時產生的心理不適感。傳播者通過製造認知失調,動搖目標受眾的既有態度,甚至誘導其接受新的意識形態。在政治選舉中,定向傳播負面信息常迫使選民重新審視政治立場,甚至改變投票傾向。武器化傳播通過選擇性暴露進一步加劇了“信息繭房”的形成,讓受眾傾向於接觸與自身信念一致的信息,忽視或排斥相反觀點。這不僅強化了個體的認知偏見,也讓虛假信息在群體內部快速擴散,難以被外界的事實和理性聲音打破,最終形成高度同質化的輿論生態。

(二)隱私洩露與數字監控

近年來,深度偽造技術的濫用加劇了隱私侵權問題。 2019年,“ZAO”換臉軟件因默認用戶同意肖像權而被下架,揭示了生物特徵數據的過度採集風險。用戶上傳的照片經深度學習處理後,既可能生成精確的換臉視頻,也可能成為隱私洩露的源頭。更嚴重的是,深度偽造等技術被濫用於性別暴力,多名歐美女演員的面孔被非法植入虛假性視頻並廣泛傳播,儘管平台在部分情況下會刪除這些內容,但開源程序的普及讓惡意用戶能夠輕鬆複製和分享偽造內容。此外,用戶在使用社交媒體時,往往默認授權平台訪問其設備的照片、相機、麥克風等應用權限。通過這些權限,平台不僅收集了大量個人數據,還能夠通過算法分析用戶的行為特徵、興趣偏好和社交關係,進而精準投放廣告、內容推薦甚至實施信息操控。這種大規模數據採集推動了對隱私保護的全球討論。在歐洲,《通用數據保護條例》(General Data Protection Regulation)試圖通過嚴格的數據收集和使用規定,加強個人隱私權保障。然而,由於“隱性同意”或複雜的用戶協議,平台常常繞過相關規定,使數據處理過程缺乏透明度,導致普通用戶難以了解數據的實際用途。美國《通信規範法》第230條規定,網絡平台無需為用戶生成的內容承擔法律責任,這一規定推動了平台內容審核的發展,但也使其在應對隱私侵權時缺乏動力。平台出於商業利益的考慮,往往滯後處理虛假信息和隱私問題,導致審核責任被持續擱置。

在數字監控方面,社交平台與政府的合作使用戶數據成為“監控資本主義”的核心資源。美國國家安全局(NSA)通過電話記錄、互聯網通信和社交媒體數據,實施大規模監控,並與Google、Facebook等大型企業合作,獲取用戶的在線行為數據,用於全球範圍內的情報收集和行為分析。跨國監控技術的濫用更是將隱私侵犯推向國際層面。以色列網絡安全公司NSO開發的Pegasus間諜軟件,通過“零點擊攻擊”入侵目標設備,可實時竊取私人信息和通信記錄。 2018年,沙特記者賈馬爾·卡舒吉(Jamal Khashoggi)被謀殺一案中,沙特政府通過Pegasus監聽其通信,揭示了這種技術對個體隱私和國際政治的深遠威脅。

(三)情感極化與社會分裂

情感在影響個體認知與決策中起著關鍵作用。武器化傳播通過煽動恐懼、憤怒、同情等情緒,影響理性判斷,推動公眾在情緒驅動下做出非理性反應。戰爭、暴力和民族主義常成為情感動員的主要內容,傳播者通過精心設計的議題,將愛國主義、宗教信仰等元素植入信息傳播,迅速引發公眾情感共鳴。數字技術的廣泛應用,特別是人工智能和社交媒體平台的結合,進一步放大了情感極化的風險。虛假信息與極端言論在平台上的快速傳播,不僅來自普通用戶的分享行為,更受到算法的驅動。平台傾向優先推送情緒化和互動性高的內容,這些內容常包含煽動性語言和極端觀點,從而加劇了仇恨言論和偏激觀點的傳播。

社交媒體標籤和算法推薦在情感極化中扮演著關鍵角色。在查理周刊事件後,#StopIslam標籤成為仇恨言論的傳播工具,用戶借助該標籤發布仇視和暴力傾向的信息。在美國2020年總統選舉期間,社交平台上的極端政治言論和錯誤信息也在激烈的黨派鬥爭中被放大。通過精確的情感操控,武器化傳播不僅撕裂了公共對話,還極大影響了社會的民主進程。另一種特殊的極端主義動員策略是“武器化自閉症”(Weaponized Autism),即極右翼團體利用自閉症個體的技術專長,實施情感操控。這些團體招募技術能力較強但有社交障礙的個體,通過賦予虛假的歸屬感,將其轉化為信息戰的執行者。這些個體在極端組織的指引下,被用於傳播仇恨言論、執行網絡攻擊和推動極端主義。這種現像不僅揭示了情感操控的深層機制,也表明技術如何被極端團體利用來服務於更大的政治和社會議程。 [21]

(四)信息殖民與文化滲透

“武器化相互依賴”理論(Weaponized Interdependence Theory)揭示了國家如何利用政治、經濟和信息網絡中的關鍵節點,對其他國家施加壓力。 [22]特別是在信息領域,發達國家通過控制信息流實施“信息殖民”,進一步鞏固其文化和政治優勢。數字平台成為這一殖民過程的載體,全球南方國家在信息傳播中高度依賴西方主導的技術平台和社交網絡,在撒哈拉以南非洲地區,Facebook已成為“互聯網”的代名詞。這種依賴不僅為西方企業帶來了巨大的廣告收入,還通過算法推薦對非洲本土文化和價值觀,尤其是在性別、家庭和宗教信仰等方面,產生了深遠影響,使文化滲透成為常態。

數字不平等是信息殖民的另一表現。發達國家在數字技術和信息資源上的主導地位,使南方國家在經濟、教育和文化領域日益邊緣化。巴勒斯坦因基礎設施不足和技術封鎖,難以有效融入全球數字經濟,既限制了本地經濟發展,又進一步削弱了其在全球信息傳播中的話語權。全球主要經濟體和信息強國通過技術封鎖和經濟制裁,限制他國獲取關鍵技術與創新資源,這不僅阻礙了目標國的科技發展,也加劇了全球技術與創新生態的斷裂。自2018年退出《伊朗核協議》以來,美國對伊朗的經濟制裁導致其在半導體和5G領域發展受阻,技術與創新的不對稱拉大了全球技術生態的差距,使許多國家在信息競爭中處於劣勢。

四、反思與討論:非對稱傳播格局中的話語權爭奪

在國際非對稱傳播(Asymmetric Communication)競爭格局下,強勢方常常通過主流媒體和國際新聞機構等渠道佔據輿論的主導地位,而弱勢方則需要藉助創新傳播技術和手段來彌補劣勢,爭奪話語權。這一傳播格局的核心在於信息地緣政治(Information Geopolitics),即國家之間的權力較量不僅僅取決於地理位置、軍事力量或經濟資源,更取決於對信息、數據和技術的控制。大國間的博弈已不再僅限於物理空間的控制,而擴展至輿論空間的爭奪。這些“信息景觀”涉及全球傳播生態中的話語權、信息流通和媒體影響力等,在這一過程中,國家通過不斷製造景觀,以影響國際輿論、塑造全球認知框架,進而實現其戰略目標。非對稱傳播的策略不僅關乎信息內容的傳遞,更重要的是如何借助各種傳播技術、平台和手段彌補資源與能力上的差距,信息傳播的核心不再局限於內容本身,而圍繞著話語權的爭奪展開。隨著信息戰和認知戰的興起,誰掌握了信息,誰就能在全球競爭中占得先機。

(一)後發優勢下的技術赶超

傳統的大國或強勢傳播者掌控著全球輿論的主導權,相比之下,弱勢國家往往缺乏與這些大國抗衡的傳播渠道。後發優勢理論主張後發國家能夠通過跳躍式發展,繞過傳統的技術路徑,引進現有的先進技術和知識,從而迅速崛起並規避早期技術創新中的低效和過時環節。在武器化傳播的背景下,這一理論為信息弱國提供了通過新興科技突破大國傳播壁壘的路徑,有助於其在技術層面上實現赶超。傳統媒體往往受到資源、影響力和審查機制的限制,信息傳播速度慢、覆蓋面有限,且容易受到特定國家或集團的操控。數字媒體的崛起使信息傳播的格局發生了根本性變化,弱勢國家能夠借助全球化的互聯網平台,直接面向國際受眾,而不必依賴傳統的新聞機構和主流媒體。通過新興技術,弱勢國家不僅能更精準地傳遞信息,還能通過定向傳播和情感引導,迅速擴大其在國際輿論中的影響力。後發國家可以利用先進技術(如大數據、人工智能、5G網絡等)實現精準的信息傳播,打造高效的傳播渠道。以大數據分析為例,後發國家可以深入了解受眾需求和輿情趨勢,快速識別全球輿論脈搏,實施定向傳播,快速擴大國際影響力。人工智能技術不僅能夠預測輿論發展方向,還能實時優化傳播策略。 5G網絡的普及大大提升了信息傳播的速度與覆蓋範圍,使後發國家能夠以低成本、高效率的方式突破傳統傳播模式的局限,形成獨特的傳播優勢。

通過跨國合作,後發國家可以整合更多的傳播資源,擴大傳播的廣度與深度。例如,阿根廷與拉美其他國家共同建立了“拉美新聞網絡”,通過新聞內容共享,推動拉美國家在國際輿論中發出統一的聲音,反擊西方媒體的單一敘事。在非洲,南非與華為合作推動“智慧南非”項目,建設現代化信息基礎設施,促進數字化轉型和公共服務效率的提升。後發國家政府應加大對技術研發和創新的投入,鼓勵本土企業和人才的發展。同時,還應注重文化輸出和媒體產業建設,通過全球化合作和去中心化傳播模式提升國家在國際信息空間中的話語權。政府可以資助數字文化創作,支持本地社交媒體平台的成長,並通過國際合作框架整合更多傳播資源。

(二)信息反制中的壁壘構建

與軍事行動可能引發的全面衝突,或經濟制裁可能帶來的風險不同,武器化傳播能夠在不觸發全面戰爭的情況下實現戰略目標,基於成本和戰略考量,其具有極大的吸引力。由於武器化傳播具備低成本、高回報的特點,越來越多的國家和非國家行為體選擇通過操控信息來達到戰略目標。這種傳播手段的普及,使得國家在面對來自外部和內部的信息攻擊時,面臨更加複雜和多變的威脅。隨著信息戰爭的日益激烈,單純的傳統軍事防禦已經無法滿足現代戰爭的需求。相反,構建強有力的信息防禦體系,成為國家保持政治穩定、維護社會認同和提升國際競爭力的關鍵策略。因此,如何有效應對外部信息干擾和輿論操控,並進行信息反制,已成為各國迫切需要解決的問題。完善的網絡安全基礎設施是維護國家安全的關鍵,用以防范敏感信息不被外部操控或篡改。以歐盟為例,歐盟通過“數字單一市場”戰略推動成員國加強網絡安全建設,要求互聯網公司更積極地應對虛假信息和外部干預。歐盟的網絡安全指令還規定各成員國建立應急響應機制,保護重要信息基礎設施免受網絡攻擊。此外,歐盟還與社交平台公司,如Facebook、Twitter和Google等建立合作,通過提供反虛假信息工具和數據分析技術來打擊假新聞傳播。人工智能、大數據和自動化技術正在成為信息防禦的重要工具,被用以實時監控信息傳播路徑,識別潛在的虛假信息和抵禦輿論操控。在網絡安全領域,大數據分析幫助決策者識別和預警惡意攻擊,並優化反制策略。這些技術的應用不僅能夠在國內層面增強信息防禦能力,還能提高國家在國際信息空間中的主動性和競爭力。

反制機制是信息防禦體系的另一重要組成部分,尤其是在國際輿論壓力下,實時監控外部信息傳播並及時糾正虛假信息成為維護輿論主動權的關鍵。烏克蘭自2014年克里米亞危機以來,通過與北約和美國合作,建立了頗具規模的網絡防禦體系。烏克蘭的國家網絡安全局為應對網絡威脅設立了“信息反制小組”,利用社交媒體和新聞發布平台實時駁斥俄羅斯的虛假報導,這一策略顯著提升了烏克蘭在國際輿論中的聲譽和信任度。

(三)輿論引導中的議程設置

在信息化和數字化的全球競爭格局中,輿論引導不僅涉及信息傳播內容,更關鍵的是如何設置議程並聚焦全球關注的熱點話題。議程設置理論表明,誰能掌控信息流通的議題,誰就能引導輿論的方向。議程設置通過控制話題的討論範圍和焦點,影響公眾對事件的關注與評價,社交媒體的興起為信息弱勢國提供了突破口,使其可以通過多平台聯動來爭奪信息傳播的主導權。以烏克蘭為例,其在俄烏戰爭中通過社交媒體傳播戰爭實況,不僅發布戰鬥實況,還融入民眾的情感訴求,借助平民遭遇和城市破壞的悲情敘事,激發國際社會的同情與關注。在抵禦外部信息干擾的同時,國家還需要主動傳播正面敘事,講述能夠引發國際社會共鳴的文化故事。故事應該符合國際輿論的情感需求,同時展現國家的獨特性,強化與國際社會的聯繫。以我國的“一帶一路”共建為例,在“一帶一路”共建國家,我國投資建設了大量基礎設施項目,這些項目不僅幫助改善了當地的經濟基礎條件,也展示了中國在全球化進程中的責任擔當,更為文化合作和交流活動提供了窗口,向世界展示了中華民族豐富的歷史文化,為國際社會展現了中華文化的包容性和責任感。

但由於全球南方國家往往面臨資源、技術與國際傳播平台的限制,難以直接與發達國家競爭,因此它們依賴更加靈活、創新的傳播手段來參與全球議程的設置。例如,巴西在應對環保和氣候變化議題上,尤其是亞馬遜森林的砍伐問題,面臨來自西方媒體的負面輿論壓力。為此,巴西政府利用社交媒體發布關於亞馬遜保護的最新數據和成功案例,積極塑造國家在環境保護領域的形象。同時,巴西通過與其他發展中國家合作,參與全球氣候變化談判,推動南南合作,增強了在氣候問題上的話語權。大型國際事件、人道主義活動和製作文化產品等,也是講述國家故事的有效方式。國際體育賽事如世界杯、奧運會等,不僅是體育競技的展示平台,更是國家形象和文化軟實力的展現場所,通過承辦或積極參與這些全球性事件,國家能夠向世界展示其實力、價值和文化魅力,推動積極的輿論議程。

“戰爭無非是政治通過另一種手段的延續”[23]。這一克勞塞維茨的經典論斷在武器化傳播的語境下得到了現代化的詮釋。武器化傳播突破了傳統戰爭的物理邊界,成為一種融合信息戰、認知戰和心理戰的現代戰略手段。它以非暴力的形式操控信息流向和公眾認知,使國家和非國家行為者無須依賴直接軍事行動即可實現政治目標,體現出極強的戰略性和目標性。通過操控信息、情緒和價值觀,武器化傳播能夠在避免全面戰爭的同時達成戰略目的,在全球競爭和衝突中,已成為強國對弱國進行政治壓制的重要手段。

武器化傳播的核心在於通過信息操控削弱敵方的決策力與行動能力,但其複雜性使得傳播效果難以完全預測。儘管信息強國通過技術優勢和傳播渠道壓制信息弱國,傳播效果卻充滿不確定性。尤其是在社交媒體和數字平台全球化的背景下,信息流動的邊界和效果愈加難以控制。這種複雜性為弱國提供了突破話語霸權的機會,推動信息傳播的反向博弈。弱國可以利用這些平台發起對抗,挑戰強國的信息操控,在全球輿論中佔據一席之地。非對稱性博弈反映了國際輿論的動態平衡,傳播不再是單向的控制,而是更為複雜的交互和對話,賦予弱者影響輿論的可能性。當前國際輿論格局仍以信息強國對信息弱國的單向壓制為主,但這一局面並非不可打破。信息戰爭具有高度的不對稱性,信息弱國可以憑藉技術創新、靈活策略和跨國合作逐步反制。通過發揮“非對稱優勢”,弱國不僅能夠影響全球輿論,還能藉助聯合行動和信息共享提升話語權。跨國合作與地區聯盟的建立,為弱國提供了反制強國的有力工具,使其能夠在國際輿論上形成合力,挑戰信息強國的主導地位。在戰爭框架下,各國可以靈活調整策略,主動塑造信息傳播格局,而非被動接受強國的信息操控。

戰爭社會學強調社會結構、文化認同和群體行為在戰爭中的作用。武器化傳播不僅是軍事或政治行為的延續,更深刻影響社會心理、群體情感和文化認同。強國利用信息傳播塑造他國的認知與態度,以實現自己的戰略目標。然而,從社會學視角來看,武器化傳播並非單向的壓制,而是複雜的社會互動和文化反應的產物。在這一過程中,信息弱國並非完全處於弱勢,相反,它們可以藉助文化傳播、社會動員和全球輿論的動態對抗,以“軟實力”反擊外部操控,塑造新的集體認同,展示“弱者武器”的正當性。

(基金項目:研究闡釋黨的二十屆三中全會精神國家社科基金重大專項“推進新聞宣傳和網絡輿論一體化管理研究”(項目編號:24ZDA084)的研究成果)

References:

[1] Lasswell H D Propaganda techniques in the world wars [M] Beijing: Renmin University Press, 2003

[2] Clausewitz C V. On War: Volume 1 [M] Academy of Military Sciences of the People’s Liberation Army of China, translated Beijing: The Commercial Press, 1978.

[3]Herrman J. If everything can be ‘weaponized,’ what should we fear? [EB/OL]. (2017-03-14)[2024-12-20].https://www.nytimes.com/2017/03/14/magazine/if-everything-can-be-weaponized-what-should-we-fear.html.

[4] Ministry of Foreign Affairs of the People’s Republic of China Joint statement by the People’s Republic of China and the Russian Federation on strengthening contemporary global strategic stability (full text) [EB/OL].https://www.mfa.gov.cn/web/ziliao_674904/1179_674909/201906/t20190606_7947892.shtml.

[5]Mazarr M J, Casey A, Demus A, et al. Hostile social manipulation: present realities and emerging trends[M]. Santa Monica, CA USA: Rand Corporation, 2019.

[6]Bob Y J. Ex-CIA director Petraeus: Everything can be hijacked, weaponized[EB/OL].(2018-01-30)[2024-12-20].https://www.jpost.com/israel-news/ex-cia-director-petraeus-everything-can-be-hijacked-weaponized-540235.

[7]Mattson G. Weaponization: Metaphorical Ubiquity and the Contemporary Rejection of Politics[EB/OL].OSF(2019-01-08)[2024-12-20].osf.io/5efrw.

[8]Robinson L, Helmus T C, Cohen R S, et al. Modern political warfare[J]. Current practises and possible responses, 2018.

[9]Kreitem H M. Weaponization of Access, Communication Inequalities as a Form of Control: Case of Israel/Palestine[J]. Digital Inequalities in the Global South, 2020: 137-157.

[10]Laity M. The birth and coming of age of NATO StratCom: a personal history[J]. Defence Strategic Communications, 2021, 10(10): 21-70.

[11]Confessore N. Cambridge Analytica and Facebook: The scandal and the fallout so far[J]. The New York Times, 2018(4).

[12]McQuinn B, Kolga M, Buntain C, et al. Russia Weaponization of Canada’s far Right and far Left to Undermine Support for Ukraine[J]. International Journal,(Toronto,Ont),2024,79(2):297-311.

[13]Stukal D, Sanovich S, Bonneau R, et al. Why botter: how pro-government bots fight opposition in Russia[J]. American political science review, 2022, 116(3): 843-857.

[14]Jones M O. The gulf information war| propaganda, fake news, and fake trends: The weaponization of twitter bots in the gulf crisis[J]. International journal of communication13(2019):27.

[15]Genovese D. Nearly 50% of voters said deepfakes had some influence on election decision. [EB/OL].(2024-10-30)[2024-12-20].https://www.foxbusiness.com/politics/nearly-50-voters-said-deepfakes-had-some-influence-election-decision.

[16]Peters C, Allan S. Weaponizing memes: The journalistic mediation of visual politicization[J]. Digital Journalism, 2022, 10(02):217-229.

[17]Gorissen S. Weathering and weaponizing the# TwitterPurge: digital content moderation and the dimensions of deplatforming[J]. Communication and Democracy, 2024, 58(01): 1-26.

[18]Pascale C M. The weaponization of language: Discourses of rising right-wing authoritarianism[J]. Current Sociology, 2019, 67(06): 898-917.

[19]Ridwa1ah A O, Sule S Y, Usman B, et al. Politicization of Hate and Weaponization of Twitter/X in a Polarized Digital Space in Nigeria[J]. Journal of Asian and African Studies, 2024.

[20]Mercieca J R. Dangerous demagogues and weaponized communication[J]. Rhetoric Society Quarterly, 2019, 49(03): 264-279.

[21]Welch C, Senman L, Loftin R, et al. Understanding the use of the term “Weaponized autism” in an alt-right social media platform[J]. Journal of Autism and Developmental Disorders, 2023, 53(10): 4035-4046.

[22]Farrell H, Newman A L. Weaponized interdependence: How global economic networks shape state coercion[J]. International security,2019,44(01):42-79.

[23] Clausewitz C V. On War: Volume 1 [M] Academy of Military Sciences of the People’s Liberation Army of China, translated Beijing: The Commercial Press, 1978

作者簡介:郭小安,重慶大學新聞學院教授、博士生導師,重慶市哲學社會科學智能傳播與城市國際推廣重點實驗室執行主任(重慶 400044);康如詩,重慶大學新聞學院碩士生(重慶 400044)。

中國原創軍事資源:https://www.cjwk.cn/journal/guidelinesDetails/192031322246497484888

Chinese Military to Utilize Artificial Intelligence Empowering Cognitive Confrontation Success on the Modern Battlefield

中國軍隊將利用人工智慧增強現代戰場認知對抗的成功

現代英語:

With the advent of the “smart +” era, artificial intelligence is widely used in the military field, and conventional warfare in physical space and cognitive confrontation in virtual space are accelerating integration. Deeply tapping the potential of artificial intelligence to empower cognitive confrontation is of great significance to improving the efficiency of cross-domain resource matching and controlling the initiative in future operations.

Data mining expands the boundaries of experience and cognition

Data-driven, knowing the enemy and knowing yourself. With the advancement of big data-related technologies, data information has become cognitive offensive and defensive ammunition, and information advantage has become increasingly important on the battlefield. Empowering traditional information processing processes with artificial intelligence technology can enhance the ability to analyze related information, accelerate information integration across domains through cross-domain data collection and false information screening, and enhance dynamic perception capabilities. Artificial intelligence can also help alleviate battlefield data overload, organically integrate enemy information, our own information, and battlefield environment information, and build a holographic intelligent database to provide good support for cognitive confrontation.

Everything is connected intelligently, and humans and machines collaborate. Modern warfare is increasingly integrated between the military and civilians, and the boundaries between peace and war are blurred. Technology has redefined the way people interact with each other, people with equipment, and equipment with equipment, and battlefield data is constantly flowing. Through big data mining and cross-domain comparative analysis, unstructured data such as images, audio, and video can be refined, and the truth can be retained to expand the boundaries of experience cognition and improve the level of human-machine collaboration. The in-depth application of the Internet of Things and big data technologies has promoted the continuous improvement of the intelligent level of data acquisition, screening, circulation, and processing processes, laying a solid foundation for the implementation of cognitive domain precision attacks.

Break through barriers and achieve deep integration. Relying on battlefield big data can effectively break through the barriers of full-domain integration, help connect isolated information islands, promote cross-domain information coupling and aggregation, accelerate barrier-free information flow, and promote the transformation of data fusion and information fusion to perception fusion and cognitive fusion. The comprehensive penetration of intelligent equipment into the command system can accelerate the deep integration of situation awareness, situation prediction and situation shaping, optimize multi-dimensional information screening and cognitive confrontation layout, and promote the continuous iteration and upgrading of cognitive domain combat styles.

Intelligent algorithms enhance decision-making efficiency

Accelerate decision-making and cause confusion to the enemy. The outcome of cognitive confrontation depends to a certain extent on the game of commanders’ wisdom and strategy. Through full-dimensional cross-domain information confrontation and decision-making games, with the help of intelligent technology, we can analyze and intervene in the opponent’s cognition and behavior, and finally gain the initiative on the battlefield. At present, artificial intelligence has become a catalyst for doubling combat effectiveness. In peacetime, it can play the role of an intelligent “blue army” to simulate and deduce combat plans; in wartime, through intelligent decision-making assistance, it can improve the quality and efficiency of the “detection, control, attack, evaluation, and protection” cycle, create chaos for the enemy, and paralyze its system.

Autonomous planning and intelligent formation. In the future intelligent battlefield, “face-to-face” fighting will increasingly give way to “key-to-key” offense and defense. In cognitive domain operations, the use of intelligent algorithms to accurately identify identity information, pre-judge the opponent’s intentions, and control key points in advance can quickly transform information advantages into decision-making advantages and action advantages. Using intelligent algorithms to support cognitive domain operations can also help identify the weaknesses of the enemy’s offense and defense system, autonomously plan combat tasks according to the “enemy”, intelligently design combat formations, and provide real-time feedback on combat effects. Relying on data links and combat clouds to strengthen intelligent background support, we can strengthen combat advantages in dynamic networking and virtual-real interaction.

Make decisions before the enemy and attack with precision. Intelligent algorithms can assist commanders in predicting risks, dynamically optimizing combat plans according to the opponent’s situation, and implementing precise cognitive attacks. In future intelligent command and control, the “cloud brain” can be used to provide algorithm support, combined with intelligent push to predict the situation one step ahead of the enemy, make decisions one step faster than the enemy, and completely disrupt the opponent’s thinking and actions. We should focus on using intelligent technology to collect and organize, deeply analyze the opponent’s decision-making and behavioral preferences, and then customize plans to actively induce them to make decisions that are beneficial to us, aiming at the key points and unexpectedly delivering a fatal blow to them.

Powerful computing power improves the overall operation level

Plan for the situation and create momentum, and suppress with computing power. “He who wins before the battle has more calculations; he who loses before the battle has less calculations.” The situation of cognitive confrontation is complex and changeable, and it is difficult to deal with it only by relying on the experience and temporary judgment of commanders. Intelligent tools can be used to strengthen the penetration of enemy thinking before the battle, actively divide and disintegrate the cognitive ability of the enemy team, and improve our battlefield control ability and combat initiative. At the same time, we should use powerful intelligent computing power to improve flexible command and overall planning capabilities, take advantage of the situation, build momentum, and actively occupy the main position of cognitive confrontation.

Smart soft attack, computing power raid. The rapid development of artificial intelligence has promoted the transformation of war from “hard destruction” to “soft killing”, which is expected to completely subvert the traditional war paradigm. For example, the latest technical concepts can be used to gain in-depth insights into the operating mechanism of the enemy system, actively familiarize oneself with the opponent, and mobilize the opponent. It is also possible to use the psychological anchoring effect and the network superposition amplification effect to interfere with the opponent’s cognitive loop link, disrupt the opponent’s command decision-making, and slow down the opponent’s reaction speed.

Cross-domain coordination and computing power support. To win the proactive battle of cognitive confrontation, we must coordinate across domains, gather forces in multiple dimensions, use intelligent tools to autonomously control the flow of information, realize the integrated linkage of physical domain, information domain and cognitive domain, lead forward-looking deployment and distributed coordination, launch a comprehensive parallel offensive, and form cognitive control over the enemy. Effectively carry out joint actions of virtual and real interaction in the entire domain, intervene in the enemy’s cognition, emotions and will, and use powerful computing power to take the initiative and fight proactive battles.

China Military Network Ministry of National Defense Network

Thursday, April 20, 2023

Chen Jialin, Xu Jun, Li Shan

現代國語:

伴隨「智慧+」時代的到來,人工智慧廣泛應用於軍事領域,物理空間的常規戰爭與虛擬空間的認知對抗加速融合。深度挖掘人工智慧潛力為認知對抗賦權,對提升跨域資源匹配效率,掌控未來作戰主動權具有重要意義。

資料挖潛拓展經驗認知邊界

數據驅動,知彼知己。隨著大數據相關技術的進步,數據資訊已成為認知攻防彈藥,資訊優勢在戰場上變得越來越重要。運用人工智慧技術賦能傳統資訊加工流程,可強化關聯資訊分析能力,透過跨領域資料擷取、虛假資訊甄別,加速資訊全局融合,強化動態感知能力。人工智慧還可協助緩解戰場數據過載,有機整合敵情、我情、戰場環境訊息,建立全像智慧資料庫,為認知對抗提供良好支撐。

萬物智聯,人機協同。現代戰爭日漸軍民一體、平戰界線模糊,技術重新定義了人與人、人與裝備、裝備與裝備的互動方式,戰場資料源源不絕。透過大數據探勘與跨域比較分析,可對影像、音訊、視訊等非結構化資料去粗取精、去偽存真,拓展經驗認知邊界,提升人機協同水準。物聯網、大數據技術的深度運用,推動資料取得、篩選、流轉、加工流程的智慧化程度不斷提升,為實施認知域精準攻擊夯實基礎。

打通壁壘,深度融合。依靠戰場大數據可有效突破全域融合的壁壘,有助於聯通條塊分割的資訊孤島,促進跨域資訊耦合聚合,加速資訊無障礙流通,推動資料融合與資訊融合向感知融合與認知融合轉化。智慧裝備全面滲透進入指揮體系,能夠加速態勢感知、態勢預測與態勢塑造的深度融合,優化多維資訊篩選與認知對抗佈局,推動認知域作戰樣式不斷迭代升級。

智慧演算法強化輔助決策效能

加速決策,致敵混亂。認知對抗的勝負,某種程度上取決於指揮家智慧謀略的博弈。可透過全維度跨域資訊對抗與決策博弈,借助智慧技術分析並介入對手認知與行為,最終贏得戰場主動。目前,人工智慧已成為戰鬥力倍增的催化劑,平時可扮演智慧「藍軍」模擬推演作戰方案;戰時透過智慧輔助決策,提升「偵、控、打、評、保」循環品質效率,給敵方製造混亂,促使其體系癱瘓。

自主規劃,智能編組。未來智慧化戰場上,「面對面」的拼殺將越來越多地讓位給「鍵對鍵」的攻防。在認知域作戰中,利用智慧演算法精準甄別身分資訊、預先研判對手企圖、事先扼控關鍵要點,能夠將資訊優勢快速轉化為決策優勢與行動優勢。利用智慧演算法支撐認知域作戰,還可協助摸清敵方攻防體系弱點,因「敵」制宜自主規劃作戰任務,智慧設計作戰編組,即時回饋作戰效果,依托資料鏈、作戰雲強化智慧後台支撐,在動態組網、虛實互動中強化作戰勝勢。

先敵決策,精準攻擊。智慧演算法可輔助指揮者預判風險,根據對手狀況動態優化作戰方案,實施精準認知攻擊。在未來智慧化指揮控制中,可利用「雲端大腦」提供演算法支撐,結合智慧推送先敵一步預判態勢,快敵一招制定決策,徹底打亂對手思路和行動。應著重運用智慧科技收集整理、深度分析對手決策和行為偏好,進而專項客製化計劃,積極誘導其作出有利於我的決策,瞄準要害出其不意地對其進行致命一擊。

強大算力提升全域運籌水平

謀勢造勢,算力壓制。 「夫未戰而廟算勝者,得算多也;未戰而廟算不勝者,得算少也。」認知對抗態勢複雜多變,僅靠指揮經驗和臨時判斷難以應對,可利用智能工具在戰前即對敵思維認知加強滲透,積極分化瓦解敵方團隊認知力,提升我戰場控局能力和作戰性。同時,應藉助強大智能算力,提升靈活指揮與全局運籌能力,順勢謀勢、借勢造勢,積極佔領認知對抗主陣地。

巧打軟攻,算力突襲。人工智慧的快速發展,推動戰爭進一步從「硬摧毀」轉向「軟殺傷」,可望徹底顛覆傳統戰爭範式。如可運用最新技術理念,深入洞察敵方體系運作機理,積極熟悉對手、調動對手。還可利用心理沉錨效應和網路疊加放大效應,幹擾對手認知循環鏈路,打亂對手指揮決策,遲滯對手反應速度。

跨域統籌,算力支撐。打贏認知對抗主動仗須全域跨域統籌、多維同向聚力,利用智慧工具自主控制資訊的流量流向,實現物理域、資訊域與認知域的一體聯動,引領前瞻性布勢與分散式協同,全面展開並行攻勢,形成對敵認知控制。有效進行全域虛實相生的聯合行動,對敵認知、情緒和意志實施幹預,借助強大算力下好先手棋、打好主動仗。

中國軍網 國防部網 // 2023年4月20日 星期四

陳佳琳 徐 珺 李 山

中國原創軍事資源:http://www.81.cn/jfjbmap/content/2023-04/20/content_338002888.htm

“Studying the Military, Studying War, Studying Fighting” Chinese Military Special Topic: The Key to Winning Cognitive Warfare

「學軍事、學戰爭、學打仗」中國軍事專題:打贏認知戰爭的關鍵

現代英語:

Information Network: The Key to Winning the Cognitive War

■Zhai Chan

introduction

In today’s era of information and intelligent integration, information networks, with their advantages of deep reach, wide popularity, and strong interactivity, will play an irreplaceable and important role in cognitive warfare. With the support of information networks, cognitive warfare will be more powerful and more scalable. A deep understanding of the mechanism, laws, basic forms, methods and means of cognitive warfare of the role of information networks will help to timely control the initiative of cognitive warfare and lay the foundation for victory.

The Mechanism and Laws of Information Networks and Cognitive Warfare

The essence of cognitive warfare in the role of information networks is to provide massive amounts of information through core algorithms, create biased cognitive scenarios, and influence the thinking and cognition of people and intelligent machines. This process integrates the operating rules of information networks and the internal mechanisms of thinking and cognition, has strong predictability, and is the underlying structure and key point that must be grasped in information network cognitive warfare.

The stickiness effect based on path dependence. The highly developed information network in today’s society provides a platform that people cannot live without for learning, working, living, entertainment, military construction, combat and military struggle preparation, forming an interconnected path dependence between each other. This platform uses information as the core and the network as the medium. Through invisible stickiness, it connects different groups of people, societies, countries and the military together, connecting the entire world into a closely connected global village. Objectively, it also provides a bridge and a link for conducting cognitive operations, influencing the opponent’s thinking and cognition, and winning cognitive wars. In 2009, US Secretary of State Hillary Clinton delivered an “Internet Freedom” speech, advocating the “Internet Freedom” strategy, attempting to use the channel formed by people’s high dependence on the Internet to influence the thinking and cognition of the people of the opponent country, especially the younger generation, and spread American values.

Interactive influence based on information exchange. Education believes that interactive communication can effectively overcome the cognitive barriers formed by one-way information transmission, reach consensus, form empathy, and strengthen empathy through mutual information exchange, emotional integration, and mutual needs. A big difference between information networks and traditional communication media is that they provide a carrier that can interact and communicate on a large scale, at a fast pace, and with high efficiency. In this carrier, the party with strong information can repeatedly confirm the influence, adjust methods and strategies, and intervene in the thinking and cognition of the other party through the interactive mechanism provided by the carrier, based on the other party’s thought fluctuations, emotional changes, attitude feedback, etc. For a long time, the United States has maintained a “engagement + containment” strategy toward China. One very important consideration is that this kind of engagement can effectively overcome the communication barriers and information gaps formed by simple blockade and confrontation, enhance the interaction between the two governments and peoples, and thus find opportunities to open gaps and influence our ideas and ideologies. Although this strategy takes place in the traditional field, it is inherently consistent with the interactive influence mechanism of information networks based on information exchange.

The seductive influence based on the preset scene. The concealment, virtuality and permeability of the information network allow its controllers to create extremely deceptive, tempting and inflammatory information scenes through water army flooding, information filtering and “fishing in troubled waters” and other technical and strategic means, so that the opponent is deeply trapped in it without knowing it, and instead develops towards the preset process and results. This directional manipulation of the information network can subtly and efficiently influence, infect and shape the opponent’s thinking and cognition, so that the opponent is unconsciously led by the rhythm, and the combat effect is far better than the confrontation. On the eve of the Iraq War, the US media spread false information such as the existence of weapons of mass destruction in Iraq through the Internet and other platforms, accusing the Saddam regime of collusion with al-Qaeda, rampant corruption, and unprovoked harm to the Iraqi people. At the same time, they tried every means to cover up the truth, filter out the anti-war voices of their own people, and strive to create an atmosphere that the Saddam regime is evil and hateful and that the whole of America is united in hatred of the enemy.

The basic form of cognitive warfare in the role of information networks

The laws of war and the mechanism of victory determine the basic form of war. The laws and mechanisms of cognitive warfare based on information networks inherently determine the external forms of this war. The most basic and representative ones include information confusion warfare, misleading thinking warfare, and will-destroying warfare.

Information confusion warfare. It is to infuse the network with a large amount of complex information that combines the real and the fake, which is both true and illusory, so that the enemy’s information network capacity is overloaded, malfunctions, and disordered operations, or causes specific audiences to become “deaf, blind, and insensitive”, have cognitive abilities blocked, and their thinking, cognition, and decision-making judgments are hindered. This form of warfare is often used in the early stages of combat and in opaque battlefields. The party with information advantage can make the enemy fall into a state of panic and bewilderment, resulting in perception failure, loss of thinking, and self-disorder. Bloomberg reported that the Space Force, the sixth branch of the U.S. military that was recently established, plans to purchase 48 jammer systems by 2027, aiming to disrupt satellite signals “in the event of a conflict with a major power.” Many national militaries generally feel that the information they receive is not too little but too much. The massive amount of information coming from all directions has put tremendous pressure on situation perception and analysis and judgment.

Misleading thinking warfare. This is to form a biased information scene by instilling specific information that contains the intentions of the party controlling the information network, misleading, deceiving and influencing the thinking of specific countries, armies and people, causing them to deviate from the correct development track and deviate in a direction that is beneficial to oneself and detrimental to the enemy. It is the highest level and common practice of cognitive attack. This kind of misleading is based on strong external pressure, on specious strategies, and on information mixed with water as a weapon. It targets the opponent’s thinking characteristics and weak links, and implements clear-cut deception, causing the opponent to lose his way in tension and panic, and fall into the “trap” unknowingly. In recent years, while implementing the great power competition strategy, some countries have used cyber trolls to fabricate false situations, create false information, and spread true rumors to fan the flames around our country and encourage some countries that have historical grievances with our country and frictions with our country in reality to seek trouble. The purpose is to induce us to divert our attention, weaken the investment of resources and strength in the main strategic direction, deviate from the track of great power rejuvenation, and seek to reap the benefits of the two fishermen.

Will-destroying war. Futurist Alfred Toffler said that whoever controls the human mind controls the entire world. War is ultimately a confrontation between people. People’s psychological activities largely affect their mental state, which in turn affects their will to fight. Will-destroying war is different from traditional warfare that indirectly affects people’s will through material destruction. It directly affects the psychological activities, mental state and thinking decisions of key figures, thus affecting military morale, fighting will and combat actions. With the development of science and technology and social progress, the intervention in people’s will has entered the stage of “technology + strategy” from the traditional strategy-based intervention. More than a decade ago, scientists developed a “sound beam” weapon that uses an electromagnetic network to emit extremely narrow sound waves from hundreds of meters away, interfering with the enemy’s judgment and even causing mental confusion among strong-willed soldiers. In recent years, studies have shown that artificial speech synthesis technology based on brain wave signals can extract signals from the brain and synthesize speech that humans can directly understand.

Information networks are the main means of cognitive warfare

“Technology + strategy” constitutes the basic means of modern cognitive warfare. As a product of modern scientific and technological development, the information network’s means of effecting cognitive warfare are also mainly reflected in “technology + strategy”. This provides us with a basic entry point for understanding and grasping the ways and scientific paths of information network’s effect on cognitive warfare, thereby winning the war.

Big data construction. As the core component of the information network, data is not only the carrier of information, but also the “new oil” driven by the value of the information network, and the basic ammunition for cognitive warfare. Through massive data, complex information scenarios are constructed for my use, or the opponent’s cognitive confusion, or misleading and deceiving thinking, or destroying beliefs and wills are formed, which constitutes the basic logic of cognitive warfare in the information network. In this logical framework, data is undoubtedly the most basic resource and the most core element. A few years ago, authoritative departments calculated that the world produces about 2.5 exabytes (EB) of data every day, of which only 20% is structured data that can be directly used, and the remaining 80% needs to be analyzed, identified, and screened. These data resources, which are growing exponentially, provide an inexhaustible supply of “data ammunition” for constructing data information scenarios and conducting cognitive warfare.

Intelligent push. In the information network era, intelligent push has become a convenient channel for people to absorb external information, gain identification of thinking, emotional resonance, and influence the thinking and cognition of others. Using advanced technologies such as artificial intelligence to collect, organize, and analyze people’s thinking habits and behavioral preference data to form personalized and customized perception push can produce an “echo wall” of social cognitive trends and an information cocoon that shackles people. At the same time, it is also conducive to empathizing with others, understanding the thinking trends and possible actions of opponents, and taking targeted countermeasures. In our daily lives, we all have the experience of receiving a large amount of similar information after shopping online or searching for certain types of information. This intelligent push method is applied to cognitive operations, which can easily enable the information leader to use information network data to conduct forward-looking analysis and judgment on the decisions and actions that may be made by the command and decision-making level of the combat target, and induce them to make the decision-making actions they hope to see or make corresponding response measures in advance.

Emotional infiltration. Freud said that we are not pure wisdom or pure souls, but a collection of impulses. In the information network space, the concepts that can be widely and quickly disseminated are often not calm, rational, and objective thinking and analysis, but mostly impulsive and irrational emotional mobilization. This is determined by the fast pace of information dissemination and news release. The cognitive need to respond quickly to this information, in turn, leads to the reflexive, impulsive, and emotional response of “fast thinking”, which transforms seemingly isolated social cases into highly coercive and inflammatory psychological hints and behavioral drives, and explosively promotes irrational decision-making and actions. In June 2009, a diplomatic cable disclosed by WikiLeaks described the lavish banquets held by the family of Tunisia’s Ben Ali regime and described the regime as a corrupt and tyrannical “mafia”. This deepened the resentment of the country’s citizens and became an important driving force behind the “Jasmine Revolution” that overthrew the Ben Ali regime.

現代國語:

來源:解放軍報 作者:翟嬋 責任編輯:劉上靖 2021-11-18 06:49:14
資訊網絡:認知戰制勝要津

■翟 嬋

引 言

在資訊化智慧化融合發展的當今時代,資訊網絡以其觸角深、受眾廣、互動性強等優勢,在認知戰中將發揮無可取代的重要作用。有了資訊網絡的加持,認知戰將如虎添翼、如魚得水。深刻掌握資訊網絡作用認知戰的機理規律、基本形態、方法手段等,有助於及時掌控認知戰主動權,為贏得勝利奠定基礎。

資訊網絡作用認知戰的機理規律

資訊網絡作用認知戰的本質在於透過核心演算法,提供大量訊息,營造傾向性認知場景,影響人和智慧機器的思維認知。這個過程融合資訊網絡運行規律和思維認知內在機理,具有很強的可預知性,是資訊網絡認知戰必須把握的底層架構和關鍵之點。

基於路徑依賴的黏性影響。當今社會高度發達的資訊網絡,提供了一個人們學習、工作、生活、娛樂,軍隊建設、作戰和軍事鬥爭準備須臾離不開的平台,在彼此之間形成一個互聯互通的路徑依賴。這一平台以資訊為核、網絡為媒,透過無形的黏性把不同人群、社會、國家包括軍隊連接在一起,既將整個世界打通成一個緊密聯繫的地球村,客觀上也為開展認知作戰、影響對手思維認知、制勝認知戰爭提供了橋樑和紐帶。 2009年美國國務卿希拉裡曾發表「互聯網自由」演說,鼓吹「互聯網自由」戰略,企圖利用人們對互聯網的高度依賴形成的作用通道,影響對手國民眾特別是青年一代的思維認知,傳播美式價值觀。

基於資訊交換的互動影響。教育學認為,互動交流能有效克服訊息單向傳遞所形成的認知屏障,在彼此訊息交換、情感融通、需求相促中達成共識、形成同理心、強化同理。資訊網絡與傳統交流溝通媒介的一個很大不同,在於提供了一個能大範圍、快節奏、高效率互動交流的載體。在這一載體中,資訊強勢一方能透過載體提供的互動機制,依據另一方的思想波動、情緒變化、態度回饋等,反復確認影響,調整方法策略,幹預另一方的思維認知。長期以來,美國對華保持「接觸+遏制」戰略,一個很重要的考慮就在於這種接觸能有效克服單純封鎖對抗形成的溝通壁壘和信息鴻溝,增強兩國政府和民眾之間的互動,從而尋找機會打開缺口,影響我們的思想觀念和意識形態。這項戰略雖然發生在傳統領域,但與資訊網絡基於資訊交換的互動影響機理內在一致。

基於預設場景的誘導影響。資訊網絡的隱蔽性、虛擬性、滲透性,使其掌控者能通過水軍灌水、資訊過濾、「渾水摸魚」等技術和謀略手段,營造極具欺騙性、誘惑性、煽動性的信息場景,使對手深陷其中而不自知,反而朝著預設的過程和結果發展。這種對資訊網絡的指向性操控,能潛移默化地高效影響、感染和塑造對手的思維認知,使之不知不覺被帶節奏,收到遠比對抗硬槓好得多的作戰效果。伊拉克戰爭前夕,美國媒體透過網路等平台大肆散佈伊拉克存在大規模殺傷性武器等虛假訊息,指責薩達姆政權與基地組織相互勾連、腐敗成風,還無端殘害伊拉克人民,同時又想方設法掩蓋事實真相,過濾本國人民的反戰聲音,極力營造薩達姆政權邪惡可恨、全美上下同仇敵愾的氛圍。

資訊網絡作用認知戰的基本形態

戰爭規律和製勝機理決定戰爭的基本形態。資訊網絡作用認知戰的規律機理內在規定著這種戰爭的外在呈現形態。其中最基本、最具代表性的包括資訊迷茫戰、思維誤導戰和意志毀傷戰。

資訊迷茫戰。就是用海量虛實結合、亦真亦幻的復雜信息灌注網絡,使敵對方信息網絡容量過載、功能失常、運轉失序,或導致特定受眾對象“失聰失明失感”、認知能力擁堵、思維認知和決策判斷受阻。這一戰爭形態常用於作戰初期和不透明戰場,擁有資訊優勢的一方能使敵對方陷入茫然不知所措的恐慌狀態,從而感知失靈、思維失據、自亂陣腳。彭博社稱,美國成立不久的第六大軍種——太空軍,計劃2027年前採購48套幹擾系統,旨在“與大國發生沖突情況下”,幹擾迷茫其衛星信號。不少國家軍隊普遍感到,現在獲取的資訊不是太少了而是太多了,來自四面八方的巨量資訊大量聚集,給態勢感知和分析判斷造成巨大壓力。

思維誤導戰。就是透過灌輸包含資訊網絡掌控方意圖指向的特定訊息,形成傾向性訊息場景,誤導欺騙和影響特定國家、軍隊和人群思維理念,使之偏離正確發展軌道,朝著於己有利、於敵有損的方向偏移,是認知攻擊的最高境界和慣常做法。這種誤導以強大的外部壓力為前提,以似是而非的策略為基礎,以摻雜水分的信息為武器,針對對手思維特點和薄弱環節,實施導向鮮明的誘騙,使對手在緊張慌亂中迷失方向,不知不覺落入「圈套」。這些年來,一些國家在實施大國競爭戰略的同時,通過網絡水軍虛構假情況、製造假信息、散佈真謠言,在我國週邊煽風點火,鼓動一些在歷史上與我國有積怨、現實中有摩擦的國家尋尋覓滋事,目的就是要誘導我們轉移注意力,削弱在主要戰略方向上的資源力量投入,偏離強國復興的軌道,謀取漁翁之利。

意志毀傷戰。未來學家托夫勒說,誰控制了人的心理,誰就控制了整個世界。戰爭說到底是人與人的對抗,人的心理活動很大程度影響人的精神狀態,進而影響作戰意志。意志毀傷戰與傳統作戰透過物質摧毀間接影響人的意志不同,它透過直接影響關鍵人物的心理活動、精神狀態和思維決策,影響軍心士氣、戰鬥意志和作戰行動。隨著科技發展和社會進步,對人的意志的干預,已經由傳統以謀略為主演進到“技術+謀略”階段。早在十多年前就有科學家研製出“聲波集束”武器,通過電磁網絡從數百米外發射極為狹窄的聲波“音柱”,幹擾敵人判斷甚至使意志堅強的軍人精神錯亂。近年來有研究表明,基於腦電波信號的人工語音合成技術可提取大腦中的信號,合成人類能夠直接理解的語音。

資訊網絡作用認知戰的主要手段

「技術+謀略」構成現代認知戰的基本手段。資訊網絡作為現代科技發展的產物,其對認知戰的作用手段也主要體現在「技術+謀略」上。這為我們認識並掌握資訊網絡作用認知戰的方式、科學路徑,從而製勝戰爭提供了基本切入點。

大數據構塑。數據作為資訊網絡的核心構件,不僅是資訊的載體,而且是資訊網絡價值驅動的“新石油”,更是作用認知戰的基本彈藥。透過大量資料構塑為我所用的複雜資訊場景,或對對手進行思維認知迷茫,或給予思維誤導欺騙,或進行信念意志摧毀,構成資訊網絡作用認知戰的基本邏輯。在這個邏輯架構中,數據無疑是最基礎的資源、最核心的元素。遠在幾年前,權威部門就統計,全球每天生產約2.5艾字節(EB)的數據,其中僅20%是可以直接利用的結構化數據,其餘80%則需要進行分析、甄別、篩選。這些幾何級數成長湧現的數據資源,為構塑數據資訊場景、實施認知戰提供了取之不盡用之不竭的「數據彈藥」。

智能化推送。資訊網絡時代,智慧化推播成為人們攝取外在訊息,獲得思考認同、情感共鳴、影響他人思考認知的便利管道。運用人工智慧等先進技術收集、整理、分析人的思維慣性、行為偏好數據,形成擬人化客製化感知推送,能夠產生社會認知趨同的「回音壁」和桎梏人的信息繭房,同時也有利於推己及人、了解對手的思維趨向和可能行動,有針對性地採取應對措施。生活中,我們都有一次或幾次網上購物、搜索某類信息後,大量類同信息推送進來的經歷,這種智能化推送手段應用到認知作戰中,很容易使信息主導方通過信息網絡數據,對作戰對象指揮決策層可能做出的決策、採取的行動等予以前瞻分析研判,誘導其作出希望看到的決策行動或預先作出相應的應對措施。

情緒化浸染。佛洛伊德說,我們不是純粹的智慧、純粹的靈魂,而是一個沖動的集合。在資訊網絡空間,能夠得到廣泛且快速傳播的觀念認知,往往不是冷靜理性客觀的思維分析,多是沖動非理性的情緒情緒動員。這是由資訊傳播、新聞發布「先發製人」的快節奏決定的。對這些資訊作出快速反應的認知需求,反過來又導致「快思維」條件反射性、沖動性、情緒化反應,將看似孤立的社會個案轉化為具有強烈壓迫性、煽動性的心理暗示和行為驅動,暴發性催生非理性決策行動。 2009年6月維基解密披露的一份外交電文中,描繪了突尼斯本·阿里政權家族宴會的奢靡場景,並煞有介事地將該政權形容為腐敗暴政的“黑手黨”,這加深了該國國民怨恨情緒,從而成為引燃推翻本·阿里政權的「茉莉花革命」重要推手。

中國原創軍事資源:http://www.mod.gov.cn/gfbw/jmsd/4899062.html?big=fan

Comprehensive Review of Chinese Military Intelligent Warfare: Intelligent Combat Command

中國軍事智慧戰爭全面回顧:智慧作戰指揮

現代英語:

Liu Kui, Qin Fangfei

Tips

● Modern artificial intelligence is essentially like a “brain in a vat”. If it is allowed to carry out combat command, it will always face the problem of subjectivity loss, that is, “self” loss. This makes artificial intelligence have natural and fundamental defects. It must be based on human subjectivity and improve the effectiveness and level of combat command through human-machine hybrid.

● In intelligent combat command, the commander is mainly responsible for planning what to do and how to do it, while the intelligent model is responsible for planning how to do it specifically.

“Brain in a vat” is a famous scientific hypothesis. It means that if a person’s brain is taken out and placed in a nutrient solution, the nerve endings are connected to a computer, and the computer simulates various sensory signals. At this time, can the “brain in a vat” realize that “I am a brain in a vat”? The answer is no, because as a closed system, when a person lacks real interactive experience with the outside world, he cannot jump out of himself, observe himself from outside himself, and form self-awareness. Modern artificial intelligence is essentially like a “brain in a vat”. If it is allowed to implement combat command, it will always face the problem of subject loss, that is, “self” loss. This makes artificial intelligence have natural and fundamental defects, and it must be based on human subjectivity and improve the effectiveness and level of combat command through human-machine hybrid.

Based on “free choice”, build a “man-planned” command model

On the battlefield, the commander can choose which target to attack, and can choose to attack from the front, from the flank, from the back, or from the air; he can isolate but not attack, surround but not attack, talk but not attack… This is human autonomy, and he can freely choose what to do and how to do it. But machines can’t do that. The combat plans they give can only be the plans implied in the intelligent model. As far as the specific plan given each time is concerned, it is also the most likely plan in the sense of probability statistics. This makes the plans generated by artificial intelligence tend to be “templated”, which is equivalent to a “replica machine”. It gives similar answers to the same questions and similar combat plans for the same combat scenarios.

Compared with artificial intelligence, different commanders design completely different combat plans for the same combat scenario; the same commander designs different combat plans when facing similar combat scenarios at different times. “Attack when the enemy is unprepared and take them by surprise”, the most effective plan may seem to be the most dangerous and impossible plan. For commanders, facing combat scenarios, there are infinite possibilities in an instant, while for artificial intelligence, there is only the best-looking certainty in an instant, lacking creativity and strategy, and it is easy for the opponent to predict it. Therefore, in intelligent combat command, based on human autonomy, the commander is responsible for planning and calculation, innovating tactics and tactics, and designing basic strategies, and the machine is responsible for converting basic strategies into executable and operational combat plans, forming a “man-planned” command mode. More importantly, autonomy is the unique mark of human existence as human being. This power of free decision-making cannot and is not allowed to be transferred to machines, making people become vassals of machines.

Based on “self-criticism”, build a command model of “people against machine”

Human growth and progress are usually based on the real self, focus on the ideal self, and criticize the historical self in a negation-negation style. Artificial intelligence has no “self” and has lost its self-critical ability. This makes it only able to solve problems within the original cognitive framework. The combat ideas, combat principles, and tactics of the model are given when the training is completed. If you want to update and improve your knowledge and ideas, you must continuously train the model from the outside. Mapped to a specific combat scenario, the intelligent model can only provide the commander with a pre-given problem solution. It is impossible to dynamically adjust and update it continuously during a battle.

People with a self-critical spirit can jump out of the command decision-making thinking process and review, evaluate, and criticize the command decision. In the continuous self-criticism, the combat plan is constantly adjusted, and even the original plan is overturned to form a new plan. In the command organization group, other commanders may also express different opinions on the combat plan. The commander adjusts and improves the original plan on the basis of fully absorbing these opinions, and realizes the dynamic evolution of the combat plan. Therefore, combat command is essentially a dynamic process of continuous forward exploration, not a static process given in advance by the combat plan. When the machine generates a combat plan, the commander cannot accept it blindly without thinking, but should act as an “opponent” or “fault finder”, reflect on and criticize the combat plan, and raise objections. Based on the human’s objections, the machine assists the commander to continuously adjust and optimize the combat plan, forming a command mode of “human opposing and machine correcting”.

Based on “self-awareness and initiative”, we build a command model of “people lead and machines follow”

Comrade Mao Zedong once said that what we call “conscious initiative” is the characteristic that distinguishes humans from objects. Any complex practical activity to transform the world starts with a rough and abstract idea. To transform abstract concepts into concrete actions, it is necessary to overcome various risks and challenges, give full play to conscious initiative, and take the initiative to set goals, make suggestions, and think of ways. Artificial intelligence without conscious initiative, when people ask it questions, it only gives the answers implied in the model, without caring whether the answer can be used, targeted, or practical. In other words, when an abstract and empty question is raised, it gives an abstract and empty answer. This is also why the current popular large model unified operation mode is “people ask questions and machines answer”, rather than “machines ask questions”.

Relying on conscious initiative, even the most abstract and empty problems can be transformed step by step into specific action plans and specific action practices. Therefore, in intelligent combat command, the commander is mainly responsible for planning what to do and what ideas to follow, while the intelligent model is responsible for planning how to do it specifically. If the combat mission is too abstract and general, the commander should first break down the problem into details, and then the intelligent model should solve the detailed problem. Under the guidance of the commander, the problem is gradually solved in stages and fields, and the combat goal is finally achieved, forming a command mode of “people lead and machines follow”. It’s like writing a paper. First you make an outline and then you start writing. People are responsible for making the outline, and the specific writing is done by the machine. If the first-level outline is not specific enough, people can break it down into a second-level or even a third-level outline.

Based on “self-responsibility”, build a command model of “human decision-making and machine calculation”

Modern advanced ship-borne air defense and anti-missile systems usually have four operational modes: manual, semi-automatic, standard automatic, and special automatic. Once the special automatic mode is activated, the system will no longer require human authorization to launch missiles. However, this mode is rarely activated in actual combat or training. The reason is that humans, as the responsible subject, must be responsible for all their actions, while the behavior of machines is the absence of the responsible subject. When it comes to holding people accountable for major mistakes, machines cannot be held accountable. Therefore, life-and-death matters must not be decided by a machine without autonomous responsibility. Moreover, modern artificial intelligence is a “black box”. The intelligent behavior it exhibits is inexplicable, and the reasons for right and wrong are unknown, making it impossible for people to easily hand over important decision-making power to machines.

Because AI lacks “autonomous responsibility”, all problems in its eyes are “domesticated problems”, that is, the consequences of such problems have nothing to do with the respondent, and the success or failure of the problem solving is irrelevant to the respondent. Corresponding to this are “wild problems”, that is, the consequences of such problems are closely related to the respondent, and the respondent must be involved. Therefore, in the eyes of AI without self, there are no “wild problems”, all are “domesticated problems”, and it stays out of any problem. Therefore, in intelligent combat command, machines cannot replace commanders in making judgments and decisions. It can provide commanders with key knowledge, identify battlefield targets, organize battlefield intelligence, analyze battlefield conditions, predict battlefield situations, and even form combat plans, formulate combat plans, and draft combat orders. However, the plans, plans, and orders it gives can only be used as drafts and references. As for whether to adopt them and to what extent, it is up to the commander to decide. In short, both parties make decisions together, with artificial intelligence responsible for prediction and humans responsible for judgment, forming a command mode of “human decision-making and machine calculation”.

現代國語:

從「缸中之腦」看智慧化作戰指揮

■劉 奎 秦芳菲

要點提示

●現代人工智慧,本質上就如同“缸中之腦”,如果讓它實施作戰指揮,始終會面臨主體缺失即“自我”缺失的問題。這使得人工智慧存在天然的、根本的缺陷,必須基於人的主體性,透過人機混合來提升作戰指揮效能和水平

●智能化作戰指揮中,指揮員主要負責規劃做什麼、依什麼思路做,智能模型則負責規劃具體怎麼做

「缸中之腦」是一項著名科學假設。意思是,假如人的大腦被取出放在營養液中,神經末梢接上計算機,由計算機模擬出各種感知信號。這時候,「缸中之腦」能不能意識到「我是缸中之腦」?答案是不能,因為人作為一個封閉的系統,當與外界缺乏真實的互動體驗時,人是無法跳出自身、從自身之外觀察自身並形成自我意識的。而現代人工智慧,本質上就如同“缸中之腦”,如果讓它實施作戰指揮,始終會面臨主體缺失即“自我”缺失的問題。這使得人工智慧存在天然的、根本的缺陷,必須基於人的主體性,透過人機混合來提升作戰指揮效能和水準。

基於“自由選擇”,建構“人謀機劃”的指揮模式

戰場上,指揮員可以選擇打哪一個目標,可以選擇從正面打、從翼側打、從背後打、從空中打;可以隔而不打、圍而不打、談而不打……這就是人的自主性,可以自由選擇做什麼、怎麼做。但機器不行,它給出的作戰方案,只能是智慧模型中蘊含的方案。就每次給出的特定方案而言,也是機率統計意義上可能性最大的方案。這使得人工智慧生成的方案呈現“模板化”傾向,相當於一個“復刻機”,同樣的問題,它給出的是相似的回答,同樣的作戰場景,它給出的就是相似的作戰方案。

與人工智慧相比,同樣的作戰場景,不同的指揮員設計的作戰方案完全不同;同一指揮員在不同的時間面對相似的作戰場景,設計的作戰方案也不相同。 “攻其無備,出其不意”,最有效的方案很可能看上去是最危險、最不可能的方案。對於指揮員,面對作戰場景,一瞬間有無限可能,而對於人工智慧,一瞬間卻只有看上去最好的確定,缺乏創意、缺少謀略,很容易為對方所預料。所以,在智慧化作戰指揮中,要基於人的自主性,由指揮員負責籌謀算計、創新戰法打法、設計基本策略,由機器負責將基本策略轉化為可執行可操作的作戰方案,形成「人謀機劃」的指揮模式。更重要的是,自主性是人作為人而存在的獨特標志,這種自由作決定的權力不可能也不允許讓渡給機器,使人淪為機器的附庸。

基於“自我批判”,建構“人反機正”的指揮模式

人類的成長進步,通常是立足現實自我,著眼理想自我,對歷史自我進行否定之否定式的批判。人工智慧沒有“自我”,同時也喪失了自我批判能力。這使得它只能停留在原有認知框架內解決問題,模型擁有的作戰思想、作戰原則、戰法打法,是在訓練完成時所給予的。如果想獲得知識和想法的更新提升,就必須從外部對模型進行持續訓練。映射到特定作戰場景,智慧模型給指揮員提供的只能是事先給定的問題解決方案,要想在一次作戰中不斷地動態調整更新是做不到的。

具有自我批判精神的人類,可以跳脫指揮決策思考過程,對指揮決策進行審視、評價、批判。在持續地自我批判中不斷對作戰方案進行調整,甚至推翻原有方案,形成新的方案。在指揮機構群體中,其他指揮人員也可能對作戰方案提出不同意見,指揮員在充分吸納這些意見的基礎上,調整改進原有方案,實現作戰方案的動態進化。所以,作戰指揮本質上是一個不斷向前探索的動態過程,不是作戰方案事先給定的靜態過程。當機器生成作戰方案時,指揮員不能不加思考地盲目接受,而應充當“反對者”“找茬人”,對作戰方案展開反思批判,提出反對意見,機器根據人的反對意見,輔助指揮員不斷調整、優化作戰方案,形成「人反機正」的指揮模式。

基於“自覺能動”,建立“人引機隨”的指揮模式

毛澤東同志說過,我們名之曰“自覺的能動性”,是人之所以區別於物的特點。任何一項改造世界的複雜實踐活動,都是從粗糙的、抽象的想法開始的,要將抽象觀念轉化為具體行動,需要克服各種風險和挑戰,充分發揮自覺能動性,主動定目標、出主意、想辦法。沒有自覺能動性的人工智慧,人們向它提出問題,它給出的只是模型中蘊含的答案,而不會管這個答案能不能用、有沒有針對性、可不可以實際操作,即提出抽象、空洞的問題,它給出的就是抽象、空洞的回答。這也是為什麼時下流行的大模型統一的運行模式是“人問機答”,而不是“機器提出問題”。

依賴自覺能動性,再抽象、空洞的問題都能由人一步一步轉化為具體的行動方案、具體的行動實踐。因此,在智慧化作戰指揮中,指揮員主要負責規劃做什麼、依什麼思路做,智慧模型則負責規劃具體怎麼做。若作戰任務太過抽象籠統,應先由指揮員對問題進行分解細化,再由智慧模型對細化後的問題進行解算。在指揮引導下,分階段、分領域逐步解決問題,最終達成作戰目標,形成「人引機隨」的指揮模式。這就像寫一篇論文,先列出提綱,再進行寫作,列提綱由人負責,具體寫作由機器完成,如果感覺一級綱目不夠具體,可由人細化為二級乃至三級綱目。

基於“自主負責”,建立“人斷機算”的指揮模式

現代先進的艦載防空反導系統,通常有手動、半自動、標準自動、特殊自動四種作戰模式,一旦啟用特殊自動模式,系統發射導彈將不再需要人的授權幹預。但該模式無論在實戰還是在訓練中都很少啟用。究其原因,人作為責任主體要對自己的所有行為負責,而機器行為背後卻是責任主體的缺失,當要為重大失誤追責時,機器是無法負責的。所以,生死攸關的大事決不能讓一個沒有自主責任的機器決定。況且,現代人工智慧是一個“黑箱”,它所展現的智能行為具有不可解釋性,對與錯的原因無從知曉,讓人無法輕易將重大決定權完全交給機器。

由於人工智慧缺乏“自主責任”,會使它眼中的問題全是“馴化問題”,也就是該類問題產生的後果與回答者沒有關系,問題解決的成功也罷、失敗也罷,對回答者來說無所謂。與之相應的是“野生問題”,也就是該類問題產生的後果與回答者息息相關,回答者必須置身其中。所以,在缺失自我的人工智慧眼中沒有“野生問題”,都是“馴化問題”,它對任何問題都置身事外。因此,在智慧化作戰指揮中,機器不能取代指揮員做出判斷和決策。它可以為指揮員提供關鍵知識、識別戰場目標、整編戰場情報、分析戰場情況、預測戰場態勢,甚至可以形成作戰方案、制定作戰計劃、擬製作戰命令,但它給出的方案計劃命令,只能作為草稿和參考,至於採不採用、在多大程度上採用,還得指揮員說了算。簡單來說,就是雙方共同做出決策,人工智慧負責預測,人負責判斷,形成「人斷機算」的指揮模式。

中國原創軍事資源:http://www.81.cn/yw_208727/16361814.html

Chinese Military in-depth Analysis of the Latest “Cognitive Warfare” Case in the Russia-Ukraine Conflict

中國軍方深入解析俄烏衝突最新「認知戰」案例

2023-10-07 09:00 來源: 述策

現代英語:

It is said that on September 22, the Ukrainian Air Force used the “Storm Shadow” cruise missile to attack the Black Sea Fleet Command in Sevastopol. Since then, the life and death of Admiral Victor Sokolov, commander of the Black Sea Fleet, has been a topic of concern to the outside world. After a few days of undercurrents, on September 25, the Ukrainian Special Operations Command (SOF) announced that they had successfully “killed” Admiral Sokolov, commander of the Black Sea Fleet, and dozens of officers below him. Even Admiral Romanchuk, commander of the Russian Zaporizhia Cluster, was injured by the Ukrainian army.

Unexpectedly, Ukraine’s news was “slapped in the face” less than a day after it was released-on September 26, the Russian Ministry of Defense held a meeting of the National Defense Committee. Senior leaders of the Ministry of Defense, commanders of various military regions, and commanders of various military services attended the meeting in person or via video. At the meeting, Admiral Sokolov, commander of the Black Sea Fleet, who was declared “killed” by Ukrainian officers, appeared. After the news was released, the Ukrainian side was extremely embarrassed and had to announce that they wanted to collect more information. But then someone claimed that Admiral Sokolov, who attended the meeting, was “just a photo” and not a real person.

Nevertheless, the battle between the Russian and Ukrainian armies over the life and death of General Sokolov can be seen as a classic case of cognitive warfare in the Russian-Ukrainian military conflict. It is worthwhile for us to analyze this case specifically, and it is even more worthwhile for us to “talk about” this case and talk about the battle between the Russian and Ukrainian armies in the field of cognitive warfare in the Russian-Ukrainian military conflict.

  1. Is the “beheading” of General Sokolov not an isolated case?

It is not the first time that the Ukrainian propaganda department has fallen into the trap of “beheading” Russian generals. For example, in mid-June this year, the Ukrainian army spread rumors that the Vice President of Chechnya, Lieutenant General of the Chechen National Guard, and Kadyrov Jr.’s right-hand man, Drimkhanov, was “killed” by the Ukrainian army’s HIMARS rocket launcher.

From the perspective of the implementation process of the entire cognitive warfare, whether it was the “beheading” of Drimkhanov in June or the “beheading” of General Sokolov this time, the whole process was similar:

The first step: The Ukrainian propaganda department deliberately “created topics”. The so-called “creating topics” can be regarded as a “primer” in cognitive warfare, which serves to trigger heated public discussion. This kind of “primer” is often not necessarily released by the official, but may be released by some semi-official channels or channels with close ties to the official. For example, the fact that Lieutenant General Drimkhanov was “killed” by the Ukrainian army’s “HIMARS” rocket launcher was first said to have been released from a telegram group of the Ukrainian armed forces, and the statement was ambiguous. The earliest source of the news that Admiral Sokolov was “beheaded” this time was traced back to a telegram group in Russia. The news in this telegram group is mixed, some of which are of low credibility, but some are surprisingly accurate. In a sense, releasing the “primer” of cognitive warfare through these groups is even more likely to arouse public attention and discussion than official news.

The second step can be regarded as “reinforcement” of public opinion. The first step of cognitive warfare, namely “primer information”, often lacks details for a complete news, but it doesn’t matter. Judging from the efficiency and characteristics of information dissemination in the current society, a “lead information” that lacks details but is easy to arouse everyone’s interest, but has information barriers due to some factors, will naturally be “supplemented with details” during the dissemination process. This is true whether Admiral Sokolov was “shot dead” or Drimkhanov was “beheaded”. Information lacks details, but it is very important “Breaking News” for the media. At the same time, due to military secrets, it is impossible to cross the information barrier to obtain more information in time. Therefore, this will inevitably lead the media to a result – public opinion “reinforcement”, and everyone will do it together, and in the process of word of mouth, a lead information will be constantly detailed and “real”. For example, in the process of dissemination, Drimkhanov was “beheaded”, and the “time” and “place” of his beheading were supplemented one after another (even due to different dissemination channels, these time and place elements are also different), sometimes appearing in Kremenaya and sometimes in Gorlovka. The same is true for Admiral Sokolov’s “killing”. During the process of information dissemination, people have come up with a whole set of details such as “The Black Sea Fleet held a regular meeting on Friday, and the Ukrainian army seized the opportunity to launch a strike”, “Two missiles hit the headquarters office, and the other missile launched a supplementary strike when the rescuers went in to rescue people”. It is precisely in this “decentralized” word of mouth that the “fact” that these two senior generals were “beheaded” has been continuously strengthened, and the lack of information sources is even more conducive to the cognitive shaping of ordinary information recipients.

After several days of fermentation, the cognitive war has come to the third step – “the final word”. The Ukrainian official did not “finally decide” the “beheading” of Drimkhanov. Kadyrov Jr. couldn’t sit still first, and soon released a video of himself and Drimkhanov sitting together for a meal and picnic, and even took out his mobile phone on the spot to show the shooting time, “slapping his face”. Admiral Sokolov was “finally decided” by large departments such as the Ukrainian Special Operations Command and the Ukrainian Ministry of Defense. According to the process of cognitive warfare, this kind of official media “final word” should play the role of completing the cognitive shaping process in the field of cognitive warfare, that is, completing a complete logical closed loop from releasing “primers” to “speculation reinforcement” by the outside world, and finally “stamping and confirming” by the official. However, the Russian army took the initiative and performed a “slap in the face on the spot”, which not only made all the information dissemination of the Ukrainian army in front of it useless, but also made Ukraine’s two key units, the Special Operations Command and even the Ministry of Defense, become clowns.

  1. Cognitive warfare in the Russian-Ukrainian conflict, is the Ukrainian army coming in full force?

Many people may be puzzled by the Ukrainian army’s cognitive warfare propaganda department’s personal participation in the rumor that Chechen Vice President Delimkhanov or General Sokolov was “killed” by the Ukrainian army: If the Russian side responds quickly, pulls these senior generals out to show their faces, and self-confirms that “I am still alive”, then won’t the rumors of the Ukrainian cognitive warfare be self-defeating?

It seems reasonable, but the Russian army did not clarify it once it was rumored. Why? Because things are not that simple.

First, from a tactical perspective, if the Russian army arranges for generals to come out and refute rumors every time the Ukrainian army creates a rumor about a senior general, the most direct consequence is that it will cause unnecessary interference and trouble to the daily combat command of senior generals. In combat operations, the time of senior officers above the rank of general is very precious, the daily workload is extremely high, and the brainpower is extremely consumed. It is impossible for them to have so much time to appear and refute rumors endlessly. If the Russian army refutes a rumor every time the Ukrainian army creates a rumor, then these senior generals will have nothing to do in their daily lives and just shoot videos to refute rumors every day.

Second, for information that enters the cognitive shaping cycle, in some cases, “refutation” is not only useless, but will further create more rumors. For example, regarding the “beheading” of General Sokolov, although the Russian army has arranged to refute the rumor, some Ukrainian groups are still “picking on it”, believing that General Sokolov did not move in front of the camera and that the time and place of General Sokolov’s interview could not be seen, so it was just “information countermeasures” arranged by the Russian side. Even for some Russian generals who were “officially announced” and “beheaded” by Ukraine last year but then appeared on certain occasions, such as Major General Gerasimov, Chief of Staff of the 41st Army, there are still Ukrainian supporters who claim that “this person is dead”. The only reason is that this person did not show up again after showing up!

Therefore, in the field of cognitive warfare, many things are not as simple as we think.

Having said that, at the strategic level, there is indeed a big gap between the Russian army and the Ukrainian army in the field of cognitive warfare in the Russian-Ukrainian military conflict. The Ukrainian army is very good at creating topics, and it is flexible and mobile and pays attention to participation.

In terms of creating topics, Ukraine takes the initiative to create topics almost every time in cognitive warfare, from the Mariupol Theater incident at the beginning, to the Bucha incident later, to the chemical explosion in Rubezhnoy, and the Zaporizhia Nuclear Power Plant incident and the explosion of the New Kakhovka Hydropower Station Dam. Almost every time, Ukraine “takes the lead”. Russia is in a state of passive response, and as a result, Ukraine continues to create topics and continuously attacks, which puts it in a disadvantageous position.

In terms of mobility and flexibility, the Ukrainian army is very familiar with the characteristics of modern media communication. For example, in the cognitive warfare against the “beheading” of Admiral Sokolov, the Ukrainian army took advantage of the characteristics of modern media’s fast communication speed and decentralized communication mode, released the “primer” in a semi-official form, and let netizens “reinforce” it (in fact, it is the self-growth of information), and finally the more authoritative official media came out to “finalize the final word”.

In terms of “focusing on participation”, the Ukrainian army is better at creating a sense of participation of ordinary netizens in specific topics. For example, after the Bucha incident and the Battle of Mariupol, Ukraine immediately launched a set of websites called “The Executioner’s Book”. Anyone can log in to these websites or network plug-ins at will and register the so-called “Russian army’s atrocities” or Russian army movements. The United States immediately responded and opened the “Observation Post” project belonging to the US State Department in response to the “Executioner’s Book” project, which is used to record the “atrocities” of the Russian army in the conflict. These public topics are set up quite cleverly, making the outside world feel that they are highly involved, while the specific content is true and false, which is different from the false information indoctrination of traditional information warfare. In the cognitive warfare of the Ukrainian army, these highly involved projects once made the entire network trend one-sided.

Compared with the propaganda and operation of the Ukrainian army in the field of cognitive warfare, the Russian army is far behind in the field of cognitive warfare. For example, in response to Ukraine’s formation of a combat mode that emphasizes mass participation and mass experience in the cognitive field, Russia is still using the old method – announcing ambiguous combat conditions in the form of daily combat reports, with only Major General Konashenko “acting as an emotionless reading machine” in front of the camera. As a result, most combat observers do not take his combat reports seriously. Another example is the tactics of Ukraine frequently setting topics and carrying out cognitive shaping in steps, and Russia can only defend itself. Every time Ukraine creates topics, Russia responds passively, and then Ukraine continues to create new topics based on Russia’s response, leaving Russia in a state of exhaustion. From this perspective, the Russian army’s cognitive warfare against the Ukrainian side’s “beheading” of Admiral Sokolov, and the sudden arrangement of the rumor-busting “face-slapping”, is just a tactical “flash of inspiration”, and the effect does not seem to be very good. It has not completely changed the basic power comparison between the Russian and Ukrainian armies in the field of cognitive warfare. Ukraine’s “cognitive warfare advantage” over Russia will continue.

  1. How do we deal with cognitive warfare in the new combat form?

In terms of definition, “cognitive warfare” can be traced back to the concept of “network-centric warfare” proposed by the US military in military reform at the beginning of this century. After years of development, by 2017, related discussions began to frequently appear in the speeches of senior NATO generals. For example, on August 14, 2017, Stewart, director of the US Defense Intelligence Agency, clearly put forward the assertion that “the fifth-generation war is cognitive warfare” at the 2017 Defense Intelligence Information System Annual Conference. On September 17, 2017, Goldfein, then Chief of Staff of the US Air Force, also clearly put forward the concept of “cognitive warfare” at the annual meeting of the US Air Force Association. Soon, NATO comprehensively developed this novel operational concept. In March 2020, NATO released the concept book “Operation 2040”, which clearly stated that “information and cognitive warfare” will play an important role in future operations. In June, NATO appointed François du Creuse, former French colonel and head of the NATO Innovation Center, to study cognitive warfare, and produced a detailed report on “Cognitive Warfare” in January 2021. In June 2021, NATO held a scientific conference on cognitive warfare and released a special report on “Cognitive Warfare: Cognition Dominates the Future”, thus forming a more systematic and complete cognitive warfare theory.

Compared with traditional information warfare and propaganda warfare, the biggest feature of cognitive warfare is that it is based on the working principle of the human brain, that is, the process of acquiring, perceiving, understanding, processing, inferring, evaluating, judging, calculating, and making decisions on external information. In short, cognitive warfare is superior in terms of operational characteristics. It is not a unilateral indoctrination, but a good use of people’s cognitive process to “reshape” everyone’s worldview, values, ideology, and even cognitive and understanding processes in an all-round way, so as to fully rebuild the individual’s interpretation and response to information and interfere with the individual’s ideology and value orientation. The final result is not only to use various false information to disrupt the opponent, but more importantly to reshape the opponent’s social psychology, thereby affecting the opponent’s strategic decision-making, “defeating the enemy without fighting.” This determines that cognitive warfare is a new generation of propaganda warfare and information warfare. Compared with the traditional information warfare that focuses on the tactical level, the role of cognitive warfare can be further improved to the strategic level, which may affect the direction or outcome of the war.

So, facing a new type of combat form such as cognitive warfare, referring to the current situation in which Russia has been at a disadvantage in the field of cognitive warfare with Ukraine in the Russian-Ukrainian military conflict, how should we respond to future cognitive warfare?

From a strategic perspective, we should realize that in the field of cognitive warfare, pure passive response is the most unreliable and inefficient form of combat. For example, Russia’s passive response to Ukraine’s agenda setting is equivalent to passively being beaten. Of course, compared with firepower warfare or mobile warfare, cognitive warfare is a thorough “open conspiracy” without too much feint and deception. It mainly relies on the ideological superiority and media skills of the West, and relies on the legal rights of the Western media in the field of the “fourth power”. Since we are temporarily at a disadvantage in the field of media compared with Western countries, it is actually a relatively difficult thing to implement cognitive confrontation with them in the whole system and at all levels. Therefore, even if we also adopt the strategy of strategic offensive in cognitive warfare, its implementation effect may not be as good as that of the other party. One way may be to firmly grasp our basic plate strategically and form a “you fight yours, I fight mine” pattern.

In the field of tactics, we should fully learn from the experience and lessons of both sides in cognitive warfare in the Russian-Ukrainian conflict. Judging from the cognitive warfare strategy implemented by the Ukrainian army, in areas such as topic shaping, it mainly exploits the loopholes of information asymmetry. Then, our possible response strategy is to disclose some information in a timely manner and change the original concept that “military operations must be kept confidential and not disclosed to the outside world.” In fact, public information itself is a process of cognitive shaping. Both sides can carry out a lot of cognitive warfare confrontation around the timing, content, and timing of information disclosure. For example, in the field of setting topics, we can “take the initiative” and first seek the ability to set topics and carry out cognitive shaping in countries such as the Belt and Road, BRICS or Shanghai Cooperation Organization countries, at least to ensure that the people of our country and some friendly countries can offset the influence of the Western cognitive warfare. For another example, in response to the “cognitive warfare” implemented by the other party against different groups in multiple dimensions and levels, or the rumors created step by step, we can make extensive use of the influence of KOL (Key Opinion Leader) and the platform to form a “cognition against cognition” combat pattern.

In short, cognitive warfare, as a new combat style that has been studied by NATO for many years, appeared in large numbers in the Russian-Ukrainian military conflict and played a certain role, deserves further research to ensure that it is invincible in future military operations.

This articleThe article on cognitive warfare is just an appetizer, and there will be a main course later. This studio took about a year to complete the “Research on Cognitive Warfare in the United States and Other Western Countries”, which has a text of more than 40,000 words (excluding more than 7,000 words of notes), which is much richer and more in-depth. The report will be officially launched and introduced the day after tomorrow, so stay tuned.

現代國語:

據稱,9月22日烏克蘭空軍使用「風暴陰影」巡航飛彈襲擊塞瓦斯托波爾黑海艦隊司令部。此後,關於黑海艦隊司令維克托.索科洛夫上將的生死一直是外界關注的議題。經過幾天暗流湧動後,9月25日,烏克蘭特戰司令部(SOF)對外宣布,他們成功「擊斃」黑海艦隊司令索科洛夫上將及以下數十名軍官,甚至俄軍札波羅熱集群司令羅曼丘克上將也被烏軍擊傷。

沒想到,烏克蘭的消息剛放出來不到一天就慘遭「打臉」——9月26日,俄羅斯國防部召開國防委員會會議,國防部高級領導人、各軍區司令、各軍兵種司令等以現場出席或視訊連線的方式參會,會上赫然出現了被烏軍官宣「擊斃」的黑海艦隊司令索科洛夫上將。消息放出後,烏方極為尷尬,只好宣布他們要收集更多資訊。但隨後又有人宣稱說參會的索科洛夫上將“只是照片”,不是真人。

儘管如此,從這次俄烏兩軍圍繞索科洛夫上將的生死問題展開的鬥法,可以被看做是俄烏軍事衝突中關於認知戰的一個經典案例,既值得我們就這一案例進行專門剖析,更值得我們從這個案例中“說開去”,談一談俄烏軍事衝突中俄烏兩軍在認知戰領域展開的較量。

一、索科洛夫上將“被斬首”,並非孤例?

關於「斬首」俄軍將領,烏克蘭宣傳部門栽進坑裡已經不是第一次了。例如今年6月中,烏克蘭軍隊造謠稱車臣副總統、車臣國民近衛軍中將、小卡德羅夫左膀右臂德里姆哈諾夫被烏軍海馬火箭砲「擊斃」。

從整個認知戰的實施過程看,無論是6月那次德里姆哈諾夫被“斬首”,還是這次索科洛夫上將被“斬首”,整個過程大同小異:

第一步:烏克蘭宣傳部門有意「製造議題」。所謂“製造議題”,在認知戰中可以被視為一個“引子”,作用是引發輿論熱議。這種「引子」往往不一定由官方放出,可能是由一些半官方的管道或與官方關係比較緊密的管道放出。例如德里姆哈諾夫中將被烏軍的「海馬」火箭炮「打死」一事,最早據說是從烏克蘭武裝部隊的一個電報群組裡放出來的,而且說法模稜兩可。這次索科洛夫上將被“斬首”,最早的消息來源經過追溯則是俄羅斯的某個電報群組。這種電報群組的消息魚龍混雜,有些消息則可信度很低,但有些消息卻出奇準確。將認知戰的「引子」透過這些群組放出,某種意義上說甚至比官方消息更容易引發輿論關注和討論。

第二步可以被視為輿論的「補強」。認知戰的第一步即「引子資訊」對一個完整的新聞來說往往缺乏細節,但不要緊。從當前社會訊息傳播的效率和特徵來看,一個缺乏細節、但容易引起大家興趣、卻又因某種因素出現信息壁壘的“引子信息”,在傳播過程中,大家自然會對其進行“細節補充」。無論是索科洛夫上將被“擊斃”還是德里姆哈諾夫被“斬首”,都是如此。資訊缺乏細節,但對傳媒來說偏偏又是非常重要的「Breaking News」(突發新聞),同時基於軍事機密的因素,想穿越資訊壁壘及時獲取更多的資訊也不可能。因此,這必然會使傳媒導向一個結果——輿論“補強”,而且是大家一起上,在口耳相傳的過程中不斷把一個引子信息細節化、“真實”化。例如德里姆哈諾夫被「斬首」在傳播過程中,先後彌補上了他被斬首的「時間」、「地點」(甚至由於傳播管道不同,這些時間和地點要素也各不相同),時而出現在克雷緬納亞,時而出現在戈爾洛夫卡。索科洛夫上將被「擊斃」同樣如此,訊息在傳播過程中,被先後腦補出一整套「黑海艦隊週五開例會,烏軍抓住機會實施打擊」、「兩發飛彈一發擊中了司令部辦公室,另一發飛彈在救援人員進去救人的時候實施了補充打擊」這種細節。正是在這種「去中心化」的口耳相傳,這兩名高級將領被「斬首」的「事實」被不斷強化,消息來源的缺失甚至更有利對普通信息受眾進行認知塑造。

經過數天發酵之後,認知戰來到第三步-「一錘定音」。德里姆哈諾夫被“斬首”一事並沒有輪到烏克蘭官方“一錘定音”,小卡德羅夫先坐不住了,很快放出了自己和德里姆哈諾夫坐在一起吃飯野餐的視頻,甚至當場拿出手機展示拍攝時間,進行「打臉」。索科洛夫上將則是烏軍特戰司令部、烏克蘭國防部這樣的大部門出面完成「一錘定音」。按照認知戰的過程,這種官方媒體“一錘定音”應該起到在認知戰領域完成認知塑造過程的作用,也就是完成一個從放出“引子”,到外界“猜測補強”,最後官方「蓋章確認」的完整邏輯閉環。但俄軍居然後發製人,表演了一出“當場打臉”,不僅讓烏軍前面的所有信息傳播都變成了無用功,還讓烏克蘭的兩個要害單位特戰司令部甚至國防部變成了小丑。

二、俄烏衝突中的認知戰,烏軍來勢洶洶?

對於烏軍認知戰宣傳部門親自上陣、造謠車臣副總統德里姆哈諾夫或索科洛夫上將被烏軍“擊斃”,很多人可能大惑不解:如果俄羅斯方面迅速反應,把這些高級將領拉出來亮個相,自我確認一下“我還活著”,那麼烏方認知戰的謠言不就不攻自破了嗎?

看似有理,但俄軍並沒有被造謠一次就照上面的辦法澄清一次。為什麼?因為事情沒有那麼簡單。

其一,從戰術角度來說,如果烏軍每製造一個關於高級將領的謠言,俄軍就安排將領出面闢謠,最直接的後果就是對高級將領的日常作戰指揮造成不必要的干擾和麻煩。在作戰行動中,將官以上的高級軍官時間非常寶貴,每天工作量極高,對腦力消耗極大,根本不可能有那麼多時間沒完沒了地現身闢謠。如果烏軍每造一個謠俄軍就闢一個謠,那這些高級將領平時啥也別幹了,就天天拍視頻闢謠吧。

其二,對於進入認知塑造循環的訊息來說,某些情況下,「闢謠」不僅沒用,還會進一步製造出更多謠言。例如索科洛夫上將被“斬首”一事,儘管俄軍已經安排了闢謠,但一些烏克蘭群組依然在“挑刺死磕”,認為索科洛夫上將在鏡頭前沒有動,索科洛夫上將在受訪時看不出時間和地點,因此只是俄方安排的「資訊反制」。甚至對一些去年曾經被烏克蘭方面「官方宣布」「斬首」、但隨後又在某些場合露面的俄軍將官,比如第41集團軍參謀長格拉西莫夫少將,目前依然有烏克蘭支持者宣稱“此人已死”,唯一的原因就是這人在露面之後居然沒有再度露面!

所以,在認知戰領域,很多事沒有想的那麼簡單。

話又說回來,在戰略層面上,俄軍在俄烏軍事衝突的認知戰領域相比烏軍確實存在較大差距。烏軍非常善於製造議題,而且機動靈活,注重參與。

製造議題方面,幾乎每次認知戰都是烏克蘭主動製造議題,從一開始的馬裡烏波爾大劇院事件,到後來的布查事件,再到魯別日諾耶的化學物質爆炸事件,還有後來的札波羅熱核電廠事件和新卡霍夫卡水力發電廠大壩爆破事件,幾乎每次都是烏克蘭「先聲奪人」。俄羅斯則處於被動應對的狀態,結果被烏克蘭繼續製造議題連續攻訐,處於不利地位。

機動彈性方面,烏軍對現代傳媒的傳播特徵非常熟稔,例如對索科洛夫上將被「斬首」展開的認知戰,烏軍利用了現代傳媒傳播速度快、傳播模式去中心化的特點,以半官方形式放出“引子”,放任網民對其進行“補強”(事實上就是信息的自生長),最後再由比較權威的官方媒體下場“一錘定音”。

「注重參與」方面,烏軍更善於營造普通網民對特定議題的參與感。例如布查事件和馬裡烏波爾戰役之後,烏克蘭方面立即上線了一套名叫「劊子手之書」的網站,任何人都可以隨意登陸這些網站或者網絡插件,在裡面登記所謂的「俄羅斯軍隊的暴行」或俄軍動向。美國立即回應,針對「劊子手之書」項目,開通了屬於美國國務院的「觀察站」項目,從而用於記錄俄軍在衝突中的「暴行」。這些公共議題設定相當巧妙,令外界群眾感受到的參與度極高,而在具體內容上則真真假假,不同於傳統資訊戰的假訊息灌輸。在烏克蘭軍隊的認知戰中,這些參與度極高的計畫一度讓整個網路風向呈現一面倒的趨勢。

和烏軍在認知戰領域的宣傳和操作相比,俄軍在認知戰領域差太遠。例如針對烏克蘭方面在認知領域塑造極為強調群眾參與、群眾體驗的作戰模式,俄羅斯方面依然在沿用著老辦法——以每日戰情通報的形式對外公佈模棱兩可的戰況,只有一個科納申科少將在鏡頭前“當一個沒有感情的讀稿機器”,結果絕大多數戰況觀察者都不太把他的戰況通報當回事。又如對烏克蘭方面頻繁設置議題、依照步驟進行認知塑造的戰法,俄羅斯方面更是只有招架之功。每次都是烏克蘭製造議題,俄羅斯方面被動應對,然後烏克蘭方面根據俄羅斯的應對情況繼續製造新的議題,使俄羅斯處於疲於奔命的狀態。從這個角度來看,俄軍此次針對索科洛夫上將被“斬首”的烏方認知戰塑造,突然安排闢謠“打臉”,只是戰術上“靈光乍現”而已,而且效果似乎也沒多好,也沒有徹底改變目前俄烏兩軍在認知戰領域的基本力量對比,烏克蘭對俄羅斯的「認知戰優勢」還會持續下去。

三、新型作戰形式認知戰,我們如何因應?

從定義上來說,「認知戰」最早可追溯到本世紀初美軍在軍事改革中提出的「網路中心戰」概念,經過多年的發展,到2017年,相關論述開始頻頻見於北約高級將領的言論集中,譬如2017年8月14日,美國國防情報局局長史都華在國防情報資訊系統2017年會上就明確提出了「第五代戰爭是認知戰」這一論點。到了2017年9月17日,時任美國空軍參謀長戈德費恩在美國空軍協會年會上同樣明確地提出了「認知戰」這個概念。很快,北約就對這個新穎的作戰概念進行了全面發展。 2020年3月,北約發布《作戰2040》概念書,明確提出「資訊與認知戰」將在未來作戰中扮演重要角色。 6月,北約又指派前法軍上校、北約創新中心負責人弗朗索瓦.杜.克魯澤專題研究認知戰,並在2021年1月拿出了《認知戰》的詳細報告書。 2021年6月,北約召開了認知戰科學會議,並發布了《認知戰:認知主導未來》專題報告,從而形成了較有系統、完整的認知戰作戰理論。

相較於傳統的資訊戰和宣傳戰,認知戰的最大特徵是基於人的大腦運作原理,也就是對外在資訊的獲取、感知、理解、加工、推論、評估、判斷、計算、決策的過程。總之,認知戰在作戰特質上可謂更勝一籌,不是進行單方面灌輸,而是要善於利用人們的認知過程,對每個人的世界觀、價值觀、意識形態,甚至認知、理解過程進行全方位“重塑”,從而全面重建個人對訊息的解讀和反應,干涉個人的意識形態和價值取向,最終的結果不僅是要利用各種假訊息擾亂對手,更重要的是重塑對手的社會心理,從而對對手的戰略決策產生影響,「不戰而屈人之兵」。這決定了認知戰是新一代的宣傳戰和資訊戰,相對於傳統的專注於戰術層面上的資訊戰,認知戰的角色可以進一步提高到戰略層面上,可能會影響戰爭的走向或結局。

那麼,面對認知戰這樣一種新型的作戰形式,參考俄烏軍事衝突中俄羅斯在和烏克蘭的認知戰領域長期處於下風的現狀,我們對未來的認知戰究竟該如何應對呢?

從戰略角度來看,我們應當認識到,在認知戰領域,單純的被動應對是最不可靠、效率最低的作戰形式,如俄羅斯在烏克蘭的議題設定面前被動應對等於被動挨打。當然,認知戰相比於火力戰戰或機動戰,是徹底的“陽謀”,並沒有太多佯動和詭詐,依託的主要是西方的意識形態優勢地位和傳媒功力,靠的是西方媒體「第四權」領域的法權。由於我方相比西方國家在傳媒領域暫時處於下風,要在全系統、全層面上與其實施認知對抗作戰其實是一件相對困難的事情。因此,即使我們在認知戰上同樣採取戰略進攻的策略,其實施效果可能也不如對手。辦法之一或許是從策略上牢牢把握住我們的基本盤,形成「你打你的,我打我的」格局。

而在戰術領域,要充分借鏡俄烏衝突中雙方在認知作戰上的經驗教訓。從烏克蘭軍隊實施的認知戰策略來看,在議題塑造等領域,主要鑽了資訊不對稱的空子。那麼,我方可能的因應策略是及時公開部分訊息,要改變原有的「軍事行動必須保密、不要對外界公開」的觀念。事實上,公開資訊本身就是認知塑造的過程,雙方圍繞著資訊公開的時機、內容、時序上,可以展開大量的認知戰對抗。如在設置議題領域,我方可以“先發製人”,先求得在諸如一帶一路沿線國家、金磚國家或上海合作組織國家內設置議題、展開認知塑造的能力,起碼確保本國群眾和一些友好國家能夠對沖西方認知戰領域的影響。再如,針對對方在多維度、多層次上針對不同人群實施的「認知戰」或步步為營塑造出的謠言,我方可廣泛利用KOL(Key Opinion Leader,即意見領袖)及平台的影響力,形成以「認知對認知」的作戰模式。

總之,認知戰作為一種被北約研究多年、在俄烏軍事衝突中大量出現且起到一定作用的新型作戰樣式,值得進一步進行研究,以確保在未來的軍事行動中立於不敗之地。

這篇關於認知戰的文章只是“開胃菜”,隨後還有“正餐硬菜”——本工作室歷時約一年完成了《美國等西方國家的認知作戰研究》,正文4萬多字(不含註7千多字),要豐富和深入得多。該報告將於後天正式推出並進行介紹,敬請關注。

中國原創軍事資源:https://www.163.com/dy/article/IGEFT5CB0515NAKC888.html

Chinese Intelligent Warfare is Accelerating and Advancing

中國智能化戰爭正在加速推進

中國軍網 國防部網. 2022年3月17日 星期四

現代英語:

With the widespread application of artificial intelligence in the military field, intelligent warfare has gradually become a hot topic. History has repeatedly proved that the evolution of war forms will lead to profound changes in the winning mechanism. In today’s era when information warfare is developing in depth and intelligent warfare is beginning to emerge, the armies of major countries in the world have made great efforts to promote military intelligence, and many of these trends are worthy of attention.

Strengthen top-level design

Outlining a “roadmap” for intelligent warfare

Driven by a new round of scientific and technological revolution and industrial revolution, intelligent military transformation is developing in depth. The United States, Russia, Japan and other countries have regarded artificial intelligence as a disruptive technology that “changes the rules of the war game” and have made early arrangements, strengthened top-level design and planning guidance, and explored the direction of military application of artificial intelligence.

The U.S. military has detailed the current status and development plan of artificial intelligence in documents such as “Preparing for the Future of Artificial Intelligence”, “National Artificial Intelligence Research and Development Strategic Plan”, “Artificial Intelligence and National Security”, “Integrated Roadmap for Unmanned Systems, Fiscal Year 2017-2042”, and “American Artificial Intelligence Initiative: First Annual Report”, and has elevated the development of artificial intelligence to the national strategic level. In 2021, the U.S. military pointed out in its “U.S. Department of Defense Artificial Intelligence Posture: Assessment and Improvement Recommendations” that the U.S. military should consider three guiding questions in developing artificial intelligence: what is the current state of artificial intelligence related to the U.S. military; what is the current situation of the U.S. military in artificial intelligence; and what internal actions and potential legislative or regulatory actions may enhance the U.S. military’s artificial intelligence advantage.

Russia has invested a lot of resources to maintain a balance with the United States in the competition for the application of artificial intelligence in the military field. In 2021, Russian President Vladimir Putin stated at the first Defense Ministry meeting of the year that artificial intelligence will greatly promote changes in the military field, and the Russian Federation Armed Forces must accelerate the research and development of artificial intelligence application technologies such as robots, intelligent individual systems, and intelligent weapon modules, so as to form core technical capabilities and battlefield competitive advantages as soon as possible. Documents such as “Special Outline for the Research and Development of Future Military Robot Technology and Equipment before 2025”, “Future Russian Military Robot Application Concept”, and “The Development Status and Application Prospects of Artificial Intelligence in the Military Field” have established a series of mechanisms at the national level for the Russian military to promote the military application of artificial intelligence.

The Japanese government has also issued an “Artificial Intelligence Strategy” to lead the research and development of artificial intelligence technology and industrial development. In the “Robotics and Artificial Intelligence” strategic plan formulated by the United Kingdom, the application of artificial intelligence in battlefield construction is emphasized. In January 2021, the Australian Department of Defense released “Fighting the Artificial Intelligence War: Operational Concepts for Future Intelligent Warfare”, which explores how to apply artificial intelligence to land, sea and air combat.

Innovative combat concepts

Promoting the “Thinking First” Approach to Intelligent Warfare

The innovation of operational concepts has an ideological driving effect on the development of military science and technology and the evolution of war forms. In the past, people’s understanding and grasp of war mainly came from the summary of practical experience, and operational concepts were empirical concepts. In the future era of intelligent warfare, operational concepts are not only empirical concepts, but also the conception, design and foresight of operations.

The U.S. Army has proposed the concept of “multi-domain warfare”, which requires deep integration and close coordination of combat capabilities in various domains such as land, sea, air, space, electromagnetic, and network. To this end, the U.S. Army has successively issued white papers such as “Multi-Domain Warfare: The Development of Synthetic Arms in the 21st Century (2025-2040)”, “U.S. Army Multi-Domain Warfare (2028)”, and “Using Robotics and Autonomous Technologies to Support Multi-Domain Warfare”. In March 2021, the U.S. Department of the Army issued the document “Army Multi-Domain Transformation: Preparing to Win in Competition and Conflict”, indicating that “multi-domain warfare” has become a “flag” leading the transformation and development of the U.S. Army. The Defense Advanced Research Projects Agency proposed the concept of “mosaic warfare”, which aims to create a highly decentralized and highly adaptable “kill net” composed of different combat functional units, based on advanced computer technology and network technology. The U.S. Department of Defense strongly supports the concept of “joint all-domain operations”. In March 2020, the U.S. Air Force took the lead in writing “joint all-domain operations” into the doctrine to explore how the Air Force can play a role in “joint all-domain operations”.

The Russian military proposed the concept of “charge disintegration”. “Disintegration” is one of the most important operational concepts in Russia at present. The Russian electronic warfare forces set the goal of making the enemy’s information, charge, electronic warfare and robot systems ineffective, and believe that this goal will “determine the fate of all military operations”. Disrupting the command and control of enemy forces and weapon systems and reducing the efficiency of enemy reconnaissance and use of weapons are the primary tasks of electronic warfare. At present, the Russian military is considering forming 12 types of electronic warfare forces. The Russian military also proposed the concept of “non-nuclear containment system”, the core of which is to use non-nuclear offensive strategic weapons to contain opponents. The non-nuclear offensive strategic weapons it defines include all ballistic missiles equipped with non-nuclear warheads, as well as strategic bombers and long-range air-based and sea-based cruise missiles. In addition, the Russian military also proposed the concept of “hybrid warfare”, hoping to use artificial intelligence systems to seek battlefield information advantages.

The British Ministry of Defense has proposed the concept of “multi-domain integration” and will develop a new command and control system with intelligent capabilities to achieve comprehensive, persistent, accurate and rapid battlefield perception and force coordination.

Focus on technology research and development

Shaping the Intelligent Warfare Operational Model

The key to the effectiveness of artificial intelligence is the combination with other technologies, which is also described as the “AI stack”. Various technologies interact to produce a combined effect, thereby enhancing the capabilities and effects of each technology. In the intelligent warfare supported by artificial intelligence technology, the collaborative combat mode of “man-machine integration, cloud brain control”, the cluster combat mode of “mixed formation, group intelligence”, and the cognitive combat mode of “intelligence-led, attacking with intelligence first” will constantly update people’s understanding of war.

Focus on the research and development of innovative projects. The US military is vigorously promoting the application of artificial intelligence chips in existing weapons and equipment systems, adding “intelligent brains” to weapons to enable them to have human-like thinking and autonomous interaction capabilities. In October 2021, the US Navy launched the “Beyond Plan”, which is regarded as the “current highest priority”. It aims to accelerate the delivery of artificial intelligence and machine learning tools by building a military Internet of Things for maritime operations, integrating manned and unmanned joint formations, supporting a new intelligent naval architecture, enhancing large-scale firepower killing, and realizing intelligent distributed operations of the navy. In addition, the Defense Advanced Research Projects Agency has also carried out cognitive electronic warfare projects such as “Adaptive Electronic Warfare Behavior Learning”, “Adaptive Radar Countermeasures”, and “Communications under Extreme Radio Frequency Spectrum Conditions”, and developed a prototype of a cognitive radar electronic warfare system. The Russian Ministry of Defense’s Intelligent Technology and Equipment Research and Experimental Center cooperated with the Institute of Control Problems of the Russian Academy of Sciences to develop and test autonomous intelligent algorithms including drone swarm command and control, and also jointly developed an object automatic recognition software system based on neural network principles with the National Aviation System Research Institute.

Establish innovative R&D institutions. The continuous emergence of new technologies is an inexhaustible driving force for the vigorous development of military intelligence. High-level military intelligence construction cannot be separated from the technical research and development of professional institutions. Some countries and militaries have established R&D centers, focusing on innovative development from a technical level. The U.S. Department of Defense has established a joint artificial intelligence center, which is planned to be built into a national key laboratory to lead the promotion of hundreds of artificial intelligence-related projects and ensure the efficient use of artificial intelligence-related data and information to maintain the United States’ technological advantage in this field. Russia has established an artificial intelligence and big data alliance, a national artificial intelligence center, and a robotics technology research and experimental center under the Ministry of Defense, mainly conducting theoretical and applied research in the fields of artificial intelligence and information technology. France has established an innovative defense laboratory, the United Kingdom has set up an artificial intelligence laboratory, and India has established an artificial intelligence task force to explore related technologies.

Strengthen equipment research and development and deployment. In recent years, many countries have attached great importance to the research and development of intelligent weapons and equipment, and unmanned aerial vehicles, unmanned combat vehicles, unmanned ships, unmanned submarines, etc. have continued to emerge. At present, the US Air Force has begun to practice the combat concept of “man-machine collaboration, man in the loop” on the F-35 fighter. The US XQ-58A “Valkyrie” stealth drone previously mainly carried out man-machine collaborative operations with F-35 and F-22 fighters. In April 2021, the stealth drone successfully launched the ALTIUS-600 small drone system, further enhancing its manned and unmanned collaborative combat capabilities. Russia is focusing on reconnaissance and surveillance, command and decision-making, firepower strikes, combat support and other fields, and is developing and deploying intelligent equipment. It plans to increase the proportion of unmanned combat systems in weapons and equipment to more than 30% by 2025. Russia’s ground unmanned combat weapons, represented by the “Uranus” series and “Platform-M” and “Argo” models, are developing rapidly. Among them, the Nerekhta unmanned combat vehicle can be equipped with remote-controlled machine guns and rocket launchers. In addition to the combat capabilities of ordinary armored vehicles, it also has transportation and reconnaissance functions. In addition, the Japanese Self-Defense Forces plan to officially deploy an unmanned aerial formation with strong combat capabilities in 2035.

(Author’s unit: National University of Defense Technology)

國語中文:

■賈珍珍 丁 寧 陳方舟

隨著人工智慧在軍事領域的廣泛應用,智慧化戰爭逐漸成為備受矚目的焦點話題。歷史多次證明,戰爭形態的演進將引發致勝機理的深刻改變。在資訊化戰爭向縱深發展、智慧化戰爭初露端倪的當今時代,世界主要國家軍隊紛紛下大力推動軍事智慧化,其中的諸多動向值得關注。

加強頂層設計

勾勒智能化戰爭“路線圖”

在新一輪科技革命與產業革命推動下,智慧化軍事變革正向縱深發展。美國、俄羅斯、日本等國紛紛把人工智慧視為「改變戰爭遊戲規則」的顛覆性技術,並事先佈局,加強頂層設計和規劃引領,探索人工智慧的軍事應用方向。

美軍在《為人工智慧的未來做好準備》《國家人工智慧研究與發展戰略計畫》《人工智慧與國家安全》《2017至2042財年無人係統綜合路線圖》《美國人工智慧計畫》:在首個年度報告》等文件中,詳述了人工智慧的發展現狀和發展規劃,並將人工智慧發展提升至國家戰略層面。 2021年,美軍在發布的《美國防部人工智慧態勢:評估與改進建議》中指出,美軍發展人工智慧應考慮三個指導性問題:與美軍相關的人工智慧現處於何種狀態;美軍目前在人工智慧方面的態勢如何;哪些內部行動以及潛在的立法或監管行動可能會增強美軍的人工智慧優勢。

俄羅斯投入大量資源,以維持與美國在人工智慧軍事領域應用競爭的平衡。 2021年,俄總統普丁在年度首場國防部會議上表示,人工智慧將大幅推動軍事領域變革,俄國聯邦武裝力量要加速機器人、智慧單兵系統和武器智慧化模組等人工智慧應用技術的研發工作,早日形成核心技術能力和戰場競爭優勢。 《2025年前未來軍用機器人技術裝備研發專題綱要》《未來俄軍用機器人應用構想》《人工智慧在軍事領域的發展現狀以及應用前景》等文件,從國家層面為俄軍推動人工智慧軍事應用確立了一系列機制。

日本政府也推出了《人工智慧戰略》,旨在引領人工智慧技術研發和產業發展。在英國制定的《機器人與人工智慧》戰略規劃中,強調了人工智慧在戰場建設中的應用。 2021年1月,澳洲國防部發布《打好人工智慧戰爭:未來智慧化戰爭之作戰構想》,這份文件探討如何將人工智慧應用到陸、海、空作戰領域。

創新作戰概念

推動智慧化戰爭“思想先行”

作戰概念創新對軍事科技發展、戰爭形態演變具有思想牽引作用。過去人們對戰爭的認識與掌握,主要源自於對實踐經驗的歸納總結,作戰概念即經驗概念。未來智慧化戰爭時代,作戰概念不僅是經驗概念,更是對作戰的構想、設計與前瞻。

美陸軍提出「多域戰」概念,要求陸、海、空、天、電磁、網路等各域作戰能力深度整合與密切協同。為此,美陸軍先後發布《多域戰:21世紀合成兵種的發展(2025至2040)》《美國陸軍多域戰(2028)》《運用機器人與自主技術支援多域戰》等白皮書。 2021年3月,美陸軍部發布文件《陸軍多域轉型:準備在競爭和衝突中取勝》,顯示「多域戰」已成為引領美陸軍轉型發展的一面「旗幟」。美國防高級研究計畫局提出「馬賽克戰」概念,旨在打造一種由不同作戰功能單元構成的、以先進電腦技術與網路技術為基礎的、高度分散、具有高度適應性的「殺傷網」。美國防部大力支持「聯合全域作戰」概念。 2020年3月,美空軍率先將「聯合全域作戰」寫入條令,探討空軍如何在「聯合全域作戰」中發揮作用。

俄軍提出「指控瓦解」概念。 「瓦解」是當前俄羅斯最重要的作戰概念之一,俄軍電子戰部隊把使敵人的訊息、指控、電子戰和機器人系統失效作為目標,認為這一目標將「決定所有軍事行動的命運」。擾亂敵方部隊和武器系統的指揮和控制,降低敵方偵察和使用武器的效率,是進行電子戰的首要任務。目前,俄軍正在考慮組建12種類型的電子戰部隊。俄軍也提出「非核武遏制體系」概念,核心是使用非核武進攻性戰略武器來遏制對手。其所定義的非核武攻擊性戰略武器既包括所有裝備非核彈頭的彈道飛彈,也包括戰略轟炸機和遠程空基、海基巡航飛彈。此外,俄軍也提出「混合戰爭」概念,希望利用人工智慧系統謀求戰場資訊優勢。

英國防部提出「多域融合」概念,將發展具備智慧化能力的新型指控系統,以實現全面、持久、準確、快速的戰場感知與力量協同。

注重技術研發

塑造智慧化戰爭作戰模式

人工智慧發揮效用的關鍵是與其他多種技術的組合,這種組合也被描述為「人工智慧堆疊」。各種技術透過互動的方式產生組合效應,進而提升每項技術所產生的能力與效果。在人工智慧技術支援的智慧化戰爭中,「人機一體、雲腦控制」的協同作戰模式,「混搭編組、群體智慧」的集群作戰模式,「智慧主導、攻智為上」的認知作戰模式等,將不斷更新人們對戰爭的認知。

聚焦創新專案研發。美軍正在大力推廣人工智慧晶片在現有武器裝備系統中的應用,為武器加上“智慧大腦”,使之具備類人思考和自主互動能力。 2021年10月,美海軍推出被視為“當前最高優先事項”的“超越計劃”,旨在通過構建海上作戰軍事物聯網,整合有人無人聯合編隊,加速交付人工智能和機器學習工具,支撐全新的智慧化海軍架構,提升大規模火力殺傷、實現海軍智慧化分散式作戰。此外,美國防高級研究計畫局也進行了「自適應電子戰行為學習」「自適應雷達對抗」「極端射頻頻譜條件下的通訊」等認知電子戰項目,研發出認知雷達電子戰系統原型機。俄國防部智慧技術裝備科研試驗中心與俄聯邦科學院控制問題研究所合作,開發測試了包括無人機群指揮控制在內的自主智慧演算法,也與國家航空系統科研所共同開發基於神經網路原理的物體自動辨識軟體系統。

組成創新研發機構。新技術的不斷湧現是軍事智慧化蓬勃發展的不竭動力,高水準的軍事智慧化建設離不開專職機構的技術研發。一些國家和軍隊組成研發中心,注重從技術層面創新發展。美國國防部建立了聯合人工智慧中心,計劃將該中心打造成國家級重點實驗室,用於領導數百個與人工智慧相關的項目,確保對人工智慧相關數據資訊的高效利用,以保持美國在該領域的技術優勢。俄羅斯組成了人工智慧和大數據聯盟、國家人工智慧中心和隸屬國防部的機器人技術科研試驗中心,主要進行人工智慧和資訊科技領域的理論和應用研究。法國成立了創新國防實驗室,英國設立了人工智慧實驗室,印度組成了人工智慧特別工作小組,進行相關技術探索。

加強裝備研發列裝。近年來,多國重視研發智慧武器裝備,無人飛行器、無人戰車、無人艦艇、無人潛航器等不斷湧現。目前,美空軍已開始在F-35戰機上實踐「人機協同,人在迴路」的作戰理念。美XQ-58A「女武神」隱身無人機先前主要與F-35和F-22戰機進行人機協同作戰,2021年4月該隱身無人機成功投放ALTIUS-600小型無人機系統,進一步提升了其有人無人協同作戰能力。俄羅斯正聚焦偵察監視、指揮決策、火力打擊、作戰支援等多個領域,展開智慧裝備研發和列裝工作,計畫到2025年將無人作戰系統在武器裝備中的比例提高到30%以上。以“天王星”系列和“平台-M”“阿爾戈”等型號為代表的俄地面無人作戰武器發展迅速。其中,Nerekhta無人戰車可搭載遙控機槍和火箭發射器,除擁有一般裝甲車的戰鬥力外,還兼具運輸和偵察功能。此外,日本自衛隊計劃在2035年正式部署具有較強作戰能力的無人空中編隊。

(作者單位:國防科技大學)

中國軍事資料來源:https://www.81.cn/jfjbmap/content/2022-03/17/content_311555.htm

How Chinese Military Will Achieve Precise Strikes in Cognitive Domain Operations

中國軍隊如何在認知域作戰中實現精準打擊

現代英語:

How to achieve precise strikes in cognitive domain operations

■Bu Jiang Jiang Rilie

introduction

Currently, driven by intelligent technology, cognitive domain operations are showing new characteristics such as precise perception, precise prediction and precise calculation. Studying and grasping the connotation mechanism of precision strikes in cognitive domain operations to ensure clear operational targets, personalized information generation, and precise information delivery will be more conducive to seizing the commanding heights and initiative in future cognitive domain operations.

Accurately establish combat goals

The establishment of operational goals is often the primary issue of concern in cognitive domain operations. With the continuous application of artificial intelligence, big data and other technologies, the party with a technological advantage is often able to quickly and efficiently collect cognitive data of different dimensions, levels and modalities, thereby discovering the weaknesses and sensitivities of the opponent’s cognitive system. point and detonation point.

Massive “data sources” refine target clarity. Today, as the Internet becomes more popular, cognitive data is growing exponentially. With the support of big data, psychometric and other technologies, target portraits are gradually evolving rapidly towards accurate portraits and intelligent portraits. According to foreign statistics, as of July 2022, the global Internet penetration rate reached 69%, and the Internet has become an essential platform for users’ daily lives. With the help of the Internet, both combatants can widely and quickly realize target object cognitive data collection and cognitive situation awareness, providing support for analyzing the target object’s political beliefs, values, national sentiments, public opinion positions, etc. It is reported that in foreign elections in recent years, foreign data analysis companies have captured social media user data, established character analysis models, accurately portrayed voters’ personalities and cognitive characteristics, and on this basis pushed suggestive campaigns to swing voters. advertising, thereby influencing their electoral decisions.

Dynamic “tag pool” improves target recognition rate. Labeling usually refers to the abstract classification and generalization of certain characteristics of a specific group or object. In cognitive domain operations, labeling is an important process to achieve classification and visualization of cognitive data. In the face of massive user data, establishing a mature and reliable label system is a prerequisite for sorting out, analyzing, and making good use of cognitive data. Using the label system to filter useless data and mine potential value information can provide information for presetting combat scenarios in the cognitive domain. Direct frame of reference. The development of the labeling system should be based on the logical starting point of cognitive domain operations, and ultimately comes down to the application of cognitive domain operations. For the target object, the transfer of interests, changes in personality, and changes in emotion are real-time and dynamic. The establishment of a “tag pool” can sense the cognitive dynamics of the target object in real time and accurately improve the target recognition rate.

Intelligent “algorithm library” shows target relevance. If data is compared to the “fuel” of cognitive domain operations, algorithms are the “engine” and an important source of power for cognitive precision strikes. In a certain sense, cognitive domain operations are “confrontation of data or algorithms.” Through intelligent algorithms, we can deeply mine the multi-dimensional correlation data of the target object’s behavior, build an accurate target portrait, and then combine it with machine learning algorithms to build a prediction model to automatically match and associate cognitive information with the target object, at the right time and at the right place. Deliver cognitive information in an appropriate manner to change the target object’s cognition. As analyzed by some foreign research institutions, with 10 likes, the algorithm can know you better than your colleagues; with 150 likes, the algorithm will know you better than your parents; with 300 likes, the algorithm will know you better than your closest partner you.

Accurately generate information “ammunition”

Designing information “ammunition” that conforms to the target’s thinking habits and perception style is the key to improving the cognitive domain killing effect. The development and application of intelligent science and technology provides a convenient means to achieve “private customization” of cognitive information themes, content and forms, making it possible to instantly and forcibly change the decisions and actions of target objects.

Information theme planning based on target value orientation. Cognitive information theme is the central idea represented by the information and the core of the information content. From legal advice, military deterrence, conflict and separation, and emotional summons, to moral guidance, war mobilization, behavioral instructions, and motivational incentives, different information themes exert different influences. Practice shows that the theme of cognitive information must be planned closely around the target object. According to the different value orientations shown by different combat stages and different target objects, the information theme must be optimized in a timely manner so that the information “ammunition” can satisfy the target object to the maximum extent. needs. According to the analysis of foreign research institutions, foreign election campaign advertisements in recent years are often inseparable from the support of big data. Accurately designing different advertising themes for voters with different values ​​​​can resonate with voters’ values.

Information content design based on target mindset. In the Internet era, the life trajectory, geographical location, hobbies, social relationships, etc. of the target object are all recorded on the Internet, making it possible to accurately create an “information cocoon” that caters to the target object’s way of thinking. Driven by big data technology, the interaction trajectories of target objects in the virtual world can also be easily captured, perceived and calculated. With the assistance of multimedia content intelligent generation systems, information similar to the target’s thinking habits can be generated in batches, causing the target to be trapped in an “information cocoon”. The scope of information acceptance narrows, and the perception of the outside world gradually decreases, and then falls into cognitive confusion. Know the trap. In recent years, many “color revolutions” that have occurred around the world are inseparable from the support of cognitive control. Some Western countries use “deep forgery” technology to instill false information in target objects that conforms to their way of thinking, creating anti-intellectual, The information environment stupefies the people, forming cognitive biases and inducing them to deny their own national and cultural values, thereby creating anti-government sentiments.

Information form selection based on target perceptual characteristics. Psychology believes that the formation and change of the cognitive subject’s attitude needs to go through three processes of “attention-understanding-acceptance”. Whether the target object can be affected by the dissemination of information, attracting attention is the first step. Information form is an important carrier to attract the attention of the target audience, and its form design is crucial to improving the acceptance, dissemination and infectivity of information “ammunition”. Through big data technology, we can mine the national emotions, customs and habits, religious beliefs, personal preferences and other characteristics of the target object, and scientifically judge the perceptual characteristics such as information receiving habits. On this basis, we can comprehensively use text, language, video, image and other information carriers to integrate Color, layout and other elements can cause strong stimulation to the target object’s senses. Since 2011, some Syrian anti-war activists have produced a number of anti-war propaganda short films from the perspectives of children and women and spread them on the international Internet, arousing strong responses from international public opinion. This internationally accepted information carrier meets the aesthetic needs of the public, avoids differential interpretation by the audience, and can often achieve unexpected results.

Accurately deliver information

Cognitive information delivery follows the laws of information dissemination. In order to achieve the effect of precise cognitive attack, it is necessary to deal with issues such as delivery objects, delivery channels, and delivery timing.

Extract cognitive features and filter information delivery objects. The profiling technology supported by big data makes it possible to extract the cognitive characteristics of target objects. Through the cognitive characteristic library, objects with similar characteristics can be screened out from groups of different races, different parties, different occupations, etc., thereby upgrading the traditional extensive screening method. , so that the information “ammunition” is more closely matched with the target object, thereby improving the pertinence and accuracy of cognitive attacks. In recent years, Cambridge Analytica has used machine learning methods to classify Facebook users according to five personality types: openness to experience, conscientiousness, extroversion, agreeableness, and emotional instability, and established a linear regression of the five personality traits. model to establish a “target” for precise delivery of campaign advertisements. This move has many implications for the world. In the future, cognitive domain operations, based on the extensive collection of users’ cognitive characteristics, will place more emphasis on accurately dividing groups, and carry out targeted campaigns based on the differences in values ​​and behavioral habits of different groups. information delivery and behavior prediction.

Follow social habits and match information delivery channels. The deep popularization of the Internet has brought about tremendous changes in the way information is disseminated, and the ways in which people receive information are becoming more diversified and diversified. According to data from foreign research institutions, there are currently more than 4.62 billion social media users worldwide, and social media platforms have become the main battlefield in the cognitive domain. In the many “color revolutions” that have occurred in recent years, social media such as Facebook, Twitter, and YouTube, controlled by Western countries, have played an important role in spreading public opinion, organizing protests, and mobilizing the public. It is reported that in similar operations, Facebook is often used to determine the schedule, Twitter is used to coordinate actions, and YouTube is used to spread the word widely. Future operations in the cognitive domain will place great emphasis on focusing on the target’s social habits and characteristics, fully understanding the target’s social circle and life circle, and selecting information delivery methods from multiple channels, including online and offline, military and civilian, to ensure the effectiveness of cognitive information. Delivery rate.

Track cognitive dynamics and accurately deliver information at the right time. Changes in cognition do not happen overnight. Blindly pursuing high rhythm and achieving goals in an instant will have the opposite effect. Therefore, cognitive domain operations must grasp the rhythm and intensity of “time immersion”, select the correct delivery time based on the cognitive dynamics of the target object, and gradually seek to expand the effect advantage. Before the target object has formed a preliminary understanding of a certain event, it is necessary to actively seize the priority of information release, carry out information “bombing” as soon as possible, and strive to “preemptively strike first.” In addition, during the public opinion fermentation stage of the incident, the subject’s cognition has not yet been completely solidified. At this time, by repeatedly disseminating a specific information, the purpose of subtly reconstructing the subject’s cognition can also be achieved.

(Author’s unit: National University of Defense Technology)

國語中文:

如何實現認知域作戰精準打擊

■卜江 蔣日烈

引言

目前,在智慧化技術的推動下,認知域作戰正呈現出精確感知、精確預測和精確計算等全新特徵。研究掌握好認知域作戰精準打擊的內涵機理,從而確保作戰目標清晰化、資訊生成個性化、資訊投射精準化,將更有利於奪取未來認知域作戰制高點和主動權。

精準確立戰目標

作戰目標的確立往往是認知域作戰關注的首要問題。隨著人工智慧、大數據等技術的不斷應用,佔據技術優勢的一方往往能夠快速、有效率地採集不同維度、不同層級、不同模態的認知數據,進而發現對手認知體系的薄弱點、敏感點和爆燃點。

海量「資料來源」細化目標清晰度。在網路深入普及的今天,認知數據正呈指數級增長,目標畫像在大數據、心理測量等技術的支撐下,正逐漸朝著精準畫像、智慧畫像的方向快速演進。根據國外統計數據顯示,截至2022年7月,全球互聯網滲透率達69%,互聯網已成為用戶日常生活的必備平台。借助互聯網,作戰雙方能夠廣泛快速地實現目標對象認知數據收集和認知態勢感知,為分析目標對象的政治信念、價值觀念、民族情感、輿論立場等提供支撐。據悉,在近年的國外大選中,國外數據分析公司就曾透過抓取社群媒體用戶數據,建立人物分析模型,精準刻畫選民性格、認知特徵,在此基礎上對搖擺選民推送暗示性競選廣告,從而影響其選舉決策。

動態「標簽池」提升目標辨識率。貼標簽通常是指對某一類特定群體或物件的某項特徵進行的抽象分類和概括。在認知域作戰中,貼標簽是實現認知資料分類與可視化的重要過程。面對海量的用戶數據,建立一套成熟可靠的標簽體係是梳理分析、用活用好認知數據的前提,利用標簽體系過濾無用數據,挖掘潛在價值信息,能夠為認知域作戰場景預設提供直接參考框架。標簽體系的開發要基於認知域作戰這個邏輯起點,最終歸結於認知域作戰應用。對於目標對象來講,興趣的遷移、性格的改變、情感的變化是即時動態的,建立「標簽池」能夠即時感知目標對象的認知動態,精準提升目標辨識率。

智慧“演算法庫”顯現目標關聯性。如果將數據比作認知域作戰的“燃料”,演算法則是“引擎”,是認知精準打擊的重要動力源。從一定意義上講,認知域作戰是「數據或演算法的對抗」。透過智慧演算法,可以深度挖掘目標對象行為的多維關聯數據,構建精準目標畫像,再結合機器學習演算法構建預測模型,將認知資訊與目標對象進行自動匹配關聯,在合適的時間、合適的地點,以合適的方式投送認知訊息,從而改變目標對象認知。正如國外一些研究機構分析發現,透過10個點贊,演算法可以比同事更了解你;150個點贊,演算法將比你父母更了解你;300個點贊,演算法將比最親密的伴侶更了解你。

精準生成資訊“彈藥”

設計符合目標對象思維習慣和感知風格的訊息“彈藥”,是提升認知域殺傷效果的關鍵所在。智慧科學技術的發展運用,為實現認知資訊主題、內容和形式的「私人客製化」提供了便捷手段,即時、強制地改變目標對象決策和行動成為可能。

基於目標價值取向的資訊主題策劃。認知資訊主題是資訊所表現的中心思想,是資訊內容的核心。從法理勸告、軍事威懾、矛盾離間、情感召喚,到義理引導、戰爭動員、行為指示、動機激勵,不同的訊息主題發揮不同的影響作用。實踐表明,認知訊息的主題必須緊緊圍繞目標對像做策劃,針對不同作戰階段、不同目標對象所表現出來的不同價值取向,及時優化信息主題,才能使信息“彈藥”最大限度地滿足目標對象的需求。根據國外研究機構分析,近年來的國外大選競選廣告背後往往離不開大數據的支撐,針對不同價值觀的選民精準設計不同廣告主題,可以引起選民價值共鳴。

基於目標思維方式的資訊內容設計。在互聯網時代,目標對象的生活軌跡、地理位置、興趣愛好、社交關係等都被網絡所記錄,精準打造迎合目標對象思維方式的「資訊繭房」成為可能。在大數據技術驅動下,目標對像在虛擬世界中的互動軌跡也很容易被捕捉、被感知和被計算。在多媒體內容智慧生成系統等輔助下,可大量產生與目標對象思維習慣類似的訊息,致使其陷於「訊息繭房」之中,訊息接受範圍變窄,對外界的感知度逐漸降低,進而陷入認知陷阱。近年來,全球發生的多起「顏色革命」背後都離不開認知控制的支撐,一些西方國家利用「深度偽造」技術,向目標對象灌輸符合其思維方式的虛假信息,製造反智化、愚民化資訊環境,形成認知偏差,誘導其否定自身民族文化價值理念,進而產生反政府情緒。

基於目標感知特性的資訊形式選擇。心理學認為,認知主體的態度的形成與改變需經過「注意力-了解-接受」三個過程,目標對象能否受到訊息傳播的影響,吸引註意是第一步。資訊形式作為引起目標對象注意的重要載體,其形式設計對提高訊息「彈藥」的接受度、傳播力、感染性至關重要。透過大數據技術可以挖掘目標對象民族情感、風俗習慣、宗教信仰、個人喜好等特徵,科學判斷訊息接受習慣等感知特性,在此基礎上綜合運用文本、語言、視頻、圖像等資訊載體,加以融入色彩、佈局等元素,可以給目標感官造成強烈刺激。自2011年以來,一些敘利亞反戰人士以兒童、婦女等視角,製作出多部反戰宣傳短片在國際互聯網上傳播,引起國際社會輿論強烈反響。這種國際通用的資訊載體,符合大眾審美需求,避免了受眾差異性解讀,往往能達到意想不到的效果。

精準實現資訊投送

認知訊息投送遵循資訊傳播規律,要達到認知精準打擊效果,需要處理好投送對象、投送管道、投送時機等問題。

提取認知特徵,篩選訊息傳遞對象。大數據支撐的畫像技術使提取目標對象認知特徵成為可能,透過認知特徵庫,可以從不同種族、不同黨派、不同職業等群體中篩選出具有相似特徵的對象,從而升級傳統的粗放篩選方式,讓資訊「彈藥」與目標對象更加匹配,從而提高認知攻擊的針對性和精準性。近年來,劍橋分析公司曾使用機器學習方法,依照經驗開放型、盡責型、外向型、親和型、情緒不穩定型五類人格對臉書用戶進行分類,建立了五種人格特質的線性回歸模型,為精準投送競選廣告立起「標靶」。此舉對世人的啟示是多方面的,未來認知域作戰,在廣泛蒐集用戶認知特徵的基礎上,將更加強調精準劃分群體,依據不同群體的價值觀念和行動習慣的差異,進行有目的地信息投送和行為預測。

遵循社交習慣,匹配資訊投送管道。互聯網的深度普及使資訊的傳播方式正發生巨大變革,人們接受資訊的方式更加多樣化、多元化。根據國外調研機構數據顯示,目前全球社群媒體用戶超過46.2億,社群媒體平台成為認知域作戰主戰場。在近年來發生的多起「顏色革命」中,臉書、推特、優兔等社群媒體在西方國家操縱下,在輿論傳播、組織抗議、動員民眾等方面發揮了重要作用。據悉,在類似行動中臉書往往用來確定日程,推特用來協調行動,優兔用來廣泛傳播。未來的認知域作戰,十分強調著眼目標對象社交習慣和特點,充分掌握目標對象的社交圈、生活圈,從線上線下、軍用民用等多渠道選擇信息投送方式,從而確保認知信息的送達率。

追蹤認知動態,把準資訊投送時機。認知的改變,並非一蹴而就,一味地追求高節奏、瞬間達成目的反而會起到反面效果。因此,認知域作戰要掌握好「時間沉浸」的節奏與力度,根據目標對象認知動態選準投送時間,逐步漸進地謀求擴大效果優勢。在目標對像對某一事件還未形成初步認知前,要積極搶佔信息的發布優先權,第一時間進行信息“轟炸”,力求“先發製人、先入為主”。此外,在事件的輿論發酵階段,主體的認知還未徹底固化,此時透過不斷重復傳播某個特定訊息,也可以達到潛移默化地重構主體認知的目的。

(作者單位:國防科技大學)

中國軍事資料來源:https://www.81.cn/yw_208727/16209631.html

How the Chinese Military Identify Key Targets for Cognitive Domain Operations

中國軍隊如何辨識認知域作戰的關鍵目標

現代英語:

Cognitive domain combat targets refer to the specific role of cognitive domain combat. In cognitive domain combat, compared with combat targets, combat targets solve the problem of precise aiming, that is, to let commanders understand and grasp the precise coordinates of what to hit, where to hit, and to what extent. Only by deeply understanding the connotation and characteristics of cognitive domain combat targets can we accurately find key targets through appearances and thus seize the initiative in future combat.

Cognitive focus that influences behavioral choices

The cognitive focus is the “convergence point” of the cognitive subject’s multi-dimensional thinking cognition in war activities. As a dynamic factor, it affects the cognitive process and behavioral results. Generally speaking, the cognitive factors that affect individual behavioral choices in war activities mainly include political attribute cognition, interest-related cognition, group belonging cognition, risk loss cognition, emotional orientation cognition, war morality cognition, etc. For war activities and groups or individuals who pay attention to war activities, the cognitive focus that affects their attitudes, tendencies and behaviors is not the same. Judging from the local wars and regional conflicts in the world in recent years, there are obvious differences in the cognitive focus of different groups or individuals. Politicians pay more attention to political attribute cognition and interest-related cognition, those who may intervene in the war pay more attention to risk loss cognition and interest-related cognition, ordinary people pay more attention to interest-related cognition and emotional orientation cognition, and people in other countries outside the region generally pay more attention to war morality cognition and group belonging cognition because their own interests will not be directly lost. In combat practice, foreign militaries are good at targeting the cognitive focus of different objects, accurately planning topics, and pushing related information to induce specific behavioral choices. For example, before the Gulf War, the Hill Norton public relations company fabricated the non-existent “incubator incident” by using Naira, the daughter of the Kuwaiti ambassador to the United States, as a “witness” to show the “inhumanity” of the Iraqi army, induce the American people’s ethical and moral cognition, and then support the US government to send troops to participate in the Gulf War.

Style preferences that constrain command decisions

Cognitive style directly affects decision-making behavior preferences. Cognitive style refers to the typical way of individual cognition, memory, thinking, and problem solving. According to the preference of command decision-making style, commanders can be divided into calm cognitive style and impulsive cognitive style. Commanders with calm cognitive style pay attention to accuracy but not speed in the decision-making process. The quality of the decisions they make is high, but they are prone to fall into the comparison and analysis of various intelligence information sources and overemphasize the accuracy and objectivity of information analysis. Commanders with calm cognitive style are often easily disturbed by the diverse and diverse information stimulation in battlefield cognitive offensive and defensive operations, and their mental energy is easily disturbed and dissipated, which may lead to missed opportunities. Commanders with impulsive cognitive style pay attention to speed but not accuracy. The decision-making reaction speed is fast, but the quality is not high. They are easily emotional and prone to conflict with team members. Commanders with impulsive cognitive style are also prone to over-interpret the ambiguous external security environment, and constantly look for “evidence” to strengthen and verify individual erroneous thinking, narrowing individual attention and leading to command decision-making deviations. In combat practice, foreign armies pay more attention to analyzing the decision-making style of commanders of combat opponents, and then select specific information to influence them psychologically. For example, during the U.S. invasion of Panama, when besieging the hiding place of Panamanian President Noriega, the U.S. military repeatedly played rock and heavy metal music, and used language that stimulated and humiliated Noriega to carry out cognitive and psychological attacks on him, causing Noriega to gradually collapse physically and mentally.

Backdoor channel to control thinking and cognition

Once a computer is infected with a “Trojan” virus, it will send a connection request to the hacker control terminal at a specific time. Once the connection is successful, a backdoor channel will be formed, allowing the hacker to control the computer at will. Similarly, the human brain also has a cognitive “backdoor” and may be controlled by others. Cognitive psychologists have found that by sending information to the target object’s audio-visual perception channel, carefully pushing information content that the target object recognizes and accepts, catering to the target object’s existing experience memory, conforming to the target object’s thinking habits, and stimulating the target object’s emotional pain points, it is possible to control and interfere with the target object’s cognition and promote its instinctive emotional and behavioral reactions. With the support of cutting-edge cognitive science and technology, using the two modes of automatic start and control processing of brain information processing, the target object can easily fall into a “cognitive cocoon”. In cognitive domain operations, by immersing individuals in massive amounts of artificially constructed information, and continuously providing them with “evidence” to prove that their judgments and cognitions are “correct”. Over time, the individual’s cognitive vision becomes smaller and smaller, and the ability to perceive the external environment gradually decreases. Eventually, they will not be able to see the truth of the matter and will be immersed in the “cognitive cocoon” and unable to extricate themselves. When foreign militaries conduct operations in the cognitive domain, they often target their opponents’ cognitive biases on a certain issue and continuously push situational information and intelligence information through various channels to support their opponents’ so-called “correct cognition,” causing errors and deviations in their opponents’ command decisions.

Sensory stimuli that induce attention

Effective perceptual stimulation is the first prerequisite for attracting the attention of the target object. The human brain will perceive and react to stimuli within the perceptual range. Cognitive psychology experimental research has found that information such as dynamic, dangerous, relevant, survival safety, and contrast between before and after is more likely to attract the attention of the human brain. In the era of intelligence, the psychological cognitive process of the target object often follows the law of “attracting attention, cultivating interest, actively searching, strengthening memory, actively sharing, and influencing others”. In combat, foreign troops often use exclusive revelations, intelligence leaks, authoritative disclosures, on-site connections, and other methods, and cleverly use exaggeration, contrast, association, metaphor, suspense, and contrast to push information that subverts common sense, cognitive conflicts, and strong contrasts to attract the attention of the target object. For example, the “Lin Qi rescue incident” created by the US military in the Iraq War and the “Gaddafi Golden Toilet” in the Libyan War mostly choose stories familiar to the audience as the blueprint, hiding the purpose and embedding the viewpoint in the story plot, which attracted the attention of the general public. In addition, the human brain will also process stimuli outside the perceptual range. In recent years, the military of Western countries has attached great importance to the research of subthreshold information stimulation technology, and has developed subthreshold visual information implantation technology, subthreshold auditory information implantation technology, subthreshold information activation technology, subconscious sound manipulation technology of the nervous system, etc., continuously expanding the application scope of neurocognitive science and technology in the military field.

Meta-value concepts that give rise to cognitive resonance

In cognitive theory, cognitive resonance refers to information that can cross the cognitive gap between the two parties and trigger the ideological and psychological resonance and cognitive empathy of both parties, thereby achieving the deconstruction and reconstruction of the other party’s cognitive system. In cognitive domain warfare, this cognitive energy-gathering effect is not a simple concentration of power, but an internal accumulation of system synergy. Under the diffusion and dissemination of modern information media, this cognitive resonance effect can spread rapidly to all parts of the world in a short period of time, and produce secondary indirect psychological effects or more levels of derivative psychological effects, presenting a state of cumulative iteration. Once it exceeds the psychological critical point, it will present a state of psychological energy explosion, thereby changing the direction or outcome of the event. The targets that can induce this cognitive resonance are mainly value beliefs, moral ethics, common interests, etc. In war, if one party touches or violates human meta-values, common emotional orientation, etc., it is very easy to induce collective condemnation, bear the accusation of violating human morality, and fall into a moral trough. For example, a photo during the Vietnam War shows a group of Vietnamese children, especially a 9-year-old girl, running naked on the road because of burns after being attacked by US napalm bombs. In 1972, this photo caused a huge sensation after it was published, setting off an anti-war wave in the United States and even the world, and accelerating the end of the Vietnam War.

Cognitive gaps in a split cognitive system

In daily life, seemingly hard steel is very easy to break due to the brittleness of the material due to factors such as low temperature environment, material defects, and stress concentration. The same is true for the cognitive system. Cognitive gaps refer to the cracks, pain points, weaknesses, and sensitive points in the cognitive thinking of the target object, which are mainly manifested as the individual’s worry that he is unable to cope with or adapt to the environment, and under the influence of anxiety, cognitive vulnerability is formed. The experience of security threats, the looseness of group structure, the confusion of beliefs and ideals, and the loss of voice of authoritative media will all cause cognitive conflicts and tearing of the target object. In cognitive domain operations, sometimes seemingly powerful combat opponents hide a large number of thinking cracks and psychological weaknesses behind them. Often a news event can shake the cognitive framework of the combat opponent and puncture the cognitive bubble. In addition, this cognitive psychological conflict will also cause moral damage and psychological trauma to individuals. In recent years, the U.S. and Western countries’ troops carrying out overseas missions have faced “enemies disguised as civilians” that appear anytime and anywhere, and their uncertainty about the battlefield environment has continued to increase. They generally lack the perception of the significance of combat, and are filled with guilt and sin. A large number of soldiers developed post-traumatic stress disorder, the number of self-harm on the battlefield, post-war suicides and crimes increased sharply, and the number of suicides among veterans of the war even exceeded the number of deaths on the battlefield.

(Author’s unit: Political Science Academy of National Defense University)

國語中文:

引言

認知域作戰標靶是指認知域作戰的具體作用指向。在認知域作戰中,相較於作戰對象,作戰標靶解決的問題是精確瞄準,也就是讓指揮官了解掌握具體打什麼、往哪裡打、打到什麼程度的精準座標問題。只有深刻理解認知域作戰標靶的內涵特點,才能透過表象準確找到關鍵標靶,以便在未來作戰中掌握先機。

影響行為選擇的認知重心

認知重心是戰爭活動中認知主體多元思維認知的“匯聚點”,作為一種能動因素影響認知進程和行為結果。一般而言,影響戰爭活動中個人行為選擇的認知因素,主要包含政治屬性認知、利益關聯認知、群體歸屬認知、風險損失認知、情緒定向認知、戰爭道德認知等。對於戰爭活動以及關注戰爭活動的群體或個體而言,影響其態度、傾向和行為的認知重心並不相同。從近年來的世界局部戰爭和地區衝突來看,不同群體或個體關注的認知重心有著明顯差異,政治人物更加關注政治屬性認知和利益關聯認知,戰爭可能介入者更關注風險損耗認知和利益關聯認知,一般民眾更關注利益關聯認知和情感定向認知,而域外他國民眾由於自身利益不會受到直接損失,普遍更關注戰爭道德認知和群體歸屬認知。外軍在作戰實踐中,善於針對不同對象的認知重心,精準策劃主題,推送關聯訊息,誘發特定的行為選擇。如同在海灣戰爭前,希爾·諾頓公關公司炮製了根本不存在的“育嬰箱事件”,就是利用科威特駐美大使的女兒娜伊拉“做證”,展現伊拉克軍隊的“慘無人道”,誘發美國民眾的倫理道德認知,進而支持美國政府派兵參加海灣戰爭。

制約指揮決策的風格偏好

認知風格直接影響決策行為偏好。認知風格是指個體認知、記憶、思考、解決問題的典型方式。根據指揮決策風格偏好,指揮家可以分為冷靜型認知風格和衝動型認知風格。冷靜型認知風格的指揮者在決策過程中重視準確但不重視速度,作出的決策品質較高,但容易陷入對各類情報資訊來源的比對分析,過度強調資訊分析的準確客觀。冷靜型認知風格的指揮在戰場認知攻防行動中,常常容易受到紛繁多元的信息刺激幹擾,心智精力容易被擾亂和耗散,進而可能貽誤戰機。衝動型認知風格的指揮者重視速度但不重視準確度,作出的決策反應速度較快,但品質不高,且容易情緒激動,易與團隊成員發生衝突。衝動型認知風格的指揮者也容易將模稜兩可的外在安全環境進行過度曲解,並不斷尋找「證據」強化和驗證個體錯誤思維,使個體注意力變窄,導致出現指揮決策偏差。外軍在作戰實務中,比較著重分析作戰對手指揮官決策風格,進而選擇特定資訊對其進行心理影響。如美軍入侵巴拿馬戰爭中,在圍攻巴拿馬總統諾列加躲藏處時,美軍反複播放搖滾和重金屬音樂,運用刺激和羞辱諾列加的語言對其進行認知打擊和心理進攻,使諾列加身心逐漸崩潰。

控制思維認知的後門通道

電腦一旦中了「木馬」病毒,會在特定時間向駭客控制端發送連線請求,一旦連線成功就會形成後門通道,使得駭客可以隨心所欲地控制電腦。與之相似,人類大腦也存在認知“後門”,也可能被他人控制。認知心理學家研究發現,透過給目標對象視聽感知通道發送訊息,精心推送目標對象認可的、接受的信息內容,迎合目標對像已有的經驗記憶,順應目標對象思維習慣,刺激目標對象的情感痛點,就可以控制干擾目標物認知,促進其產生本能情緒行為反應。在尖端認知科學技術的支撐下,運用大腦資訊加工的自動啟動和控制加工兩種模式,目標物很容易陷入「認知繭房」之中。認知域作戰中,透過讓個體沉浸在人為構設的海量資訊之中,並源源不斷地為其提供「證據」用來佐證其判斷和認知是「正確」的。長此以往,個體的認知視野就變得越來越小,對外在環境的感知能力逐漸降低,最終會看不到事情的真相,沉湎於「認知繭房」中無法自拔。外軍在認知域作戰中,常常針對作戰對手對某一問題的認知偏差,持續透過多種管道推送佐證作戰對手自以為「正確認知」的態勢訊息和情報訊息,使作戰對手指揮決策出現失誤和偏差。

誘發關注的感知覺刺激

有效的知覺刺激是引發目標對象關注的首要前提。人類大腦對感知覺範圍內的刺激會有所察覺,並做出各種反應。認知心理學實驗研究發現,動態、危險、利害關係人、生存安全、前後反差等類別資訊更容易引起人類大腦的注意。在智慧化時代,目標對象的心理認知過程往往遵循「引起注意、培養興趣、主動搜尋、強化記憶、主動分享、影響他人」的規律。外軍在作戰中,常運用獨家爆料、情報外洩、權威揭露、現場連線等方式,巧用誇張、對比、聯想、比喻、懸念、襯託等手法,推播顛覆常識、認知衝突、對比強烈等訊息,來引發目標對象注意。例如伊拉克戰爭中美軍塑造的“營救女兵林奇事件”,利比亞戰爭中的“卡扎菲黃金馬桶”,大多選擇受眾對象熟知的故事為藍本,藏目的、寓觀點於故事情節,吸引了廣大民眾的注意力。此外,人類大腦也會對感知覺範圍外的刺激進行加工。近年來,西方國家軍隊非常重視知覺閾下資訊刺激技術的研究,開發發展了閾下視覺訊息植入技術、閾下聽覺訊息植入技術、閾下訊息啟動技術、神經系統潛意識聲音操控技術等,不斷擴大神經認知科學技術在軍事領域的應用範圍。

催生認知共振的後設價值概念

認知理論中,認知共振是指跨越雙方認知鴻溝,能夠引發雙方思想心理與認知共鳴共感的訊息,進而實現對對方認知體系的解構與重建。在認知域作戰中,這種認知聚能效應不是簡單意義上的力量集中,而是體系合力的內在累積。在現代資訊傳媒的擴散傳播作用下,這種認知共振效應能在短時間內迅速擴散到全球各地,並產生二次間接心理效應或更多層次的衍生心理效應,呈現出一種累積迭代的狀態,一旦超過心理臨界點,即呈現出心理能量爆發狀態,從而改變事件走向或結果。能夠誘發這種認知共振的靶標,主要有價值信念、道德倫理、共通利益等。戰爭中,若某一方觸及或違反人類元價值觀、共同情感指向等,則極易誘發集體聲討,承擔違背人類道德的指責,陷於道義低谷。如越戰期間的一張照片,畫面呈現的是遭遇美軍凝固汽油彈襲擊後,一群越南孩子特別是一名9歲女孩在公路上因為燒傷而裸體奔跑。 1972年,這張照片刊登後引發巨大轟動,掀起美國乃至全球的反戰浪潮,加速了越戰的結束。

分裂認知體系的認知縫隙

日常生活中,看似堅硬的鋼鐵,受低溫環境、材質缺陷、應力集中等因素影響,非常容易因材料脆性而斷裂,認知體係也是如此。認知縫隙是指目標對象認知思考中的裂縫、痛點、弱點與敏感點,主要表現為個體擔心自己沒有能力應對或無法適應環境的想法,並在焦慮情緒的作用下,構成認知脆弱性。安全威脅的經驗、團體結構的鬆散、信念理想的迷惘、權威媒介的失聲等,都會使得目標物出現認知上的衝突與撕裂。認知域作戰中,有時看似強大的作戰對手,背後卻潛藏著大量的思維裂隙與心理弱點,往往一個新聞事件就能動搖作戰對手的認知框架,刺破認知泡沫。此外,這種認知心理衝突也會使個體產生道德損傷和心理創傷。近年來,執行海外任務的美西方國家軍隊面對隨時隨地出現的“偽裝成平民的敵人”,對戰場環境的不確定感不斷提升,普遍缺乏作戰意義感知,進而內心充滿內疚與罪惡。大量士兵產生戰爭創傷後壓力障礙,戰場自殘自傷、戰後自殺與犯罪人數激增,參戰老兵自殺人數甚至超過戰場死亡人數。

(作者單位:國防大學政治學院)唐國東

中國軍網 國防部網 // 2023年3月23日 星期四

中國原創軍事資源:https://www.81.cn/jfjbmap/content/2023-03/23/content_336888.htm

Chinese Military Analysis on the Application of Metaverse in Military Communication

中國軍事分析虛擬宇宙在軍事通訊的應用

現代英語翻譯:

Abstract: Metaverse, as an innovative concept of the clustering effect of advanced technologies, will become the key to future media content production and cognitive advantage. Looking forward to the development prospects of Metaverse, this article explains the concept of Metaverse and analyzes its development prospects, key technologies and practical applications, aiming to provide reference for the application of Metaverse in the field of military communication.

Keywords: Metaverse; Military Communication; Development Prospects

The Metaverse has become a hot topic that people are competing to talk about, and has been selected as one of the “Top Ten Internet Terms of 2021”. Globally renowned Internet companies from Facebook to ByteDance are all planning the Metaverse. The 2022 Russia-Ukraine conflict was called a “public opinion war” and “cognitive war” with various means by domestic and foreign public opinion experts. Some experts even exclaimed that cognitive domain warfare in the form of the Metaverse has begun. The Metaverse, as an innovative concept of the clustering effect of advanced technology, will become the key to future media content production and gaining cognitive advantages. Exploring the application of the Metaverse in the field of military communication has become an important topic in the era of omnimedia.

1. The special functions of the metaverse determine its broad development prospects

Metaverse was born in the 1992 science fiction novel Snow Crash. The Metaverse described in the novel is a virtual shared space parallel to the real world. According to relevant information, as early as 1990, Qian Xuesen had a vision of virtual reality and Metaverse, and gave it a very meaningful name – “Spirit Realm”. Four years later, Qian Xuesen specifically mentioned: “Spirit Realm technology is another technological revolution after the computer technology revolution. It will trigger a series of changes that will shock the world and must be a major event in human history.” Qian Xuesen had already foreseen that Metaverse-related technologies would bring profound changes to human society.

From originating from science fiction to entering reality, the industry has not yet reached a consensus on the definition of the metaverse. According to the research of relevant experts, the essential characteristics of the metaverse are two: virtual-real integration and immersive experience. Virtual-real integration means that the boundary between the digital world and the physical world gradually disappears, and the economy, life, assets and identity of the two worlds are fully integrated. Immersive experience means that people’s two-dimensional audio-visual experience of the Internet is expanded into a three-dimensional, immersive, full-sensory experience. The special functions of the metaverse determine its broad development prospects.

The Metaverse is the next generation of the Internet. Looking back at the development of the Internet, from PC Internet to mobile Internet, the sense of immersion when using the Internet has gradually increased, and the distance between virtual and reality has gradually shortened. Under this trend, the Metaverse, where both immersion and participation have reached their peak, may be the “ultimate form” of the Internet. Regarding the future development of the Metaverse, some experts predict that: in terms of hardware terminals, with the portable development of wearable devices such as VR/AR glasses, their popularity will increase significantly, and people will gradually adapt to and accept the larger visual range and more natural interaction methods brought by new devices; in terms of content ecology and application scenarios, explosive Metaverse content will continue to emerge, and application scenarios will gradually expand. In the Metaverse, user experience has achieved an improvement and transformation from “online” to “presence”, thus entering the “scenario era”.

The metaverse is a new type of holographic medium. With the development of media technology, the presentation of media content has evolved from one-dimensional, two-dimensional to multi-dimensional. The emergence of the metaverse is another revolution in communication media after radio, television, and the Internet. From the perspective of user experience, the metaverse not only expands the user’s experience space, but also brings an immersive experience of “you are not just watching the content, you are in it as a whole”. From the perspective of media products, a large number of “we are on the scene” news media products will appear in the metaverse. The media products of the metaverse will achieve the advancement of news content with immersive narratives. For example, major sudden incident reports, large-scale live events, news documentaries, etc., can make the complete news scene into a digital scene of the metaverse, allowing the audience to enter the scene from various perspectives for experience. From the perspective of communication methods, there are currently four main modes of information communication: mass communication, network communication, social communication, and intelligent communication. The arrival of the new media of the metaverse will enrich the means of information communication in the era of intelligent communication, and “holographic communication” will become possible.

The metaverse is the future battlefield of cognitive domain warfare. The essence of communication media is the communication platform and channel, which is the material basis and main weapon of cognitive narrative in cognitive domain warfare. The 2022 Russia-Ukraine conflict was reported to the world in countless “first-person perspectives”. Both Russia and Ukraine spoke out on online media and social platforms to compete for the dominance of international communication cognitive narrative. As a new type of holographic medium, the metaverse transmits cognition in a full-dimensional, full-system and immersive way. It can shape people’s thinking and cognition more comprehensively, deeply and lastingly, and has immeasurable application value in cognitive warfare. In addition, the metaverse provides a parallel cognitive space that digitally twins real combat scenarios, where cognitive warfare can be efficiently promoted and presented in a panoramic manner.

2. A Preliminary Study on the Application of Metaverse in the Field of Military Communication

Like other new technologies, the metaverse was quickly applied to the field of military communication. The PLA News and Communication Center made a bold attempt and launched the “Holographic Military Newspaper” during the National People’s Congress for three consecutive years. It used technologies such as extended reality and digital construction to show a newspaper full of futuristic atmosphere: you can wear VR glasses to experience the “Holographic Military Newspaper” immersively, or you can watch it through your mobile phone. The “Holographic Military Newspaper” is the first of its kind in the domestic newspaper publishing industry and has been selected as an innovative case of deep integration and development of China’s newspaper industry. During the 2021 National People’s Congress, the center also launched the military media intelligent cartoon virtual person “Xiaojun”, which realized the same-screen interaction between 3D cartoons and real people. In 2022, the center and the Art Department of the PLA Culture and Art Center jointly launched the “2022 Metaverse Military Camp Network Spring Festival Gala”, which used metaverse technology to build a virtual space and interactive platform. Netizens and audiences can enter the three-dimensional virtual space by avatars, visit the performance site, and choose their favorite seats to watch the Spring Festival Gala. They can also interact with the audience around them through language and gestures. Some netizens commented: “It’s so shocking! The literary and artistic light cavalry team expressed it in the form of the metaverse, which shows the advancement of technology!” In addition, the center’s network department also took the lead in launching the public welfare NFT digital collection “Stars Accompany Me to Guard the Border”.

At present, the military-related authoritative media is organizing a team to promote the preliminary research and design of the Metaverse Editorial Department. Looking at the development process from traditional news editorial departments to intelligent editorial departments, combined with the advanced technology and future development of the Metaverse, experts have proposed the concept of “Metaverse Editorial Department”, that is, “Metaverse Editorial Department” enables editors and reporters at multiple locations to efficiently complete planning, interviews, editing, publishing and other tasks “face to face” in the same virtual space, the same chain of command, and the same work system. This will be the evolution of the news editorial department in the future. Each editor and reporter has his or her own virtual workspace. When there is a need for a meeting discussion, they can instantly travel to the virtual conference room for “face-to-face” communication.

3. Thoughts on how to win the cognitive war in the metaverse

A major feature of the 2022 Russia-Ukraine conflict is the deep involvement of social media. Mobile Internet has become the main source of information related to this conflict. As mentioned earlier, the special functions of the metaverse determine its broad development prospects. How to win the cognitive war in the metaverse urgently requires us to think forward-lookingly.

Strive to achieve the autonomy and control of the core technologies of the future metaverse. As a pioneering and innovative frontier field, the metaverse has huge initial R&D costs and requires long-term and continuous high investment to achieve the docking and unification of massive standards and specifications and the connection and interaction of ultra-large-scale users. This also leads to the inherent monopoly gene of the metaverse. At present, American companies such as Facebook, Google, and Microsoft, with the support of their governments and military, have deeply laid out the metaverse, and are very likely to become the technology monopolist and ruler of the future metaverse, just like the current situation of the Internet. In this Russia-Ukraine conflict, these technology giants “one-sidedly” pointed their spearheads at Russia, restricted or even banned Russia from using its technology products, and provided support for the United States to impose comprehensive sanctions. This warns us that in order not to be constrained in technology in the future, we should concentrate the superior forces of the military and the local area, aim at the metaverse technology, work together to tackle key problems, and strive to achieve the autonomy and control of the core technologies of the future metaverse.

Develop a metaverse platform that adapts to cognitive warfare. Developing a metaverse platform that is autonomous, controllable, has a wide coverage, and has a great influence is the key to winning in the cognitive domain battlefield in the future. Back to the Russia-Ukraine conflict, in order to suppress Russia from public opinion, American social platforms such as YouTube, Twitter, and Facebook, at the instruction of the US officials, directly restricted the exposure of Russian media. It can be said that they have taken advantage of the platform at the cognitive warfare level. This requires us to actively think about the future form of military communication platforms, develop metaverse platforms that adapt to cognitive warfare, and strive to create explosive products. For example, launching a metaverse version of the military’s new media platform.

We should speed up the production and accumulation of immersive content suitable for the era of the Metaverse. In addition to the traditional visible content types, content creation in the Metaverse era has also added a large amount of three-dimensional content, including panoramic shooting, digital twins of the real world, artificial construction of virtual space, and the display of virtual digital people. It is an issue that needs to be considered at present to speed up the production and accumulation of immersive content suitable for the era of the Metaverse. For example, the creation of digital history museums, the creation of heroic virtual people, the reproduction of classic battles, etc., can truly make history “perceptible” and make cultural relics “speaking”. In addition, the independent research and development of content creation tools is also critical.

國語中文:

摘 要:元宇宙作為先進技術群聚效應的創新概念,將成為未來媒體內容生產、贏得認知優勢的關鍵。展望元宇宙發展前景,本文闡釋了元宇宙概念,並圍繞其發展圖景、關鍵技術和實踐應用進行闡釋分析,旨在為元宇宙在軍事傳播領域的應用提供借鑒參考。

關鍵字:元宇宙;軍事傳播;發展前景

元宇宙,目前成為人們競相談論的熱門話題,並入選了「2021年度十大網路用語」。從Facebook到位元組跳動等全球知名網路公司都在版面元宇宙。 2022年的俄烏衝突被國內外輿論戰專家稱作是一場手段多樣的“輿論戰”“認知戰”,有專家甚至驚呼元宇宙形態下的認知域作戰拉開了序幕。元宇宙,作為先進技術群聚效應的創新概念,將成為未來媒體內容生產、贏得認知優勢的關鍵。探尋元宇宙在軍事傳播領域的應用,成為全媒體時代一門重要課題。

一、元宇宙的特殊功能決定了其廣闊的發展前景

元宇宙(Metaverse),誕生於1992年的科幻小說《雪崩》。小說中所描述的元宇宙是一個平行於現實世界的虛擬共享空間。根據相關資料顯示,早在1990年,錢學森就對虛擬實境與元宇宙有過展望,並為其起了個頗有意境的名字—「靈境」。 4年後,錢學森特別提到:「靈境技術是繼電腦科技革命之後的另一場科技革命。它將引發一系列震撼全世界的變革,一定是人類歷史中的大事。」錢學森當時就已預見元宇宙相關技術將對人類社會帶來的深層變革。

從源自科幻到走進現實,業界對於元宇宙的定義還沒能達成共識。根據相關專家的研究,認為元宇宙的本質特徵是兩個:虛實融合和沈浸體驗。虛實融合,就是數位世界和實體世界的邊界逐漸消失,實現兩個世界的經濟、生活、資產和身分認同等全方位的融合。沉浸體驗,就是人們對網路的二維視聽體驗拓展為三維立體、沉浸式的全感官體驗。元宇宙的特殊功能決定了其廣闊的發展前景。

元宇宙是下一代互聯網。回顧網路的發展歷程,從PC互聯網到行動互聯網,使用網路時的沉浸感逐漸提升,虛擬與現實的距離也逐漸縮短。在此趨勢下,沉浸感和參與度都達到高峰的元宇宙或是網路的「終極形態」。對於元宇宙的未來發展,有專家預計:硬體終端方面,隨著VR/AR眼鏡等穿戴設備的便攜化發展,其普及程度將大幅度提高,人們逐漸適應和接受新設備帶來的更大的視覺範圍和更自然的互動方式;內容生態及應用場景方面,爆款元宇宙內容將不斷湧現,應用場景也將逐步拓展。在元宇宙中,使用者體驗實現了從「線上」到「在場」的提升和轉變,從而進入「場景時代」。

元宇宙是新型全息媒介。隨著媒介技術的發展,媒體內容的呈現方式從一維、二維到多維不斷演進。元宇宙的出現,是繼廣播、電視、網路之後傳播媒介的另一次革命。從使用者體驗來看,元宇宙不僅拓展了使用者的體驗空間,也帶來了「你不只是觀看內容,你整個人就身在其中」的沉浸式體驗。從媒體產品來看,元宇宙將出現大量「我們在現場」式的新聞媒體產品。元宇宙的媒體產品將以沉浸式敘事實現新聞內容的進階。例如,重大突發事件報導、大型現場活動、新聞紀錄片等,可以將完整的新聞現場做成元宇宙的一個數位場景,讓觀眾以各種視角進入現場進行體驗。從傳播方式來看,目前,訊息傳播主要有4種傳播模式:大眾傳播、網路傳播、社交傳播、智慧傳播。元宇宙新型媒介的到來將使得智慧傳播時代訊息傳播的手段更為豐富,「全像傳播」成為可能。

元宇宙是認知域作戰的未來戰場。傳播媒介實質就是傳播平台和管道,是認知域作戰中認知敘事的物質基礎和主要武器。 2022年的俄烏衝突以無數「第一視角」的方式向全球報道,俄烏雙方都在網路媒體和社群平台發聲,爭奪國際傳播認知敘事主導權。元宇宙作為新型全像媒介,其傳導認知的方式是全維度、全系統和沈浸式的,能夠更全面、更深入、更持久地塑造人的思維認知,具有不可估量的認知戰應用價值。另外,元宇宙提供了一個將現實作戰場景數位孿生的平行認知空間,在這裡認知戰得以高效率推進和全景式呈現。

二、元宇宙在軍事傳播領域的應用初探

和其他新技術的產生一樣,元宇宙也很快被應用於軍事傳播領域。解放軍新聞傳播中心進行了大膽嘗試,連續3年在全國兩會期間推出的“全息軍報”,運用擴展現實、數字構建等技術,展示了一份充滿未來氣息的報紙:可以佩戴VR眼鏡沉浸式體驗“全息軍報”,也可以透過手機觀看。 「全像軍報」是國內報紙出版業的首創,入選了中國報業深度融合發展創新案例。 2021年全國兩會期間,該中心還推出軍媒智慧卡通虛擬人“小軍”,實現了3D卡通與現實人物的同屏互動。 2022年,該中心和解放軍文化藝術中心文藝部共同推出的“2022年元宇宙軍營網絡春晚”,利用元宇宙技術搭建虛擬空間和互動平台。網友觀眾化身虛擬人即可進入立體虛擬空間,參觀演出現場,並自行選擇喜好的座位觀看春晚,還可以跟著周圍的觀眾進行語言和手勢互動。有網友評價:「太震撼了!文藝輕騎以元宇宙的形式表現,真是科技在進步!」另外,該中心網絡部還率先推出了公益性NFT數字藏品《星星伴我守邊防》。

目前,軍隊相關權威媒體正在組織團隊推進元宇宙編輯部的前期研究和設計。縱觀傳統新聞編輯部到智慧編輯部的發展歷程,結合元宇宙先進技術和未來發展,專家提出了「元宇宙編輯部」的概念,即「元宇宙編輯部」使多點位的編輯記者在同一虛擬空間、同一指揮鏈、同一工作體系裡「面對面」有效率地完成規劃、訪談、編輯、發布等工作。這將是未來新聞編輯部的進化形態,每個編輯記者都擁有各自的虛擬工作空間,當有會議討論需求時,可以瞬間穿越到虛擬會議室進行「面對面」交流。

三、如何在元宇宙中打贏認知戰的思考

2022年俄烏衝突的一個主要特徵,是社群媒體的深度參與。行動互聯網成了這次衝突關聯資訊的主要來源。如前所述,元宇宙的特殊功能決定了其廣闊的發展前景。如何在元宇宙中打贏認知戰,迫切需要我們做前瞻性思考。

努力實現未來元宇宙核心技術的自主可控。元宇宙作為開拓性和創新性的前沿領域,前期研發成本龐大,需要長期且持續的高額投資,以實現大量標準規範的對接統一、超大規模用戶的連結互動。這也導致了元宇宙具有內在壟斷基因。目前,Facebook、Google、微軟等美國公司在其政府和軍方的支持下,深入佈局元宇宙,極大可能成為未來元宇宙的技術壟斷者和統治者,就像現在互聯網的情況一樣。在這次俄烏衝突中,上述這些科技巨頭「一邊倒」地將矛頭對準俄羅斯,限制甚至禁止俄羅斯使用其科技產品,為美國實施全面製裁施壓提供了支撐。這警告我們,為了將來在技術上不被掣肘,應該集中軍地優勢力量,瞄準元宇宙技術,協力攻關,努力實現未來元宇宙核心技術的自主可控。

發展適應認知戰的元宇宙平台。開發自主可控、覆蓋範圍廣、影響力大的元宇宙平台,是未來在認知域戰場上取得勝利的關鍵。回到俄烏衝突中,為了從輿論上打壓俄羅斯,YouTube、Twitter和Facebook等美國的社群平台在美國官方的授意下,直接限制了俄羅斯媒體的曝光率,可以說在認知戰層面佔盡了平台優勢。這就需要我們主動思考未來軍隊傳播平台型態,開發適應認知戰的元宇宙平台,努力打造爆款產品。例如,推出軍隊新媒體平台的元宇宙版本等。

抓緊生產與累積適合元宇宙時代的沉浸式內容。元宇宙時代的內容創作除了傳統可見的內容類型外,還大量增加了三維內容,包括全景拍攝、真實世界的數位孿生、虛擬空間的人工構建、虛擬數位人的展示等等。抓緊生產和累積適合元宇宙時代的沉浸式內容,是當前需要重點考慮的問題。例如,製作數位史館、打造英雄虛擬人、復現經典戰例等等,真正實現讓歷史「可感知」、讓文物「會說話」。另外,內容創作工具的自主研發也很關鍵。

(作者單位:解放軍新聞傳播中心網絡部)

中國軍事資源:https://www.81.cn/rmjz_203219/jsjz/2022nd5q_242715/tbch_242721/10193529.html