Evaluating the role of generative AI and color patterns in the dissemination of war imagery and disinformation on social media

Front Artif Intell. 2025 Jan 6:7:1457247. doi: 10.3389/frai.2024.1457247. eCollection 2024.

Abstract

This study explores the evolving role of social media in the spread of misinformation during the Ukraine-Russia conflict, with a focus on how artificial intelligence (AI) contributes to the creation of deceptive war imagery. Specifically, the research examines the relationship between color patterns (LUTs) in war-related visuals and their perceived authenticity, highlighting the economic, political, and social ramifications of such manipulative practices. AI technologies have significantly advanced the production of highly convincing, yet artificial, war imagery, blurring the line between fact and fiction. An experimental project is proposed to train a generative AI model capable of creating war imagery that mimics real-life footage. By analyzing the success of this experiment, the study aims to establish a link between specific color patterns and the likelihood of images being perceived as authentic. This could shed light on the mechanics of visual misinformation and manipulation. Additionally, the research investigates the potential of a serverless AI framework to advance both the generation and detection of fake news, marking a pivotal step in the fight against digital misinformation. Ultimately, the study seeks to contribute to ongoing debates on the ethical implications of AI in information manipulation and to propose strategies to combat these challenges in the digital era.

Keywords: LUTs; color patterns; disinformation; fake news; generative AI; information manipulation; social media; war imagery.

Grants and funding

The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 101007638.