Information moves much more quickly than the ability to fact-check. Experts are already sounding the alarm about how emerging technologies can be exploited for coordinated, algorithmically driven propaganda. A panel of specialists recently convened to discuss the implications of artificial intelligence and social media on information warfare, particularly focusing on misinformation related to the ongoing conflict in Ukraine.
Olivia Shen, the Director of the Strategic Technologies Program at the United States Studies Centre at the University of Sydney, highlighted the challenges faced in regulating digital content. She emphasized that there has been “more laggard progress on turning voluntary safeguards and codes of conduct and guidelines into legally enforceable regulation.” The demand for strong regulation has never been more pronounced as misinformation is injected even more widely into the online information ecosystem.
Carl Miller, founder of the Centre for the Analysis of Social Media at Demos in the UK, pointed out the potential dangers posed by AI technologies. He stated that “these models, these chatbots that are increasingly going to be used to kind of interrogate the internet and paint pictures for people to learn about the world… this is going to be basically the kind of crucible of a whole new kind of really powerful and quite unseen influence and power.” This important observation begs the question of how AI might influence AI4GI public perception and discourse by design.
Isis Blachez, an analyst at the US-based research agency NewsGuard, expresses a like apprehension. Powerful dangers Rosen warns that dangerous disinformation is cycling through nearly every digital space. She explained that “many different sites will repeat the same false claim multiple times on each site and on different sites.” This cycle can cause AI chatbots to further entrench falsehoods, since they are trained on what is already online.
Australians have certainly not been immune from these narratives, as Mike Burgess, Director General of ASIO, explained in detail. They log onto social media, and Australians create and disseminate radical online discourse. This discourse explains and defends the Russian invasion of Ukraine and denigrates the Australian government’s support for Kyiv. These are the propagandists as they intentionally obscure their ties to Moscow. They want to hijack and inflame the legitimate debate, probably all per the Kremlin’s playbook. His comments revive the concept of balancing, for the key principle that should govern all lies, misinformation or propaganda intended to force policy change.
The panelists talked about the subtle ways propaganda gets spread. One expert described it as a “laundering machine for Russian narratives and Russian propaganda,” highlighting how content from sanctioned media outlets such as Sputnik or Russia Today can be recycled to create misleading narratives.
Building back better The evolving complexities of misinformation campaigns requires a multi-faceted effort from researchers, federal agencies, and private technology firms.

