The head of the Australian Defense Force, General Angus Campbell, has warned of the growing threat of “truth decay” to democracies around the world caused by artificial intelligence (AI) and deepfakes. Speaking at the Australian Strategic Policy Institute conference, General Campbell stressed that disinformation campaigns have the potential to divide and fragment entire societies. He expressed concern that in a post-truth world, perceptions and emotions often take precedence over facts. As AI systems become more advanced, the risk that they could be used by adversaries to mislead and misinform the public grows. General Campbell emphasized the vulnerability of societies that rely on a well-informed and engaged citizenry.
Advances in technology may further accelerate the “decay of truth” by questioning the quality of public common sense and undermining public trust in elected officials. General Campbell emphasized the importance of countermeasures, but emphasized that the first impression created by disinformation is often the most influential.
Governments around the world share these concerns. The Albanian government in Australia is considering banning the use of high-risk AI due to deepfakes and algorithmic bias. The European Union has already taken steps to regulate AI and mitigate potential risks.
Experts such as Professor Jill Slay AM, SmartSat Chair Professor in Cybersecurity, warn that weaponized AI enables the spread of false information, cloned images and false narratives that could support cyber or kinetic warfare if left unregulated. However, the controlled use of these techniques can contribute to the development of useful technologies and benefit the economy.
The issue of regulating generative AI and deepfakes remains complex. Professor Mary-Anne Williams of the University of New South Wales recognizes the challenge of defining and determining what constitutes “unsafe” AI. The responsibility and regulatory oversight of these technologies remain unclear, and uncertainties remain in various areas, including science, technology, education and law.