Australian researchers develop AI tool to combat dangerous deepfake images

Australian researchers develop AI tool to combat dangerous deepfake images

  Researchers at Monash University in Australia, in collaboration with the Australian Federal Police (AFP), have developed a new artificial intelligence (AI) tool to combat malicious deepfake images.


Deepfake is a technique of manipulating images or videos using AI to create new content that looks authentic and convincing.


The tool, among its wide-ranging applications, can slow down and stop criminals from producing AI-generated child abuse material, deepfake images and videos, and more, according to a Monash University statement released on Monday (10/11).


Known as “data poisoning,” the technique involves subtle changes to data to make it much more difficult to produce, manipulate and misuse images or videos using AI programs, according to AI for Law Enforcement and Community Safety (AiLECS), a collaborative project between the AFP and Monash University.




AI and machine learning tools rely on large online datasets. Poisoning this data can result in inaccurate, biased, or corrupted output, making it easier to identify fake images or videos manipulated by criminals and also aiding investigators by reducing the volume of fake material they need to examine, researchers say.


The AI-hacking tool, called Silverer, is currently in the prototype stage and is designed to develop and continuously improve technology that will be easy to use for ordinary Australians who want to protect their data on social media, the researchers said.


"Before someone uploads an image to social media or the internet, they can modify it using a Silverer. This will alter the pixels to fool the AI ​​model, and the resulting image will be very low quality, full of blurry patterns, or even unrecognizable altogether," said Elizabeth Perry, researcher and project leader of AiLECS and a PhD candidate at Monash University.


The AFP has reported a rise in AI-generated child abuse material, which can be easily produced and distributed by criminals using open-source technology with very little access restrictions, according to Campbell Wilson, a digital forensics expert and co-director of AiLECS.


Post a Comment

Previous Post Next Post