The Australian quotes Neuranext in Cyber Security Special Report.

For this reason the Sydneybased AI software and education company Neuranext has been building units on deepfake technology into its education content.

Neuranext’s managing director Adrian Tyson says he has been showing people AI-generated faces and teaching them where the faults are, which often relate to the positioning of glasses or the depth of teeth.

“But this is a blown-up image that we have time to look at,” Tyson says. “If it is a small image on your phone from social media, you don’t have the opportunity to critically look at the image, and it is going to be much harder to detect.”

Tyson says his immediate goal is to educate people that this can be done in the first place and be wary of what is going on.

“What I say to people in my courses is they are going to have to

alter their thinking as to everything they are presented visually going forward, because they are not going to be able to trust what they see,” Tyson says. “Critical thinking has to change radically. Hollywood has had its special effects departments, but this technology democratises those abilities.” Read the full report here: