The spread of altered photos, videos, and audio, known by the English term deepfake, is increasingly establishing doubts about the veracity of the content we consume online but not only. In fact, if before only politicians and public figures had to worry about the deepfake phenomenon (and the serious misinformation that consequently spilled over to the general public), this is no longer the case as it is much more widespread.

Altered media files are moving faster and faster mainly because it is getting easier and easier to manipulate them. We delude ourselves into believing that this is a distant technology, too cutting-edge to be of personal concern to us, and it was, until a few years ago. Today, on the other hand, one can edit a photo with a touch, without even having to rely on an external application, all it takes is owning a smartphone.

The FBI and Europol: deepfake is no longer just in politics

Il Sole 24 Ore sheds light on this issue by reporting some alarming data shared by the FBI. Unfortunately, deepfakes arrive in companies through online applications and interviews, but more importantly, they spread to much more serious and worrisome crimes that fuel fraud, pornography, and child pornography. We highly recommend reading this article to learn more about this topic.

Thus, an era is foreseen in which it will no longer be possible to distinguish the true from the false; it is increasingly shared by philosophers and thinkers that we are living in a world of post-truth, where everything can be falsified, disputed, or read backwards. On this we find the comment of an experienced AI researcher, Alex Champanard, in the above article:

“Everyone should know how quickly things can be faked nowadays with this technology: the main problem is that humanity may fall into an era where it is no longer possible to determine whether the content of a media corresponds to the truth.”

Alex Champanard, Experienced researcher in the field of AI

New technologies vs. even newer technologies

However, it is always in technology that we can find the solution. When trying to counter the deepfake phenomenon, the focus is usually on developing tools that can detect what is real and what is fake.

However, the result is an unequal struggle between similar technologies, such as precisely artificial intelligence, and consequently an undisputed victory of manipulated content. For these and other reasons, we wondered how to counter the problem: It is known that prevention is better than cure, so why can’t it be so for technology as well?

Cybersecurity solutions at hand

We are talking about a solution that is practically already in our hands: the smartphone, combined, however, with an acquisition method that complies with international cybersecurity guidelines. It sounds difficult, but in fact it is not. This method makes the contents of our file authentic and, more importantly, impossible to modify, so that the contents can be recognized as valid and unchallengeable. This method is called TrueScreen.

How does it work?
It is an app for smartphones and tablets that can make all our media files indisputable, guaranteeing their authenticity and unchangeability. It not only protects content, but protects the person who often has to rely on a photo to give proof of facts or events.
We are talking about a very fine but accessible technology, designed for everyone and thus made simple and intuitive. The interface is what we are already used to before we take a photo, record a video or audio:

Discover our solutions

TrueScreen provides companies, professionals and individuals with an ecosystem of services that ensures indisputable data.

Discover our forensic acquisition services for multimedia files, electronic signature solutions, and legally valid email certification.

Discover our solutions

TrueScreen provides companies, professionals and individuals with an ecosystem of services that ensures indisputable data.

Discover our forensic acquisition services for multimedia files, electronic signature solutions, and legally valid email certification.