top of page
Writer's pictureVale VPN

How deep fakes are altering our memories

Some mind-blowing photos have recently been causing havoc on the internet. Vladimir Putin kissed Xi Jinping's hand while kneeling down and the pope wearing a puffy jacket. The problem? None of these things truly occurred. They are the result of sophisticated AI technology called deepfakes.


These artificial intelligence-based fakes demonstrate how readily the line between fact and fiction may be blurred, with important and far-reaching repercussions.


With a VPN free trial, you can protect your online activities and keep your personal information hidden.






The Mandela effect: what is it?


This phrase bears Nelson Mandela's name, a politician and anti-apartheid rebel from South Africa. In spite of ample evidence to the contrary, many people still held the misconception that he passed while in jail in the 1980s. In truth, he was freed in 1990, became the leader of the nation in southern Africa, and died in 2013.

Since then, the Mandela Effect has been used to explain a wide range of inaccurate memories of facts or events, including the spelling of company names, song lyrics, the storylines of films or television series, and the specifics of historical events.


The well-known comedian from the 1990s, Sinbad, is one instance of the Mandela Effect in action. The 2019 superhero film Shazam! is not to be confused with the film Shazaam, in which Sinbad portrays a genie who aids two young children. However, no such film has ever been produced. The closest thing to that was the Shaquille O'Neal-starring film Kazaam, in which he played a genie.


Many individuals continue to believe they watched Shazaam despite the lack of proof for its existence and recall the specifics of the story, characters, and even the movie poster. Despite Sinbad's own admission that he has never portrayed a genie, the concept for the film has been so embedded in people's minds that they are absolutely confident it exists.

Why does the Mandela Effect occur?

Numerous ideas have been proposed in an effort to explain the Mandela Effect, despite the fact that its exact origin is unknown. One theorizes that it's the product of false memories when individuals mistakenly recall events or facts as a result of incomplete knowledge, poor interpretation, or the influence of suggestion. A more radical idea holds that it's the result of a matrix bug or a parallel world where humans may have encountered a different reality.


The Mandela Effect may be brought on by the way our brains are built, according to psychologists. Specifically, how our memories might sometimes be distorted by things like what other people say or our own ingrained ideas. The term for this is a cognitive bias. For instance, if many people claim that something occurred in a specific manner on social media, it may lead us to assume that it really did, even if it didn't.


The Mandela Effect is not a new phenomenon, and it largely affects pop culture, but the emergence of deepfakes means that false information may travel even quicker and easier and that more people may start recalling things that never occurred.


This brings up crucial issues such as whether we should believe what we read online, whether it's OK to employ AI to modify photos and videos, and to what extent technology should be allowed to influence our memories and beliefs.

The risk of memories created from deep fakes

Artificial intelligence (AI) is used in deepfakes to produce convincing films and pictures of individuals speaking and doing things they never really said or did. The underlying technique is deep learning, a kind of machine learning that entails putting artificial neural networks through a lot of data training.


Less than 15,000 deepfakes have been found online as of 2019. According to the World Economic Forum, that number is now in the millions, and the number of expertly made deep fakes is growing at a pace of 900% every year.


Deepfakes' potential to be used maliciously, such as to fabricate news or propaganda or to impersonate someone for financial benefit, is one of its most worrisome elements. Deepfake pornography is also being produced using technology, which has sparked worries about the potential for abuse and the exploitation of people.


Due to the incredibly realistic nature of the technology, deepfakes also have the ability to trick people into thinking they have seen something that never really occurred. They could eventually interpret an erroneous situation as reality due to this.


Generating false news reports


Deepfakes may be used to construct news items that are so convincingly fake that readers mistake them for the genuine thing even if they are entirely false. For instance, misleading news reports on terrorist attacks or natural catastrophes were made only to further certain sociopolitical goals. With the use of deepfakes, these tales may be made to seem to be legitimate news items, replete with convincing audio and video. For future generations, this can lead to a mistaken perception of history and current events.

5 views0 comments

Recent Posts

See All

Comments


bottom of page