In this era of fake news, it’s almost a given that you’ve come across videos like Jon Snow apologizing for the rather underwhelming finale episode of Game of Thrones, Mark Zuckerberg boasting of being the owner of people’s stolen data, and Steven Buscemi attending the Golden Globes wearing what people remember as Tilda Swinton’s gown. These videos have been our exposure to deepfake, the 21st century equivalent to photoshopped pictures. These videos are possible through artificial intelligence that produces fake images of events that have never happened. Most deepfakes have become tools for personalities to be embarrassed and have their reputations ruined, with some becoming effective in “fooling” unsuspecting viewers and in millennial-speak, “canceling” personalities.
Given how gullible we have all become to these fake videos, we need to inform ourselves of them lest we are fine with being unresisting victims. How they are made, how to know the difference between real and deepfake graphics, what the consequences of deepfake are, and possible solutions are some things we need to know. After all, there will come a time when technology becomes more common and accessible to just about everyone. When this time comes, deepfake will not only victimize celebrities and world-renowned personalities but even ordinary citizens like you and I. You definitely wouldn’t want to be left scratching your head when you fall target to a deepfake item.
What do you need to make a deepfake?
As of the moment, it takes a lot to produce deepfake. A standard laptop or desktop PC won’t do. You need a high-end desktop armed with professional-grade graphic cards and storage capacity. After all, one deepfake video alone may need a least 40,000 high definition pictures of the person you would like to be put in the video. HD photos are huge files that cloud storage is actually preferred. Fast computing power is also a must, or you will end up taking a month just to come up with a video that’s two minutes long. You also need tools and apps, which cost according to the quality of their outputs.
Right now, it’s actually not difficult to make deepfake videos, thanks to free apps and programs that allow even ordinary people to make them. Cost-free source codes and machine-learning algorithms are abundant online, and the only things you need to make yourself a video are time and materials.
Can you tell the difference?
Prior to 2019, when technology experts were asked how one can detect a deepfake, they will readily answer that it’s all in the eyes. Eyes in deepfake videos don’t naturally move, much less blink. After all, you can’t really see a lot of photos where a celebrity blinks or closes his or her eyes. But then, deepfake technology is fast to provide solutions, so it was no sooner when the blinking weakness was revealed that a solution emerged.
It comes down to being super vigilant. Organizations can help by making sure employees undergo a thorough cybersecurity awareness programme that is updated frequently to inform them about the latest threats, and how to react.
Will deepfakes destroy the world?
Deepfakes are meant to embarrass, intimidate, and destabilize individuals. The best ones might send beloved celebrities into retirement or away from the limelight. Will it affect entire countries and societies? Not really, especially since a lot of countries have their own intelligence agencies that employ state-of-the-art security imaging systems. But deepfakes can definitely affect economic markets and voters, not to mention cause religious tension and other areas that are not directly under government control.
However, the ability to generate realistic simulations using artificial intelligence will, on the whole, be only a positive for humanity.
Is there a solution?
Artificial intelligence, which produces deepfakes, is also what can stop deepfakes. AI has also been proven to be effective in detecting deepfakes that feature celebrities because of millions of hours of available footage. Technology firms are now looking into how detection systems can spot deepfakes, whether or not they feature a known personality or not. There is also talk of having an online ledger system on a blockchain that will house original copies of videos, audio, and photos so that any file can be cross-checked for any manipulation.
In fact, the detection methods are similar enough to the methods used to make deepfakes that the research on detection inadvertently provides a roadmap for improving the fakes. Worse, the deepfake detection models themselves can be used directly in the deepfake generation process to improve their output.
On the other hand, if you are certain that you’ve become a victim of your laptop’s struggling hard drive, we’ll help you recover. If you would like to know more about hard drive errors and data recovery, read here: https://www.harddrivefailurerecovery.net/fixing-hard-drive-errors.
Rolling in the Deep(fake) was first published on www.harddrivefailurerecovery.net
from Hard Drive Recovery Associates – Feed at https://www.harddrivefailurerecovery.net/rolling-in-the-deepfake/