Hautetalk: Sharing, Consuming Deepfake Videos Is A Bigger Threat Than Their Existence. Can We Stop Circulating Them?

This is concerning!
Hautetalk: Sharing, Consuming Deepfake Videos Is A Bigger Threat Than Their Existence. Can We Stop Circulating Them?

The Internet can be both a boon and a bane in our lives. With more problems branching out from technology than the good, every day we face a new problematic trend, surfacing on the internet. The latest in the list is deepfakes. Circulation of videos, featuring women, and making obscene gestures, has become the latest trend, with morphed images of female celebrities being used as a ladder to popularity. 

It is not today that we witness the problem with morphed celebrity pictures, but a long-lived phenomenon on the internet, where celebrity pictures are easily found and misused. Owing to porn sites, which bridge the gap between commoners and celebrities, through this hyper-fantasised world of pornography. But in the current scenario, the problem that lies under the spotlight is what makes them go viral. These videos that found a spot in a million boxes on a porn site are now openly available through a hashtag on Twitter and Instagram. Being accessible to a million new users, who question but mostly enjoy watching them. Trying typing deepfake on your search bars, and you’d be welcomed with a list of at least ten to twelve pages that work around the circulation of them. From compilations, dance numbers to transitions, there is content of every kind, readily served on a platter. 

This month, we saw a sudden surge in these videos, that featured manipulated pictures of Rashmika Mandanna, Katrina Kaif, Kajol and more actresses, making it to fake videos. And today, it is actress Alia Bhatt, who has turned a victim to this trend. Using artificial inelegance and its potential for misuse, these deepfake videos invade the privacy of female celebrities and misinterpret their stardom for numbers. As per research, this hike in the number is more of a threat to another budding problem generated on the internet. While battling the question of what’s real and what’s not, deepfakes have led the government to place strict laws. 

Moreover, their popularity concerns us about the content palette of internet dwellers, who seem to knowingly or unknowingly consume such content. The attention-grabbing nature of these fake videos, has it decorating the search tabs of most Twitter pages, seeking your click. The threat to one’s privacy is what bothers us the most about these videos. Imagine the turmoil such videos could create if it was to use your images over unauthentic obscene videos, made for public consumption. The idea here is the free availability of tools, that enable even a technologically handicapped user to create a professional fake. That’s the threatening reality of deepfakes. This nature of deepfakes being readily available and shared over social media, leads to a future crisis, knocking on our doors. Something which Prime Minister Narendra Modi rightly pointed out during one of his Mann Ki Baat videos. 

Also Read: After Rashmika Mandanna, Kajol, Obscene Deepfake Video Of Alia Bhatt Goes Viral. This Trend Is Concerning!

How To Fight Them?

With MeITY and the government placing rules and regulations to avoid the occurrence of these deepfakes, stricter laws ensure a change. As per the new IT rules, the government is strengthening its laws to prevent people from putting misleading and manipulative content on social media for mass consumption. From fine to FIR, the system placed catches hold of the culprit.

Besides the makers behind these videos, consumers also stand at a difficult spot and have equal responsibility to prevent from deepfakes to go and travel the internet verse. 

After Rashmika Mandanna, Sara Tendulkar’s Viral Deepfake Videos, IT Minister Lays Down New Rules On AI Manipulation

Jasveen Kaur Sawhney

Jasveen Kaur is a fashion writer, and pyjama hoarder, who loves watching interviews of all kinds, and checking her Pinterest mood board every hour!

Read More From Jasveen

Seen it all?

We’ve got more!