Nora Fatehi Disputes Lookalike in Brand Campaign: ‘Shocking, This is Not Me’

In a recent turn of events, Bollywood sensation Nora Fatehi has taken to social media to express her astonishment and disappointment after discovering that a brand used her alleged lookalike in a promotional campaign. The incident has stirred quite a buzz in the industry, prompting discussions on the ethics of celebrity endorsements and the impact on personal identities.

Following in the footsteps of Rashmika Mandanna, Katrina Kaif, Alia Bhatt, Kajol, and a few other celebrities, Nora Fatehi appears to have become the latest victim of what appears to be a deepfake video or, at the very least, a video featuring a convincing lookalike. Social media is abuzz with a circulating video of a woman bearing a striking resemblance to Nora, promoting an end-of-season sale.

Nora’s Reaction on Social Media:

Taking to her Instagram Stories, Nora expressed her concern and set the record straight, emphasizing that the woman featured in the video is not her. Sharing the video showcasing the lookalike promoting a fashion brand, imitating her mannerisms, appearance, and voice, Nora wrote, “Shocked!! This is not me!” In bold letters, she labeled the video as ‘fake’ for added clarity. As of now, the brand has not responded to her claims. Fans and followers rallied behind Nora, emphasizing the importance of respecting a celebrity’s identity.

Rashmika Deepfake Video Creator in Custody:

The revelation of Nora Fatehi’s video incident closely follows the arrest of Eemani Naveen, a student from Andhra Pradesh, by the Delhi Police on Saturday. The accused, Naveen, was taken into custody for creating a deepfake video of Rashmika in an attempt to boost followers on his fan page, according to the police.

The case was officially filed on November 10 last year after the video gained widespread attention. Originally shared on a fan page on October 13, Naveen removed the video from his devices and account upon realizing the controversy it stirred. However, law enforcement successfully traced the video back to him.

What is Deepfake Video?

A deepfake video is a form of manipulated content created using artificial intelligence (AI). Typically, these videos involve the appropriation of a person’s face or voice and integrating them into an existing video or attributing them to another individual.

If not scrutinized carefully, deepfakes can be crafted to give the illusion that the person whose features are manipulated has made statements or participated in actions that never actually took place. The entertainment industry, with a particular focus on female actors, has unfortunately become a frequent target for such deceptive activities.

Celebrities often play a pivotal role in shaping brand perceptions, and their endorsements can significantly impact consumer choices. The Nora Fatehi case serves as a reminder that the responsibility lies not only with celebrities but also with brands to maintain transparency and authenticity in their marketing strategies.

As Nora Fatehi takes a stand against the unauthorized use of her likeness in a brand campaign, the incident sheds light on the complexities of the celebrity endorsement landscape. It prompts a larger conversation about the importance of respecting a celebrity’s identity and the ethical considerations that brands must uphold in their pursuit of star-studded promotions.

Published by HOLR Magazine.