#FactCheck - Virat Kohli's Ganesh Chaturthi Video Falsely Linked to Ram Mandir Inauguration
Executive Summary:
Old footage of Indian Cricketer Virat Kohli celebrating Ganesh Chaturthi in September 2023 was being promoted as footage of Virat Kohli at the Ram Mandir Inauguration. A video of cricketer Virat Kohli attending a Ganesh Chaturthi celebration last year has surfaced, with the false claim that it shows him at the Ram Mandir consecration ceremony in Ayodhya on January 22. The Hindi newspaper Dainik Bhaskar and Gujarati newspaper Divya Bhaskar also displayed the now-viral video in their respective editions on January 23, 2024, escalating the false claim. After thorough Investigation, it was found that the Video was old and it was Ganesh Chaturthi Festival where the cricketer attended.
Claims:
Many social media posts, including those from news outlets such as Dainik Bhaskar and Gujarati News Paper Divya Bhaskar, show him attending the Ram Mandir consecration ceremony in Ayodhya on January 22, where after investigation it was found that the Video was of Virat Kohli attending Ganesh Chaturthi in September, 2023.



The caption of Dainik Bhaskar E-Paper reads, “ क्रिकेटर विराट कोहली भी नजर आए ”
Fact Check:
CyberPeace Research Team did a reverse Image Search of the Video where several results with the Same Black outfit was shared earlier, from where a Bollywood Entertainment Instagram Profile named Bollywood Society shared the same Video in its Page, the caption reads, “Virat Kohli snapped for Ganapaati Darshan” the post was made on 20 September, 2023.

Taking an indication from this we did some keyword search with the Information we have, and it was found in an article by Free Press Journal, Summarizing the article we got to know that Virat Kohli paid a visit to the residence of Shiv Sena leader Rahul Kanal to seek the blessings of Lord Ganpati. The Viral Video and the claim made by the news outlet is false and Misleading.
Conclusion:
The recent Claim made by the Viral Videos and News Outlet is an Old Footage of Virat Kohli attending Ganesh Chaturthi the Video back to the year 2023 but not of the recent auspicious day of Ram Mandir Pran Pratishtha. To be noted that, we also confirmed that Virat Kohli hadn’t attended the Program; there was no confirmation that Virat Kohli attended on 22 January at Ayodhya. Hence, we found this claim to be fake.
- Claim: Virat Kohli attending the Ram Mandir consecration ceremony in Ayodhya on January 22
- Claimed on: Youtube, X
- Fact Check: Fake
Related Blogs

Executive Summary:
The claim of a video of US President Joe Biden dozing off during a television interview is digitally manipulated . The original video is from a 2011 incident involving actor and singer Harry Belafonte. He seems to fall asleep during a live satellite interview with KBAK – KBFX - Eyewitness News. Upon thorough analysis of keyframes from the viral video, it reveals that US President Joe Biden’s image was altered in Harry Belafonte's video. This confirms that the viral video is manipulated and does not show an actual event involving President Biden.

Claims:
A video shows US President Joe Biden dozing off during a television interview while the anchor tries to wake him up.


Fact Check:
Upon receiving the posts, we watched the video then divided the video into keyframes using the inVid tool, and reverse-searched one of the frames from the video.
We found another video uploaded on Oct 18, 2011 by the official channel of KBAK - KBFX - Eye Witness News. The title of the video reads, “Official Station Video: Is Harry Belafonte asleep during live TV interview?”

The video looks similar to the recent viral one, the TV anchor could be heard saying the same thing as in the viral video. Taking a cue from this we also did some keyword searches to find any credible sources. We found a news article posted by Yahoo Entertainment of the same video uploaded by KBAK - KBFX - Eyewitness News.

Upon thorough investigation from reverse image search and keyword search reveals that the recent viral video of US President Joe Biden dozing off during a TV interview is digitally altered to misrepresent the context. The original video dated back to 2011, where American Singer and actor Harry Belafonte was the actual person in the TV interview but not US President Joe Biden.
Hence, the claim made in the viral video is false and misleading.
Conclusion:
In conclusion, the viral video claiming to show US President Joe Biden dozing off during a television interview is digitally manipulated and inauthentic. The video is originally from a 2011 incident involving American singer and actor Harry Belafonte. It has been altered to falsely show US President Joe Biden. It is a reminder to verify the authenticity of online content before accepting or sharing it as truth.
- Claim: A viral video shows in a television interview US President Joe Biden dozing off while the anchor tries to wake him up.
- Claimed on: X (Formerly known as Twitter)
- Fact Check: Fake & Misleading
.webp)
Introduction
The Digital Personal Data Protection (DPDP) Act, of 2023, introduces a framework for the protection of personal data in India. Data fiduciaries are the entity that essentially determines the purpose and means of processing of personal data. The small-scale industries also fall within the ambit of the term. Startups/Small companies and Micro, Small, and Medium Enterprises (MSMEs) while determining the purpose of processing of personal data in the capacity of ‘data fiduciary’ are also required to comply with the DPDP Act provisions. The obligations set for the data fiduciary will apply to them unilaterally, though compliance with this Act and can be challenging due to resource constraints and limited expertise in data protection.
DPDP Act, 2023 Section 17(3) gives power to the Central Government to exempt Startups from being obligated to comply with the Act, taking into account the volume and nature of personal data processed. It is the nation's first standalone law on data protection and privacy, which sets forth strict rules on how data fiduciaries can collect and process personal data, focusing on consent-based mechanisms and personal data protection. Small-scale industries are given more time to comply with the DPDP Act. The detailed provisions to be notified in further rulemaking called ‘DPDP rules’.
Obligations on Data Fiduciary under the DPDP Act, 2023
The DPDP Act focuses on processing digital personal data in a manner that recognizes both the right of individuals to protect their personal data and the need to process such personal data for lawful purposes and for matters connected therewith or incidental thereto. Hence, small-scale industries also need to comply with provisions aimed at protecting digital personal data.
The key requirements to be considered:
- Data Processing Principles: Ensuring that data processing is done lawfully, fairly, and transparently. Further, the collection and processing of personal data is only for specific, clear, and legitimate purposes and only the data necessary for the stated purpose. Ensuring that the data is accurate and up to date is also necessary. An important part is that the data is not retained longer than necessary and appropriate security measures are taken to protect the said data.
- Consent Management: Clear and informed consent should be obtained from individuals before collecting their personal data. Further, individuals have the option to withdraw their consent easily.
- Rights of Data Principals: Data principals (individuals) whose data is being collected have the right to Information, the right to correction and erasure of data, the right to grievance redressa, Right to nominate.the right to access, correct, and delete their personal data. Data fiduciaries need to be mindful of mechanisms to handle requests from data principals regarding their concerns.
- Data Breach Notifications: Data fiduciaries are required to notify the data protection board and the affected individuals in case a data breach has occurred.
- Appropriate technical and organisational measures: A Data Fiduciary shall implement appropriate technical and organisational measures to ensure effective observance of the provisions of this Act and the rules made thereunder.Cross-border Data Transfers: Compliance with regulations in relation to the transfer of personal data outside of India should be ensured.
Challenges for Small Scale Industries for the DPDP Act Compliance
While small-scale industries have high aims for their organisational growth and now in the digital age they also need to place reliance on online security measures and handling of personal data, with the DPDP act in the picture it becomes an obligation to consider and comply with. As small-scale industries including MSMEs, they might face certain challenges in fulfilling these obligations but digital data protection measures will also boost the competitive market and customer growth in their business. Bringing reforms in methods aimed at better data governance in today's digital era is significant.
One of the major challenges for small-scale industries could be ensuring a skilled workforce that understands and educates internal stakeholders about the DPDP Act compliances. This could undoubtedly become an additional burden.
Further, the limited resources can make the implementation of data protection, which is oftentimes complex for a layperson in the case of a small-scale industry, difficult to implement. Limitations in resources are often financial or human resources.
Cybersecurity, cyber awareness, and protection from cyber threats need some form of expertise, which is lacking in small enterprises. The outsourcing of such expertise is a decision that is sometimes taken too late, and some form of harm can take place between the periods by which an incident can occur.
Investment in the core business or enterprise many times doesn't include technology other than the basic requirements to run the business, nor towards ensuring that the data is secure and all compliances are met. However, in the fast-moving digital world, all industries need to be mindful of their efforts to protect personal data and proper data governance.
Recommendations
To ensure the proper and effective personal data handling practices as per the provisions of the act, the small companies/startups need to work backend and frontend and ensure that they take adequate measures to comply with the act. While such industries have been given more time to ensure compliance, there are some suggestions for them to be compliant with the new law.
Small companies can ensure compliance with the DPDP Act by implementing robust data protection policies, investing in and providing employee training on data privacy, using age-verification mechanisms, and adopting privacy-by-design principles. Conduct a gap analysis to identify areas where current practices fall short of DPDP Act requirements. Regular audits, secure data storage solutions, and transparent communication with users about data practices are also essential. Use cost-effective tools and technologies for data protection and management.
Conclusion
Small-scale industries must take proactive steps to align with the DPDP Act, 2023 provisions. By understanding the requirements, leveraging external expertise, and adopting best practices, small-scale industries can ensure compliance and protect personal data effectively. In the long run, complying with the new law would lead to greater trust and better business for the enterprises, resulting in a larger revenue share for them.
References
- https://pib.gov.in/PressReleaseIframePage.aspx?PRID=1959161
- https://www.financialexpress.com/business/digital-transformation-dpdp-act-managing-data-protection-compliance-in-businesses-3305293/
- https://economictimes.indiatimes.com/tech/technology/big-tech-coalition-seeks-12-18-month-extension-to-comply-with-indias-dpdp-act/articleshow/104726843.cms?from=mdr

In the vast, uncharted territories of the digital world, a sinister phenomenon is proliferating at an alarming rate. It's a world where artificial intelligence (AI) and human vulnerability intertwine in a disturbing combination, creating a shadowy realm of non-consensual pornography. This is the world of deepfake pornography, a burgeoning industry that is as lucrative as it is unsettling.
According to a recent assessment, at least 100,000 deepfake porn videos are readily available on the internet, with hundreds, if not thousands, being uploaded daily. This staggering statistic prompts a chilling question: what is driving the creation of such a vast number of fakes? Is it merely for amusement, or is there a more sinister motive at play?
Recent Trends and Developments
An investigation by India Today’s Open-Source Intelligence (OSINT) team reveals that deepfake pornography is rapidly morphing into a thriving business. AI enthusiasts, creators, and experts are extending their expertise, investors are injecting money, and even small financial companies to tech giants like Google, VISA, Mastercard, and PayPal are being misused in this dark trade. Synthetic porn has existed for years, but advances in AI and the increasing availability of technology have made it easier—and more profitable—to create and distribute non-consensual sexually explicit material. The 2023 State of Deepfake report by Home Security Heroes reveals a staggering 550% increase in the number of deepfakes compared to 2019.
What’s the Matter with Fakes?
But why should we be concerned about these fakes? The answer lies in the real-world harm they cause. India has already seen cases of extortion carried out by exploiting deepfake technology. An elderly man in UP’s Ghaziabad, for instance, was tricked into paying Rs 74,000 after receiving a deep fake video of a police officer. The situation could have been even more serious if the perpetrators had decided to create deepfake porn of the victim.
The danger is particularly severe for women. The 2023 State of Deepfake Report estimates that at least 98 percent of all deepfakes is porn and 99 percent of its victims are women. A study by Harvard University refrained from using the term “pornography” for creating, sharing, or threatening to create/share sexually explicit images and videos of a person without their consent. “It is abuse and should be understood as such,” it states.
Based on interviews of victims of deepfake porn last year, the study said 63 percent of participants talked about experiences of “sexual deepfake abuse” and reported that their sexual deepfakes had been monetised online. It also found “sexual deepfake abuse to be particularly harmful because of the fluidity and co-occurrence of online offline experiences of abuse, resulting in endless reverberations of abuse in which every aspect of the victim’s life is permanently disrupted”.
Creating deepfake porn is disturbingly easy. There are largely two types of deepfakes: one featuring faces of humans and another featuring computer-generated hyper-realistic faces of non-existing people. The first category is particularly concerning and is created by superimposing faces of real people on existing pornographic images and videos—a task made simple and easy by AI tools.
During the investigation, platforms hosting deepfake porn of stars like Jennifer Lawrence, Emma Stone, Jennifer Aniston, Aishwarya Rai, Rashmika Mandanna to TV actors and influencers like Aanchal Khurana, Ahsaas Channa, and Sonam Bajwa and Anveshi Jain were encountered. It takes a few minutes and as little as Rs 40 for a user to create a high-quality fake porn video of 15 seconds on platforms like FakeApp and FaceSwap.
The Modus Operandi
These platforms brazenly flaunt their business association and hide behind frivolous declarations such as: the content is “meant solely for entertainment” and “not intended to harm or humiliate anyone”. However, the irony of these disclaimers is not lost on anyone, especially when they host thousands of non-consensual deepfake pornography.
As fake porn content and its consumers surge, deepfake porn sites are rushing to forge collaborations with generative AI service providers and have integrated their interfaces for enhanced interoperability. The promise and potential of making quick bucks have given birth to step-by-step guides, video tutorials, and websites that offer tools and programs, recommendations, and ratings.
Nearly 90 per cent of all deepfake porn is hosted by dedicated platforms that charge for long-duration premium fake content and for creating porn—of whoever a user wants, and take requests for celebrities. To encourage them further, they enable creators to monetize their content.
One such website, Civitai, has a system in place that pays “rewards” to creators of AI models that generate “images of real people'', including ordinary people. It also enables users to post AI images, prompts, model data, and LoRA (low-rank adaptation of large language models) files used in generating the images. Model data designed for adult content is gaining great popularity on the platform, and they are not only targeting celebrities. Common people are equally susceptible.
Access to premium fake porn, like any other content, requires payment. But how can a gateway process payment for sexual content that lacks consent? It seems financial institutes and banks are not paying much attention to this legal question. During the investigation, many such websites accepting payments through services like VISA, Mastercard, and Stripe were found.
Those who have failed to register/partner with these fintech giants have found a way out. While some direct users to third-party sites, others use personal PayPal accounts to manually collect money in the personal accounts of their employees/stakeholders, which potentially violates the platform's terms of use that ban the sale of “sexually oriented digital goods or content delivered through a digital medium.”
Among others, the MakeNude.ai web app – which lets users “view any girl without clothing” in “just a single click” – has an interesting method of circumventing restrictions around the sale of non-consensual pornography. The platform has partnered with Ukraine-based Monobank and Dublin’s BetaTransfer Kassa which operates in “high-risk markets”.
BetaTransfer Kassa admits to serving “clients who have already contacted payment aggregators and received a refusal to accept payments, or aggregators stopped payments altogether after the resource was approved or completely freeze your funds”. To make payment processing easy, MakeNude.ai seems to be exploiting the donation ‘jar’ facility of Monobank, which is often used by people to donate money to Ukraine to support it in the war against Russia.
The Indian Scenario
India currently is on its way to design dedicated legislation to address issues arising out of deepfakes. Though existing general laws requiring such platforms to remove offensive content also apply to deepfake porn. However, persecution of the offender and their conviction is extremely difficult for law enforcement agencies as it is a boundaryless crime and sometimes involves several countries in the process.
A victim can register a police complaint under provisions of Section 66E and Section 66D of the IT Act, 2000. Recently enacted Digital Personal Data Protection Act, 2023 aims to protect the digital personal data of users. Recently Union Government issued an advisory to social media intermediaries to identify misinformation and deepfakes. Comprehensive law promised by Union IT minister Ashwini Vaishnav will be able to address these challenges.
Conclusion
In the end, the unsettling dance of AI and human vulnerability continues in the dark web of deepfake pornography. It's a dance that is as disturbing as it is fascinating, a dance that raises questions about the ethical use of technology, the protection of individual rights, and the responsibility of financial institutions. It's a dance that we must all be aware of, for it is a dance that affects us all.
References
- https://www.indiatoday.in/india/story/deepfake-porn-artificial-intelligence-women-fake-photos-2471855-2023-12-04
- https://www.hindustantimes.com/opinion/the-legal-net-to-trap-peddlers-of-deepfakes-101701520933515.html
- https://indianexpress.com/article/opinion/columns/with-deepfakes-getting-better-and-more-alarming-seeing-is-no-longer-believing/