#FactCheck - Deepfake Alert: Virat Kohli's Alleged Betting App Endorsement Exposed
Executive Summary
A viral video allegedly featuring cricketer Virat Kohli endorsing a betting app named ‘Aviator’ is being shared widely across the social platform. CyberPeace Research Team’s Investigations revealed that the same has been made using the deepfake technology. In the viral video, we found some potential anomalies that can be said to have been created using Synthetic Media, also no genuine celebrity endorsements for the app exist, we have also previously debunked such Deep Fake videos of cricketer Virat Kohli regarding the misuse of deep fake technology. The spread of such content underscores the need for social media platforms to implement robust measures to combat online scams and misinformation.

Claims:
The claim made is that a video circulating on social media depicts Indian cricketer Virat Kohli endorsing a betting app called "Aviator." The video features an Indian News channel named India TV, where the journalist reportedly endorses the betting app followed by Virat Kohli's experience with the betting app.

Fact Check:
Upon receiving the news, we thoroughly watched the video and found some featured anomalies that are usually found in regular deep fake videos such as the lip sync of the journalist is not proper, and if we see it carefully the lips do not match with the audio that we can hear in the Video. It’s the same case when Virat Kohli Speaks in the video.

We then divided the video into keyframes and reverse searched one of the frames from the Kohli’s part, we found a video similar to the one spread, where we could see Virat Kohli wearing the same brown jacket in that video, uploaded on his verified Instagram handle which is an ad promotion in collaboration with American Tourister.

After going through the entire video, it is evident that Virat Kohli is not endorsing any betting app, rather he is talking about an ad promotion collaborating with American Tourister.
We then did some keyword searches to see if India TV had published any news as claimed in the Viral Video, but we didn’t find any credible source.
Therefore, upon noticing the major anomalies in the video and doing further analysis found that the video was created using Synthetic Media, it's a fake and misleading one.
Conclusion:
The video of Virat Kohli promoting a betting app is fake and does not actually feature the celebrity endorsing the app. This brings up many concerns regarding how Artificial Intelligence is being used for fraudulent activities. Social media platforms need to take action against the spread of fake videos like these.
Claim: Video surfacing on social media shows Indian cricket star Virat Kohli promoting a betting application known as "Aviator."
Claimed on: Facebook
Fact Check: Fake & Misleading
Related Blogs

Introduction
A policy, no matter how artfully conceived, is like a timeless idiom, its truth self-evident, its purpose undeniable, standing in silent witness before those it vows to protect, yet trapped in the stillness of inaction, where every moment of delay erodes the very justice it was meant to serve. This is the case of the Digital Personal Data Protection Act, 2023, which holds in its promise a resolution to all the issues related to data protection and a protection framework at par with GDPR and Global Best Practices. While debates on its substantive efficacy are inevitable, its execution has emerged as a site of acute contention. The roll-out and the decision-making have been making headlines since late July on various fronts. The government is being questioned by industry stakeholders, media and independent analysts on certain grounds, be it “slow policy execution”, “centralisation of power” or “arbitrary amendments”. The act is now entrenched in a never-ending dilemma of competing interests under the DPDP Act.
The change to the Right to Information Act (RTI), 2005, made possible by Section 44(3) of the DPDP Act, has become a focal point of debate. This amendment is viewed by some as an attack on weakening the hard-won transparency architecture of Indian democracy by substituting an absolute exemption for personal information for the “public interest override” in Section 8(1)(j) of the RTI Act.
The Lag Ledger: Tracking the Delays in DPDP Enforcement
As per a news report of July 28, 2025, the Parliamentary Standing Committee on Information and Communications Technology has expressed its concern over the delayed implementation and has urged the Ministry of Electronics and Information Technology (MeitY) to ensure that data privacy is adequately ensured in the nation. In the report submitted to the Lok Sabha on July 24, the committee reviewed the government’s reaction to the previous recommendations and concluded that MeitY had only been able to hold nine consultations and twenty awareness workshops about the Draft DPDP Rules, 2025. In addition, four brainstorming sessions with academic specialists were conducted to examine the needs for research and development. The ministry acknowledges that this is a specialised field that urgently needs industrial involvement. Another news report dated 30th July, 2025, of a day-long consultation held where representatives from civil society groups, campaigns, social movements, senior lawyers, retired judges, journalists, and lawmakers participated on the contentious and chilling effects of the Draft Rules that were notified in January this year. The organisers said in a press statement the DPDP Act may have a negative impact on the freedom of the press and people’s right to information and the activists, journalists, attorneys, political parties, groups and organisations “who collect, analyse, and disseminate critical information as they become ‘data fiduciaries’ under the law.”
The DPDP Act has thus been caught up in an uncomfortable paradox: praised as a significant legislative achievement for India’s digital future, but caught in a transitional phase between enactment and enforcement, where every day not only postpones protection but also feeds worries about the dwindling amount of room for accountability and transparency.
The Muzzling Effect: Diluting Whistleblower Protections
The DPDP framework raises a number of subtle but significant issues, one of which is the possibility that it would weaken safeguards for whistleblowers. Critics argue that the Act runs the risk of trapping journalists, activists, and public interest actors who handle sensitive material while exposing wrongdoing because it expands the definition of “personal data” and places strict compliance requirements on “data fiduciaries.”One of the most important checks on state overreach may be silenced if those who speak truth to power are subject to legal retaliation in the absence of clear exclusions of robust public-interest protections.
Noted lawyer Prashant Bhushan has criticised the law for failing to protect whistleblowers, warning that “If someone exposes corruption and names officials, they could now be prosecuted for violating the DPDP Act.”
Consent Management under the DPDP Act
In June 2025, the National e-Governance Division (NeGD) under MeitY released a Business Requirement Document (BRD) for developing consent management systems under the DPDP Act, 2023. The document supports the idea of “Consent Manager”, which acts as a single point of contact between Data Principals and Data Fiduciaries. This idea is fundamental to the Act, which is now being operationalised with the help of MeitY’s “Code for Consent: The DPDP Innovation Challenge.” The government has established a collaborative ecosystem to construct consent management systems (CMS) that can serve as a single, standardised interface between Data Principals and Data Fiduciaries by choosing six distinct entities, such as Jio Platforms, IDfy, and Zoop. Such a framework could enable people to have meaningful control over their personal data, lessen consent fatigue, and move India’s consent architecture closer to international standards if it is implemented precisely and transparently.
There is no debate to the importance of this development however, there are various concerns associated with this advancement that must be considered. Although effective, a centralised consent management system may end up being a single point of failure in terms of political overreach and technical cybersecurity flaws. Concerns are raised over the concentration of power over the framing, seeking, and recording of consent when big corporate entities like Jio are chosen as key innovators. Critics contend that the organisations responsible for generating revenue from user data should not be given the responsibility for designing the gatekeeping systems. Furthermore, the CMS can create opaque channels for data access, compromising user autonomy and whistleblower protections, in the absence of strong safeguards, transparency mechanisms and independent oversight.
Conclusion
Despite being hailed as a turning point in India’s digital governance, the DPDP Act is still stuck in a delayed and unequal transition from promise to reality. Its goals are indisputable, but so are the conundrum it poses to accountability, openness, and civil liberties. Every delay increases public mistrust, and every safeguard that remains unsolved. The true test of a policy intended to safeguard the digital rights of millions lies not in how it was drafted, but in the integrity, pace, and transparency with which it is to be implemented. In the digital age, the true cost of delay is measured not in time, but in trust. CyberPeace calls for transparent, inclusive, and timely execution that balances innovation with the protection of digital rights.
References
- https://www.storyboard18.com/how-it-works/parliamentary-committee-raises-concern-with-meity-over-dpdp-act-implementation-lag-77105.htm
- https://thewire.in/law/excessive-centralisation-of-power-lawyers-activists-journalists-mps-express-fear-on-dpdp-act
- https://www.medianama.com/2025/08/223-jio-idfy-meity-consent-management-systems-dpdpa/
- https://www.downtoearth.org.in/governance/centre-refuses-to-amend-dpdp-act-to-protect-journalists-whistleblowers-and-rti-activists

Introduction
Web applications are essential in various sectors, including online shopping, social networks, banking, and healthcare systems. However, they also pose numerous security threats, including Cross-Site Scripting (XSS), a client-side code injection vulnerability. XSS attacks exploit the trust relationship between users and websites, allowing them to change web content, steal private information, hijack sessions, and gain full control of user accounts without breaking into the core server. This vulnerability is part of the OWASP Top 10 Web Application Security Risks.
What is Cross-Site Scripting (XSS)?
An XSS attack occurs when an attacker injects client-side scripts into web pages viewed by other users. When users visit the affected pages, their browsers naively execute the inserted scripts. The exploit takes advantage of web applications that allow users to submit content without properly sanitising inputs or encoding outputs. These scripts can cause a wide range of damage, including but not limited to stealing session cookies for session hijacking, redirecting users to malicious sites, logging keystrokes to capture credentials, and altering the DOM to display fake or phishing content.
How Does XSS Work?
- Injection: A malicious user submits code through a website input, like a comment or form.
- Execution: The submitted code runs automatically in the browsers of other users who view the page.
- Exploitation:The attacker can steal session information, capture credentials, redirect users, or modify the page content.
The fundamental cause behind the XSS vulnerabilities is the application of:
- Accepting trusted input from the users.
- After users' input, web pages have the strings embedded without any sanitisation.
- Not abiding by security policies like Content Security Policy (CSP).
With such vulnerabilities, attackers can generate malicious payloads like: <script>alert('XSS');</script>
This code might seem simple, but its execution provides the attacker with the possibility to do the following:
- Copy session tokens through hidden HTTP requests.
- From attacker-controlled domains, load attacker scripts.
- Change the DOM structure to show fake login forms for phishing.
Types of XSS Attacks: XSS (Cross-Site Scripting) attacks can occur in three main variations:
- Stored XSS: This type of attack occurs when an attacker injects an administered payload into the database or a message board. The script then runs whenever a user visits the affected board.
- Reflected XSS: In this attack, the danger lies in a parameter of the URL. Its social engineering techniques are attacks, in which it requires tricking people to click on a specially designed link. For example:
- DOM-Based XSS: This technique injects anything harmful without the need for server-side scripts, in contrast to other approaches. It targets JavaScript client-side scripts such as `document.write` and `innerHTML`. Without carrying out any safety checks, these scripts will alter the page's look (DOM stands for Document Object Model). If the hash is given a malicious string, it is run directly within the browser.
What Makes XSS a Threat?
A Cross-Site Scripting attack is only a primary attack vector, and can lead to significant damage that includes the following:
- Statement Hijacking. This uses scripts to steal cookies, which are then used to pose as authorized users.
- Theft of Credentials. Users’ passwords and usernames are wrenched from keystroke trackers.
- Phishing. Users are prompted with deceitful login forms that are used to capture sensitive details.
- Website Vandalism. Modified website material lowers the esteem of the brand.
- Monetary and Legal Consequences. There are compounding effects to GDPR and DPDP Act compliance in case of Data breaches, which incur penalties and fines.
Incidents in the Real World
In 2021, an XSS Stored attack occurred on a famous e-commerce platform eBay, through their product review system. The malicious JavaScript code was set to trigger every time an infected product page was accessed by customers. This caused a lot of problems, including account takeovers, unauthorised purchases, and damage to the company’s reputation. This example further worsens the fact that even reputed platforms can be targeted by XSS attacks.
How to Prevent XSS?
Addressing XSS vulnerabilities demands attention to detail and coordinated efforts across functions, as illustrated in the steps below:
Input Validation and Output Encoding:
- Ensure input validation is in place on the client and server.
- Perform output encoding relative to context: HTML: <, >, &.
- JavaScript: Escape quotes and slashes
Content Security Policy (CSP): CSP allows scripts to be executed only from the verified sources, which helps diminish the odds of harmful scripts running on your website. For example, the Header in the code could look to some degree like this: Content-Security-Policy: script-src 'self';
Unsafe APIs should be dodged: Avoid the use of document.write(), innerHTML, and eval(), and make sure to use:
- TextContent for inserting text.
- CreateElement() and other DOM creation methods for structured content.
Secure Cookies: Apply the HttpOnly and Secure cookie flags to block JavaScript access.
Framework Protections: Use the protective features in frameworks such as:
- React, which escapes data embedded in JSX automatically.
- Angular, which uses context-aware sanitisation.
Periodic Security Assessment:
- Use DAST tools to test the security posture of an application.
- Perform thorough penetration testing and security-oriented code reviews.
Best Practices for Developers: Assume a Secure Development Lifecycle (SDLC) integrating XSS stoppage at each point.
- Educate developers on OWASP secure coding guidelines.
- Automate scanning for vulnerabilities in CI/CD pipelines.
Conclusion:
To reduce the potential danger of XSS, both developers and companies must be diligent in their safety initiatives, ranging from using Content Security Policies (CSP) to verifying user input. Web applications can shield consumers and the company from the subtle but long-lasting threat of Cross-Site Scripting if security controls are implemented during the web application development stage and regular vulnerability scans are conducted.
References
- https://owasp.org/www-community/attacks/xss/
- https://www.paloaltonetworks.com/cyberpedia/xss-cross-site-scripting
- https://developer.mozilla.org/en-US/docs/Glossary/Cross-site_scripting
- https://www.cloudflare.com/learning/security/threats/cross-site-scripting/

Introduction
Assisted Reproductive Technology (“ART”) refers to a diverse set of medical procedures designed to aid individuals or couples in achieving pregnancy when conventional methods are unsuccessful. This umbrella term encompasses various fertility treatments, including in vitro fertilization (IVF), intrauterine insemination (IUI), and gamete and embryo manipulation. ART procedures involve the manipulation of both male and female reproductive components to facilitate conception.
The dynamic landscape of data flows within the healthcare sector, notably in the realm of ART, demands a nuanced understanding of the complex interplay between privacy regulations and medical practices. In this context, the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011, play a pivotal role, designating health information as "sensitive personal data or information" and underscoring the importance of safeguarding individuals' privacy. This sensitivity is particularly pronounced in the ART sector, where an array of personal data, ranging from medical records to genetic information, is collected and processed. The recent Assisted Reproductive Technology (Regulation) Act, 2021, in conjunction with the Digital Personal Data Protection Act, 2023, establishes a framework for the regulation of ART clinics and banks, presenting a layered approach to data protection.
A note on data generated by ART
Data flows in any sector are scarcely uniform and often not easily classified under straight-jacket categories. Consequently, mapping and identifying data and its types become pivotal. It is believed that most data flows in the healthcare sector are highly sensitive and personal in nature, which may severely compromise the privacy and safety of an individual if breached. The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules”) categorizes any information pertaining to physical, physiological, mental conditions or medical records and history as “sensitive personal data or information”; this definition is broad enough to encompass any data collected by any ART facility or equipment. These include any information collected during the screening of patients, pertaining to ovulation and menstrual cycles, follicle and sperm count, ultrasound results, blood work etc. It also includes pre-implantation genetic testing on embryos to detect any genetic abnormality.
But data flows extend beyond mere medical procedures and technology. Health data also involves any medical procedures undertaken, the amount of medicine and drugs administered during any procedure, its resultant side effects, recovery etc. Any processing of the above-mentioned information, in turn, may generate more personal data points relating to an individual’s political affiliations, race, ethnicity, genetic data such as biometrics and DNA etc.; It is seen that different ethnicities and races react differently to the same/similar medication and have different propensities to genetic diseases. Further, it is to be noted that data is not only collected by professionals but also by intelligent equipment like AI which may be employed by any facility to render their service. Additionally, dissemination of information under exceptional circumstances (e.g. medical emergency) also affects how data may be classified. Considerations are further nuanced when the fundamental right to identity of a child conceived and born via ART may be in conflict with the fundamental right to privacy of a donor to remain anonymous.
Intersection of Privacy laws and ART laws:
In India, ART technology is regulated by the Assisted Reproductive Technology (Regulation) Act, 2021 (“ART Act”). With this, the Union aims to regulate and supervise assisted reproductive technology clinics and ART banks, prevent misuse and ensure safe and ethical practice of assisted reproductive technology services. When read with the Digital Personal Data Protection Act, 2023 (“DPDP Act”) and other ancillary guidelines, the two legislations provide some framework regulations for the digital privacy of health-based apps.
The ART Act establishes a National Assisted Reproductive Technology and Surrogacy Registry (“National Registry”) which acts as a central database for all clinics and banks and their nature of services. The Act also establishes a National Assisted Reproductive Technology and Surrogacy Board (“National Board”) under the Surrogacy Act to monitor the implementation of the act and advise the central government on policy matters. It also supervises the functioning of the National Registry, liaises with State Boards and curates a code of conduct for professionals working in ART clinics and banks. Under the DPDP Act, these bodies (i.e. National Board, State Board, ART clinics and banks) are most likely classified as data fiduciaries (primarily clinics and banks), data processors (these may include National Board and State boards) or an amalgamation of both (these include any appropriate authority established under the ART Act for investigation of complaints, suspend or cancellation of registration of clinics etc.) depending on the nature of work undertaken by them. If so classified, then the duties and liabilities of data fiduciaries and processors would necessarily apply to these bodies. As a result, all bodies would necessarily have to adopt Privacy Enhancing Technologies (PETs) and other organizational measures to ensure compliance with privacy laws in place. This may be considered one of the most critical considerations of any ART facility since any data collected by them would be sensitive personal data pertaining to health, regulated by the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules 2011”). These rules provide for how sensitive personal data or information are to be collected, handled and processed by anyone.
The ART Act independently also provides for the duties of ART clinics and banks in the country. ART clinics and banks are required to inform the commissioning couple/woman of all procedures undertaken and all costs, risks, advantages, and side effects of their selected procedure. It mandatorily ensures that all information collected by such clinics and banks to not informed to anyone except the database established by the National Registry or in cases of medical emergency or on order of court. Data collected by clinics and banks (these include details on donor oocytes, sperm or embryos used or unused) are required to be detailed and must be submitted to the National Registry online. ART banks are also required to collect personal information of donors including name, Aadhar number, address and any other details. By mandating online submission, the ART Act is harmonized with the DPDP Act, which regulates all digital personal data and emphasises free, informed consent.
Conclusion
With the increase in active opt-ins for ART, data privacy becomes a vital consideration for all healthcare facilities and professionals. Safeguard measures are not only required on a corporate level but also on a governmental level. It is to be noted that in the 262 Session of the Rajya Sabha, the Ministry of Electronics and Information Technology reported 165 data breach incidents involving citizen data from January 2018 to October 2023 from the Central Identities Data Repository despite publicly denying. This discovery puts into question the safety and integrity of data that may be submitted to the National Registry database, especially given the type of data (both personal and sensitive information) it aims to collate. At present the ART Act is well supported by the DPDP Act. However, further judicial and legislative deliberations are required to effectively regulate and balance the interests of all stakeholders.
References
- The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011
- Caring for Intimate Data in Fertility Technologies https://dl.acm.org/doi/pdf/10.1145/3411764.3445132
- Digital Personal Data Protection Act, 2023
- https://www.wolterskluwer.com/en/expert-insights/pharmacogenomics-and-race-can-heritage-affect-drug-disposition