#FactCheck: Israel Apologizes to Iran’ Video Is AI-Generated
Executive Summary:
A viral video claiming to show Israelis pleading with Iran to "stop the war" is not authentic. As per our research the footage is AI-generated, created using tools like Google’s Veo, and not evidence of a real protest. The video features unnatural visuals and errors typical of AI fabrication. It is part of a broader wave of misinformation surrounding the Israel-Iran conflict, where AI-generated content is widely used to manipulate public opinion. This incident underscores the growing challenge of distinguishing real events from digital fabrications in global conflicts and highlights the importance of media literacy and fact-checking.
Claim:
A X verified user with the handle "Iran, stop the war, we are sorry" posted a video featuring people holding placards and the Israeli flag. The caption suggests that Israeli citizens are calling for peace and expressing remorse, stating, "Stop the war with Iran! We apologize! The people of Israel want peace." The user further claims that Israel, having allegedly initiated the conflict by attacking Iran, is now seeking reconciliation.

Fact Check:
The bottom-right corner of the video displays a "VEO" watermark, suggesting it was generated using Google's AI tool, VEO 3. The video exhibits several noticeable inconsistencies such as robotic, unnatural speech, a lack of human gestures, and unclear text on the placards. Additionally, in one frame, a person wearing a blue T-shirt is seen holding nothing, while in the next frame, an Israeli flag suddenly appears in their hand, indicating possible AI-generated glitches.

We further analyzed the video using the AI detection tool HIVE Moderation, which revealed a 99% probability that the video was generated using artificial intelligence technology. To validate this finding, we examined a keyframe from the video separately, which showed an even higher likelihood of 99% probability of being AI generated. These results strongly indicate that the video is not authentic and was most likely created using advanced AI tools.

Conclusion:
The video is highly likely to be AI-generated, as indicated by the VEO watermark, visual inconsistencies, and a 99% probability from HIVE Moderation. This highlights the importance of verifying content before sharing, as misleading AI-generated media can easily spread false narratives.
- Claim: AI generated video of Israelis saying "Stop the War, Iran We are Sorry".
- Claimed On: Social Media
- Fact Check:AI Generated Mislead
Related Blogs

Introduction
The Ministry of Electronics and Information Technology (MEITy) released the Draft Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Second Amendment Rules, 2026 on March 30, 2026, inviting public comments with a response window closing on April 14. This is a limited 15-day period for public input on proposed rules that will have major constitutional impacts. The brevity and timing of this opportunity demonstrate debatable commitment to stakeholder engagement and meaningful consultation by the drafting agency.
While MEITY describes the proposed amendments as "clarificatory and procedural nature," an analysis shows they will have substantive effects. Collectively, the amended language changes significantly how online speech will be regulated in India by providing the executive with more concentrated regulatory authority, limiting the required transparency of content enforcement, mandating greater retention of data without proportionality-based safeguards, and placing excessive compliance burden on intermediaries. Each of these changes has consequences beyond just changes in process and together, these changes collectively raise substantial concerns regarding compliance with Articles 14, 19, and 21 of the Constitution of India.
The Constitutional Baseline: Shreya Singhal and the Limits of Intermediary Liability
India’s Supreme Court decision in Shreya Singhal v Union of India (2015) 5 SCC 1 provides the foundation for intermediary liability, wherein the Court read down Section 79(3)(b) of the IT Act, 2000, holding that intermediaries are required to act upon receiving actual knowledge only through a court order or a valid notification by the appropriate government authority. The Supreme Court’s decision intended to provide a constitutional protection to intermediaries from being subjected to informal, unverified executive pressure to take down content by requiring that any such order be subject to some level of legal objective credibility or threshold.
Rule 3(4) of the proposed amendments places that balance under significant strain. By requiring intermediaries to comply with advisories, directions, standard operating procedures, codes of practice, and guidelines issued by the Ministry — and tying non-compliance to the loss of safe harbour — the draft effectively lowers the constitutional threshold that Shreya Singhal was designed to maintain. Compliance obligations now potentially arise from instruments that carry no judicial sanction and no mandatory public disclosure.
Rule 3(4): Delegated Legislation or Executive Overreach
The rule-making power conferred on the Central Government under Section 87 of the IT Act is limited to carrying out the provisions of the Act. It does not authorise the creation of new substantive obligations. This principle has been consistently affirmed in Indian Express Newspapers v. Union of India (1985) 1 SCC 641 and Confederation of Ex-Servicemen Associations v. Union of India (2006) 8 SCC 399, where the Court held that delegated legislation must remain within the four corners of the parent statute.
Rule 3(4) tests those limits. It converts executive advisories into binding compliance instruments without a clear statutory foundation in either Section 79 or Section 87. Although the proposed rule requires that such instruments specify their legal basis, there is no requirement that they be published or made publicly accessible. This creates a framework in which legality risks becoming circular — instruments claimed to be lawful solely by reference to a provision that does not clearly authorise them, shielded from scrutiny by their own opacity. Justice Chandurkar’s judgment in Kunal Kamra v. Union of India identified precisely this defect in the Fact Check Unit amendment. Rule 3(4) replicates the structural problem in a broader form.
Compliance Pressure and the Logic of Over-Censorship
The practical consequence of Rule 3(4) lies not only in its legality but in how it reshapes incentive structures for platforms. An intermediary facing the permanent threat of safe harbour loss will not wait to assess the legal merit of each advisory. The rational calculation is to comply early, broadly, and without friction. Lawful content — particularly satire, political commentary, and journalism — becomes vulnerable not because it is unlawful, but because it presents regulatory risk.
This dynamic was visible on 18 March 2026, when stand-up comedian Pulkit Mani (@hunnywhoisfunny) found his satirical Instagram reel being restricted across India. The video had accumulated over 16.5 million views. Users encountered a notice citing Section 79(3)(b) of the IT Act. No reasons were publicly provided. No prior hearing was offered. The same night, several political parody and satire accounts were withheld on X.
Data Retention, Privacy, and the Proportionality Test
The amendments to Rules 3(1)(g) and 3(1)(h) extend data retention obligations by making them additional to requirements under any other law. The existing 180-day floor for retained user data — covering removed content, registration information, and associated records — becomes a minimum rather than a ceiling. No maximum is specified, and no proportionality requirement accompanies the extension.
This raises direct concerns under Article 21 as interpreted in Justice K.S. Puttaswamy v. Union of India (2017) 10 SCC 1, which held that any state intrusion into privacy must satisfy the triple test of legality, necessity, and proportionality. Undefined retention periods, with no statutory ceiling and no requirement of purpose limitation, risk failing all three. The longer user data is held, including metadata, device information, and records of removed content, the greater the exposure to surveillance, unauthorised access, and use beyond the original justification.
Circumventing Judicial Scrutiny Through Procedural Redesign
The Bombay High Court, in its August 2021 order, stayed provisions of the IT Rules’ oversight mechanism as prima facie violative of Article 19(1)(a). The Madras High Court in T.M. Krishna v. Union of India affirmed that stay, cautioning that government-controlled media oversight risked undermining press independence. Both matters remain pending before the Delhi High Court.
The amendments to Rules 8(1) and 14 restructure the same oversight machinery through a modified procedural design. By extending the Inter-Departmental Committee’s jurisdiction to cover “matters” referred by the Ministry with no requirement of a complainant, no defined subject matter, and no guaranteed prior hearing, the proposed rules effectively reconstitute what courts found constitutionally suspect. Individual users posting news and current affairs content are now brought within reach of blocking mechanisms originally designed for institutional publishers.
Conclusion
As seen above, the Draft IT Rules 2026 are unable to meet the constitutional and judicial requirements to regulate free speech. What the proposed amendments construct is a durable system in which platforms self-censor under liability pressure, data is retained without proportionate justification, and content oversight expands through procedural adjustment rather than parliamentary legislation. Regulation of the digital public sphere is both legitimate and necessary. But it must be anchored in law, not in the quiet authority of executive advisories. The law must ultimately remain anchored in constitutional values, guided by the enduring principles of justice, equity, and good conscience.
The comment period closes on 14 April 2026.
Submissions may be sent to itrules.consultation@meity.gov.in.
References
- https://www.meity.gov.in/static/uploads/2026/03/30591fc6e322dcbcc9dae84a0f02e9e7.pdf
- https://www.meity.gov.in/static/uploads/2026/03/a71a21d35c107f2e528363d3eb17646a.pdf
- https://www.meity.gov.in/static/uploads/2026/02/550681ab908f8afb135b0ad42816a1c9.pdf
- https://neopolitico.com/india/government-blocks-viral-satirical-reel-impersonating-pm-modi-raising-fresh-questions-on-free-speech-and-digital-regulation/
- https://internetfreedom.in/sound-the-alarm-iffs-first-read-on-meitys-draft-it-rules-second-amendment-2026/

Introduction
A new dawn in the realm of cyber security and criminal justice is on the horizon. Maharashtra's Deputy Chief Minister, Devendra Fadnavis, has recently announced the advent of the country's most sophisticated cyber lab—a bastion against the dark arts of cybercrime. This announcement, made with the gravitas befitting a statesman, was not merely a bureaucratic note; it was a clarion call to a future where technology and law converge to create a safer society.
The cyber lab, poised to be the largest and most modern of its kind, is not just a facility—it is a symbol of the state's commitment to harnessing the power of technology in the ceaseless battle against crime. Fadnavis, who also holds the state's home portfolio, underscored the significance of this initiative during a function where he also emphasised the need for the Maharashtra police to brace themselves for the enforcement of three transformative criminal laws set to take effect from the first of July 2024.
In compliance with the New Laws
These laws—the Bharatiya Nyaya Sanhita, the Bharatiya Nagarik Suraksha Sanhita, and the Bharat Sakshya Act—They are not mere statutory texts; they are the architects of a new edifice of criminal justice, designed with the mortar of modern electronic and technical evidence to buttress the conviction rates and fortify the legal system.
At the inauguration of the Evidence Management Centre (EMC) and the Evidence Dispatch Van (EDV) in Navi Mumbai, Fadnavis spoke with an air of prescience about the radical shifts these new acts will engender. The EMC, a paragon of innovation with its no-human-intervention ethos, is set to revolutionise the procedure of handling evidence, thereby amplifying the likelihood of securing convictions in an era increasingly marred by cyber frauds and hacking escapades.
Recent Trend
The Deputy Chief Minister's vision extends beyond the present, into a realm where blockchain technology becomes an ally of law enforcement, rendering evidence tampering an obsolete concern. Under the new legislative framework, expert collection of evidence is mandated for crimes with sentences exceeding seven years, a move that underscores the gravity with which digital and electronic evidence is now regarded.
The Cyber Lab
The Navi Mumbai police Commissionerate stands as the vanguard of this new legal era, being the first in the country to align with the upcoming laws. As digital transactions burgeon, so too does the evil of cybercrime. Fadnavis assures us that the cyberlaw, a veritable nexus of modernity, will bring together banks, non-banking financial companies (NBFCs), and social media platforms on a unified platform to detect and thwart crimes with alacrity.
This announcement was made in the presence of Maharashtra's Director General of Police, Rashmi Shukla, and Navi Mumbai's police commissioner, Milind Bharambe, both of whom are key figures in the conception of this project. Their attendance shows the collective resolve of Maharashtra's law enforcement to elevate its capabilities in cybercrime prevention.
Conclusion
The establishment of this cyber lab is a vivid thread woven with the intent to protect the digital integrity of its citizens. It is a testament to the state's foresight and its unwavering commitment to staying abreast of the evolving landscape of crime and technology. As we stand on the cusp of this new era, we are reminded that the fight against crime is perennial, but with such pioneering initiatives, victory is not just a possibility—it is an inevitability.
References
- https://indianexpress.com/article/cities/mumbai/navi-mumbai-cyber-lab-criminal-laws-fadnavis-9206801/
- https://www.the420.in/why-maharashtras-new-cyber-lab-could-be-a-game-changer-for-national-security/
- https://apacnewsnetwork.com/2024/03/navi-mumbai-to-host-indias-most-advanced-crime-busting-lab-boosting-conviction-rates-maharashtra-deputy-cm-fadnavis-announces/

Introduction
In today’s digital world, where everything is related to data, the more data you own, the more control and compliance you have over the market, which is why companies are looking for ways to use data to improve their business. But at the same time, they have to make sure they are protecting people’s privacy. It is very tricky to strike a balance between both of them. Imagine you are trying to bake a cake where you need to use all the ingredients to make it taste great, but you also have to make sure no one can tell what’s in it. That’s kind of what companies are dealing with when it comes to data. Here, ‘Pseudonymisation’ emerges as a critical technical and legal mechanism that offers a middle ground between data anonymisation and unrestricted data processing.
Legal Framework and Regulatory Landscape
Pseudonymisation, as defined by the General Data Protection Regulation (GDPR) in Article 4(5), refers to “the processing of personal data in such a manner that the personal data can no longer be attributed to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organisational measures to ensure that the personal data are not attributed to an identified or identifiable natural person”. This technique represents a paradigm shift in data protection strategy, enabling organisations to preserve data utility while significantly reducing privacy risks. The growing importance of this balance is evident in the proliferation of data protection laws worldwide, from GDPR in Europe to India’s Digital Personal Data Protection Act (DPDP) of 2023.
Its legal treatment varies across jurisdictions, but a convergent approach is emerging that recognises its value as a data protection safeguard while maintaining that the pseudonymised data remains personal data. Article 25(1) of GDPR recognises it as “an appropriate technical and organisational measure” and emphasises its role in reducing risks to data subjects. It protects personal data by reducing the risk of identifying individuals during data processing. The European Data Protection Board’s (EDPB) 2025 Guidelines on Pseudonymisation provide detailed guidance emphasising the importance of defining the “pseudonymisation domain”. It defines who is prevented from attributing data to specific individuals and ensures that the technical and organised measures are in place to block unauthorised linkage of pseudonymised data to the original data subjects. In India, while the DPDP Act does not explicitly define pseudonymisation, legal scholars argue that such data would still fall under the definition of personal data, as it remains potentially identifiable. The Act defines personal data defined in section 2(t) broadly as “any data about an individual who is identifiable by or in relation to such data,” suggesting that the pseudonymised information, being reversible, would continue to require compliance with data protection obligations.
Further, the DPDP Act, 2023 also includes principles of data minimisation and purpose limitation. Section 8(4) says that a “Data Fiduciary shall implement appropriate technical and organisational measures to ensure effective observance of the provisions of this Act and the Rules made under it.” The concept of Pseudonymization fits here because it is a recognised technical safeguard, which means companies can use pseudonymization as one of the methods or part of their compliance toolkit under Section 8(4) of the DPDP Act. However, its use should be assessed on a case to case basis, since ‘encryption’ is also considered one of the strongest methods for protecting personal data. The suitability of pseudonymization depends on the nature of the processing activity, the type of data involved, and the level of risk that needs to be mitigated. In practice, organisations may use pseudonymization in combination with other safeguards to strengthen overall compliance and security.
The European Court of Justice’s recent jurisprudence has introduced nuanced considerations about when pseudonymised data might not constitute personal data for certain entities. In cases where only the original controller possesses the means to re-identify individuals, third parties processing such data may not be subject to the full scope of data protection obligations, provided they cannot reasonably identify the data subjects. The “means reasonably likely” assessment represents a significant development in understanding the boundaries of data protection law.
Corporate Implementation Strategies
Companies find that pseudonymisation is not just about following rules, but it also brings real benefits. By using this technique, businesses can keep their data more secure and reduce the damage in the event of a breach. Customers feel more confident knowing that their information is protected, which builds trust. Additionally, companies can utilise this data for their research or other important purposes without compromising user privacy.
Key Benefits of Pseudonymisation:
- Enhanced Privacy Protection: It hides personal details like names or IDs with fake ones (with artificial values or codes), making it harder for accidental privacy breaches.
- Preserved Data Utility: Unlike completely anonymous data, pseudonymised data keeps its usefulness by maintaining important patterns and relationships within datasets.
- Facilitate Data Sharing: It’s easier to share pseudonymised data with partners or researchers because it protects privacy while still being useful.
However, using pseudonymisation is not as easy as companies have to deal with tricky technical issues like choosing the right methods, such as encryption or tokenisation and managing security keys safely. They have to implement strong policies to stop anyone from figuring out who the data belongs to. This can get expensive and complicated, especially when dealing with a large amount of data, and it often requires expert help and regular upkeep.
Balancing Privacy Rights and Data Utility
The primary challenge in pseudonymisation is striking the right balance between protecting individuals' privacy and maintaining the utility of the data. To get this right, companies need to consider several factors, such as why they are using the data, the potential hacker's level of skill, and the type of data being used.
Conclusion
Pseudonymisation offers a practical middle ground between full anonymisation and restricted data use, enabling organisations to harness the value of data while protecting individual privacy. Legally, it is recognised as a safeguard but still treated as personal data, requiring compliance under frameworks like GDPR and India’s DPDP Act. For companies, it is not only regulatory adherence but also ensuring that it builds trust and enhances data security. However, its effectiveness depends on robust technical methods, governance, and vigilance. Striking the right balance between privacy and data utility is crucial for sustainable, ethical, and innovation-driven data practices.
References:
- https://gdpr-info.eu/art-4-gdpr/
- https://www.meity.gov.in/static/uploads/2024/06/2bf1f0e9f04e6fb4f8fef35e82c42aa5.pdf
- https://gdpr-info.eu/art-25-gdpr/
- https://www.edpb.europa.eu/system/files/2025-01/edpb_guidelines_202501_pseudonymisation_en.pdf
- https://curia.europa.eu/juris/document/document.jsf?text=&docid=303863&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=16466915
- https://curia.europa.eu/juris/document/document.jsf?text=&docid=303863&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=16466915