#FactCheck -AI-Generated Video Falsely Shows Car Stuck on Delhi–Jaipur Highway Signboard
Executive Summary
A shocking video showing a car hanging from a highway signboard is going viral on social media. The clip allegedly shows a black Mahindra Thar stuck on an overhead direction signboard on the Delhi–Jaipur Highway (NH-48). Social media users are widely sharing the video, claiming it shows a real road accident. However, a research by CyberPeace found the viral claim to be false. Our findings reveal that the circulating video is not real but AI-generated.
Claim
Social media users are sharing the clip as footage of an actual road accident. A viral post on X (formerly Twitter) claims that the incident took place on the Delhi–Jaipur Highway, showing a black Mahindra & Mahindra Thar lodged in a highway signboard.
- https://x.com/SenBaijnath/status/2024098520006029504
- https://archive.ph/cmr5e

Fact Check
On closely examining the viral video, several inconsistencies were observed that are commonly associated with AI-generated content. For instance, it appears highly improbable for a heavy vehicle to get stuck precisely at the center of a signboard at such a height. Despite the scale of the alleged incident, traffic on the highway below continues moving normally without any disruption. Additionally, the text visible on the right side of the signboard appears distorted and unusually written. To further verify the authenticity of the video, we analysed it using the AI detection tool Hive Moderation, which indicated a 99.9% probability that the video was AI-generated.

Another AI image detection tool, WasitAI, also found that the visuals in the viral clip were largely AI-generated.

Conclusion
Based on our research and available evidence, it is clear that the viral video showing a Mahindra Thar hanging from a highway signboard is not real but AI-generated.
Related Blogs

Introduction
You must have heard of several techniques of cybercrime up to this point. Many of which we could never have anticipated. Some of these reports are coming from different parts of the country. Where video calls are being utilised to cheat. Through video calls, cybercriminals are making individuals victims of fraud. During this incident, fraudsters film pornographic recordings of both the victims using a screen recorder, then blackmail them by emailing these videos and demanding money. However, cybercriminals are improving their strategies to defraud more people. In this blog post, we will explore the tactics involved in this case, the psychological impact, and ways to combat it. Before we know more about the case, let’s have a look at deep fake, AI, and Sextortion and how fraudsters use technology to commit crimes.
Understanding Deepfake
Deepfake technology is the manipulation or fabrication of multimedia information such as videos, photos, or audio recordings using artificial intelligence (AI) algorithms, and profound learning models. These algorithms process massive quantities of data to learn and imitate human-like behaviour, allowing for very realistic synthetic media development.
Individuals with malicious intent may change facial expressions, bodily movements, and even voices in recordings using deepfake technology, basically replacing a person’s appearance with someone else’s. The produced film can be practically indistinguishable from authentic footage, making it difficult for viewers to distinguish between the two.
Sextortion and technology
Sextortion is a sort of internet blackmail in which offenders use graphic or compromising content to compel others into offering money, sexual favours, or other concessions. This information is usually gained by hacking, social engineering, or tricking people into providing sensitive information.
Deepfake technology combined with sextortion techniques has increased the impact on victims. Deepfakes may now be used by perpetrators to make and distribute pornographic or compromising movies or photographs that seem genuine but are completely fake. As the prospect of discovery grows increasingly credible and tougher to rebut, the stakes for victims rise.
Cyber crooks Deceive
In this present case, cyber thugs first make video calls to people and capture the footage. They then twist the footage and merge it with a distorted naked video. As a result, the victim is obliged to conceal the case. Following that, “they demand money as a ransom to stop releasing the doctored video on the victim’s contacts and social media platforms.” In this case, a video has emerged in which a lady who was supposedly featured in the first film is depicted committing herself because of the shame caused by the video’s release. These extra threats are merely intended to inflict psychological pressure and coercion on the victims.
Sextortionists have reached a new low by profiting from the misfortunes of others, notably targeting deceased victims. The offenders want to maximise emotional pain and persuade the victim into acquiescence by generating deep fake films depicting these persons. They use the inherent compassion and emotion connected with tragedy to exact bigger ransoms from their victims.
This distressing exploitation not only adds urgency to the extortion demands but also preys on the victim’s sensitivity and emotional instability. They even pressurize the victim by impersonating them, and if the demands are fulfilled, the victims may land up in jail.
Tactics used
The morphed death videos are precisely constructed to heighten emotional discomfort and instil terror in the targeted individual. By editing photographs or videos of the deceased, the offenders create unsettling circumstances that heighten the victim’s emotional response.
The psychological manipulation seeks to instil guilt, regret, and a sense of responsibility in the victim. The notion that they are somehow linked to the catastrophe increases their emotional weakness, making them more vulnerable to the demands of sextortionists. The offenders take use of these emotions, coercing victims into cooperation out of fear of being involved in the apparent tragedy.
The impact on the victim’s mental well-being cannot be overstated. They may experience intense psychological trauma, including anxiety, depression, and post-traumatic stress disorder (PTSD). The guilt and shame associated with the false belief of being linked to someone’s death can have long-lasting effects on their emotional health and overall quality of life, others may have trust issues.
Law enforcement agencies advised
Law enforcement organisations were concerned about the growing annoyance of these illegal acts. The use of deep fake methods or other AI technologies to make convincing morphing films demonstrates scammers’ improved ability. These tools are fully capable of modifying digital information in ways that are radically different from the genuine film, making it difficult for victims to detect the fake nature of the video.
Defence strategies to fight back: To combat sextortion, a proactive approach that empowers individuals and utilizes resources is required. This section delves into crucial anti-sextortion techniques such as reporting events, preserving evidence, raising awareness, and implementing digital security measures.
- Report the Incident: Sextortion victims should immediately notify law enforcement. Contact your local police or cybercrime department and supply them with any important information, including specifics of the extortion attempt, communication logs, and any other evidence that can assist in the investigation. Reporting the occurrence is critical for keeping criminals responsible and averting additional harm to others.
- Preserve Evidence: Preserving evidence is critical in creating a solid case against sextortionists. Save and document any types of contact connected to the extortion, including text messages, emails, and social media conversations. Take screenshots, record phone calls (if legal), and save any other digital material or papers that might be used as evidence. This evidence can be useful in investigations and judicial processes.
Digital security: Implementing comprehensive digital security measures can considerably lower the vulnerability to sextortion assaults. Some important measures that one can use:
- Use unique, complicated passwords for all online accounts, and avoid reusing passwords across platforms. Consider utilising password managers to securely store and create strong passwords.
- Enable two-factor authentication (2FA) whenever possible, which adds an extra layer of protection by requiring a second verification step, such as a code delivered to your phone or email, in addition to the password.
- Regular software updates: Keep your operating system, antivirus software, and programmes up to date. Security patches are frequently included in software upgrades to defend against known vulnerabilities.
- Adjust your privacy settings on social networking platforms and other online accounts to limit the availability of personal information and restrict access to your content.
- Be cautious when clicking on links or downloading files from unfamiliar or suspect sources. When exchanging personal information online, only use trusted websites.
Conclusion:
Combating sextortion demands a collaborative effort that combines proactive tactics and resources to confront this damaging practice. Individuals may actively fight back against sextortion by reporting incidences, preserving evidence, raising awareness, and implementing digital security measures. It is critical to empower victims, encourage their rehabilitation, and collaborate to build a safer online environment where sextortionists are held accountable and everyone can navigate the digital environment with confidence.

Introduction
February marks the beginning of Valentine’s Week, the time when we transcend from the season of smog to the season of love. This is a time when young people are more active on social media and dating apps with the hope of finding a partner to celebrate the occasion. Dating Apps, in order to capitalise on this occasion, launch special offers and campaigns to attract new users and string on the current users with the aspiration of finding their ideal partner. However, with the growing popularity of online dating, the tactics of cybercriminals have also penetrated this sphere. Scammers are now becoming increasingly sophisticated in manipulating individuals on digital platforms, often engaging in scams, identity theft, and financial fraud under the guise of romance. As love fills the air, netizens must stay vigilant and cautious while searching for a connection online and not fall into a scammer’s trap.
Here Are Some CyberPeace Tips To Avoid Romance Scams
- Recognize Red Flags of Romance Scams:- Online dating has made it easier to connect with people, but it has also become a tool for scammers to exploit the emotions of netizens for financial gain. They create fake profiles, build trust quickly, and then manipulate victims into sending money. Understanding their tactics can help you stay safe.
- Warning Signs of a Romance Scam:- If someone expresses strong feelings too soon, it’s a red flag. Scammers often claim to have fallen in love within days or weeks, despite never meeting in person. They use emotional pressure to create a false sense of connection. Their messages might seem off. Scammers often copy-paste scripted responses, making conversations feel unnatural. Poor grammar, inconsistencies in their stories, or vague answers are warning signs. Asking for money is the biggest red flag. They might have an emergency, a visa issue, or an investment opportunity they want you to help with. No legitimate relationship starts with financial requests.
- Manipulative Tactics Used by Scammers:- Scammers use love bombing to gain trust. They flood you with compliments, calling you their soulmate or destiny. This is meant to make you emotionally attached. They often share fake sob stories. It could be anything ranging from losing a loved one, facing a medical emergency, or even being stuck in a foreign country. These are designed to make you feel sorry for them and more willing to help. Some of these scammers might even pretend to be wealthy, being investors or successful business owners, showing off their fabricated luxury lifestyle in order to appear credible. Eventually, they’ll try to lure you into a fake investment. They create a sense of urgency. Whether it’s sending money, investing, or sharing personal details, scammers will push you to act fast. This prevents you from thinking critically or verifying your claims.
- Financial Frauds Linked to Romance Scams:- Romance scams have often led to financial fraud. Victims may be tricked into sending money directly or get roped into elaborate schemes. One common scam is the disappearing date, where someone insists on dining at an expensive restaurant, only to vanish before the bill arrives. Crypto scams are another major concern. Scammers convince victims to invest in fake cryptocurrency platforms, promising huge returns. Once the money is sent, the scammer disappears, leaving the victim with nothing.
- AI & Deepfake Risks in Online Dating:- Advancements in AI have made scams even more convincing. Scammers use AI-generated photos to create flawless, yet fake, profile pictures. These images often lack natural imperfections, making them hard to spot. Deepfake technology is also being used for video calls. Some scammers use pre-recorded AI-generated videos to fake live interactions. If a person’s expressions don’t match their words or their screen glitches oddly, it could be a deepfake.
- How to Stay Safe:-
- Always verify the identities of those who contact you on these sites. A simple reverse image search can reveal if someone’s profile picture is stolen.
- Avoid clicking suspicious links or downloading unknown apps sent by strangers. These can be used to steal your personal information.
- Trust your instincts. If something feels off, it probably is. Stay alert and protect yourself from online romance scams.
Best Online Safety Practices
- Prioritize Social Media Privacy:- Review and update your privacy settings regularly. Think before you share and be mindful of who can see your posts/stories. Avoid oversharing personal details.
- Report Suspicious Activities:- Even if a scam attempt doesn’t succeed, report it. Indian Cyber Crime Coordination Centre (I4C) 'Report Suspect' feature allow users to flag potential threats, helping prevent cybercrimes.
- Think Before You Click or Download:- Avoid clicking on unknown links or downloading attachments from unverified sources. These can be traps leading to phishing scams or malware attacks.
- Protect Your Personal Information:- Be cautious with whom and how you share your sensitive details online. Cybercriminals exploit even the smallest data points to orchestrate fraud.

Executive Summary
A video is being widely shared on social media linking it to the ongoing tensions between Israel and Iran. The clip shows multiple fighter jets flying across the sky, while massive flames appear to be rising from tall buildings below. The visuals are dramatic and alarming, creating the impression of a large-scale military strike. Users sharing the video claim that after Israel carried out an attack, Iran launched a retaliatory strike on Israel, and that the viral footage captures the aftermath of this counterattack. However, research conducted by the CyberPeace found the claim to be misleading. Our research revealed that the viral video is not authentic but AI-generated.
Claim
On the social media platform Facebook, a user shared the viral video with the caption: “Iran has also carried out a retaliatory attack on Israel.”
(Post link and archive link provided above.)

Factcheck
Upon closely examining the video, we noticed several irregularities in the visuals and motion patterns, which raised suspicion that the footage may have been generated using artificial intelligence. To verify this, we analyzed the video using the AI detection tool developed by Hive Moderation. According to the analysis report, there is a 62 percent likelihood that the viral video is AI-generated.

As part of further verification, we also scanned the video using Sightengine. The results indicated an even stronger probability, suggesting that the video is 99 percent AI-generated.

Conclusion
Our research confirms that the viral video does not depict a real military attack. It is AI-generated content being falsely shared in the context of Israel-Iran tensions.