#FactCheck

Executive Summary
A video circulating on social media shows an electric car allegedly being powered by a portable generator attached to it. The clip is being shared with the claim that the generator is directly running the vehicle, suggesting a groundbreaking or unusual technological feat. However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the video is not authentic but AI-generated.
Claim
On February 22, 2026, a user on X (formerly Twitter) shared the viral video with the caption: “After watching this video, Newton might turn in his grave.” The post implied that the video demonstrates a scientific impossibility.

Fact Check:
To verify the claim, we conducted a keyword search on Google. However, we found no credible reports from any reputable media organization supporting the assertion made in the viral post. A close examination of the video revealed several visual inconsistencies and unnatural elements, raising suspicion that the footage may have been generated using artificial intelligence. We then analyzed the video using the AI detection tool Hive Moderation. The results indicated a 96 percent probability that the video was AI-generated.

In the next step of our research , we scanned the video using another AI detection platform, WasItAI, which also concluded that the viral video was AI-generated.

Conclusion
Our research confirms that the viral video is not real. It has been artificially created using AI technology and is being circulated with a misleading claim.

Executive Summary
A photo is going viral on social media showing a young man dressed in traditional Arab attire warmly embracing an elderly woman. The post claims that the man flew in from Saudi Arabia to Kerala just to meet his “Hindu mother,” portraying the image as a heartwarming example of communal harmony. However, research by the CyberPeace found that the claim being shared with the image is misleading.
Claim
The viral post narrates an emotional story, alleging that years ago a Hindu woman from Kerala worked in Saudi Arabia caring for children and loved a young boy like her own son. After she returned to India, the boy—now grown up—reportedly searched for her for months, booked a flight, and finally reached Kerala to reunite with her. The post describes an emotional reunion filled with tears, affection, and a bond beyond religion and nationality.

Fact Check
A reverse image search of the viral picture led us to a video uploaded on August 18, 2023, on the YouTube channel of social media influencer Hashim Abbas. In the video, he is seen meeting and hugging the elderly woman while extending Onam greetings.

Further examination of Hashim Abbas’ social media accounts revealed several other videos from his Kerala visit. Our research also found that Abbas played a significant role in the Malayalam film Kondotty Pooram.

Additionally, we found a video posted on August 13, 2023, by actress and theatre artist Sandhya Rajendran, daughter of veteran Malayalam actress Vijayakumari. The video shows Vijayakumari teaching Onam songs to Hashim Abbas.

Conclusion
The evidence clearly establishes that the viral claim is misleading. The man seen in the image is Hashim Abbas, who was meeting senior Malayalam actress Vijayakumari to extend Onam greetings. The emotional story about a son flying from Saudi Arabia to reunite with his Hindu mother is fictional and not connected to the viral image.

Executive Summary
The film ‘Yadav Ji Ki Love Story’, scheduled to release on February 27, has become embroiled in controversy over its title. Several organizations have expressed objections, registering their displeasure regarding the name of the film. Amid the row, a video is being widely circulated on social media. The footage shows a large crowd holding banners and posters while staging a protest. Users sharing the clip claim that it is from South India, where members of the Yadav community have allegedly launched a large-scale agitation against the film. However, research conducted by the CyberPeace found the viral claim to be false. Our research revealed that the video is not authentic but AI-generated, and is being shared with a misleading narrative.
Claim
On February 22, 2026, a Facebook user shared the viral video claiming it depicts protests by the Yadav community in South India against the film. The original and archived links to the post are provided below

Fact Check:
Upon closely examining the viral video, we noticed several anomalies in the visuals, crowd movements, and certain frames. The unnatural patterns and inconsistencies raised suspicions that the footage may have been generated using artificial intelligence. To verify this, we analyzed the video using the AI detection tool Aurigin AI, which indicated that the footage was AI-generated.

We further scanned the clip using another AI detection platform, Hive Moderation. The results showed a 99 percent probability that the video was AI-generated.

Conclusion
Our findings confirm that the viral video is not real. It has been artificially created using AI technology and is being circulated with a false and misleading claim.

Executive Summary
A video circulating on social media shows a lion carrying away a woman who was washing clothes near a pond. Users are sharing the clip claiming it depicts a real incident. However, research by CyberPeace found the viral claim to be false. The research revealed that the video is not real but AI-generated.
Claim
A user on Facebook shared the viral video claiming that a lion attacked and carried away a woman from a pond while she was washing clothes. The link to the post and its archived version are provided below

Fact Check:
Upon closely examining the viral clip, we noticed several visual inconsistencies that raised suspicion about its authenticity. The video was then analyzed using the AI-detection tool Sightengine. According to the analysis results, the viral video was identified as AI-generated.

Conclusion
The research confirms that the viral video does not depict a real incident. The clip is digitally created using artificial intelligence and is being falsely shared as a genuine event.

Executive Summary
A video circulating on social media shows a woman using abusive language in front of a camera. Users sharing the clip claim that the woman is a professor at Galgotias University and that the video exposes her alleged reality. However, an research by CyberPeace found the claim to be misleading. The probe revealed that the woman seen in the viral video has no connection with Galgotias University and is not a professor there.Fact-checking further showed that the video is not recent but around seven years old. The woman featured in the clip was identified as Shubhrastha, who is a political strategist by profession.
Claim:
A user on X (formerly Twitter) shared the viral video on February 18, 2026, claiming: “A ‘class in abuse studies’ at Galgotias University? An obscene video of a professor teaching ethics has gone viral. Another shameful chapter has been added to the list of controversies surrounding Galgotias University.” The post further alleged that after falsely claiming a Chinese robot as its own, the university’s “Culture and Ethics” faculty member was seen publicly using abusive language in the viral clip. The post link and its archived version are provided below:

Fact Check:
To verify the authenticity of the viral claim, we extracted key frames from the video and conducted a reverse image search using Google Lens. During the research , we found the same video uploaded on the Indian Spectator’s YouTube channel on June 9, 2018

The video was also found on another YouTube channel, where it had been uploaded on June 12, 2018.

Conclusion
The research clearly establishes that the woman seen in the viral video has no association with Galgotias University and is not a professor there. The clip is also not recent but approximately seven years old. The woman in the video was identified as Shubhrastha, a political strategist.
.webp)
Executive Summary
The U.S. Department of Justice recently released nearly three million pages of documents, along with thousands of videos and photographs, related to its research into convicted offender Jeffrey Epstein. Meanwhile, a video showing a massive crowd protesting on a street is going viral on social media The video, which had earlier circulated with false claims linking it to anti-government protests in Iran, is now being shared by several users who claim that the protest took place in the United States after the release of the Epstein files. Research by CyberPeace found the viral claim to be false. The video being linked to protests in the United States following the release of the Epstein files is not real and was generated using artificial intelligence (AI).
Claim:
An Instagram user uploaded the viral video on February 9, 2026, with the caption: “After Epstein files released in America. All eyes on America.”
- https://www.instagram.com/reel/DUjLe-XE5lA
- https://ghostarchive.org/archive/tkP6W

Fact Check:
To verify the claim, we first conducted a reverse search of the viral video using Google Lens. The same video was found posted on January 10, 2026, by an Instagram account named “elnaz555,” where it was shared in the context of recent protests in Iran. The post also mentioned that the video was created using AI.

Based on this lead, we further analyzed a higher-quality version of the viral video using Hive Moderation, a tool used to detect AI-generated images and videos. The analysis indicated a 97.9% probability that the video was generated using artificial intelligence. The research clearly shows that the video is not authentic and has been falsely linked to protests in the United States after the release of the Epstein files.

Conclusion:
The claim circulating on social media is false. The viral video allegedly showing protests in the United States following the release of the Epstein files is AI-generated and not related to any real event.

Executive Summary
A video is going viral on social media showing a woman performing a pre-wedding ritual called “Roka” for a couple at a metro station. Many users are sharing the clip believing it to be a real incident. CyberPeace found in its research that the viral claim is false. The video is actually scripted.
Claim:
An Instagram user posted the video on February 7, 2026, with the caption, “A mother performed her son’s Roka with his girlfriend at a metro station.”

Fact Check:
To verify the claim, we conducted a reverse image search using Google Lens on screenshots from the viral video. We found the same video was first uploaded on February 5, 2026, by an Instagram account named “chalte_phirte098.” The profile belongs to digital content creator Aarav Mavi, who regularly posts relationship and breakup-related videos.

Although the viral clip does not include any disclaimer stating that it is scripted, an older video posted by the creator on December 16, 2025, clarifies that his content is based on real-life stories shared by people but is filmed using professional actors. Several similar staged videos are also available on his profile on Instagram.

Conclusion:
Our research clearly shows that the viral video claiming to show a pre-wedding Roka ceremony at a metro station is not real. It was created by a content creator for entertainment purposes. Therefore, the claim circulating on social media is misleading.

Executive Summary
Social media users, particularly Pakistani propaganda accounts, shared an image showing coffins wrapped in the Indian tricolour and claimed that India violated the ceasefire along the Line of Control (LoC). According to the posts, Pakistan retaliated with heavy firing, captured the Indian Army’s Kumar Top post, and several Indian soldiers were killed in the exchange.
One user wrote, “Breaking News: Indian Army once again violated the ceasefire in the Mandal sector, targeting civilians with mortar shelling. Pakistan responded strongly, captured the Indian Army’s Kumar Top post, and several soldiers were reportedly killed. Calm has now been restored after Pakistan’s response.”

Fact Check
Research by CyberPeace found the viral claim to be false. Using reverse image search, we traced the viral photo to the Shutterstock website. The image description states that it was taken on August 6, 2013, and shows Indian Army personnel standing near the coffins of soldiers who were killed by Pakistani infiltrators at a brigade headquarters in Poonch, located about 240 km from Jammu. This confirms that the image is old and unrelated to recent developments along the Line of Control.

Further verification led us to a report published by NBC News on August 8, 2013, which also featured the same visual in connection with the 2013 cross-border attack.

Additionally, posts from the official X (formerly Twitter) handle of the Indian Army 16 Corps (White Knight Corps) stated that based on intelligence inputs and continuous surveillance, suspicious terrorist activity was detected near Nathua Tibba in the Sunderbani sector close to the LoC in the early hours of February 19, 2026. Alert troops responded promptly and successfully foiled the infiltration attempt. The Army also confirmed that operational vigilance remains high across the sector. However, there were no reports of casualties due to Pakistani firing.

Conclusion:
The viral image showing coffins of Indian soldiers is not recent but dates back to 2013. There are no confirmed reports of casualties from Pakistani firing along the Line of Control in the current context. Therefore, the claim circulating on social media is misleading.

Executive Summary
A video circulating widely on social media shows a man interacting with a humanoid robot and using abusive language, after which the robot asks him to maintain politeness. Several users shared the clip claiming that the incident took place during a recent AI summit in New Delhi. The video triggered strong reactions online, with some users demanding legal action against the individual. However, research by CyberPeace found the claim to be misleading.
Claim
Social media users claimed that the viral video showing a man abusing a robot was recorded during an AI summit in New Delhi, India.

Fact Check
To verify the claim, we conducted a reverse image search of the individual seen in the video. The search led us to an Instagram post uploaded by a Pakistani account identifying the individual as Kashif Zameer.

Further keyword searches helped us locate his Instagram profile, where the same video had been uploaded on February 17, 2026. The post included hashtags such as “Dubai,” indicating the actual location of the incident. The profile also lists Lahore, Pakistan, as the user’s location and describes him as a businessman and social media personality.

To confirm the location shown in the video, we conducted additional searches using keywords such as “Dubai” and “humanoid robot.” The research revealed that the robot featured in the clip is “Ameca,” located at the Museum of the Future in Dubai.

Conclusion
The viral claim is false. The video is not related to any AI summit held in New Delhi. The incident occurred in Dubai, and the person seen in the video is not an Indian citizen.

Executive Summary
A purported news clip circulating on social media claims that the Bharatiya Janata Party (BJP) purchased Bhupen Bora, a leader of the Indian National Congress, for ₹50 crore as part of a political deal in Assam. The viral clip further alleges that the transaction took place under the leadership of Assam Chief Minister Himanta Biswa Sarma and included an agreement to induct several Congress leaders into the BJP.
However, research by CyberPeace found the viral claim to be false and revealed that the original news video had been manipulated using AI and shared with misleading claims.
Claim
On February 18, 2026, a user shared the viral video on Facebook, claiming that the Assam BJP had bought a Congress leader who had lost the last three elections for ₹50 crore, and that the alleged deal led by Himanta Biswa Sarma had drawn public criticism.

Fact Check:
To verify the authenticity of the claim, we extracted key frames from the viral video and conducted a reverse image search using Google Lens. During the research, we found the original version of the video published on the website of Aaj Tak on February 16, 2026. In the original report, the anchor is only seen reporting on Bhupen Bora’s resignation from the party. The report does not mention any alleged financial transaction or political deal, contrary to the claims made in the viral clip.

In the next stage of the research, the viral video was analysed using the AI detection tool AURGIN AI, which identified the video as AI-generated.

Conclusion
Our research found that users had manipulated the original news broadcast using AI and shared it with misleading claims. The viral clip does not show any real financial deal between Bhupen Bora and the Assam Chief Minister.

Executive Summary
A shocking video showing a car hanging from a highway signboard is going viral on social media. The clip allegedly shows a black Mahindra Thar stuck on an overhead direction signboard on the Delhi–Jaipur Highway (NH-48). Social media users are widely sharing the video, claiming it shows a real road accident. However, a research by CyberPeace found the viral claim to be false. Our findings reveal that the circulating video is not real but AI-generated.
Claim
Social media users are sharing the clip as footage of an actual road accident. A viral post on X (formerly Twitter) claims that the incident took place on the Delhi–Jaipur Highway, showing a black Mahindra & Mahindra Thar lodged in a highway signboard.
- https://x.com/SenBaijnath/status/2024098520006029504
- https://archive.ph/cmr5e

Fact Check
On closely examining the viral video, several inconsistencies were observed that are commonly associated with AI-generated content. For instance, it appears highly improbable for a heavy vehicle to get stuck precisely at the center of a signboard at such a height. Despite the scale of the alleged incident, traffic on the highway below continues moving normally without any disruption. Additionally, the text visible on the right side of the signboard appears distorted and unusually written. To further verify the authenticity of the video, we analysed it using the AI detection tool Hive Moderation, which indicated a 99.9% probability that the video was AI-generated.

Another AI image detection tool, WasitAI, also found that the visuals in the viral clip were largely AI-generated.

Conclusion
Based on our research and available evidence, it is clear that the viral video showing a Mahindra Thar hanging from a highway signboard is not real but AI-generated.

Executive Summary
A video circulating on social media claims that a Pakistani man misbehaved with TV anchor Rubika Liyaquat during a live television debate. Users sharing the clip alleged that the Pakistani participant silenced the anchor on live TV.
However, research by CyberPeace found the viral claim to be false and revealed that the video being shared on social media is edited. In the original video, published on YouTube on November 26, 2025, the alleged Pakistani man was not present in the TV debate.
Claim
On February 13, 2026, a user shared the viral clip on X (formerly Twitter), claiming that the anchor was insulted during the debate and was left speechless. Another user on February 11, 2026, asked News18 India to verify the video and questioned who allowed such behaviour towards the journalist on air.

Fact Check:
To verify the claim, we extracted key frames from the viral video and conducted a reverse image search using Google Lens. During the research, we found the full version of the debate uploaded on the official YouTube channel of News18 India on November 26, 2025. The nearly 40-minute original broadcast featured anchor Rubika Liyaquat along with panelists Zafar Islam, Varun Purohit, Prateek Kumar, Arvind Kumar Vajpayee, Tausif Ahmed Khan, and Aziz Khan. However, the person seen misbehaving with the anchor in the viral clip was not present in the original video.

Upon carefully reviewing the footage, we located the actual segment around the 25-minute 40-second mark. In this portion, the anchor can be heard asking panelist Tausif Ahmed Khan to leave the show, using the same words heard in the viral clip. However, the original broadcast does not feature any Pakistani participant or any individual named “Nadeem Shahzad.”

Conclusion
Our research found that the viral claim is false. The circulating video has been edited, and the alleged Pakistani participant does not appear in the original debate uploaded on November 26, 2025.