A fake video was circulating on social media platforms, purporting to show a mass knife attack on a London-bound train, just hours after news of such an incident broke. However, investigators at France 24's Truth or Fake unit quickly debunked the clip as entirely fabricated.
The viral video appeared to show passengers fleeing in terror from a knife-wielding attacker as the train comes to a halt. But a close examination revealed several telling inconsistencies that raised suspicions about its authenticity.
Firstly, the footage showed what appeared to be multiple perpetrators wielding knives - but it was impossible to discern their faces or clothing for any length of time, making it impossible to identify them with certainty.
Secondly, the video seemed to show the train braking suddenly, yet the sound effects were muffled and unclear. This made it difficult to verify whether the train really did come to a sudden stop as depicted in the clip.
Lastly, experts noted that the video appeared too smooth and polished for a genuine emergency footage captured on a smartphone or other camera device, which typically suffer from shaky cam, audio distortion, and a higher level of compression.
In light of these red flags, France 24's Truth or Fake unit concluded that the AI-generated video was likely created solely to deceive viewers into believing it showed an actual attack.
The viral video appeared to show passengers fleeing in terror from a knife-wielding attacker as the train comes to a halt. But a close examination revealed several telling inconsistencies that raised suspicions about its authenticity.
Firstly, the footage showed what appeared to be multiple perpetrators wielding knives - but it was impossible to discern their faces or clothing for any length of time, making it impossible to identify them with certainty.
Secondly, the video seemed to show the train braking suddenly, yet the sound effects were muffled and unclear. This made it difficult to verify whether the train really did come to a sudden stop as depicted in the clip.
Lastly, experts noted that the video appeared too smooth and polished for a genuine emergency footage captured on a smartphone or other camera device, which typically suffer from shaky cam, audio distortion, and a higher level of compression.
In light of these red flags, France 24's Truth or Fake unit concluded that the AI-generated video was likely created solely to deceive viewers into believing it showed an actual attack.