Pentagon Blast: AI-Generated Image Goes Viral, Leads To Stock Market Chaos. Some Twitter Accounts Taken Down
The US Department of Defense has officially confirmed the image to be a fabrication.
A social media frenzy ensued on Monday as an AI-generated image depicting an explosion near a building in the Pentagon complex circulated online, intensifying concerns surrounding the spread of AI-generated misinformation. The image, portraying a tall plume of dark grey smoke, rapidly disseminated on Twitter, with verified accounts also sharing it. Its origin remains unknown.
The US Department of Defense has officially confirmed the image to be a fabrication. Nevertheless, CNN reports that its virality briefly impacted the stock market.
The fire department of Arlington, Virginia, located near Washington, DC, acknowledged social media reports regarding the alleged explosion but assured the public that there was no actual threat.
@PFPAOfficial and the ACFD are aware of a social media report circulating online about an explosion near the Pentagon. There is NO explosion or incident taking place at or near the Pentagon reservation, and there is no immediate danger or hazards to the public. pic.twitter.com/uznY0s7deL
— Arlington Fire & EMS (@ArlingtonVaFD) May 22, 2023
One of the verified Twitter accounts that propagated the photo was OSINTdefender, an account with over 336,000 followers that shares news related to international military conflicts.
Sorry for the Confusion and possible Misinformation, there is a lot of Reports and Claims going around right now that I as 1 Person am struggling to get a handle on.
— OSINTdefender (@sentdefender) May 22, 2023
The owner of the account expressed regret for spreading false information and described the incident as an illustration of the ease with which such images can manipulate the information landscape, underscoring the potential dangers in the future.
Furthermore, some verified accounts which did share the photo were suspended by Twitter.
Prime example of the dangers in the pay-to-verify system: This account, which tweeted a (very likely AI-generated) photo of a (fake) story about an explosion at the Pentagon, looks at first glance like a legit Bloomberg news feed. pic.twitter.com/SThErCln0p
— Andy Campbell (@AndyBCampbell) May 22, 2023
This particular AI-generated image is just one example of several that have recently gone viral. Other instances include an image of the Pope wearing a trendy white long puffer coat and a black-and-white, photorealistic image that won a prize at the Sony World Photography Awards. The German artist responsible for the award-winning image admitted to submitting it as a playful experiment to test the preparedness of competitions to accept AI-generated entries. Ultimately, he declined the award.
OPINION: Meta, Twitter Are Ruining Social Media As We Knew It. Why Should You 'Pay' For It?
The incident also draws attention to the ongoing challenges of verification on Twitter. Twitter recently introduced its subscription service, Twitter Blue, which altered the process for obtaining blue check badges, previously awarded to verified users. With the new programme, individuals can pay $8 per month to receive a blue checkmark. Concerns have grown regarding the proliferation of accounts impersonating public figures, government officials, and news outlets since this change was implemented.