• Home
  • Politics
  • AI-generated deceptive election data will impact the 2024 election. (Op-Ed)
Photo Credit: AI Generated Image

AI-generated deceptive election data will impact the 2024 election. (Op-Ed)

The image you see above of Donald Trump posing with African American men and women is not real — it was generated by AI. Did you fall for it?

Despite Artificial Intelligence platforms having strict rules against creating misleading political and election data, people can easily use AI to create such content. Researchers with the Center for Countering Digital Hate (CCDH) attempted to work around the rules and found a 41% success rate for generating deceptive election-related images. Many people are unable to distinguish AI-generated images from real photos, especially older people who are not familiar with such technologies, and that could pose a direct threat to the 2024 Election.

CCDH tested the four largest AI platforms available for public use: Midjourney, OpenAI’s ChatGPT Plus, Stability.ai’s DreamStudio, and Microsoft’s Image Creator. All platforms forbid the creation of misleading images, with Chat GPT Plus specifically forbidding the creation of images that feature politicians; all four platforms are working to stop their tools from being used to create misleading election information. However, researchers were able to work around these rules listed in the terms and conditions.

They created realistic images of Joe Biden standing outside the White House with a clone of himself, a paparazzi photo of Donald Trump and Jeffrey Epstein on a private jet, a security camera-style image of someone tampering with a ballot box, and a grainy CCTV-style image of boxes of ballots in a dumpster with some of the ballots visible. While some platforms did not allow presidential candidates to appear, they had nearly a 60% success rate when it came to images about ballots and voting locations.

AI can produce deepfake images that appear to be almost indistinguishable from authentic pictures. At first glance, they appear real. If you are familiar with AI image generation, you may be able to notice simple mistakes or clues that are a trademark of AI. For example, skin texture may be smoothed, an extra finger may be seen, or strands of hair may defy physics. Often, dark shadows are visible that simply don’t make sense when compared to the lighting source.

“All of these tools are vulnerable to people attempting to generate images that could be used to support claims of a stolen election or could be used to discourage people from going to polling places,” said Callum Hood, the head of CCDH’s research team. In the age of digital misinformation, people will likely be tricked by these false images. Anyone can produce these images as the AI platforms are publicly accessible. We cannot stop the spread of AI being used to falsify election data. Still, we can implore AI platforms to safeguard their tools from being used maliciously and educate others on how to spot AI-generated photos.

Share:

Join Our Mailing List

Recent Articles

Can Taylor Swift impact the Election?

Prior to the election in 2020, pop superstar Taylor Swift endorsed Joe Biden and Kamala Harris. Before this Swift had not spoken much publicly about

Bronx District Leader accused of Extortion

Nicole Torres, A Bronx district leader who represented the GOP, was arrested on Tuesday. Torres is being charged with several counts of extortion and conspiracy.

Hey! Are you enjoying NYCTastemakers? Make sure to join our mailing list for NYCTM and never miss the chance to read all of our articles!