Current track

Title

Artist


New York AG Letitia James warns tech giants against allowing election-related misinformation

Written by on August 13, 2024

New York AG Letitia James warns tech giants against allowing election-related misinformation
Letitia James, New York’s attorney general, speaks during a news conference in New York, Feb. 16, 2024. (Jeenah Moon/Bloomberg via Getty Images)

(NEW YORK) — New York Attorney General Letitia James is asking nearly a dozen large tech companies to take meaningful steps to protect voters from election-related misinformation, according to a letter obtained exclusively by ABC News.

“While misinformation has been a concern in past elections, with the rise of gen AI, barriers that prevent bad actors from creating deceptive or misleading content have weakened dramatically,” said the letter, which was sent to 10 social media and AI companies, including Meta, Google and OpenAI.

The letter said the generative AI tools the recipients have built have “become increasingly popular and easy to use and misuse.”

Deceptive and misleading content about the 2024 presidential election has been circulating online, and generative AI has been making it increasingly difficult for users to distinguish fact from fiction.

In an altered campaign video of Vice President Kamala Harris last month, her original audio was swapped out and replaced with an AI voice-clone mimicking her voice to make her say things she never said. The creator posted the video on the social media platform X, along with a disclaimer saying it was a parody — but the video then garnered massive attention after it was reposted by X owner Elon Musk, who did not explicitly say it was satire.

In January, a robocall appearing to impersonate the voice of President Joe Biden encouraged recipients of the call to “save your vote” for the November general election, rather than participate in the New Hampshire primary, according to audio obtained by ABC News.

James isn’t the only politician raising the alarm about election-related AI.

Last month, the secretaries of state from Minnesota, Michigan, New Mexico, Washington and Pennsylvania sent a public letter to Musk calling for X’s AI search assistant, “Grok,” to direct voters seeking election information to the nonpartisan CanIVote.org, as the administrators of ChatGPT and OpenAI already do.

“As tens of millions of voters in the U.S. seek basic information about voting in this major election year, X has the responsibility to ensure all voters using your platform have access to guidance that reflects true and accurate information about their constitutional right to vote,” read the letter.

A recent study by AI Forensics, a European nonprofit investigating the impact of AI, found that Microsoft Copilot’s answers to simple election-related questions contained factual errors 30% of the time. Following that investigation, as well as a request for information from the European Commission, Microsoft and Google introduced “moderation layers” to their chatbots so that they refuse to answer election-related prompts, AI Forensics told ABC News.

In February 2023, most of the technology companies James addressed in this week’s letter signed a voluntary pact to prevent AI tools from being used to disrupt democratic elections around the world. The companies did not commit to banning or removing deepfakes; instead they outlined methods to try to detect and label deceptive AI content when it is created or distributed on their platforms.

In her letter, James is requesting an in-person meeting with these companies to review steps they are taking to protect voters from misinformation. The letter seeks written responses to questions about policies and practices, plus a meeting with corporate representatives.

The letter said nothing about the companies’ obligation to comply, though implicit in any request from the state attorney general is the possibility of an enforcement action.

Copyright © 2024, ABC Audio. All rights reserved.


Reader's opinions

Leave a Reply

Your email address will not be published.