The company also said that during the election year they collaborated with the National Association of Secretaries of State, to direct people to official sites on how and where to vote, in addition to receiving information that encouraged users to consult more news sources from some from its partners, such as the Associated Press or Reuters.
However, these types of collaborations only exist in the United States, which is worrying, since ChatGPT is a platform that operates globally.
In Mexico, for example, there is no direct collaboration with the National Electoral Institute or any other entity, so the tool uses the INE website, official statements and information from “highly credible” media so that users can can obtain accurate and up-to-date data. But without delving into more details.
A persistent problem
According to data from the machine learning company, Clarity, the number of deepfakes has increased by 900% year after year and their objective was to destabilize the United States electoral processes.
According to a report published in October by OpenAI, the company had disrupted “more than 20 deceptive operations and networks around the world that attempted to use our models” for AI-generated articles, to social media posts from fake accounts.
For officials around the world, the presence of generative AI represents a worrying misinformation potential, because although they have gone through refinement processes, large language models often continue to show inaccurate and unreliable information.
Despite not agreeing with regulations like California’s, OpenAI has supported the Protect AI Elections Act, a bipartisan project in the US, which would prohibit the distribution of misleading AI-generated audio, images or videos related to candidates federal in political propaganda.
“We do not want our technology, or any AI technology, to be used to mislead voters and believe this legislation represents an important step in addressing this challenge in the context of political advertising,” the company says.
Add Comment