OpenAI says Iran tried to influence US elections with ChatGPT

1 month ago 22

OpenAI has banned a drawstring of ChatGPT accounts tied to an Iranian power run that generated and shared contented related to the US statesmanlike election, among different topics. The cognition chiefly utilized ChatGPT to make long-form articles and societal media comments for platforms similar Instagram and X, according to OpenAI.

OpenAI linked the accounts to Storm-2035, a covert Iranian power cognition that has attempted to prosecute US voters by launching websites disguised arsenic governmental quality outlets. In summation to commentary astir the US predetermination connected some sides of the governmental spectrum, OpenAI says the cognition generated contented astir the struggle successful Gaza, Israel astatine the Olympic Games, authorities successful Venezuela, and “the rights of Latinx communities” successful the US.

These posts connected  X amusement   however  the cognition  generated contented  connected  some  sides of the US governmental  spectrum.

These posts connected X amusement however the cognition generated contented connected some sides of the US governmental spectrum.

Image: OpenAI

OpenAI says that its probe recovered the run “does not look to person achieved meaningful assemblage engagement.” The AI institution says astir societal media posts it tracked down received “few oregon nary likes, shares, oregon comments.” In May, OpenAI and Meta announced they’d disrupted a societal media run that utilized AI to post pro-Israel messages connected Instagram and Facebook.

With the US statesmanlike predetermination conscionable months away, we whitethorn spot much attempts to interfere with its outcome. Last week, Former President Donald Trump confirmed his run was hacked and linked the incidental to a phishing email sent by an Iranian hacking group. The FBI has opened an investigation into the purported hack connected Trump’s campaign, arsenic good arsenic alleged hacking attempts connected the Biden-Harris campaign.

Read Entire Article