Home Technology OpenAI warns users against forming emotional bond with its GPT-4o chatbot

OpenAI warns users against forming emotional bond with its GPT-4o chatbot

39
0


OpenAI has issued a warning for its users after it revealed that some had started developing feelings for its GPT-4o chatbot.

In a “system card” blog post, the AI company highlighted the risks associated with “anthropomorphization and emotional reliance,” which involves attributing human-like behaviors and characteristics to nonhuman entities, such as AI models.

OpenAI stated that the risk may be heightened by the more advanced audio capabilities of GPT-4o, which appear to be more realistic. According to the tech firm, early testing revealed that users were using language that might show they were forming a bond with the OpenAI model. For example, this includes language expressing shared bonds, such as “This is our last day together.”

The phenomenon might have broader social implications. “Human-like socialization with an AI model may produce externalities impacting human-to-human interactions,” OpenAI continued. “For instance, users might form social relationships with the AI, reducing their need for human interaction—potentially benefiting lonely individuals but possibly affecting healthy relationships.”

Omni models like GPT-4o mean that the AI is able to remember key details in a conversation, but this may also lead to an over-reliance on technological interactions.

OpenAI added that it would study the potential for emotional reliance, and ways in which deeper integration of its model’s and systems’ features with the audio tool may drive behavior and cause people to form bonds with it.

That said, the company claims that the models are “deferential,” allowing users to interrupt and “take the mic” at any time.

Concerningly, OpenAI also noted that GPT-4o can sometimes “unintentionally generate an output emulating the user’s voice.” This means it could potentially be used to impersonate someone, which could be exploited for nefarious purposes by anyone from criminals to malicious ex-partners engaging in harmful activities.

Life imitating art as GPT-4o compared to Her

Some users have taken to X to comment on the strange development, stating that it is “creepy” and similar to the plot of the 2013 movie “Her.” Another likened the impact to the science-fiction series “Black Mirror.”

Coincidentally, OpenAI removed one of the voices used by GPT-4o after it was likened to the actress Scarlett Johansson and the character she played in “Her.”

Featured image: Canva / Ideogram

The post OpenAI warns users against forming emotional bond with its GPT-4o chatbot appeared first on ReadWrite.





Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here