publish time

12/08/2024

author name Arab Times
visit count

1284 times read

publish time

12/08/2024

visit count

1284 times read

Concerns raised by OpenAI over emotional attachments to AI developed to mimic humans.

NEW YORK, Aug 12: In a recent development, researchers at OpenAI discovered that a safety tester’s message to the GPT-4o chatbot, which read “this is our last day together,” indicated a significant emotional bond between the user and the AI. This revelation was highlighted in a blog post detailing OpenAI's safety efforts concerning GPT-4o, the flagship model for ChatGPT users.

The company expressed concern that such bonds could present risks to humanity. OpenAI's worry centers around the potential for individuals to prefer interactions with AI due to its passive nature and constant availability. This scenario is particularly concerning given OpenAI's mission to develop artificial general intelligence (AGI). The company's approach has consistently framed its products in terms of human-like qualities, which may contribute to anthropomorphization—the tendency to treat AI as if it were human.

This practice of personifying AI is not unique to OpenAI; it appears to be an industry-wide trend. Marketing strategies often use human-like descriptions to explain technical aspects, such as "token size" and "parameter count," in layman's terms. This approach has led to widespread anthropomorphization, with interactive AI products often referred to using human pronouns.

Historically, the personification of AI began with early chatbots like MIT's "ELIZA" in the mid-1960s, designed to trick users into believing they were interacting with a human. Modern AI products, including Siri, Bixby, and Alexa, continue this trend, with even non-human-named assistants like Google Assistant utilizing human-like voices. Both the public and media commonly refer to these AI products using human pronouns.

While OpenAI’s current research does not address the long-term effects of human-AI interactions, the potential for forming emotional bonds with these subservient, human-like machines aligns with the goals of companies developing AI technologies.