In a recent tweet, a senior employee from OpenAI made quite an interesting comparison. Lilian Weng, the head of security systems at OpenAI, compared the new version of ChatGPT to therapists. She shared, “Just had a personal and quite emotional conversation with ChatGPT in voice mode, discussing stress and work-life balance. Felt really ‘heard’ during the conversation.” She even suggested that people should try it as therapy, especially if they mainly use it for productivity.
This sparked some criticism and concern from other Twitter users. Many pointed out that using ChatGPT as a therapist is problematic and potentially dangerous. One response stated, “This isn’t therapy, and claiming it is can be harmful.” Another user simply said, “Your personal opinion is not appropriate for someone in your position.”
It’s worth noting that Weng is not the first person to use ChatGPT as a therapist, and she certainly won’t be the last. However, considering her role in AI security and her lack of experience in actual therapy, her comments become even more complicated.
OpenAI recently announced a major update to ChatGPT that allows users to have voice conversations with the chatbot, in addition to image interactions. The startup aims to make the bot even more popular with these new features.
But let’s remember, as fascinating as the advancements in AI may be, using it as a substitute for real therapy is not advisable. Sometimes, it’s best to seek help from trained professionals who can provide the necessary guidance and support.