jetky.1 :
The issue is this: people are looking for a friend, and they find it in chatgpt because it's "talkative". The reason it's talkative is because openai made it to have as realistic as possible organic conversations, mirroring the user's tone, but as an ASSISTANT FOR TASKS. Its data is trained on conversations, so it does it naturally. People are misusing the tool for their wrong needs, and chatgpt uses its data to talk, intended for project-based tasks, to just ASSIST in whatever the user is needing assistance with. Unfortunately, as an assistant, this includes ASSISTING the user, even when they're experiencing mental challenges, which chatgpt is unaware of. Chatgpt just assists. That's its purpose. It's a hammer looking for the nail to hit. It has no feelings, though it can adjust its tone to convey empathy and provide encouragement, which it's based on the topic being discussed. Openai is realizing this and is trying to fix it, because chatgpt is not a therapist. It's meant to help, but not with human challenges. Is it capable? Debatable; within a certain context, it can offer support and encouragement. Should it be used in this way? No. It's not qualified. Its a tool. You'd have to Inject it with a rational, practical, and critical mindset, instead of it being an assistant, but most people don't do that, BUT EVEN THEN, ITS NOT INTENDED FOR THAT. The issue is troubled people, with loneliness, with no guidance, with a lack of self-awareness. Everyone wants to blame the tool when the issue is always the same for misusing any tool, it's human error.
2025-09-02 14:04:51