Fatima Sultakeeva, Master’s Student, Graduate School of Data Science
Fatima Sultakeeva, Master’s Student, Graduate School of Data Science

Nowadays, AI has become an integral part of daily life. In particular, ChatGPT has entered our everyday lives, available everywhere and at any time. It is even used as a therapeutic tool, especially among younger generations, since it provides instant responses to their concerns. One reason for this trend is the difficulty of accessing traditional therapy, whether due to time constraints or the high cost of consultations. According to a survey by Harvard Business Review last April, the most common application of generative AI is “consultation and companionship.” Anyone with a phone and Internet access can receive this type of “therapy,” which is a significant advantage. However, does this truly help, or could it make the situation worse? Regardless, young people are increasingly turning to AI as a psychotherapist or coach.

But we must recognize that AI has significant limitations. It does not have human feelings, emotions, and empathy, and most importantly, it cannot make a diagnosis. Relying on AI for serious mental health issues can be dangerous. There is also the matter of privacy. In March 2023, OpenAI reported a data leak in which some ChatGPT users could see others’ messages, names, email addresses, and payment details. Although the issue was quickly addressed, it highlighted the risk that all the personal problems you shared with the model can be revealed. In today’s digital world, data can be easily misused. Therefore, while AI may serve as a supportive tool, it can never replace a professional psychologist. Users must remain cautious, recognizing AI as helpful but ultimately limited.

By Fatima Sultakeeva, Master’s Student, Graduate School of Data Science

저작권자 © Chonnam Tribune 무단전재 및 재배포 금지