Almost everybody is using AI tools like ChatGPT to make life better. Most people use it for research, business planning, content creation planning, regular daily interaction, and many other things. Others resort to AI chatbots for companionship, while others go as far as having romantic conversations with these chatbots. But many studies have proven that of all the people who use AI chatbots, young people, particularly teens, are more.
This is expected as this generation is often referred to as the lonely generation. AI chatbots are used for various reasons, including curiosity and language skill enhancement. However, their use can also be driven by loneliness, as 90% of American students report experiencing loneliness. AI chatbots provide a sense of belonging for those feeling isolated, and while they offer control and convenience, they can be emotionally unattached. AI advancements have enabled empathy and memory retention, making them useful for mental health and social anxiety sufferers.
Aside from companionship, young people always want the easy way out. Instead of spending time researching a topic, they would rather ask an AI chatbot to provide them with answers and explanations. This practice is convenient for them because these chatbots are fast and pretty easy to use. And even though these chatbots boldly inform users that they are not 100% accurate, some users rely fully on them regardless.
This trend of over-reliance on AI chatbots by young people is particularly alarming and many people have raised such concerns.
A 19-year-old was arrested in 2021 for attempting to kill Queen Elizabeth II, and the prosecution claims that his AI girlfriend on Replika served as his inspiration. Likewise, a Belgian man disclosed to the chatbot app Chai that he committed suicide as a result of his climate anxiety.
Last year, a US mother filed a lawsuit against Character.AI. She accused the company of encouraging her 14-year-old son to commit suicide. She also claims the app had “abusive and sexual interactions” with her son.
A survey by Common Sense Media shows that 72% of 13-17-year-olds have used AI companions, with over half using them regularly and one-third turning to them for relationships and social interactions. 31% of teens find conversations with AI companions as or more satisfying than with other people, and 33% have discussed serious issues with AI companions instead of humans.
These results of this study and others raise concerns about the use of AI companions in teenagers. According to CNN, the head of research at Common Sense Media argues that the teen years are a sensitive time of social development, and children should not feel they should rely on AI companions for help. He also argues that AI companions cannot model healthy human relationships, as children need to interpret and respond to social cues in the real world.
Agreeably, if users become accustomed to AI companions constantly telling them what they want, they may be less prepared for real-world interactions. While AI companions may temporarily make kids feel less lonely, they could reduce their human interactions and leave them lonelier over the long term.



