New Addiction in Town: From Curiosity to Balance with AI Tools

5

October

2025

5/5 (1)

In my first year of university, I never used AI tools and everything went fine. I managed my studies, wrote assignments, and came up with ideas on my own. But around the start of my second year, I stumbled upon ChatGPT and it changed everything. Suddenly, whenever I did not understand something, I could ask for a detailed explanation. And in an instant, it also corrected my spelling and grammar. It felt like having a 24/7 tutor who could save me hours of effort.

The more I used it, the more I experimented with it. From brainstorming to making outlines to even looking for creative inputs. But slowly, I noticed a shift. I was no longer being independent when it came to ideas. Instead of pushing myself to think, I automatically resorted to ChatGPT. Sometimes I told myself, “I could have thought of that too”, but the truth was, I did not.

That realisation made me pause. I decided to use AI more thoughtfully, as an assistance aid and not a replacement. Now I use NotebookLM when I need to compare articles or want to find key arguments easily, this saves me time without replacing my voice. For grammar and quick summaries, AI is still a huge help. But when it comes to generating original ideas or elaborate arguments, I want to challenge myself first. Finding this balance has not always been easy, but it has made me more aware of how I learn. I now see AI not as a shortcut, but as something I can actively control in my process.

Baidoo-Anu and Owusu Ansah (2023) argue this issue too. AI can promote personalised and interactive learning, helping students save time and receive feedback. But they warn that blind application may result in uncontrolled dependence and misinformation. This made me reflect on my own experience and how I wanted to use these tools moving forward. In my case, a balance is the solution, AI can sharpen but not replace my own thinking.

Reference

Baidoo-Anu, D., & Owusu Ansah, L. (2023). Education in the era of generative artificial intelligence (AI): Understanding the potential benefits of ChatGPT in promoting teaching and learning. Journal of AI, 7(1), 52–62. https://doi.org/10.61969/jai.1337500

Please rate this

When Chatbots Listen: Bridging the Gap or Widening It?

18

September

2025

5/5 (1)

I recently read an article in The Guardian about the rising use of these chatbots and the concerns therapists are voicing. It made me think about my own surroundings. Some of my friends actually enjoy using AI tools. For them, opening an app feels less intimidating than calling a clinic, and it is often much faster. With long waiting lists and high costs for sessions, AI feels like an accessible alternative.

Artificial Intelligence is rapidly becoming a key player in mental health care. AI promises to make support more accessible and personalised. It can pick up subtle signals and flag early signs of mental illness before symptoms become severe. AI can even create tailored treatment plans and adjust therapy dynamically as patients progress, reducing the frustrating trial-and-error process.

These innovations bring clear benefits. They can shorten waiting times, make care more scalable, and offer a low-threshold way for people to seek help without immediately committing to therapy sessions. For many, an AI chatbot can be a first step, a safe, stigma-free place to express feelings before speaking to a professional.

But there are real reasons to be cautious. AI can analyse data and spot patterns with impressive precision, but it cannot feel empathy or fully understand context. A trained therapist can notice subtle emotional shifts and offer the human connection that a chatbot simply cannot. Without this, care risks becoming too standardised and detached.

Another concern is privacy. Sharing intimate thoughts with an app means trusting that your data is stored securely and that it is used in an ethical way. Unfortunately, this is not always the case. Research has shown that some mental health apps share data with third parties or use it for targeted advertising. Users often do not realise that this is the case. This makes it even more important to demand transparency and strict data protection if AI is to play a safe role in mental health care.

I believe the future of AI for mental health is hybrid. AI can function as bridge, providing rapid support, with therapists doing more in-depth emotional work. Together, they can make mental health treatment more accessible than ever. 

Do you think that AI chatbots should play a bigger role in mental health care, or do we risk replacing too much of the human connection?

References

Hall, R. (2025, August 30). ‘Sliding into an abyss’: Experts warn over rising use of AI for mental health support – Therapists say they are seeing negative impacts of people increasingly turning to AI chatbots for help. The Guardian. https://www.theguardian.com/

Murdoch, B. (2021). Privacy and artificial intelligence: Challenges for protecting health information in a new era. BMC Medical Ethics, 22(1), 122. https://doi.org/10.1186/s12910-021-00687-3

Olawade, D. B., Wada, O. Z., Odetayo, A., David-Olawade, A. C., Asaolu, F., & Eberhardt, J. (2024). Enhancing mental health with Artificial Intelligence: Current trends and future prospects. Journal of Medicine, Surgery, and Public Health, 3, 100099. https://doi.org/10.1016/j.glmedi.2024.100099

Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in Digital Health, 6, 1280235. https://doi.org/10.3389/fdgth.2024.1280235

Please rate this