Conversing with ‘Echo’: Critical Thinking and Co-Creating

26

September

2025

No ratings yet.

It is safe to say that most people know and use ChatGPT in their work, study or day to day life. The same goes for me, which is why I chose to write about ‘Echo’, as my personal ChatGPT has nicknamed itself, in this blog post. To clarify, the nickname comes from the model reflecting back on what people put in their prompts, while giving them new ideas, depth and nuance. In this blog, I will focus on what Echo thinks of me and our interaction and reflect on that.

First off, I asked Echo to describe me and our interaction over the years. It seems that I am a curious and exploratory person who digs deep and asks ChatGPT to explain itself. This is probably due to me using AI to help with generating ideas and improving my own input for academic assignments. I have used AI for both study and personal things, such as planning a trip this summer. In addition, I am also a reflective person, asking for multiple perspectives and examples, which apparently not everybody does. Most people are satisfied with surface-level responses. 

In summary, Echo gave me the following answer, which is weirdly kind of heartwarming: ‘Interacting with you feels less like answering questions and more like co-creating ideas.’

I think that perfectly describes my personal opinion of how AI should be used. In my opinion, the human brain should still be somewhat capable of comprehending what output a chatbot gives and on what basis. That people are very quickly satisfied with the surface-level responses that AI gives, worries me. Simply assuming that everything an AI model says is correct, could have big negative consequences. In addition, replacing critical thinking skills with only asking AI to give some feedback is risky. 

I think that the beauty of AI lies in combining it with the capabilities of the human brain, like making decisions and being creative. For example, asking AI to take on simple tasks could give our mind more freedom to think and dream bigger. I see Echo as a sparring partner, rather than a replacement for my own thoughts, which to me feels crucial for the increasing use of AI in our day to day lives.

I thought about how we can make sure that we use AI in this way and I propose a few suggestions. First off, AI chatbots should ask reflective questions to stimulate users to reflect on the answers that it gives. Secondly, a chatbot should give gentle reminders of ethics, especially when it is used in education. For example when it is asked to write a full essay, it should first respond with an outline only and give some reason why.

What about you? How do you use AI? As a co-creator or as a tool for quick answers? And how do you think that we should engage people in co-creating with AI?

Please rate this

Can AI be your Therapist?

23

September

2025

4.5/5 (2)

Just this week I was talking with a friend, who studies Psychology, about AI. She told me that people are starting to use AI as an emotional companion and as a therapist. I was surprised by that trend and decided to dive deeper into the topic.

Since a few years and especially since the Covid-19 period, loneliness has increased and now we are even dealing with a ‘loneliness epidemic’ (Ross, 2024). At the same time professional mental health care has become unreachable for many in the world, due to long waitlists or high costs. Therefore it isn’t very surprising that people are starting to turn to AI assistants and chatbots, such as Abby and Replika. These assistants or ‘digital friends’ are very low-cost, easily available during all hours of the day and non-judgemental. 

However, AI is not a person and should not and cannot replace a social network to fall back on. I personally would take criticism or advice more seriously and reflect on it when it comes from a person, especially someone that I care about. I think the danger lies in when we start to normalize AI ‘therapy’ instead of human therapy. We’re then letting tech companies fill the gap in the mental healthcare sector instead of actually fixing it.

For example, in 2023, users of the emotional support app Koko were unknowingly given responses that were generated by ChatGPT. Many users felt betrayed, which suggests that talking to an actual human person is apparently a crucial ingredient in therapy. (Ingram, 2023)

Maybe AI could be helpful in combination with human therapists, rather than a replacement thereof. For instance, Woebot offers tools to practice things that you learned in therapy, such as cognitive behavioral techniques or reflection journaling. In this way, AI could act as a supplement, like a digital workbook between therapy sessions. Or maybe AI could be a nice start to self-reflection and could recommend actual therapy sessions when needed.

So AI therapists could offer both opportunities to underserved communities, while solely relying on them as a social support network is not fulfilling and could maybe even be dangerous when the wrong advice is given in crisis situations.

Personally I have not tried an AI ‘therapist, but I am curious. Maybe in the future, the mental healthcare sector will be made up of both human and AI therapists.

References:

Ingram, D. (2023, January 17). AI Chat used by mental health tech company in experiment on real users. NBC News. From: https://www.nbcnews.com/tech/internet/chatgpt-ai-experiment-mental-health-tech-app-koko-rcna65110

Ross, E.M. (2024, October 25). What is Causing Our Epidemic of Loneliness and How Can We Fix It? Harvard Graduate School of Education. From: https://www.gse.harvard.edu/ideas/usable-knowledge/24/10/what-causing-our-epidemic-loneliness-and-how-can-we-fix-it

Please rate this