My love-hate relationship with ChatGPT: Trust issues exposed

8

October

2024

arguing with ChatGPT
No ratings yet.

In this world where technology is unimaginable, artificial intelligence like ChatGPT has become big part of our everyday lives. My experience with this AI has turned into a complicated love-hate relationship that is filled with enthusiasm, confusion and frustration.

Building trust

When I first started using ChatGPT, I was excited. It felt like having an assistant always near me, ready to help with my questions, schoolwork, recipes and even emails. It was even better than Google at some points. I could ask questions and get clear answers almost immediately. At first I thought it was fantastic and that I could rely on it for anything. The AI provided explanations, helped me brainstorm ideas and suggested solutions to problems I was struggling with. In those early days it felt like I was forming a solid partnership.

Doubts start to appear

However, the excitement did not take long, when I started asking more straightforward school related questions, questions like “Is this right?”, to check if I’m on the right track with my homework, I found myself getting different responses each time. I expected a confirmation but instead I received answers that did not match what I was looking for.

I tried and intentionally gave a wrong answer to a question and asked if it was right, just to see how ChatGPT would react. When it told me my answer was right, I asked, “Are you sure?”  it replied, “I apologize for the mistake. Let me provide the correct information.” That left me more confused than ever. How could it change the answer so quickly? It was hard to trust it when it seemed so inconsistent.

Growing trust issues

When I used it more often, my trust issues increased. I found myself repeating questions, hoping for a good answer. I had moments when I spent more time discussing things with ChatGPT than it would have taken to just do the task myself. I would find myself getting frustrated and typing in all caps. I felt like I was talking to someone who did not even want to understand me. Instead of feeling that it helped me, it felt like I was only arguing back and forth and it was exhausting.

Realising that my frustration only increased. I knew that I had to change the way how I asked my questions. I started double checking answers and used other sources to confirm information. I realized that while it could be a helpful tool, it was important to verify the information I got. I learned to ask more specific questions and provide additional context, this led to better results.

Lessons learned

I learned an important lessons about trust, not just with AI but in all areas of life. Trust takes time and clear communication. It is important to realise that even advanced technology can make mistakes. My relationship with ChatGPT changed from blind trust to a more cautious partnership. I learned to appreciate the strengths while acknowledging the limitations.

Looking back on my experience with ChatGPT, I realised how unstable technology can be. While my experience has had its conflicts, I still appreciate the value it brings to my learning process. Have you ever felt frustrated using AI? You are not alone, let’s share our struggles and find ways to make it work better for us! 


Please rate this

2 thoughts on “My love-hate relationship with ChatGPT: Trust issues exposed”

  1. Dear Mirjam,

    Thank you for your article! It was very fun to read :D. I totally agree: I have some trust issues with ChatGPT as well.

    I remember one time for an assginment I had to ask ChatGPT to write a summary about a chapter of a book. The assignment was about checking the summary of Chatgpt and adjusting it manually. At first, I was sooooo excited. I thought “This is gonna be so easy”. That was until I realized that half of the summary was incorrect. So, I had to reread the chapter, check what was correct and what was incorrect, and then manually adjust the generated summary. This process felt like it cost way more effort then just writing my own summary from the start. This also felt super inefficient.

    Thus, for making summaries, I still believe it is best to just write them yourself. But I believe ChatGPT could assist in explaining concepts you might not understand. What do you think, Mirjam?

  2. Hey Mirjam, finally someone talks about this on this blog. The way you described your history with ChatGPT made me laugh, I’ll be honest. More importantly though, I can totally relate to your struggles. When I realised that ChatGPT made this many errors, especially with mathematical problems, I simply stopped using it. To me, the burden of validating any information or answer ChatGPT provided me was too much.
    However, I also realised that ChatGPT’s value is too high to ignore the technology. In my opinion you just have to figure out which questions you can ask to ChatGPT and which you cannot. And further develop good prompt writing skills that GPT is even correct for mathematical questions. But as you said, validation is crucial until this point and can sometimes be the reason why the burden of ChatGPT outweighs its value. I think the technology still has a long way to go, while we users need to develop our ways of interacting with it efficiently.

Leave a Reply

Your email address will not be published. Required fields are marked *