Oh my god, my doctor is a robot!

20

September

2022

No ratings yet.

Hello, my name is Molly. I am your virtual medical assistant. How can I help you?” This is the new opening line for patients today and in the future will start hearing first. A patient who wants to know why they have severe headaches has to call their doctor, schedule an appointment, go to the doctor, wait for a long time, meet the doctor, to just have them say the 3 famous words, “Take a paracetamol.” With the emergence of artificial intelligence (AI) powered chatbots, the patient’s journey will be cut in half from the first step (Sennaar, 2019).

Natural language processing (NLP) is the underlying technology for chatbots. This technology falls under AI, having the aim of analysing texts through computerized means and gathering knowledge on how humans understand and use language (Joseph et al., 2016). NLP technology application is evident in many industries due to the technology’s ability to recognize human speech, understand and process natural language, generate text that can be read and interpreted by humans, and measure the sentiment behind certain speech and text (Eggers et al., 2018). Within the healthcare industry it is become more important due to its aforementioned abilities to search, analyse, and interpret large amounts of patient datasets (Foresee Medical, n.d.). Current areas of impact for NLP within the healthcare industry are in remote prevention and care, diagnostics support, treatment pathways support, drug discovery and development, operations, marketing and sales, and other support functions (Aboshiha et al., 2021).

Chatbots within healthcare aim to handle simple tasks of medical professionals which helps save them time to focus on their actual job more and eliminate unnecessary work on their behalf (Teo, 2022). Chatbots ask simple questions and based on the patient’s answers, analyse it and provide solutions if that’s to provide the patient with health-related information, set up appointments, give appointment reminders, and provide health condition information when the patient explains their symptoms (Curtis, 2021). The introduction of chatbots to serve as a supporting technology within the healthcare value chain has provided a list of benefits such as enhanced patient engagement, symptom assessment before in-person appointments, doctor and patient consultation management, reduced waiting times, cost reduction, scalability, and timely medical advice (Mousumi, 2022; Teo, 2022). The video below depicts how this could like, illustrating Sensely’s Virtual Medical Assistant in action (ExpectLabs, 2015).

The future of chatbots is bright as it is predicted that AI and NLP technology will constantly be improving and developing to further enhance the patient journey and for the healthcare provider (Eggers et al., 2018). However, with such technology one should be aware of its challenges such as trust issues among patients due to privacy and doctors leaving it all up to “robots”, cybercrimes, and the question, “Who is accountable if something goes wrong?” (Thomas, 2022).

References

Aboshiha, A., Gallagher, R., & Gargan, L. (2021, December 15). Chasing value as AI transforms health care. BCG Global. Retrieved September 19, 2022, from https://www.bcg.com/publications/2019/chasing-value-as-ai-transforms-health-care

Curtis, B. (2021, November 22). Chatbots in Healthcare: 5 best solutions and use cases. YourTechDiet. Retrieved September 19, 2022, from https://yourtechdiet.com/blogs/healthcare-chatbots-2/

Eggers, W. D., Malik, N., & Gracie, M. (2018). (rep.). Using AI to unleash the power of unstructured government data (pp. 1–20). Deloitte Insights.

ExpectLabs. (2015, June 18). Sense.ly virtual nurse, powered by Mindmeld. YouTube. Retrieved September 19, 2022, from https://www.youtube.com/watch?v=gUfRc_aIntA&t=17s

Foresee Medical. (n.d.). Natural language processing in healthcare medical records. ForeSee Medical. Retrieved September 19, 2022, from https://www.foreseemed.com/natural-language-processing-in-healthcare

Joseph, S. R., Hlomani, H., Letsholo, K., Kaniwa, F., & Sedimo, K. (2016). Natural Language Processing: A Review. International Journal of Research in Engineering and Applied Sciences, 6(3), 207–210. https://doi.org/https://www.researchgate.net/profile/Sethunya-Joseph/publication/309210149_Natural_Language_Processing_A_Review/links/5805ea1f08ae03256b75d965/Natural-Language-Processing-A-Review.pdf

Mousumi. (2022, June 8). Top 5 healthcare chatbot uses cases. Kommunicate Blog. Retrieved September 19, 2022, from https://www.kommunicate.io/blog/top-5-use-cases-of-chatbots-in-healthcare/

Sennaar, K. (2019, December 13). Chatbots for healthcare – comparing 5 current applications. Emerj Artificial Intelligence Research. Retrieved September 19, 2022, from https://emerj.com/ai-application-comparisons/chatbots-for-healthcare-comparison/

Teo, P. (2022, February 21). Healthcare Chatbots: Use cases, examples and benefits. KeyReply. Retrieved September 19, 2022, from https://keyreply.com/blog/healthcare-chatbots/

Thomas, L. (2022, May 4). The Pros and cons of Healthcare Chatbots. Medical News . Retrieved September 19, 2022, from https://www.news-medical.net/health/The-Pros-and-Cons-of-Healthcare-Chatbots.aspx#:~:text=Moreover%2C%20as%20patients%20grow%20to,self%2Ddiagnose%20once%20too%20often.

Please rate this

6 thoughts on “Oh my god, my doctor is a robot!”

  1. Great blog with an extensive use of scientific sources! This blog sketches a bright, positive future for chatbots in healthcare. However, I think that the downsides of AI-based robots could be assessed more in-depth to give a complete image of the development. As you said, privacy and responsibility are two challenges, but in the future we also have to think about ethical issues. For example, are we fine with the idea of elderly being supported by functional or chat robots, which leads to less human interaction?

    1. Hi Frank, thanks a lot for your comment! Appreciate the feedback 🙂 Yes, the challenges, limitations, and future implications of AI are something I definitely would have loved to further discuss. Since it was a short blog I decided to just touch upon however if the blog assignment didn’t call for different unique topics I would definitely wrote about the downsides of AI within healthcare. There are definitely a lot of challenges as you mentioned like the ease of use for the elderly additionally, accountability and privacy concerns. An interesting question I asked myself when writing this is, what if the AI chatbot misdiagnoses someone leading to potentially fatal outcomes? Who is accountable? who is to blame? I also believe the healthcare industry is one of the slowest in terms of the adoption of new technologies which has been seen in the cases of new innovative medical devices because of all the stakeholders involved. Last but not least, it is human life that is on the line.

  2. First, very interesting insights and I agree with you that this kind of technologies could be the future of low-grade healthcare and could take over a part of the GP’s work. However, I also think it will take time before people take medical advice from a robot or AI controlled system and be satisfied with it. If the robot doesn’t give the answer that people might want to hear, I think they will always want to hear a medical specialist before agreeing to the outcome. I think it will be a difficult task for the robot to build up enough trust that people will fully rely on this technology. How do you think about this? Do you think this tipping point will eventually come or will it be an insurmountable problem in this sector for AI?

    1. Hi Tom, thanks for your comment! My blog definitely doesn’t further dive into the limitations which I would have loved to have done for my second blog. But yes, the healthcare industry is probably one of the slowest adopters of new technologies which can be seen and I have seen in the slow adoption rate of new medical devices. In the end, we are talking about human life on the line so I believe it will take a long time for AI chatbots to completely diagnose patients. Patients additionally also like to hear from their doctor rather than a robot because of factors like trust and human interaction. I also think the balance needs to be there in how much you leave up to the AI and to the doctor because if you leave too much for the AI, patients might lose trust and doubt why doctors are not putting effort to diagnose the patient. If it’s not managed well and balanced properly then AI can be a big problem for healthcare but if done well it can prove to be a succeeding factor in treating patients and improving healthcare operations.

  3. Thanks for writing about this topic, I also considered to write about this topic myself, because I think it is a very relevant topic, so great job! We all know that the healthcare sector in our country is understaffed and with the increasing amount of elderly, the workload is only going to keep on increasing. I think technologies such as chatbots are needed in the future to support, for example, general practitioners (GPs) to provide basic medical advice, as you mention. I think that this helps GPs to focus on the patients that really need their attention. Right now, at my GP, for example, they already make use of an application called “Moet ik naar de dokter?” (“Do I have to go to the Doctor?” in English), which is basically a lo-fi chatbot, to, like you suggest, filter out the patients that come in with a minor cold, are prescribed a paracetamol and leave again.

    However, I do think there are challenges with the introduction of these technologies. Like Tom mentions, trust is I think a big challenge. But also patients want to feel like the doctor really is there for them when they need them and takes time to listen. I once talked to a diabetes patient who now had contact with her doctor via a platform on which they could share progress and things like lab results or via phone consults for general checkups. Even though this was experienced as very efficient, it was also experienced as impersonal. It makes me wonder whether chatbots would feel impersonal to patients. The “Moet ik naar de dokter?” application already makes me feel like I really need a good reason to go to the doctor and they actually don’t have time for me. But also can future chatbots maybe recreate this personal human-to-human contact that is created in patient to doctor settings? Or might this be unnecessary in the future?

    Finally, I also see a danger of misinformation. Last weekend, I talked to a family member who is a general practitioner. She was talking about that, nowadays, she has a lot more discussion with patients on why she is prescribing them certain medicine, because patients are able to look-up a lot more information themselves on e.g. thuisarts.nl. On the one hand this is good, patients are more aware of their own health, but sometimes it also leads to a form of distrust towards doctors. Would this be further simulated by chatbots giving advice to patients?

    What do you think about the questions I raised? I would love to hear your thoughts!

    1. Hi Isabel, thanks for your comments and insights on the potential limitations of AI! I definitely agree with all your points and would be something I would definitely have written about if it wasn’t for the blog assignment asking for writing about different topics. I just touched upon the limitations but would have explained more and definitely cover the points you mentioned. Accountability, privacy concerns, and the trust patients have with chatbots and their doctors is a huge thing with AI chatbots as I mentioned in the replies to Tom and Frank. It’s very interesting to hear about that app “Moet ik naar de dokter” which I will definitely explore 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *