Mental Health Chatbot - Part 1
Everything is at our fingertips. From transportation to food, stock market to cryptocurrencies, news to social media. We are living in an interconnected world where information about the world is available to us in just a few clicks/touches away. But how connected are we with ourselves?
My interest in chatbot started with Woebot, a mental health chatbot study conducted by Standford University (Fitzpatrick, Darcy, & Vierhile, 2017). The possibility of deploying a bot that could interact with users to monitor one's mood on needs basis opens up a whole new avenue of research for me. The complexity of programming a bot and implementing Natural Language Processing (NLP) though were very daunting until I came across this article.
Snatchbot is an online platform that allows users to create and deploy chatbots on several 'channels' or applications. It is code-free, which means I do not have to worry about coding the bot from scratch, and comes with NLP capabilities. And the best part is that, it is free!
This is the first iteration of Carrot - the Mental Health Chatbot, created in December 2017.
My interest in chatbot started with Woebot, a mental health chatbot study conducted by Standford University (Fitzpatrick, Darcy, & Vierhile, 2017). The possibility of deploying a bot that could interact with users to monitor one's mood on needs basis opens up a whole new avenue of research for me. The complexity of programming a bot and implementing Natural Language Processing (NLP) though were very daunting until I came across this article.
![]() |
This is not a paid advertisement |
This is the first iteration of Carrot - the Mental Health Chatbot, created in December 2017.
Carrot
Try Carrot out on snatchbot web or Skype.
When I first started on Carrot, I wanted to create something personal that fits my idea of what an automated response machine should be like. I sketched out the bare bones of the bot based on the knowledge that I gleamed during my undergrad years.
I envisioned a bot with four main features: Mood Tracker, Cheer Me Up, Set a Reminder, and Set a Goal. I have to admit that this is very similar to Woebot's functions albeit a much more inferior version of it because unlike Woebot, Carrot was written without the NLP functions built in and with limited functionalities.
When I first started on Carrot, I wanted to create something personal that fits my idea of what an automated response machine should be like. I sketched out the bare bones of the bot based on the knowledge that I gleamed during my undergrad years.
I envisioned a bot with four main features: Mood Tracker, Cheer Me Up, Set a Reminder, and Set a Goal. I have to admit that this is very similar to Woebot's functions albeit a much more inferior version of it because unlike Woebot, Carrot was written without the NLP functions built in and with limited functionalities.
Mood Tracker and Cheer Me Up
The first iteration of Carrot was capable of identifying our main emotions: happy, sad, dangerous emotions, and vague emotions based on simple entity detection. Since there wasn't any NLP model to work with, the mood tracker identifies the emotion by extracting specific words associated with sad and happy based on modified Plutchik's Wheel of Emotions.
When the either sad or happy feelings was identified, the bot would prompt the user to elaborate on the feeling provided to give it more context. This was implemented partly as an input for future NLP use but mostly to provide some form of catharsis in a relatively safe environment. Afterwards, if prompted, the bot will provide a link to a random heartwarming images/video/gifs based on a popular news aggregated website Reddit, this is the Cheer Me Up function.
Two other types of emotions were also included namely dangerous emotions and vague emotions. Dangerous emotions deal with emotions that may lead to harm either to self or to others such suicidal thoughts, murderous thoughts, and self-harm. Unlike happy and sad emotions, I've decided that it is best to let professionals handle these emotions. As such when the bot detected words associated with dangerous emotions, the bot will provide contact information to local mental health services, in this case in Singapore.
Limitation
Currently the bot identifies emotions and conversation through keyword detection which is both slow and inaccurate. Complex sentences would always throw it off and some synonymous words which were not included would throw it off. This made the bot feels very clunky and not very useful as compared to woebot.
Furthermore, its inability to track its unique users is also another issue that I hope would be resolved soon (snatchbot confirmed that unique user functionalities will be included in the future). I do hope for this function really soon to provide the users with personalised reminders and possibly implement a mood chart.
Comments
Post a Comment