In a time where we are so interconnected, it is baffling that we can be just as alone. I remember feeling alone, and I'm sure we all have at some point in our lives. From feeling incompetent to pain, misguided joy, or even just not feeling like life is not worth living. It's a bleak world that's filled with pain and suffering, but in that we also have found new ways to adapt and grow through platforms that speak openly about mental health to new outlets for getting help. It's the human connection that makes us complete, but we still have a long ways to go from living in a world where we can fairly and effectively treat an individual's mental health issues. The barriers to mental health access are far more pronounced today than ever. Teens and adults alike are finding it harder to find access to care, and people are becoming more isolated than ever. Globally, more than 70% of people with mental illness receive no treatment whatsoever while a study by the World Health Organization found that between 30 and 80 percent of people with mental health issues don't seek treatment. From all of the therapists and counsellors, there seems to be something that's preventing people from not just getting care, but also the right care.
Personalization comes hand in hand with empathy and compassion, and while we don't have enough therapists in the world, we do have the advent of technology. But it's crazy for me to think that despite all of our progress in social networks and emerging tech, the only service someone has to seek help is a chatbot. A simple copypasta doesn't solve issues, and in cases where personalized care is inaccessible to most, we need something new that can see above the surface. It's hard to find a therapist, a friend or someone you can even talk to who can especially understand your struggles. So how can we not only increase access to care, but dramatically improve our approach to helping the people we care about?
That's what I set off to do with Cubits. The goal was to create a pipeline that can utilize much more relevant data on a patient to better help them. I wanted to tackle the area of personalized mental health support and care using something unconventional: deep learning. No, this isn't a chatbot but instead its a set of tools that help to collect far more detailed, relevant data on a patient which can not only dramatically improve the potential for personalized care, but make things a bit more clear.
Traditionally, most applications you've probably heard of involve chatbots or some pre-defined response mechanism that's triggered with keywords like "I'm sad" or "I had a bad day". But this is not how human emotions work. There are nuances in the way someone talks, what they say, and their facial expression and automating this is a challenge. Solving the communication and understanding gap is hard on its own, but to make it accurate is still miles away. We need more data and smart people working on this problem, and Cubits is a step forward in that direction.
Cubits uses multimodal deep learning to extract key data points on a patient's mood, attitude, and emotions in a given moment. This can be used for several applications, primarily in identifying relationships between certain questions, key topics, and mentions with changes in sentiment. If you want to understand what's bothering you, just speak your mind and get a potential answer to that question. The system is designed to analyze similar queues a therapist or trained professional might use, but also draw better connections between a person's facial features, speech, statements, and more.
The idea is to take a patient and identify their mood and emotion using two primary indicators: facial features and audio tone. This is done on a timeseries to provide context, after which these sentiment predictions are examined with what the person is saying. For example, a patient records a video of them saying "Today was an okay day. Nothing good happened, but a few bad things did happen. I lost that promotion I was looking to get after 5 years with my company. I guess we move forward." Now, there's a lot of potential areas of contradiction or interest someone might want to ask more about. Questions like "why was the promotion so important to you" or "do you feel happy moving forward?" can offer us more insight on whether the person really is feeling optimistic or is covering up their emotions. To do this, Cubits can analyze a person's facial expressions and make sentiment predictions using visual images, after which it can connect the timestamp of a prediction with a specific word or phrase the patient said. The same goes for the speech sentiment model, which analyzes patterns in volume and tone to make predictions of whether someone is happy or sad, regardless of the context. These two models come together and help zone in on key phrases that might have more to them.
Cubits achieves this using the following DL algorithms:
There's a lot of machine learning behind this, so let's break it down a little bit more from the sentiment models to the actual phrase capture system.