FUTR.tv Podcast

Solving the Mental Health Crisis: How AI and Wysa are Revolutionizing Access to Care

FUTR.tv Season 2 Episode 125

Send us a text

It seems that lately we are all in some state of persistent mental health crisis. Unfortunately, there are not enough mental health practitioners to go around. Today, we are talking to a company that is building tech to help solve the rationed care problem, so stay tuned.

Hey everybody, this is Chris Brandt welcome to another FUTR podcast.

Today we are talking with Ram Vempati, founder of Wysa, an AI enhanced mental health platform  about how they are trying to solve the challenge by giving people the tools and strategies they need to be successful. By bringing therapists and AI together, they seek to bring access to a much larger audience. So let's hear their story.

Welcome Ram

Get Wysa in the App Store
Get Wysa in the Google Play Store 

Click Here to Subscribe:

FUTR.tv focuses on startups, innovation, culture and the business of emerging tech with weekly podcasts talking with Industry leaders and deep thinkers.

Occasionally we share links to products we use. As an Amazon Associate we earn from qualifying purchases on Amazon.

Chris Brandt:

It seems that lately we are all in some state of persistent mental health crisis, and unfortunately there are not enough mental health practitioners to go around. Today we're talking with a company that is building tech to help solve the rationed care problem. Hey everybody, this is Chris Brandt. Welcome to another FUTR podcast. We are talking with Ram Vempati, founder of Wysa, an AI enhanced mental health platform, about how they are trying to solve this challenge by giving people the tools and strategies they need to be successful. By bringing therapists and AI together, they seek to bring access to a much larger audience. So let's hear their story. Welcome Ram.

Ramakant Vempati:

Thank you Chris, and thank you for having us here.

Chris Brandt:

Tell me about Wysa. So you, you've got this mental health platform. What, what is it all about?

Ramakant Vempati:

So everybody needs help. You're, you're worried about, uh, your finances, you're worried about your health, you're worried about your loved ones, what the future holds. And these start small. But uh, over time they turn into something. For some people at certain times, they turn into something very significant. And as we've seen, and I think a lot of people, yeah, especially hopefully even in your listener base, Have experienced that. There's just not enough therapists to go around.

Chris Brandt:

I've experienced that personally.

Ramakant Vempati:

Waiting life. So waiting times are long. Uh, it is underinsured. It's extremely expensive. Uh, it is gated. Uh, access is gated through a payer system or through, uh, just availability, right? Uh, and by a diagnosis. So you need to be diagnosed saying that you are bad enough to seek help or you are, you're poorly enough that you need help. And whereas a helping hand much, much earlier on in the journey, uh, could probably nip it in the bud or help you get support at the point when you need it, and prevent these things from festering and ballooning later. Yeah. And that's really at a very fundamental level, that is what Wysa is doing. Um, it is allowing people to seek help at the point where they are at the point they need it, when they need it in a way which is Inobtrusive always available. Uh, and then handholding them to get to further support in case they need to. And, uh, at that point, and we'll talk more about this, uh, I think in a conversation. Uh, we think the way we are doing this is actually helping solve that problem. Yeah. Of undersupply over demand, long wait lists under insurance, bad access, and ultimately people suffering in silence. Yeah. And that is something which you really want to address.

Chris Brandt:

Tell me about your, your story. Like, why did you choose to do this? I, I, I gotta imagine there's a little bit of a personal journey here too.

Ramakant Vempati:

Uh, it is, and, uh, I am thinking that most people who start their journey in mental health, uh, uh, do that due to a personal story. Uh, I'm not a clinician, so I'm not a a therapist. I'm not a psychologist. Uh, I would say I was a caregiver and we come at it, or we've come into our, Wysa journey as a, uh, very personal journey as as users. Mm-hmm. And, uh, so my father, for example, in the last few years of his life, uh, had bipolar. Mm-hmm. So I was a, a caregiver, uh, my co-founder Joe, when we set up Wysa early on, and we, as most startups do, we went through multiple pivots and some, uh, of those pivots did not work. Um, and that led to an incredible amount of stress. Uh, we looked at depression and Joe went through depression as well. So, uh, and being a founder is a lonely journey. Absolutely. Uh, so at that time, again, that was my second experience as a caregiver, and we discovered CBT Cognitive Behavioral Therapy. We discovered also that the way of delivering this, even in a self-help context, even if you did not have access to the therapist, Uh, was extremely poor, very fragmented, and the condition of depression itself. And what we realized was with all the resources in the world, with all the will in the world, uh, it is extremely hard for you to reach out and take support. There is some part of that journey, which is very private, where you literally are making sense of the world around you. Yeah. Uh, you are coming to terms with what is going on and mustering up the energy and the courage to go out and become vulnerable in front of another human being. That takes a lot of guts. That takes a lot of work, for sure. And that that initial part of the journey is very private, where we thought people needed help. And that's where Wysa really works.

Chris Brandt:

You mentioned, uh, CBT or Cognitive Behavioral Therapy there, and I know that mm-hmm. The, the Wysa platform is largely based around CBT, and I know you're not a clinician, so, you know, this is not me medically medical advice here, but, um, yeah. Could you, could you describe I, because I know that CBT has long been talked about, as, you know, kinda one of the gold standards. For therapy in, in terms of, you know, certain certain, you know, solving for certain types of, uh, challenge. Can you talk a little bit about generally what CBT is?

Ramakant Vempati:

CBT? Absolutely. And also before we go in there, I'd also like to highlight that one Wysa is not therapy. It's not, it's a therapeutic intervention. It is therapeutic support. Okay. But I am, I'm very careful and I would never want to say that this is a, this is therapy, right? Because it's not, we are not replacing, we do not want to replace a therapist. What we have discovered, Uh, is that a AI chat platform like Wysa is able to introduce a therapeutic intervention or a supportive therapeutic intervention in a way which really helps people and helps them guide them along that, uh, journey towards self-care. It's a helping hand, if you will. Right, right. And which is then eventually, hopefully taking them to a point where they start seeing a therapist. Right now, cognitive behavioral therapy, the way it has, I have applied it in CBT or cognitive behavioral therapy. Uh, can also be used in a self-help context. Okay? And there are guidelines around that and, uh, which is what I think I really experienced in my own personal life. And the underlying principle behind CBT is very simple. It is essentially saying that your thoughts drive your emotions. And so, um, uh, I'll give you a, take a simple example. Imagine that my boss came and shouted me, uh, at me at work. Now I might start thinking that I'm being bullied at work, and that obviously leads to incredible distress and sure me feeling powerless and anxious and so on and so forth. Now, the thought that I'm being bullied has led to a mental distress situation of mental distress. Now, I, if I reframe the thought, if I say, and I look at the situation and saying, actually my boss doesn't think, uh, is I'm in line for a promotion. And she wants me to get the promotion. So there is a huge emphasis on my performance and constant feedback to make sure that I'm doing well and I'm doing what is expected of me. And that is leading her, uh, to come in and have this very intense sets of discussions. And she's actually got my best interest at heart. Now if I were able to reframe it, then I would, uh, obviously address the root cause of my distress. And saying, okay, maybe it's not so bad. So it's recontextualizing the situation you are in. And, and hence you're trying to address the underlying emotion or the emotion which that situation is causing. And I experienced this firsthand, Chris, when, like, when my father passed away. Yeah. I remember I went through grief, right? And CBT helped with grief. Hmm. Uh, and, uh, which I did not expect, but uh, I think that was the time when I actually became a believer, because it helped, uh, me. Right. And uh, what we also found out was that, um, something like CBT is actually very, very conducive to being delivered in a AI chat format. Mm-hmm. Because a lot of it is prompting you for self-reflection, uh, for self-discovery, for you to reframe, for you to rethink your own thoughts, right? So a lot of the work is internal. It's, it's completely inward focused and having, you know, a little penguin sitting on your phone and talking to you in a AI chat format, but asking very intentional questions, open-ended questions, which is prompting that self-reflection to happen. It's a beautiful way of getting people onto that journey. And, uh, we've now also published some evidence around this, which is very, I think, quite surprising, but if you think about it, maybe not that surprising at all. Yeah. Which shows that people build a bond, a therapeutic bond with the, with AI and, uh, because it's non, and I, I personally believe it's because it's non-judgmental. Mm-hmm. Um, using Wysa is anonymous, so you can stay completely nameless so that at one stroke removes a large part of that barrier. The stigma, which people have. In exposing their vulnerabilities to another human being. The fear of being judged, the fear of not being able to articulate it well enough. Uh, maybe the fear of not knowing as well. So, but in a safe space, in a anonymous, yeah. Safe space. At three in the morning you have something, or somewhere where you can talk about this. Yeah. And hopefully come to some answers, which then when you're actually talking to a human being and that really is fundamentally what we are doing at, at Wysa.

Chris Brandt:

And I think that's interesting because I mean, you mentioned at like at three in the morning cuz you, you know, therapists are very expensive, their time is precious and they've got a lot of patients right. And so when you are in a crisis, you know, like having the ability to reach out to some somebody or something, I suppose, um, It can be very valuable, I would imagine.

Ramakant Vempati:

I wouldn't say that Wysa is there for crisis support, but there are times when you just, you got thoughts zooming, zooming around in your head, and you want to talk about what's bothering you and your friend's not going to pick up your phone. Your therapist is not available and probably your spouse has heard this about a million times already and doesn't want to talk about it. So what do you do? Maybe especially at three in the morning, maybe there's a penguin, especially at three in the morning. So, uh, maybe there's a penguin who can help.

Chris Brandt:

From what I know of my experiences with CBT and I, I, I've dabbled into a couple other things in that area. Um, it it's the, the thing that's. Interesting about it. And I think why it's particularly effective is, like you say, it's about reframing, but it's also gives you the tools to kind of say, Hey, my thoughts are, are going awry here. I need to, you know, do that reframing. I I need to, you know, it helps you identify the fact that you're in a place where you need to do that reframing and it gives you tools to, to reframe it and, and, and restructure it. Right. So it's not, um, Like you say, it's not necessarily a therapy, but what it does is it, it's, it's a, it's a class in some ways of teaching you how to look at your own mental health and how to evaluate your own mental health and how to intervene, correct.

Ramakant Vempati:

Absolutely. And I think another in, uh, useful or interesting way of thinking about Wysa is it's a journal. Mm-hmm. Which is reflective, but it's also intelligent, right. So it's a journal which is talking back or talking with you. And is on a journal with, uh, so I, I joke about this, but, uh, and, uh, it's like Tom Riddle's diary, but it's, it's not, you know, in, in Harry Potter. But this is actually a benign diary, right? It is talking back and it is asking you and allowing you to reflect. And talk about what's bothering you and nudging you along in a helpful way.

Chris Brandt:

You know, you mentioned earlier about the idea of solving ration care. Mm-hmm. Um, I remember during, you know, I'm, I, I don't wanna mention the P word because that gets me flagged on YouTube. Um, but. Uh, what, you know, we were looking for somebody at that time and it literally, we called, you know, like 20, 30 people and they're like, I'm not taking any new patients. I'm not taking any new patients. Yeah. I mean, it was Yep. Absolutely nearly impossible. And, and almost. Every single one of them also said, we don't, I don't take insurance cause I can't deal with the insurance industry because it's too complicated. Um, and so like you've got this double whammy of like, if you do want to get insurance reimbursement, you gotta go back and kind of do it yourself on the back end and pay out front. So it's, it's really, uh, prohibitive for a lot of people I would imagine. So having something that you know is affordable, that's available has gotta be really beneficial.

Ramakant Vempati:

You know, and every country in the world has this issue. And, uh, Wysa as a service is being used by, uh, people in 90, 90 plus countries.

Chris Brandt:

As you, as you mentioned, there's um, you know, like people in small rural towns or you know, people who are, have view the, a lot of stigma around the. Uh, issue of mental health. Um, the fact that you have this anonymous platform and it, you know, it's, it's not even necessarily a person you're talking to, um, as it's AI, I think that's gotta open up. You know, people who are feeling really uncomfortable with the process of, of mental health,

Ramakant Vempati:

uh, that you're absolutely right, Chris. And, um, for example, I, I'll give you two or three examples of how, uh, we are seeing this actually move the needle on that problem of ration care, right? Uh, one is people build a therapeutic bond with AI. Uh, almost twice as fast as they do with a human being. Mm-hmm. And we've tested this and we've published research around this, imagine twice as fast and roughly at about the same level. So the disinhibition effect of having a non-human agent or non-judgmental agent always available and you being anonymous, has a huge, and it's, I think it is a area which is very ripe for inquiry. Mm-hmm. From a research point of view. Yeah. I think there's a lot of work which can be done. In, why does that happen? But we have seen the effect, and this is more from a research point of view, um, we've seen in, in our client implementation. Mm-hmm. So we work with employers, we work with public health agencies, we work with large insurers across the world. And uh, what we have seen is, um, that the AI chat platform, something like Wysa, is not just about. Connecting people to human care more efficiently. Mm-hmm. It is actually also absorbing need. It is absorbing and delivering care and basically telling people, or, uh, users are saying, you know what, I'm fine. I got what I needed. Thank you. I don't need to see somebody else. Hmm. And for example, in our employer work, um, where otherwise usually you have these help lines, which people call. Uh, in case they need help. And, uh, the usage there is very, very low. It is, uh, less than 5%. It sense, I think, uh, I've seen numbers around ranging from 2% to 3%. Wow. And whereas you know that the need is huge, right? Is 30, 40, 50% of the population needs help. There's a huge need gap and people are not coming forward for a variety of reasons if they're not coming forward, even when care is being paid for by, by their employer. Right. What we have seen is that, When we introduce Wysa along with an EAP service. So Wysa is in a sense, the front door for people to come in and start the process. And if they, and then we handhold them to reach and take more advanced help when they need to, um, usage rates go, go up 10x. Hmm. So the, from one to 3%, they go to th 10% to 40% of the population starts using it. It's huge. Yeah. And, and, and not only that, but the second thing is when. Um, you know, you've seen a 10 x increase in your population of the people who are actually taking help. We also see that 80% of the people just stay within the AI. They don't reach out for help, uh, to the human layer of support. So it's actually absorbing demand. Now, if you combine both of those data points together, that's huge.

Chris Brandt:

With all the, the talk that's coming out about AI and all these large language models that are kind of hitting the market right now. I think, you know, people are, are a little, um, cautious about AI, but this is not pure AI. This is not unfettered AI. This is supervised by humans. Correct?

Ramakant Vempati:

That's right. And in a variety of ways. Um, I think, uh, GPT4 ChatGPT, all the large language models generated AI models, which have come through, happened in the last six months or so. Yeah. Uh, we've been doing it for six years, more than six years now. We are in year seven. Um, and, uh, so we've learned a thing or two. In terms of how do you deliver a supportive intervention in mental health, very sensitive subject in a robust, clinically safe and evidence driven manner. Right? Um, and I'm using these words very carefully and very intentionally because we do a lot of work to make sure that from a design and implementation point of it, to make sure this is safe, right? And I'll take two examples here to illustrate the point that it's a very deep subject and I can go on, we can probably spend the rest of your, our time together in talking about this. But, uh, one very clear distinction, for example, is we do not use what are called generator models. Hmm. Uh, to deliver the responses which Wysa is giving back to the user. They're pre-written, they're written by clinicians, uh, they're vetted by clinicians. And using clinical safety data sets where we know that, uh, because one of the risky, there are two risks are two risky situations here, uh, which a generative AI model can, uh, can spiral. One, it can spiral out of control because, uh, you cannot control what it is saying. You don't know what it's going to say in a particular context. And the second is, uh, you don't know what it'll say if it's not understood you properly. Right. And in both cases you need to have very clear and very strong guardrails around it to know that you, you have it under control. You have, uh, the AI is in a box, uh, in, in a sense. And that is what we do, advisor. So the responses are written by clinicians. The AI is actually to understand what the user is saying, but what the bot is saying back to the user is something which has gone through clinical review. And so we are able to explain, we are able to defend, we are able to reassure. Users, buyers, uh, health agencies, potentially regulators, saying this is safe. And the second is we go through a very period, uh, a period of very intensive testing in ambiguous user responses where we, uh, can save with a degree of confidence that even if the response, which WISE says back to the user, may not help. We know it'll not trigger, right? So there is a fundamental safety net, which we have built under the, the service or under the interaction, which gives us confidence that this can be used and it is being used in, uh, very sensitive context, like mental health.

Chris Brandt:

Not having that generative AI is a really good idea because having seen, you know, some of the things that generative AI produces, I would think that could be a very concerning point. I mean, I was using Bard the other day. And I asked it a question about a topic I knew very well and it gave me this completely made up, nonsensical answer. And then I fed it back. Well, here's the thing. It's like, oh yes, you're right. I wasn't, I was mistaken. And then it just fed me back. What? I just fed it. I mean, it just, you know, you could see how that gets really crazy. And the other thing too, I know that with, you know, when, um, ChatGPT was integrated into Bing. It, the longer the conversation went, the more off track the AI would get. So I could, I could imagine in that scenario exactly, you wouldn't want that happening by the end of the session that the AI is saying something, you know, it's time to take over the world. Let's go. Um, you know,

Ramakant Vempati:

let's go or, or, uh, oh yeah, the bridge is that way or something. Yeah. You don't want that to happen. And, uh, but at the same time, I do also want to, uh, share my incredible gratitude and respect for, uh, just in terms of the amount of talent and investment which has gone in. Oh yeah. To build something like ChatGPT4 and. It'll definitely have an impact on the world we live in. Yes. And uh, I think we are also, to be very honest, there are probably low risk situations. There are functionalities where, um, you know, for example, we could use this to create synthetic datasets, which can help us translate or trans adapt wise into low resource languages. Sure. Where you don't have enough. Uh, data sets to build ai. Um, you know, for example, I might want to say I want to help this 14 year old girl in, in Malawi, in Africa, and, uh, I have no way of actually creating AI data sets or AI models chart. G P T can help. It's a very, it's a low risk use case. It still helps advance the mission. We just need to be creative. Right in using the capabilities which have been now been made available in a safe way.

Chris Brandt:

ChatGPT is an amazing tool. You just have to understand what it is and you have to use it appropriately, right? Um, yeah. In that context, it can give you enormous amount of things, but you have to vet the data that comes out of it.

Ramakant Vempati:

And like most things, uh, it's a knife. You can use it any way you want.

Chris Brandt:

That's right. It is, it is. Just don't point it back at yourself. Yeah, absolutely. You're throwing out some incredible statistics here about, you know, like sort of improving, uh, the nature of mental healthcare and I think, you know, there's some interesting stats in there too about people's willingness to talk to an AI and I know that, um, You, you do not conceal the fact that they are talking to an AI. You want them to know that that's what, what they're talking to Absolut's. True. So it's not disguising itself. I, I gotta imagine, you know, based on all those numbers, you have some really great success stories.

Ramakant Vempati:

Lots of them. Um, which is also what keeps us going, right, uh, during the dark days. Um, as a startup, as a founder, yes. Um, when you see or hear those stories, that's what, uh, gives you that motivation to get up next morning and, and do it again. Um, and I, again, I, uh, highlight that, you know, this is not a crisis prevention support service, right? Or it's not an SOS service. But we have been there for people at the time when they needed help the most, and. More than 350 people have written in saying that this has saved their lives. Wow. And, uh, uh, so first, uh, the first time this happened, I was a bit of a skeptic when we launched six years, seven years ago, 2017. Uh, saying it's not, you know, nobody's going to talk to an AI chat bot, um, about their deepest, darkest fears. And, um, but I still remember I, and we were on vacation in the Himalayas and, uh, we got this email from a 13 year old girl, which said that. She had depression, she tried committing suicide. This is the only thing which is helping her hold on. Wow. And, and thank you. And she used an email, which is typically used to report issues. And this is the email we got. And when that happened, I think that's when the penny dropped for me personally saying, oh my God, this is actually, uh, providing that kind support. It's, it's there at that time. This is the impact it can have. And we were doing a host of other things. We were trying to do AI chat bots and diabetes and, and all kinds of different use cases. We shut everything else down and said, this is it. Yeah. Uh, we, we will, we will, uh, make, hone it. And that was our, I think our path towards then more evidence, more clinical safety and so on. And since then there was a, there was a user who called us from the uk, uh, saying that I've been put on a wait list and I was suicidal and I didn't know have anything else to turn to. And I found you guys, you saved me and if somehow found a number, which was a bit scary, but he actually called us from the UK and uh, and said, I just wanted to call and say thank you. And, uh, then there was another person we met in a, in an event where she was sitting in the audience and she came up to us later and said, you know, I've have been struggling with depression for like five years and I'm about to go overseas for, uh, my master's education. I haven't told my parents about this because I think then they will not let me go. And I've been using Wysa every night and thank you. Oh, wow. So these stories keep coming and they keep coming at you and that's when. Uh, you start believing that this is actually going to have a difference. Then of course we have all the enterprise work we are doing, the commercial work we are doing, but it's the user stories which. Um, uh, keep us going. Just one last story, which I think really illustrates the value of what Wysa does. Yeah. Uh, I remember reading a user review of this lady who, who said I com and going back to CBT, she said, I competed in an entire session of cognitive behavioral therapy in front of the television with my family and nobody got to know. And, uh, which I think really illustrates how, uh, that discretion, that anonymity, Uh, and that interactivity, the responsiveness, uh, it just, it, it, it, it works certain places, certain times. It really works. And I'm hopeful that, you know, right now we are at 6 million people served across 90 countries. Wow. There's no reason why that 6 million cannot go to 60 million. Yeah. And that really is my aspiration, is to actually go and say that we actually help 60 million people or be even more for that matter. And, uh, hopefully we'll find partners, clients, vendors, or uh, funders, investors who will help us on that journey.

Chris Brandt:

That's amazing. That's gotta, that's gotta be such a reason to get out of bed every morning and just so exciting to know that you're helping people.

Ramakant Vempati:

It is, it is. And I think it's also a great, uh, tool to build it and keep the team together so. Uh, I'm very proud of the fact, for example, that of the founding team, uh, which we started so many years ago, uh, we've not lost a single person, uh, involuntarily. People whom we wanted them to stay, and they've, uh, left. It's not happened. People have stuck together. That's great. That's great. And they hear, I think because, because they can read those stories.

Chris Brandt:

Yeah. What, what's the experience? So like if I want to, you know, go through CBT on Wysa, like how does that all work?

Ramakant Vempati:

So there are two scenarios here, Chris. So one is of course, if you're downloading this from the play stores, okay? So Google Play or the Apple App store as a direct to con, direct to consumer, B2C user. Um, then it's a very straightforward, you look for Wysa. Um, uh, and you find it, you download the app, uh, and uh, off you go, right? You give yourself a nickname, uh, doesn't ask anything else about you, reassures you that. You are completely anonymous. Uh, we don't collect any personally re identifiable information, so you can't say, you know, my wife Sue, who works at IBM and we live in Milwaukee. So because if you triangulate those three things together, you can probably find out who you are, right? So we don't store any of that. Even if you let that through in the conversation, we, we scrub that out before we store any kind of, Chat data. Okay, so you're completely anonymous. So then that reassurance happens. Then there's a little bit of an onboarding on trying to understand what help you're looking for, and then there's an open free flowing chat. There are structured tools, which if you want, you can do specific exercises. So you can go into CBT as an exercise, or you can generally just start talking about what's bothering you or what you're feeling, and the conversation will nudge you along to a point where hopefully you'll find some kind of intervention, some kind of support. Which is helpful. And then we also ask users at periodic intervals, uh, when they, for example, go through a specific tool or a technique saying, did you find that useful? And typically 910 people say they do. And we are rated 4.8, uh, 4.7, 4.8 on the app stores now at a very large user base. Yeah. And um, so that tells us that this works. And this is at the very basic level saying the users are telling us that this is helpful, right? And then of course we do a lot of research and trials and so on to make sure that we reassure ourselves in a very robust manner that this works. Um, so that's one user experience, which is the direct to consumer. And, uh, there is, uh, of course if you want additional advanced support, you have to pay. So there is a paywall there and, uh, but if you're in the,

Chris Brandt:

it's free until you want additional advanced support, a certain period for a certain period. And then is it like a subscription after that?

Ramakant Vempati:

It's a subscription. So, uh, the, uh, free by the way is not time restricted. There is a certain amount of functionality which you can use for eternity if you want to. Okay. Um, and, uh, 95% of our users use it for free. Oh wow. So this is actually a huge impact story from that point of view. Um, and, uh, some people then elect to pay and we are very grateful for that because it helps keep the service alive of the other 95% who don't. And, uh, but on the other side then you have the enterprise, uh, user where. Say you are an employee who's working for a large global corporation who has elected to offer Wysa to its employees. Then you get a, a configured, co-branded version, say Wysa, brought to you by your employer X, whoever that person is, or whoever that entity is. And then everything is unlocked. So, uh, and the employer is paying for it, and they typically pay on a fixed fee basis or, uh, on, on the service. And we integrate. And I think another thing to, uh, from a user experience point of view for the enterprise user, um, it's not a point solution. So if you need more advanced help, which your employer might have created, say, a therapy network or an EAP service, we can actually handhold you to the point when you get there. Oh. So it is integrated with whatever else which your employer has created, which I think a lot of people find very useful as well. So you're not just, you know, in a little well of your own. And if you need to jump out and move into a lake or a, or, or the ocean, um, there's a pathway for you to get there. You don't have to climb out and then jump back in.

Chris Brandt:

Yeah. Well, I, I think it's great that, um, more employers are taking healthcare, mental healthcare seriously. I think it has a huge impact, not just on, you know, the, the personal wellbeing of a, of each employee, but it makes. People more productive, more excited to be at work, more creative. Um, you know, for so long there's been so much stigma around mental health, but I think it's, it's really important to, to break through that. And, you know, I, I, I've, I've, you know, been in therapy and done, you know, those things as well. And I think, um, it's been tremendously helpful for me. Um, and I, I think that, you know, getting. That kind of access to more people is really important. Uh, and I think it's, it's great that, you know, you're coming up with innovative ways to go about doing that, to, to solve the ultimate, what's ultimately the people problem. Right? Absolutely. Thank you. Where do you go from here? What's next for Wysa?

Ramakant Vempati:

A lot. I think there is so much more we can do.

Chris Brandt:

What can you talk about? I guess this is probably a fairer way to put it.

Ramakant Vempati:

Uh, I mentioned I think, you know, 6 million going to 60 million people served. I think that's one flag we can plant and say that, uh, okay. That's where we want to get to. Uh, we have a team which is very motivated, very mission driven, very excited by what we do. Um, want to keep the team together, grow them, make sure that. They have fulfilling parts in the mission. They've chosen and they've devoted their time with and deployed and given us their time. We want to create a sustainable, scalable, uh, operation. So right now, for example, we are a venture funded company, so we just closed a series B, uh, recently last year. That's exciting. And, um, making sure it is, and it also comes with own. Yeah, pressures and opportunities as well, and which we are. Uh, so growth is, I think, really important from a commercial point of view, right? So many more employers, many more public health agencies, lots and lots of insurers, all of whom are struggling with the same problem. We have a solution and I'm hoping, and, uh, we are doing a lot of very hard work to make sure that we put the solution in front of them and, and make them see the, uh, the light, so to speak, and hopefully make money. Uh, create a sustainable enterprise with scales 10x and helps many, many more people.

Chris Brandt:

That's awesome. And I gotta imagine that, you know, through doing this, you're getting data back that helps you improve the process and improve what you're doing. So, you know, I, I gotta imagine there's a lot of a wealth of really interesting information there that, you know, you, you can examine

Ramakant Vempati:

Absolutely.

Chris Brandt:

The process of mental, mental health

Ramakant Vempati:

At last count, we have had more than 500 million conversations on the platform. Wow. Half a billion conversations. So yeah, that's a lot of data and that's a lot of insight. Uh, we recently, we were invited to, um, uh, Davos, the World Economic Forum and we, we released a very, uh, I think very, very interesting report on employee, uh, mental health, uh, global. And uh, the session was attended by the director general of the World Health Organization. And, um, we were joking amongst us here saying we probably have more data about mental health right now, uh, than the WHO does. Probably, uh, because they have to go out and collect, uh, this on a survey basis, which is time consuming, expensive, obsolete by the time it hits the report. Right. Whereas we get almost real time data. That's amazing. So it's a force for good. We, yeah. If, and, uh, we don't, and of course we use it to improve the product, improve the user experience, and that's where it stays.

Chris Brandt:

I'm excited for what you're doing. I think, um, you know, there's the old quote about the unexamined life. I think more people need to take a look at their life, their behavior, you know, what's going on in their head, and, um, and, and find better ways to manage. Your life and, and, and your thoughts. Um, because that's a challenge for a lot of people. And, and, and it seems that it's only getting more challenging every day. You know, life out there is a little chaotic, not a little, a lot chaotic. And, you know, having this as a resource is, is great stuff. So thanks so much for doing that. And thank you so much for coming on. I really appreciate having you on and, you know, Good luck to Wysa. I'll put links in the show notes so that everybody can find you and go out and get some help.

Ramakant Vempati:

Thank you Chris, and thank you again for having us here.

Chris Brandt:

Thanks for watching and if you are in a mental health crisis, please reach out and get the help you need. And if you're still with us, please hit that subscribe button cuz that helps a ton. And why not give us a like while you're at it and I will see you in the next one.