Skip to main content

Going digital in suicide prevention

September 20, 2023 |2 minute read time

Dr. Seth Feuerstein, CEO of Oui Therapeutics, in front of red background

Seth Feuerstein, MD
CEO, Oui Therapeutics

Going digital in suicide prevention

Last fall, our host Dr. Daniel Kraft spoke with Dr. Seth Feuerstein, the CEO of Oui Therapeutics, a psychiatrist and researcher who’s made the study and treatment of suicide his life’s work. In recognition of Suicide Prevention Awareness Month, we’re re-releasing a two-part episode on this ongoing public health issue and examining the challenges in identifying, understanding and treating people with suicidal ideation.

Cara McNulty, President of Behavioral Health and Mental Well-being at CVS Health, re-introduces the episodes with an up-to-the-minute look at suicide in this country. “According to the CDC, suicide rates increased 33% between 1999 and 2019,” she notes. After small declines over the next two years, the CDC reported another spike earlier this year, with rates nearly returning to their 2018 peak. “Even worse, suicide continues to be the leading cause of death among adolescents and young adults, and we are seeing a rise among older adults, as well,” she notes. To help, she believes everyone has to become more comfortable talking about suicide. “We’ve seen how important it is for everyone to be more vocal about their mental health and to be willing to talk about the importance of suicide prevention, especially among adolescents and older adults.”

How digital therapeutics can help

Oui Therapeutics has created a digital therapeutic aimed at helping to reduce suicide rates. Patients interact with an application through their smartphone in a treatment that typically lasts 10-12 weeks under the direction of a clinician. “A patient might work with a chatbot function, interact with other patients, or work on practicing exercises to refine the way their brain might react to certain situations.”

“Software is a tool that will allow us to gather tremendous amounts of useful data to understand what is working, what’s not working and what can work better,” he says. This will allow more timely intervention when people are at risk.

Dr. Seth Feuerstein, CEO of Oui Therapeutics, in front of red background

The thing that really excites me is, where can we use software to solve problems that were not previously solvable?

Seth Feuerstein, MD

CEO, Oui Therapeutics

Dr. Seth Feuerstein, CEO of Oui Therapeutics, in front of red background

Reframing how we talk about suicide

Many people are reluctant to talk about suicide, including physicians and other health care professionals, Dr. Feuerstein says. Reframing the discussion can be helpful. “At the moment when there’s an attempt that might lead to death, there’s an inability of the brain to function properly and to see another option,” he says. “If we think about a suicidal state of the brain as an arrhythmia-like state in the heart, hopefully it’ll make it easier for clinicians to talk about it.”

Learn more


Welcome to Healthy Conversations. I'm Cara McNulty, President of Behavioral Health and Mental Wellbeing at CVS Health. Today we have a special edition of Healthy Conversations in recognition of Suicide Prevention Awareness Month. Last fall, our host, Dr. Daniel Kraft, interviewed Dr. Seth Feuerstein, psychiatrist faculty member at Yale and CEO of Oui Therapeutics. The two explored the issue of suicide in great detail.


Suicide continues to be an ongoing public health issue that has been exacerbated by the pandemic. According to the CDC, suicide rates increased 33% between 1999 and 2019. While we saw a very small decline in 2019 and 2020, this past February the CDC reported an increase in suicide. Unfortunately, suicide rates nearly returned to their 2018 peak. As doctors Kraft and Feuerstein discuss in the first episode, suicide is a complex health issue and people continue to believe in many common myths about it.


However, from a public health perspective, suicide is generally preventable. By using a systematic approach grounded in early detection, evidence-based interventions, postvention and community education, as we've done at CVS Health, we can work together to save and change lives. For instance, in late 2020, we initiated our strategy to reduce suicide attempts among Aetna members and to raise awareness of suicide prevention among our CVS colleagues and in our communities.


While we have seen a 16% decrease in suicide attempts among our commercially insured adult members compared to a 2019 baseline, we've also seen how important it is for everyone to be more vocal about their mental health and to be willing to talk about the importance of suicide prevention, especially among adolescents and older adults. These two populations are in need of additional support and targeted interventions. It is extremely valuable for businesses, leaders, doctors, hospitals, educators, parents, family members, friends and others to learn about the unique nature of suicide and know how to provide support when needed.


Dr. Kraft and Dr. Feuerstein go into great detail. Before we share their conversation, I want to highlight some other resources available to learn more and engage in suicide prevention. Mental Health First Aid and Talk Saves Lives trainings are really great resources to learn more about mental health and suicide and to learn strategies to help someone who may be in crisis.


The American Foundation for Suicide Prevention is a great place to learn about the facts of suicide. The Jed Foundation is also a great resource for youth suicide prevention, and the Trevor Project is a leader when it comes to LGBTQ+ mental health and suicide prevention. You can also visit, where we have mental health guides for parents, educators, young adults, and coming out support guides for LGBTQ+ youth and young adults, podcasts and more.


Please keep in mind if you or someone you know needs immediate support, you can also dial 988 to speak with a counselor. The National 988 Suicide and Crisis Lifeline actually just celebrated its one-year anniversary and has answered nearly 5 million contacts, which is about 2 million more than the previous 12 months. This shows we can continue to offer hope to those in need while making a real difference. Now onto the conversation with Dr. Feuerstein.


The only cause of death that's more common is accidents, actually.


Welcome to Healthy Conversations. I'm Dr. Daniel Kraft, and today I'm in Healthy Conversations with Dr. Seth Feuerstein, the CEO of Oui Therapeutics. We've got a lot to talk about. So, for the first time on Healthy Conversations, we're going to have the same guest for a two-part podcast. Today we'll be talking about a topic that's been very much in the news and a bit exacerbated by the pandemic, suicide. Suicide kills more than 45,000 people in the US each year, a rate that's remained stubbornly high even with efforts like the New National 988 hotline. So Seth, why do you think we haven't been able to really move the needle on suicide more?


It's one of the reasons I ended up spending most of my professional time on this issue. Suicide is a condition of the brain that's universal in societies around the world. So in some ways it's more like diseases that can cause sudden death like cardiac arrhythmias in some ways. And so I doubt we'll ever completely get rid of it, but we could certainly do better.


I think there are a variety of systemic issues as well as misconceptions and preconceptions about suicide that remain stubborn problems in terms of bending the curve and shifting the curve and reducing the number of attempts and deaths. I've been really fortunate to have some of the best training one can have to be on faculty at one of the best medical schools there is, and yet I had some significant misconceptions and preconceptions about suicide despite all of this and besides being trained as a psychiatrist that I'm really undoing and relearning over the last seven or eight years since I decided to focus on this area.


Certainly it's one of the leading causes of death in younger folks. And you mentioned you had your own common misconceptions. What are the ones that you still see commonly across healthcare profession and maybe even in psychiatry specifically?


I'm going to start with some of those statistics because they are really, really frightening and I don't think we talk about them a lot because we often feel we can't do a lot about them. So the CDC and other estimates indicate that approximately one and a half million people a year attempt suicide in the US and somewhere between 12 and 15 or 20 million, it's really not entirely known, think about suicide, have what's called suicidal ideation. The only cause of death that's more common is accidents, actually. So suicide is the number one medical cause of death for people in their teens and 20s. Suicide is the fourth leading cause of death for people in their 30s, 40s, and 50s.


The thing that I started to conclude as I reviewed the data was that the suicidal state of the brain is a lot like the arrhythmia state of the heart. It's a relatively spontaneous period where there's an elevated risk of sudden death. And if we think about attempts as potentially deadly events or cancer metastases, we can focus in on preventing those deadly events, which has proven to be a successful model in other diseases like cardiac and oncology diseases. Does that make sense, Daniel?


Yeah, I think that's a good analogy. I think we've all known friends or even colleagues who may have been suicidal or even successfully concluded that we didn't have a clue of, didn't seem to be on that mental health trajectory where anything seemed to be wrong. And are there ways to find those early equivalent of the atrial fibrillation or EKG that can pick up folks at risk even when they don't outwardly seem to be?


There are, and we're getting better at that, and one of the reasons we don't generally actively look is that if you speak to psychiatrists and other clinicians, they'll tell you they're not exactly sure what to do with the patients to reduce their risk once they identify them. If you went to your closest hospital and you said, how many stroke patients did you see in the emergency room this past year? Probably somebody in the hospital can say, we saw X number of stroke patients in the emergency room last month and last year. Same with heart attacks, same with arrhythmias.


If you said to them, how many suicidal patients did you see in the emergency room, they probably would not even know where to look. They're not thinking about this in the same way they think about other leading killers, even though it's actually a major cause of lost life years in the United States. And I think we have a long history in healthcare when we know we can do something, we actively seek out those patients. When we're not sure if we can do something, we don't.


The truth is most people who attempt and die by suicide do not have major depressive disorder. And you also mentioned heart disease. It's a really great analogy. Diseases like depression and bipolar, they're risk factors, but they're not directly causative. And if you look at the risk factors for things like cardiac arrhythmia, they're remarkably similar to the risk factors for suicide. The very best predictor of a cardiac arrhythmia is a previous cardiac arrhythmia, and the same as with suicide attempts. The very best predictor of a suicide attempt is a previous suicide attempt, and for every death there are about 30 attempts in the United States.


Thanks to the stigma of mental health, an attempt may not follow someone in their records so that you can be proactive in on the outlook. Are there better ways to integrate that into our EMRs and workflow of all forms of clinicians so that you've had a heads-up that maybe even it was 10 years ago, someone had an attempt?


Generally speaking, people were not screening for this, so the Joint Commission took it upon itself to start to push that forward. The Joint Commission, in fact about two years ago, first started encouraging and requiring hospitals to start to screen people for suicidality, and so we're starting somewhere as a system, we're getting better. There are outstanding researchers looking at all kinds of data sources to better predict what might lead to an attempt.


Are there any sort of top elements that you'd recommend that we put on our radar when we're even talking to our friends and family, let alone our patients, to sort of see those early signs or even the acute signs that often are commonly missed?


Suicide, in my experience, may make clinicians more uncomfortable than any other condition, and I think the first thing is related to stigma. A lot of clinicians and a lot of non-clinicians are uncomfortable talking about suicide. If we think about a suicidal state of the brain as an arrhythmia like state in the heart, hopefully it'll make it easier for clinicians to talk about it. I think people are nervous that if they identify suicidality, they're not sure what to do with the patient, who to send that patient to.


People get very nervous when they know a patient is suicidal because an attempt or a death feels like something we should have been able to predict and it feels like something that we might get sued for if it happens. The reality is asking from a legal perspective is better than not asking, and while it may be human nature to avoid areas where we're uncomfortable, it's an area we really should embrace.


Are there any kind of equivalents to the screening EKG or cholesterol measure that could give us earlier insights into who's at risk?


Absolutely. There's some researchers who've developed something called the Suicide Cognition Scale, which is doing a pretty good job at providing indicative looks at future risk for people who've been seen in psychiatric settings. There's a group at Harvard looking at a variety of digital biomarkers as well as electronic medical record data. There's no doubt that the kind of data that can be captured digitally will revolutionize our ability to be more contemporaneous with the periods of that risk and the ability to intervene with people when they are at risk. No question.


We're now in this era of digital biomarkers. Voice is a biomarker that can pick up mental health or neurologic issues. Give us a little bit of a flavor of where things might get to be more specific.


I've been doing digital psychiatry work for more than a decade, probably about 15 years now. It's interesting. I did a lot of genetics work before that. I believe that what genomics and proteomics were to disease categories like oncology, software will be to the brain. The brain is essentially an information acquisition processing and distribution center.


So if we can ping that information processing center, see how the brain and the body react to that ping, whether we're pinging it with software or applying something inside the body like a drug or from outside the body to inside, like transcranial magnetic stimulation, software is a tool that will allow us to gather tremendous amounts of useful data to understand what is working, what's not working, and what can work better.


In terms of precision psychiatry, are there different subtypes of suicidality? If we can measure those, might we subset those patients or folks at risk of being suicidal into different prevention and intervention realms?


It's a very insightful question and it's absolutely true. There are some people who are chronically suicidal ideators, meaning they think about suicide a lot and they have little or no risk of ever having an attempt. There are some people who rarely ever think about suicidality, but when they do think about suicide, the risk of an attempt is very high and there's an infinite number of gradations in between.


What they share in common is that at the moment when there's an attempt that might lead to death, there's an inability of the brain to function properly and to see another option. Broadly, though, they'll almost universally tell you they actually didn't want to die, but they just didn't see another option at the time. It really undermines a lot of what people normally say when they hear someone attempted or thought about suicide. Why did they choose to do that? The brain, as something that's a sort of logic driven machine, doesn't actually see something to get it past what's in front of it. It only sees the one choice. Does that make sense?


Sure, so there are folks who have the underlying, let's say back pain or arrhythmia or mental health issue, others that we seem to be surprised that they had a suicide attempt or unfortunately a successful one.


We need to think about suicidal ideation in the way we think about things like chronic back pain. I can have back pain all the time and have little or no risk of ever having a ruptured disc. Or I can rarely ever have back pain, but if I have it, I may have a ruptured disc almost immediately thereafter. It really depends on the situation.


We know that people who aren't sleeping well are at increased risk. We know that you don't process information as well and you're more impulsive when you're not sleeping. At the same time what you're texting to people, the types of language you use, what you hear in the tone of your voice, all of those data sources and many more are relevant to predicting risk.


You've had a pretty amazing and varied background, undergrad at Cornell Medical and Law School from NYU. We won't hold your law degree against you. And you did your internship at residency at Yale where you're in the faculty of the School of Medicine, but what really drove you to work on the issue of suicide?


Suicide had always befuddled me since I was a resident because we were asked to make judgements about patients, but there really wasn't a tremendous amount of data in that, and then even after we knew someone was at risk, there really wasn't clarity around what we could do to reduce that risk, and that's what I got focused on. One of the things that I get most excited about is working on tough problems. Reading a lot of the research that came across my desk on the topic pretty quickly made me realize that even though I had been on faculty at a leading university in the Department of Psychiatry for more than a decade, I misunderstood.


The second was I had had a pretty successful career leveraging software in healthcare and mental healthcare delivery and felt that some of the things that I had been fortunate to be successful in could be applied to suicide risk, and the third was that it was the most inefficient and most problematic area potentially in all of healthcare that one could make a big dent in. And so that was really attractive to me.


I know you've done work with the military. There's definitely a pandemic of suicidality amongst service members, particularly those often served overseas in challenging environments. Maybe touch on the PTSD side and some of these new emerging therapies that go outside of traditional, let's say antidepressants and counseling.


Generally speaking, medications for psychiatric conditions per se have not been shown to reduce risk of suicide, even though they can improve your depression, for instance. PTSD is a risk factor. It's essentially a stressor on the organ, a very clear stressor. Any organ in your body can give way at any time, whether it's your liver, your kidney. You twist your knee fast enough and strong enough on the football field, your ACL can snap, and I think that's really the model we need to think about when we think about the stressors that people put on their brain. That can be me or you statistically, Daniel. I don't actually know exactly how old you are, but I think we're both in that number four cause of death window. You and I are at a much higher risk of becoming suicidal than getting melanoma, but I don't think it even occurs to us to not put sunscreen on when we go to the beach. We do it because we think we're at risk and it can help us.


The brain is the same. If we have a condition like PTSD, or somewhere else in our life there's a stressor, we need to be realistic that it is the number four cause of death for us regardless of where we sit. We can't envision it entering that suicidal mode, but it can happen. And when it happens, it doesn't mean that we're weaker. It's what happens when the organ misfires. It's not your fault. We just need to provide the resources and the tools to reduce that risk and also to treat your PTSD.


Yeah. I think that's an enlightening way to frame it. I met Zak Williams, the son of Robin Williams, who's quite active in the mental health space, and he used a term that I hadn't heard before, which is sort of practicing mental health hygiene, whether it's mindfulness and meditation or optimizing sleep or social connection. What are those ways that you can inoculate against bad mental health, but all the way to suicidality?


I think we all know that if you take a solid week off from work, if you have that luxury, or stop doing the things that stress your brain, you feel a lot better at the end of the week. Oncology is a great example. If you've ever lived through a cancer diagnosis, it very much feels to the patient in many cases, like their options are narrowing. It's not that cancer patients want to die, but the outcomes that they face are very, very large. I think it was that 40% of patients who get diagnosed with cancer end up in bankruptcy due to the result of the costs of their cancer treatment.


So you're pulling yourself and your loved ones into bankruptcy with you. Think about the toll that that takes on the brain as an organ. What kind of options would I see? Now couple that with lack of sleep, the medications from the treatment, and all of a sudden you're loading up the brain with all kinds of trauma around which it needs to start to make decisions, and it really freezes up in the same way your laptop might freeze up and stop functioning.


Healthcare issues are the leading cause in bankruptcy in general in the US. Is there a difference in folks in the UK with NHS where they have underlying insurance?


Suicide rates are higher in the US across many of these age groups than they are in the United Kingdom. We're seeing a new wave of companies focusing more on serious mental illness, diseases like schizophrenia and bipolar, and working with health insurers around that. When it comes to suicide, we had this phenomenon in the psychiatric system broadly, not just in digital health, where suicide patients are turned away. Nobody wants them. I think there's two reasons they generally are not wanted as new patients. One is the perceived liability risk and the other is the feeling that they don't have good treatment options for them.


It is the only leading killer without a prescription product available. So as a clinician who's a prescriber, I don't have an option. That's one problem. The other is the interventions that have been proven to work. They're really like a subspecialty, but the system isn't structured around subspecialties within psychiatry. So for instance, in cardiology, if I have an arrhythmia, I might be sent to an electrophysiologist. Suicide is similar. The interventions are quite different than other interventions in order to work to reduce suicide attempt risk.


That's an interesting point. There certainly aren't suicidologists. If you were to wave your magic wand and you had more folks trained in this area, what are the most effective tools that they could use that take folks at risk who've had attempts, and to have them fully recover?


In terms of reducing attempts, the first therapy to get wide recognition was called dialectic behavioral therapy or DBT. The challenge with DBT is it's extraordinarily costly and intensive. Patients typically go for at least several weeks and often several months of around the clock care with access to their clinicians at night via text, and that showed modest reductions in attempts. The second category was a modified version of cognitive behavioral therapy, which is about three months in length. That's the one I got most interested in. It actually separated attempts from ideation. That was one important breakthrough. And the second is it focused on giving the patient control.


So as clinicians, we were often scared to say, "Look, you can manage this yourself." That feels uncomfortable for clinicians, but what these researchers showed that they can reduce suicide attempts by 60% or more by going through this very specific, very specialized therapy. If I could wave my magic wand, we'd have an unlimited number of clinicians doing this. The problem with that is twofold. I don't have a magic wand. And the second is the work is intensive.


As we're now three years into the COVID pandemic, we certainly mentioned earlier stressors that can build up, we certainly see some high profile cases of physicians taking their own lives. Any learnings from that?


I'm going to say something very simple. Figure out, even if you need to ask people you know when and what are the things that cause you to be more irritable, to sleep less well. Pull back from those things. And then number two, turn off your cell phone. Every day, a couple hours before you go to bed, as hard as it sounds, leave your phone in another room when you go to bed. The digital bombardment of the brain, from all the data I've seen, is definitely an ongoing stressor and everything will be fine 12 hours from now if you put your phone in airplane mode. I found for myself that that was an incredibly powerful thing and just detach as often as you can.


So doom scrolling Twitter before I go to bed is not a good idea. I need to act on that myself.


That's a great interpretation of what I said.


Well, thank you so much, Seth Feuerstein, for joining us on this issue of Health Conversations. A lot of insights and a lot of things to think about and hopefully take into our day-to-day life and practice, and we're happy to have you on our next segment. We'll be talking more about some of the emerging digital tools and digital therapeutics that can help prevent suicide, so hopefully if you are listening, will join us then, and thanks for joining us on this issue of Healthy Conversations.


Please join us for episode two with Dr. Feuerstein, where he talks about the innovative prescription digital therapeutics that his company has developed to help manage patients with suicidal ideation. The episode will launch on September 20th.


Welcome to Healthy Conversations. I'm Cara McNulty, President of Behavioral Health and Mental Wellbeing at CVS Health. Since this is Suicide Prevention Awareness Month, we are re-airing our conversation with Dr. Seth Feuerstein, the founder of Oui Therapeutics. If you missed the first episode, he talked with our host, Dr. Daniel Kraft about the issue of suicide in this country. In this episode, the two discuss new interventions to help prevent suicide, digital therapeutics. Dr. Feuerstein's company, Oui Therapeutics, has developed a tool to help patients manage their suicidal ideation. We are really excited to offer this program to our Aetna members along with our other clinical programs. These newer, evidence-based clinical outpatient programs are uniquely focused on suicide prevention and risk reduction and demonstrate that we can help prevent suicide when we help connect those who are in need of care.


That connection to care is absolutely essential, especially for vulnerable and diverse communities such as the LGBTQ plus population. Increasing access and connection to these resources is how we continue to make progress in this fight against suicide. There are lots of ways we can continue to move the needle. For instance, we've implemented universal screening and safety planning into our clinical protocols across our organization, helping to ensure we identify people in need and get them connected to care. We also proactively reach out to members who are at risk for suicide and connect them with clinical support and resources. For example, through our Caring Contacts Program, we send postcards with reassuring messages, comfort items, and more to people in need. Because one-third of patients who die by suicide saw their primary care physician the month prior to their suicide, we also want to ensure that providers feel comfortable supporting their patients who may be in crisis.


To help, we've launched a suicide prevention training pilot for select primary care practices. We also offer a unique suicide prevention qualification for behavioral health providers in our network, including those therapists in select CVS Minute Clinic locations. As you'll hear in this conversation with Dr. Feuerstein, it is really encouraging to hear how Oui Therapeutics and others are using research and technology to develop proven methods to reduce suicide. Together with newer strategies to help those in need, I am hopeful we can continue to make significant progress in the fight against suicide. Stay tuned to hear all the details about this innovative new technology.


We can use software to tackle problems where we already have lots of prescription options, but life and death situations where we don't have good answers. The thing that really excites me is, where can we use software to solve problems that were not previously solvable?


Welcome to Healthy Conversations. I'm Dr. Daniel Kraft, and now we're in our second part of a two-part series with Dr. Seth Feuerstein, a psychiatrist, faculty at Yale, and the CEO of Oui Therapeutics. Seth, from our last conversation, we kind of covered the huge challenge of suicide and suicidality. Maybe you can just give us a 30 second recap about what the problem state is in the US, around the world, and why it's so critical that we find new ways to address suicide.


Suicide is the number two cause of death in the US of people in their teens and their twenties, the number four cause of death for people in their thirties, forties, and fifties. It's the only leading cause of death without any prescription products and it's a cause of death that's been increasing for a couple of decades. There's approximately one and a half million people a year with a suicide attempt and approximately 14 or 15 million a year who think about suicide, without really good options that people can prescribe or offer their patients.


Has huge societal costs and many of us who are healthcare providers don't often feel well-equipped to deal with those who have suicidal ideations or symptomatic suicidal behaviors. You've recently launched a company called Oui Therapeutics, helping clinicians manage and treat these patients. Can you, first off, explain to us what is a digital therapeutic and how might it be used to replace or supplement more traditional therapies, particularly in the setting of mental health and suicide?


The phrase, digital therapeutic, is getting tossed around a lot these days. I like to be more specific and say over-the-counter digital therapeutics or FDA approved digital therapeutics. Over-the-counter would be more like vitamins, things that are not regulated by the FDA that are essentially offered in combination with other products that's software and hardware driven to provide an intervention or a therapy in one way or another. Medical devices like pacemakers and hip implants and drugs like the statins or the antidepressants or the antibiotics or the two categories before this. FDA approved digital therapeutics is a new category. In 2017, it was actually created by legislation and then the FDA implemented it. This new set of rules created something called software as a medical device and essentially mandated that the FDA regulate this area and create this third leg of the stool, if you will.


There's lots of consumer digital apps out there, from smoking cessation, to forms of mental health, to exercise, to managing almost any element of care, but to be FDA cleared you have to bear a few hurdles. Can you help discern what those might be?


Generally speaking, it would be the same kind of thing that a drug might go through. I think toxicity is a little bit less of an issue, for the most part, with software as a medical device. The thing that really excites me is, where can we use software to solve problems that were not previously solvable? We can use software to tackle problems where we already have lots of prescription options, life and death situations where we don't have good answers, and that's something that's really potentially breakthrough, from my perspective, and something really exciting to potentially work on.


Let's dig into that. Your company, OUI Therapeutics, has created a tool for helping manage patients with suicidal ideation and beyond. How does the platform operate?


I had a lot of experience, by chance, early in my career. I founded a first generation digital therapy company back in 2009. Just for context, that was a time when if things didn't work on your computer that was because our DVD ROMs were scratched. That's how long ago that is in the software world. We scaled that to tens of millions of covered benefit lives. From that experience we learned that patients do like interacting with software, and in this case we made apps that go on your smartphone. Patients interact with that and there's a multi-dimensional interactive experience where you might work with a chat bot function, you might interact with other patients, you might work on practicing exercises to refine the way your brain might react to certain situations. We combine all of those things into a multi-week experience that typically lasts about 10 to 12 weeks for a patient. They get offered it by a clinician and they interact with it at their own pace over those two and a half months or so.


Just maybe take it and dig down into a patient who's recently had a suicide attempt and they're now on the We platform. What do they actually do and how does the clinician interact with that information?


In some ways it's like what they do when they're prescribed a drug. In other words, they get prescribed the app, they dose the app according to a schedule that they are told by the clinician, and then can follow through up on their own, which is similar to with a medication. I'd say some of the differences are that clinicians can check in, they can see what's happening and they can interact with the patient. I would say more interaction is better. One of the nice things about it is that an app can remind them to do more and can learn from some of the information that they've entered to help nudge them along in certain ways. In terms of what's required of the clinician, really it's just to be a responsible and thoughtful steward of the experience. I do think that in the future there'll be more ways to have more interaction with the clinicians.


We want to make sure we're bringing them along on the journey so they can provide more useful tools. I will say an important part of this are the payers. We've had really good relationships and conversations with some payers, including for full transparency, the sponsor of this podcast. Payers do seem interested in allowing the right digital tools with the right data to be reimbursed and used with their health plan members. One way to think about it is the early days of the patient portal. It was not a good experience for patients and so clinicians would send a message, patients had a hard time logging in. We're trying really hard to avoid that and so we only want to require what's absolutely necessary for all the parties.


For the patients, it's all about user interface that might be tuned to age, culture, language. What might a session look like on Oui Therapeutics platform?


We're very mindful of that. There's no question some patients respond differently to different experiences, language, so we include different types of peer and group therapy type content that aligns with a diverse set of individuals. I might log in and I might be speaking with a chat bot, and that chat bot explains that they're going to guide me through a certain exercise. In that exercise, I might get to watch four or five other patients and pick one or two or all five of them and see what their experience was like doing that exercise, and then apply what they did to my own experience doing that exercise. What's interesting is some patients like to choose people who have had different experiences and will look different to them, which helps them realize that, in fact, the experience that they're having is a more universal experience and not just limited to people like them. That's akin to what might happen in a good group therapy session.


Got it. As a digital therapeutic that you're trying to get through this FDA process, it's often still about comparing it to standard of care. As I understand it, you have a design right now studying your intervention, comparing that to in-person therapy that had good results in reducing suicide attempts in military folks. Can you tell me about that study and how you find the methodology evolving?


We founded our company a lot like a biotechnology firm. We worked together to create a single software intervention that took the best assets from the very best interventions and put them in one place. The study you're referring to was published in 2015 and was actually launched a few years before that. The researchers, what they showed, was that just 12 doses of this digital therapy delivered over just a couple of months was far more impactful than all the treatment that these patients were receiving. Everyone in the control arm got everything. They got medications, they got therapy, they got inpatient, they got everything the system could throw at them, but just 12 doses of this intervention in that particular study reduced suicide attempts by more than 60%. They estimated that they reduced by more than six out of 10 the number of suicide attempts that were occurring in those populations over just a period of several months. They did follow-up for almost two years, which is a very long time for follow-up, and found that the interventions seemed to have a bigger effect the longer you looked after only 12 doses.


And are you digitizing that in-person type of intervention?


Yeah, so it's following up and layering in additional tools and resources, but that is a core part of what we do. The first data around the intervention was actually published in the Journal of the American Medical Association in 2005 by our team. I think the thing that was shocking to me was that we could have an intervention that showed it was cutting potentially deadly events by more than half. I couldn't find a single psychiatrist or psychologist scanning across hundreds and, believe it or not, actually hundreds of thousands because I was at a national health insurer at that point after selling my first company. I couldn't find anyone doing this intervention that was a top five killer of groups of people in this country in different age cohorts. There's something that can reduce their risk of a potentially deadly event by more than half, and nobody's getting the intervention. That is a tragic situation and it's sort of fundamental to why we do what we do and what we built.


As you're building this, you've been able to develop separate tools for children, adolescents, and adults. How do you see the data and the interface shift?


There's no question, kids are different. The brain is still developing into your twenties and what you expect when you're 15 is different than when you're 20, no question. They're still developing their ability to understand when their brain isn't functioning well and when stressors are really pushing them to the limit. They also need to go to adults very often to access interventions and care. But I think most of us can remember when we were teenagers communicating with the adults in our lives, and especially our parents, wasn't always that easy. We didn't necessarily have the words. We were scared because we were afraid how they might react so we're actually using software to create an adult support network around at-risk youth.


The preliminary studies around which that software is based showed an 80% reduction in death among at-risk teenagers over 10 years from suicide as well as from accidental drug overdose and all violent deaths. We believe that's because the software helped those adults understand how and when to communicate with those youth and vice versa, when the youth can and should be communicating with the adults. They now had a more fluid way to access an intervention and that intervention doesn't necessarily need to be a doctor and it doesn't necessarily need to be a treatment, it needs to be communication and rescue, essentially.


That's super impactful. Of course, you can't just prescribe the pill or the device, it's interacting with the socio and your family, friends, and community that often play a big role in outcomes. Have you seen any reluctance from providers or payers to think about this new digitally enabled mental health approach, and how might you address those misconceptions?


Innovation is a process, and so what we've seen is that around suicide and around what we're doing payers have been pretty receptive. They recognize that there aren't good options for their health plan members. They also recognize if what we're doing works it saves them a lot of money. There's no question that health insurers want to see good results from Aetna and CVS, incredible support around the mission and the outcomes. The first large Blue Cross Blue Shield plan just announced full coverage of FDA approved digital therapeutics. We expect that trend will continue. In some ways, what we're doing is specific kind of like a new cancer treatment or for an orphan disease. It's something where there really aren't good options and people do want to figure out how to cover that.


And, of course, CMS drives a lot of this in terms of reimbursement codes and they have a new billing code, as I understand it, earlier this year. How does that change the game?


I think there are two ways that CMS will help drive success in this area. One is, as you mentioned, coding to pay for the digital interventions. The other is to provide reimbursement for clinicians to spend time with their patients around those digital interventions. I think we'll see a trend from CMS and elsewhere where they're encouraging follow-up and connectivity with their patients around treatments, around social determinants, which will also help drive adoption.


And are you seeing this ability to create digital health formularies, not just for cardiovascular disease and diabetes, but now across mental health?


Absolutely. The first formularies that came up that received widespread attention were from CVS Caremark and Cigna Express Scripts. Those were largely over-the-counter digital tools. I think we're going to see them evolve where this is finally a place where value can really come into the prescription environment. I think there's been a lot of tension for a long time between the pharmaceutical industry and the payers around the value that each prescription brings in the drug world. One of the great things about software is we can collect a lot of data on outcomes in utilization, and I think payers want to pay for that. They just want to know what they're paying for, and digital provides a platform for them to do that through those formularies.


And I don't want to leave out employers. Suicide, when you think about the statistics I laid out at the beginning of our discussion today, if you look at any large employer, and large employers are usually at risk for the costs of their employees' healthcare benefits, that means that suicide is a leading killer of their working population. Suicide is taking between five and 15% of all of their employee deaths. Think about that. Approximately one out of every 10 employee deaths is from suicide. Suicides are extremely costly to those employers, both from a loss of good personnel, spiritual wellbeing of the team, disruption in the workplace. They want to pay for that.


Yeah. I think, number one, of course there's always the reimbursement and regulatory challenges, but then there's also the challenges to get those solutions into the hands of clinicians and their patients and into the workflow without having to log into 12 different apps. Where are we on that paradigm?


It's tricky. I think in some large health systems they've already selected certain platforms for certain problems that I would say generally lean towards low intensity, less sick patients. There are some platforms that have a half dozen or a dozen different interventions around mood, and sleep, and anxiety. I don't think it's clear yet what the right place of distribution and commercialization will be for these products. My guess is it'll come from a variety of places Initially. What I can say from my own experience is that patients are not the issue. Patients like the tools, they use the tools, they interact with the tools much more than they do with traditional office visits. Patients are not going to be the roadblock here.


And so, just to finish up with what you're doing with Oui Therapeutics, where are you now in this development process?


We have multiple products going through the FDA. We're optimistic that within a year or two the first ones will be available to patients. We are actually working on some things that will not require FDA approval that should be helpful around suicide. More to come on those fronts.


Fantastic. Just to close with putting our futurist hats on, if we were to jump forward 10 years, do you think with these new emerging digital layers, therapeutics, new forms of data, analytics, we can start to inoculate against the issue in the first place?


I don't know if we'll get to inoculation in 10 years but I do think we may get closer to what we see for cancer, where we've got certain subtypes. Cancer is a whole group of diseases. I think the types of suicide risk are going to be one of those areas. I think there are going to be some patients where we really can understand who they are in advance and almost completely eliminate their risk. I think there are going to be some cases where we can significantly reduce their risk, and then there are going to be somewhere we're still struggling, and that's where I think we'll be probably in eight to 10 years.


Yeah. Like in my world of oncology, we now have the ability to sequence sometimes multiple times. We'll look at the cancer stem cell proteome. Hopefully, this era of precision mental health will continue to unlock pieces in the neuroscience of the brain and hopefully be more proactive and preventative rather than waiting for the therapeutic component.


I completely agree. I'm extremely excited for the next decade.


Well, number one, thank you, DR. SETH FEUERSTEIN, for your amazing work in the space in the past, and now, and into the future. Thank you so much about sharing with us on healthy conversations. For those of you who are listening and you missed our initial episode, Dr. Feuerstein and I had a fascinating discussion diving into the etiology, the current state, and the challenges and opportunities to understand suicidal ideation, suicide in general, and to hopefully move the needle in that regard more broadly. Thanks for joining us on Healthy Conversations, we'll see you next time.