Apple Podcast
Spotify Podcast
YouTube

In this podcast, Dan Housman and Giuliana Kotikela, Co-Founder and CEO of ChiKeey, a fashionable and functional wearable device discuss the utility of real-time biometrics and advanced real-world data and analytics to assess and analyze emotional wellness. Chikeey’s solution utilizes unique indicators to proactively identify potentially significant changes in mental health where individuals may benefit from earlier intervention either via support or therapies. With this perspective we will discuss future applications to support the next generation of clinical research and patient care via remote monitoring.

Transcript

[00:00:01] Hi, this is Dan Housman with the Novel Cohort’s Podcast, and I’m here with Giuliana Kotikela and we’re here to talk about her new venture and talk about real world data. So, Jewels, tell me a little bit about your background. How did you get involved? Data science, how to get home health care?

[00:00:19] Hey, Dan. Hey, guys, so I’m Jewels Kotikela and my background originally was in mathematics and statistics. But fast forward in my previous life at a big four consulting group, I started what was called the Inside Studio. And what I really did is I worked with hundreds of different companies to drive innovation and really using their data to identify what insights were possible, practical as well as feasible and valuable. So I really kind of merged my skill sets coming from school when I was a kid up into kind of using it in a real world manner. When I was working at the Big Four, I traveled consistently and quite often and I had these conversations with my daughters. I have two daughters and the oldest one who’s not yet 13, she was having some challenges with school, unbeknownst to myself, and the teacher called me up to have an intervention. And I was like, What is this? And it was basically she was having a conflict with a girl in another classroom. And I was kind of like, where did this kind of come about? Where did it come from? I didn’t really have I kind of was blindsided with I didn’t have any awareness before that the situation was escalating and that it kind of came into its own. And I thought at the moment, if I had contextual awareness to understand that there was something that was going on where she was, it could really open up that aperture to have an honest discussion and dialogue with my daughter. And so that kind of started getting the wheels moving. And so I left actually the big four to start ChiKeey. And that’s how I started ChiKeey. And it’s really the culmination of all these different things, having that real poignant problem in my own life and seeing how not only data science, but how untapped the mental health arena really is. There’s no class two types of devices. And I really wanted to be able to not only just help myself, but help other people who could be in a similar situation.

[00:02:29] So I think it’s a consumer device you’re working on. Can you tell me more about what it is, what it looks like?

[00:02:37] So it’s a smart wearable ring. It’s an integrated behavioral management system and a smart wearable. The titanium ring contains a miniaturised sensor package which is capable of detecting biophysiological signals in real time. That is also paired with a mobile application that allows individuals for near constant monitoring as well as interventional treatment. So this actually gives us the ability to alert the patient preemptively, alternately recommend different courses of action. And ultimately we look to align the individual’s emotional goals with quantifiable therapeutic outcomes. So think about this in a different way. We could potentially assist in treatment decisions by alerting clinicians to acute distress factors like some different areas like that, that we have some abilities as well as in addition to Just-In-Time interventions and retrospective analysis.

[00:03:34] So it sounds like a smart mood ring from the 70s.

[00:03:39] Is a smart move from the 70s, but it is turbocharged. So we’ve taken the best that technology has to date and put it into the very small form factor.

[00:03:50] And why a ring why not just use my Apple Watch or whatever Samsung does.

[00:03:56]  So the ring was a really great form factor requested actually by one of my founders, my co-founder. And I thought the idea was really great as well as a you know a watch is very different. A ring can be very stylish. I felt that there was no real meeting the needs in the marketplace. Right. When you think about what was out there. There’s lots of fitness bands or watches and so forth, but there was high amount of saturation. And when you think about the different ring form factors, there’s not very many that play in this space and they’re also for very different use cases.

[00:04:35] And, you know, you’re going to be observing a whole lot of things based on what you described, aren’t you worried about privacy and people being concerned about the invasiveness of this kind of product?

[00:04:47] Yeah, completely. I mean, I think I mean, we take privacy, very seriously. Right, with, you know, especially when you think about what are the concerns that people have is definitely around privacy and people have the option to share or not to share. We give people the tools and the capabilities to make a choice. Right. An informed decision whether they want to use the data for themselves or do they want to share it with other people, either be that from a clinical standpoint or with creation of their own tribe? Right. Another thing, when you think about it, people don’t typically ask for help if they really, really need it. Right. You see a lot of the stuff happened. And imagine if you could preemptively understand or see that maybe a friend or family member cohort, etcetera wasn’t doing really well. What happens if you could actually intervene into that situation ahead of time? Instead of waiting for someone to call you, you make the call.

[00:05:50] Yeah, but does that not get into that privacy problem?

[00:05:54] But that’s the individual’s choice, right? That’s completely their choice, they don’t have to share. They could keep it to themselves. That’s OK.

[00:06:01] And how do you train this thing? I think you’re assuming it works out of the box. Does it get better over time?

[00:06:09] It gets better over time, right. You have to baseline for an individual because every person is completely different. Right. So you have to go to the baseline and process to really understand where do you fall in terms of the different levels of emotional state happiness to you might be like what? What would be considered on my scale? Like sad.

[00:06:30] So everybody falls into a different area, and so you have to be able to actually baseline and as you wear it, the more that you wear it, the more it’s understanding and recognizes your own feelings, thoughts, emotions as your you know, your prior history as well as as you’re going through your day. And it just gets better. In addition to that, I mean, that’s the whole purpose of really having the multiphase effort with the University of California San Diego. We’ve already completed one clinical trial in India. Right. But that’s in a laboratory type of setting. It’s not real world. When you’re in the real world, you’re moving. There’s different noises.There’s all these different types of artifacts that you have to account for. And so the multiphase approach that we’re taking with you is really to account and to cover for that stuff. Right. When you think about the validity, not only the algorithms, but everything else.

[00:07:20] How will this work in conjunction with health care? If you’re dealing with a clinical team and a patient and patient data. How is it going to work? And what can you thought of?

[00:07:34] I mean, they’re pretty pretty different use cases, right? I mean, there’s definitely it’s prescribable option, right. Where we’re working directly with, you know, the doctors, therapists, the patient itself is very separate use case versus the over the counter version of our product.

[00:07:52] Ultimately, our goal is to be a class two device. So we’re working with the FDA to do and take the steps that are necessary to go through that process.

[00:08:04] And so validation is important. Are you going to do that? How do you validate something like this? Have you thought about it?

[00:08:12] Yeah. So the validation really comes from having that foundational data set. And so in our proposal for our clinical trial, it kind of goes through. You don’t want to bias the different individuals, but you want to make sure. And so people are pinged throughout the study, right? It’s a 12 week study with control group and so forth. And as we validate the algorithms, when there’s specific movement relative to their heart rate, we send them an actual alert, kind of basically like, can you open the app and basically choose? Did you did you feel something? Is your mood changing right. From collecting all that information? That’s really going to be kind of like the base and the foundation for taking it out to market.

[00:09:03] And if you could link this to claims data, things like that, do you think you could find patterns that identify people with certain diseases or classify who’s at risk for a negative outcome?

[00:09:18] Completely. I mean, there’s there’s so many different things that you can start looking at. Right? Especially with mental health and chronic care diseases and conditions. They’re very heavily and highly correlated. It’s just understanding a lot of it has to do with social determinants. And that’s where I think we actually play really strong and is having that environmental stimuli, having the analysis of, you know, not only just who are you potentially engaging with, but where are you? What time is it in kind of what are you doing? That contextual awareness really adds that flavor, which I think is really challenging for anybody to really understand. Right. So and, you know, think about what are different social determinants for claims data for for the different individual patients. How can you start kind of understanding where they’re living? What are they doing? What’s the demographics of the area? Sometimes not all that information might be available. Hopefully it would be, but it can definitely make it a lot richer to preemptively determine, you know. Before a cause or before condition kind of gets out of hand. And who’s the most likely? What’s the likelihood of need for intervention or that they’ll potentially relapse into the emergency department emergency room?

[00:10:43] And I have seen a lot of these devices getting used in clinical research just as a tool for patients who are already on an oncology trial. How do you think that’s going to work? Do you have to change the device or is it the same device or what’s it going to look like?

[00:11:05] So that’s a good point. Right. So when you think about your class two submission, it’s for very specific use case. If you start monitoring for additional what I’m thinking that you’re asking for is if.

[00:11:17] You’re undergoing another trial and you want to know how they’re feeling throughout the trial, so potentially like maybe the drugs that they might be under in a trial or might have suicidal ideation.There’s some things like that where you could incorporate, additionally the ChiKeey ring to understand some of those those things, because right now, what do people have in the clinical trial? They use the GADS or the PHQ-9, but those are all point in time, not continuously monitored. So I think that it would actually be a really big help to clinical trials, much more innovative to incorporate that, to ensure that the patient health and safety are of utmost importance.

[00:12:09] And I’m thinking you’re going to end up with some treasure trove of data here, you’ll have let’s say you’re super successful a few years from now, you have 10 million subscribers or one hundred million subscribers. You are you going to do with all that data?

[00:12:24] That’s a good question. And that might be for podcast number two. You know, maybe we can open it up for more research opportunities. right with UCSF what we’ve been talking with them about is just like the N3C opportunities out there, it’s how do we make the world better for everybody, right? Without without giving away people’s privacy or individualized data, of course. But if we could aggregate that somehow to help or support mental illness, I think we would definitely be in favor of that.

[00:13:03] You’re an entrepreneur so you must stay awake late at night dreaming of what the future lies, especially with your lens with ChiKeey, what are some things you see happening in the future that are really exciting?

[00:13:15] I think it’s the idea of almost like twenty four hour constant remote monitoring. Right. I really love the idea that people can get and understand not only where they are, but determine where they want to go. And I think that opens up a whole new world for most people. Most people can say, oh, yeah, you know, I feel stressed or this or that. But it’s not about kind of how you felt. It’s how do you want to feel. And I think it’s about smashing that stigma. And if we can really be the leading brand of emotional wellness and mental health in this area, that to me would indicate definite success.

[00:13:57] Great Jewels, well, thanks for taking the time with me today, and I guess we have to find that podcast number two!

[00:14:08] Awesome. All right. Thanks, Dan.

[00:14:10] All right. Catch you later.