false
Catalog
Mental Health Apps: How to Recommend and Review
View Presentation
View Presentation
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Aloha everyone. Thank you for joining us for the mental health apps and how to recommend and review. My name is Jacques Ambrose. I'm the senior medical director at Columbia University and it is my pleasure to introduce this distinguished guest panelist that we have today. You guys are in for a treat because all of them have very long bios that I have to read from. So Dr. King, Darlene King, is an assistant professor in the Department of Psychiatry at UT Southwestern Medical Center, deputy medical information officer at the Parkland Health and the chair of the APA mental health IT committee. She graduated from the University of Texas at Austin with a degree in mechanical engineer prior to attending medical school and residency at UT Southwestern. Dr. John Liu is the clinical professor of psychiatry and serves as the director of consult liaison and emergency psychiatry at UC Irvine Medical Center and the director of the psychiatry residency training at UCI School of Medicine. Dr. Steve Chan to his left is a clinical informaticist and addiction psychiatrist, clinical assistant professor affiliated at the Stanford University School of Medicine. He's also the chair of the committee on innovation at the American Psychiatric Association. He has published in JAMA, Telemedicine, eHealth, JAMER, WIRED, PBS, NPR, IdeaStream. His latest ventures include mental power hacks and async health. And last but certainly not least is Dr. Jay Shore who is the director of telemedicine at the Helen and Arthur E. Johnson Depression Center. I'm vice chair for innovations at the Department of Psychiatry, University of Colorado Andrews. That's good. Thank you. Oh my gosh. So please join me in giving them a round of applause. Looking at you. Hi everyone. So today I'm gonna start off by talking about mental health apps and giving you a framework for how you can incorporate apps into your practice. So the learning objectives for this talk is to understand at least three risks and three benefits of using apps and care with patients. Guide a patient through informed decision-making by helping them find a smartphone app that is tailored to their unique needs and engagement style and assess current APA resources and be able to utilize them to support yourself and the patient around the selection of an app. All right so we've all heard the term there's an app for that but when it comes to mental health apps there's over 10,000 in the App Store and sometimes when you're looking through there you know there are a lot of different factors that go into whether an app is going to be safe, is it going to be effective for a patient, is that app evidence-based? And since COVID, the searches for depression apps have rose over 156% and searches for mindfulness apps continue to rise and so there is a need to know, you know, what kind of apps can we recommend to patients? And so something to notice is like even before COVID-19, you know, mobile apps were in demand and so there is a demand for apps but it's still kind of the Wild Wild West and like trying to search for an app that is clinically effective and safe for patients may be like searching for a needle in a haystack. And part of that is because there is a confusing regulatory process and to give you an idea, there have been reviews about apps where they'll look at all the, they'll look at about, they'll go to the App Store, they'll look at the five-star apps for anxiety and then they'll see which of these are actually like evidence-based and actually have some clinical evidence behind their use. And multiple studies have shown that the top-rated anxiety apps at that time when that review was published, actually none of them had any evidence-based behind them. And some apps can even be harmful where there were, there was a study in 2015 that looked at apps and they found that there was a bipolar app that was saying, you know, drink, take a shot of liquor during a manic episode or be careful because your bipolar might spread to other people if you're telling them about your depressive life. And so there's some language in certain apps, if you don't dig into them, they could actually have really harmful language behind them. And so the current regulatory system right now is whether a company decides, is this app a medical device, do we want to classify it as software as a medical device, or do we want to keep it as a wellness app. And wellness app means, you know, it's something that could be like a Fitbit or something that you use to just kind of like maybe help with some breathing exercise, but it doesn't like really track things that you would want to act on medically. So there's a great area there. And at one point the FDA was piloting what was called a pre-certification pilot program where they were wanting to have some regulatory oversight and say, you know, a certain level of evidence would be needed to be able to have these devices be overseen. However, back in September they said, you know, we're gonna do away with the pilot program and they haven't offered any further guidance on it to date. So there's no effective regulation for most mental health apps or technology, but usually what happens is the FTC comes in from the back end when there's egregious data errors. So in this slide it was the case of Loris AI, which was, so it was a conversational app that patients talked to when they were suicidal, and they were packaging that data and selling it to a spinoff company. And so the FTC intervened and put a stop to it. But that's an example of how, you know, there's some regulation after some harm had been done. And then what about apps that are FDA approved? Because there are some out there, and where it gets a bit cloudy is in the quality of the evidence that exists for FDA approved apps, where some of the studies may not have the highest quality of data. And so that's why it's really important if you want to use apps to be able to judge them and evaluate them yourself, and to be able to teach your patients how to do that too. And so that's why John Torres's lab and the APA partnered to create the APA app evaluation framework. And so here you see that you start with context and background, and then it's in the shape of a pyramid. So that you start at the bottom, and you go through each level, and if the app doesn't meet one of the lower criteria, then you don't move up on the pyramid. Then you can just stop there and say, okay, this is likely not an app that me or the patient would want to use. And it's also an open-ended framework, and we're going to go through the framework together. But overall, there are some guiding principles when you're using this evaluation framework. The first is that there is really, like, no, like, this might be the very best. Apps are constantly changing, so we're trying not to, like, say or rank the apps because an app that might be good today may not get an update, and the next day it may not really fit your criteria anymore, or a privacy policy might change. And so you want to stay up to date with that. And then everybody has different needs, and everyone is going to react to an app differently, and especially in terms of privacy protections. Some people are really, really don't like the idea of information flows to a company, but other people are okay with it, and they are willing to share information to get that app as a service. And so I think talking about that and considering that is helpful. And then, of course, everybody has different clinical needs, and you may want to use an app for multiple reasons. And so ultimately, so let's start with level one. We're going to go through the context in the background. So right now we're going to go through the app evaluation model layer by layer, and then we're going to do some examples to help you kind of solidify that knowledge. So the first level is context and background, where one is, like, what do you want this app for? You say you have a patient with depression. All right, we're going to look for an app that has depression, and then what else do you want the app to do? So you're just, you're learning about the app. You just, it's kind of surface level, what is, what are we looking for? And so another part of this is digital literacy. So you want to check in with the patient. You want to make sure, one, do they know how to use a smartphone? Do they have a smartphone? What's their data plan? Do they want to purchase an app, or do they want something that's free? So then we move on to level two. So this is around privacy. So you always want to look at the privacy policy, and keep in mind some key phrases. Whenever you see the term third parties, that usually means that data is going to be sent out of the app, and then the third parties, there's a lot of flexibility with what they do with that data, and so it's important to know that if that word is in there, you want to kind of read the language around it very carefully to see how they may be sending that information out. Another way to think about privacy is something called contextual integrity, and you know with HIPAA, HIPAA really defines the information flows, where a patient's agreeing to share sensitive information, but with the doctor, and the payers, and whoever is involved in their health care, so that they can receive a service. And so this idea of information flow for a certain purpose, and it fits the particular norm, is important to think about. And so with apps, thinking about, okay, how is this information flowing, and what is it saying in this privacy policy, but it can still be confusing. And so on here, this is just showing a few top, a few top apps that even if they may be ranked five stars in the App Store, may have pretty egregious data policies. And then there is, there was a study that was done where a journal, a journalist contacted data brokers, and she wanted to see how much information she could get, how much health care information could she get from what are called data brokers. And so she contacted 11, she contacted 37 data brokers, and she was able to get 11 to agree to sell her mental health data. And so this slide shows different charges, $275 for 5,000 aggregated counts of Americans mental health records. And so data included their ethnicity, their age, their zip code, their, the number of children in their home, their marital status, date of birth, whether they were a single parent or not. It said what medications they were on, what diagnoses they held, specifically if they had depression, bipolar, anxiety disorders. And so it was really intrusive information that was captured in these data broker databases that were being sold. So another thing to think about is the safety of an app. So sometimes apps can be dangerous in that during a crisis they may have inaccurate suicide crisis lines, or they may not have accurate information on where to find help or what to do in a crisis, or they may not offer any services in a crisis. So it's important to, if you're thinking about an app to support a patient during a crisis, to really thoroughly look at how it handles that. And so lesson number two, don't assume an app is private. Be especially careful if it captures more sensitive, like GPS or personal information. And then the privacy policy and terms and conditions often govern the privacy, and the fine print does matter. All right, level three. So this is where we look and see what kind of clinical evidence exists. So although, on this slide, 59 apps claim to be effective at diagnosing the mental health condition or improving symptoms, only one app included a citation to publish literature. And only a single app for bipolar disorder from a 2020 review was backed by strong evidence. And so overall it's important to look at what type of studies have been done and and thoroughly vet the evidence that may be out there for an app to use all the skills we learned going to journal club. So approach app claims like those made by nutritional supplements. When you see the phrase based on CBT, it doesn't mean that it's evidence-based. And then app store reviewers and download numbers do not correlate with evidence. So then we get into level four, which is usability. And how likely is a patient going to be able to use an app? So if the app isn't really user-friendly, their use will probably drop off really quickly. And the graphs on this slide show that, you know, the time from installation and the use at first it's really high, but then it quickly drops off. And people may use the app a little bit, but then they just eventually stop using it regardless of what app it was. So I think a big thing with apps is how do you keep up that engagement and keep people interested as they're using the app? And so on here, COVID coach was an example. For patients that downloaded it, they were using it for stress management, but then only 19% used it for more than three days. And then 1.56% used it for more than 15 days. So we can maybe make some, draw some conclusions about the engagement of the app. And so overall, here's a review of all the levels of the AP App Evaluation Model. And, you know, you may be thinking, how am I going to take the time out of my day to go through all of these levels and read through these apps? And so what Dr. Torres' lab did is they created, they went from the framework and then they translated that into a database or an evaluation hub that's easily accessible. So you can go online to mindapps.org and you can filter through and find apps that you want to use. And so instead of having to search for apps and then go through the evaluation on your own, you can filter by different criteria and find apps that meet what you're looking for, and then read the evaluation that's already been done. And it can be a starting point. Like you don't have to say this is the end-all be-all evaluation, but you can use it as a starting point to just say, okay, this is, and the other thing is that there are multiple evaluations on there. It's not just a one-time thing. There are evaluation, like several different evaluations on the same app, so you can read multiple opinions, and then they're also kept up to date, which is another thing. So there are other evaluation hubs out there that you may come across, and when you find other evaluation hubs, you want to look and see, okay, what framework do they use? And then see if what you think about that framework. Think about, okay, what's the evidence behind the framework? And then you also want to think about, do they keep up to date on the evaluations in the evaluation hub? Because if the evaluation is old or outdated, that app may not fit the criteria anymore. And so the thing with this is, it allows you and your patient to really decide what's important to you. So what we're going to do now is we're going to go through a few scenarios where you can use the evaluation hub, the mindapps.org website, and we're going to go through and see how we can find some relevant apps. So in scenario one, we have a client experiencing depressive symptoms. So let me go back. So they're experiencing depressive symptoms. They potentially have a mood disorder. They're interested in using an app to log their mood every day. In their spare time, they enjoy writing in a journal. They would prefer to have an app where they can connect with someone, such as a therapist. And the digital navigator asks them if they have any privacy concerns. The client feels strongly about having an app that meets HIPAA. All right, so we're on the website. All right, so we are on here. So the patient has depressive symptoms. So we can go to supported conditions and do depressive symptoms or mood disorders. They want to log their mood in a journal. So as you can see, there are different filters on the edge over here. You can read through them to find. They said that they want something that meets HIPAA. Has anybody found an app with the filters yet? Journaling. Let's see. What else did it want? Connect with somebody. Coach-therapist connection. So in doing all those filters, we were able to find an app. And then you can go and read through. So it always has a description with different screenshots of the app that you can explore. And then here's the review. And you can look and see there have been eight ratings and reviews of this app. So this can be a way, if you are wondering about an app, if a patient asks you about an app, you can come here and it can be a good starting point before you do a deeper dive. And so there's a lot of other scenarios we could go through with this. So another one is if your client speaks Spanish and can't afford to spend money on an expensive mental health app, or their doctor has told them that they need to exercise more, they can't think of anything else that they want in an app, and then as a digital navigator, you ask them if they have any preferences in terms of developers, which they only want an app created by the government. So again, you can go through and you can, again, filter for this. So developer types, there's one for government. And I would say the VA has a really good privacy policy. So a lot of their apps I feel comfortable using. Going Spanish. All right, and then this app pops up, and again, you can go look through, look at all of the evaluations. All right, well, thank you so much for navigating apps with me and learning about app evaluation. So now I will pass it off to Dr. Chan. I really love the interactive demo that you showed us and how to use mindapps.org. I'm so curious, I would love to ask, for those of you in the audience, if you could have a show of hands, how many of you recommend apps or have ever recommended apps to your patients? Okay, wow, fantastic. How many of you are involved with your clinical informatics or IT department? Quite a few, quite a few. How many of you make purchasing decisions for your department, have the power? Okay, not enough, not enough. And then how many of you have digital navigators in your institution that helps guide patients or guides clinics? So not enough either, just a few, just a few, okay. So I think that all the things that Dr. King had presented, especially digital navigators and then all the sort of criteria that we have to go through are very important, right? I think one of the most common questions that we hear about is, which apps should I do, do X and Y? And it kind of just underscores how nuanced the conversation can be when you're assessing the landscape of apps. So we're all very familiar, since we've all been there in terms of recommending apps, oh, conflict of interest, disclosures, I'm not going to be talking about any of the stuff that I've consulted on or have been involved in creating. So we all know that with apps, lots of potential there because there are more mobile devices than actual mental health clinicians in the world, right? And this has a great potential to serve underserved areas, folks who don't have access to therapy or any sort of psychiatric care. There's also been this movement, of course, for patients to be much more self-empowered, to manage their own health. There are even societies that are dedicated to empowering patients. And then we've also recognized that there's a lot that goes on in a patient's life outside of the clinic. In this particular graph from this paper on digital life data in the clinical white space, one interaction with the health care system is shown in point one, just a few points over the course of three years, but just imagine all the data you could potentially collect in between and then show a summary of that during the sessions, right? And we all know that there's been a lot of funding, a lot of potential in this area. The digital therapeutics and digital health space has grown quite a bit and is suspected to continue growing. Despite some of the setbacks economically, we're recognizing that there's a role for apps to play. But I think it's also important for us to recognize that, you know, there's always these hype cycles, right? I remember when we first talked about this at the APA and just like even my training 10 years, oh gosh, I'm dating myself, 10-ish years ago, apps weren't even really on people's radar. And we would see sensational headlines like, well, can apps cure mental health? And then we're seeing sort of the same Mad Libs-like template play out over the past decade. Can VR cure mental health? And this year it's been, can artificial intelligence cure mental health? So I think it's just important for us to recognize that, hey, there are all these patterns, right, when it comes to evaluating these technologies. What we do to evaluate and critically appraise apps can also apply to things like artificial intelligence or chatbots or virtual reality too. So we have technologies that are in different stages of the hype cycle, but I think apps, we're really pretty much at the plateau of productivity where we're seeing how apps are just very commonly involved and implemented in institutions like the Veterans Affairs. So we choose Veterans Affairs because it's like widely available. I'm not here to represent the VA, but, you know, we all feel like it's a very important resource to have because you can download these apps, you can try them out for yourself. It's not going to send data to the mothership. You can go to mobile.va.gov and just try them out. Some of those apps though do require an account, like you have to have a, have to be a customer or a veteran in this case, but it makes sense for those apps to require you to be a customer. Those are things like refill medication apps or video apps. You can download this for free. The VA mobile health practice guide has a very comprehensive set of information. This is from the National Center for PTSD, but it applies for a lot of the other apps that they have in their mobile app store. There are a lot of other resources out there. We're just going to show a few more examples. This one's free to use, pam.stanford.edu. PAM stands for pause a moment. This is more of a web app, but it works on mobile phones as well. And then Dr. King also talked about COVID Coach. I want to pass over this because this is an app more geared towards telehealth services. So if you think about it, mental health apps don't necessarily need to be just for mental health. Zoom could potentially be considered like a mental health app to stay connected, to stay connected to group therapy sessions or individual therapy sessions as well. Lots of studies have been done on these apps at the VA, particularly Virtual Hope Box is one of them to allow people to regulate their emotions and cope with distress. Stay Quit Coach helps those who wish to cut down or remain abstinent after a quit attempt. They recently refreshed their interface with a really spiffy new logo. And you can also combine these with your other therapies in your toolbox, medicines, patches, verniculine, that sort of thing. IntelliCare was one of the earliest app suites that was studied in, I think it was Northwestern University by David Moore. They are now trying to make this, disseminate this much more broadly with a more commercial venture, Avega. In their studies, they have found that using apps with coaches is more effective than just apps alone. There are other commercial apps out there. So if you ever get a lot of vendor solicitations asking you to, hey, try this app, you've seen on the exhibit floor there are a lot of these booths, you can do things like, hey, how does it look like? How does it feel like? Is this easy to use? Using the criteria that Dr. King presented when you're going to the booths downstairs. And then I think even commercial apps that may not necessarily say that they treat a mental disorder, these are also things you can consider like, hey, maybe just trialing it on your own. This one's Headspace. It's a very popular meditation app. Not a medication app, meditation app. And they have also been branching out into medication management since they acquired Ginger.io. So all these, it's really interesting. All these different, just like organizations, just like health systems, these app platforms just morph and try, they try different things. One of the things you may want to just be mindful of is that they change so much, right? So that's the good thing and the bad thing. Lots of products exist. There have been studies that show like how apps would just kind of disappear from the app store after a while. I think that more recently it's kind of plateaued, particularly since apps are much more complex to develop lately and the consumers are demanding more complex functions from their apps. But this is probably something that's really well known in the mental health app space, Paratherapeutics. This is a company that unfortunately went bankrupt this year, but for the past decade they spent a lot of time trying to expand the definitions of digital therapeutics and make this something that could be disseminated more broadly. I think a couple days ago they announced, it was a company that was valued at a billion or more than a billion dollars, and now it was just sold recently for like, I think, the single digits, millions of dollars. So this kind of just goes to show you that when you choose apps, choose vendors, you want to make sure that they're well funded, that they are going to stick around for a while, that you have a backup plan in place. I've shared with you, personally, with video, even video technologies, we always had one video app, and if that didn't work, then we had a backup video app, too, just in case we needed to use it, and then good old plain old phone. Lots of apps also, we've heard about how they are being sort of conduits to get access to medication support, but something that the New York Times had called restaurant menu medicine for things like depression and anxiety, birth control, those are some things to just at least be aware of that these exist, in case you do get questions from the patients that you serve. And also some of the issues that may face these apps, Cerebro, for instance, was a telehealth, is a telehealth provider app that has had some investigations, just something to just watch for and just keep apprised of. Okay, so some of the things that we've already learned about, how to appraise apps, using the APA app evaluation model. Some of the other terms you may hear in this realm, this is just interesting because when you try to search for these terms, they all have different, they all kind of reach different areas of the space of digital health. So broadly, there's digital health technologies, focusing a bit more, you have digital medicine and then the digital industry term for things that are much more vetted and can be prescribed are digital therapeutics. The analogy I like to use is that it's very similar to nutritional health and nutritional supplements, which have a little bit more regulation, and then oral pharmaceuticals, which have certainly a lot more regulation, right? And then this terminology is also evolving inside the journal articles that we read. mHealth, for instance, it used to be addressing just mobile health apps. We also have heard over the past few years, variations of DMH, not Department of Mental Health, but digital mental health, digital mental health treatments, DMHTs, and digital mental health interventions, DMHIs. Just throwing these terms out there, just so that when you're looking for journal articles about DMHIs or DMHs in general, then this could be helpful for you when you're trying to plug it into PubMed. But a lot of these, it just kind of shows you how diverse the app landscape is too. The American Medical Association produces really helpful guide, and if you're trying to think about, oh, which apps or platforms should I integrate into our practice or our health system, this one is a free-to-download report called Accelerating and Enhancing Behavioral Health Integration Through Digitally Enabled Care. That's a mouthful, but they have some really, really nice graphs and diagrams inside. One of them is this particular graph, and what I like about it is that it shows steps along your patient's journey. Whenever you have a patient, you just see them for a slice of time in your clinic, but then there's all these steps that happened before and after that these apps could potentially be helpful with. For instance, all the stuff on the left-hand side are things that happened early on, patient intake. You can get a digital intake or screening tool app, digital referral tools, EHRs are an app themselves, a mega app, right? How do you manage your panel, population health, health information exchanges, how do you get data from other EMRs, digital social determinants of health tools, machine learning algorithm enhanced CDS, clinical decision support, digital prescribing platforms, and then of course telehealth for video, digital medication management tools to see if they're continuing to take their medicines. All of these are different features of your apps that you can pick and choose. So it's just something to keep in mind when you're designing your digital clinic. We talked about the app evaluation criteria. AppAdvisor itself is a resource you can use to see examples of how to evaluate apps. So this is on the APA's website. You'll notice that it says that the evaluation was completed in May 2020. And I think that what we're, you know, one of the drawbacks that Dr. King had mentioned is that, you know, you can't just rely on one app at a time, right? It's even, I mean, it potentially could be applicable to medicines too, right, depending on the manufacturer and when it was, you know, evaluated, but more so for apps. So I think that, you know, at some point we may see some sort of certification body or accreditation body, but we're not quite there yet. MindApps is something that Dr. King had mentioned already. This one's another guide called CyberGuide. This one also has a very similar, you know, issue where you need to just check the date on when the reviews were done. So we've seen this, I mean, over the past decade we'd see all these kind of app evaluation websites come and go, and at some point we'll find a way to sustain them. We've talked about how to evaluate these. One of the things that I also want to highlight is how in the world, how do you get these compensated for? There was a very, very good diagram in IQVIA, they had a digital health trends guide a couple years back that showed all the different ways to get an app paid for, right, because you want to make sure that the app continues and lives on so that it's supported. So it shows four different diagrams, but first is direct-to-consumer so that the patient pays for it. Second, you can charge it like a device. So we've heard of some devices, like combined with an app, that are charged like a durable medical equipment. Third, a lot of companies have been trying to pursue this where they get a pharmacy benefit or pharmacy reimbursement. And then four, place it into a vertically integrated health system that's value-based. This is where the VA comes in. We've heard of Kaiser Permanente, which is also a vertically integrated system, and that's where some of these apps may come to play. So if you're ever wondering, oh gosh, how are these apps paid for? You know, there's no one single way, right? There's so many different ways to have it financially sustainable. I'm going to skip over this because we've already gone through these, but you can check these out on the psychiatry.org website. And if you're interested in other tech events that are related to psychiatry, you've seen the Innovation Zone Dunsters, the Psychiatry Innovation Lab probably just finished up just about an hour ago. American Medical Informatics Association is more data science-focused, ATA, American Telemedicine Association. A new one that I didn't hear about, but we just announced downstairs, AMXRA, that's for extended reality that covers VR and AR headsets, and a whole host of other events, too. So it's a really exciting time. We didn't have this list 10 years ago. Let's just put it that way. But I just want to say thanks again for your time, and I'll hand it over to Dr. Loh. All right. So you've learned about how to assess apps, and you know, it's daunting, right? Because you're not sure. Like, you're not a tech wizard. I mean, you barely know how to turn on your phone, but I am going to teach you security issues on your phone. So this is an easy question. Does your smartphone or tablet require protection? Yes, right? Please don't tell me you have no lock, especially if you have no, any PHI or anything on it, or probably your IT from your enterprise has put something on it. You're like, do I really want them to have control? Yes, you do. Of course, they're not paying me to say that. I'm no longer involved in informatics. But anyway, but the bottom line is, you know, you're at risk if you expose your patient's data. So of course, if you can't figure out how to manage it on your own phone, how the heck is the patient going to figure this out, right? Well, no worries. I'm going to make you a security expert in just 15 minutes. So let's kind of review a few things, right? This is from 2012, which is 11 years ago. But you think about it. Things have evolved over time, right? The first iPhone, people were learning, whoa, it exposed tons of information, right? So here, Apple says, oh yeah, we're going to have, it requires explicit permission from the user to have access to contacts. Okay, that's good, right? Make sure the operating system, iOS, is secure. Give us control. So that's one element. But really, lots of data is sitting on your device just like, you know, a postcard, right? I mean, you know, this is very nice. It says, hi, I'm glad you've enrolled in our spotlight program, blah, blah, blah. But the point is, it's there, right? We know that, oops, let's see here, the slide's not, well anyway. So there's other problems, which are too many versions of slides. So the other thing is, there's malware. We all know about this. And if you come to my other workshop, you'll learn about this. But malware, as you know, is these bad agents that use software to take control of your device and steal information. And if you're not a black hat or white hat hacker, how the heck are you going to prevent that happening to your device, much less your patient's device, right? And there's reasons to be worried, right? So for example, you may not have heard this, but there was something called the Xcode ghost malware. And so what happened was that the Apple App Store was hacked, mostly from developers from China because they were trying to download the Xcode, which is the platform used to create apps. And, you know, what happens was the download took forever, so they kind of, the hackers created this sort of side one that people could go, be redirected to, and they didn't know this. And then it would infect different apps so they could gain full control and all the data on this. So you can imagine, right, you recommended an app to your patient, and you go, uh-oh, it's on the list. No, no, fortunately, most of these were not sites that had mental health information, but it's damn scary, isn't it, right? You have no idea what people can do with your device. So that's the problem. We know that there's lots of data on your device. This is from one of John Torres' slide decks. We like to share in the tech field. And so there's social network data, behavioral objective data, self-reported data, active, passive, there's all sorts of stuff that our device has. I mean, heck, Apple even touts, as well as Android, that it can help you with your health. Did you close your ring today, right? Or, oh, I noticed, John, you only slept six hours last night because, you know, you're really playing Angry Birds way too late at night, you should cut that out. But the other part of it is, back in the day, the global positioning system is great, right? We know where we are. Although, I swear, it still has trouble around tall buildings. I barely made it here. I got lost from R&G Lounge, trying to walk to the convention center, and it turned me the wrong way. And I'm thinking, dang it, I knew this was going to happen. I gotta quit leaving the convention center. But anyway, the GPS, as you know, is more accurate, which is great, right? I mean, come on, who, does anybody here remember what a Thomas Guide is? Yes, okay, thank you. I was afraid I was gonna be the last one. But, you know, everybody now uses Google Maps or Apple Maps or whatever map system. In fact, these things have taken over and used to be that TomTom and other GPS special device makers were the way to go, but no, those are passe because it's all on your phone. But the problem is, now that data can be used for good and for bad, right? So there's geotagging, right, which is great. I mean, come on, you can say, come on, John, like, you know, I go traveling to all sorts of places, so I tag my photos on my location, and that's perfectly fine because then it tells me where I took that picture as well as the dates and stuff like that. Okay, fair enough, fair enough, I agree. Sometimes that tree in the background looks like it could be anywhere, so it would be nice to be identified for me, so I like that. But you need to know how to track and block permissions because simple apps that you think are just doing simple things like this flashlight app that used to exist before Apple put it into the App Store was basically sending information to the developer about your location, which had no business knowing, and so you need to know that these things happen and how to turn that off. So please tell me, well, maybe I won't ask for a raise of hands, but please tell me that you've at least looked in your security settings on your smartphone, yes? Okay, well, maybe then I could just stop and give it over to Jay Short because then you already know everything, or do you think there's more you should learn? All right, I'm gonna assume the answer is yes. But anyway, it's very simple to adjust your smartphone privacy. All you have to do is look at location services, and you know, yes, maybe Redfin needs to know where you are because you do want that data on that house you saw for sale as you drove by, so yes, you want the app to know that. And you notice here, this is my phone, by the way, notice my photos has that off. Why is that? I know myself too well. I'm very bad at remembering to turn on and off location storing on my phone, so therefore, when I take a picture of my house, so that way people know what my house is when they're coming over, yes, I could conveniently let them know the actual address tagged in the photo, but then if I, you reuse that photo and post it to like Facebook or something like that, and who knows where else it goes from there. Well, this is why I don't tell people that I'm going on vacation until after I'm back because I don't want people to see, oh yes, John's on vacation, here's his house, here's the location, I'm just gonna break in because I already know too much about him. Anyway, that scares me at night, I don't know about you. So one thing you can do is, because now it's privacy and security, so you can see it goes from that, let's see here, so you can see here, we start off here, the next screen shows location services, and then here, there's system services, and the thing called significant locations is important. Why is that? The other day, the phone said, oh, you're heading over to the clinic next, after your half day over on the consult service, and I'm like, how did it know that? That freaked me out, I don't know about you, but if Apple knows where I'm going next, I'm not too happy about that, but maybe I'm kind of old school and it's time for me to retire. So significant locations, you can turn off, so that way the phone is not trying to predict where you're gonna go by reading your calendar, reading your location, tracking your everyday motions so it knows what your pattern is, but maybe that's just me, maybe you find it to be an added plus and so you wanna leave it on, okay, not gonna argue with that. But one thing we can do is, believe it or not, finally, finally, Apple, and I have some stuff for you Droid users, but has given us tools to look at the privacy that the apps basically potentially expose. So has anybody actually looked at this before, the app privacy report? Nobody, well now you know. So dig deep into the system settings, looking for the app privacy report. So you can see here, it goes on, it shows you basically data and sensor access, network activity, website activity and most contacted domains. That's a ton of information about what that app is doing on your phone without you knowing about it. So you could turn it on or off, actually it's actually probably already off, so you have to turn it on, because otherwise if it's not collecting stuff then you don't know what your apps are doing and you could see here that right now my music app was showing access to the library, there's some messages, et cetera. We'll go a little bit more to see what this shows us. So you could see here, for example, yes, I play way too much games, I really shouldn't be exposing this. By the way, this is all confidential, right? You're gonna let my wife or my boss at work know how much I play wordscapes? Or you use ESPN. Wow, I am doing no work. Outlook is only third on this list. Oh, this is embarrassing. Anyway, you can see though that these apps are accessing the network. When you look below, you can see which websites are going to. Oh, yes, Blizzard. Good God. Well, yes, one of my cousins said, hey, John Diablo, is it five or whatever, is coming out soon. You gonna get it? I'm like, no, no, no, I got work to do. I better not buy it, because I'm gonna be in trouble. Well, apparently, I've been accessing it a little too often, as you can see here. Then you can look at most contacted domains. You can see here the iTunes, that's fine. The next one is Google, which is probably Google. There's Google API. So you can actually see all the different domains that the apps are accessing. So you could see or get a sense. If some of these look kind of funny, like you don't know what that site is, then I'd be concerned that maybe the app is pirating your information and selling it to the black market. Because data is the most important thing. Forget the device. The device is $1,000, but your data is worth a lot more. So I decided to try one of these mental health apps out. So I decided for now to use Wysa, W-Y-S-A. That's one of those AI-guided, sort of self-help kind of apps. I decided to start a new chat. So it asked me, hey, John, you haven't accessed us in a while. Over the last two weeks, how often have you been bothered by any of these following problems? You can see it's clearly screening me for depression. And of course I told them, eh, I'm doing okay. I was lying, but that's okay. So now I'm going in to see what is Wysa accessing. So you can see here, I've said, yes, you can refresh. Yes, you use cellular data. No, don't bother me. I don't want any notifications. And also said no to Siri and search. But then you can dive deeper and see that these are the domains contacted by the apps, right? So you can see here, many of them seem to be, so I know that touchkin.com is actually the website of the app developer, so that's legit. Then the I-O for Wysa seems like input, output, so they're sending stuff back and forth from the server. That's okay. Interestingly, Google is involved in this. Hmm, I'm not sure what that means. But anyway, at least I know. Now you have data that the app is accessing stuff. Now if this seems kind of like too much gobbledygook and too hard to access, there's actually an app called App Privacy, which I used. And I found it actually kind of helpful. It kind of takes some of that report data and make it a little bit more readable so you understand it. So again, I said, let me see what the heck is Wysa doing. So it specifically shows the different IOs, the touchkin, et cetera. So these are all the same ones. Oh, it looks like it's also checking the iTunes store. I don't know why it's doing that. And Amazon, whoa. Hmm, now I see a conspiracy between Amazon, Google, and Apple. What are they doing with my data? Oh well. Anyway, so you can see here also what's nice is the app also shows what else has it accessed. Photos, camera, contacts, microphones. You dig deep because you never know if you download an app that has a Trojan in it and it's secretly listening in to all your conversations with your patients, right? Possibly, probably not. Hey, this is now one way you can find out. So if you get uber paranoid, you think your phone is jacked and therefore in trouble, you could use the safety check. You could also do lockdown mode. I won't get into it, but these are probably last minute resort things. But you can render your phone practically like as if you were a member of the CIA in high security. The phone won't do much, by the way, but other than make phone calls, but at least all your data is not going everywhere. So these things exist. So for Android, they do say that they will scan for malware and apps on the Android Google Play Store. They say they're also gonna help you manage your privacy. Like for example, will this Apple be able to take pictures and record video? So you can see here it says there's a privacy dashboard as to which apps can access your data, which devices have permission to control different settings, how's your data shared, as well as who has access to these things. Now, I don't have a Droid device, so I couldn't take any screenshots for you, but I'm gonna assume that Google's a benevolent company and they really mean what they say they're gonna do, right? Anyway. And of course, you can also look in the data safety section and figure out which details of what the app is doing. So if you have a Android device and can tell me at the end of this presentation, I would love to have screenshots to do on the next one. But really, even Wysa said that they know, and this is on the Google App Store, right, that this app may share data with third parties, it does collect personal information, health and fitness, et cetera, and does say that you can request that data be deleted. So that's actually useful to know. I haven't figured out how to do that yet, but I'm sure you go onto their support website, you can ask them how to delete that kind of data if you're worried about it. I probably need to when I think about it, because I one day decided to test out, and instead of answering that I was fine, I said I was depressed and suicidal, and I swear, I got worried that some welfare check was gonna happen to me, so I probably should delete that data by now. And so if Google also has apps like this one, the Privacy Dashboard, so it will show you, again, microphone usage, which ones are accessing your privacy permission, all it's telling you, et cetera, so again, possibly a useful app for you Android users to make sure that your privacy is well-maintained. And of course, data's not just there, but other places, so there is one called Jumbo, which is available for Android and iPhone, and of course it does have premium features, but the basic ones are pretty decent, and if you haven't used it, I highly recommend that you do. So, and again, I don't own any stock in it or anything like that, but what it does is it will comb your social media sites like Twitter and LinkedIn and Instagram, it'll say, or your Alexa comments, so it says, John, your old Alexa comments are stored on the server, do you really need those up there? Alexa, please play whatever, Gangnam Style or something like that, embarrassing things you want to remove, so you should have it probably do that. And you can see here, they noticed that this North Dakota contract tracing app actually violated people's privacy, and so that's why, if you weren't using Jumbo, you wouldn't know that. And you can see here, it also searches the dark web, so it sees whether or not your credit card or social security number was available online. I think I might want to know. Anyway, the last thing I'll say is apps are great, but they can be misused, so this is a slide that John Torres gave me, but what happened was there was one app that was used to predict your blood alcohol level, so just like anything, right, like a sword is good for cutting, but can be used maliciously. College students were using the app to see how high of a BLL they could get, and so this obviously led to, I think, the app being pulled from the app store. Anyway, thank you, and we'll have Jay come up, right. Thank you. You're cutting me off at 15 minutes, is that the threat? I can talk for a long time. And I'm all that stands between you all and a nice San Francisco dinner or a happy hour, so thank you for being here at this point of the conference late in the day. So, yeah, I'm gonna talk a little bit, a little more at the macro level, a little bit more about some population issues, but hopefully this will also stimulate your thinking about working with individual patients with apps and technologies. And so I'm just, let's see, okay, yeah. So I, you know, I wanna talk about sort of frameworks and bias in apps and how to think through at both a group of patients you may be working with, but individual patient level when you're prescribing apps, what you may wanna give some thought to and be aware of. There are, you know, there's a lot of reviews out now about different approaches and frameworks to looking at apps and looking at biases. And I just cited these two reviews, some work that we're involved in in the VA in adapting an app to a very specific population. But trying to think through, you can see this review by Ramos looking at diversity elements and trying to think through how to match patients with it. And the categories and domains they came up where were access, content, appearance, and cost are things to be aware of. And then this other recent review by Katawala looked at three components for cultural sensitivity and app development, representation, adaption, and accessibility. So I think sort of the three areas when working with a patient or a group of patients is to really think about digital disparities, the cultural fit with the background of a patient you're working with, and then really communication and what's really going on. So how it impacts the clinical relationship. So obviously, we live in this era of promising technologies, apps included, and obviously when used for good, as John says, that apps can reduce barriers to access to care, they can customize to the individual patient, they can address things, particularly apps like care coordination and resources. But I think also if we've had much more awareness, I think in a positive way, the impact of social determinants of health and healthcare disparities on populations. And particularly, I think COVID has been a huge lesson because especially in the beginning of the pandemic, having access to digital medical care was in some cases a difference of life and death for populations. And so we had both an increasing of access to mental health care because of technologies, but also a widening in a number of populations of the disparities with comorbidities and mortality. And particularly in the idea, the concept of the digital divide, I think in a number of populations and communities and individuals, that actually increased where patients were unable to access care because of a number of issues. I guess it can be a positive thing is sort of like the Leonard Cohen song, there's a crack in everything, but that's where the light gets in. So I'm hoping as a field that we can take some of these lessons from COVID, right? And so to me, there's been some articles and writing, and I think there's many different framework. I kind of think of four basic things that someone needs to access technological and digital mental health care, including apps. So broadband access, right? That's sort of a no brainer. So I work with a lot of rural populations, and when I select and prescribe apps, I'm looking for apps that can be run without ongoing connection or having to download from a server, right? And for some of the patient populations I work with, if they have to be constantly updated or in touch, they're not gonna be able to use them, right? So to use digital technology, you need bandwidth, you actually need some kind of updated technology to run it. And I have patients that use burner phones or even burner smartphones, right? Or they can't afford the latest. My iPhone version is getting clunky, but I dread spending $1,000 to upgrade out of like an iPhone 10, right? But at some point, and I can go out and I have the resources to get the latest iPhone, but getting the smartphones needed to run these apps, I think we've mentioned sort of tech literacy. And it's not just, I mean, I think all of us, right? Even today, I was learning things about, I saw us all on our iPhones and I was like, oh, wow, there's this stuff. And this crowd is generally probably considered tech literate, but this stuff is really, really complicated when you start getting into it and running the app. And then honestly, and you need real tech support. And in this country, and real tech support does not submit a ticket in my mind. You need sort of real time and maybe AI assistance will actually help us get to that. So those are four things when really prescribing an app and the patient in front of you to really consider what barriers they may have. And usually it's not all or nothing. I think often when you have a good app or a good piece of technology, you can figure out ways, but you may need to get creative to reduce that access to that technology. So again, that's the digital divide issue. So on the communication side, right, we all remember now in the DSM-TR, right, is the cultural formulation, which I think is a really nice conceptual way to think about how we interact with patients, right? You've got a clinical interaction, you've got a provider in the room, and what they're bringing to and their backgrounds, including their multiple, their culture, socioeconomic level, microcultures they belong to. You've got a patient in front of them and their background and the cultures they belong to. And then you have this interaction or filter they're communicating through, which is Western medicine. That's the model or paradigm that most of us are using to interact. The cultural formulation really asks us to pay attention to both what the patient's bringing in, what the provider's bringing in, their different explanations, and this framework of Western medicine that we're doing this negotiation across. John, I'm gonna use, he's got a super neat pointer, we're gonna try this out. Wow, so again, right, here's a maybe simplified diagram of the DSM-IV cultural formulation. Well, now, if you think about it, especially in the pandemic, it's not only Western medicine that we are interacting in a filter with. We are acting, we are interacting through some kind of technology medium, right? And it could be as simple as like voice, audio, a telephone, or video. It could be the app, which gets much more complicated, texting, a patient portal. And we need to remember that the technology that we develop has been developed in a Western framework, so if you're working, we're always working cross-culturally, right? You're never perfectly aligned with anyone, but if you're working internationally or in other cultures, you're certainly dealing with different components of technology. And, you know, so engineers generally make and develop technology, right? And then users use them. One of my favorite, this was a very simple example, we did a community health program, and we used the step counter, the non-iPhone electric step counter, and it was a diabetes-obesity intervention. And so we offered prizes at the end of the week, and there was this sort of group, this mafia of about three older grandmothers who were coming in, and they were putting up 30,000, 40,000 steps a day, right? I mean, they were just skyrocketing. They were cleaning up week after week. And so they were finally confronted, right? And it turned out they would take the step counter and put it on the dog's collar and send the dog out and crank, right? So it's a simple example, but users never use technology like we think. And that's true for sort of apps when you start thinking about, okay, I'm prescribing this app, and you can say things like, hey, this app isn't real-time, so if you get suicidal, you really need to use other mechanisms. You can say that and you can document that so you're medically legally covered, right? But you have a relationship with this patient. If this app has this mechanism to report that, what are you really gonna do when you get user adaptation? So here's my cute take on this, right? You have technology and culture, so you gotta think of the texture, right? Get it, get it, dad joke? Thank you, good dad joke. My first of the day, because I've been away. But really, it's sort of thinking now that we are interacting with patients through these mediums, it does change the communication. And it is significant, and that gets to the third point, that every technology we use has a different communication. You know, I love these, I love that, you know, you can, there's books now that have these auto-correct errors, so you see this one here, LOL. And I've used that, you know, and there was a study recently that came out on emojis, and there is no sort of, the study indicated that people are cross-communicating with emojis. People don't, we don't have shared meaning in emojis, right? And then you can see the grinning faces across our different platforms. So what does that, what does that really mean, right? And so, again, when you start using an app, right, what are the assumptions you're making about the communication processes? I always think, when I'm using a technology, I always think of three areas to consider before I actually start using with a patient. There's the administrative aspect. So those are the legal, regulatory stuff. And we've been through a lot of these and how apps have these components, privacy, security, tracking. There's the operational. That's like, how do you embed it in your practice workflow, right? So, you know, you really need to have an informed consent process. You know, Dr. Liu, I'm prescribing you this app for depression, this is how we're gonna use it. Here's the risk because you may not be able to get ahold of me on the weekend when you're telling the app you're depressed. And, you know, that's just the way it is. And so I'm gonna have to go through this operationalizing it and then finally thinking through, how does this really impact the doctor-patient relationship, both positively in terms of a connection, right? You know, having, I also think there's a symbolic thing if you prescribe an app to a patient. I've had patients tell me this. They sort of kind of felt my presence through the app. Like there was someone there, right? There's a transference, right? They may like, oh, Dr. Liu prescribed this app and so there's a connection with you, which could be really positive, right? There could be negative regard too. So trying to just think through it. Again, those are sort of distilling down the three issues I think through when I prescribe. How are there digital divide issues for this patient in terms of access, bandwidth, affording the app, using it? What is really going on? Are there some cultural issues I haven't really thought through about how we regard and use the app? Then how is that affecting our communication? So I think Steve already talked, but I think he wants to talk again, right? I don't see your names on the slide. No. I think we're out of order, that's fine. It's a blip in the matrix. So I think now we're going to move to Q and A. Did you want me to stay up here or was he, I think he's, are we doing, how are we doing the Q and A again? You can have a seat. I'm being told to sit. Thank you. Are there any live questions from folks? And then we'll take a couple online. Hi, Andrew Chocko. Thank you so much. That was really helpful. Great to see you, Steve. So quick question. I tried to catch it. You said that there was a site that could tell if your data was on the dark web and I, was that the Jumbo? That was Jumbo. That was Jumbo, okay. Was there something else we should be looking at to figure out where our data's going? Yeah. The problem is there's too many tools. I think there's actually a credit card. I think Discover says that they'll give you a free, as a member, you get a subscription to see if your data's on the dark web. Jumbo will also, although I'm not sure if that's a premium or a standard feature. I think you may have to pay the, I don't know, what was it, $5 per month or something like that. The problem, like anything, is there's just too many of them out there to know what to use. But I think it's kind of like behooves us to use something that we're comfortable with. You know, just for example, I'll be doing a talk later on physician reputations. It may be worth it if somebody posted bad reviews for you that you're gonna pay $5,000 to Yelp to kind of move that down, or maybe not. Like in my case, I said, meh, that's fine. Less patients for me. I'm too busy and in high demand anyway, so that's fine. So it depends on how you look at it. I understand Discover gives you 3% back every time they sell your social security number on the dark web, so I think that's a good way to get mileage and points at least. Thanks again, that was fantastic, by the way. Really appreciate all the information. There is one, I just wanna mention too, there's one site that is called Have I Been Pwned, or P-E-W-N, lots of nods in the audience. Y'all have used it before, and it just shows you where your emails or other identifiers have been. I don't think I wanna know. I like John, I don't wanna know if I'm on the dark web. Actually, no, iPhone does have in there a list of your passwords that have been security leaked, so I've just been too busy to sit down and change them all. I probably need to, but you know, it makes sense, right? You know, there's just too many passwords to remember, so get a password storage, you know, like 1Pass or LastPass or whatever. Use that, and you know, don't use, although my dad, I still see the little notebook, and he writes each password in there, and I'm like, oh, dad, please, but anyway, he claims it's stored someplace safe. I have no idea. Hi, Al, thank you again, that was wonderful. I was taking a lot of notes during that. So I'm Amit Gupta, I'm a child psychiatrist, a military psychiatrist, and I'm interested in this topic, but I just wanted to kinda, it's sort of on the fringe, but just wanted to make a point, because I think people can also be involved, potentially, in actually developing applications themselves based on their needs, so a case example for me, we developed with DHA, Defense Health Agency, which is, if people don't know, that's sort of the new medical command for the, well, Army, well, it's every service for the military, so when you're talking about VA apps, for example, DHA is also developing apps and other things that are federally funded, they're free, no subscriptions, right, so they have all the HIPAA compliance and all that as well, so I worked with them a few years ago in San Antonio to develop an app for local usage to increase medication adherence, basically. We were getting all these data points saying, hey, people are not staying on their medications more than 30% of the time, like, they're not coming back, how do we improve this? So anyway, we looked at the information out there, did the cost market analysis, looked at what was out there, we didn't like a lot of the privacy stuff, we didn't like some of the other limitations that the apps had, so we had the DHA team develop an app for us, it's actually still in the app store because we, even though it was locally used, they didn't publish it, so DHA medication adherence is the app, so we used that locally to look at how do we improve information, and our workflow was, hey, we're gonna push this out to our primary care guys, we're actually going to sit down with the patients, because they don't know how to use this, we're gonna sit down with them, and one of our techs will actually show them how to input the data, where do you put in messages for the doc, things like that, right, set up the reminders so they remember to take the medications so we were able to do that, and we were lucky enough that we had that availability, so that was very helpful, and then now they have a new application that they're developing called Doseform, which kind of does most of what we were doing already, so we've kind of stepped away from that app, but anyway, just pointing that out, that if you have the availability, or you can make a case, depending on which advocacy group you're for, you might be able to subcontract, or have your IT resources potentially develop something for you, instead of trying to hunt down everything you need, and especially if you say, hey, I went through this, and there's really nothing that meets my need, can my institution help me actually develop a product that they can either monetize, versus it's saving them, you know, this is how it's gonna save them a lot of costs, because it's serving a need and reducing healthcare costs overall, so just a side point, which I know isn't necessarily your topic, but you know, it's something to think about. Hello, so I'm Bryn, and one thing that I've been kind of thinking about, I'm a clinical researcher at the University of Vermont, so I'm more on the computational side, the data science side, and something I've been thinking a lot about is I see a lot in our research labs that we're developing a lot of research tool apps, and we're trying to build that feedback loop with the clinicians, with the different need groups in the areas, with different institutions that might want a custom app, but it's beginning to have this kind of saturation, which really helps with tools like MindApps that kind of gives the lay of the land of what apps are there, but one thing that we've been thinking about a lot is the more custom apps we build, the more kind of niche we get each app, it just really oversaturates this tool, so I'm curious for you all up there, how you see we can bridge this kind of feedback loop or kind of have all these siloed apps maybe come together, especially between, you know, I'm on a grant just here to learn of what is the need groups and need areas, and I feel like there's that almost barrier between the developers and the clinicians who needs the tools, so I'm curious, what are thoughts of how we can maybe build that better, so we're not building a million apps? No, I don't know. Thank you. Well, I can, so thank you so much for the question, so I think that what can be helpful is if you're able to have focus groups with clinicians and talk to them and ask them, you know, what are some of the needs that you have, and then kind of iterate based on, you know, what is out there, and then see, okay, of all the things that are out there, what are we missing? And then you have your focus groups, you talk to the clinicians, and you fill in those gaps, and you just kind of iterate on that. That's one thing I would say. I would add, you know, I think, well, the clinician's helpful, but also I think it's the user, and I think one of the things that's missing is the user interface. That's really key because there's gotta be something that drives the stickiness. I mean, come on. Pokemon Go is such an awesome game that people have fallen off cliffs because they're trying to catch Pokemon, so you know they're really invested. And I think, you know, the interviews are really going to be helpful in the focus groups as well. One of the biggest pain points I think I've heard, may have been one of our panelists say, is that I had to log on to yet another app or check yet another dashboard, you know, I will, X, Y, Z. But it can be a lot of burden on the clinician to have to juggle multiple apps and websites, so you want to minimize that, and if you need to prototype it, just, there's no shame in paper prototypes. There's no shame in Figma prototypes or Adobe XD prototypes. It can get you a lot of the feedback that you want much more quickly than an actual, like, coding an app and spending the months engineering it. There was, at another session, it made me think of sort of two comments. One was that we're, and the human brain's wired this way. We're much better at adding things on than actually reducing things, right? So people like, let's do another app, let's do another function, rather than taking lessons learned and being reductionists, but in our busy healthcare, that busy healthcare system, that's what's needed. And then the other question came up was, well, how do you get clinicians to change? And one of the panelists say, you know, they're so busy, unless it's a pain point, you have to really find out, for researchers, for everyone, as a clinician, the market's so saturated, they actually have to be feeling pain, I think, before they're sort of willing to change and do another app. So if you have another app that they have to go to until it offsets some acute, like, hand on the stove reaction in them, I think it gets really hard to get widespread adoption. So I think those are two sort of meta things we need to be considering when we do this. Before we go on to the question, I'll say concretely, I consult for industry in terms of engagement in user. And exactly as you describe, there was this study that was done by one of the larger PE firms that show that once you cross, like the, I think it was like 120 seconds or so, adoption rate just plummet. And the utility that whatever clinical assistant tools or app that you're hoping to use, the utility needs to be three or four times higher, that the really sweet spot is whatever you're hoping to deliver needs to happen within the first 60 seconds. So on that note, one of the online questions is, are there apps that can be used to do psychological testing? Depends on what part of psychological testing. I would say, you know, yes, we know there's like simple screening tools, but like full, like ADOS type psychological testing, I don't think so, unless it's really just a platform to deliver the test, you know, with a remote examiner, but I'm not aware of any. The only one I can think of is a colleague of mine. I mean, I don't have no financial motive, but he runs telepsychiatrystartup.com and his EnvisionADHD, EnvisionADHD.com clinic uses some sort of app for ADHD psychometric testing, but I haven't really looked into it from there. Hi, I'm Seth. There's certainly online, right? There's certainly web-based psychological testing available and stuff, so I haven't looked into it in detail, but I can't imagine that some don't have a mobile sort of facing. Hi, I'm Seth Powsner. Any panelists have experience with patients making the mistake of using an employer-supplied phone to run their app and thus running afoul of the fact that it's the employer's phone that can monitor whatever they want? There's a question about, are there any cases where patients are using their employer-supplied phone? It was a risk of the desktop. I'm sure there are. I think this is what the IT does, right? First of all, the fact you can install something on it is amazing because mine's pretty locked down. At least my computer is at work. That's why I basically bring my own laptop in because it's just easier, but certainly, yes, they say, your employer IT will be monitoring access logs, so yes, it's not good to use your work phone to go to unsavory websites. Things of that nature because they know. And I'm dating myself. I grew up with TVs that have knobs on them, right? You just would, yeah, yeah, yeah. Seriously, I moisturize my face. So the knobs would just change the channel, right? And that's all you could do, but a lot of these phones, who knows what's going on inside? So a lot of it's education, like educating them, hey, you may not wanna use this because your data might be transmitted to the mothership or what, so yeah. Thank you. All right, please join me in giving the panelists a hand of applause.
Video Summary
The video features a panel discussion on mental health apps, focusing on their evaluation and integration into clinical practice. Dr. Jacques Ambrose introduces the panelists, all experts in psychiatry and digital health, including Dr. Darlene King, Dr. John Liu, Dr. Steve Chan, and Dr. Jay Shore. The discussion begins with an overview of the vast number of mental health apps available and highlights the challenges in assessing their safety and efficacy due to inconsistencies in regulations and evidence. Dr. King presents the APA app evaluation framework, a structured method for clinicians to assess and recommend suitable apps. The framework includes assessing privacy, clinical evidence, and usability, with a focus on the patient's needs. Dr. Chan discusses the potential and current state of the digital health market, noting the various types of digital health technologies and emphasizing the importance of a thorough evaluation of commercial apps. Dr. Liu covers smartphone security, offering tips on managing privacy settings to safeguard personal information. Dr. Shore addresses the interaction of technology with patient care, emphasizing the need for cultural sensitivity and understanding the communication nuances technology introduces in clinical interactions. The session concludes with a Q&A, discussing issues such as app over-saturation and patient education on technology use. Overall, the discussion underscores the promise of digital health tools in psychiatry, while highlighting the careful consideration required in their implementation and patient engagement.
Keywords
mental health apps
clinical practice
psychiatry
digital health
app evaluation
safety and efficacy
APA framework
privacy
usability
smartphone security
cultural sensitivity
patient engagement
×
Please select your language
1
English