false
Catalog
Applying Quality Improvement Methods to Implement ...
View Presentation
View Presentation
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hi, good morning. Thank you for coming to join us early this Tuesday morning. It's nice to see everybody. So our workshop today is around applying quality improvement methods to implement principles of collaborative care. My name is Amy Bauer. I'm a psychiatrist and health services researcher at University of Washington. With me are my colleagues, Dr. Jennifer Erickson, Dr. Anna Ratzliff, Dr. Denise Chang. So we're all from University of Washington and we're part of the integrated care training program there. We've been teaching quality improvement methods to our fellows as well as to practicing psychiatrists for the last five years. So what we're going to bring to you this morning is a very quick overview of some of the things that we teach during a year-long course that we give. We hope that it will give you a really strong foundation in quality improvement methods as well as some ideas around how to apply it. We're going to use examples from the integrated care and collaborative care setting, but we think that these tools are applicable to really any healthcare practice setting. So let's see, getting started, the way that we're going to organize today is that I'll be providing a quick introduction and overview around quality improvement. We'll be focusing on the model for improvement, which is one framework for quality improvement that applies really readily and I think is easy to wrap our heads around. So that's what we're going to be walking through today so that you learn those key concepts. We have a couple of activities that we'll be doing around measurement and developing aims and we'll have time for questions and comments. I hope this will be interactive throughout the session. So we're really interested also in hearing from all of you. All right. So to kick us off, I'm going to be talking a little bit about the overview of quality improvement. All of our slides are uploaded. Some of them I'm going to be going through very quickly in the interest of time, but they'll be available to you as a resource to reflect back on. I wanted to present a definition of quality improvement from the Agency for Healthcare Research and Quality that I particularly like because it reminds us that QI is a framework that we can use in a systematic way to improve the ways that we deliver care for patients. It also reminds us that quality improvement really focuses on processes and that those have characteristics that can be measured, analyzed, improved, and controlled. Quality improvement is an ongoing effort. So the idea of continuous quality improvement or kind of making iterative changes over time in order to get to your desired result. And that sometimes variations can be undesirable. And so reducing variation in practice can help improve outcomes. And the other thing that's included in here is that sustained quality improvement really requires buy-in from not just the, you know, maybe the clinician or the medical director or program lead, but from other people in the organization. So you really need to think about all the people who might be affected by a change in a workflow. And that's often not just you as the clinical provider. This is a little schematic that we use in thinking about how we use metrics for behavioral health in a collaborative care setting. And the reason it's organized this way is often if you're launching a new clinical program, you will want to up front think about what are your metrics for success. And start with an idea of what are those measurable characteristics that will help tell you if your program is on track in meeting its goals. So selecting metrics that reflect your program's goals, or in this case we'll be talking a little bit about collaborative care principles, is very key. Reviewing those, then figuring out what are the areas where you're not hitting your goals and identifying those areas for improvement. And then finally applying some of the QI methods to drive those tests of change. So today we'll also be organizing what we're talking about according to this framework. And after the quick introduction, we will be doing some work on measures, some work on identifying areas for improvement. And we'll send you home with some handouts with some additional tools that you can then apply in your practice settings. I'm going to give you an analogy to what we're going to be teaching for the model for improvement. Which is that quality improvement is not unlike planning a trip. You need to know where you're going to go. You need to have some idea about if you're making progress along the way there. And then you need some strategies to get there. So we all made it here this morning. We knew we had to get to Moscone Center. But not just Moscone Center, we had to get to this room at the right time in order to arrive here today. And so all of you probably used some tools like a watch or a phone or Google Maps to figure out where you were going. Maybe you used the meeting app to know what room you wanted to be in. And unless you happen to live in San Francisco and walk down here, you probably used several of these strategies to get from your home all the way to here today. So if you can wrap your head around that, then you can wrap your head around the model for improvement. Because these steps map directly onto the questions that the model for improvement asks. Number one, what are we trying to accomplish? Number two, how will we know if a change is an improvement? Number three, what change can we make that will result in an improvement? So where are we going? How do we know if we're on the right track? And what are we going to try to get there? These map directly onto the things we'll be talking about today, which is developing an AIM statement, measurement, and selecting interventions to test out. So conceptually, that is the model for improvement. If you can figure out how you got here this morning, you can figure out quality improvement. But we'll get into some more details because where the rubber hits the road is really applying this in real-life clinical settings. And that's where all of the fun and challenges come up. So AIM statements. This is answering the question, what are we trying to accomplish? This identifies the destination. Where are we headed? The more specific you can be, the better. So we talk about SMART aims. Many of you may have heard of this in other contexts. SMART goals are often used in education settings. And the acronym stands for Specific, Measurable, Achievable, Relevant, Timely. So like a destination, if you just said, I want to go to the beach, that would be pretty open-ended. I want to improve my patient outcomes. That's pretty open-ended. The more specific you can get, the more likely you are to get to the place you have in mind. So some examples that we've put up here show that these can be short and simple, but specific. Over the next three months, we will increase the proportion of follow-up visits for depression where a PHQ-9 is completed from 10% to 50%. This could be a very important aim for implementing measurement-based care. Can't do measurement-based care if you're not collecting measures. It also says what timeframe we're focusing on. And the target from 10% to 50%, that may be achievable. That may not be the whole way to where you want to get to eventually two years down the road, but that may be something you can do in a three-month timeframe. And so setting targets that are achievable and time-bound is really key. Moving on to measures. Measures help you to know where you are going and if you are moving in the right direction. So they help answer the question, how will we know if change is an improvement? It's important to think really broadly, but it is also really important to be very precise. So just like you want to have a very specifically crafted aim, you want to have precise measures so that if I'm measuring something and my colleague in the next clinic over is measuring the same thing, we're actually using the same way to measure that so we can compare what we're doing. Having too many measures can be a real challenge because it dilutes focus. So one of the recommendations is not to have more than six measures. There are different types of measures that one can look at. I'm just really briefly going to introduce you to some of the nomenclature. Again you can come back to our slides as a resource. Outcomes measures, those are going to be the things that the patients experience, patient improvement for example. Process measures really talks about what are we doing in our clinic. So what is the kind of care that we're delivering? Maybe it's how often are people getting an outreach call or a follow-up visit, or how often are they receiving psychiatric consultation. So those are process measures. Those are the things that the clinic might care about. The patient may not care as much about that as they really do about their own outcome. And then balancing measures are other things that may be important to take into account. Are there other things happening that could affect your outcomes? What's the total number of patients you're taking care of, for example? Could be an important balancing measure to help you understand what's going on. Maybe your quality improved but your patient volume went way down so you have a lot more time to spend on each patient. That might be important to know. In the next slide here I'm going to show you some metrics that could be useful in a collaborative care-type program. And the reason that I'm showing you this is to show that you really want to choose your metrics to reflect the kinds of principles that you think are important in your program. So we don't have time to go through all of these. Again, I encourage you to take a look. You can always send us questions if you have them. But just to make the point that if you're interested in measurement-based care in your practice, then you may want to focus on measures that have something to do with how often measures are being administered to your patients. So things like the percent of patients who have had a PHQ-9 score in the last two weeks or last month, something like that. And so really thinking about how do the metrics you choose tie back to that north star of what your program is aiming to accomplish. Once you know what your measures are, there's a few steps that you'll need to take which I won't spend a lot of time on today. But you'll really want to be thinking about how are you going to collect that information in a systematic way. One tip that I strongly encourage is that however you're going to collect it, think as granularly as you can. Because you can always lump things together later. There may be something that you're really interested in and you have three different providers in your clinic. If you measure it for the whole clinic, you won't know if Clinician A, Clinician B, and Clinician C are all doing the same, or if somebody's really figured it out and doing an excellent job and somebody else could benefit from learning what Clinician A over there is doing. So think about collecting things at a very granular level. All right, interventions. So this is the last component that we'll talk about. So we covered aims, measures. This is interventions. And they address the question of what change can we make that will result in an improvement. Today we're going to spend less time on selecting interventions because these are going to be very nuanced to what the issue is that you're working on in your setting. However, a couple of very high-level comments that I would like to make. One is that you really want to focus your interventions on those most important causes. And so whether it's using kind of causal theory around what do I think is the most important cause, or if you have some data, you may be able to identify those most important causes. There's a principle that actually doesn't come from quality improvement called the Pareto principle that 80% of the results come from 20% of the causes. And so you really want to focus your interventions. In the example here, the reasons for missed appointments. You want to focus on those first two causes in the graph that are the most common causes rather than the causes on the right-hand side that might really only affect two or three or five or ten people. So focusing on the high-yield interventions and really important causes is key. I've listed here three tools that are used in QI that can help you identify those high-yield interventions. And I have a couple of visuals on the next slides. We don't have time today to go into detail about these tools, but I wanted to make sure you know what they are and know that these are tools that are used at this point when you're trying to select your interventions. So those tools are process maps, which are shown with a colorful image depicting a process map for a suicide safety evaluation. They really help you detail what a process is in your clinic and identify areas for improvement. A fishbone diagram, which is a diagram that helps brainstorm among your staff what different causes are according to different categories. Is this a people issue? Is this an equipment issue? What is this patient-related factor? And a driver diagram, which shows what are kind of primary and secondary drivers and how might those relate to change ideas. This is a lot. I don't expect people to be digesting it all. But again, our slides are uploaded and so you are welcome to refer back as you are thinking about how you might use these tools in your practice. Once you have your measures, your aim, your intervention, it's time to test your changes on a small scale. And that means a really small scale. This is very different from a research setting where you might try to implement something and wait six months and collect data and see, you know, what your outcomes are to try to test whether the thing that you did caused an improvement. In quality improvement, we're talking short and fast. And so this is really kind of thinking that I think that we're not so used to in healthcare but is really common in design thinking, where you essentially want to fail fast and fail forward. So think about, what can I do quickly? What can I do by next Tuesday? What can I try on a really, really tiny scale that's going to tell me, is there a fatal flaw in my plan? Because if there is, I'd like to know that in days, not weeks or months. So PDSA cycles, you may have heard of the acronym for Plan, Do, Study, Act. And these are the steps. Oftentimes, as health care providers, we want to jump straight to do. And I encourage that to be really successful, it's important to slow down and really be thoughtful about planning, making sure you know your predictions, making sure you have a good way of collecting your data before you jump into doing the do step. Thinking about a really, really small step you can take. And then being systematic. Collect your data. Also collect your observations. So those two components are both equally important. When you need to interpret your results, you really need to incorporate in, what have I learned qualitatively, along with what is my data telling me? And then the act phase is where you make a decision about, am I really on to something great? Then maybe I'm going to scale it up. Am I sort of on to something, but I need to tweak it a little bit? Maybe I'm going to modify it. Am I totally on the wrong track? I found that fatal flaw. Time to shift gears. This is an example of a PDSA cycle that you might be able to do in less than a week that relates to improving the proportion of patients that are achieving depression remission. I will acknowledge that the goal, as written here, is not in a Smart Aim format. But for the sake of brevity, I wanted to keep it short. But this might be an example of, I'm going to attach paperwork for a PHQ-9 to the registration paperwork for three days in my clinic. And I'm going to look at what percentage of patients had the PHQ-9 actually completed, and what percent scored less than 5, or had a score over 10 and never got their treatment change. So I'm just doing this for three days, but I'm going to get some fast data. I'm going to find out what it's actually like, and are patients really completing this, and will the front desk hand this out, and all these other important things. When you're doing QI, you'll want to be able to track your progress over time. We won't go in detail in graphs, but you can refer back to this slide around some of the qualities that you might want to consider in graphing your results over time. And I will just say that graphs can be very important, not just for you in understanding what you're doing, but really when you're communicating with others in your organization. Additional considerations. The most important one I want to focus on is the top bullet point here, which is around the human side, really paying attention to who's affected by changes in processes that we're introducing. What is their buy-in? Is this going to be something that my team members can get on board with? Is this a priority from their perspective? And so really making sure that you have an approach that brings in all of the wisdom of all of the members of your health care team, and all of their creativity in problem solving. For any of you that are interested in publishing, there's a couple of other points on here that could be useful for you to look at in the future. So with that, we just covered in 20 minutes what we spend a year covering in our program. And I'm going to turn it over to Dr. Jennifer Erickson, who is going to be talking through measurement. We'll also be doing a short activity during this part. So I want to prepare you to be involved in interacting when we move on to the next part. I know. I'm going to ask you guys questions, and I'm going to eventually make you move into groups, probably of two, looking at the size. So you've been warned. Good morning. I'm Jennifer Erickson, as Amy Bauer hinted. You just had, as she was saying, if you feel overwhelmed by the overview of QI, that's OK. We're going to talk about AIM statements and writing your own measures, which is a good way to get your foot in the door and start thinking about the problem. Obviously, there's much more to it, but as a place to start. All right. We have learning objectives. You saw this slide. We often reference back to this. You can imagine going through the hierarchy. The second bullet point is really, truly, how are you going to measure, and how are you going to know where you're going? Often, many of us see large systems provide all sorts of data sheets. They're kind of overwhelming. Those can be measurements you use, but often they're not the most responsive to what's going on in the system. If you think about the, what can I do next Tuesday, having data that says, oh, no, you don't have this this month, three months later, may not be as responsive. And that's why it's important to potentially have an idea how to write your own measures and what you're doing with your measures. All right. I mentioned participation. I have prompt questions. I'll repeat your responses, so don't feel like you have to go to a mic. And please, just shout out. Why do you think it's important to have a shared definition of a measurement between the people who are getting it, between team members? Don't be shy. Yes? I think it's important because it prevents splitting. Yeah. It prevents splitting. You're all looking at the same problem, right? Anyone else? It's agreed on ahead of time. Yeah, yeah, yeah, and agreed on ahead of time. Amy mentioned finding that fatal flaw. That is a fatal flaw. If you're thinking you're talking about x and what everyone else is talking about is an apple, that's a big problem, right? Anything else? For buying results. Right, right. You get everyone excited, right? I mean, they're willing to accept the results. Yeah, that too. Yeah, yeah, yeah. Again, it builds faith in the process. Yeah? Exploring the issues that you're facing. Perfect. So exploring the issues that you're faced with. And I apologize for people recording. What he said was buy-in for results was the other comment. We'll move on to the next one. How do you find out if the measure that you defined is clear to everyone? Again. Yeah? Determining benchmarks. Benchmarks. Benchmarks are perfect. Anything else? Yes? I think it's important to ask people questions what they think about what they're doing, like you're asking them to do. Yeah, exactly. Oh, no, we talk to them, right? We ask people to look at the measure and say, you're in this problem. You're in the trenches. Is this what this looks like to you? And I'll admit, that's one of my big frustrations when I look at big data, where I'm like, what is this? And how did we get there? All right. Some things to think about when you're, so again, introducing measurements. You want to define your measure, and we'll actually practice defining measures in a second. Much like what we hinted with the team, you want to be really clear why you're choosing your definition, partly because in your QI process, when you're on that journey, you want to make sure that you're actually representing what you hope to capture. And then any difficulties you have defining your measure. We all know that things are often multidimensional, and we are choosing one factor. And sometimes, that one factor may not be what we're trying to do. For example, again, not obviously in psychiatry, if you're trying to increase the number of, say, purple flowers in your garden, you may be counting purple flowers, but there may be weeds in there, too. And so if you're measuring that, you may miss some of the other health that's going on. Not a great example, I apologize, but definitely one that comes to mind for me. Thinking about operational definition. So when we're creating a measure, you want to have an operationalized procedure that allows you to create that calculated variable. This is one of Amy's favorite cartoons, so I don't pull it down ever. That's why it's there. But you can imagine taking out particular instruments and actually measuring Dumbo, or measuring something. It's not just a, oh, we know that it's like a stalled approach. This is like approximately that. You want to give people actual steps that are reproducible. All right, I promised you a measurement activity. Part of my colleagues are going to be moving around, breaking you up into groups, and you can self-assign. I think we can actually do groups of two. For this activity, we're going to give you an opportunity to demonstrate the clarity of defining measures. So write your own measure, and then experience going through that process. So please sort yourself into groups of two if you can. We're going to give you a bag. It has candy in it. Don't eat the candy, please, or you will have nothing to measure. You can later, and then I'll give you further instructions as they hand it out. I will repeat the instructions, too. But go ahead and start handing things out. You'll get a sheet and a bag of candy. Please associate yourself in groups of two. Two or three. Yeah, two to three. It was, there's actually an even number. I was counting them through. And that's fine. Have friends. All right. As they're passing out, I'll give everyone a second to chat and meet and say hello. Hello. We could easily do two rounds on it. It's small enough, but I'm thinking we can just do it. Yeah. Yeah. All right. So I'm going to give you a little bit more instructions now that we're broken up. So you actually have a baggie that has something to measure and some instruments in it, as well as a worksheet. Your job in the next eight minutes as a group is to come up with an operationalized definition to describe that piece of candy and create potentially a severity scale. Now, you could think about it as there are many properties to that. So don't obviously just go, there's one. But think about the different properties that you'd like to measure. And write down the steps you have to decide is that something that you would use to measure that with a plan to potentially hand that to someone else to have them look at it and re-measure it. We'll give everyone eight minutes. We're up front. Let me know if there are questions. I may wander around. But you can start working on it. And write down your instructions. That's why we gave you a piece of paper. Go ahead, Amy. For the purposes of the recording, this doesn't affect people in the room, but anyone watching. I encourage you to look at the IHI measurement activity online now. And then when we do our debrief, you can join back in with us. All right, last comments. I'm going to have everyone reconstruct their baggie real quick. Wrap up this part. Yep. Wrap up the part. We're going to do one more. And if this was oddly hard when you started working on it, that's OK. Finding a good measurement can be hard. That's part of the experience. We could. You want to do it that? We have 30? All right. So what we're going to do actually is slightly different than what we do. We're going to hand you another bag. And I want you to take the tools in there that you have and measure it based upon your description that you wrote. So there's another bag coming at you. Let's make sure we give you a different one. Yes. Some of their definitions. So give us a minute. When you get a new bag, measure your new bag with the tool you wrote. I'm actually going to make sure. I won't. I'm going to get you half a bag. Does everyone have two bags? Does anyone need a second bag? Does everyone have tools? OK. All right, perfect. So just to repeat, you measured your first piece of candy and wrote an operational definition for it. I want you to take that second bag with your operationalized definition and then quantify that candy with that definition. Practice using your measurement. We'll give you eight minutes and we'll give you a one minute warning. All right, let's go ahead and pivot to a debrief. I know that that was quick. We're on a very small time crunch. And again, it's to give you an experience with this. It's not to be perfect. I know that doctors were like, we're going to get this the first time. It's going to be fully accurate. And we've got this. This is an iterative process. This is you doing QI in real time. You just didn't know it. All right, let's do some discussion questions. So usually, we have team switch. And we thought, for the sake of this, and because we had extra candy. Oh, by the way, feel free to eat your measurement things, the things you're measuring now. Totally fine. They're yours to do so as you wish. But thinking and reflecting about what you did with your team, I mean, what did you notice about how your team approached measuring that object? Yes, this is participation time again. Shout it out. Qualitative, yeah. And I think when I walked around, a number of you were very qualitative. You were describing what you saw. And then we walked around and say, OK, how would you have someone else describe it in a similar way, in a reproducible way? What else did you notice? Yeah. We're trying to identify similarities and differences in one of your pages for a program that we're doing with Google Docs. Yeah, the similarities and differences. Do you ignore some of them? Do you highlight some of them? It becomes really complicated really fast, huh? I mean, the two pieces of candy and string are just different needs. Yeah, absolutely, two pieces of candy and string, like your tools. We're going to actually look at that real quick. Tools definitely affect how you measure something and how you think about something. Anyone else? Shout them out. Yes? We felt that the measurement tools were, the only measurements that were irrelevant to what seemed to be the value of that item. Yeah, things that you might actually care about with that, so I'm going to summarize real quick because there's a recording. The tools that you had did not get to the things you wanted to actually know about the objects. You were measuring candy with string and a paper measuring tape, but it didn't get to weight or taste or things that were inherently properties of candy, right? And it's true. What you have to measure may not capture everything you're hoping to capture or may not be the essence of what you're looking at. Yeah, we definitely gave clear wrappered peppermints for people who are elsewhere, and that's really challenging when you're trying to measure the hard object. And that never happens in medicine, right? Measuring things that you can't see, it's a real challenge, so just like this activity, so it goes in life. So I heard a couple of things where people were measuring it across that. Did anyone who, many people actually took it out of the wrapper for that reason, where you're like, ah, the wrapper. Did anyone do anything else than measure the length? What did you do? Just real quick. Volume. How'd you get volume? Math. You used math. Yes! Yeah, absolutely. Yeah, the quality of the wrapper. Sometimes we do these activities, people will count the number of stripes, people that, Amy, right there. So there are other things. We're going to look at a real world example in a second that people have used bananas for. Amy, when we beta tested this, she has small children, I have small children. We gave them paper clips, and believe it or not, they used paper clips and measured the banana with paper clips. Again, not great. Some people have counted spots on things before, so there are many ways to capture this. Many different approaches. You can imagine, usually we switch teams, but for the sake of the room, we didn't. If you were to look at another team's definition, how they put it together, you'd notice that there'd be differences. And I think that's part of it. And like we were saying at the beginning, coming up with a common definition, getting people on the same page of how you're measuring and why you're measuring is really important, because we're going to have different perspectives. And in QI, it's good to bring an entire team on a QI journey and not be using different tools and going in different directions. That's been historically my big frustration when I did QI as a resident. It's less so now that I'm in control as a faculty member, control, air quotes, but, you know. All right. So we hinted at this. How is your definition affected? How is your operational definition affected by your measurement tools? Dramatically. We gave you paper and string partly because it's easier to process, but you wouldn't be measuring length if you didn't have something to measure length with. So we sort of baked that in. If you had a scale, you'd be probably measuring weight. This is actually an example that Dole uses, the banana people, where they look at color and they actually have a scale for color. I am going to hopefully be able to pull this off. It's control, shift. Alt tab. Alt tab. Do you want me to? There it is. Is that it? That is not correct, Amy. We have a video. It's short. There we go. There. And here. Amy had more coffee and is more technologically smart than me. All right. This was C1, D1. And we're going to go to C2, D2, D3, D4, D5, D6, D7, D8, D9, D10, D11, D12, D13, D14, D15, D16, D17, D18, D19, D20, D21, D22, D23, D24, D25, D26, D27, D28, D29, D30, D31, D32, D33, D34, D35, D36, D37, D38, D39, D40, D40, D40, D41, D42, D43, D44, D45, D46, D47, D48, model of quality in health care. So safety is the one that we can all recognize as a you know a common aim in health care. Patient-centered work especially if you those of you doing collaborative care that's a very big principle within the work that we do. We want it to be care that's respectable responsive to patients preferences and values and their needs. Another thing is effectiveness. Obviously we all want good outcomes we're looking for that so often the things that we measure are looking for our effectiveness and efficiency. That's another big theme in health care is trying to improve efficiency, improve value, reduce waste in the system. Also being thing having things be timely timely for our clinicians, timely for our patients so they're not waiting either access to care or wait time for providers. And then obviously a very big issue is equitable care so reducing that disparity in our care and bias. So these are just some common things I'm sure you're all thinking in your brains okay if there's things within my health care organization that I'm looking at that might be addressing one of these themes as maybe an aim that you want to work on. So but really that when you're picking an aim for your organization start thinking about what the problems that you have the patient the things the problems that are problems for your patients. What are things that you're putting up with day to day that are really irritating to you that are in some people let's say those rocks in your shoes that you really want to get fixed and addressed and that's I think the thing that you're gonna be most passionate about working on. Very common themes within our field are access right. So access is a huge problem so you may want to work on things like referrals or wait times or no-shows or there might be issues around diagnosis or identification of symptoms how we're using the PHQ-9 or other measurement based screeners, suicide risk assessments for example. And then there also may be themes around treatment or management how are we or our organization using evidence-based tools or protocols how are we implementing things like AIMS testing or antipsychotic metabolic monitoring in our system. Things like that may be things that you're interested in addressing and then also communication is a big theme especially for those working collaborative care or anywhere you're getting referrals or consultation or you're sending back recommendations to maybe a primary care provider or another provider how do we improve that communication so that those recommendations actually get acted on. That's another theme that we see in collaborative care a lot is how do we improve that communication. So those are very common themes that we see that you can you know maybe identify with and maybe consider addressing. So back to the AIMS statement so really narrowing down what problem you're thinking about and what do you want to improve and again an AIMS statement is a single sentence that clearly states the goal of your project but how do you make it clear and objective. So again this is another way a smart AIM could be where you really get your team if you're working in a team based setting aligned on the either the mission or vision that you're trying to work on. So this is really great to have one single sentence that brings it all together and again back to our kind of theme it's the smart AIM so having something specific measurable attainable relevant and time-based and I'll just say you know for maybe me and some other people in the room we did I didn't grow up learning quality improvement this wasn't something that I learned in residency I think it's different now so some of I think that they're you know starting to incorporate that but if you think about it we've all been doing some form of this in our lives both personal and professional right there's a problem and you want to fix it and this is just a framework to be able to put that language and really put an aim. This actually graphic came from my then 10 year old class 10 year old daughter who they're starting to teach quality improvement in some form or smart AIMS even in fourth grade so this idea of like for her it's like how am I gonna get my homework done on time how am I gonna remember to bring my binder to school so they really are teaching kids already at this young age about how to make specific AIM statements make a measurable attainable and this is all really good right for both professional and for personal just to be able to kind of really hone in on what it is that we're trying to improve. So sample AIM statements just to give you some language around how people build AIM statements so one could be decreased wait time for a behavioral health intake by 30% in the next six months reducing the no-show rate for my group by 25% at the end of the calendar year or in the next four months increasing referrals from the primary care provider to behavioral health by 50% and increase the rate of PHQ-9 completion at new visits from our current rate of 30% to a goal of 90% by June 2023. Again you can see in all these examples it's specific it's measurable it's time bound and hopefully attainable and relevant to your setting. So those are my quick slides we're going to move to our small group activity really quick and helping you build this smart AIM and our goal here is to not only have some interaction but really for you to think about what is it that's really a burning passion for you that you really want to fix in your organization this is your chance to really practice and putting together okay this is my AIM statement this is something I'm going to help bring back to my team my organization to get them all fired up about wanting to work on this project again it helps align vision and mission and really kind of solidify the goal for your team so this is a good way to practice and on how much time everybody about eight or nine minutes I think this is a really good time to take that activity which just kind of got your brain ready to start thinking about how do we measure things to really be taking this back to like what would you like to do in your own clinic system when you go back to wherever you're traveling from from San Francisco and if you can't think of one off the top of your head that you want to do you're welcome to use any of the examples I'll put them back up like that you could use we also included a whole bunch of examples on the back okay great so that in case you were feeling stuck you could get inspired okay so I'd love for a few of you to be if you're willing to to either share your smart aim or as far as you got on your smart aim with us and and we're gonna in a minute actually wrap up so this is partly discussion partly wrap up because I feel like this is really where I think the learning hopefully from today goes back to your practice when you go home which to me is always a big goal of meetings is that I feel like sometimes I come and get inspired and then I feel like there's something about that airplane ride home where everything that I learned sort of goes poof because I start thinking about the inbox that I haven't been attending to my email that I haven't been attending to right so these aim statements can be really helpful as you're transitioning back to you know the daily grind to be thinking about what are some of the things that you really wanted or got inspired around trying to improve your practice so if anyone is willing to share I think I walked around and talked to a few people and one of the things you know we were talking about is sort of what's hard to define and when you start writing down that aims statement sometimes you quickly get into things like okay I know that this is an important concept but how are we going to measure it or what even is the meaning I'll share the example that we were just talking about we were talking about engagement in a collaborative care program and we were saying it's so hard to define that right because when a patient doesn't come back do we how do we know what's happening right sometimes that means they're not coming back because their depression anxiety whatever has gotten worse sometimes that means they didn't like the care manager sometimes that means they got better and are like I'm done with treatment I don't need to come back right so we were sort of talking about more creative solutions around how you might actually measure that even one of the things that could be an interesting measure is did you complete sort of a concluding that episode of care like a relapse prevention plan and collaborative care is a common tool that's used to kind of define, hey, we think you're better, how do we sustain that being better, right? So that, you know, rate of completion of that might be a really interesting definition of engagement or not, right? Because did they complete a full episode of what we'd want to see as care? It's very different in, you know, were they not seen in the last three months, right? Which is a little less specific to the patient experience in collaborative care. So we were brainstorming, like it almost means you have to go back to your team and say, what's our shared understanding of what it means to complete an episode of care, to be engaged in care? So many of you may have encountered things like that. Are there other examples that people? Yeah, great. Yeah. Yeah. So I'm going to try to do my best to summarize this because this is being recorded. So I think the reflection was, you know, often we're focusing on measuring and attaining something, but we might miss the sort of relevance pieces because our perspective on what's most important or needed may or may not be reflected in sometimes the partners that were involved in care. So a specific example, you know, maybe as a psychiatric consultant, you're really wanting to see more of the patients being directly cared for by the primary care providers. But you know, that may or may not be what the PCPs are necessarily seeing as the most important thing. And even if you're in the psychiatric consultant role, maybe even where you are in that program development, that relationship with the PCPs and who's getting referred could evolve. And I think this is a really interesting point because I think, you know, and how involved you could get them in the QI process, right? I think that was kind of the second idea. I think, you know, it's a really interesting thing to think about. You know, who's the QI for, right? Is it from my experience as the psychiatric consultant? Is it for our shared goals with our PCP colleagues? I think you can't really do good QI in isolation. So I guess that's one of the reflections I would have about sort of the comments that you make is you might be able to start it or have some ideas of measures that are important, but you're probably going to really need, especially if you're working in a collaborative care setting, but I would argue probably any clinical setting, need to bring the partners into that conversation. I think it's a really interesting thing because it's almost sometimes not until you get to quality improvement are you actually sometimes having conversations about what are our actual goals of this work we're doing together as a team, right? And sometimes you have to go back and spend some work really making sure you're all on the same page around the vision of what you're doing together. And that conversation can be super helpful, right? You can also sometimes identify potential sources of frustration. I feel like it's one of the beauties of continuous quality improvement is it allows you to revisit, you know, are we accomplishing what we hoped to accomplish together? And I think a lot of you, it sounds like, are working in collaborative care settings. And often, you know, again, the things that are most salient or meaningful to some of our primary care colleagues might not be the things we're necessarily thinking of as the thing that we were, you know, coming in to do in a primary care setting. I think it's also really important to talk about the fact that those goals can shift over time, right? Maybe you started out saying, gosh, what we really want to do is do really high quality depression care. But I don't know about you all, what I'm hearing is that the patients that are showing up in primary care are a lot more complex these days and a lot more ill just because we have such lack of access to care. And it may be important to go back and actually revisit with your colleagues, like, okay, who is the population that we're serving? Do we need to like revisit what our goals are? So sometimes that effort and quality improvement can actually help you recognize that, you know, there might need to be a more shared definition of the goal before you can even start to work on quality. So I just, I want to acknowledge that I think some of what you're talking about is really important. But, okay, from which perspective is this a quality measure and how might that mean that we need to go back and have a conversation about it, right? So I appreciate that. Other reflections or aims? Can I make a quick comment on just what you were saying? I think it's back to your earlier point of, you know, you have these candies and you have a tool to measure either like weight or circumference. That measure isn't really what's the value, the value of the candy. It's like you were just saying, you know, finding something that's relevant. Maybe you can measure something, but what about the thing that really matters to you the most? So I think your point is valid in that case. And I think one thing that I'll just say, we didn't cover it in detail, but like process mapping, which is basically kind of the mapping of a workflow, is one way to get buy-in from people that you work with around a certain workflow. So just using your example of, let's say the PCPs are sending me patients that I wish that they would be able to handle or I feel like that they should be able to manage. And if you go through the process of a referral to you, and then, you know, you can show them the flow and then maybe I'm just making things up, but like, you know, you're getting a lot of patients that, you know, maybe could be managed by a PCP. And by taking on those patients, you're full and you're booked out, you know, six months or something. So you can get engagement from those PCPs and say, this is the flow, this is what happened, this was when you referred to me, but this is the outcome, which is, you know, access goes down when I get a flood of patients that could be shared or co-managed with you. And so can we come up with, you know, a intervention, maybe it's I give you some in-service training on how to manage, you know, common themes or, you know, you can, I think process mapping is just one way of really engaging a team around the real and actual workflow that's happening in your system and where the problems arise and then kind of brainstorming together like, okay, where we can intervene and brainstorm how we would measure it. So we would measure, you know, number of referrals or wait time to see you or things like that. So that are things that are relevant, tangible, attainable, measurable types of SMART aims that you can come out of that process mapping. Yeah, I really appreciate that. I think in some ways QI gives you like a neutral territory to explore some sometimes potentially charged topics too, like we're going to process map this together. That feels much less, I think, threatening, maybe the right word, like then saying like, I just don't want you to send me so many patients, which sometimes is a hard conversation to start from. But I think, so I really like that and I really think QI can be a really nice neutral place to have some good and sometimes really important conversations about what's actually happening in our processes of care. I think we, you know, we again often are kind of stuck in our own perspective and by sitting together and kind of looking at the system together, it sometimes allows some things to get unstuck and maybe, you know, unfortunately what you realize is there's not enough resources to do everything that you'd like to do, but at least then you can have a shared conversation about how you're going to manage that, right? And you can start to do some very brief tests of change, right? Does training help or not, right? Do some of the ideas that might come up in those conversations, are they helpful? Since we are getting close to the end of our time together, I'm going to transition to questions which we sort of are already doing. We did want to let you know that we included a whole list of QI resources, things that we found helpful as we all have done more QI. We have variable experiences in how much training. Some of our team has a lot of training in QI, some of us have less. I actually don't have any formal training in QI, so I just want to say you don't have to be formally trained in a whole QI certificate to get deeply involved in this, because you can teach yourself through a lot of these resources. So we hope that today was really like, if this is newer concept to you, an introduction to some of the core concepts and ideas, and if you're excited about them, here's a lot of resources. We just tested it last night. If you go into your app, these links are active, so you can actually open them in our PDF that we included in the app. So hopefully these are helpful to you. If you're a more advanced QI person, which I think I heard a few people that sounded pretty advanced in our group, we hope that these just inspired you to think about a few other ways that you might also do QI, but also maybe talk or engage people who are newer to QI in those conversations, because I do think often, I've been saying we need to rebrand QI, like way to advocate in your system with data or something like that, because I think when we say QI, people start going like, oh, that's that thing where they're going to tell me I'm terrible at doing something, right? And so to me, I feel like I really would encourage you, if QI has some bad associations with it, then potentially say, hey, I'd love to work on how we can work better together, or some other way of talking about QI. Still using all these tools, but maybe that isn't as inducing of fear. We have a couple minutes. Are there any questions that people have before we wrap up today? Or if you didn't get a chance to reflect on your aim, and you'd like to get feedback on it or input? Yes. Yeah, go ahead. Use the microphone if you can. That's great. Thank you so much for this. This is really helpful. I was wondering if you can expand a bit more on the comment regarding the patient that's sort of proper for collaborative care, as far as I know we mentioned one end of the spectrum where somebody could really be managed in primary care, but also the reality across the country is that primary care docs are just seeing the level of complexity and acuity that is really way beyond what's proper for primary care. Maybe from a QI perspective, how you go about measuring that, sort of the proper setting for a patient, and also working with reality on the ground, that you can't really just say, well, this person that's far too acute, good luck putting them on that six-month wait list and then see what happens. Yeah. Maybe I'll say a second sentence, but I'm actually going to also invite our other panelists to comment on that. I guess I would say there is no one right answer to that question. And in some ways, I think QI can be a nice way to have that conversation with your partners around that reality, right? And what you might identify is that perhaps there are needs that exceed what maybe collaborative care is going to be good at. Where you set that line will probably very much depend on where you are and what the other resources are that are available. I personally feel like what has been most helpful to me when I've talked to practices around that is actually thinking about, even if collaborative care is not the right setting, is there an episode of treatment that we could do that would be helpful? Maybe that's doing a really good facilitated referral. So it's not no collaborative care, yes, collaborative care, but rather the focus is on getting that person to a higher level of care, for example, as opposed to trying to say we're going to address all of their mental health needs in primary care. And I just give that example because I think if you're really in partnership with your PCP colleagues, they need your help, right? We're still probably the people around that can help them the most. How do we do that in a way, though, that's not going to disrupt the system so much that we're not going to want to work together? I know you're thinking about this a lot, Denise, so I'm going to pass the baton to you. Yeah, I think this is a very common theme and one we're dealing with as well. And I just echo what Ana was saying. I think one thing could be like a needs assessment in the beginning is kind of just what are you seeing? And I think that's are you seeing people that need a certain level of care? What do the PCPs need or want? How do we support them? I mean, similar to you, we're seeing all sorts of patients that really need a much higher level of care. But at the same time, we don't really expect the PCPs to know how to get them there, right? So we want to build, on one hand, a process that allows patients to easily flow into our collaborative care program, but at the same time also assist the primary care provider. So out of that, you know, we're thinking can we build a system within our collaborative care team that triages those referrals, helps triage those patients for the PCP? They're not going to know where the IOPs and PHPs are in town. They don't know if this person needs CBT, CBT, or even, you know, moving towards other kind of types of care. So I think really we are the experts to be able to help triage that. But when you do process map, like if you were to do a needs assessment and process map this kind of referral, you really have to think about who's going to do the triage. It's like the who, what, when, where. Who's going to do that triage, with what resources, what time? Because that triaging isn't reimbursable, if you will, right? But at the same time, with collaborative care codes, there's some general BHI codes that you could use for more of a lighter touch. So if you really think about, you know, your system in which you sit, what are the other resources available? What kind of tiered or stepped-up care can you provide? And doing some of that triage at the PCP level could be an intervention, but then you'd have to test that. But I think starting with a needs assessment of like, what's the volume? How many patients would you categorize as appropriate for collaborative care or not? So it's kind of that measurement, back to the measurement activity, is how do you measure what's in and what's out? You have to have kind of some agreement on what you consider as this is collaborative care and everything else is not. And then trying to measure that is one way to say like, what percentage is in that bucket, what percentage is out? And then if it's a large percentage that's out of that bucket, then what are we going to do to help solve that problem to be able to get those people to the right place? So I think a needs assessment, process mapping, talking to your colleagues, but then also trying to define the measure of how you're going to measure. It's just like measuring the banana or these cannings, like how do you measure what's in, what's appropriate for collaborative care? That's a discussion amongst your team and not everyone's going to agree on that. But I think having some rubrics or standardization of that can help. And the real power as well of having that measure is that then you have that metadata that you can take to your organization and say, this is what we're seeing and this is how it's changing and this is the challenge and this is the other resources we'd like. And so you actually have another layer where you could go to that next level and start advocating. Exactly. And that's really powerful when you can do that. Yeah. The other thing I'd just add to that is I think it also allows you to start doing small tests of change. If you have data on who all you're seeing, you can say, okay, if we try this thing, does it help us shift that in some way? You know, the number of people that are on the wait list, those kinds of things. And those tools of small tests of change, I think can foster a lot of collaboration if implemented thoughtfully. I was just going to circle us back really quickly to the model for improvement. And this is where thinking of those different types of measures can be really valuable. So you may be really focused on some of the processes that you want to measure, but balancing measures can be really important to thinking about things like caseload size or wait times. And so when you want to think broadly about these measures, in order to figure out what's relevant, you may have not just what do you want to improve, but what are the other factors happening in the clinic. So I think the model for improvement provides a really good framework for thinking about some of these problems and bringing in the quantitative pieces as well as the qualitative. I'm also going to give a quick plug. Many of the resources up here are directly from the Institute for Healthcare Improvement. They're really good. A lot of these links have a short page that's like three paragraphs and a five-minute video. It's super digestible. So if you're interested in getting more into this, check out some of those sites. Or having your colleagues. That's why we included those particular links, because they're approachable. And sometimes we even have to create a common definition of what QI is as a starting place. I had a terrible experience of residency with QI. And I had to become an attending and come here and go, oh, it's actually not evil. So just sometimes you have to fight that barrier, too. We're at time, I think. Yeah. So we're going to stop officially. Thank you all for your attention, your participation in the activity. Enjoy a piece of candy. Hopefully this has been helpful. And thank you so much for your participation.
Video Summary
The workshop, led by Amy Bauer, a psychiatrist and health services researcher from the University of Washington, along with her colleagues, focuses on integrating quality improvement (QI) methods to apply collaborative care principles in healthcare settings. This morning session aims to condense a year-long course into an introductory overview, designed to build foundational knowledge about QI methods applicable across healthcare practices. The discussion is structured around the "Model for Improvement," which addresses key questions: What are we trying to accomplish? How will we know if a change is an improvement? And what changes can we make that result in improvement?<br /><br />Participants engage in activities to learn about developing AIM statements, identifying areas for improvement, and selecting measurable outcomes. The session highlights the importance of precise, shared measurements and consistent definitions while conducting QI initiatives. The information is aimed at encouraging an interactive environment, soliciting participation and sharing experiences.<br /><br />Key elements discussed include the model's framework, which encourages iterative changes and the importance of reducing practice variation to improve patient outcomes. Emphasis is also placed on choosing relevant, achievable, and time-bound SMART goals to guide quality improvement projects. Participants are encouraged to consider the human aspect and engage team members in the change process, ensuring alignment with their perspectives and needs, facilitating sustainable improvement initiatives. Overall, the workshop seeks to foster a collaborative environment, enabling participants to apply QI methodologies effectively in their settings.
Keywords
quality improvement
collaborative care
healthcare settings
Model for Improvement
AIM statements
measurable outcomes
practice variation
SMART goals
sustainable improvement
team engagement
×
Please select your language
1
English