“We don't always stop to think about doing things differently. Typically, we know what the topic is, we have the
— Linda Casebeer, PhD
Want to improve your chances of getting measurable results from your CME programs? The first step is to design educational activities that will be effective in changing physician behavior. And fortunately, there's a wealth of research to help you. Here's how several CME experts (see box on page 21) summarized the five main elements proven to be most effective in improving physician practice.
A needs assessment that really identifies the concerns healthcare professionals are facing in their daily practices. This entails more than having someone review the literature. As part of your needs assessment, you need to decide what kind of outcome you're looking to accomplish. A program that aims simply to disseminate information can have a similarly simple educational design, but one that has practice improvement as a goal will need a more complex design. (See MM's December 2002 cover story, “Needs Re-Assessment,” for more.)
Multiple interventions. Research shows that programs that only offer one type of activity, say a lecture, are dramatically less effective in impacting physician behavior than those that combine lectures with interactive sessions, worksheets, reflective activities, and other elements.
Putting content up on a Web site or sending it out in a fax newsletter isn't a very interactive way to expose it to a wider group of participants. But if you put the meeting's content into case-study format and ask participants to come up with solutions, or even just ask questions, it becomes more interactive and engaging. This gives participants more opportunities for hands-on learning and to practice and reflect on what they learned.
Interactive activities (such as case studies, threaded discussions, and learning communities). Healthcare professionals may balk at the idea of interactive sessions, but once they experience interactivity, most will admit it is more effective.
If you want to have your audience put what they've learned into practice, especially if participants are physicians who have been doing something a certain way for a long time, they need to experience the information themselves. They need to be able to compare and contrast what they've learned to the way they're currently working, before they move into a new way of doing things. That takes interactivity and hands-on experience with using the new information in practical, realistic ways they would use it back in their offices.
Feedback and self-reflection. Providing opportunities for healthcare workers to gauge their practices against their peers', and to think about how they can adapt it to their work, also increases the effectiveness of an educational intervention.
Enduring learning aids. Handouts, cheat sheets, pocket cards, and other physical reminders that support the learning also are highly effective. These could be a copy of standardized guidelines of care for a particular disease state, or practice and office system tools, such as assessment forms and care checklists that can be attached to patient files to ensure that all the recommended practices are followed for each patient with a particular disease. You also could give them some job aides, such as an easy-reference manual, or a Web site where they can turn for answers. Or how about a pain scale sheet the receptionist can give a patient to fill out while waiting to see the doctor?
Not every program needs to contain all these elements. The simple lecture that provides a one-way information flow from the faculty to the audience is fine for programs that just want to convey information — just don't expect that physicians will put that information to use back at the office. And it's not practical, or even possible, to incorporate all these elements into every program you do. But when practice improvement is the outcome you're targeting, the more of these elements you can incorporate, the better your chances of making a difference in your attendees' real worlds.From Theory to Reality
Do you want to see the theories in action? Here's what one organization, the 15-person radiology group called Chambersburg Imaging Associates, in Chambersburg, Pa., headed up by Robert Pyatt, Jr., MD, has been able to accomplish.
CASE 1: An article about diagnosing stroke in the Journal of the American Medical Association found that radiologists were missing subtle strokes on CAT scans 18 percent of the time. Pyatt asked his group's body imaging radiology expert to measure what his group's rate of detection was, which turned out to be very similar to the national average. This physician expert then held a number of different interventions, ranging from one-on-one mentoring sessions to case studies to group meetings — all with Category 1 credit. Several months later, he revisited the data and found that performance had moved from 85 percent accuracy to 99 percent.
Why it worked: Pyatt's group expert was able to identify a problem, set a baseline, design a multidisciplinary series of interventions, and systematically track the outcomes — a great setup, experts agree. They didn't have to rely on self-reporting for needs assessment, which is a big advantage on the front end. Studies show that objective data tends to be more reliable than self-reported data, because healthcare workers are not always aware of what it is they need to improve, among other reasons. The program also included multiple interventions, and multiple types of interventions.
CASE 2: The group was getting complaints from their OB/GYN doctors about incomplete reports. The group's expert in ultrasound met with the OB/GYN doctors to find out what they wanted to be in every report, then educated the group using multiple types of interventions, including creating a reporting template they could use, says Pyatt. After tracking individual performance, “we went from a very poor compliance rate to almost 100 percent. Now, instead of complaints about the reports, we have happy doctors and complete reports.”
Why it worked: In addition to the factors noted in Case 1, the reporting template the radiologists could take away and use in their daily practice contributed to the positive outcomes these interventions achieved. Nancy Davis, PhD, director, CME, American Academy of Family Physicians, Leawood, Kan., says AAFP is moving in the direction of practice-based reminder systems as well, developing tools like flowcharts that could go in the chart of every patient with that disease, and checklists physicians can use to make sure they're doing everything they should be doing for each patient. “Though it requires some kind of chart audit, it doesn't mean you have to review every single diabetes patient's chart. You can do a random sample to see if you're hitting your targets, report the outcome, and assign credit to it,” she says.
CASE 3: The leader of a local post-polio patient support group came to Pyatt's hospital with complaints about how doctors were treating her group's members. She worked with the CME department and a neurologist on the CME committee to develop an intervention and identify which doctors needed to come. “Three months later, I got a wildly enthusiastic letter from the patient support group saying how all kinds of improvements to their care had been made. As a result, we saw a big reduction in psych consults, in MRI lower-back studies, CAT scans, all these tests and consults that were being done unnecessarily,” Pyatt says. “We had that support group leader present when we got surveyed for accreditation,” says Pyatt, “and she told them from a patient's perspective how outstanding the outcomes of that CME were. That was a fun one.”
Why it worked: Involving the patients was important, says Pyatt. Who could be in a better position to make sure physicians learned what they needed to know? And he suggests that CME providers go further abroad and involve others in the community who are impacted by the specific topic. For example, for an intervention on domestic violence and spousal abuse, he involved local women's shelters, social workers, even local judges — and got results. Physician referrals to women's shelters increased 75 percent within 60 days. “You have to be multidisciplinary,” Pyatt says. “Physicians don't always realize that they operate in a community, and what they do makes a difference.”
OK, admittedly Pyatt's CME department runs like a Ferrari, where most providers find theirs is more in the Hyundai vein. But there are still lots of easy, inexpensive ways to gear up your programs — if you're willing to try something new.
“We don't always stop to think about doing things differently,” says Linda Casebeer, PhD, associate director, Division of CME, University of Alabama School of Medicine, Birmingham. “Typically, we know what the topic is, we get the faculty, and we let them speak. But you lose many opportunities to improve physician practice that way.”
This article is a compilation of wisdom from many experts: Linda Casebeer, PhD, associate director, Division of CME, and Robert Kristofco, director, Division of CME, University of Alabama School of Medicine, Birmingham; Susan Cobb, MSN, RN, president, Meniscus Educational Institute, West Conshohocken, Pa.; Nancy Davis, PhD, director, CME, American Academy of Family Physicians, Leawood, Kan.; Suzanne Murray, president, and Lorna Cochrane, PhD, vice president, Education and Research, AXDEV Global, Norfolk, Va.; Robert Pyatt, Jr., MD, head of Chambersburg Imaging Associates, Chambersburg, Pa.; and Mark Schaffer, EdM, vice president of CME, Professional Postgraduate Services, Thomson Healthcare, Secaucus, N.J.
WHILE CME PROVIDERS are generally pretty good at providing programs aimed toward improving a physician's knowledge and skills, “we pay less attention to attitudes and barriers to care,” says Linda Casebeer, PhD, associate director, Division of CME, University of Alabama School of Medicine, Birmingham.
Suzanne Murray, president, AXDEV Global, Norfolk, Va., says attitudinal issues often can be the “unperceived” need her organization uncovers in its needs assessments, particularly for doctors working with patients with dementia, Alzheimers, depression, and other difficult-to-treat illnesses. “There's a certain stigma involved in these cases,” she says. “The attitude is, ‘We can't help these patients, so why bother?’ They may have the skills and knowledge, but if they're stuck at the attitudinal level, they won't be able to apply them.”
That's why AXDEV Global designs programs to address attitudes that can hinder patient care, as well as how to screen, diagnose, and treat patients with these disorders. The attitude module, which comes first, uses a peer-to-peer approach that includes encouraging peers to share their values and experience in treating these patients, and what satisfaction they've been able to identify. Once physicians conquer the attitudinal barrier, her company follows up with modules to improve the knowledge and skills sides.
Casebeer also tackles attitudinal barriers to care for women with HIV in a workshop her group has put together for the International AIDS Conference for the past several years. “In order to better attune physicians to the issues women with HIV have before we even begin to talk about treatment, we do something a little innovative — we run a photography contest that solicits photos that evoke what women with HIV face around the world,” she says. The black-and-white photos are displayed during a reception held the first hour of the workshop, and participants are asked to vote on the ones they feel are most evocative of the issues women with HIV have. Winning photographers receive cash prizes. “We try to get people immersed in understanding the barriers, the problems — what having HIV really means to women patients in different parts of the world,” she says. The second half of the workshop follows up with cases and lectures on therapies and treatments for women with HIV.
“We certainly saw changes in how people thought about women with HIV, how they addressed the barriers, and their confidence in managing HIV in women. That's one we're proud of.”
In future issues, we will run case studies of successful educational activities and outcomes initiatives. To contribute, contact Executive Editor Sue Pelletier at (978) 448-0377, or e-mail her at email@example.com.