Think of the Essentials and Standards of the Accreditation Council for Continuing Medical Education () as a highway system you've traveled for years. It is slow and winding, and it has speed traps at various spots. It is tedious. It is inconvenient. It gets you where you want to go, but only by following a prescribed route. You've been using it for so long that you can anticipate the twists and turns, and you know where all the speed traps are. Still, you dream of other, faster routes.
One day you hear that there will be a new system, with more interesting routes that will get you where you want to go and will even make you a better driver. You attend meetings where you offer your own ideas about what a good route might look like.
Then one morning in July 2000, the old system isn't there anymore. Instead, there are a dozen possible routes. Fast? Scenic? Most economical? Most challenging? Suddenly, where there was routine, there are choices. Depending on who you are and what your circumstances might be, you might be giddy with new-found freedom. Or you might find yourself wishing for that same old road that you knew so well. Either way, it's a shock to the system.
From Essentials to Elements
In a nutshell, the ACCME's new system (originally called System 98) condenses the Seven Essentials into three elements: planning, execution, and measurement of outcomes. These three elements are expected to be linked. To use the ACCME's own example, it is not enough to identify a need to teach physicians CPR and then offer a course. The course must be presented in such a way that when it is over, physicians who attended it can actually perform CPR. Linkage occurs when you can trace how the identified need was turned into an activity that accomplished the desired result.
The new system also calls for a self-study report. This is supposed to be a narrative in which an overview of the CME organization is provided, along with an explanation of how the organization complies with ACCME requirements (with examples), and a plan for future improvements.
Talking with CME providers who have been through the new accreditation process is a Rashomon-like experience. Everyone has a different version of the event. A few themes emerge, nonetheless:
While providers disagree about the end result, they do agree that this is a new process, and it is important to budget time for it.
The new system appears to be scalable: It requires more work from large organizations, less work from small ones.
One thing is definitely unchanged: Organized files are a prerequisite for success.
One thing is definitely different: The attitude of the ACCME toward institutions seeking accreditation.
“I think the new system really is ‘scalable’ in that it allows more flexibility — there isn't that old cookie-cutter approach anymore.”
— Barbara Barnes, MD
Barbara Barnes, MD, is the director of the Center for Continuing Education in the Health Sciences at the University of Pittsburgh. One of her responsibilities is overseeing the accreditation process for the Consortium for Academic CME, a pilot program, now in its third year, in which a group of Pennsylvania medical schools share a single accreditation. She is also a member of the board of the Integrated Advanced Information Management Systems Program at the university's Center for Biomedical Informatics. All of which is to say that Barnes is an experienced swimmer in the sea of information management as it applies to CME.
Another credential: The Consortium earned a six-year accreditation under the new ACCME system, with exemplary ratings in Essential Area 2 — the area which initially struck dread into CME providers because it seemed to require outcomes measurements. Under Area 2, Educational Planning and Evaluation, providers must, among other things, evaluate the effectiveness of their CME activities and of the organization's overall program.
“Depending on how seriously you take this, it can be very labor-intensive,” Barnes says. “Because it is different from what they've done before, it's taken a lot of time for many institutions to figure this out.” She cites identification of and meeting with stakeholders — those who have an interest in the success of CME — as an example. At a large institution, in particular, if the CME department has never met anyone from, say, the education department — a potential ally, if you're in thinking-outside-the-box mode — then it's going to take time just to make contact, never mind build a relationship. As Barnes cites such other potential internal stakeholders as a large institution's quality control department, pharmacy department, staff training and development department, and external stakeholders — community physicians, county boards of health, insurance companies — it becomes clear that reaching out to them can be a major task in itself. Barnes recommends providers begin to work at this level at least 18 months before initiating the accreditation process under the new system.
Obviously, not every organization works on that scale. For Susan Gardinor, director of education for the American Gastroenterological Association in Bethesda, Md., the time required was considerably less. “We really started focusing on this project a little more than six months prior,” she says. “One top manager spent six weeks of that time working on the accreditation process. As director, I spent about six weeks on it; and our administrative people devoted two weeks to the project.” Looking back, she estimates that the amount of work on the new system was about the same as it was under the old one.
Of the CME directors contacted for this story, the land-speed record-holder is Debra Kuehler, vice president of education for the Texas Heart Institute in Houston, who got the project done in six weeks. Still, she complains that the process is “way too laborious.” She adds that in her case, the main consumer of time was the self-study report. “That narrative was a real bear for me to prepare,” Kuehler says, adding that “something should be done to make it easier for everyone, including the reviewers.”
She also echoes the concerns expressed by providers about outcomes. “The whole matter of measuring educational outcomes as they relate to improving the quality of patient care is very difficult and no one seems to have a good response to the challenge. I'm not sure what CME sponsors are supposed to do.”
But Barnes notes a silver lining to all the work involved in discovering stakeholders: The exercise raises the profile of the CME department in the organization and in the community. And that's a good thing for CME providers who worry that no one outside their department understands what they do.
As may be deduced from the experiences of Barnes at the giant CACME and Kuehler at the small Texas Heart Institute, one pleasing aspect of the new process is that the amount of work required seems to correspond to the size of the organization undergoing the accreditation process.
“It should be much easier to do this in a small specialty society than at a large medical school,” says Barnes. “You have fewer stakeholders, and the breadth of your programming is narrower. I think the new system really is ‘scalable’ in that it allows more flexibility — there isn't that old cookie-cutter approach anymore.”
At least one head of a small CME organization agrees. “The new system is easier,” says Barbara Solomon, director of continuing education for the San Diego Eye Bank. “The wording of the questions is clearer about what is actually wanted. It's a much shorter form.” Solomon comes at the experience from a unique perspective: She had applied too late under the old Essentials system and was asked to re-apply under the new system, so she was able to make a back-to-back comparison.
But not everyone agrees. The AGA's Gardinor says the amount of work required was the same as under the old system. “The bottom line is, the same information was requested,” she says. “To me, the setup offers the appearance of flexibility, but when it really came down to it, the information presented, and the way it's presented, is really not so different.”
Whether or not CME directors think the new system represents a change, they all agree that keeping good files is as important as it ever was. Kuehler, who felt the new system was still too much work, nonetheless said good filekeeping made it easier. “Fortunately, our files were in excellent order, so there was no backtracking or ‘creating’ documentation for the reviewers,” she says.
Gardinor speaks of coming in on Saturdays to get files in order before the surveyor visit. “That's half the battle — everything is based on documentation,” she says.
Barnes goes so far as to warn against getting caught up in the self-study document, simply because it's new. “It's easy to get caught up in writing the self-study and forget the documentation file review. That's where the rubber hits the road in terms of demonstrating what you actually do in practice. It's very helpful to go through those files — not to change anything, but to organize them in a way that makes it as easy as possible for the surveyors to get the information they need.
“It's a good opportunity, also, to think about how files should be organized for the future, perhaps using the ACCME's documentation review form as a checklist. Then, by the time your survey comes, you'll know that everything is in compliance.”
“Hi, We're Here to Help”
Of all the issues surrounding the new system, the one where CME providers seem most in agreement is the changed stance of the ACCME.
“The whole matter of measuring educational outcomes as they relate to improving the quality of patient care is very difficult, and no one seems to have a good response to the challenge.”
— Debra Kuehler
“Whether perceptual or real, a lot of providers felt there was a lot of antagonism there,” says Barnes. Especially in the early 1990s, when the Food and Drug Administration published its notorious “draft letter” about enforcing strict guidelines for commercial support of CME, many providers began to look at the ACCME as an enforcer, and at the time that was clearly what the FDA wanted from the organization. Fortunately, those days are gone, and the ACCME looks a lot more benign.
“I appreciated not feeling like anyone in the ACCME was trying to keep secret what they wanted,” says Solomon. “When they set up the site survey, they actually sent me the form that the site surveyors use. I don't remember them doing that before.”
“I don't think the ACCME stresses enough that they really are advocates for us,” says Gardinor. “There's still the lingering sense that they're going to come in and look for mistakes. And that's not at all the angle they come from. The first thing our site surveyors said was, “We are here to take all your information and present it in the best positive light.”
Providers are also very pleased with the way the ACCME has reached out to provide information — especially through its workshops and its Web site.
“I went to the ACCME workshop a few years ago in Chicago. After that, I felt that any time I had a question, I could just call them,” recalls Solomon.
“The ACCME provides a lot more documentation than it used to, and it's all on the Web site,” says Barnes. Looking at the surveyor documents is an especially good exercise, she adds, because looking at the forms gives providers an opportunity to step back and consider the surveyor's perspective. “I think that's a very good way of understanding how to organize things.”
Even Kuehler, who prefers the old system to the new one, agrees that the ACCME's phone service and Web site are helpful, and recommends the Alliance for CME Web site as another good source of accreditation guidance.
The Real Shock: Freedom
Many years ago, the Alliance for CME ran a plenary session at its annual meeting on dealing with change. The speaker asked everyone in the room to stand and face the person next to them, then turn away, change something about their appearance, and turn back again. Everyone stood and did it. There was joking and enthusiasm. Then she asked everyone to do it again. Most people stood again, but this time they were somewhat puzzled at having to go through the exercise a second time. When she asked the audience to do it yet a third time, complaints could be heard, and not nearly as many people stood up to participate.
There were two points to the exercise: One was that change is hard, and the more change you're asked to make, the harder it becomes. The second was that when people turned away from their partners to change, they found themselves facing someone else who was going through exactly the same situation. How many people thought to ask each other for help?
Barnes thinks the ACCME has taken the need for CME providers to ask for help during change to heart. She enumerates the increased availability of ACCME workshops on accreditation, the visibility of ACCME staff at such national meetings as the Alliance for CME annual meeting, the increased availability of useful documentation, and the collegial attitude taken by ACCME staff when asked questions over the telephone, as four good examples of the ACCME's support during a time of change. (For coverage of the ACCME's workshops at the 2001 Alliance for CME Conference, see stories on page 8 and page 43.)
And she respects the difficulty of making changes. “The ACCME will give guidance, but it also wants the providers to do what's best for them. I think this is hard for some providers, who have been in the lock-step model for so long.”
Barnes adds that she thinks there's “a real desire on the part of the ACCME to make its requirements explicit and objective, so there is equity across the system regardless of provider type.”
And it should come as no shock, she says, to discover that the real bottom line is still attitude.
“If people do this purely with the intent of getting accreditation,” says Barnes, “there's not much benefit. Hopefully, getting a clearer idea of the direction of your CME program, and building new relationships with stakeholders, will reap significant benefits to your CME program.”
Achieving Exemplary Results in Essential Area 2: Planning and Evaluation
On the Web: www.PassinAssociates.com
By telephone: (253) 756-1616
In person: The University of Wisconsin Medical School, with the assistance of Steve Passin & Associates, and H.B. Slotnick, PhD, PhD, recently received a six-year accreditation with exemplary ratings in several elements of Essential Area 2. Find out their strategies for success at this conference, April 29 to 30, 2001, at the Westin O'Hare Hotel-Chicago.
The Accreditation Council for Continuing Medical Education
On the Web: www.accme.org
By telephone: (312) 464-2500
In person: The next “Understanding ACCME Accreditation” workshop takes place July 29 to 30 at the Hotel Inter-Continental Chicago. Contact the ACCME for additional information.
The Alliance For Continuing Medical Education
On the Web: www.acme-assn.org
By telephone: (205) 824-1355
In person: The next annual meeting of the Alliance takes place January 30 to February 2, 2002, at Disney's Coronado Springs Resort, Orlando, Fla.
The Alliance for CME also operates e-mail listservs on general CME topics and on best practices. See the Web site for more information.