"My concern is, we do programs we think are terrific, but do they have lasting impact? How do we document [that] we have done a good job changing physician behavior? That's what the folks who fund the programs really want know."

A familiar question? It sure is. The debate over how to prove outcomes from CME and demonstrate to commercial supporters their ROI or ROE (return on education), has gone on for years, with no solid answers emerging to help providers as they develop programs for the next millennium.

Inside Physicians' Heads But when attendees posed the above question at the AMA's Tenth Annual Conference of the National Task Force on CME Provider/Industry Collaboration held in September, presenter Henry Slotnick, PhD, PhD, came up with an unusual answer.

"In documenting CME success in the past, we've looked strictly at whether there has been a clinical change. That's too coarse an observation."

Instead, said Slotnick, professor of neuroscience, University of North Dakota, providers need to understand how physicians learn. To find out, Slotnick crawled inside doctors' heads, figuratively speaking of course, and asked what motivated them to seek education. In reporting his findings, Slotnick showed that by understanding the physicians' learning process, providers and industry can develop education that is more attractive to practitioners, and also document its effectiveness.

If you want to get physicians' attention, you need to deal with problems, not topics, Slotnick told providers. Once a physician identifies a problem, he or she goes through a four-stage questioning process.

* Is this a problem for me? A physician first decides whether the problem falls within his practice area, or whether it is something he would refer.

* Does the problem have a solution? Research indicates that doctors underestimate the availability of solutions, Slotnick said, and are likely to bail out if they think no solution exists.

* Are the resources to learn how to solve the problem available?

* Do I want to change my practice? For example, if a physician learns how to handle a difficult case, her colleagues may then start referring all those tough cases to herand she may not want that.

It's important for providers to know how many attendees are at each stage, Slotnick said, because learning needs vary across the process. When designing a CME activity, providers can tell presenters what percentage of the audience is at stage one--figuring out whether the topic is important to them, versus what percentage is already at stage two--ready to learn the new skill.

Each Step Counts If the answers to all four questions are yes, the physician will then be ready to acquire new knowledge or skills--which certainly could result in changes in clinical behavior. But that change is not the only measuring stick, Slotnick said.

He used the conference as an example, by asking how many of the attendees who had participated in his earlier workshop had showed up at stage one--wondering whether or not they wanted to know about adult learning and its relationship to physicians.

Nine out of the 22 attendees responded, yes. He then asked how many of those nine had moved on to stage two after the workshop; that is, had decided they wanted to know more about physicians' learning processes. Seven out of the nine had moved on.

That movement Slotnick said, "would not appear as change in clinical practice, but I assert learning took place and movement took place." By making that kind of analysis, Slotnick said, "We can do a better job of documenting the impact of CME." Providers can use such data to show commercial supporters how many people moved from stage one to stage two, and so on.

As for the two people who chose not to seek more information about adult learning--that decision is important, too, Slotnick said. Physicians must decide whether change is justifiable.

For instance, he interviewed an obstetrician who initially wanted to learn how to use pessaries to control incontinence, but changed his mind. At age 62, he felt that he wouldn't learn the skill well enough before he retired in order to benefit his patients.

"Simply showing clinical behavior [change] doesn't mean the decision was suitable," Slotnick said. "That issue needs to be taken into consideration."