Until recently, the Internet has been the Wild West of the CME world. Lots of providers have stampeded the Web with online courses, but little had been done to measure their effectiveness in increasing physician knowledge or changing behaviors.

That was the impetus behind the Study of Continuing Online Physician Education: Physician Knowledge, Attitudes and Reflections on Practice (SCORE), conducted by the CME department of the University of Alabama School of Medicine, Birmingham, in collaboration with CECity, which develops online educational technologies. “There was very little data on the rigorous evaluation of online CME,” says Linda Casebeer, PhD, UAB's associate director of CME, and principal investigator of the study.

Funded by the Merck Company Foundation, the SCORE project developed a standardized evaluation instrument that it used for 30 online CME courses involving 1,800 physicians in numerous areas of practice. The participants were tested immediately prior to the online course, immediately after the course, and again 30 days later, on areas including why they attended the course and what changes they intended to make in their practices as a result of the knowledge gained.

“We're still analyzing the change data, because it's clinically specific to each topic area, but we found overall that there were big differences in the behaviors people thought they were going to change at the three different points in time,” says Casebeer. “Immediately before and after the course, they intended to make more changes than they actually reported 30 days later,” she says. “Consistent with other literature about educational interventions that are intended to change physician behavior, we found that they encountered obstacles such as a lack of patient adherence that interfered with what they thought they could change. At the 30-day post-test, they tended to focus behavior change on areas they could control.”

According to Robert Kristofco, associate professor and director of CME at UAB, there were several other interesting findings from the study, which won the 2004 William Campbell Felch/Wyeth Award for Research in CME at the Alliance for CME Annual Conference in January. In addition to behavior-change data, the study found that “a fair percentage of people thought the course was too easy, which indicates that online courses need to be better aligned with the educational needs of the target audience,” he says.

Another important outcome of the study was that it showed that it's possible to standardize the evaluation and assessment of online CME, regardless of whether the course is for orthopedic surgeons or primary care physicians. “That by itself should encourage people to understand that online measurement across the spectrum is conceivable and doable,” says Kristofco. In fact, adds Casebeer, UAB would be glad to share the templates used in the study with others. “If more people use them, that would only increase the knowledge base. We could compare our results with theirs.”

Another course-design — related result: Case-based courses produced more positive effects than the text-based courses. “That's also consistent with other sorts of courses,” says Casebeer. “Focusing on patients and cases helps physicians translate what they learn into practice.” This result should be taken to heart by those who are jumping on the bandwagon and repurposing live events for their Web sites, she adds. “What we're seeing is that many are translating a live CME activity into a didactic, text-based format. It's important to redesign the material for the Web — participants in the study said that interactivity was one of the most valuable and important aspects of Web-based education, and in many cases that's not what they're getting.”

“When we asked physicians to rate the most important characteristic that drives their decision to take an online course, they said it is the quality of the content that drives their choice, not CME credit or faculty, although these things are important too,” says Kristofco. And quality, says Casebeer, has to do with the credibility of the content. “This validates some of the other research we've done on understanding physician use of the Internet,” adds Kristofco. “They want to find the answer to a specific question, and they want to be sure they can rely on the information they get.” Fifty-five percent said they selected a course because they needed to update their knowledge in that area, as opposed to just 14 percent each who chose the course based on a need for CME credit or because they had a patient need in that area.