• Remember that outcomes objectives are not the same as learning objectives, said Dietze. An example could be to measure changes in knowledge and competence, and use those results to judge the degree to which you met the learning objectives so you can then use the results to improve future activities.
  • Detail the practical methods you plan to use (QA/QI and patient safety data, or commitment-to-change in an evaluation at the end of the activity, or an audience-response system to ask pre- and post- knowledge, confidence, and competence questions). If you have a multi-component initiative, describe how you plan to assess each component. “Going through these details helps you understand the resources you will need,” said Dietze.
  • Don’t be afraid to explain your statistical analysis plan. If you’re going to use descriptive statistics, lay out whether they’ll be percentages, numbers of participants, or mean scores. If you want to assess changes on your pre- and post-activity knowledge questions, describe the statistical tests you’re going to use, said Dietze.
  • Detail when you plan to deliver what you promised. “A lot of people miss the ‘when,’” said Dietze. If you have a two-year online activity, will you produce your first report after two years, or will there be an interim report.
  • How you will deliver the results report—will it be in PowerPoint? Word? How long will it be? Will it include an executive summary that crystallizes the results, something Dietze and Viereck highly recommend? And whom do you plan to share it with? Are there internal audiences—a CME committee, faculty, or learners—that could benefit from the outcomes data? Why not share your results in the clinical and CME communities?