DREAM: Med School Assessments That Are Tried and True

By Jennifer J. Salopek

The Educational Innovation Institute at the Medical College of Georgia at Georgia Regents University, in Augusta, is a new entity, created only three years ago. While newness may bring a blank slate and room for innovation in many things, sometimes you also want the tried-and-true.

As an assistant professor and educational researcher at MCG, Christie Palladino, MD, wanted to assess how well curricula at the new institution were working. “I approached it in a scholarly way, but really struggled to find assessment instruments that had been tested,” she explains. “I was seeing the same instruments over and over again.”

A meeting with colleagues in allied health professions brought the realization that Palladino’s was a common problem. The group discussed assembling a local repository of assessment instruments—then the innovation kicked in.

“We thought, ‘Why not disseminate it more broadly?’” Palladino says.

She approached MedEdPORTAL in November of 2010.

“MedEdPORTAL is the repository for educational resources for medical educators. Its staff had the expertise in creating and curating collections,” she says. “My colleagues and I had expertise in large literature reviews. We could combine all of those skills to create a living, breathing repository.”

They called it DREAM, the Directory and Repository of Educational Assessment Measures. The DREAM collection is intended to serve as the premiere database for health sciences education assessment.

One critical issue was bringing the resources together in an accessible form. Therefore, Palladino and her MEP counterpart, John Nash, decided against including instruments that are proprietary or carry a fee. “Although that means the collection isn’t totally comprehensive, we wanted high-quality material that could be instantly downloaded,” she says.

Aha, high-quality. How is that ensured? By several means: There is a standard template for submissions that includes a set of inclusion criteria. Instruments must have been published in at least two scholarly publications; must have some validity evidence; and must have been used at least once in a health professions sample in the past 10 years. Many of the tools were initially developed for use in other disciplines, Palladino explains.

Next, each submission is reviewed by a credible volunteer, who crafts a 1,000- to 1,200-word critical analysis of the instrument and its application to the health sciences. The benefit to the volunteer author is that the analysis is treated as a new piece of scholarship that goes through the standard MedEd Portal peer review process.

To populate the collection initially, Palladino and her colleagues searched and reviewed 2,300 instruments. Almost half were in use at a single institution and lacked validation. That’s critical, says Palladino, to answer the questions, “Am I measuring what I intended to measure?” “Will this assessment identify the students who need remediation?”

“At the end of the day, people are making big decisions with these assessments. There can be high stakes. Professors’ feedback can influence medical students’ future path, so we’re shaping them by how we assess them. Assessment really drives learning,” Palladino says.

The DREAM Collection goes live in October. Each instrument is contained within a Critical Synthesis Package that includes the instrument itself, supplementary materials, and the critical analysis. Initially numbering about 100, the curators are constantly inviting and reviewing new instruments, and hope eventually to have 300 to 400.

“We hope that as we do go live, instrument developers will start contacting us,” says Palladino. “In the meanwhile, we have gotten really good at begging.

“We really believe in this and what it can do.”

—Jennifer J. Salopek is managing editor of Wing of Zock. She can be reached at jsalopek@aamc.org.

Leave a Reply

Your email address will not be published. Required fields are marked *