MedEdPortal: a great resource
MedEdPortal is a repository of online modules, and other tools that are vigourously peer-reviewed and suitable for medical and other health professions education. To find out more about this great resource, to to their short video: www.mededportal.org/about
Exam Wrappers: A novel way to review exams
Here’s a new and very interesting tool called “Exam Wrappers” that you can add to your exam review after mid-terms and even finals. It enables students to think more carefully about their studying and learning. It is from a chapter by Marsha C. Lovett, (2013) Chapter 2, in Make Exams Worth More Than the Grade, in the book, Using Reflection and Metacognition to Improve Student Learning, edited by Matthew Kaplan, et al, Stylus Publishing, Sterling, Virginia.,
This is a technique that engages students in reflection, metacognition (learning to learn) and self-regulated learning. Prof. Lovett’s approach was to “build metacognitive practice around exams” and in so doing satisfy the many constraints that challenge metacognition in a curriculum.
What are Exam Wrappers?
Exam wrappers are short activities that direct students to review their performance (and the instructor’s feedback) on an exam, with an eye toward adapting their future learning. Exam wrappers ask students three kinds of questions: How did they prepare for the exam? What kind of errors did they make on the exam? What could they do differently next time?
Prof. Lovett provides examples in Appendices A1 and A2 of her book. Here is a summary of her work on the three questions above:
1. How did you prepare for the exam?
Benefits of this question:
• Challenge student to confront study process and implicit or explicit choices they made about their studying
• Asks themselves if they studied enough or with enough lead time
• Focusing on diverse study methods (reviewing notes, solving practice problems, rereading the textbook) points out that there are many approaches they can use for next time
2. What kinds of errors did you make?
Benefits of this question:
• Challenges students to move beyond marks: with high marks, they tend to be relieved and move on; with low marks, they may leave the “painful event behind.”
• Allows opportunity to analyze in greater depth, e.g. considering level of difficulty of the questions they may have had problems with, looking for patterns in types of errors.
• Gives them a lexicon re. self-assessment: e.g. “Did they read the question carefully? Did they have trouble setting up the problem? Did they fail to understand the concepts involved?” Or “Did they make mistakes on the required math, chemistry, physiology, anatomy, etc.?”
3. How should you study for the next exam?
Benefits of this question:
• Ties responses from #1 and 2 together
• “A key goal of the third type of question is to help students see the association between their study choices and their exam performance so they can better predict what study strategies will be effective in the future.” (Lovett, 2013)
• Asks students to attribute their problems from #2 to some specific study errors, or look back at #1 and #2 and ask how they would specifically prepare differently.
Benefits of exam wrappers:
1. Impinge minimally on class time.
2. Are as easily completed by students within the time they are willing to invest.
3. Are easily adaptable. (Faculty can add their own concerns in #2, for example, asking about test anxiety or other issues). Can be used with other types of graded assessments.
4. Are repeatable yet flexible. (can add new questions or change questions slightly to keep things “fresh”)
5. Exercise the key metacognitive skills instructors want their students to learn: assess strengths and weaknesses, identify strategies for improvement, and generate adjustments.
Steps for Exam Wrappers
1. Hand back exams.
2. Assign “Wrapper” with questions.
3. Students complete, either during the exam review, for homework, or online (non-graded but required element). Students can also share study techniques with classmates.
4. Instructor collects and reviews to gain new knowledge of student needs, and patterns of behavior (e.g. amount of hours spent studying)
5. Hand back wrappers, or remind students about them as they might begin studying for another exam.
6. Repeat for subsequent exams (you can streamline a wrapper for a later exam, eg.)
Thanks to the Tomorrow’s Professor Digest for this idea from Prof. Sharon Lovett.
New resource for electrocardiogram interpretation
Queen’s own Dr. Adrian Baranchuk is the Editor of the newly published Atlas of Advanced Electrocardiogram Interpretation. With Contributing Editors Drs. Hoshiar Abdollah, Damian Redfearn, and Christopher Simpson, one could call this “The Queen’s Atlas of ECG”! The atlas is “a practical guide to recognizing and analysing a wide spectrum of cardiac conditions.” There is free access for the next 25 days at http://asandk.com/ecg/ It’s available for PC and Mac.
The atlas provides:
Tracings, data, descriptions, interpretations, and tips from the expert contributors
Straightforward and consistent style encourages logical and step-wise ECG interpretation, as well as rapid recognition based on the study of repeated patterns
There are 100 “real world” tracings with contributions from 100 of the world’s leading cardiologists and electrocardiographers.
Cases are divided into 12 chapters covering key disorders and abnormalities.
Bibliographic information is provided to facilitate further reading.
Images from each chapter are available to download to your computer for use as teaching and learning aids.
A great teaching idea: The 3-2-1 Assignment
Here is a great teaching idea from Dr.Geraldine Van Gyn, professor in the School of Exercise Science at the University of Victoria.
She writes in the e-zine Faculty Focus about the “Purposeful Reading Assignment” or the “3-2-1” assignment.
It goes like this:
Requirement 1: Students read what is assigned, then choose and describe the three most important aspects (concepts, issues, factual information, etc.) of the reading, justifying their choices.
Requirement 2: Students identify two aspects of the reading they don’t understand, and briefly discuss why these confusing aspects interfered with their general understanding of the reading. Although students may identify more than two confusing elements, they must put them in priority order and limit themselves to the two most important ones. Students seldom understand everything in a reading and, knowing that they must complete this part of the assignment, will reflect on their level of understanding of all the reading’s content.
Requirement 3: Students pose a question to the text’s author, the answer to which should go beyond the reading content and does not reflect the areas of confusion in requirement 2. The question reflects students’ curiosity about the topic and reveals what they think are the implications or applications of the reading content. This last requirement lets you know how well students understood the article’s intention.
This would be a great assignment to try in Health Sciences classes. In Meds, perhaps we could modify it so that the students share with their group Requirement 2 and hand in Requirement 3 for feedback. We could use an e-template to complete these and allow faculty to give quick e-feedback.
Prof. Van Gyn reports that in analyzing her mid-and end of term feedback, The purposeful, 3-2-1 reading report is the most frequently cited in all courses (mid-term =72% of all students, n= 549, end of term = 65% of students, n= 513) as being of greatest benefit to the students’ learning.
If you’d like to learn more about 3-2-1, just drop me a line.
Van Gyn, Geraldine. It’s The Little Assignment with the Big Impact: Reading, Writing, Critical Reflection, and Meaningful Discussion. Faculty Focus May 6, 2013.
Case Reports Database
Dr. Kanji Nakatsu shared this resource with us recently. It’s a bank of Case Reports, from Biomed Central and supplemented by the Journal of Medical Case Reports. It is searchable and freely accessible. This is a resource for physicians, but may also be used in medical education. “By bringing similar case reports together, through the Cases Database, researchers and clinicians can start to look for new knowledge – new associations, new side effects, new thoughts about disease processes, new understandings about the impact of disease on our patients and our communities.”
Access it by going to
Updated Faculty Resources Community Available
The newly-updated Faculty Resources Community is now available in MEdTech Central. This online resource contains great teaching and assessment ideas, highlights of Curriculum Committee, notes and slides from the retreats, and more.
The resource material available includes refresher instructions on the audio-visual equipment in teaching theatres 132 and 032 (including a map of the numbered student microphones), e-learning resources and links to the small group learning community.
This Faculty Resource Community is open to all faculty at the School of Medicine. For more information, please contact Sheila Pinchin (email@example.com) or Theresa Suart (firstname.lastname@example.org).
Using the IDEAL banks of questions for your assessments
Obtaining IDEAL Consortium Questions
Queen’s School of Medicine has joined the IDEAL Consortium, an international assessment item-sharing collaboration among Schools of Medicine. The Consortium has 27 member schools from 11 countries. Queen’s and UBC are currently the only Canadian members.
The IDEAL Restricted Question Bank contains over 20,625 assessment items including 17,109 MCQs, 539 short-answer questions and 461 OSCE stations. Collectively, members contribute about 4,000 new questions to the restricted and non-restricted question banks annually.
Restricted Bank: Please contact your Curricular Coordinator to request sets of restricted bank questions in your subject area or questions on particular topics. (Zdenka Ko for Year 1, Tara Hartman for Year 2, Jane Gordon for Clerkship Rotations and Candace Trott for “C” courses in clerkship.) Restricted bank questions need to be kept secure, so they can only be used on final examinations. A Word document containing the questions (as well as their answers and “item numbers”) will be couriered to you, or you can request that a secure MEdTech community be created for you to share restricted questions with other faculty members in your course.
To use restricted questions on final exams, simply provide your Curricular Coordinator with the item number of each question and the order in which you would like the questions to appear on the final exam. If you are sharing restricted questions via a secure MEdTech community, you can copy and paste your question selections into a Word document and upload it to the Curriculum Coordinator’s folder in the secure community. It is important that the IDEAL restricted bank questions not be emailed except in password-protected Word files. The restricted questions must not be viewed by students except during the writing of final exams.
You can specify edits to any of the IDEAL items – including OSCE stations. If you edit the items yourself, please highlight your edits so that your Curriculum Coordinator can transfer the edits to the local copy of the IDEAL bank.
The old LXR bank contained many duplicate and triplicate questions, so please let your Curriculum Coordinator know the origin of each exam question (IDEAL? LXR? Original? From a colleague?) We especially need to know, for copyright and item submission reasons, if any questions did not originate at Queen’s. Questions that did not originate at Queen’s will be marked, “Do not submit to IDEAL”, but can be stored in the local copy of the IDEAL bank and used on Queen’s exams.
Unrestricted Bank: Unrestricted bank items can be used in online quizzes, in clicker sessions, on midterms etc. Students can have full access to all unrestricted bank questions. Currently the MEdTech team is creating an interface for the unrestricted bank so that faculty members will have full access to the questions. At present, requests for emailed sets of unrestricted bank question sets can be sent to Catherine Isaacs (email@example.com).
Great Health Care Requires Great Medical Educators
Education is not an industrial process; it is a human one.
In the Dec. 10 edition of The Atlantic Monthly, Richard Gunderman, MD. PhD., examines different sets of components of excellence in medical education: curriculum, instructional methods, and assessment techniques AND creativity, commitment, and inspiration of medical educators. He focuses on the critical importance of fostering a generation of medical educators through support of medical education. For the article see
What do p and R-values mean anyhow? : Understanding how to interpret multiple-choice test scores using statistics.
Have you ever wondered whether or not your multiple-choice questions (MCQs) are too easy? The answer to this question can be found in the p-values or item difficulty: the percentage of students who answered correctly. The difficulty of a MCQ can range from 0.00 to 1.00; the higher the p-value, the easier the question. What we should be concerned with are high difficulty questions with p-values less than 0.3.
Have you ever wondered which questions tricked students who otherwise performed well on a test overall? The R-value or item discrimination looks at the relationship between how well students performed on a question and their total score. Item discrimination indicates students who know the tested material and those who do not. The higher the R-value, the more discriminating the test question. We should try to remove questions on the test with discrimination values (R-values) near or less than 0.3. This is because students who did poorly on the test did better on this question than students who performed better overall.
Did you Know?
Multiple-choice questions that use words in the stem such as best, most, first, or most correct require higher-level thinking but often confuse students because they are ambiguously worded. Our students have struggled lately with ambiguity in the wording of MCQs on RATs and exams such as “Which is the most likely….”. They assume “most likely” to be “most common”, whereas the most likely answer could be an uncommon situation. It’s important to word the question clearly so that students are not confused. So for example, the question could state, “In light of the clinical information provided above, which diagnosis would you make?
You can also ask students about “most common”, “most concerning”, or “what is the first test you would perform” etc. but it is always good to anchor these stems by referring to the data presented previously. Then the key is to require them to choose, evaluate, interpret, judge, infer from data, solve problems, and apply principles.
Did you Know?
The Student Assessment Committee has posted several articles, checklists and PowerPoint slides to assist you with Multiple Choice Questions.
For more guidance on writing high-quality multiple-choice questions refer to MCQ Guidelines and Writing MCQ’s in School of Medicine Faculty and Staff Resources at:
Queen’s School of Medicine: Faculty and Staff Resources.
Translating students’ comments on course evaluations
Navigating students’ comments could be one of the most challenging aspects of interpreting course evaluations. In an article in Innovative Higher Education, Linda Hodges and Katherine Stanton (2007) suggest using these comments as “windows into the process of student learning and intellectual development” rather than as reviews of “how they have been entertained” by an instructor.
Hodges is Director of the Harold W. McGraw, Jr. Center for Teaching and Learning at Princeton University; Stanton is the center’s assistant director. They point out that sometimes students’ comments stem from “students’ expectations of or prior experiences with college classes” that “entail teachers standing in front of the room ‘telling.’”
For example, is a comment like “I did not learn in this class because the teacher did not teach” evidence of a lack of effective teaching, or evidence that the style of teaching – including lots of team-based work – wasn’t what the student was expecting? Reframing student comments in this light can ultimately help improve teaching, Hodges and Stanton suggest.
“We may see our evaluations less as judgments of our performance and more as insight into our students’ intellectual growth—insight that may engage us in intellectual growth as teachers and scholars.”
Hodges, L.C., and Stanton, K. (2007). “Translating comments on student evaluations into the language of learning” in Innovative Higher Education 31:279-286.