Everything you need to know about exam questions types in our curriculum!

Post Thumbnail

Are all exam questions created equal? Not really—different type of questions test different levels of understanding. In the UGME program, we use a variety of exam questions to assess student learning—broadly classified as multiple-choice questions (MCQs) and short-answer questions (SAQs). But within these broad categories are a range of types of questions designed to test different levels of cognition. We use these different types of questions at different points both within courses and within the program.

Based on Bloom’s Taxonomy

 Bloom’s taxonomy is a classification system used to define and distinguish different levels of human cognition—thinking, learning, and understanding. The taxonomy was first developed in the 1950s by Benjamin Bloom and further revised by him in the 1990s. In his original version, there are six levels of cognitive behaviours that explain thinking skills and abilities of learners. The original six levels of cognition as described by Bloom are: knowledge, comprehension, application, analysis, synthesis and evaluation. Educators have used Bloom’s taxonomy to inform or guide the development of assessment, such as with the construction of MCQs. MCQs are widely used for measuring knowledge, comprehension and application of learning outcomes. Our curriculum uses MCQs in different assessment formats, for different purposes, and those are described below.

 

Screen Shot 2014-09-29 at 1.19.44 PM

You may hear acronyms and terms about assessment in our UGME program: RATs, MCQs, SAQs, Key Features. Here is a brief description of each:

Readiness Assessment Tests (RATs)

RATs used in our curriculum often consist of 10-15 multiple-choice questions that are linked directly to the readings (and/or prior lectures). A RAT focuses on foundational concepts that will be important for following SGL activities. MCQs found on a RAT, test for knowledge (i.e., recall information) and less for application of knowledge. Examples of verbs used in the question stem that would test knowledge include: define, list, label, recall, select, name, outline, or match.Filling in bubble test

Multiple-choice questions (MCQs): on midterms and finals

There are three components to an MCQ: the stem, lead-in question, and options that consist of one correct answer and typically three distractors (wrong answers). The stem should be directly linked to a learning objective assigned to a course. MCQs that are used on midterms and final exams often test for comprehension and application of knowledge; this is beyond the recall information that is typically the case with MCQs on RATs. Some multiple-choice questions may assess simple recall, depending on the learning objectives of the course but should be kept to a minimum. Verbs used in the question stem to test comprehension include: predict, estimate, explain, indicate, distinguish, or give examples. Verbs that would test application include prompts such as: solve, compute, illustrate, interpret, demonstrate, or compare.

Short-answer Questions (SAQs)

SAQs typically are composed of a case scenario followed by a prompt that requires a written answer that varies in length from one or two words to several sentences. SAQs often test the higher cognitive skills in Bloom’s taxonomy. Final examinations in our curriculum are typically composed of a mix of MCQs and SAQs. To test analysis, verbs in the question stem include: explain, arrange, select, infer, calculate, or distinguish. Verbs such as develop, design, plan, devise, formulate, or generalize test for synthesis, whereas verbs in the question stem to test evaluation include: argue, assess, estimate, justify, predict, compare, conclude, or defend.

Key Features Questions

Key features problems are used by the Medical Council of Canada for the assessment of clinical decision-making skills in the MCCQE Part 1. Key features problems have a case scenario usually followed by two or three questions, each question testing one or more key features. A key feature is defined as a critical step in the resolution of a clinical problem, and key-feature problems consist of clinical case scenarios followed by questions that focus only on those critical steps. While knowledge is an important feature for effective problem solving, the challenge posed by key features problems is the application of knowledge to guide clinical decision-making. For each question, instructions may require selection of whatever number of responses is appropriate to the clinical tasks being assessed, and there may be more than one response in the answer key. The development of key features problems for clinical decision-making is being piloted in the Clerkship curriculum courses this year.

How do we administer our tests?

Queen’s Undergraduate Medical Education has moved to an electronic exam system called ExamSoft for the administration midterms and final exams in Preclinical and the Clerkship curricular courses. Medical students no longer write exams on paper; rather they do it all on laptops. This greatly facilitates marking of exams, and it means we are no longer managing huge volumes of paper and deciphering student handwriting.

 References:

  1. http://www.nmmu.ac.za/cyberhunts/bloom.htm
  2. https://www.utexas.edu/academic/ctl/assessment/iar/students/plan/method/exams-mchoice-bloom.php
  3. http://www.profoundlearning.com/index.html
  4. Page, G., Bordage, G. & Allen, T. (1995). Developing Key-feature proglems and examinations to assess clinical decision-making skills. Academic Medicine, 70 (3).
  5. mcc.ca/wp-content/uploads/CDM-Guidelines.pdf
  6. Laura April McEwen, OHSE 2011, MCQ Checklist

 

 

 

 

 

 

 

Leave a Reply

Post Timeline

Curriculum Committee Information – September 28, 2017
Published Wed, November 15, 2017

Faculty and staff interested in attending Curriculum Committee meetings should contact the Committee Secretary, Candace Miller (umecc@queensu.ca), for information relating to agenda items and meeting schedules. A meeting of the Curriculum Committee was held on September 28, 2017.  To review the topics discussed at this meeting, please click HERE to view the agenda. Faculty interested in reviewing the minutes of the September … Continue reading

Grade Inflation – the “dirty little secret” of academia
Published Mon, November 13, 2017

“Would any of us have gotten into medical school today?” This was the tongue-in-cheek question I posed to my classmates at our medical school reunion last year. They were rather amused by it and, being very much aware of the high academic standards required by our current admissions processes, believed the answer was an obvious “no”. I tried to raise … Continue reading

Facebook thinks I’m a doctor…
Published Mon, November 6, 2017

  And other unusual things that happen when you’re an educational developer at a medical school It’s a unique and interesting thing being one of the non-medically-trained employees who work (mostly behind the scenes) to help run the undergraduate medical education program at Queen’s. On the one hand, friends and family can sometimes think I’ve magically completed medical school in … Continue reading

Nominations open for next Exceptional Healer Award
Published Mon, October 30, 2017

Instilling the values of patient-centered care is one of our goals in the UGME program. It’s also what the Kingston Health Sciences Centre Exceptional Healer Award recognizes in physicians from both the Hotel Dieu and KGH sites. Launched earlier this year, the Exceptional Healer Award is sponsored by the KHSC Patient & Family Advisory Council. It honours a physician who … Continue reading

Students striving to make a difference in our community
Published Mon, October 23, 2017

One of the attributes that our Admissions Committee works very hard to identify in applicants is a commitment to service. This has multiple dimensions, involving service to both individual patients and communities. It’s therefore always very gratifying to learn of efforts such as that described below in todays guest article provided by students Lauren Wilson, Katherine Rabicki and Melissa Lorenzo. … Continue reading