Evaluating the Student Experience: Assessing satisfaction is important, but not enough

“Universities are centres of learning, not teaching”

These were the words, uttered many years ago, by a former professor and teacher in response to some very demurely and deferentially expressed comments about the quality of lectures being provided in a particular medical school course. The message, directed to me and a couple of my classmates, was pretty clear. The university and faculty would provide opportunities to learn, in whatever manner they felt appropriate. It was not for us, as mere students and consumers, to question the methods. The responsibility for our education was ours.

In fact, in recent discussions with a number of my medical school contemporaries who I’m fortunate to meet with regularly, none of us could recall, during our four years of medical school, ever being asked for feedback of any kind about our educational program. If such processes existed, either internal or external to the school, they were largely invisible to the students of that time. This was certainly not unique to our school. For our generation, medical education was very much a “take it or leave it” proposition.  

This is not to say we didn’t get excellent teaching, role modelling and mentorship. We certainly did, and many of us found our inspiration for education in those early experiences. It’s also almost certainly true that many of the teachers of that time quietly observed and responded to the impact of their methods on their learners. However, the culture of the day simply did not provide methods by which the student experience could be collected and analyzed.

This rather parochial approach was not exclusive to medical education. Patients of the past were rarely, if ever, surveyed for feedback about the quality of care they received from institutions or individual physicians. Corporations and businesses largely allowed the public to “vote with their feet”. If the product wasn’t good, people wouldn’t buy it, or would simply walk away.

Clearly, things have changed.

In the business world “Consumer Satisfaction” is an industry in itself. Successful businesses aggressively seek out customer feedback because they have learned that responding to real or even perceived needs drives future spending. IBM has taken this a step further. They go beyond the need to ask questions and, instead, are building and offering services that track consumer behaviour and provide that information to service and product providers. To quote from their site: 

In health care, knowledge of the patient experience is now considered essential to a well- run institution. Hospitals are expected, through accrediting processes, to actively seek out patient perspectives

The Agency for Healthcare Research and Quality operates within the U.S. Department of Health and Human Services. Its mission is “to produce evidence to make health care safer, higher quality, more accessible, equitable, and affordable”. To quote from their site:

“Understanding patient experience is a key step in moving toward patient-centered care. By looking at various aspects of patient experience, one can assess the extent to which patients are receiving care that is respectful of and responsive to individual patient preferences, needs and values. Evaluating patient experience along with other components such as effectiveness and safety of care is essential to providing a complete picture of health care quality.” (https://www.ahrq.gov/cahps/about-cahps/patient-experience/index.html)

They make an important distinction between patient satisfaction and the patient experience. Satisfaction is a subjective impression of a patient’s interaction with an institution or individual, and is largely based on whether their personal expectations were met. The patient experience relates to gathering information, available only through patient reporting, that is relevant to determining whether certain institutional goals are being achieved.

A person test driving a new automobile, for example, is able to report on both the driving experience (acceleration, braking, ease of handling, visibility etc…) and their personal satisfaction (enjoyment, comfort, excitement) driving the car. To those designing and building the car, evaluating the driving experience allows them to determine if the equipment and concepts they developed are working as expected. Evaluating driver satisfaction determines whether the consumer is getting what was expected from the car, which may be unclear to the designers. Both are relevant to success. Both are certainly relevant to the likelihood that the consumer will purchase the car.

In medical education, the value of student feedback is widely appreciated and schools go to considerable effort and expense to collect it. In fact, the systematic collection of feedback is mandated by accreditation standards, and the evidence required to establish compliance with those standards is based largely on student feedback. The distinction between measurements of the student experience and student satisfaction is relevant, both being important goals. Systematic Program Evaluation must encompass both.

At Queen’s, we recognize that many goals of our educational program can only be fully assessed with the perspective of those actually experiencing and living the process. We also recognize that a full picture only emerges if many points of feedback are provided. We have therefore put in place many and varied opportunities for students to provide both their personal perspectives and objective observations.

After each course, students are invited (and expected) to provide feedback that consists of responses to questions exploring pre-determined educational objectives, and provision for narrative commentary in which they can elaborate or explore other aspects. Those end-of-course evaluations also provide opportunity to provide similar feedback regarding the effectiveness of teaching faculty.

We receive and carefully review the results of course-related examinations undertaken by our students, not only to gauge their learning, but also the effectiveness of the teaching and learning opportunities provided.

We anticipate and review closely the results of external examinations undertaken by our students, such as the Medical Council of Canada Part 1 and 2 examinations, and all National Board of Medical Examiners tests we utilize. These provide valuable comparators to other institutions and, to a limited extent, further feedback about our teaching effectiveness.

The Canadian Graduation Questionnaire is completed annually by all graduating medical students and provides a comprehensive review of all aspects of their educational experience. We review it in great detail, and many aspects of the CGQ are incorporated into the accreditation process.

Dr. John Drover

We have established a Program Evaluation Committee that, for the past few years has been under the leadership of Dr. John Drover. That group collects, collates and analyzes data from a variety of sources to provide an overarching analysis of our performance relative to our programmatic goals. The PEC recently released a comprehensive report, which has been passed along to the Curriculum Committee for analysis and action. I am very grateful to Dr. Drover who has generously and effectively provided PEC leadership. He is now passing that role along to Dr. Cherie Jones as she assumes her role as Assistant Dean, Academic Affairs and Programmatic Quality Assurance.  

We have also developed a number of more informal ways by which students can provide feedback.

We meet regularly with student leadership and curricular leads to get “on the fly” feedback about courses as they are taught. This often causes us to undertake adjustments or provide supplemental content even before the course is completed.

We provide numerous ways in which students can report personal distress or incidences of mistreatment at any point during their medical school experience. These range from direct contact with selected faculty members, our external counselor (who can be contacted directly and is completely segregated from faculty or assessment) or submission of reports that can be embargoed until a mutually agreed to time. All these are outlined in our policies and accessible through convenient “Red Button” on MedTech.

I have found “Town Halls” to be very valuable sources of feedback on all aspects of the MD program. These are held at least once per term with each class and consist of a few “current events” items I provide, followed by “open mike” time when students are invited to bring forward any commentary or questions they may have, about any aspect of the program. The issues that emerge and dialogue among students in attendance can be highly revealing and have certainly provoked new directions and changes over the years.

Recognizing that not all students are comfortable with speaking out, or may not wish to be identified as they raise sensitive issues, a confidential portal was established on MedTech a number of years ago. Students are able to provide their commentary in a completely anonymous fashion if they wish. My commitment is to read and consider (but not necessarily act on) all commentary provided, and to respond personally if students choose to identify themselves. To date, I have received almost 500 such submissions, about 70% of which are provided anonymously. The commentary has been thoughtfully provided and has spanned all aspects of our program and learning environment. Importantly, it often brought to light issues that had not previously emerged in any other way.

In all these ways, student feedback has become a continuing, multi-faceted component of our school and, more broadly, our learning environment. It goes beyond being a mechanical, mandated exercise and data collection. It is embedded and cultural. It is what we do. It is who we are. 

Posted on

Med Students’ activities extend beyond the classroom

It’s that time of the new year when the winter doldrums can set it – weather and routine can weigh everyone down. Along with that, there’s that old cliché about “all work and no play”. There’s little risk of our medical students being thought of as anything approaching dull and they provide great ideas for how to beat the winter blahs. In addition to their full class and study load, they make time for a wide variety of extra-curricular activities for fun, recreation and community involvement.

Aesculapian Society President Rae Woodhouse recently shared some highlights of these endeavours:

In early January, 68 pre-clerks attended the annual MedGames in Montreal and placed 2nd of everyone outside of Quebec. Sponsored by the Canadian Federation of Medical Students (CFMS), MedGames brings together medical students from across the country for a friendly sports competition and network building.

Thirty-one second year students competed in BEWICS.  This is the annual Queen’s Intramural sports competition which features a variety of self-proclaimed “quirky” sports such as water volleyball and rugby basketball. The QMed team placed third overall for competitiveness and spirit.

The Class of 2021 Class Project Committee hosted Queens’ first ever Scholars At Risk Talk (see more on this here).

Pre-clerk students recently competed in the Ottawa’s Winterlude Ice Dragon Boat competition and about 30 went on the annual ski trip to Mont-Tremblant two weekends ago.

And if ice dragon boating and skiing weren’t enough of a challenge, about 45 students from across the four years spent a couple of hours recently learning the basics of curling from a fourth year student. This is the fourth time for this event!

For Wellness month, the Wellness committee put together a month of activities with each week having a theme: social, physical, mental and nutritional wellness. During physical wellness week, 40 pre-clerks did a Crossfit class and 20 did a spin class taught by the AS Wellness Officer.

The 2nd annual Jacalyn Duffin Health and Humanities conference happened recently and was very well received.

This past weekend, 20 students went to NYC to learn about the history of medicine, led by Dr. Jenna Healey (Hannah History of Medicine Chair) and the What Happened In Medicine Historical Society. 

And, over 100 mentorship group members attend trivia at the Grad Club. (Take note of that, it could be a future trivia question!)

Posted on

Singing the praises of learning objectives

This past Sunday afternoon, I had the pleasure of attending the Kingston Symphony’s matinee performance of Gene Kelly: A Life in Music at the Grand Theatre.  The show featured clips from Kelly’s most memorable performances, with live musical accompaniment by the symphony, under the direction of Evan Mitchell.

Throughout the show, Kelly’s wife and biographer, Patricia Ward Kelly, shared anecdotes and Kelly’s own insights into his choreography and performances.

She talked about the work he put into creating dances, painstakingly writing out the choreography plan, before working with his fellow performers to perfect the dances themselves. “He didn’t just show up and wiggle around on the stage,” she said.

My educational developer lens instantly compared this to the framework provided by well-written learning objectives. Objectives focus teaching and learning plans, and contribute to authentic assessment.

Yes, this is another blog about learning objectives.

In the abstract, learning objectives seem like just another box on a checklist or hoop to jump through.  Used the way intended, however, they are signposts that guide learning and teaching plans effectively—whether for a class or a single person—the same way Kelly’s planning delivered award-winning and inspiring choreography.

Yes, there’s a “gold standard” for writing objectives (that I’ve written about previously here). And there are verbs to use—and ones to avoid—and if it doesn’t come naturally to you to think this way, it can be pretty tedious.

What it’s really about is planning: knowing what you’re setting out to do. If you have an objective—a goal—then you can make your plan and communicate it to others effectively.

Well-crafted objectives also make things great for assessment, because it’s very clear what you have to measure at the end of the lesson, course, or program.

If you say, “I’m going to get better at taking patient histories” – what does that mean? What does “better” look like? If it means, “I’m going to note down details, or I’m going to ask specific questions, or I’m going to listen more than I have been, or interrupt less… then you know what you need to work on. You know what the focus needs to be, whether you’re a learner or a teacher.

Eventually, you’ll be able to do a history without thinking things through so deliberately – once you’ve achieved fluidity in that skill.  But before it’s a habit, you need to plan, your checklist, and I’m hitting all the boxes? Not just: “be better”.

For example, one of my plans in 2018 was to read more books that weren’t medical education and weren’t related to my PhD coursework. “Read more for fun.” That was it. My objective was pretty vague and, as a result, I didn’t create a workable plan. “Read more” didn’t get me very far. I read parts of eight non-work-related and non-course-related books. And three of those were cookbooks.

I set a more specific objective for 2019 that I would read more by spending five minutes every morning before I left for work reading something from my “recreational” “to be read” book stack (mountain).

I’ve finished two books, which is already a 200% improvement over last year. That specificity can make a difference.

And that’s really all objectives are: an outcome statement to focus your plan.

And that’s why we focus highlight objectives in our competency framework. It’s why we map things to them—learning events, assessments, EPAs—so we can be consistent and everybody knows what the plan is.

How much detail do you need in your objectives? This depends on how granularly you need to communicate your goals in order to be effective.

For his iconic Singin’ in the Rain, Gene Kelly had to map out the location of each of the puddles. His plan needed to be that detailed to get it right.

If you’re wrestling with learning objectives and how these relate to your teaching, give me a call.

Posted on

Residency Match Day 2019: What our students are experiencing, and how to help them get through it

If life were a roller coaster, our fourth year students have, for the past few months, been on quite a wild ride, slowly rumbling upward, gradually ascending to the summit, stopping for a moment as they stare downward to a distant, small landing point, readying themselves for a rapid and rather scary descent.

The process by which learners transition from undergraduate to postgraduate medical education has evolved into a rather jarring and extremely stressful experience (don’t get me started – a subject for another blog/rant). It has required them to not simply consider what specialties are best suited to their interests and skills, but engage an application process that requires strategic selection of elective experiences, preparation of voluminous documents, meeting multiple deadlines (twelve, no less), and commitment of personal time and expense to travel and interviewing which, for many, spans the country in the midst of the Canadian winter.

This year, the roller coaster reaches its summit at 12:00 noon on February 26th. The much anticipated Residency “Match Day” is when all fourth year medical students in Canada learn which postgraduate program they will be entering. By approximately 12:00:05 that day, all students will know their fate.  As you can imagine, there will be much anticipation and anxiety leading up to the release.  For most (hopefully all), the roller coaster ride will end with the exhilaration and satisfaction of having successfully overcome the process. For a few (and hopefully none), it will bring a realization that their efforts to date have not been successful, that their ride is not yet over, and they have to begin again. They will be profoundly disappointed, they will be afraid, they will be confused. They will need the understanding and help of the faculty who are currently supervising their training, and much help from our Student Affairs staff.    

This year, we are again prepared to provide all necessary supports, but there are a few changes to the process which I’d like to clarify for both students and the faculty that will be supervising them that day:

  • Unlike previous years, our Undergraduate Office will not automatically receive match results the day before the full release. However, students have the option of directing CaRMS to release their results the day before (February 25th) if they fail to match. They can do so by going into the CaRMS website and providing the appropriate permission.
  • Any unmatched students who have allowed early release will be contacted directly by myself to notify them of the result. This is for three purposes:
    • to arrange for immediate release from clinical duties
    • to allow the student some time to prepare for the release moment the following day when most of their classmates will be hearing positive results
    • to arrange for the student to meet our student counselors who will provide personal support and begin the process for re-application through the second iteration of the residency match. 
  • Unmatched students who did not opt to provide early release will similarly be contacted and offered the same support and services after we get their results on match day.
  • Because we may not have full information in advance, we have decided to release all students from clinical obligations beginning noon on match day, until the following morning.

I’d also like to remind all faculty supervising our fourth year students on or around match day to anticipate that your student will be distracted. Please ensure your student is able to review the results at noon. If you sense he or she is disappointed with the result, please be advised that the student counselors and myself are standing by that day to help any student deal with the situation and provide support.

Fortunately, we have an outstanding Student Affairs team which has been working hard to guide the students through the career exploration and match process, and will be standing by to provide support for match day and beyond.


Dr. Renee Fitzpatrick Assistant Dean, Student Affairs rf6@queensu.ca

Dr. Erin Beattie, Careers Counselor, ebeattie@queensu.ca
Dr. Josh Lakoff Career Counselor, jml7@queensu.ca
Dr. Mike McMullen, Careers Counselor, Michael.mcmullen@kingstonhsc.ca
Erin Meyer, Assistant to Directors, Student Affairs
Lynne Ozanne, Assistant to Directors, Student Affairs

The team can be accessed through our Student Affairs office learnerwellness@queensu.ca, or 613-533-6000 x78451. 

Thanks for your consideration, and please feel free to get in touch with myself or any of the Student Affairs Team if you have questions or concerns about Match Day or beyond.

Posted on