Assessment and Evaluation
Queen’s PGME is emerging as a national leader in assessment and evaluation. We share some of the exciting activities that have occurred over the past year.
Assessment, Promotion, and Appeals Policy
Recognizing the need for policy to align with and support our move to a competency-based model of residency education, a policy review process has been initiated. Policies related to progress, promotion, and time to completion are particular foci of this review.
Moving Competency-based Assessment Forward
The PGME office hosted the well attended Assessment and Evaluation Retreat in January of this year. We welcomed Dr. Farhan Bhanji, Associate Director of Assessment at the RCPSC as our guest speaker. Dean Reznick opened the event by sharing his vision of Queen’s PGME becoming the first fully competency-based Canadian institution. In May, a team of education leaders from Queen’s met with members of the RCPSC to showcase Queen’s readiness to take up the challenge of moving to competency-based PGME. Members were very impressed with the Queen’s Rubric Bank. In a side conversation with Associate Dean Dr. Ross Walker, Jason Frank lamented that the theme for ICRE 2014 was already set, but suggested rubrics could be an exciting theme for ICRE 2015. Dr. Frank’s enthusiasm for Queen’s assessment innovation was evidenced by his invitation to Dr. Leslie Flynn, our Vice-Dean Education to present Queen’s rubric bank at the June meeting of the RCPSC Clinical Educator’s group. Members of the audience were equally impressed, with one member actually emailing Dr. Laura McEwen, Director, Assessment and Evaluation during Dr. Flynn’s presentation for permission to share the resource handout with members of her institution.
Dr. Ingrid Harle, program director for Palliative Care has been working with Dr. Laura McEwen to develop a “Weekend On-call Assessment” rubric. Once piloted, this assessment innovation will likely be of interest to other program directors looking to capture residents’ performance on-call.
Drs Bob Connelly, Amy Acker, and Laura McEwen in Pediatrics are introducing “Core Activity Rubrics” (CARs). CARs represent core, high frequency activities that become the focus of assessment in a particular clinical context. Organized in order of graduated responsibility Junior (PGY1&2) /Senior (PGY3&4), these assessment tools are differentiated from EPAs as smaller components of EPAs that don’t always involve the concept of formal entrustment decisions. Dr. Sue Chamberlain in Obstetric & Gynecology has adapted the concept of CARs to her clinical context and is busy developing innovative assessment tools for use there.
Over the past year Dr. Stephanie Baxter, program director for Ophthalmology has been working with Dr. Laura McEwen to re-envisaged resident assessment in her domain. Aside from adapted patient encounter cards, Multi-source, and Patient Feedback rubrics, they have developed a suite of procedural specific rubrics to ophthalmology and a peer assessment rubric designed to elicit feedback from residents about their peers as colleagues.
Dr. Sean Taylor, program director for Neurology is the most recent adopter of rubrics. He plans to pilot the Consult encounter card, Multi-source Feedback, Patient Feedback, and Resident Teaching Feedback rubrics in his program.
Technology Enhanced Competency-based Assessment
Educational leaders continue to work closely with members of the Education Technology Unit to adapt existing functionalities and develop additional ones to support PGME’s move to CBME. Work on the electronic rubric system continues. A major focus over the past year has been developing summary report formats. Once realized, the system will enable users to monitor resident progress over time and across contexts (e.g., inpatient, outpatient) and automate the generation of ITER-style reports. Development time has been scheduled for the Fall 2014 to finalize the system.
Family Medicine (FM) continues to lead the nation in competency-based PGME. Over the past year our FM program launched their “Entrustable Professional Activities” (EPAs) Field Note system. They wrote 35 EPAs to capture postgraduate FM core competencies (e.g., well baby and child care). This assessment innovation allows an assessor to immediately access descriptions of three levels of performance for the phase of the clinical encounter (e.g., history, physical) for which (s)he wishes to write a FN about. In effect, the system provides users with a frame of reference to guide their assessments of the nature of supervision (e.g., close, minimal, ready for independence) they feel a resident requires based on directly observed performance.
Competency-based Assessment ResearchAccording to the Social Sciences and Humanities Research Council 1 (SSHRC), a program of research is defined as a sustained research enterprise, involving multiple projects shaped by broad objectives, that is a collaborative effort, and emergent in nature. Our work on the Queen’s Rubric Bank fits SSHRC’s definition of a program of research. Over the past year, a team of researchers has worked to build a validity argument for the rubric bank. Multiple projects contribute to this initiative including the Multi-source feedback study, the Surgical Procedure rubric study currently in progress, Pediatric Case Studies underway, and an emergent collaboration with McGill to explore the transferability of rubrics outside the Queen’s context.
In terms of sharing our assessment innovation we are happy to report that an article describing the Family Medicine Portfolio Assessment & Support System (PASS) will be published in Academic Medicine. An article about our Multi-source Feedback rubric is also in preparation. Finally, a team of educators will be facilitating a pre-conference workshop at ICRE 2014 about CARs (Core Activity Rubrics) in Toronto this Fall.
Dr. Laura McEwen is the Director, Assessment and Evaluation for Postgraduate Medical Education at Queen's University. She can be contacted at email@example.com.