Author: Anthony Sanfilippo
A Brief History of Walls
Are walls effective? As we’re all aware, this seemingly innocent question has become a focus of considerable controversy for our neighbours to the south. Of course, it’s not about the sort of walls that separate rooms of your house, or the barriers around your property that deter trespassers and prevent your dog from molesting your neighbour’s flower bed. Rather it’s about massive barricades erected by political leaders to prevent or control the movement of large populations of people at borders. As it happens, there’s a rather interesting and intriguing history of such structures, both real and mythical.
Publius Aelius Hadrianus Augustus (76-132 AD) ruled when the Roman Empire was at its peak and is considered by many historians to be one of the “good emperors”. He seemed less interested in further expansion than in consolidation and security of his already vast empire. As part of that approach, he commissioned the building of a wall to define and secure the northernmost extent of the empire. Construction of Hadrian’s Wall began in AD 122. The wall is composed mostly of stone and is about 10 feet wide and up to 10 to 20 feet in height. The wall connects a series of fortifications located every 5 (Roman) miles. It runs about 73 miles, from the banks of the River Tyne near the North Sea in the east, to the Solway Firth on the Irish Sea to the west. It required a garrison of about 1,500 men and was intended to prevent the “barbarians” (ancient Britons and Picts) from troubling Roman Britain.
Hadrian’s successor, Antoninus Pius, seemed to like the concept but felt the boundary should be expanded and so, in 138 AD constructed a second wall about 100 miles to the north. The Antonine Wall was 40 miles in length. Despite the wall, Antoninus was unable to contain the northern tribes and so subsequent emperors abandoned his wall and re-occupied Hadrian’s Wall.
Today, Hadrian’s Wall is a tourist destination. It was declared a World Heritage Site in 1987, but remains unguarded. Tourists commonly climb and stand on the wall, although this is not encouraged for fear of damage to the historic structure.
The Walls of Troy
Troy was an ancient city located on the northwest coast of Turkey.
Archeological research of that site has revealed that it has been inhabited since about 3000 BC. Dutch researcher Gert Jan van Wijngaarden notes in a chapter of “Troy: City, Homer and Turkey” (University of Amsterdam, 2013) that there are at least ten settlements layered on top of each other.
It is not clear whether the ten year siege by Greeks led by King Agamemnon and described so famously in Homer’s Iliad is wholly or even partially true, but both the legend and the archeologic evidence indicate that the city was, at one time, surrounded by a rather impressive defensive wall. Van Wijngaarden notes that deep under the surface evidence exists of a“small city surrounded by a defensive wall of unworked stone.” In the period after 2550 B.C, the city “was considerably enlarged and furnished with a massive defensive wall made of cut blocks of stone and rectangular clay bricks”.
The legend, of course, indicates that the Trojans were able to hold out for ten years, but the wall was eventually overcome not by force but by clever deception: Ulysses famous “Trojan Horse”.
The Walls of Babylon
Babylon was a city and city-state located in Mesopotamia and a dominant presence in the world for over twelve centuries, ending about 600 BC. It was a key commercial and cultural centre and it is believed that, at various times, Babylon was the largest city in the world, and perhaps the first with a population exceeding 250,000.
A prominent feature of Babylon were its extensive walls. Various rulers would add successively to the work of their predecessors. Nebuchadrezzar II surpassed most by fortifying the existing double wall and actually adding a third. He also added a separate wall north of the city between the Euphrates and Tigris rivers. Considered to be over 100 feet high at points and extending 41 miles, both the sheer magnitude and artistic features of the walls were remarkable, notable particularly for the “hanging gardens”. They are considered one of the “Seven Wonders of the Ancient World”.
Extensive efforts have been made to excavate various components of the ancient city, which has been partially reconstructed as a historic and tourist site. Unfortunately, the reconstruction has been damaged by the development of oil pipelines and military conflicts. In April 2006, American Colonel John Coleman, former Chief of Staff for the 1st Marine Expeditionary Force, issued an apology for the damage done by military personnel under his command.
The Great Wall of China
Perhaps the most famous extant wall in the world was built to protect the then northern border of China from invasion by various nomadic tribes. The “Great Wall” was actually built in portions over several centuries beginning in the 7th century BC
and finally enlarged and united into a single structured with embedded towers and fortifications. The main construction of the existing wall dates to the Ming Dynasty (1368-1644).
In addition to its defensive purpose, the wall also had a border control function, controlling immigration and, serving as a tariff collection station for goods being transported along the “Silk Road” between eastern and western markets.
It extends 21,196 km making it clearly the most extensive wall ever constructed. Whether it is the only man-made structure visible from space is a point of contention. There has never actually been a recorded “sighting” from space, although a Chinese astronaut in the space station claims to have taken a photograph using high resolution equipment. What is clear is that it is a UNESCO World Heritage Site and a symbol of modern China. Although many portions of the wall are in disrepair and eroding, it remains an extremely popular tourist attraction, arguably, the world’s most sought-after selfie opportunity.
The Berlin Wall
A more contemporary example is the Berlin wall that physically divided that city between 1961 and 1989. Its history is both fascinating and instructive.
After World War II, the Potsdam Agreement determined that the victorious allies would divide Germany into four zones of occupation controlled by the United States, the United Kingdom, France and the Soviet Union. The German capital, Berlin, was the centre of administrative control of all four powers and so was similarly divided into four sectors. However, Berlin was entirely within the Soviet controlled portion of former Germany. Within a short period of time, political tensions mounted between the Soviets and the other three nations, largely related to the Soviets’ reluctance to agree to the Marshall Plan which called for the reconstruction, self-governance and economic support of post-war Germany. The United States, United Kingdom and France decided to proceed nonetheless, uniting their portions into a single country which came to be called West Germany (officially, the Federal Republic of Germany), with a capital located in Bonn. East Germany (known as the German Democratic Republic) emerged as a separate and Soviet controlled state, with its capital in Berlin. This left Berlin under divided governance but entirely within a separate and rather unfriendly state.
East Germans began to use West Berlin as a means to defect to western countries. It is estimated that 3.5 million circumvented emigration regulations by simply crossing into West Berlin and then on to West Germany and other countries. To prevent this exodus, the GDR (East German) leadership constructed a concrete, militarized wall essentially separating and isolating West Berlin within East Germany. During the time it was in place, over 100,000 people attempted to escape and about 5,000 succeeded in doing so. They were taking serious risks. According to the Centre for Contemporary History, a research institute concentrating on recent European history, at least 140 people are known to have been killed attempting to cross the wall, ranging from a one-year old child to 80-year old woman. Most believe the number to be considerably higher.
Eventually bowing to anti-communist sentiments in neighbouring countries and civil unrest, the East German government lifted restrictions on movement within Berlin in November of 1989, which led to open and euphoric celebration. People began chipping away parts of the wall until the government removed what was left of it. Germany officially became re-unified October 3, 1990.
Today, only small segments of the wall remain, including “Checkpoint Charlie”, its best known militarized crossing point. The Berlin wall is seen as a failed attempt by a government to impose its will on its citizens. Because it is so recent in our collective memory and so well documented, it has become a powerful image of oppression and courageous defiance. It too has become a popular tourist destination.
“The Wall” (Game of Thrones version)
The most famous albeit imaginary wall of our time no doubt comes from “Game of Thrones”, a hugely popular HBO series based on the fantasy novels of George R.R. Martin. A key feature is “The Wall”, a massive fortified structure composed of solid ice stretching across the northern border of the “Seven Kingdoms”. It is intended to provide protection from the various miscreants beyond, including “Wildlings” and a wandering army of frozen zombies referred to as the “White Walkers”.
Seemingly inspired by Hadrian’s Wall, this frozen barricade stretches from coast to coast, has fortifications along the way, and is manned by a garrison of exiled misfits referred to as the “Night’s Watch”. Apparently, Wildlings and White Walkers don’t swim or paddle. In any case, the wall has held up for millennia but, guess what happened at the end of last season?
(SPOILER ALERT: stop reading if you’re catching up on the series).
It comes down!!!….courtesy of a resuscitated and demonically-possessed fire-breathing dragon, no less! We’ll have to wait until next season to see if it becomes a tourist attraction.
And so, what does all this teach us about massive walls (real or imaginary) intended to separate populations of people? What themes and lessons emerge?
- They don’t work. People (even zombies) are smarter than walls, and are very capable of finding ways to overcome them. This is particularly true of people who are seeking better lives for themselves or families. Walls are static structures that can be overcome by imagination, determination and technology.
- Walls are hugely symbolic. They serve as a very visible expression of the values and priorities of those who construct them. The fences around our homes may not actually prevent a determined person from entering our property, but they certainly clarify for all the world that uninvited folks are unwelcome.
- They endure over time as artefacts, searched out and studied by historians and archeologists. They express and expose for posterity the true, unvarnished values and motives of those who constructed them. This persists long after they stop providing their original, intended purpose.
- They seem to serve as ideal, although expensive, tourist attractions.
If the planned wall does get built, can’t help but wonder how future generations will interpret the existence of a massive barricade on the southern border of a nation that also erected this other symbol at its major eastern port, proudly declaring to the world, “Give me your tired, your poor, your huddled masses yearning to breathe free.”
Service Before Self: The Legacy of George H.W. Bush
I’ve always liked George Herbert Walker Bush.
I realize, as I write those words, that it’s somewhat inappropriate and maybe even a little pretentious to use the term “liked” in reference to a former President of the United States who I never met or knew personally. It implies a familiarity I certainly can’t claim. Words like “respected” or “admired” might be more suitable, and are certainly applicable. But, in truth, “liked” is what comes immediately to mind. So, why is that? I think it’s because what has resonated with me as I’ve watched and read the various tributes since his passing a couple of weeks ago, and what probably resonates with most Canadians, are the fundamental human qualities- honesty and vulnerability-he maintained through his life. A few quotations provide insight into the character of the man.
In describing his neurologic symptoms that confined him to a wheelchair during his later years:
“It just affects the legs. It’s not painful. You tell your legs to move and they don’t move. It’s strange, but if you have some bad-sounding disease, this is a good one to get.”
While he was president, he famously indulged a life-long food preference by banning broccoli on Air Force One:
“I do not like broccoli. I’m president of the United States, and I’m not going to eat any more broccoli.”
In ending a contentious discussion with his Secretary of State James Baker:
“If you’re so smart, Baker, why am I president and you’re not?”
How can you not like someone so genuine?
Despite being what we might term a person of privilege, he seemed and acted like a regular, decent, fair and unfailingly respectful person caught up in powerful roles and great events. In terms of attitude and character he was, one might respectfully conjecture, an American that many Canadians can identify with and feel a certain kinship.
But none of that should detract from what he did or accomplished through his life. He was, arguably, the most qualified and best prepared person ever to assume the presidency, having previously served his country as a World War II combat pilot, two terms in congress, Ambassador to the United Nations, Special Envoy to China, Director of the CIA and two terms as Vice-President.
He advanced environmental concerns and worked to reduce trade barriers in North America. He led the US at a time when it was the only significant superpower in the world and could therefore have exerted unilateral authority. But he chose not to. Instead, he responded to the Iraqi invasion of Kuwait by firstly seeking the advice of the Canadian Prime Minister of the time, Brian Mulroney, and then working through the United Nations to form a multi-national coalition to engage the threat. When the former Soviet Union collapsed, he cautioned against gloating and maintained a respectful attitude. In a recent statement current Russian President Vladimir Putin provided the following tribute:
“George Bush Sr. was well aware of the importance of a constructive dialogue between the two major nuclear powers and took great efforts to strengthen Russian-American relations and cooperation in international security,”
He never wrote an autobiography, but wrote thousands of personal letters, casually composed but highly articulate and poignant, cherished by those who received them.
What is perhaps most remarkable about him is that, despite being what we might consider a “person of privilege” who could easily have chosen a life of quiet and private comfort, he made deliberate choices to engage public service, beginning with his decision to drop out of school and voluntarily enlist in the Navy at the age of 18 against family advice. He became a naval aviator undertaking 58 combat missions, during one of which he was shot down and had to be rescued at sea. That would have been enough for most people. Returning home after the war, he could easily and understandably have entered a comfortable private life as a successful businessman, but instead chose public service leading to the numerous positions and culminating in the presidency in 1988.
His family members, who have themselves taken up positions of social and political responsibility, remember his exhortation of “Service before Self”.
Perhaps the most revealing GHW Bush quotation are the words of a note he left in the Oval Office for his successor, Bill Clinton, who defeated him in the 1992 presidential election:
The last five sentences are perhaps the most telling of all and speak volumes about the author
You will be our President when you read this note. I wish you well. I wish your family well. Your success is now our country’s success. I am rooting hard for you.
Truly a life of Service before Self. A legacy and example for his nation. Indeed, for us all.
The Essential Elements of Medical Education Transcend Politics and Culture
How do you judge a medical school? Specifically, how do you know if it’s providing an effective educational experience for its students? There’s no shortage of perspectives on that question. Everyone involved in medical education, from first year students to Deans, will happily weigh in. Theories and opinions abound, ranging from the rigorous application of systematic Program Evaluation involving the collection, processing and consideration of multiple pre-determined sources of data, to the “I know it when I see it” approach. Our accrediting agencies certainly favour a data driven approach, now requiring the analysis of twelve standards which break down to 95 elements requiring the collection and reporting of literally hundreds of individual points of information.
I was recently faced with this question, with the added complexity that the medical school was situated in a country with very different political and social structures than our own, and very different challenges to the delivery of health care. The school was in a large (very large) city in China, and I was part of a small team asked to provide perspectives on a recently developed English language program.
The obvious and perhaps easiest approach is to measure it against our established, North American accreditation standards. However, I found many of the standards, particularly those relating to issues such as diversity, admission procedures, faculty appointments and governance, simply did not translate to that cultural context. So, I decided to concentrate instead on the essentials – those elements that are foundational to any medical education process and should retain relevance regardless of social or political context. With that in mind, I concentrated on four “essential ingredients” of medical education.
The first, and most obvious, is students. Medical education is fundamentally about student learning and their personal development as physicians. They therefore need to be capable of learning and, probably more importantly, motivated by a true commitment of service to their future patients and communities. The students I encountered certainly had those attributes. They were very well-qualified academically, highly-motivated, ambitious and adaptable. They also seemed to have high levels of social responsibility and commitment to utilizing their medical training in the interests of their society. They are also all only children which, I came to learn, puts them under considerable pressure to succeed.
Students need to encounter teaching faculty, basic scientists and clinicians committed to the process of passing along their accumulated knowledge, experience and wisdom to the next generation of physicians. Their commitment must be based not simply on conditions of employment or obligations, but an almost instinctive impulse to teach that they see as part of their professional role and personal mission.
In China, I met numerous clinical faculty and curricular leaders during the visit who were uniformly committed to providing education both through formal teaching and in conjunction with their clinical responsibilities. They saw this as an embedded component of their appointments, and felt supported in their roles through provision of faculty development. When pressed, they admit that educational responsibilities are provided “over and above” their clinical or academic roles.
Together, students and teachers must encounter patients. Those patients must be accessible, representative of the conditions and circumstances students will eventually encounter, and be willing to participate in the educational process. In the Chinese school I reviewed, there was virtually unlimited and unfettered access to patients of all types. This is the result of the sheer volume of patients and pathology in a city whose population approaches that of all Canada. Whereas many Canadian schools struggle to ensure students are exposed to all clinical problems, clinical instructors in China are able to select patients for students to see and work with based on their educational needs. The Internal Medicine clerkship director pointed out how she is able to first identify what clinical problems any particular student needs to encounter, then select among multiple appropriate patients.
The fourth essential element is resources. These include space for teaching, facilities for basic science instruction and the equipment and technology necessary to provide contemporary medical care. This requires a commitment on the part of school and medical leadership to ensure resource stewardship, and mechanisms to ensure they have the means to ensure updating and refreshing into the future.
And so, in the end, the similarities were much more significant than the differences. It comes down to students, teachers and patients coming together in an environment providing adequate resources to allow the educational process to flourish. When they do, it seems education just happens, almost spontaneously. Without any of the first three fully in place, it’s not possible, even with outstanding resources.
The purpose of a medical school and its leadership is to ensure the essential elements are in place and well-supported. Once they are, education happens. The urge to learn and to teach, it would seem, transcend geography, culture and politics.
Engaging Disruptive Innovation. The evolving role of POCUS in clinical medicine and medical education.
Who among us didn’t get through high school without regularly reaching for a well-thumbed encyclopedia plucked from a shelf in our parents’ basement or local library reference room? Not me, to be sure. Whether it was how rubber is manufactured, legislative accomplishments of a long-deceased prime minister, or the agricultural exports of Guatemala, the encyclopedia could always be counted on to provide reliable information, in time for whatever deadline was looming.
The word “encyclopedia” itself has an interesting and revealing etymology. It apparently contains elements of word origins for “circle” (interpreted to mean “complete” or “all-inclusive”), “child” and “education”. We all know the word to refer to a comprehensive, single source that brings together diverse information. An encyclopedia is a one-stop-shop for a little bit of everything you might need to know about anything.
The most venerable example is Encyclopædia Britannica, first published in 1768 (https://www.britannica.com/topic/Encyclopaedia-Britannica-English-language-reference-work). The 2010 edition consisted of 32 volumes and 32,640 pages. It was written by about 100 full-time editors and more than 4,000 contributors. Contributors have included Nobel laureates and five American presidents.
That 2010 edition version was its last print edition. After 242 continuous years, Encyclopedia Britannica went out of the print business. It was a victim of what has come to be known as Disruptive Innovation.
That concept emerged in the 1990s and is most commonly attributed to Clayton M. Christensen who has written extensively on the topic as it plays out in the business world and explains the rise and failure of various enterprises.
In a 1995 Harvard Business Review article that is well worth the read (https://hbr.org/1995/01/disruptive-technologies-catching-the-wave), Christensen defines disruptive technologies in the following way:
The technological changes that damage established companies are usually not radically new or difficult from a technological point of view. They do, however, have two important characteristics: First, they typically present a different package of performance attributes—ones that, at least at the outset, are not valued by existing customers. Second, the performance attributes that existing customers do value improve at such a rapid rate that the new technology can later invade those established markets. Only at this point will mainstream customers want the technology. Unfortunately for the established suppliers, by then it is often too late: the pioneers of the new technology dominate the market.
The disruptive innovation that lead to the demise of print versions of Encyclopedia Britannica was, of course, Wikipedia. It provided an easily accessible, comprehensive and continually updated source of information at no direct cost to the consumer. The fact that it lacked historical status, cachet or even a reputation for the accuracy of its sources was glossed over by the consuming public who were very willing to set aside all those considerations for the convenience and economic advantages.
Disruptive Innovation, almost by definition, upsets existing patterns of practice or behaviour and resets the way people go about a common task or access a service. There is always a reaction from those involved in the traditional paradigm, usually characterized by statements such as
“what’s the proof this is better”
“there’s no problem with what we’re doing now”
“it hasn’t been fully researched”
“there will be unintended consequences”
The disruptive innovators, for their part, have the courage of their convictions. They believe they understand market forces better than the established providers, and are willing to gamble that they’re right. Basically, they believe in letting the market decide.
The medical world, of course, is certainly not excluded from disruptive innovations. In fact, it has benefited greatly, but not always willingly. An example I’m very familiar with from the cardiology world is Percutaneous Coronary Angioplasty. When first introduced by Dr. Andreas Gruentzig in 1977, this innovation truly set the cardiovascular world on its collective ear. Prior to that, therapies for coronary occlusive disease were limited to medical therapies (provided by cardiologists) and coronary bypass surgery (provided by cardiac surgeons). The dichotomy and division of labour were clear and well accepted. The catheterization laboratory was a place for diagnostic investigations to determine the extent of disease, not a place for therapeutics. Gruentzig’s innovation completely upset the existing paradigm. Moreover, it put the interventional cardiologists in the driver’s seat, because they could link the therapeutic intervention to the diagnostic procedure, therefore engaging the issue first and therefore, potentially, circumventing the role of the cardiac surgeon. The simple intuitive appeal of being able to dilate an obviously obstructed vessel without the need for even a second interventional procedure, much less surgery, was powerfully compelling, and both the medical community and patients were very willing to set aside the usual and well-established need for controlled comparative trials before embracing this new technology enthusiastically.
The development of Hand-Held Ultrasound (HHU) and its clinical counterpart, Point of Care Ultrasound (POCUS), could be considered further disruptive innovations facing the medical community. Ultrasonic imaging, by virtue of its ability to provide information on a variety of structures in a non-invasive, non-toxic manner and at relatively low cost, has taken on a key role in medical diagnostics, ranging from cardiac (where it is known as Echocardiography) to abdominal, thoracic and vascular imaging. It was initially provided only with large and complex machines that were not easily transported, and provided images and measurements which were imprecise, difficult to obtain and required “expert” recording and interpretation. The technology therefore required third party interpretation and consultation before results could be reliably utilized to guide patient care.
Over the past decade or so progressive technical advances have made it possible to obtain excellent quality images from small devices that can be carried easily and used at the bedside. This technology is such that it can be used by an individual to guide the diagnostic approach and decision-making process, analogous to how physicians use stethoscopes. Although the HHU technology is not yet able to provide the full package of information that would allow it to completely replicate the comprehensive examination, it’s not unreasonable to expect that will occur in the not-too-distant future.
In addition to challenging the role of ultrasonic imaging as a diagnostic procedure, this technology is also challenging our approach to the clinical examination in medical school, where students and educators are asking very valid questions as to the role of these “competing” technologies.
I recently participated in a symposium at the Canadian Cardiovasular Congress recently exploring this very topic. Together with my colleague Dr. Amer Johri, as well as Dr. Sharon Mulvagh from Dalhousie, Dr. Rob Arntfield from Western University, and our former Echocardiography Fellow (now staff Cardiologist at McGill) Dr. Hanane Benbarkat, we explored current and further applications of HHU and POCUS, all centred on its fundamental impact on patient care.
Dr. Johri has been active in the development of guidelines for its application in medical education (Journal of the American Society of Echocardiography 2018;31:749), and has been working with Dr. Steven Pang of our department of Biomedical and Molecular Science to introduce the technology within our curriculum.
The session was, as you might imagine, not without controversy. However, I believe the discussion ultimately centred on the only truly relevant issue: how we can utilize emerging technology to better serve the needs of patients. The concluding messages I provided our audience at that symposium are:
- HHU and POCUS are excellent examples of disruptive innovation
- They challenge our conventional approaches, but have considerable potential to bring added value to both the clinical setting and educational process
- They are here to stay – but how, and who will be guiding their use is not yet determined
- They have the potential to evolve from disruptive to sustaining innovations
- The key consideration in assessing value should be the impact on patient care
- Based on work carried out by Dr. Benbarkat during her fellowship at KHSC and hopefully extended to further collaborative studies with other centres, integrated utilization of POCUS by hospital-based Echo Labs is feasible and beneficial.
I’ll conclude with the words of Mr. Christensen who has given much thought to what causes organizations to fail in the face of disruptive innovation. In his book “The Innovator’s Dilemma” he provides a rather disturbing paradox:
“in the case of well managed firms…good management was the most powerful reason they failed to stay atop their industries.”
“widely accepted principles of good management are, in fact, only situationally appropriate.”
In other words, it was, at least in part, a failure to deviate from previously successful practices that prevented well-established firms from engaging disruptive innovations, ultimately to their detriment. Such innovations challenge us to step away from what we consider to be the “tried and true” methods and approaches we have come to rely upon. They will always entail an element of risk and uncertainty, and therefore require what might be termed a leap of faith. In the medical world, that leap is only justified by a considered, clear potential to improve patient outcome. All other considerations must take a back seat.
Well trained and committed clinicians can indeed succeed in the research world. Celebrating the accomplishments of two Queen’s Grads
Most organizations we join in the course of our professional careers are a natural consequence or requirement of what we do. There are others that carry some degree of prestige or special recognition that we may choose to apply to with the hope of being selected. A few organizations – very few indeed – come looking for special people. These are called “Honorific” societies, because they seek and recognize individuals whose lifetime work merits special recognition.
The Royal Society of Canada is such an organization. According to its website:
The RSC is the recognized pre-eminent body of independent scholars, researchers and creative people in Canada whose Fellows comprise a collegium that can provide intellectual leadership for the betterment of Canada and the world.
RSC Fellows are men and women from all branches of learning who have made remarkable contributions in the arts, the humanities and the sciences, as well as in Canadian public life.
In the United States, the National Academy of Medicine is another such organization. It describes it’s goals as follows:
- An independent, evidence-based scientific advisor. To carry out our work, we harness the talents and expertise of accomplished, thoughtful volunteers and undertake meticulous processes to avoid and balance bias. Our foundational goal is to be the most reliable source for credible scientific and policy advice on matters concerning human health.
- A national academy with global scope. Although the National Academies were originally created to advise the U.S. government and advance the well-being of the U.S. population, our mandate is now much broader. The NAM includes members from across the globe and partners with organizations worldwide to address challenges that affect us all.
- Committed to catalyzing action and achieving impact. We identify and generate momentum around critical issues in health; marshal diverse expertise to build evidence-based solutions; inspire action through collaboration and public engagement; and foster the next generation of leaders and innovators.
- Collaborative and interdisciplinary. In partnership with the National Academy of Sciences, the National Academy of Engineering, and other stakeholders, the NAM draws on expertise across disciplines and domains to advance science, medicine, technology, and health.
- An honorific society for exceptional leaders. The NAM has more than 2,000 members elected by their peers in recognition of outstanding achievement. Through a commitment to volunteer service, NAM members help guide the work and advance the mission of the NAM and the National Academies.
Recently, two graduates of our medical school have been named to these societies. Dr. Stephen Archer is a friend and classmate from Meds 81. I’ll again quote from the citation provided by the RSC in announcing his appointment:
Stephen Archer is Professor and Head of Medicine at Queen’s University and a world renowned cardiologist and leader in several research fields, including oxygen sensing, vascular biology, and the experimental therapeutics of pulmonary hypertension and, more recently, cancer. He has made numerous discoveries that can undisputedly be considered firsts, particularly in regard to defining the roles of mitochondrial fission/fusion and metabolism in oxygen-sensing and cell proliferation.
Dr. Azad Bonni is a Meds 86 grad and well-remembered by many current faculty. He also has the distinction of being the younger brother of my colleague Dr. Hoshiar Abdollah. Again, I quote from a announcement provided by his current school:
Bonni is the Edison Professor and head of the Department of Neuorsciences at Washington University School of Medicine and director of the university’s McDonnell Center for Cellular and Molecular Neurobiology. An international leader in molecular neuroscience, Bonni has made seminal contributions to our understanding of how the brain is built at the level of individual connections between nerve cells, and how deregulation of those mechanisms contributes to neurological diseases. His group has discovered fundamental signaling networks within nerve cells that program neural circuit assembly and function in the developing brain. Using brain development as a guide, the Bonni laboratory also has provided novel insights into neurological disorders including neurodevelopmental disorders of cognition such as intellectual disability and autism spectrum disorders, brain tumors and neurodegenerative diseases.
Neither Archer nor Bonni acquired their research expertise while in medical school. However, I believe both would agree that their ability to formulate important and relevant research themes and the commitment required to pursue those issues in a scientifically rigourous fashion was rooted in their understanding and personal involvement in clinical medicine and likely fostered by exposure to people and situations they encountered as medical students.
The role of research in undergraduate medical education has always been controversial. In an increasingly packed undergraduate curriculum, it is often sacrificed in favour of the many therapeutic applications and competency objectives medical schools are expected to provide. In fact, many current curricular frameworks have chosen to exclude it completely.
At Queen’s, we made the deliberate decision to include it in our list of essential EPAs, our only departure and addition to the nationally accepted list. We include research involvement as a core component of our curriculum (the Critical Enquiry), provide opportunities for summer research involvement, and integrate aspects of translational research into our teaching in various courses.
We do so not with the expectation that every student will become an independent researcher, but because we believe understanding research methodology makes us all better “consumers” of new information and that these early experiences may be formative and awaken a passion for research in those who had not previously imagined it either within their reach or as component of their career.
Congratulations to Drs. Archer and Bonni, and thanks for affirming that solidly trained and committed clinicians can, indeed, achieve great things in the research world.
Adjusting to Medical School
Adjusting to a new environment never comes easily. Our bodies will eventually adapt to seasonal climate changes, travelling to different time zones, or high altitude, but it invariably takes some time, and involves a little discomfort along the way. Adjustments of any kind are easier if anticipated and understood in advance.
Medical school is an adjustment and, unfortunately, not always anticipated by those “taking the plunge”.
What’s the most difficult adjustment for first year medical students?
Asked that question, most would point to issues such as workload, engaging initial patient encounters, or perhaps aspects of technical competence involving physical examination or procedures. All important, to be sure, but these challenges are understood in advance, anticipated by our curriculum, and well within the abilities of the young people entering medicine, who are already very accomplished and have engaged the process and been selected with all these issues firmly in mind.
Beyond these anticipated challenges, there are other adjustments that are even more critical to success but much less well-appreciated or even unanticipated by students.
Why do we undertake educational programs? For many undergraduate university students, it is to either to pursue an area of personal interest, or to achieve prerequisites or qualification for a subsequent program. That’s certainly the case for students contemplating entry to medical school. These are certainly worthy goals, but they are personal and intended to promote individual objectives. In a professional program such as medicine, the goals of learning shift to encompass the interests of other parties, specifically future patients. The approach and motivation for learning must also shift. In the words of an astute former mentor “Medicine is a service industry”. Medical school is about preparing young people to provide that service. The learning is facilitated by that goal. In fact, it can’t occur without it.
Students entering medical school have achieved much recognition for their academic and personal accomplishments, the most recent and notable being their success in the admission process. As they undertake their studies together with equally accomplished classmates and in a system that defines success simply as “pass” with very little numerical grading, external kudos and other tangible evidence of success become increasingly rare. The perception of success must therefore shift from the external to internal as will, eventually, the responsibility for ensuring they remain knowledgeable and technically competent.
The expectation of professional behaviour
Medical education is patient-centred. Students learn early that their interactions with patients must be carried out with high standards of confidentiality, respect and personal behaviour. Although that expectation is easily understood within the patient contact itself, it is perhaps less immediately understood that the same expectations are in play with all their interpersonal and social interactions. The lines between their personal and student lives therefore become blurred. For most, this is a novel experience, and perhaps the first realization of what it means to have engaged a professional role.
Dealing with uncertainty
Students, particularly those from backgrounds in the physical or biologic sciences, have come to expect precision and certainty in their studies. The concept of “right” and “wrong” provides reassuring clarity and promotes the expectation that learning is a finite endeavor, culminating with the discovery of that single, correct response. In the study of medicine, they find a much less dichotomous world where many clinical issues are nuanced and require interpretation based on many variables. They must develop “approaches” based on “best evidence” always contextualized to the “patient’s unique circumstances”. For those accustomed to singular solutions, this can be quite unsettling.
All this can sound quite daunting but, like any life adjustment, will be eased with patience and support. Fortunately, much support is available. The quick “bonding” with classmates allows for the comforting realization that these challenges are not unique or some critical personal shortcoming, but rather ubiquitous features of the early medical school experience. Interactions with upper class colleagues, both planned and informal, provide further validation. Our Student Affairs programs, mentor groups, observerships and Clinical Skills groups all provide opportunities to discuss transition difficulties.
In the end, the adjustment is not merely about engaging a new educational program, but rather a more clearly defined identity and perspective of one’s role in the world.
Welcoming Queen’s Meds 2022
At precisely 1 p.m. on Monday, November 6th 1854, Dr. James Sampson rose to address the twenty-three students who would become the first medical class entering the Queen’s School of Medicine. They were gathered in an upper room of a former military infirmary at 75 Princess Street, a building that still stands today, currently the site of a popular local hardware store.
Dr. Sampson, an Irish and British trained former military surgeon who was instrumental in the development of Kingston
General Hospital and would go on to serve multiple terms as Mayor of Kingston, was Professor of Clinical Medicine and Surgery. He was also President (essentially the first Dean) of the medical school. He introduced himself and his five colleagues who would form the first teaching faculty and then turned the podium over to Dr. John Stewart, Professor of Anatomy, Physiology and Practical Anatomy, who would deliver the first lecture.
In his book “Medicine at Queen’s: A Peculiarly Happy Relationship”, the late Dr. Tony Travill describes the event in vivid detail. He notes that the room in which they met was “deplorably filthy”, but appearances did not deter the faculty members who felt appearances did not matter much “as there are no bacteria then in Kingston” meaning, presumably, there was no epidemic or plague currently active.
In that inaugural address Dr. Stewart spoke of “the importance of anatomy and physiology to the proper practice of surgery and medicine”. He went on to quote Galen who described anatomy as “the most beautiful hymn which man can chant in honor of his creator”. In finishing “He recounted the events leading to the school’s founding and exhorted the students to recognize that their future success depended more on themselves than on their professors: the only barrier to that success was idleness.”
Last week, Dr. Sampson’s successor, Dr. Richard Reznick, welcomed the one hundred and sixty-fourth group to be welcomed to their studies and to the profession by their faculty. Dr. Reznick challenged them to be restless in the pursuit of their goals and the betterment of our patients and society.
A few facts about our new colleagues:
They were selected from a pool of 4836 highly qualified students who submitted applications last fall.
Of the 104 students the average age is 24 years. Forty-nine members of the class are women and 55 are men. They hail from no fewer than 43 communities across Canada, including; Alma, Belleville, Brampton, Burlington, Cambridge, Dundas, Etobicoke, Golden Lake, Guelph, Kingston, Lively, London, Maple, Markham, Milton, Mississauga, Nepean, Nobleton, North York, Oakville, Odessa, Ottawa, Peterborough, Richmond Hill, Sarnia, Scarborough, Sittsville, Thornhill, Toronto, Whitby, Edmonton, Leduc, Calgary, Vancouver, Maple Ridge, Victoria, Coquitlam, West Vancouver, North Vancouver, Winnipeg, St John’s, New Minas, Halifax.
Eighty-six of our new students have completed an Undergraduate degree, and sixteen have postgraduate degrees, including three PhDs. The universities they have attended and degree programs are listed below:
Universities of Undergraduate Studies
|Simon Fraser University|
|St. Francis Xavier University|
|University of Alberta|
|University of British Columbia|
|University of Calgary|
|University of Guelph|
|University of Ottawa|
|University of Toronto|
|University of Victoria|
|University of Waterloo|
|University of Ontario Inst. Of Tech|
|Wilfred Laurier University|
Undergraduate Degree Majors
|Anatomy and Cell Biology|
|Biochemistry and Molecular Biology|
|Biomedical Discovery and Commercialization|
|Chemical and Physical Biology|
|Computer Science and Biology|
|English Language and Literature|
|Epidemiology and Biostatistics|
|Foods and Nutrition|
|Health and Disease|
|Kinesiology and Health Science|
|Mathematics and Physics|
|Medical Health Informatics|
|Molecular Biology and Genetics|
|Occupational and Public Health|
An academically diverse and very qualified group, to be sure. Last week, they undertook a variety of orientation activities organized by both faculty and their upper year colleagues.
On their first day, they were called upon to demonstrate commitment to their studies, their profession and their future patients. They were assured that they will have a voice within our school and be treated with the same respect they are expected to provide each other, their faculty and all patients and volunteers they encounter through their medical school careers. In addition to Dr. Reznick, they were welcomed by Ms. Rae Woodhouse, Asesculapian Society President, who spoke on behalf of their upper year colleagues, and Dr. Rachel Rooney provided them an introduction to fundamental concepts of medical professionalism.
Over the course of the week, they met curricular leaders who will particularly involved in their first year, including Drs. Michelle Gibson and Lindsey Patterson (Year 1 Directors) and Drs. Cherie Jones and Laura Milne (Clinical Skills Directors). They were also introduced to Dr. Renee Fitzpatrick (Director of Student Affairs) and our excellent learner support team, including Drs. Martin Ten Hove, Jason Franklin, Kelly Howse, Mike McMullen, Josh Lakoff, Craig Goldie and Erin Beattie, who oriented them to the Learner Wellness, Career Counseling and Academic Support services that will be provided throughout their years with us. They met members of our superb administrative and educational support teams led by Jacqueline Findlay, Jennifer Saunders, Theresa Suart, Amanda Consack, and first year Curricular Coordinator Corinne Bochsma.
Dr. Susan Moffatt organized and coordinated the very popular and much appreciated “Pearls of Wisdom” session, where fourth year students nominate and introduce faculty members who have been particularly impactful in their education, and invite them to pass on a few words of advice to the new students. This year, Drs. Dale Engen, Debra Hamer, Ingrid Harle, Annette Hay, Michael Leveridge, Joseph Newbigging, Louise Rang and Andy Thomas were selected for this honour.
On Friday, the practical aspects of curriculum, expectations of conduct and promotions were explained by Drs. Michelle Gibson and Lindsey Patterson.
Their Meds 2020 upper year colleagues welcomed them with a number of formal and not-so-formal events. These included sessions intended to promote an inclusive learning environment, as well as orientations to Queen’s and Kingston, introductions to the mentorship program, and a variety of evening social events which, judging by appearances the next morning, were much enjoyed.
For all these arrangements, flawlessly coordinated, I’m very grateful to Rebecca Jozsa, our Admissions Officer, Admissions Assistant Rachel Bauder, and to Rae Woodhouse and her second year colleagues.
I invite you to join me in welcoming these new members of our school and medical community, and end with a quote Dr. Reznick shared with the incoming class, drawn from his favourite poet and recent Nobel Laureate Bob Dylan:
May your heart always be joyful
May your song always be sung
And may you stay forever young
Of Robots, Worms and Youthful Inspiration
The Day the Earth Stood Still is a science fiction movie released in 1951. Filmed entirely in black and white, it is based on a 1940 short story by Harry Bates entitled Farewell to the Master. The story involves an alien visitor to earth named Klaatu, portrayed by Michael Rennie. The real star of the show is an eight-foot tall, death-ray-emitting robot named Gort who accompanies Klaatu. As one might imagine, mayhem ensues.
I was recently surprised to learn that it’s possible to connect that motion picture with the sophisticated systems that are rapidly developing and being used in robotically assisted surgery. As someone who grew up being told reading and viewing science fiction was a waste of time, this was of some interest to me.
That connection begins with one Victor Scheinman.
Mr. Scheinman, who grew up in New York City, recalls being terrified of Gort after seeing The Day the Earth Stood Still for the first time at the age of 9. He hid in his bed, unable to sleep due to nightmares in which he would imagine the robot standing in his room. His father, a psychiatrist who practiced in Manhattan and taught at Columbia, advised him to build a model of the robot as a means of dealing with his fears. In doing so, Mr. Scheinman began to develop mechanisms to animate the arms and legs of his models. This led to a variety of projects that were encouraged by his parents and teachers, and to a series of entries and prizes in various science fairs. He went on to earn admission to the Massachusetts Institute of Technology at the age of 16.
His work at MIT and then Stanford eventually led to development of “The Stanford Arm”.
In her book, “The Robot: The Life Story of a Technology” (2007), Lisa Nocks, writes:
“In contrast to heavy, hydraulic, single-use machines, his Stanford Arm was lightweight, electric, mutliprogrammable, and could follow random trajectories instead of fixed ones. Scheinman showed that it was possible to build a machine that could be as versatile as it was autonomous.”
The technology was picked up and advanced by Joseph Engelberger and George Devol who formed Unimation in 1977, the world’s first robotics company which, with support from Scheinman and General Motors, developed the Programmable Universal Machine for Assembly (PUMA), the prototype of which now resides in the Smithsonian Institution. The PUMA was quickly introduced to the automotive industry revolutionizing the assembly line process. The 200 and 500 series PUMAs are of “desktop” size and therefore applicable to surgical applications. The first recorded applications were for assisting brain biopsies in 1985. In 2000, the daVinci surgery system became the first robotic system approved by the Food and Drug Administration. A key development that allowed for approval involved improved, high resolution and three-dimensional imaging that allows the operator to utilize the mechanical arms without laparoscopic guidance.
And so, much has developed from youthful imagination, creativity and energy, suitably nurtured and allowed to develop.
Recently, we’ve seen what might be the beginnings of another such example. Reports describe the very impressive accomplishments of four young people from Toronto. Beginning with an idea inspired by her grandfather’s illness, young Annabel Gravely decided to devote her eighth-grade science project to investigating causes of muscle deterioration in Amyotrophic Lateral Sclerosis (ALS). Hypothesizing a link with the muscle loss in ALS and that which is known to occur during prolonged periods in space, Gravely and her schoolmates (Alice Vlasov, Amy Freeman and Kay Wu) proposed to send a tube of microscopic worms (Caenorhabditis elegans, for those of you taking notes) into space aboard the International Space Station in order to examine the effect of zero gravity on the worms and particularly on the activity of a specific enzyme sphyingomyelinase (ASM) known to be linked to ALS.
Dr. Jane Batt, a respirologist and scientist at St. Michael’s Hospital, learned of their interest and provided them space in her lab to carry out preparatory work, as well as connections with the space agency. All this resulted in a cannister of worms spending a ten-week sojourn aboard the space station, after which it was found they not only survived quite nicely in space, but were longer and larger than their earthbound control group, and expressed lower levels of ASM. Although the link between ASM and ALS associated muscle loss is not yet clear, the findings support further investigation, and were published last month:
Young Ms. Gravely (now 16 years old) and her colleagues have their first publication citation.
And so, we have two accounts of youthful inspiration, one arising in response to an imaginary threat, the other from the memory of a beloved grandparent. Both bring much credit to the young people involved and remind us that age need be no barrier to creative thinking and dedication to a goal. However, the significance of these success stories goes far beyond the young originators themselves.
Potentially groundbreaking ideas, like seeds cast into the air, must find fertile ground if they’re to flourish. Scheinman and Gravely were able to find such fertile ground in the support and encouragement of their families, schools and communities without which their brilliant insights might have never come to fruition.
Transformative innovation can be thought of as applied inspiration. The originating idea is necessary, but insufficient if not supported.
In the world of medical education, we encounter many potential Scheinmans and Gravelys, who experience their own moments of inspiration. Given the “busyness” and apparent urgency of our educational and clinical lives, it’s easy for them, and for us, to let those opportunities pass in favour of achieving more immediate short-term goals. From time to time, it serves to be reminded that great achievements can start from rather humble origins – such as scary science fiction movies and microscopic worms.
Dynamic Learning Environments – not just for academic centres
Several years ago, the Association of American Medical Colleges (AAMC) developed and publicized a statement on the learning environment.
The statement nicely articulates three key points about effective medical learning environments:
- Medical education and exemplary patient care go hand-in-hand.
- They feature a pervasive atmosphere (dare I say “culture”) of mutual respect and collaboration on the part of all involved in the delivery of patient care.
- Everybody involved is both a learner and a teacher, and feel free and comfortable in both roles.
Lofty goals and expectations, to be sure. In fact, the skeptical among us may consider these to be merely aspirational statements, expressing unachievable ideals.
I’m pleased to report that this is not the case. In my experience, I often encounter learning environments that are nicely meeting those lofty goals. Most commonly, these are in large teaching hospitals where available resources, space and academic focus combine to produce close-to-ideal learning environments. Recently, I had the opportunity to see similar success in a much smaller site.
I attended the Third Annual Georgian Bay Healthcare Wellness Research and Innovation Day held at the Collingwood General and Marine Hospital.
Organized by Collingwood Chief of Staff Dr. Michael Lissi and supported by Dr. Peter Wells and Program Manager Michelle Hunter of the Rural Ontario Medical Program, this year’s theme was Geriatrics and involved a thoughtful panel discussion followed by a series of very well-qualified and engaging speakers.
The hospital cafeteria, re-purposed for the occasion, was standing-room-only as about 150 folks from all areas of the health care community, as well as interested local residents, packed the room and contributed to the discussion. The sessions were live-streamed to several sites.
In addition to the presentations, hospital corridors were used to feature about 60 posters featuring studies carried out by local practitioners and learners working in the community.
I was there largely because two of our students who are in Collingwood completing placements.
Claire Tardif and Daniel Weadick of Meds 2019 are, by all accounts, both enjoying the experience and learning a great deal. They’re integrating well into that local learning environment, working with multiple physicians, other learners and health care providers. Dan summarized it all rather effectively. In his own words “there’s a lot to like”.
For me, the whole experience was a little surreal. Having grown up in Collingwood and worked in the various jobs in and out of the local hospital, I found myself reviewing posters and meeting local physicians in the same rooms and corridors in which I’d made deliveries and portered patients many years ago.
Medical education theorists have described the learning process in many ways, but all agree that the knowledge and skills
learned through largely classroom and simulated settings are insufficient unless integrated and applied to real patients. That process of application must be progressive, beginning with highly supervised settings where learners can begin to experience clinical care and decision making in safe and nurturing environments, while at the same time allowing them to progress to increasing levels of independence as their skills and growing confidence allows. For the medical student, highly-structured and learner-dense academic hospital settings are certainly valuable and essential, but may provide unintentional “ceilings” to professional development, and limit the appreciation of continuity of care that occurs outside the specialized ward and is so critical to patient outcomes. Community placements in smaller centres can complement their learning by providing that context.
In the end, medical education is fundamentally about providing and identifying environments where motivated, talented students can encounter generous and welcoming practitioners in settings that strive to provide excellent patient care and learning for all involved.
I’m pleased (and perhaps a little proud) to say that my home town is one of those places.
Wolves among the sheep
How does a nurse, working in public hospitals and nursing homes, manage to murder frail, elderly patients without detection?
How does it go on for 20 years, resulting in the deaths of eight patients under her care?
Why did it only come to attention and stop when the perpetrator herself confessed openly to the crimes?
These questions are the focus of an inquest commissioned to investigate the actions of Elizabeth Wettlaufer, a former nurse now serving a life sentence for the crimes to which she has confessed.
The inquest is scheduled to release its final report this summer, but documents recently released reveal a number of very sobering facts that should concern any health care professional and, particularly, those with leadership or administrative positions:
- She was fired from her first nursing job in 1995 for stealing medications. However, following intervention by her nursing
professional association, the firing was noted officially as a voluntary resignation.
- Between 2007 and 2014, while working at the Caressant Care centre in Woodstock, Ms. Wettlaufer was reprimanded no fewer than nine times for medical errors and general incompetence, refused recommendations to take leaves of absence and ignored threats from her colleagues that her increasingly suspicious behaviour would be reported to her licensing body.
- She was finally fired in 2014. That firing was again officially noted as a voluntary resignation after her union intervened. As a result of that settlement, Ms. Wettlaufer actually received $2,000 and a letter of recommendation.
- Between 2007 and 2014, while all these concerns were under review, she continued to kill residents of the Caressant Care centre by administering lethal doses of Insulin.
- On two occasions, the coroner’s office was notified about deaths at chronic care facilities where Ms. Wettlaufer worked. No autopsies or investigations were ordered.
If Ms. Wettlaufer had not voluntarily confessed her crimes, they might never have come to attention. She has recently spoken out about loose regulatory processes governing the use of Insulin which made it possible for her to administer overdoses without detection.
This is, regrettably, not the first instance of a health care provider using a position of trust to facilitate murder.
Harold Frederick Shipman was an English physician, considered one of the most prolific serial killers of all time. In January of 2000 he was found guilty of killing 15 patients under his care and sentenced to life imprisonment, but a subsequent inquiry linked him to over 250 murders over his thirty year career. It seems that, in retrospect, numerous warnings of misbehaviour were ignored, including the fact that one of his first victims, an elderly lady previously in good health who was found dead only a few hours after a visit from Dr. Shipman, had recently changed her will to bequeath her entire fortune to him. In fact, most of his patients were in good health prior to visits with him during which injections were administered. It appears that at least three of the murders were directly witnessed by other personnel but nonetheless went unreported.
Joseph Michael Swango is an American who is currently serving three consecutive life sentences imposed in the year 2000 for the murder of patients who were under his care while he was practicing as a physician. It now appears he was responsible for as many as 60 fatal poisonings of both patients and colleagues. In retrospect, it is clear that there were signs of very troubling behaviour during medical school. Although considered intellectually brilliant, he exhibited a fascination with dying patients, to the extent of preferring to work as an ambulance attendant rather than going to his classes. It was found at one point that he had submitted falsified documents regarding completion of required tasks. Numerous fellow students and faculty raised concerns about his behaviour and honesty. He was nearly expelled but was allowed to stay on because one member of a review panel felt he should be given a chance to remediate. He was allowed to graduate one year after his entering class and, despite a very poor evaluation in his dean’s letter, secured a surgical internship. While on clinical rotations, nurses had reported multiple instances of apparently healthy patients dying mysteriously while he was on duty. On one occasion, he was caught injecting a substance into a patient who subsequently became very ill. Despite these warnings, no major sanction was imposed, although the program revoked its residency offer. He went on to work as a paramedic and laboratory technician. By changing his name and falsifying documents he was able to get into a variety of different residency programs at medical schools across the United States, and therefore work as a physician, all the time murdering both patients and co-workers, usually with injections of arsenic. The American Medical Association eventually did a thorough background check on one of his applications and uncovered the pattern of previous incidents. As a result, all 125 American medical schools and over 1,000 teaching hospitals were alerted to his identity and record. Effectively blacklisted from further residencies, he fled to Africa where he secured positions and continued to commit murder. A very complex and thorough investigation eventually led to his extradition, indictment and conviction.
These are, mercifully, very rare and extreme examples. However, they remind us that the intelligent sociopathic personality may find the medical or nursing professions ideal environments to prey on the innocent and satisfy the craving to kill. They also remind us that the patterns of deviant behaviour start early and without major impact until fully empowered. Set amongst trusting patients and innocent, often naive colleagues who would have difficulty even conceiving such behaviour, these monstrous individuals are like wolves among sheep. They may also benefit from the well-meaning protection of colleagues and supervising faculty whose first instinct will always be to help and cure rather than condemn. As in the case of Ms. Wettlaufer, they may also benefit from professional organizations and legal processes that put the interest of the individual above potential impact on current and future victims. Unless counterbalanced by administrations and leadership willing to undertake legal challenges and defend the broader interests of the public, profession, and future patients, these behaviours can go unchecked.
The upcoming inquest report will surely identify several points at which our processes failed to act and put an end Ms. Wettlaufer’s serial murders. However, there are lessons here for us all who are involved in medical education. The degrees and qualifications we bestow convey an assurance to licensing bodies, institutions and the public that the individuals who hold them are not only knowledgeable and technically qualified, but also trustworthy. We must be vigilant with respect all those considerations, and be prepared to defend the integrity of our educational and evaluative processes. Our responsibilities extend beyond the individual learner, to the public and to potential future patients.
We must never set wolves among the sheep.