Pages

Showing posts with label Evaluation. Show all posts
Showing posts with label Evaluation. Show all posts

Tuesday, September 26, 2017

Contradictory evaluations cause trouble, consternation

Two weeks ago I blogged about a resident who had been told she must repeat her fourth year of training. She countered with a lawsuit claiming that the surgery department and the medical school did not follow their policies in mandating her remediation.

She said written evaluations by faculty during her fourth year of residency were generally very good, but some oral feedback she received was negative.

From my experience as a surgical residency program director, I know inconsistent, vague, and unhelpful evaluations from faculty are common. For example, a medical student on Twitter recently posted the following:

The tweet prompted many comments from others; the best of which are these [names changed]:

That he seems well read and has a good depth of knowledge deprives him of good learning opportunities.

John sometimes stands too close when he presents.
3 months later. I don't understand why John stands so far away when he presents.

Tends to scare off new people./Makes everyone feel welcome & appreciated.

Dr. Doe, don't be too hard on the juniors. One week later. Dr. Doe, don't be too friendly to the juniors.

And my favorite

Jane should be aware of how she holds her shoulders. It changes the energy in the room.

Do you have similar evaluations to share?

Thanks to Natalie Wall (@nataliemwall) for allowing me to use her tweet.

Monday, February 29, 2016

The ultimate resident evaluation

It comes as no shock to me, and probably many other current and former program directors, that a recent study showed faculty overall performance evaluations of residents do not correlate with their scores on the yearly American Board of Surgery in Training Examination.

According to the JAMA Surgery paper, faculty evaluations encompassed technical skill and the six core competencies—medical knowledge, patient care, interpersonal and communication skills, professionalism, practice-based learning and improvement, and system-based practice.

The paper analyzed data for 150 residents at different levels of training over 4 years and also found that even faculty evaluations of the category medical knowledge couldn’t predict who would get a good or a bad score on the test.

It’s great to know that at the authors’ institution, the average annual evaluation scores ranged from just over 75 to 100 with means and medians both slightly above 92—like Garrison Keillor’s mythical Lake Wobegon, “where all the women are strong, all the men are good looking, and all the children are above average.”

Wednesday, October 17, 2012

Medical School Grading and T-Ball: Everyone Gets A Trophy

"Variation and Imprecision of Clerkship Grading in US Medical Schools” is the understated title of the paper (full text here) in the August 2012 issue of the journal Academic Medicine. The authors, from the department of medicine at Brigham and Women’s Hospital, analyzed 2009-2010 third-year clerkship grades from 119 (97%) of the 123 US medical schools. They found many different grading systems ranging from two levels (pass/fail) to 11 levels of grades.

The terminology used by the schools to describe the different grades is positively comical. To borrow an analogy I’ve used in a previous post about dean’s letters, the citizens of Lake Wobegon would be proud because no student is “average.”

Here are some examples:

High honors, honors, pass, fail (In some schools “honors” is not the highest possible grade).
Honors, satisfactory plus, satisfactory, fail.
Honors, satisfactory, low satisfactory, fail.
Honors, high satisfactory, satisfactory, low satisfactory, unsatisfactory. (Does “unsatisfactory” mean, dare I say it, “fail”?)
Honors, near honors, pass, fail.
Excellent, good, fail.
Honors, advanced, proficient, fail.
Honors, letter of commendation, fail.

The highest grade attainable was awarded to 23% of those students in schools with three-tiered systems (range 5-51%), to 34% (range 2-84%) in four-tiered systems and to 33% (7-93%) in schools with five grade levels.

It gets worse. The authors noted that 97% of all medical students were given one of the top three grades regardless of whether the schools used 4, 5, or 6 levels of grading.

From the paper, “Less than 1% of all US medical students fail required clerkships, regardless of the grading system used.” This raises the question of whether the grade “fail” is even necessary.

Focusing on surgery, an average of about 30% of all students got the highest grade possible in their surgical clerkship, but the percentage of the class receiving the top grade ranged from 7% to 67%. This may account for the paradox found in a paper on surgical resident performance: A significant predictor of the need for remediation was that the resident had received honors in his surgical third-year clerkship. It appears that a grade of honors is virtually meaningless.

This is an excellent example of what I call the “T-ball culture”: No one keeps score. All games end in a tie. Everyone gets a trophy.

The authors of the paper recommended that schools consider creating a more consistent, transparent and reliable system of grading. As a former surgical residency program director who grappled with the difficulty in interpreting the meaning of applicant grades from different schools, this seems remarkably clear to me.

An editorial in the same issue of the journal agreed that grade terminology should be standardized but cautioned that normative grading (establishing a set distribution or “curve” of grades) may not be the answer. The editorialists offered some other possibilities such as criterion-based grading or emphasizing the mastery of a subject as a goal rather than the achieving of a specific grade.

I do not have the background in educational theory to say what is right or wrong. I do know that a grading system with so many variables and such a skewed distribution is of no help whatsoever in evaluating the desirability of an individual applicant to a residency program.