Pages

Thursday, October 30, 2014

How to rank surgical residency programs

In September, Doximity, a closed online community of over 300,000 physicians, released its ratings of residency programs in nearly every specialty. Many, including me, took issue with the methodology. Emergency medicine societies met with Doximity's co-founder over the issue and echoed some of the comments I had made about the lack of objectivity and emphasis on reputation.

I wonder if it is even possible to develop a set of valid criteria to rate residency programs. Every one I can think of is open to question. Let's take a look at some of them.

Reputation is an unavoidable component in any rating system. Unfortunately, it is rarely based on personal knowledge of any program because there is no way for anyone not directly involved with a program to assess its quality. Reputation is built on history, but all programs have turnover of chairs and faculty. Just as in sports, maintaining a dynasty over many years can sometimes be difficult. Deciding how much weight should be given to reputation is also problematic.

The schools that residents come from might be indicative of a program's quality, but university-based residencies tend to attract applicants from better medical schools. The other issue is who is to say which schools are the best?

Faculty and resident research is easy to measure but may be irrelevant when trying to answer the question of which programs produce the best clinical surgeons. Since professors tend to move from place to place, the current faculty may not be around for the entire 5 years of a surgery resident's training.

The number of residents who obtain subspecialty fellowships and where those fellowships are might be worthwhile, but would penalize programs that attract candidates who may be exceptional but are happy to become mere general surgeons.

Resident case loads including volume and breadth of experience would be very useful. However, these numbers have to be self-reported by programs. Self-reported data are often unreliable. Here are some examples why.

For several years, M.D. Anderson has been number one on the list of cancer hospitals as compiled by US News. It turns out that for 7 of those years, the hospital was counting all patients who were admitted through its emergency department as transfers and therefore not included in mortality figures. This resulted in the exclusion of 40% of M.D. Anderson's admissions, many of whom were likely the sickest patients.

The number and types of cases done by residents in a program have always been self-reported. The Residency Review Committee for Surgery and The American Board of Surgery have no way of independently verifying the number of cases done by residents, the level of resident participation in any specific case, or whether the minimum numbers for certain complex cases have truly been met.

So where does that leave us?

I'm not sure. I am interested in hearing what you have to say about how residency programs can be ranked.

15 comments:

Vamsi Aribindi said...

Hmm,

One way to do so is to follow the example of the Michigan Bariatric Surgery study. Take videos of surgeries performed by graduating residents, and have them evaluated by multiple, experienced surgeons. The residents would be judged by their "career goal" (aspiring general surgeons would be judged on video of a hernia repair or cholecystectomy, residents going into a plastic surgery fellowship would be judged on a panniculectomy, etc.)

Come to think of it, perhaps something like this could form part of the board certification exam: half the time currently spent grilling the applicant could instead be spent watching a submitted video and grading the applicant's surgical skills. In this way, perhaps board certification could become a better indicator of a program's ability to build surgical skills.

Respectfully,
Vamsi Aribindi

artiger said...

Not that I'm an expert, but I would favor evaluating the product more than the process. In other words, rather than looking at faculty, test scores, etc., look at how many of the graduating residents are still in practice 10 or so years later, without state board action or lengthy malpractice history, how many are in academia or have teaching roles, how many quit and got MBA's or went on to administrative rolls, and so on. Although I don't know just how valuable that information may seem, it seems like something that could be tracked more objectively. If the goal is to produce competent, practicing surgeons, those kinds of criteria might be more helpful.

Skeptical Scalpel said...

Vamsi, judging the quality of a surgeon involves much more than assessing technical skills. As I pointed out in a post about the Michigan Study (http://skepticalscalpel.blogspot.com/2013/11/should-all-surgeons-have-video.html), the logistics of viewing videos may be a deterrent to widespread use. Videos would have to be edited. How much time would it take to view them. Would there be enough time to ask the types of questions currently being asked on the oral boards?

Artiger, it's not a bad suggestion, but I am not sure that what graduates are doing 10 years after leaving a program has any bearing on what the program is like today.

DocInKY said...

Wonder if "The little red book" from the 80's still exists - gold for me when I applied.

Skeptical Scalpel said...

The Little Red Book still exists on the ACS website [https://www.facs.org/education/resources/residency-search] However, it is nothing like the book you remember, which BTW was full of self-reported and often incorrect information. Like the pleasure I derive from reading any work of fiction, I used to enjoy reading it.

artiger said...

Ten years was just an arbitrary number I pulled out of the air. I thought it would be long enough to see the cream rise and the, well, other stuff, sink. I wouldn't propose focusing on individual graduates of a program so much, rather looking at patterns and trends.

Anonymous said...

Just found this site from reading Readers Digest.

I have read that it is difficult to develope a set of criteria to rate residency programs.
Whenever anything is rated there is always someone somewhere that raises problems with the methods used. But, it can not hurt to try and keep working at it.

I am glad to see a medical blog. Thank you!

I am going through the agonizing process of finding an orthopedic back surgeon, and setting up consultations. Wow, what a process.



JRSG

Skeptical Scalpel said...

Yes, if you have no connections, it must be difficult to find someone you can trust.

Anonymous said...

For surgery match rankings, one should take into account the programs that went unfilled during the 2014 match. For instance, Washington University in St. Louis had 2 unfilled positions out of 1,205 positions available with a 99.6% match rate. Current WU students say this is due to the know "malignant" environment, poor leadership, high teaching turnover and the high number of residents quitting half way which has lead to incomplete classes filled with IMGs throughout the program. So do your homework!

Skeptical Scalpel said...

Anon, thank you for commenting. I strongly advise any reader to do your own vetting of each program you are considering.

Skeptical Scalpel said...

What a shame--to throw away an education and a career and for the victims.

Anonymous said...

I am the spouse of a surgical resident halfway through their residency. When I hear of the idea of "vetting" the residency program as med school graduate, it makes me laugh and cringe. It's not really possible.

We were extremely concerned about not getting stuck in a malignant or toxic program. But these people must have been aware of how bad the program was, we thought the place had the happiest, friendliest people. What a facade.

So now what? We struggle. If it gets too bad I suppose we will try to switch to a new surgical program. Maybe the switch is to a different discipline altogether. Time will tell.

The take-away is perhaps make sure you are lucky, because if you are not, life is hard, and options to fix it are even harder.

In the mean time we will show up at each and every required recruitment dinner. We will smile, we will follow the script and lie our asses off. Why? Because if we don't we will be labeled a "problem" resident and never get a recommendation from the PD to another program. So the cycle continues. Does the program kick out good surgeons, perhaps, but how many do they lose or worse ruin?

Vetting...no. Luck of the draw.

Skeptical Scalpel said...

Thank you for your candid comments. I find them quite believable. Too bad they are buried at the end of an old post. I might want to use them as the basis for a new blog post. At any rate "caveat emptor."

Anonymous said...

I am going through this process now, and extremely concerned about the above. Everyone I talk to says that "everyone lies," and I don't know how to go about this process knowing that! Do you have any further insight on the Doximity Reviews? I feel as though the reviews there are theoretically helpful but there aren't very many.

Skeptical Scalpel said...

I looked at the Doximity site again. I am still familiar with a few programs in my area. The data on them is pretty bland and doesn't offer much insight. Honestly, I don't see it being much help.

I don't know if "everyone lies." At some point, you have to trust your gut instinct and make some decisions. The comment above from 1/28/15 is disconcerting though.

I've written several posts about the match. Just search "match" in the box above and to your right. Here's one about lying http://skepticalscalpel.blogspot.com/2014/04/a-match-day-lament-we-will-rank-you.html

Post a Comment

Note: Only a member of this blog may post a comment.