Surgeons submitted a video of their choice depicting their performance of a laparoscopic gastric bypass. Since it was self-selected, it was presumably their best work. At least 10 of their peers, blinded as to the name of the surgeon, rated skills on the video which had been edited to include only the key portions of the case.
Surgeons in the lowest quartile of ratings for surgical skill had significantly more postoperative complications, readmissions, reoperations, and deaths.
A New York Times article about the paper features a couple of short video clips—one from a not-so-skilled and one from a very skilled surgeon. The differences are obvious and dramatic.
According to the discussion section of the paper, the Michigan bariatric surgeons are now watching each other operate and will soon be receiving anonymous feedback about their technique from their peers.
It is not clear whether this will improve the skills of the lower-rated surgeons or have any effect on outcomes.
Many people rightfully praised the research. Some suggested that all surgeons should be scrutinized in this same fashion.
I agree that the study was well-done and shows that technically better surgeons have better outcomes.
But there are some problems with generalizing this to all surgeons.
The American Board of Surgery recently noted that there are almost 30,000 board-certified general surgeons in the US. This raises a number of logistical issues.
Let's say we focus on the most common major surgical procedure—laparoscopic cholecystectomy, 10 surgeon-raters would have to view at least 15 to 20 minutes of video for each of the 30,000 board-certified general surgeons. How long would that take? Who would collect and edit all the videos? Who would make sure that the ratings were consistent? Who would collate and distribute the results? How would follow-up be done? Who would pay for all of this?
And that is just for the board-certified general surgeons. What about the general surgeons who are not board-certified and all the other surgical specialists? Maybe gastroenterologists should have their endoscopy procedures scrutinized. Maybe primary care docs should have selected office visits recorded too.
This is similar to the enthusiasm which surrounded the concept of using retired surgeons to coach other surgeons. The idea was based on the experience of one surgeon, who had access to an expert coach and wrote about it. I blogged about the logistical difficulties that would preclude coaching from becoming widespread. To my knowledge in the two years since I wrote that post, coaching has not caught on as a performance improvement measure.
It's too bad, because in an ideal world, video evaluation of operative procedures and coaching would be great. Unfortunately, we don't live in an ideal world.
16 comments:
Please comment for the blog on the study's observation that the busier surgeons tended to be in the top performance groups, an observation which doesn't mesh with other volume vs. performance studies that you have commented on in the past. Do you suppose this volume/performance correlation is somehow particular to laparoscopic surgery?
This is great commentary on this article, which I think I had emailed to be by a half dozen people after its publication. It is interesting to think of trying to extrapolate these results to all physicians rather than just pick on surgeons. (Haven't you read some PCP notes you would have loved to submit for "peer review and critique"?) Hidden within your commentary is a related issue of access to care within our system. Is it even remotely possible for everyone to be operated on by the "extremely skilled" surgeon? Should this be what the healthcare system strives for (essentially setting up clinics and urgent cares around the country to funnel people to huge procedural Meccas in strategic geographic centers)? Isn't it unfortunate that surgeons who are NOT board-certified often drift to areas where competent general surgical care is needed most? (Oftentimes areas where payor mix and unpredictability of workflow mean a surgeon in these locations is financially punished compared to his/her peers.) Just a few thoughts on how this is all interconnected.
An unspoken assumption is that the less skillful surgeons can improve. Can they?
A cursory look into any OR would give the impression that some people just have better psychomotor skills. Sure, they improve with training and practice . But, after 5 years of residency and several years of practice, can a practitioner improve much technically?
Robert, that's an excellent question. I really don't know the answer. The volume/outcome question remains unsettled. It is possible that the busiest bariatric surgeons in Michigan are good because they're busy and not the other way around.
Josh, excellent points. I hinted that other specialties show have to undergo this level of scrutiny too. Yes, everyone cannot be operated on by only the most skilled surgeons. It is hard to tell from a sample of 20 surgeons just how many really skilled surgeons there are in the country.
Anon, that's a very good observation. It is not clear that skills of attending surgeons can be significantly improved by videos and coaching. It sounds good, but to my knowledge, it has not been studied.
That brings up another point. Is it realistic to expect that all surgeons will become highly skilled. As more are studied, one would expect regression toward the mean. Also, there will always be above average, average, and below average surgeons. The only category of people I know of that is all above average are medical students. See http://skepticalscalpel.blogspot.com/2012/10/medical-school-grading-and-t-ball.html
I'm not sure that using retired surgeons for "coaching" would work out so well. I mean no offense to present company, but is someone who retired five years ago going to be able to help a newbie out with something that is cutting edge, like robotics or single port laparoscopy, or, gasp, NOTES?
Speaking to Josh above, don't equate board certification with surgical ability. They are NOT the same thing at all.
I think a program of random and anonymous review of videotape of each others' work, with added critique and opportunity for response, would be beneficial. Of course, everyone is going to send in their prize pig for the contest.
Surgeons follow a basic established blueprint for how to proceed with "usual" cases like lap choles. They learn the "steps" then over time, their technique, and speed improves.The learning curve will be steeper for more complicated cases (like lap Roux-en-y-gastric bypass) and in higher risk patient populations (like morbidly obese). Watching a video of highly skilled surgeons could help that learning process. Then viewing oneself or having a peer-review could establish a baseline to measure improvement. It's a good idea.
I worry that younger surgeons are learning primarily a laparoscopic or robotic approach. So what happens when there is an unexpected problem that requires the patient to be opened? I want my surgeon to know when to convert to a traditional laparotomy--and to be able to confidently execute that switch. I want my surgeon to know how to sew not just how to deploy the latest endoscopic gadget. Bring on the retired surgeons to mentor. Experience = saving lives. Over-reliance on- technology = lawsuit.
DD
The article is behind a paid firewall, so I will just assume that the bariatric surgery studied is a laparoscopic Roux-en-Y.
That's a tricky procedure where experience and good videogame - er, I mean videoscope skills count a lot.
What about a bread-and-butter operation like a lap chole? Does the surgeon with the 15 minute operation have better (or maybe worse) results than a colleague averaging 45 minutes per case?
Artiger, I agree that board-certification does not necessarily equate to competence. You are probably right about the retired surgeons not being able to help with some cases.
DD, good points, especially your concern about open cases, which by the way, would be more difficult to video.
Anon, yes it was laparoscopic Roux-en-Y gastric bypass. I should have mentioned that. How fast a case is done may not always correlate with outcomes. I have seen surgeons who are fast and sloppy. The patient might be better off with a slow and careful surgeon.
The public (and the "state" on it's behalf) have long expected us to evaluate our(and each other's) performance. Our failure to do so has resulted in regulatory mandates (from CMS & The Joint Commission). 30 years ago, a colleague said of PRO's that "this is the last chance we have to do this ourselves...if we don't the government will come in and do it for us. And so they have. The study in Michigan should be commended for taking initiative and defining or at least exploring methods to do it effectively and constructively. Hospitals are being forced to spend dollars on resources for credentialing, OPPE/FPPE and peer review activities already so in fact some of the logistics can be addressed by using some of these resources toward making this as easy as possible to accomplish. It may be that using data triggers like infection rates, readmissions, returns to OR, to "select" the deeper dive observation rather than observing all surgeons. This study is a great step forward and we should take advantage of it to explore possiblities; There will always be great, good, fair and poor surgeons but this provides us with a tool to try to get everone to be the best they can be. A worthy goal!
Anon, you make some excellent points. Selective use of the technology might be more feasible and yet worthwhile. I agree we haven't policed ourselves very well.
as someone fresh out of training, its really vindicating to know that skill does correlate to outcomes. I disagree with DD in that just because you are old/experienced doesn't make you a technically good surgeon. Even now I will defer to someone older for decision making help (approach/indication/avoid at all costs). When I get in trouble in the operating room, I don't call the guy who's done the most or is the oldest, but I call the guy who I think is the best technically. As a resident, you kind of learn who is good technically and try to steal/learn their moves. Watching some of those videos just highlights what I try to teach my residents which is the importance of exposure and setting yourself up. The best technical surgeon I ever worked with was so good he made the resident look good.
I for one being arrogant welcome an assessment of my skills. Would I want them to be public? for better or for worse I would accept it. If I got bad scores, so be it, it forces me to technically improve and provide better care for my patients. If they were good, it would help me get more patients vs the more traditional doctor/referral good ol boys network. My expectation is that I'm average and that as I get older/do more I will improve technically but not a lot unless I find myself with a bad score and really have to force myself to re-evaluate things I'm doing now.
ways of breaking down a baseball player and trying to predict performance has become more of a science. why can't it apply to us? if my job was on the line you'd better believe I'd get my on base percentage up.
if we as surgeons don't improve and take self-governed performance standards seriously, we're gonna be like blue collar unions who rest on seniority as a measure of importance and find ourselves floundering in terms of leverage come negotiation time.
pamchenko, I hear you. I remind everyone that I never said that video rating of surgeons was a bad thing. It's not. It's a good thing. I just don't think it is feasible for every surgeon in the US to be rated this way. Although some have passionately defended the concept, no one has addressed the issues I raised in the post regarding the widespread implementation of this idea.
what about making it a part of what it takes to be a FACS? that way you have independent privileges. possibly as part of re-cert? then again this type of pain will only add to my membership costs/annual dues :(
Maintenance of certification (paid for by the doctors) is already very expensive. As I said in the post, "Who would pay for all of this?"
I have a question. Could the patient suffer any repercussions later on as a result of the sloppy suturing? Or is it just a matter of principle and pride to be neat.
Emily, yes the poorly placed sutures might be more likely to cause a leak than sutures that were placed more accurately.
Post a Comment
Note: Only a member of this blog may post a comment.