Imagine you are sick and live in New York City. Your doctor
tells you that you need major surgery. Luckily, you have excellent insurance
and can go anywhere in the city for that operation.
Being a good consumer, you decide to check the HospitalSafetyScore.org
website,
which is sponsored by the Leapfrog Group, a nationally known patient safety
organization.
You pull up a handy map of upper Manhattan and the lower
Bronx to check the safety scores of hospitals in that area which is near your
neighborhood.
A hospital on the Manhattan side (orange arrow) has a safety
score of only "C" while over in the Bronx, there is an "A"
rated hospital (blue arrow).
It's a no-brainer, right?
Clearly the safer of the two is the one with the
"A" rating.
But consider this. The "A" rated hospital is
Lincoln Medical and Mental Health Center, one of 11 hospitals owned and run by
the city of New York. It is a teaching hospital. But there is little research
going on, and there are no regionally or nationally recognized experts in just
about any specialty of medicine or surgery practicing there.
The "C" rated hospital is New York Presbyterian,
the main teaching hospital of Columbia University's medical school.
A 2012 patient safety study by Consumer
Reports rated Lincoln as the 16th worst hospital for safety in the NY metro
area. Presbyterian did not make that list of 30 such hospitals.
Healthgrades
rates New York Presbyterian as #5 of 203 hospitals in New York State with 15
5-Star ratings and 11 quality awards. Lincoln was ranked at #88 with 3 5-Star
ratings and 1 quality award.
US
News & World Report published a list of the 18 best hospitals in the
country that made its "Honor Roll." That list included New York
Presbyterian at #7, and it was the highest rated hospital in the New York area.
Now which hospital would you choose?
13 comments:
this is painful and not fun. being in the industry here in nyc, it seems obvious to me that Columbia is better than Lincoln since two extremes were chosen. I would never go to Lincoln hospital even though I've never been there. I've rotated at Columbia for a month. Am I a victim of my own bias or the marketing machinery at Columbia? Assuming Columbia is a better hospital than Lincoln, the question then becomes where is the flaw in Leapfrog's methodology? Quickly glancing at their methodology/experts it seems pretty valid. It uses a lot of benchmarks that are popular for assessing "quality." One of their experts is that Birkmeyer guy who authored the paper on surgical technique a couple of posts ago.
I feel like I read a journal article here and am not pleased with the outcome of that paper. Therefore I go back to the article and scrutinize it and still refuse to believe the logical conclusion.
I wonder if there are other industries where quality can be really assessed well. Can anyone definitively say a piece of clothing from Old Navy is inferior to one from Banana Republic?
Being in the health industry I'm sure we all know that this will mean. more paperwork and more phone calls to remove foleys, document the necessity of a central line...etc.
Ultimately is it fruitless to chase the ability to measure hospital quality?
and to answer this question, I would choose to go to CUMC over Lincoln because of my unshakeable bias. God bless those people with commercial insurance who go to Lincoln.
Pam, I believe Leapfrog's methods are flawed because they must be looking at only "process" measures. Some hospitals are very good at gaming those process requirements. And safety is only one element of a hospital's performance.
PS: I don't really believe the other rating systems either, but when they help me make a point, I use them.
Perhaps it matters what sort of surgery and treatment? If it's pretty standard, then you probably don't need a researcher or teacher, and you may have a better outcome from the Bronx hospital, so long as the surgeon is competent.
If the rating system measures inputs (hand washing, whatever), then for standard stuff, the inputs are maybe more likely to have a bigger impact on your recovery.
If the ratings are based on outcomes, then the Columbia hospital may have worse outcomes because it takes way more seriously ill patients?
And if you need some rare or especially difficult surgery, then having the super expert or researcher may be more important than the things the ratings measure (hand washing, whatever).
Anon, thanks for commenting. It's difficult to find out if a surgeon is competent. Outcomes are supposed to be adjusted for disease severity. Most are. Some people quibble about the methods of grading severity. I would not pick a hospital based on "patient safety" alone.
It's important to understand the methodology used to determine any rankings, ratings or scores for hospitals. It's a frequent topic of discussion among journalists. For some examples and more info on what to be aware of:
Should journalists cover hospital ratings?
http://healthjournalism.org/blog/2013/10/should-journalists-cover-hospital-ratings/
As consumers see more hospital ‘report cards,’ reporters can explain their limitations: http://healthjournalism.org/blog/2012/06/as-consumers-see-more-hospital-report-cards-reporters-can-explain-their-limitations/
Journalists share tips for weighing hospital rankings: http://healthjournalism.org/blog/2013/04/journalists-share-tips-for-weighing-hospital-rankings-ahcj13/
Pia, thanks very much for commenting. Sorry about the delayed response. I was away.
Great links and quite useful.
When choosing a hospital do average people most often select a facility based on a.) proximity to their home, b.) past personal or anecdotal experience, c.) their Primary doctor's preference. d.) wherever their Surgeon has privileges, e.) academic vs community based, f.) a quick review of data on the hospital's infection rates, g.) wherever their insurance company allows them to go, h.) a high grade from a quality or safety ranking system as viewed online.
One or all may apply. But since hospital statistics are getting easier to access thanks to the internet, choice "f" may be more helpful than choice "h".
Just today a patient's wife quizzed me on the "infection rates" at our facility and how they compared to other local hospitals. She was anxious that she had brought him to the 'wrong' hospital. I had to defer but found the data on our state Dept of Health website in the Annual report. Infection rates are calculated by taking the total number of infections divided by number of days patients were hospitalized, times 1,000. Our state Department of Health began collecting hospital-acquired infection (HAI) data in 2009. And our (community) hospital did have a (slightly) lower overall rate than two bigger teaching hospitals that are also part of our large university health system. Since the questioner's spouse had HAP (hospital acquired pneumonia), I doubt my "data" lessened her overall worry. Her initial reasons for bringing him to that hospital? His primary doctor went there/ proximity/ convenience. DD
DD, that's a good list of reasons why people choose a hospital. I think c and d are the most common.
It would be hard to go with f because the patient would have to blindly choose a surgeon. An Internet search might help, but would you trust it?
It's a tough problem without a simple solution.
Do the major academic hospitals such as NY Presbyterian not receive a lot of cases from other hospitals due to their complexity? So perhaps Columbia's cases are predisposed to complications due to the inherent complexity of the cases and Lincoln may even refer those complex cases to Columbia to handle.
Hmm,
I'd say scorecards are just objectively awful.
At my institution (USC), we have LA County (catering to the knife and gun club), Keck Hospital (catering more to the country club), as well as several affiliated community hospitals (one of which caters to the nursing home bingo club).
Each of those has a different baseline risk for HAIs, making comparisons of actual rates difficult to meaningless- its hard to adjust when your entire patient population is simply different and requires different procedures.
Process measures are similarly flawed and subject to incredible gaming- as seen in that DVT performance measure fiasco:
http://media.jamanetwork.com/news-item/use-post-operative-blood-clot-rate-measure-hospital-quality-may-flawed/
I can't find it now, but an article recently concluded that primary care physician recommendation of hospital and surgeon based on previous experience may actually still be a better measure than anything else right now- provided of course that physician isn't locked into a group and effectively being paid to refer their patients to the hospital that owns their practice.
Respectfully,
Vamsi Aribindi
Studdy, yes Columbia receives a lot of transfers. The data is supposed to be adjusted for patient acuity. It is difficult to sort these things out though.
Vamsi, thanks for your very pertinent comments. I had seen that JAMA article you cited. I agree it is an excellent paper that really highlights some of the issues in this area. I also agree with your comments about baseline risks of infection and how people choose their surgeons.
How do surgeons choose which hospitals in which to operate, in locales with several hospitals and not restricted by the patient's insurance? Is the choice based on hospital outcomes, infection rates, etc.? Or is it more swayed by OR slots, OR accommodation to the surgeon's schedule, willingness of the hospital to buy equipment?
Anon, good questions. I think most surgeons choose a hospital to practice in based on location, facilities, OR time, bed availability, equipment, and need for her services. Insurance is rarely factored in. I doubt that anyone bases it on outcomes and infection rates. These are difficult to determine even for physicians.
Post a Comment
Note: Only a member of this blog may post a comment.