Showing posts with label Healthgrades. Show all posts
Showing posts with label Healthgrades. Show all posts

Thursday, November 14, 2013

Reality check: Hospital safety scores



Imagine you are sick and live in New York City. Your doctor tells you that you need major surgery. Luckily, you have excellent insurance and can go anywhere in the city for that operation.

Being a good consumer, you decide to check the HospitalSafetyScore.org website, which is sponsored by the Leapfrog Group, a nationally known patient safety organization.

You pull up a handy map of upper Manhattan and the lower Bronx to check the safety scores of hospitals in that area which is near your neighborhood.

A hospital on the Manhattan side (orange arrow) has a safety score of only "C" while over in the Bronx, there is an "A" rated hospital (blue arrow).


It's a no-brainer, right?

Clearly the safer of the two is the one with the "A" rating.

But consider this. The "A" rated hospital is Lincoln Medical and Mental Health Center, one of 11 hospitals owned and run by the city of New York. It is a teaching hospital. But there is little research going on, and there are no regionally or nationally recognized experts in just about any specialty of medicine or surgery practicing there.

The "C" rated hospital is New York Presbyterian, the main teaching hospital of Columbia University's medical school.

A 2012 patient safety study by Consumer Reports rated Lincoln as the 16th worst hospital for safety in the NY metro area. Presbyterian did not make that list of 30 such hospitals.

Healthgrades rates New York Presbyterian as #5 of 203 hospitals in New York State with 15 5-Star ratings and 11 quality awards. Lincoln was ranked at #88 with 3 5-Star ratings and 1 quality award.

US News & World Report published a list of the 18 best hospitals in the country that made its "Honor Roll." That list included New York Presbyterian at #7, and it was the highest rated hospital in the New York area.

Now which hospital would you choose?

Tuesday, August 6, 2013

The ultimate hospital rating system



Finding out which hospitals are best is like "a riddle, wrapped in a mystery, inside an enigma."

Are you tired of seeing conflicting ratings from such once respected sources as Leapfrog, Medicare Compare, HealthGrades and Yelp?

Does it confuse you when a hospital is ranked in the top 10 by US News and World Report, but is "god-awful" according to Consumer Reports?

The Skeptical Scalpel Institute for Advanced Outcomes Research is proud to unveil a new rating system for hospitals.

Advanced metrics and creative statistics are linked with secret Bayesian methodologies, patient surveys and publically available databases to yield the most powerful and accurate hospital ratings ever imagined.

Just kidding.

Based on word of mouth, innuendo and rumors that have come to our attention through the back channels of the Internets, Skeptical Scalpel Ratings Plus offers unparalleled accuracy in hospital ratings.

For the low, low price of only $29.99, you will receive the Skeptical Scalpel Hospital Ratings Guide.

But wait, there's more. Here's the plus. At no extra charge, we will send you information about a weird trick that will enable you to undergo open heart surgery for just $8.00. Insurance companies are furious.

That's not all. The first 127 people to buy the guide will get a set of the sharpest kitchen knives known to mankind as our gift.

Are you a hospital administrator? Be advised of some special features of our ratings just for you.

If your hospital's ranking is not what you think it deserves, don't do what all the others do and waste time with in-house task forces or ad hoc committees. And don't claim we used bad data. Since we don't use data, that won't work.

Instead, try an on-site consultation from the Skeptical Scalpel Institute. We can show you ways to improve your standing without the need for expensive new patient safety programs.

If you are pleased with your ranking and want to advertise your Skeptical Scalpel designation, you must first contract with us. We sell banners, plaques and lobby displays. Call to discuss details. Operators are standing by.

Fee schedules for consultation and advertising will be provided upon request.

Dial MY-RATINGS.

Monday, July 8, 2013

Breaking news-Turning the tables: Doctors to rate patients online



This just in.


BREAKING NEWS--Turning the tables: Doctors to rate patients online. LINK

Thursday, September 15, 2011

Joint Commission Proves It's as Irrelevant As HealthGrades

The New York Times reports that the Joint Commission has just published a list of its 405 "Top Performing Hospitals." As is typical of these types of evaluations, most of the large, well-known teaching hospitals where knowledgeable folks [like doctors] go for care when they are really sick didn't make the list.

Similar to the HealthGrades list of "top" hospitals [which I have commented about in the past], the JC's list is dominated by small community hospitals without teaching programs. These institutions know how to play the game and have figured out that compliance with process measures results in high marks from the likes of HealthGrades and the JC. For a related post on why process measures don't mean better care, click here.

At least the Times article pointed out that no hospital in New York City made the JC list. What about Chicago? Sorry, just a children's hospital and a VA hospital. Philadelphia? No university hospital but one community hospital. Surely University of Pittsburgh Medical Center? No, not the main hospital but several of its suburban affiliates made the list. St. Louis? Nope, no good hospitals there. Pick any major city. See the list for yourself.

Guess which state had the most top performing hospitals by far? It's a state that immediately comes to mind when one thinks of quality medical care. Of course, Florida with 51, not one of which was a university hospital unless you want to count the University Hospital & Medical Center of Tamarac. A visit to its website fails to reveal the name of the university it is affiliated with. However, Don Shula, former Miami Dolphins coach, endorses the emergency department in a short video.

A survey I would like to see is what hospitals would the executives of HealthGrades and the JC choose if they or a family member had a serious illness? I'm guessing that list would be a lot shorter and would not include more than 400 of the 405 on the JC list of top performers.


Tuesday, March 1, 2011

America’s Top 50 Hospitals. Says Who?

If you are looking for a chuckle, read the recently published HealthGrades list of America’s top 50 hospitals. The Healthgrades people have come up with a formula that selected 50 hospitals based on the following two parameters: “To be recognized with this elite distinction, hospitals must have had risk-adjusted mortality and complication rates that were in the top 5% in the nation for the most consecutive years. On average, patients treated at America’s 50 Best Hospitals had a nearly 30% lower risk of death and 3% lower rate of complications.”

That’s nice but it leads to some curious anomalies. For example, there are no university teaching hospitals on the list. By university teaching hospital, I mean a university hospital that serves as the primary teaching hospital for a medical school. Teaching affiliates rarely have the resources and research backup that university hospitals have. There are a few university-affiliated teaching hospitals on the list such as Hackensack University Medical Center in New Jersey. [Note: for those of you outside the New York City metropolitan area, there is no Hackensack University.] But even if you want to count the university-affiliated hospitals, there are not even 10 on the list. Twelve of America’s top 50 hospitals are in Florida with an impressive six of those located near the apparent Mecca of modern medicine [if you believe HealthGrades], West Palm Beach.

Let’s say I live in New York City and get sick. No hospitals in the states of New York or Connecticut made the top 50 list. My choices for hospital care are either the aforementioned Hackensack University Medical Center or the Community Medical Center of Tom’s River, both in New Jersey. I have never set foot in either place. They might be great hospitals. But I think I’ll take my chances at Columbia Presbyterian, Cornell or NYU.

Do you live in St. Louis? Sorry, no top 50 hospitals there. But you can go to the only hospital in Missouri on the list, St. Luke’s Hospital in Chesterfield. Pennsylvania? Sorry, no top 50 hospitals in Pittsburgh or Philadelphia but there are four in the Allentown-Bethlehem-Scranton area. Boston? Too bad. No Massachusetts hospital is good enough to be included in the HealthGrades Top 50 list. I hope Mass General, Deaconess and the Brigham don’t have to shut down.

I could go on. But see for yourself.

I posted a similar blog on this topic in August of 2010. I think lists like this are misleading, if not actually harmful. I wish they would stop publish them. But I guess the unsuspecting public eats this stuff up. Oh, and it is certainly a boon for the public relations people at the 50 lucky hospitals on the list.

Wednesday, February 9, 2011

Hospital and Doctor Ratings: Junk Science? No, No Science at All

Pop statistician/philosopher Malcolm Gladwell takes down the ridiculous college rating process in this week’s New Yorker magazine. While pointing out the many problems with the way U.S. News goes about rankings colleges, he mentions a 2010 study published in Archives of Internal Medicine that similarly debunks the rankings of hospitals. That study found that reputation alone is the key to a hospital receiving a high ranking and the ranking has nothing to do with quality. This is true of the college ranking system too. A key point regarding the power of reputation is that as the rankings are publicized every year, the top hospitals and colleges become even more prestigious, which of course, enhances their reputations further.

The college ranking methodology takes into account a number of factors and weighs them in an arbitrary way. Gladwell feels that cost is not given enough emphasis as expensive private universities dominate the top of the list. When it comes to hospital rankings, U.S. News surveys only 250 physicians in each specialty and asks them to name the five best hospitals in their field. It is impossible for a hospital without an established national reputation to ever be ranked highly.

This whole charade is carried to an almost comical extreme by the folks at Castle Connolly, who bring you “America’s Top Doctors” and regional offshoots of the same concept. These ratings are eagerly awaited ever year and are the subject of lengthy articles in magazines. Advertising revenue is generated as hospitals tout their MDs who have been fortunate enough to have made the Castle Connolly list.

Here is a little secret. Castle Connolly sends questionnaires to hospital department chairs and asks them to name the best doctors in not only their fields but every medical specialty. No other criteria are used. Having been a department chairman for over 23 years, I can tell you that it is impossible for me to know anything about the quality of the work of any physician at another hospital or even sometimes another specialty in my own hospital. The vote is strictly by reputation. Many fine doctors make the list because reputations are very often correct. But not always. A surgeon can be well-known for research or involvement in organizations, but she may not necessarily be the best clinician around.

So what is a prospective patient to do? I’ve already blogged about the shortcomings of Healthgrades [here and here] and the CMS [Medicare] Physician Compare website is not ready for prime time. For now, I suggest you ask friends who may have had illnesses similar to yours for recommendations. Or you’ll have to trust the doctor who refers you to a specialist and your instincts when you meet her.

Friday, October 22, 2010

Hospital Ratings Revisited

A recent press release from HealthGrades claims that some 232,442 Medicare patients’ lives could have been saved over a three-year period if all hospitals performed at the level of a HealthGrades five-star hospital. While this is a laudable premise, can it be true? Let’s see.

First you need to know something about HealthGrades and its rating system. Using a large Medicare administrative database (that is, the data are submitted by hospitals for billing purposes), HealthGrades compares hospitals on an observed vs. expected outcomes basis. For some reason, hospitals are rated as five-star (best), three-star (as expected or average) or one star (poor). There is no mention of four- or two-star. And according to their methodology, “…70% to 80% of hospitals in each procedure/diagnosis were classified as three stars, with actual results not significantly different from predicted results. Approximately 10% to 15% were 1-starhospitals and 10% to 15% were 5-star hospitals.” For non-statisticians, that would be classified as a normal distribution.

Now what would happen if every hospital in the U. S. performed at the level of a five-star hospital? Well, the observed rate of complications and deaths would go down but as long as one compares observed vs. expected outcomes, the distribution of hospital ratings would still be normal with 10%-15% being above average, 70%-80% average and 10%-15% below average.

Therefore, with the possible exception of hospitals in Lake Wobegon (“Welcome to Lake Wobegon, where all the women are strong, all the men are good-looking, and all the children are above average.” [Garrison Keillor]), all hospitals cannot be above average.

Then there is the problem of using administrative databases to judge clinical outcomes. By this passage from HealthGrades’ own description of its methodology the following disclaimers are listed.

“Limitations of the Data Models
It must be understood that while these models may be valuable in identifying hospitals that perform better than others, one should not use this information alone to determine the quality of care provided at each hospital. The models are limited by the following factors:

“Cases may have been coded incorrectly or incompletely by the hospital.
The models can only account for risk factors that are coded into the billing data–if a particular risk factor was not coded into the billing data, such as a patient’s socioeconomic status and health behavior, then it was not accounted for with these models.
Although Health Grades, Inc. has taken steps to carefully compile these data using its methodology, no techniques are infallible, and therefore some information may be missing, outdated or incorrect.”

There are a number of peer-reviewed articles questioning the validity of using administrative databases in clinical outcomes research. A study of patients with cerebral aneurysms, from the Bloomberg School of Public Health at Johns Hopkins University, found many large discrepancies between the Maryland state administrative database and the clinical records of the patients at their institution. A paper from Harvard and Tufts concluded “Cardiac surgery report cards using administrative data are problematic compared with those derived from audited and validated clinical data, primarily because of case misclassification and non-standardized end points.” A systematic review of papers on infectious diseases found that administrative databases have “limited validity” for the evaluation of co-morbidities, a key factor in risk adjustment.

Try this for some hospitals that you might be familiar with. Compare HealthGrades ratings with “Medicare Hospital Compare,” which one must assume is using the same outcome data since HealthGrades uses Medicare’s data for its ratings. Here are the results for heart attack outcomes for three hospitals in New York City. (See Table.) The rating scales are the same, three possible grades.


I don’t know which one to believe. Do you?

Note: A previous blog post of mine pointed out a few other issues with HealthGrades that everyone should be aware of.

Thursday, August 5, 2010

Hospital Profiling: Is Healthgrades the Answer?

In a word, no. Let’s take a look at their most recent rankings. According to Healthgrades’ America’s Top 50 Hospitals list, not a single medical school-based university hospital made the cut. This means that either Healthgrades or the list recently published by
US News and World Report must be completely wrong, since their two lists are mutually exclusive. While I am not so sure US News has it right either, I think I’d take my chances with their top 14 over any of the top 50 listed by Healthgrades.

If you look at Healthgrades ratings by state, you will find some interesting results. Let’s take Maryland for example. There is no question that Johns Hopkins is the best hospital in Maryland and certainly one of the finest in the country. But not according to Healthgrades. Of 20 possible specialty excellence award categories, Johns Hopkins was ranked as one of the top hospitals in Maryland in exactly one, prostatectomy. For fun, search your state and see what turns up.

There is a paucity of scientific research on Healthgrades with very few citations in PubMed. An article published in JAMA in 2002 appears to still be valid today. From the abstract: “Hospital ratings published by a prominent Internet health care quality rating system identified groups of hospitals that, in the aggregate, differed in their quality of care and outcomes. However, the ratings poorly discriminated between any 2 individual hospitals' process of care or mortality rates during the study period. Limitations in discrimination may undermine the value of health care quality ratings for patients or payers and may lead to misperceptions of hospitals' performance.”

A 2005 paper raised questions about Healthgrades’ use of administrative databases to define quality of outcomes. The authors concluded: “Substantial variability of reported outcomes is seen in administrative data sets compared with an audited clinical database in the end points of the number of procedures performed and mortality. This variability makes it challenging for the nonclinician unfamiliar with outcomes analysis to make an informed decision.”

There are some other notable issues with Healthgrades. It is not widely known that they offer consulting services. These are promoted as helping hospitals achieve better quality but may be somewhat ethically challenging. It would be like a gymnastics judge charging for advice on how a gymnast could improve her score at the next Olympics. Would it not be in the best interest of the judge to have said gymnast’s score be better?

While we’re on the subject of ethics, Googling Healthgrades yields an interesting finding on the first page of hits: a web site voicing consumer complaints. The majority of these numerous complaints detail Healthgrades’ practice of charging a fee for a report on a doctor or hospital, which is of course OK. What’s not OK is that they sometimes tack on a monthly “watchdog” fee, which is billed to the consumer’s credit card, in what the consumers claim is a hidden charge that they did not agree to. And it is difficult to have the extra charges removed and refunded.

In case you think this is not big business: Healthgrades has apparently just been purchased by another company for $294 million dollars. But even this deal is coming under fire as some shareholders feel that the company was underpriced. An investigation is under way.

So how does one pick a good hospital? I would say find a doctor you trust, ask for recommendations and do your own research. But I would take all ranking systems with a large grain of salt.