Showing posts with label Board exams. Show all posts
Showing posts with label Board exams. Show all posts

Friday, September 23, 2016

Review: Online question bank for med students and residents


I just finished evaluating a study aid for National Board of Medical Examiners shelf examinations. It’s called ExamGuru, an online question resource for the major specialty rotations encountered by a third-year medical student.

The surgery shelf exam has a total of 395 questions. You can create your own multiple-choice tests of any length, timed or not, and you can focus on the subsections of surgery you want to emphasize.


What makes this set of examinations unique is that you not only get the answer, you also can see whether the question is easy or difficult and how you compare to your peers who have answered the question previously.


Questions that are too hard or too easy are revised or replaced.

Thursday, August 11, 2016

You failed the written boards: What to do now

I graduated this July and took the QE (written general surgery boards) on July 19th. I got my results today and I failed. Not only did I fail but my score placed me in the 5th percentile. Needless to say I'm disappointed. You hear stories about CE (oral exam) failure but never about QE failure. I never blew the ABSITE out of the water (50, 29, 20, 34, 38), but I never would have expected to perform so poorly. Rather than search for blame I'd like to form an effective strategy so that I pass the second time around.

I am sorry to hear of your misfortune. I can’t imagine how you must be feeling.

In your time as a PD did you have a resident or residents fail the QE? Several residents failed the QE. The most notable was a guy who never got below the 60th percentile on the ABSITE. To this day, I cannot understand how that happened.

What became of these people? You will be happy to learn that nearly everyone eventually passed. Patients never ask you how many times it took you to pass the boards. They don’t know about that sort of thing. As far as I know, failing the boards on the first attempt has no long term ramifications except for your program which is judged by the percentage of residents who pass the boards on their first attempt.

Any advice on how to avoid another failure? In order to help you answer the last question I will tell you that I went through SESAP 15 once. I listened to the audio as well. I stuck to high yield sources and UpToDate to supplement SESAP. I avoided reading any formal textbooks but I did read Cameron front to back during residency.

I have found that everyone learns in different ways. There is no single path to success.

One thing you said caught my attention. “I avoided reading any formal textbooks…” I think that would be a good place to start. You need to get a basic full-sized surgery textbook and read it carefully all the way through. I would advise you to take notes, make flashcards, or whatever else you think might help you to remember important points. Cameron is a great book but in my opinion it is more suited to studying for the oral boards because it is more clinically focused.

SESAP is geared more toward surgeons doing recertifying exams and is probably not worth spending time on for the QE.

Many of my residents used books of practice questions which may help, but only after you have done a lot of reading.


After you have studied your textbook and are feeling fairly comfortable, you should think about taking an intensive review course a few weeks before the exam. That may help solidify your knowledge. Taking a review course without studying beforehand probably won’t work because it is so much information over a short period of time that you will not be able to retain it all.

Study hard because the last thing you need is to fail the QE again. That would put tremendous pressure on the third attempt. You don’t want to be in that position.

I hope this helps. Good luck.

Wednesday, July 22, 2015

Review courses and board exams

Four years ago, I wrote a post called "Hints for new residents." Among my 15 tips was this: "Read, read, read. This isn't like school. You can't cram for your boards. You can’t learn 4 or 5 years’ worth of material in a one-week review course. You have to learn it as you go along."

Just published online in the journal Surgery is a paper entitled "Review courses for the American Board of Surgery certifying examination do not provide an advantage" by four officials from the board.

They surveyed new surgeons who took the certifying (oral) exam, 1067 for the first time and 329 who had previously failed the test, during the time period from October 2012 through June 2013. The overall response rate was 90%.

The pass rate for first-time takers was significantly better than that of repeaters, 82.1% and 72.6% respectively, p < 0.001; 77.9% of all examinees took a review course—76.1% were first-time takers compared to 84.6% of those repeating the exam, p = 0.002.

Friday, January 30, 2015

It's that time of year again

Hopes are high; everyone is prepared; all the talk is over. The big day is finally here.

No, it's not about the Super Bowl. It's about the American Board of Surgery In-Training Examination (ABSITE).

Every year at the end of January, all surgical residents take a five-hour, 250 question multiple-choice test. For many, it can be a watershed moment because their careers may be on the line.

I have written about the use of the ABSITE as a criterion for resident promotion. Whether you think it should be or not, it is used that way—sometimes as the only criterion. You can bet that in a few weeks, some residency programs will post notices saying they are looking for a categorical PGY-2 or 3 due to an "unexpected" vacancy for July 2015.

Another attending surgeon and I used to take in-house call the night before the examination so that all of the residents could take the test after a decent night's sleep.

Now the test may be given on different days so that the entire group does not have to take it at once.

One difficult situation I faced as a program director was when I had a good clinical resident who just could not do well on a multiple-choice examination. I had to decide whether keeping a resident who scored at the 10th percentile was worth the gamble. Scoring in the 10th percentile or less on a regular basis means that the resident has a good chance of failing the written board examination.

Of course, the very nature of percentiles is that 10% of those who take the test will finish in the 10th percentile or below. Also, the failure rate of the written board examination has hovered around 25% for many years.

The problem for programs is that the Residency Review Committee for Surgery mandates that 65% of a program's graduating residents must pass both parts of the board examination on the first attempt.

Of the many things I do not miss about practicing medicine during this turbulent era, the palpable level of anxiety surrounding the buildup to the exam and waiting for the dreaded results to come back rank high on the list.

I wish all residents who are taking the test the best of luck. I hope you were reading all along and not trying to cram a year's worth of studying into the week before the test.

May you all score above the 50th percentile.

Monday, November 17, 2014

Should resident promotion decisions be based on a written exam?

A few days ago, some surgeons on Twitter discussed the role of the American Board of Surgery In-Training Examination, a test which is given every year in January.

The test was designed to assess residents' knowledge and give them an idea of where their studying should be focused. However, many general surgery program directors (PDs) use the test results in other ways. Some impose remediation programs on residents with low scores and even base resident promotion or retention on them. Some even demand that all residents in their programs maintain scores above the 50th percentile.

The Residency Review Committee (RRC) for Surgery frowns upon these practices and states in its program requirements (Section V.A.2.e) that residents' knowledge should be monitored "by use of a formal exam such as the American Board of Surgery In Training Examination (ABSITE) or other cognitive exams. Test results should not be the sole criterion of resident knowledge, and should not be used as the sole criterion for promotion to a subsequent PG [postgraduate year] level."

The problem for program directors is that the RRC also mandates (Section V.C.2.c) that "as one measure of evaluating program effectiveness" 65% of a residency program's graduates must pass both the American Board of Surgery's Qualifying Examination (written) and Certifying Examination (oral) on their first attempts. I have said before that the "65% on the first attempt rule" does not seem evidence-based.

Tuesday, April 15, 2014

Why aren't all board recertification exams oral?


A loyal reader, who agrees with me that we may be teaching and testing medical students and residents the wrong way, asks why aren't all board recertification examinations given orally. She correctly asserts that oral examinations are better because they assess how people think rather than how much they have memorized.

Here's why it would be difficult to do.

The initial surgery board exam is given in two parts. First a written exam must be passed. Those who pass it are tested orally at one of four or five different times and locations within the following year. Oral exams are quite labor intensive.

Each candidate is examined by three pairs of surgeons, consisting of a senior examiner who is a member of the board itself and a surgeon from the local area. Each session lasts 30 minutes per pair for a total of 90 minutes. Multiply that by 1300+ examinees per year.

I had the privilege of serving as an examiner on one occasion. It's very stressful because one wants to be fair but also not let incompetent surgeons become certified. It's also much harder to standardize oral exams. Scenarios used in the exam are chosen by the board each year, but the individual examiners may have different approaches to the way the questions are asked and answered. For a number of surgical diagnoses, there may be more than one correct way to handle a problem, which makes creating a written exam difficult too.

About 1800 to 2000 general surgeons take the recertifying exams every year. To give each one of them a 90 minute oral exam would be very expensive and time consuming. It would be hard to find practicing surgeons willing to give up so much time to be examiners.

Many surgeons and other specialists are complaining about the cost of maintaining board certification. Taking a written recertification exam now involves going to a testing center and sitting in front of a computer. Many such centers exist, and traveling to them is much less complicated than going to one of the four or five cities where the oral examinations are held every year.

I do not see any way that recertification exams can ever be even partially oral. Until someone finds a way to make computer-based exams more clinically oriented, the ability to memorize facts will remain the basis for all recertification testing. 

There are other issues such as how to deal with surgeons who have specialized in a narrow area of surgery for many years, which is becoming more prevalent with so many graduates of residencies taking fellowships.

I addressed the other maintenance of certification components in a post last year. The concept of maintenance of certification is noble, but the execution is not working for those subjected to the process.

If anyone has a better idea, please comment.

Wednesday, January 16, 2013

Is Maintenance of Physician Board Certification a Sham?




Over the last few years, medical specialty boards have begun to compel physicians to maintain board certification by a number of means. This is an extension of recertification requirements which have been in existence since the mid-1970s.

Here is what the American Board of Surgery (ABS) mandates for Maintenance of Certification (MOC) every three years except where noted:

1. You must have an unrestricted medical license, hospital privileges in surgery, and references from the chief of surgery and the chair of the credentials committee of your hospital. It’s hard to argue with the need to have a license and practice in a hospital. However, if a surgeon had real quality issues, shouldn’t they have come to light before the end of a three-year cycle of MOC?

2. You must document 90 hours of CME credit, 60 of which must include some sort of Q & A testing which must be passed with an average score of at least 75%. I have previously blogged about the inadequacy of most CME programs. Even CMEs that require testing are often laughably simple. The American Board of Internal Medicine offers (for a price) open-book and Internet-based courses. Regarding self-assessed CMEs, the ABS website states, “[t]here is no required minimum number of questions and repeated attempts are permitted.”

3. You must successfully complete a written recertification examination every 10 years. Surely that must be an effective measure? Maybe not. For the last five years, the pass rate for recertification in general surgery is 94% or greater. The American Board of Internal Medicine (ABIM) recert exams must be a little tougher or those who take it may not be as smart as surgeons. The pass rates for the ABIM recert exams have been 88% to 92% for the last four years with similar rates for all of the medical subspecialties.

4. You must participate in a national, regional or local outcomes registry or quality assessment program. Participation in a national outcomes registry sounds great, but none of the available registries have policing powers and many rely on individual surgeon input to track outcomes. As mentioned in the critique of the first requirement, quality issues are far more likely to be discovered at the local level than by a registry that collects data submitted by the surgeon herself.

As if all of the above issues are not enough, how about this for a hot potato from the ABS? “Periodic communication skills assessment based on patient feedback may also be required in the future.” I can’t wait to see how that information is going to be collected. By what criteria will communication skills be judged? And what will happen to someone deemed a poor communicator?

I suppose the boards are doing all of this to forestall government or other regulatory bodies stepping in. Meanwhile, let’s everyone play along.

None of the MOC requirements address another issue, which is fitness for practice. A December 10th article in the Washington Post on aging physicians notes that some hospitals are setting age limits at which doctors are required to have physical and mental evaluations in order to maintain staff privileges. That’s great but not for just the elderly; every doctor needs to have period fitness testing.

Right now, all you have to do to stay on the staff of most hospitals is have a colleague attest to the fact that you are in good health, hardly a rigorous standard.

I’ve known a few physicians well under the age of 65 who could have used a checkup from the neck up.

To answer my own question, maintenance of certification is a sham.


Monday, June 4, 2012

Does size matter? Surgery residency program size & board passage rates


A study in the May 2012 issue of the journal Surgery found that the larger a residency program is, the more likely are its graduates to pass the written and oral board exams of the American Board of Surgery on their first attempt. Over the years 2006-2011, 85% of residents passed the written exam and 83% passed the oral exam on the first try.

The authors show linear regression lines with positive and statistically significant correlations between increasing size of a program and its residents’ first-time passage rates. They say, “This important finding may influence the application patterns and rank lists of medical students matching into general surgery residency programs.”

They mention only one limitation of the study, which is that they did not have first-hand knowledge of how the board passage data were produced. They apparently could think of no other potential confounding factors.

I can think of two right offhand.

One, it is well-known that larger programs, which are more apt to be based at medical schools, attract smarter applicants.

From a paper about factors predicting board passage on the first try:

Significant objective predictors of successful first-attempt completion of the examinations were Alpha Omega Alpha status [the Phi Beta Kappa of med schools], ranking within the top one third of one's medical student class, National Board of Medical Examiners/United States Medical Licensing Examination Step 1 (>200, top 50%) and Step 2 (>186.5, top 3 quartiles) scores, and American Board of Surgery In-Training Examination scores >50th percentile (postgraduate years 1 and 3) and >33rd percentile (postgraduate years 4 and 5).

These are all directly related to the degree of intelligence of the resident. First-time failure to pass the board exams are much more likely to occur with graduates of small programs on the basis of the above observation alone.

The second confounder has to do with statistics. In his book, “Thinking, Fast and Slow,” Daniel Kahneman points out in Chapter 10 “The Law of Small Numbers” that “extreme outcomes are more likely to be found in small rather than large samples.” He gives an example of a large urn filled with the same number of white and red marbles. If one draws 4 marbles at a time and repeats the drawing many times, one is far more likely to have extremes of distribution, such as all marbles being the same color, than is one who draws 7 marbles at a time.

Imagine that all residents are created equal [Well, try.] and drawing balls which are all red represents a resident failing to pass the board exam. This will happen approximately 4 times more often if one draws 4 balls instead of 7. As you increase program size to say, 10 residents per year, the disparity is even greater.

The size of the program is not the issue. Large programs and small programs have been put on probation or discontinued. It’s not about the teaching either. A recent survey revealed that residents in non-university programs felt they got better teaching than those in university programs. [See my blog about this here.]

I’ll tell you a secret. It’s about the individual resident. I’ve had residents who I am certain could have passed the boards if just given a textbook and access to patients and operations. I’ve had others who no amount of teaching, prodding or remediation could salvage.

Thursday, February 2, 2012

Cheating, written board exams and “recall” questions

CNN "Exclusive: Doctors cheated on exams"

A recent dust-up about radiology residents accessing and memorizing questions from previous board examinations generated 1361 comments on CNN alone from physicians and others. Many more people took to Twitter and vented. Some said it wasn’t really cheating because the test-takers had to memorize the answers. Some said the radiology boards didn’t really sort out who was going to be a good radiologist anyway. Some said the questions were on non-clinical topics like physics. Some were highly indignant that such a thing could happen. I saw many comments suggesting the board simply write a new test every year.

The stockpiling of "recall" questions by residents and programs goes on in all specialties. I know it does in surgery.

It would be difficult to create a completely new written exam every year. For several reasons, that is not a practical solution.

My understanding of the way the American Board of Surgery handles questions is this. Questions are recycled because they must be validated by analyzing them after they are used. In surgery, questions may appear on the residents' in-training exam one year, the re-certifying exam the next year and Part I of the boards the following year. Where appropriate, questions are also recycled through the subspecialty exams such as critical care, colorectal and others. As it happens, each test contains some reused questions and some brand new questions.

The board assesses certain things about each question such as do junior residents do as well or better on a question than chief residents? An ideal question would be one in which the percentage of correct responses increased as the training level of resident increased. They also look at whether the questions are framed correctly. For example, a question which generates two or more answers that are chosen by similar numbers of test takers may be ambiguously worded. Questions that have unusual patterns of response or ambiguous answers would be reformatted or discarded, and they would not be counted for the test that resulted in the unusual answer patterns.

A completely new test would contain quite a few questions that would have to be discarded if they were not validated by prior use.

Another problem is that writing good questions is very difficult. Most educators feel that a five-answer multiple choice question should have one correct answer, one plausibly correct but not exactly correct answer, two wrong answers and one really, really wrong answer. The correct answers need to be found in commonly used textbooks. And there are just so many questions that can be asked. Try writing a few questions. You’ll see how hard it can be.

I am conflicted about whether possession and use of copyrighted material from the boards constitutes cheating. Strictly speaking, I suppose it does. But where do you draw the line? When I was a residency program director, my trainees would often ask me what I thought the correct answers to some of the questions were. If I told them, would that be cheating? If they remembered a few of the questions and discussed them among themselves, is that cheating? What if a resident remembered a question and looked up the answer herself? Is that cheating? Or is it learning?

Here’s the good news. I’m not a residency program director any more.

What do you think?