13 April 2010

The Empty Charade of Continuous Certification

Kevin wrote today about his impending recertification in Internal Medicine, and as a coincidence, today I took three of the exams needed for my re-certification in Emergency Medicine.  Great minds think alike and fools seldom differ, and Kevin and I seem to be in agreement that the Maintenance of Certification (MoC) project is badly misguided and poorly implemented.  And, Kevin, just you wait, because ABIM is not yet as deep into the MoC morass as ABEM is, and it's going to be getting worse before it gets better. 

For the non-medicos, the way things used to work was that you graduated medical school, maybe (or maybe not) got some specialty training, got your license, and you could hang out your shingle as a physician.  For those who completed a residency or fellowship, they could take a test given by a member body of the American Board of Medical Specialties, and if they passed, they could advertise themselves as "Board Certified" in a given field or specialty.  Board Certification was supposed to represent a higher level of quality, and studies supported the idea that board certified docs did provide better care.  Over time, board certification evolved into a more or less obligatory merit badge for practice at most hospitals.

In Emergency Medicine, you have to recertify every ten years; the recert exam is a slightly shortened version of the initial certification exam. Apparently in some specialties you used to be able to get lifelong certification but that is no longer the case.  

In recent years, the ABMS has decided that a once-a-decade test was not good enough, and implemented the Continuous Certification program.  For ER docs, this has manifested as a series of obnoxious, poorly written annual exams called the LLSA.  The idea was that they would give you a couple dozen important articles from each year and test you on them to make sure you were staying abreast of developments in the field.  Sounds good, but in practice it's really annoying.  First of all, it's open book, which ought to make Kevin happy, but the questions on the exams are not generally reflective of the critical points in the literature.  They are more commonly picky detail questions to force you to go back through the articles to find the trivia point that is referenced.  (Pro tip for test-takers: get the articles in PDF and do a keyword search and you don't have to read the articles at all!)  Worse, the questions on the EM tests do not appear to have been written by professional test-writers.  They are ambiguous, often with multiple possible "correct" answers, forcing you to infer what the examiners were looking for -- the old "guess what I'm thinking" game.  That's fine for oral teaching rounds, but poor form in an exam.

The whole experience is quite worthless.  I banged out three exams today, in about two hours. I scored 100%, 95% and 100%. And I didn't learn a damn thing.  I hasten to add that this is not because I am so damned brilliant, but because the concept and execution are fatally flawed.

Though I am, you know, pretty damned brilliant, too.  Just saying.

The next step is going to be worse.  An "Assessment of Practice Performance" and "Performance Improvement." In this, you will have to choose a diagnosis or presenting complaint, review charts, confess your sins find deficiencies and develop a performance improvement plan.  Then you do a prospective review and see if you are now sin-free performing better.  Sounds like a lot of busy work for very little benefit.  Also, it appears that "Patient feedback" is going to be involved somehow.  To my cynical ear it sounds like Press-Ganey will be involved in my future recertifications.  Great.

Now I should be clear on this: it actually sounds like the ABEM is trying to come up with MoC requirements which are more or less painless for ER docs to meet.  The Practice Performance description explicitly suggests that groups can just rubber-stamp their existing quality committee work, more or less, and have it count.  For that matter, the LLSA exams are also allowed to be taken in groups and are designed to be easy.  I appreciate that.  But by devising these shortcuts, ABEM is entirely undermining the point of the MoC project and revealing it to be an empty activity, a merit-badge to be earned, but not a learning experience.  

Which makes it a horrendous waste of physician time and energy. 

Most of my irritation is directed at ABEM, but it's important to understand that this ABEM is just the messenger here.  It's the ABMS that has foisted this obligation onto us.  Looking at the ABMS roadmap for MoC, it's a muddle, and looking highly painful: 

The Six Core Competencies

  • Patient Care-Provide care that is compassionate, appropriate and effective.... (this sounds bland enough)

  • Medical Knowledge-Demonstrate knowledge about established and evolving .... (standard fare)

  • Interpersonal and Communication Skills-Demonstrate skills that result in effective information exchange and teaming with patients, their families and professional associates (e.g. fostering a therapeutic relationship that is ethically sounds, uses effective listening skills with non-verbal and verbal communication; working as both a team member and at times as a leader). (Oh Jeebus, are you kidding me?)

  • Professionalism-Demonstrate a commitment to carrying out professional responsibilities, adherence to ethical principles and sensitivity to diverse patient populations. (What the hell does this even mean? And how will you test for it?)

  • Systems-based Practice-Demonstrate awareness of and responsibility to larger context and systems of healthcare. Be able to call on system resources to provide optimal care (OK, now I know you are joking.  Even you don't know what this means.)

  • Practice-based Learning and Improvement-Able to investigate and evaluate their patient care practices.... (Obligatory self-flagellation)
Blech. What a godawful product of committee-think.  These are all fine ideas, by the way, commendable and appropriate to be included in medical education.  What they are not, is testable.  Not in any meaningful way.  Remember, one defining characteristic of a test is that there is measurement, a standard to be met and a possibility of failure.  The vague buzzword-fest above is unsurprisingly resistant to objective measurement.  And if the Boards do not have the gumption to fail physicians who cannot, say, demonstrate fostering of a therapeutic relationship, if it's a test where everybody passes, then it really isn't a test, is it?  And if they are not measuring quality, then what value are these charades providing to patients?

I do think that it's a good idea for doctors to stay current -- on the evolution of the science of healthcare as well as the ancillary skills and topics.  I also think it's a good thing to have to recertify every so often, and a knowledge-based test seems like a fine method.  If, as Kevin suggested, it were open book and designed to induce physicians to integrate dynamic information sources into the testing, that would be better.  I also think it's a good idea for the house of medicine to produce a reasonable mechanism to ensure maintenance of skills before somebody else does it for us.

But this confused, pointless hollow time-waste of a process should be trashed and rebuilt from the ground up.


8 comments:

Thai said...

I think on top of all of this there is no evidence that this improves care but we know it adds a lot of cost.

Remember, 30% of all health care dollars are admin.

One thinks they would have done a small trial on this first to see if it even achieves desired outcomes and a cost/benefit analysis before they implemented it before they force so many resources to be committed.

Thai said...

I personally think this has more to do with job security/union protectionist mentalities than anything else and is likely to backfire in the long run

Nice post

Anonymous said...

i imagine the holy grail for press-ganey would be to tie their survey scores to certification requirements.
the day that happens is the day i stop seeing patients.

already pretty agitated that 5 docs in our group were denied raises this year because of scores not meeting a certain cutoff (i was not one of them). these 5 docs were the 4 that work nights exclusively, and the 1 that works fast track exclusively. they are being punished for the shifts that they work, and the type of patients those shifts tend to attract. no mention of p-g being tied to pay in our contracts, but then again our contracts don't guarantee us any pay raises so what can you do?

Anonymous said...

HA! Therapeutic Communication. Remember never prompt "yes" or "no" answers, and never use the word "why", never give advice... (for testing purposes only, LOL).
-SCRN

Anonymous said...

Next time someone complains to me that my generation of docs are not dedicated I'm pointing them to this post. Maybe we are, or maybe we're not, but it's you old farts who let the profession go to shit by not fighting things like this.

Anonymous said...

The only thing that recertificatin and MOC are good for is giving the boards a way to get money from doctors more than once in a lifetime (sometimes twice in the old days when they had a written and an oral exam.)

Makes me glad I'm an old fart with a permanent certificate.

Anonymous said...

This is all driven by the ABMS- I think the individual specialty boards recognize that MOC is a bunch of horsheshit- but the member board all have to fall in line.

I passed my plastic boards in Nov 2009. Every year for the next 10 years, I get to 'contribute' $200 to my MOC fund- at the end of the 10 years, there is a big party (a test), and I get to collect a bunch of cases to present for QA.
Good times.

Jeffrey said...

If it makes any of you feel better, ABMS is not the only culprit: NCCPA (the certification body for PA's) is floating very similar for PA MoC, right down to the "assessment of practice performance" and QI requirements.

In this age of evidence-based practice, I find it interesting that there's no evidence being presented that this will improve physician (or PA) practice.