Overworked in the Hospital

Study challenges medical education group's claims that doctors in training are abiding by limits on their hours.
September 6, 2006

In 2003, amid growing pressure from state and federal regulators concerned about excessive hours worked by doctors-in-training and the cascading effects of that overwork on the health of patients and physicians alike, the national body that accredits medical residency programs adopted a set of restrictions on residents' work hours. And ever since, the Accreditation Council for Graduate Medical Education has reported consistently that teaching hospitals and interns themselves are overwhelmingly complying with the new standards.

But independently reported data, released in this week's special edition of the Journal of the American Medical Association on medical education, suggests otherwise. Researchers affiliated with Harvard Medical School's Work Hours, Health and Safety Program found that 83.6 percent of the 1,278 interns they surveyed had violated the accreditation council's standards in at least one month in the 2003-4 academic year, after the new rules took effect.

About two-thirds of the interns reported that they had worked more than 30 consecutive hours; 43 percent said they had averaged more than 80 hours of work over a four-week period at least once; and 43.7 said they had not had one day off in seven as required. Violations occurred in more than two in five of the monthly work-hour reports submitted by the interns, the researchers found, and interns at more than 90 percent of teaching hospitals reported having violated at least one of the rules.

Those figures contrast markedly with noncompliance rates reported by the accreditation council itself for 2003-4. Its survey of 100,000 residents found that just 3.3 percent of interns reported having exceeded the 80-hour-a-week limit over a four-week period, and that the council had cited just 5 percent of residency programs for violating the standards. (The figures were 3 percent and 7.1 percent, respectively, in 2004-5, the accrediting group found.)

What might explain the huge gap? Both sides offered answers. In a news release Tuesday, the Accreditation Council for Graduate Medical Education said its "valid and reliable" data are based on confidential responses from more than 100,000 residents as well as interviews and site visits, compared to "self-reporting by 1,278 first-year residents who elected to participate" in the study published in JAMA.

In addition, the accrediting council said, the Harvard study used a "zero tolerance" definition of abuse, counting teaching hospitals as violating the rules if "a single resident worked beyond the duty hour limits." The ACGME, by contrast, cites an institution only when it is found not to be in "substantial compliance" (which is in many ways a judgment call) with the rules.

While the researchers who published the JAMA study acknowledged the much smaller size of their sample and the possibility of some "selection bias," they also had their own explanations for why the accrediting council's numbers may be so low. First, they said, the Harvard researchers asked "open-ended questions principally directed toward accurately measuring work and sleep hours," while the ACGME's survey specifically asked questions ("How many times did you work more than 30 continuous hours?") aimed at ferreting out noncompliance.

Framing the questions in those ways may have discouraged respondents from answering honestly because they did not want to get themselves or their institutions in trouble, said Christopher P. Landrigan, an assistant professor at Harvard Medical School and associate physician at Brigham and Women's Hospital, because even though responses to the accreditation council's survey are anonymous, residents who have complained about work hours have been retaliated against in the past, he said.

That points to a larger credibility problem with the medical education group's annual reports on residents' work hours, Landrigan said, which was partly responsible for inspiring his group's work: "There's a real disincentive to report hours to ACGME," because it "serves as both data collection repository and enforcement arm for resident work hours."

Landrigan said that he and his colleagues hear often that "things haven't really changed" despite the 2003 changes in the work hour requirements, and that they collected the new data to try to challenge the accreditation council's view that the changes have had a big effect.

He argues that much more significant changes are necessary, because even 30 consecutive hours and 80 hours a week take too heavy a toll on the sharpness and awareness of young physicians. "The existing standards aren't safe," Landrigan said. "We need safer limits, and then we need separate arms for data collection and enforcement." Limiting the work hours of medical residents and interns even further would be expensive up front, he acknowledges, but doing so might lower the costs hospitals now bear from medical errors, Landrigan said. (The special issue of JAMA contains another study finding that first-year medical residents were more likely to suffer self-inflicted injuries when they worked excessive hours.)

David G. Leach, executive director of the medical education accreditation council, said in the news release and in a commentary in the special issue of JAMA that the work hours of residents are just one piece of a much larger puzzle. "Resident supervision, the educational curriculum, faculty qualifications, the structure of the residency program and ongoing evaluation of residents to assure their developing competence are other factors promoting good resident learning and safe patient care," Leach said. "It is quite possible to be fully compliant with duty hours and in doing so to compromise both patient care and resident learning. Needed is the redesign of the learning environment in ways that enable improvement of both aims.”

The medical education issue of JAMA contains several other features, including these:

  • Researchers at the University of California at San Francisco found that five postbaccalaureate premed programs in the UC system aimed at increasing the flow of members of underrepresented minority groups into medical school were successful. Their study found that participants in the five programs were much likelier than students who applied to the programs but did not enroll in them to go on to enroll in medical school.
  • In a commentary, Jordan J. Cohen, departing president of the Association of American Medical Colleges, and Ann Steinecke, a senior official there, said the California researchers' finding "adds empirical support for the long-held belief that a sturdy scaffold of academic preparation and mentoring can offset at least some of the accumulated disadvantages experienced by many minority students interested in a career in medicine. Their findings should encourage other schools to establish postbaccalaureate programs that have special appeal to minority students."


Be the first to know.
Get our free daily newsletter.


Back to Top