DC's fire department is in the same situation as WMATA in terms of the necessity of a redesign of culture and behavior through a human factors approach
Washington Post photo.
In addition to the FTA report on WMATA released earlier in the week, a similarly scathing report was released by the DC Auditor about the DC Fire and Emergency Services, which since the 2006 death of David Rosenbaum, because of a series of mis-steps by department personnel, has experienced a similar ongoing cycle of ever escalating failures and more unnecessary deaths.
From the Washington Times article, "Audit finds several deficiencies in D.C. emergency medical services":
D.C. Auditor Kathleen Patterson issued a report Thursday that found of the 36 recommendations made by the Rosenbaum Task Force — chief among them to have all first responders cross-trained with basic firefighting and medical skills — only 11 have been fully implemented.-- Review of District of Columbia’s Compliance with the Recommendations of the Task Force on Emergency Medical Services (The Rosenbaum Task Force) (Report num: DCA262015)
The task force was convened in 2007 to recommend fixes for the city’s Fire and Emergency Medical Services Department after it was found that a neglectful, botched emergency response contributed to the 2006 death of New York Times journalist David Rosenbaum, who had suffered a head wound after being beaten and robbed.
The one hopeful note is that a few months ago, Mayor Bowser appointed as Chief of the Department, Gregory Dean, who had recently retired from running the Seattle Fire Department for 10 years ("District's incoming fire chief, Bowser hope to expand firefighter pool," Washington Post).
Seattle's EMS system is a national leader, having introduced many innovative practices. Hopefully, the current organizational culture won't be resistant to Chief Dean's ministrations.
Note that some of the earliest work in "human factors," focused on wildfires, addressed fire fighting.
-- "Human factors and the fire service," Firefighter Nation
One more thing, Brookland activist Dan Wolkoff has made the point for many years that because so many of the emergency calls are to deal with drunks, DCFEMS personnel come to believe that almost anyone they attend to is likely to be an alcoholic. In all likelihood that's why Mr. Rosenbaum was misdiagnosed as they misinterpreted his symptoms as the result of drunkenness rather than from a beating, fall, and subsequent head injury.
Also see:
-- The "recent" failures of the DC Fire Department are indicative of much deeper systems failures than people realize (2006)
-- Rationalizing fire and emergency services (2011)
-- Fire and emergency services (in DC) (2013)
-- DC "fire" department continued (2013)
-- Fire department issues in municipalities (2014)
-----
For the most part, the text below is a repeat from the WMATA-related entry from a couple days ago. It's repeated because it is equally relevant and the same process of human factors related redesign should be applied to DCFEMS.
-----
About 20 years ago, New Yorker Magazine ran a great piece about the bureaucracy and dysfunction in the Chicago Post Office ("LOST IN THE MAIL"). My sense is that DCFEMS operations are roughly at the same level of dysfunction.
The "human factors" approach of evaluating failure should be applied to DCFEMS as a way to implement necessary process redesign and structural changes (see for example, "How mistakes can save lives: one man's mission to revolutionize the NHS," New Statesman, the RISKS-L online forum, and Computer-Related Risks by Peter Neumann).
The New Statesman explains the human factors approach in the context of how after the death of his wife from an avoidable error, an airplane pilot is working with the British National Health service to apply airplane safety and crash analysis protocols to health care to reduce errors and deaths. From the article:
In the 1990s, a cognitive psychologist called James Reason turned this principle into a theory of how accidents happen in large organisations. When a space shuttle crashes or an oil tanker leaks, our instinct is to look for a single, “root” cause. This often leads us to the operator: the person who triggered the disaster by pulling the wrong lever or entering the wrong line of code. But the operator is at the end of a long chain of decisions, some of them taken that day, some taken long in the past, all contributing to the accident; like achievements, accidents are a team effort. Reason proposed a “Swiss cheese” model: accidents happen when a concatenation of factors occurs in unpredictable ways, like the holes in a block of cheese lining up.
James Reason’s underlying message was that because human beings are fallible and will always make operational mistakes, it is the responsibility of managers to ensure that those mistakes are anticipated, planned for and learned from. Without seeking to do away altogether with the notion of culpability, he shifted the emphasis from the flaws of individuals to flaws in organisation, from the person to the environment, and from blame to learning.
The science of “human factors” now permeates the aviation industry. It includes a sophisticated understanding of the kinds of mistakes that even experts make under stress. So when Martin Bromiley read the Harmer report, an incomprehensible event suddenly made sense to him. “I thought, this is classic human factors stuff. Fixation error, time perception, hierarchy.”
Labels: business process redesign, design method, disaster planning, emergency management planning, organizational behavior, risk management and redundancy, systems engineering
1 Comments:
Thank you for providing such valuable information and thanks for sharing this Business Promotion technique.
Post a Comment
<< Home