dcsimg

Black Box Thinking: How to Fix the Mistakes Killing Our Companies and Loved Ones

Rob Enderle

I have a book suggestion for existing or upcoming executives to read this summer: Black Box Thinking. It compares why safety has improved massively on airlines, but in industries like health care, mistakes are so prevalent that if you knew the statistics, you’d never go into a hospital again. In fact, since I now suddenly have a huge fear of hospitals, there is some risk to reading this book. But it speaks to the difference between an industry that aggressively learns from failure, often from the black boxes that survive airline crashes, and pretty much everyone else that tries to conceal failure for a variety of reasons, making avoidable mistakes repetitively.

It amazes me that we can have such sharp contrasts between firms and between executive managers. For instance, Dell and Cisco do acquisitions very well but for firms like HP (now HPE), Yahoo, and frankly most others, they are recurring disasters. Or why Steve Jobs and Bill Gates were such powerful CEOs but were followed by people who seemed to know them well but couldn’t seem to perform at anywhere near the same levels.

It is almost as if people are intentionally avoiding learning. Let’s talk about why that is in the hope that we can learn to stop doing this.

Black Boxes

I learned a lot about these black boxes. Not the technology, which is old, but the practices that surround them. For instance, I didn’t know that they couldn’t be used, by law, in civil litigation. You see, where most practices seem to focus first on blame, and the airline industry isn’t entirely immune from that, the use of the black boxes is focused on learning from mistakes. This is largely why Captain Sully Sullenberger, according to the book, was so successful. He used practices that had been developed because of lessons from pilots who made mistakes in which people died over the years.


These practices gave him checklists to follow, delegation processes to implement, and drills that honed his skills. They helped provide the confidence he needed to make the hard calls. Most of these things were developed from analysing mistakes over the years and an industry focused on keeping planes in the air.

The underlying process is an effort to focus on understanding and learning from a mistake so it isn’t ever repeated, instead of shooting the poor sap who made the mistake, or replacing him with someone who then gets to learn it all over again from the beginning.

The result is that while the mortality rate of pilots in 1912 was between 25 and 50 percent, today there is one accident for every 2.4 million flights.

Health Care

The author compares this to health care, and by proxy almost every other industry, and points out the reason why so many people die needlessly in hospitals and why health care is so expensive. It’s because mistakes are covered up. In health care, it is particularly bad because concerns over malpractice litigation apparently have medical teams actively avoiding documenting mistakes so that they often don’t translate into mistake-avoiding learning experiences.

One of the stories he uses to drive this home is of a young mother with young children who goes in for a routine, low-risk operation. The intravenous pain killers the operating team was using resulted in the need for a breathing machine but they couldn’t get the tube inserted after the medication started because her throat opening wasn’t wide enough. A nurse was prepped to do a tracheotomy. They eventually got the tube inserted, but so much time had gone by that the patient was in a coma. She died a few days later because of massive brain damage due to lack of oxygen.

The only reason this is documented is because the dead woman’s husband didn’t believe the hospital and forced a review. The statistics are frightening. Between 44,000 and 120,000 people die in hospitals due to preventable mistakes. Three thousand people died in the 9/11 attack. We are losing up to 40 times that a year in hospitals due to avoidable mistakes. The initial studies cited may be massively conservative because the Journal of Patient Safety put the number closer to 400,000. These numbers aren’t even complete as they don’t include nursing homes, pharmacies, care centers or private offices. Hospitals, according to this research, are the third biggest killer of people in the U.S.

Proper Analysis

One other story kind of hit me where I lived. The U.S. was concerned that so many bombers were being shot down that it decided to armor them. But where to put the armor? The government initially looked at the returning bombers, looked at where the bullet holes were and decided to put the armor there. A mathematician figured out this was wrongheaded. Because these bombers had come back, the bullet holes were survivable. He instead looked at the crashed bombers and what caused the crashes to decide where the armor needed to be. The result was far fewer crashed bombers. Even when doing analysis, if you don’t think it through, you could make things worse.

Blame vs. Analysis

Years ago, I researched and issued a report that went into detail as to why our division was dying, with recommendations on how to fix the problems. The resulting executive decisions seemed focused on blame, not understanding or correcting the problems, let alone institutionalizing the learning so another division wouldn’t make the same mistake.

I think this is at the core of why so many efforts fail repeatedly. Too much focus is put on blame, and not enough on assuring mistakes aren’t repeated. Therefore, the book resonated for me. It showcased, by contrasting health care and the airline industry, the massive difference between focusing on improvement vs. focusing on blame. The book is worth your time and, if you are short on time, there is a far shorter abridged version for your summer reading.

One final thought Given the billions being spent on insurance settlements, perhaps the easiest way to make health care in the U.S. affordable is to put black boxes into operating rooms. Just saying.

 

Rob Enderle is President and Principal Analyst of the Enderle Group, a forward-looking emerging technology advisory firm.  With over 30 years’ experience in emerging technologies, he has provided regional and global companies with guidance in how to better target customer needs; create new business opportunities; anticipate technology changes; select vendors and products; and present their products in the best possible light. Rob covers the technology industry broadly. Before founding the Enderle Group, Rob was the Senior Research Fellow for Forrester Research and the Giga Information Group, and held senior positions at IBM and ROLM. Follow Rob on Twitter @enderle, on Facebook and on Google+

 


Subscribe to our Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 



Add Comment      Leave a comment on this blog post
Jul 26, 2017 2:45 PM 2kmaro 2kmaro  says:
You're talking about "fixing the blame" but not "fixing the problem". One thing that most continuous improvement processes focus on when a problem is identified is to find the root cause of the problem and take action to prevent or at least mitigate that root cause. What you've described is similar to a situation where a problem is seen as originating in some 'department' and the immediate (non)fix is to replace the department head rather than look for the cause at a deeper level such as an unmonitored process that is the actual root cause and taking steps to ensure that the process is carried out properly and the results verified before the work moves to the next process in that department. Reply
Aug 3, 2017 3:30 AM David M. Fishman David M. Fishman  says:
As it turns out, your recounting of the assessment of damage to WW2 bombers mis-states the method used by famed statistician, Abraham Wald (https://en.wikipedia.org/wiki/Abraham_Wald ). Wald had no exposure to destroyed bombers. His analysis, founded in data modeling and rigorous math, proved that the relevant learning about prevention came from the bombers that did not return strictly by analyzing the survivors. You may find an excellent recap of Wald's work here https://people.ucsc.edu/~msmangel/Wald.pdf It's a 1984 monograph by Profs. Mark Mangel of UC Sant Cruz (https://users.soe.ucsc.edu/~msmangel/) and Francisco Samaniego (emeritus from UC Davis. https://anson.ucdavis.edu/~samanieg/) Mangel's twitter feed is worth following, too: https://twitter.com/marcmangel1 Reply

Post a comment

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

null
null

 

Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.


 
Subscribe Daily Edge Newsletters

Sign up now and get the best business technology insights direct to your inbox.