By Dr. Keith Roxo, Guest Writer

One of my overall philosophies is to try and excel. In my typical fighter pilot fashion, I will commonly refer to this as “suck less.” I have always felt it is important to acknowledge the areas where we lack knowledge and experience to avoid becoming complacent, arrogant, or unsafe. One of the things I loved most about military aviation, which also translated well to medicine, was the constant learning required to stay on top of your game.

As a young and inexperienced fighter pilot, I clearly sucked compared to the more experienced ones. As an intern, you tend to suck as a physician because of inexperience. When we think we know everything, regardless of the subject, we close doors around ourselves to learning. Acknowledging that you suck and having a personal drive to not suck is how you can continue to grow. It also helps limit arrogance.

I have read a lot of books about personal finance. As mentioned in my earlier post, Starting a Medical Consulting Business as a Top Gun Fighter Pilot and Physician, I had known I wanted to work for myself, I read plenty of books about business. But I have also read a lot of books about personal growth, success and failure, and problem-solving. The Checklist Manifesto fits in here. As do many of the books by Malcolm Gladwell.

Without a doubt, the single best book I have ever read about how to not suck is Black Box Thinking: Why Some People Never Learn from Their Mistakes—But Some Do by Matthew Syed. While I initially listened to it, I ended up getting it on Kindle as well so I could more easily take notes. As a testament to how important I think this book is, I have a quarterly reminder to re-read my notes.

 

Part 1 – The Logic of Failure

The opening of the book discusses both aviation and medicine, which is how it came to my attention. The spouse of an airline pilot who was in for a routine outpatient procedure died on an operating room table. The pilot asked what went wrong, and he was essentially given no answer and told that the hospital only conducts investigations if someone sues. He did not want to sue, but coming from an industry where every accident is investigated and the results are publicly published for all in aviation to learn from, this was unacceptable to him.

This section of the book primarily looks at how failure should be looked at as an opportunity to learn. But to do so means that the failure needs to be fully investigated to ascertain what went wrong and what could have been done differently. This section also discusses monitoring complex processes to spot errors and establishing a culture without blame to encourage the reporting of mistakes and errors so that others may learn.

More information here:

Buying into (or Selling Part of) a Business

 

Part 2 – Cognitive Dissonance and Confirmation Bias

The second part of the book starts to look at personality traits that lead to continued failure. This is typically due to the concepts of cognitive dissonance and confirmation bias. Cognitive dissonance is the unwillingness to listen to alternative ideas that don’t align with your own established beliefs and values. Confirmation bias gives over-reliance on results that support our preconceptions and expectations. These concepts can result in significant denial.

The result of being unwilling to accept that something has failed, the unwillingness to investigate what went wrong, the unwillingness to challenge a concept to failure rather than just accept the easy confirmation: all of these behaviors stagnate knowledge, personal growth, and enhanced success. But by challenging assumptions, intentionally looking for counter positions, and keeping an open mind to actual evidence vs. anecdotal experience, a person can really open themselves up to the possibility that they just may be standing on the peak of Mt. Stupid on the Dunning-Kruger Curve with an entire slope of enlightenment in front of them.

 

Part 3 – Confronting Complexity

Many things in life are exceedingly complex, but they may not initially seem so. In part 3, Syed talks about how it is important to conduct testing and iterative design change with your target audience. What you thought in the abstract would be perfect may not actually be what works in real life. If you spend an eternity and fortune making a “perfect” product that no one ends up using, you have not achieved any level of success. A minimum viable product lets you start selling or using something while seeing how it works in the real world. Then, you can make the changes that are necessary.

Syed highlights a couple of examples of this. The first was about the company Unilever making powdered detergent in the 1970s. This process relied heavily on a spray nozzle where the constituent components were forced through to make the final powder product. After initial designs were less than optimum, the company hired a team of mathematicians and physicists to design the perfect nozzle. After spending the money on the design and development, these new nozzles were utter failures. Unilever then turned to a team of biologists who took an iterative design process. They would take one aspect of the nozzle, such as discharge end diameter, and make a bunch of nozzles only changing that one thing. After testing several, they settled on what seemed like the best end diameter. Then, they would repeat the same process for overall length. Then for inlet width, etc., etc. Eventually, they iterated the most functional nozzle the company ever produced. This example combined both the concepts of minimum viable product and randomized trials. It also highlighted that even the “experts” in fluid dynamics didn’t fully understand the real world application.

The concept of narrative fallacy is also introduced. That's where we try to simplify complex issues in a way that makes them more understandable. It makes us feel better about ourselves and our level of understanding. But in doing so, we neglect the bigger picture and the nuance by not putting in the time and energy for proper understanding.

An example: in the late 1970s, a crime reduction program called “Scared Straight,” where at-risk youth were exposed to hardened criminals in prison, was developed. The goal was to shock these teenagers into changing their ways. It was a public phenomenon. Initial results seemed promising where 90% of kids involved were still not in trouble three months after their trip to a prison. It was a closed-loop thinking exercise without randomized controlled trials. They saw the results they expected to see and did not do an in-depth analysis. But further in-depth analysis showed that the program made more criminals than it prevented. Most of the kids involved in the program were never at risk in the first place, and the ones who truly were on the path to criminal behavior were only emboldened.

Intuition is often wrong, specifically due to the tendency for oversimplification.

 

Parts 4, 5, and 6 – Putting It All Together

As we get further into the book, there are fewer new concepts introduced and more discussion about how to use the knowledge presented. Part 4 spends time discussing the value of iterative improvement. If you perform the same task 50,000 times, then shaving a sliver of time off that task adds up to actual results. Many times, people are looking for a big change when it is the cumulative impacts of multiple small changes that can really add up. More discussion occurs about the value of gathering data, running randomized control trials to determine what may actually be the best, and how sacrificing customer service for short-term marginal gains tends to have long-term detrimental effects.

Part 5 goes deeper into how “The Blame Game” can crush innovation, ruin safety, and create an overall terrible work environment. A just culture that seeks to improve everyone does the opposite.

The final section talks about how expertise is built on study, repetition, and repetitive failure to learn how to be better. Syed discusses that we all need to be open to the idea that there is room for improvement no matter what we do, that embracing failure is the most efficient way to learn, and that people need to be open to criticism and to challenge their own point of view.

More information here:

Cut Your Med School Expenses by Living in an RV

 

Lessons Learned from Black Box Thinking

A healthy amount of this book fits into my existing philosophy. It's a personal philosophy that was hard learned through a few decades of training and professional development in both aviation and medicine. What it did more than anything for me was to provide context. But it also highlighted areas where I remained weak.

Before reading Black Box Thinking, I allowed intuition to guide opinions and decisions outside of my areas of expertise. I assumed that a product had to be perfect to launch, and I didn’t fully appreciate the value of conducting randomized trials to make things as scientific as possible. The reason I re-read my notes quarterly is to ensure I remind myself of the areas where I had—and perhaps fall back into—weakness.

Before I started my consulting business, Wingman Med, I assumed I had to have all of the answers before I began. The concept of the minimum viable product gave me the confidence to begin consulting in my area of expertise knowing that, while I may not have the immediate solution, I do know where and how to get it. The subsequent client interaction forced me to become even more of an expert through repetition and trial and error.

Another concept that I have used in my business is the use of randomized trials, specifically for advertising. Early on, I decided to run the same ad with the same budget over the same period of time on both Google and Facebook for two different aspects of my business. The results showed that for one of my aspects of the business, Google was significantly better. But for the other one, Facebook was orders of magnitude more cost-effective. I continue to use this same method to see what other types of advertising may be beneficial while striving to find the most cost-effective use of our advertising budget.

The concepts in Black Box Thinking can open your eyes to the things you may not realize you are doing, but they are things that may be inhibiting your growth as a person, physician, and/or business owner. You may unintentionally be falling into the traps of cognitive dissonance, confirmation bias, and narrative fallacy. Being aware of these concepts and how they can impact you is the first step in building a personal strategy to avoid these mistakes. And then you can begin to suck less.

What do you think? What other ways have you learned to spot your mistakes and put an end to them? Are there other personal growth books that have influenced you?

[EDITOR'S NOTE: Dr. Keith Roxo is a Top Gun-trained adversary pilot turned Aerospace Medicine physician. He has more than 2,000 hours in a variety of high-performance military aircraft—including the F/A-18, F-16, and F-5—and he holds multiple military flight instructor qualifications. He also holds airline transport pilot and CFII certificates. His medical qualifications include board certification in both Aerospace and Occupational Medicine, and he is a HIMS-qualified FAA-designated Senior Aviation Medical Examiner. Keith provides aviation medical consulting with Wingman Med. This article was submitted and approved according to our Guest Post Policy. We have no financial relationship.]