Not long ago, I submitted a lengthy, complex policy paper to an executive director for review, confident that I nailed the final product.
I had clearly articulated the problem, explored the underlying causes, and set out various approaches that could be taken to correct the issue.
Every section of the paper was tight, the discussion flowed beautifully from the background, and my analysis of the legal mechanisms that we might use to implement the solution was logical.
I received comments back from her about a week later.
Most of the feedback received was superficial, typically a sign that I had spent enough time thinking about the problem; however, as my eyes moved down the final page of the paper, one comment identified a flaw so obvious with my analysis, that I nearly went red with embarrassment when I read it.
I'm not going to repeat the error, but let me tell you, I felt stupid for two days for making it.
How could I make such a stupid mistake?
I took to Reddit to find others to commiserate with.
Really, it wasn't that big of a deal, I'm just a perfectionist.
However, caught up in self-analysis, wondering if I was too stupid to do my job correctly😁, I remembered an old boss providing advice to me regarding error reduction and quality control in analysis.
This boss was a non-practicing physician and a smart guy who had adopted techniques from his medical practice and applied them in his management consulting business.
While working for him, he recommended that I read "The Checklist Manifesto: How to Get Things Right."
I never did though, because using checklists seemed too basic to me at the time.
I had better tools, more complex that took many hours to learn and use proficiently.
Checklists? – they seemed like a waste of time, especially with a plethora of apps available for free to help boost productivity and quality control.
A little older now, and more humble, I decided to revisit the book written by Atul Gawande in an attempt to see if the advice contained within its pages could be applied to the policy analysis and development process.
The Checklist Manifesto, or How to Get Things Right
In his book, Atul Gawande argues that we fail for two main reasons:
1) necessary fallibility
Necessary fallibility means that some things are out of our capacity, and we don't have the knowledge, skills and experience to get the job done.
Even with the assistance of technology, our physical and mental powers are limited.
The second type of failure, ineptitude, refers to when we possess the knowledge and skills to get a job done, but we fail to apply our knowledge correctly.
The mistake I made in my reasoning in my paper was ineptitude, I had the knowledge but I failed to apply it correctly, leaving a logical hole in my analysis.
Gawande points out that humans have come a long way in the past 100 years or so:
We're smarter, better educated, and harder working than ever before. What's more, we've been highly successful at putting technology to work for us to solve all sorts of complex problems.
However, despite this, success escapes us for avoidable reasons.
The volume and complexity of what we know has exceeded our individual ability to deliver its benefits correctly, safely, or reliably. Knowledge has both saved us and burdened us.
Essentially, the world we live in, the world we've created, is so complex, that our minds can't manage all of the inputs.
Obvious mistakes are made all of the time, across all professions.
It's not because we're dumb, it's because our minds weren't designed to do a lot of continuous multitasking.
What's more, we're often asked to problem solve very quickly, under stupid tight timelines. Anyone in the corporate world will have experiended this.
So, to overcome the above problems, we need a strategy, argues Gawande.
The strategy is simple: use checklists.
The History of the Checklist
In his book, Gawande details how checklists got their start in the aviation industry.
On October 30, 1935, at Wright Air Field in Dayton, Ohio, the U.S. Army Air Corps held a flight competition for airplane manufacturers vying to build its next-generation long-range bomber. It wasn’t supposed to be much of a competition. In early evaluations, the Boeing Corporation’s gleaming aluminum-alloy Model 299 had trounced the designs of Martin and Douglas. Boeing’s plane could carry five times as many bombs as the Army had requested; it could fly faster than previous bombers, and almost twice as far. A Seattle newspaperman who had glimpsed the plane called it the “flying fortress,” and the name stuck. The flight “competition,” according to the military historian Phillip Meilinger, was regarded as a mere formality. The Army planned to order at least sixty-five of the aircraft.
A small crowd of Army brass and manufacturing executives watched as the Model 299 test plane taxied onto the runway. It was sleek and impressive, with a hundred-and-three-foot wingspan and four engines jutting out from the wings, rather than the usual two. The plane roared down the tarmac, lifted off smoothly, and climbed sharply to three hundred feet. Then it stalled, turned on one wing, and crashed in a fiery explosion. Two of the five crew members died, including the pilot, Major Ployer P. Hill.
An investigation revealed that nothing mechanical had gone wrong. The crash had been due to “pilot error,” the report said. Substantially more complex than previous aircraft, the new plane required the pilot to attend to the four engines, a retractable landing gear, new wing flaps, electric trim tabs that needed adjustment to maintain control at different airspeeds, and constant-speed propellers whose pitch had to be regulated with hydraulic controls, among other features. While doing all this, Hill had forgotten to release a new locking mechanism on the elevator and rudder controls. The Boeing model was deemed, as a newspaper put it, “too much airplane for one man to fly.” The Army Air Corps declared Douglas’s smaller design the winner. Boeing nearly went bankrupt.
Despite the disaster, the army decided to purchase several aircraft from boeing.
Army officials believed the aircraft was still flyable, they just needed a solution to reduce, or streamline the complexity of the aircraft's operational procedures.
The answer? A checklist.
Using the checklist, pilots went on to fly the Model 299 a total of 1.8 million miles without an accident.
Checklists can be highly effective, argues Gawande, so how do we put them to use?
So, how do you actually create a good checkist?
There are two key steps to start with:
- Define a clear pause point or break in the workflow in which the checklist is going to be used; and,
- Decide if you need a "do-confirm" checklist or a "read-do list"
"Do-confirm" lists simply mean do first, confirm after. In other words, complete the task then pause to confirm that every step has been completed.
"Read-do" lists call for doing the tasks in a step-by-step manner, following the directions contained in the checklist.
Gawande argues to keep the checklist short – around 5-9 items and never more than a page.
Checklist design should also feature clear, precise, and simple language. Moreover, the language used should be familiar to everyone who's going to be working with the checklist.
There should also be analysis completed regarding the most important items for inclusion in the checklist. You must identify the most critical items that could cause serious problems to people or processes if they are forgotten.
Checklists should always be tested in the real world and working groups established, if necessary, to complete, review, and approve checklist design.
Even after writing this post, checklists still come across as an overly simplistic productivity tool, but they're highly effective. There are mountains of evidence available demonstrating how checklists have saved lives, money, and increased productivity.
Personally, I've created a few basic checklists to ensure that I'm not missing any key steps when conducting research and analysis and developing formal policy documents.
I encourage you to check out Gawande's book and think about how you might apply checklists to your work or profession.