Reading – 2018

December 2018

  • How Not to Be Wrong: The Power of Mathematical Thinking NOTES: …
  • The Field Guide to Understanding ‘Human Error’ NOTES: Really really enjoyed this book. I think there’s a lot that can be and should be applied to past tense corporate decision making and accountability. Things I’ve taken away: 1) look at the overall system, not the individuals, 2) It’s easy to see what the problem was after the fact.. if it was easy at the time of the incident, it probably wouldn’t have happened. 3) The difference between data availability and data observability. Relevant quotes:
    • “1) Put ‘human error’ in quotation marks, because it is merely an attribution after the fact. 2) Understand that this attribution is the starting point, not the conclusion of an investigation., 3) Do not use the word ‘failure’ but rather ‘experience’ to describe the episode where things went wrong.”
    • “The authority / responsibility mismatch: You cannot fairly ask somebody to be responsible for something he or she had no control over. It is impossible to hold somebody accountable for something over which that person had no authority.”… “This is as the heart of the professional pilot’s eternal conflict,” writes Wilkinson in a comment to the November Oscar case. “Into one ear the airlines lecture, “Never break regulations. Never take a chance. Never ignore written procedures. Never compromise safety.” Yet in the other they whisper, “Don’t cost us time. Don’t waste our money. Get your passengers to their destination — don’t find reasons why you can’t.”
    • “I was visiting the chief executive of a large organization to talk about safety when news came in about an incident that had just happened in their operation. A piece of heavy equipment, that should have been fastened, came loose and caused quite a bit of damage. As I was sitting there, the first reaction around the board table was ‘Who did this?! We must get our hands on this person and teach him a real lesson! We should turn him into an example for others! This is unacceptable!’ After the situation calmed a bit, I suggested that if they really wanted other people to learn from this event, then it could be more profitable to talk to the person in question and ask him to write up his account of what happened and why. And then to publish this account widely throughout the company. If the person would go along with this, then management should drop all further calls for accountability and retribution. It took some effort, but eventually they seemed to agree that this could be a more meaningful way forward…. What is the moral of this encounter? 1) If you hold somebody accountable, that does not have to mean exposing that person to liability or punishment. 2) You can hold people accountable by letting them tell their story, literally ‘giving their account.’ 3) Storytelling is a powerful mechanism for others to learn vicariously from trouble.”
    • On how to create a “just culture” and create accountability: “1) Don’t ask who is responsible, ask what is responsible. 2) Link knowledge of the messy details with the creation of justice. 3) Explore the potential for restorative justice. 4) Go from backward to forward-looking accountability. 5) Put second victim support in place.”
    • Page 39: “Reactions to failure focus firstly and predominately on those people who were closest to producing or potentially avoiding the mishap. It is easy to see these people as the engine of action…. In order to understand error, you have to examine the larger system in which these people worked. You can divide an operational system into a sharp end and a blunt end: 1) At the sharp end (for example, the train cab, the cockpit, the surgical operating table), people are in direct contact with the safety-critical process. 2) The blunt end is the organization or set of organizations that both supports and constraints activities at the sharp end (for example, the hospital, the airline, equipment vendors and regulators).”
    • Questions when doing a “debriefing” (likely similar to a post-mortem in software engineering): 1) Cues: What were you seeing? What were you focusing on? What were you expecting to happen? 2) Interpretation: If you had to describe the situation to your colleague at that point, what would you have told? 3) Errors: What mistakes were likely at this point? 4) Previous experience / knowledge: Were you reminded of any previous experience? Did this situation fit a standard scenario? Were you trained to deal with this situation? Were there any rules that clearly applied here? Did any other sources of knowledge suggest what to do? 5) Goals: What were you trying to achieve? Were there multiple goals at the same time? Was there time pressure to or other limitations on what you could do? 6) Taking Action: How did you judge you could influence the course of events? Did you discuss or mentally imagine a number of options or did you know straight away what to do? 7) Outcome: Did the outcome fit your expectations? Did you have to update your assessment of the situation?
    • “… There is a difference between 1) data availability: what can be shown to have been physically available somewhere in the situation ? 2) data observability: what would have been observable given the features of the interface and the multiple interleaving tasks, goals, interests, knowledge and even culture of the people looking at it” page 64. A good reminder for folks building monitoring systems that even if the information is available somewhere… if no one can see it or if it’s not available as part of the workflow, then it’s effectively not there.
    • page 75: “Cause is not something you find. Cause is something you construct. How you construct it, and from what evidence, depends on where you look, what you look for, who you talk to, what you have seen before and likely who you work for…. There is no ‘root cause’: So what is the cause of the accident? This question is just as bizarre as asking what the cause is of not having an accident. There is no single cause — neither for failure, nor for success. In order to push a well-defended system over the edge (or make it work safely), a large number of contributor factor are necessary and only jointly sufficient.”
    • page 80: Explanatory vs. change factors. The story about the hand injuries and big haul tracks… Explanatory factors were time pressure, lack of protective equipment, visibility, etc.. Change factors were that more than 80% of the injuries were happening during unscheduled maintenance along the dusty roads of the mine….
    • page 96, story about offering generic rewards (a bottle of wine) for airline crews that execute a missed approach…. “each landing is a failed go-around”.
    • page 101, so true with computer systems as well… “studies on the monitoring of dynamic processes have shown that it is very difficult for people to notice non-events. Things that do not happen are not meaningful or informative phenomena in the monitoring of dynamic processes. Something that is not happening is not a good trigger for human intervention. These studies show that non-events, that is, the lack of change over time are associated with difficulty for practitioners to detect meaningful phenomena in their monitored processes…. The sensory and neural systems of most organisms, including mammals, are highly attuned to pick up and respond to change, while reserving cognitive resources for other tasks when there is no noticeable change in the environment.”
    • page 137… Drift. “Murphy’s Law is wrong. What can go wrong usually goes right and then we draw the wrong conclusion: that it will go right again and again , even if we borrow a little more from our safety margins.” A strong reminder that we should be constantly on the lookout for failures, even when everything seems to be going well…. “This is why high-reliability organizations (HRO) deal with risk by remaining chronically uneasy. Otherwise they may: 1) be overconfident in past results. In a dynamic, complex system, past results are not a good guide for future safety; 2) suppress minority viewpoints, even though these could should more clearly where risk brews among the messy details of daily practice; 3) give of priority to acute performance expectations or production pressures… To guard against these drift-inducing impulses, HRO suggests you stay curious, open-minded, complexly sensitized, inviting of doubt and ambivalent toward the past. People in HROs are described, ideally, as skeptical, wary and suspicious of quiet periods.” Emphasis mine, I love the “chronically uneasy” phrase.
    • page 139 This is true of everything in life: “Success narrows perceptions, changes attitudes, reinforces a single way of doing business, breeds overconfidence in the adequacy of current practices and reduces acceptance of opposing points of view.”
    • page 149 on safety bureaucracies: “1) SB’s can sometimes institutionalize and confirm the Old View. They do this, for example, by counting and tabulation of negative (incidents)…. 2) Incentive structures around the absence of negatives can get people to suppress bad news… 3) SB’s are often organized around lagging indicators; measure that which has already been. 4) SB’s tend to value technical expertise less than they value protocol and compliance. This can disempower not only the expert operators who do safety critical work, but their supervisors and middle management as well. 5) The gap between how an organization or operation work and how the bureaucracy believes it works can grow. 6) Paperwork gets in the way of talking and listening to people. 7) There is a self-fulfilling nature about SB. B accountability is demanded because of bureaucratic accountability; paperwork begets paperwork… non-operational positions grow more non-operational positions. 8) Fear of the consequences of curtailing a safety function is combined with a promise of future useful work and reminders of past successes…. There are suggestions in research and from mishaps that growing your safety bureaucracy actually increases your risk of an accident.”
    • New view vs. old view of safety: “1) power to decide in New View safety lies with experts, not bureaucrats, 2) is driven by insight and context-rich experience from the line, rather than by rule and regulation governed by a staff, 3) creating safety is about giving people who do safety critical work the room and possibility to do the right thing. This means giving them not only the discretionary space for decision making, but also providing them with error-tolerant and error-resistant designs, workable procedures and the possibility to focus on the job rather than on the bureaucratic accountabilities, 4) work, and its safety, are not just governed by process and rules, but adjusted by mutual coordination, 5) and innovation and better solutions are understood to be only viable if there is a willingness to embrace diversity and occasional, safe to fail non-compliance.”
    • On “zero vision”, page 167-168: “1) Zero vision is defined by its dependent variable: the outcome au casinos online. It is not defined by its control variables: the safety inputs that everyone in the organization makes. This means that a commitment to zero often leaves many people in the dark about what is is that they are supposed to do. It also encourages numbers games: a manipulation of the dependent variable. 2) Zero vision implies that everything is preventable, otherwise ‘zero’ would be a nonsensical ambition. But if everything is preventable, then everything needs to be investigated and remedies found. This can be a waste of limited resources your organization has available for investigations. 3) Zero visions can stigmatize incidents and those involved in them. They suggest that ‘human error’ is the source of trouble and the target for intervention. ‘Errors’ can be equated with moral lapses, failures of character. 4) There is no support in safety research literature that a zero vision is at all achievable. All accident theories from the past decades more or less acknowledge that a world of zero consequences is out of the question. 5) A focus on zero can actually lead to a blindness to real risk. Most organizations which have suffered big calamities over the past decades had exemplary performance on incidents and injuries. Before they blew stuff up and killed scores of their employees, their numbers of minor negative events were really low (or even zero for some time). But while counting and tabulating largely irrelevant low-consequence events, the definition of acceptable engineering risk had been eroding under their noses. They were evidently counting what could be counted, not what counted.”

November 2018

  • The Fighters NOTES: a collection of stories about men and women sent into battle in Afghanistan and Iraq. Quotes:
    • After being shot through the face and seeing himself in the mirror for the first time and realizing he has a long recovery ahead: “What the fuck ever, man. I don’t care.”

October 2018

September 2018

August 2018

July 2018

  • Kick the Tires, Light the Fire: An Instructor Looks Back NOTES: dad of a woman I work with at New Relic, short fun read about being a fighter pilot.
  • Peak: Secrets from the New Science of Expertise NOTES: I learned a lot from this book, the main point that’s beat into you again and again and again is that you likely can be good at almost anything (some things, like playing in the NBA, are a function of size) if you practice purposefully for years. Purposeful practice requires focus (you have to give it your full attention), feedback (involve others in your practice, look for coaches when you hit a plateau), and getting out of your comfort zone (you have to be willing to try things in different ways and with different approaches). Specific quotes / pages / references:
    • On the usefulness of mental representations (page 70): “Unlike medical students, expert diagnosticians have built sophisticated mental representations that let them consider a number of different facts at once, event facts that at first might not seem germane. This is a major advantage of highly developed mental representations: you can assimilate and consider a great deal more information at once. Research on expert diagnosticians has found that they tend to see symptoms and other relevant data not as isolated bits of information but as pieces of larger patterns – in much the same way that grandmasters can see patterns among chess pieces rather than a random assortment of pieces.”
    • More on the same and the specific interplay between mental representations and writing: “There was a steady interplay between the writing of the book and our conceptualization of the topic, and as we looked for ways to make our messages clearer to the reader, we could come up with new ways to think about deliberate practices ourselves. Researchers refer to this sort of writing as “knowledge transforming” as opposed to “knowledge telling”, because the process of writing changes and adds to the knowledge that the writer had when starting out.”
    • Page 99, the traits of deliberate practice: “… develops skills that other people have already figured out how to do and for which effective training technique have been established. The practice regimen should be designed and overseen by a teacher or coach who is familiar with the abilities of expert performers and with how those abilities can be best developed. Deliberate practice takes place outside one’s comfort zone and requires a student to constantly try things that are just beyond his or her current abilities. Thus it demands near-maximal effort, which is generally not enjoyable…. involves well-defined, specifical goals and often involveds improving blackjack live stream some aspect of the target performance; it is not aimed at some vague overalll improvement. Once an overall goal has been set, a teacher or coach will develop a plan for making a series of small changes that will add up to the desired larger change. Improving some aspect of the target performance allows a performer to see that his or her performances have been improved by the training… Deliberate practice is deliberate, that is, it requires a person’s full attention and conscious actions. It isn’t enough to simply follow a teacher’s or coach’s directions. The student must concentrate on the specific goal for his or her practice activity so that adjustments can be made “
  • Redeployment NOTES: a collection of short fictional stories about the war in Iraq, good vacation read.
  • A Deadly Wandering: A Mystery, a Landmark Investigation, and the Astonishing Science of Attention in the Digital Age NOTES: …
    If you EVER text while driving, you should read this book and then you should stop texting while driving. I think I’m going to make my sons read this book before they ever get behind the wheel. Great stuff about attention and dopamine and addiction in general as well. Quotes & references:

    • “Despite some erosion, money retained value over relatively long periods. Not so with a text. In the case of a text, the information lost one quarter of its value in ten minutes and half its value in five hours. Information loses a lot of value in a short period of time. Money retains its value over time.”, all of which is in reference to a study about how people place value on recency and text messages, which goes on to point out that you’re likely to very very highly value text messages from close acquaintances and family.
    • Further studies showed that it’s not just getting a text that results in a dopamine hit, sharing or connecting with someone, especially electronically. Basically Facebook is hacking our brains. (page 170)
    • Variable reinforcement: we constantly refresh Twitter, Facebook, Reddit email and our phones because sometimes we get valuable and interesting news.. but most of the time we don’t. Our brains are constantly on the hunt for new information.
    • Page 219 talks about how our brain works much like a muscle… work it hard all day and then it gets tired and we’re less able to make good decisions (eating a fruit salad vs. a piece of cake) later in the day.
    • Page 350: “… There’s another connection between Reggie and Terryl: addiction… In Reggie’s case, it’s impossible to conclusively say he was addicted to technology. But he did find the activity almost irresistible, so much so he did it behind the wheel, maybe to fill a social void, and feel connected. For many others, particularly since technology has gotten more powerful, the signs of compulsion are even stronger. If the electronic gadgets are not used in fair moderation, they can take over a person, maybe not to the extent that drugs overtook Danny, but certainly in ways that can change who they are, the kind of parent or friend they become, how they learn, how they attend to the world.”
    • Page 354: “… The sum of these studies, and others along the same lines, offer a pretty clear road map to reaching a state of mind that allows for good decision making and better awareness of one’s life — the precursors to balance and happiness. Take breaks from stimulation. Put another way: Turn off the device for a sustained period — whether hours or days. And then, and here’s the really tough part, don’t fill the void left by absence of stimulation with some other nonstop simulation. That can be hard to do, the experts told me, when people are so accustomed to the constant stream of pings and external noises. A veritable panic can ensue – What will I do with this void? But it’s there, when things die down, that the learning and memory get strong but, more than that, that the greater powers of decision making come into play. When a person is clearheaded, the frontal lobe becomes freed of the humming and buzz of external pressure. That’s when you can decide what steps are best, what actions are wisest.

June 2018

May 2018

April 2018

March 2018

February 2018

January 2018

Now with 50% less caffeine!