31.5.08

The Human Factor

The Human Factor

Revolutionizing the way People live with technology

by Kim Vicente

from the back flap (I love this endorsement and summary of the book's message):

"This book can save lives. Strong words? Yes, but this is a strong book: engaging, easy to read, but carrying a powerful message. We have far too long neglected the human and social side of technology. The result is needless accidents in vehicles, hospitals, manufacturing plants and, worse, no way of learning to make life better, safer, more enjoyable. Instead, we rush to find blame, to sue, fire, and penalize people when it's the system that's at fault. THE HUMAN FACTOR can indeed revolutionize the way we live. Read this book." - Donald A. Norman, Author of The Design of Everyday Things, and co-founder of the Nielson Norman Group.


my flags (they aren't prolific, but the book held great value and is a source of influence for me)...

pg 41 As (Robert) Wright says: “Your brain may give birth to any technology, but other brains will decide whether technology thrives. The number of technologies is infinite, and only a few pass this test of affinity with human nature.” The innovations that don’t pass this test get thrown on the scrapheap. But to be successful, candidate technologies need the support of social structures that foster cooperation and coordination amongst individuals or institutions. (Amanda’s note: think of the human factor in the LMS.)

pg 42 The important fourth part of this development cycle is the transitional instability that results as new technologies and social structures arise and are overthrown. This fluid phase is a transitory no-man’s land; the traditional way of thinking has lost its appeal and is leading to social chaos, but a new way of thinking that can lead to social progress has yet to appear on the horizon (sound familiar?). And just when you think things are at their worst and society is totally out of control, real advances are most likely to take place. As Wright put it: “Turbulence and chaos often turn out to be harbingers of new forms of order.” (Amanda’s note: this reminds me of the lyrics from He Got Game... ‘and a new world order…’.)

pg 45 And of course, we should ensure that the design of technological system is problem-driven, that it aims to fulfill a human or societal need, so that we avoid the Mechanistic tendency to design technology for its own sake. Rather than thinking about the Cyclopean abstractions of “technology without people” or “people without technology”, we can focus our attention on what matters most – the people-technology relationship as it affect human and societal needs.

Some readers will recognize what I’d advocating as an example of systems thinking – a holistic, problem-driven way of looking at the world, an approach that focuses on the relationships between system elements, whatever form those elements happen to take (in our case, people and technology). Rather than following the old Laplacian reductionist doctrine of carving things up into smaller and smaller pieces and examining each tiny element and in isolation – the kind of thinking that got us into this mess in the first place – systems thinking focuses on the big picture, the interactions between the elements. … But systems thinking is still a minority view and many people have never heard of it. (Amanda’s note: I am curious if JA believes this comment about ‘minority view’.)

pg 51 So there you have it: a new, reader-friendly term for a new way of thinking that owes a nod to Adam Smith’s genius and has parallels with the systems thinking revolutions that are slowly transforming other areas of society. My deepest professional hope is that this simple word – Human-tech – will help to power a conceptual revolution, and tear down the roadblocks put in our way by antiquated Mechanistic and Humanistic ways of thinking. A Human-tech revolution would completely change how we live with technology, and would do away with the transitional instability that currently engulfs us.

pg 61 The Human-tech Ladder: Design should begin by understanding a human or societal need – and then tailoring the technology to reflect specific human factors.

Bottom: human or societal need e.g. the music revolution, the knowledge economy, transportation, counter-terrorism, public health, environment …

  1. Human factor – physical. Technology (hard or soft) – Size, shape, location, weight , color, material
  2. Human factor – psychological. Technology (hard or soft) – Information content / structure, cause/effect relations
  3. Human factor – team. Technology (hard or soft) – Authority, communications patterns, responsibilities
  4. Human factor – organizational. Technology (hard or soft) – Corporate culture, reward structure, staffing levels
  5. Human factor – political. Technology (hard or soft) – Policy agenda, budget allocations laws, regulations

pg 90 This is where my own discipline, human factors engineering, comes most into play. One of the things my colleagues and I do is document the psychological properties of people and the design techniques that can be used to create a fit with those properties. The best-known book on the subject is Don Norman’s bestseller, The Psychology of Everyday Things.

pg 113 In an attempt to salvage the course, I tried to lessen their anxieties by pointing out that they didn’t have to cure all of the world’s problems in one fell swoop; just design a simple product that could make a modest dent in reducing a significant global problem. The key was to pick an environmentally unfriendly activity that was performed many times by many people and focus on the social aspects of the technology it employed. If they could design a product that would lead to a small social improvement, and that product was used frequently, then the benefits to the environment, and thus to quality of life, could slowly but surely add up over time. A tonne of feathers still weighs a tonne.

pg 137 …Nobody knew how many illegal items each inspector let go by undetected. Yet one of the things we know about human psychology is that it’s critical for people to receive timely feedback on their job performance. Doug Harris, a human factors engineer who has studied airport security came up with a vivid and compelling analogy to illustrate the psychological consequences of lack of feedback: “Consider… how little people would improve their bowling performance and how soon they would stop bowling altogether if there were no system of keeping score and no feedback on how many and which pins where knocked down with each roll of the ball.”

pg 140 Scientific research on other vigilance tasks has shown that people remain more attentive and performance improves if “false signals” are introduced periodically to stimulate people to remain alert. In the case of airport security, images of illegal objects superimposed or blended in with legal objects could occasionally be projected on the monitoring screen.

pg 149 As Dr. Lucian Leape noted, the road to progress and change is a clear, but difficult, one to follow: “Physicians and nurses need to accept the notion that error is an inevitable accompaniment of the human condition, even among conscientious professionals with high standards. Errors must be accepted as evidence of system flaws not character flaws. Until and unless that happens, it is unlikely that any substantial progress will be made in reducing medical errors.” In other words, unless and until a Human-tech Revolution occurs in health care, the idea of designing systems that recognize the human factor will have a hard time showing up on the radar screen, never mind have a positive impact on patient safety. But modifying an entire profession’s basic assumptions about how the world works – like any other conceptual revolution – will take time, patience and dedicated effort. (Amanda’s note: think ‘mental models’.)

pg 150 To learn more about how medical error contributed to patient injury and death in anaesthesia, Cooper used the “critical incident technique” – the same method that Paul Fitts and Richard Jones had used to understand threats to aviation safety during World War II. Anaesthesiologists were asked to remember and describe incidents that either could have led or did lead to a bad outcome, which might be anything from adding to the length of the patient’s stay in the hospital to permanent disability or death. The anaesthesiologists were then asked to recall the circumstances surrounding the critical incidents. These “incident reports” would then be used to provide other anaesthesiologists with a way to learn from experience, by understanding the reasons bad things had happened, or almost happened, and identifying problems with products (e.g. poor equipment design) or work systems (e.g. long hours) – potentially lethal “invisible hands” that were threatening patient safety. This understanding provided a solid basis for making changes and thus reducing error. (Amanda’s note: this is similar to Instructional Design’s critical error analysis.)

pg 190 Rather than provide engineers with management training, some organizations have hired graduates from business schools to oversee the design or operation of technical systems. Because these individuals don’t usually understand the underlying technology that they’ve been put in charge of and don’t usually consult with technical experts, they have little choice but to apply standard management procedures, regardless of the industry they’re managing. Organizations that use this approach to managing technological systems …. have an abysmal long-term performance record, revealing how important it is for effective management to have access to industry-specific technical knowledge. (Amanda’s note: this is applicable to my role in Instructional Design.)

pg 248 … we need to identify the system design levers at the political level that are relevant to the success of complex technological systems. Three levers identified by political scientists include policy aims, legal regulation, and budget allocations.

pg 269 The groundbreaking work of Professor Jens Rasmussen, a Danish engineer, gives us the conceptual tools to make this transition from descriptive understanding to prescriptive intervention. … The crowning achievement of Jen’s life work is a two-part, qualitative framework that aims to explain both how accidents occur and how they can be prevented.

pg 277 The only long-term solution to managing risk in a dynamic society like ours appears to involve first of all accepting that external stressors such as budget cuts and market competitiveness aren’t going to go away since they’re the result of persistent human factors. Then we can focus on deliberately building technological systems that can respond and adapt to these pressures without compromising safety. In other words, the goal is to allow systems to operated “at the edge” to maximize competitiveness or efficiency, but without actually breaking the envelope of safety and precipitating accidents. To “operate at the edge,” vertical alignment via feedback across all levels must be achieved so that each person and organization in the system can see the effect their actions have on safety, not just on the bottom line.

pg 286 Human-tech isn’t a household word yet, but it has already had a significant impact on our quality of life and some basic principles are in place to help bring it to the fore in practices. … Here’s a summary of some of the examples we’ve covered; and you’ll find they have wide application in many areas of life, business and industry:

- task analysis

- stimulus-response compatibility

- behaviour-shaping constraints

- feedback design principle

- shape coding

- Aviation Safety Reporting System (ASRS)

- Cockpit Resource Management (CRM)

- critical-incident technique

- Jen Rasmussen’s framework for risk management

pg 291 (What can you do to make a difference?) … if you want to live in a world that celebrates humanity and the human factor, then buy Human-tech products. Begin to distinguish poorly designed products. You’ll stop blaming yourself for being technologically incompetent. Tell your friends about them. Show them how much better a human-tech gadget is, such as a PalmPilot, than one dominated by 10 million features; your friends will thank you. They too will be more likely to buy products that have an affinity with human nature. And that, in turn, will make the Wizards listen. There’s nothing like market pressure to encourage companies to change the way they do business. Human-tech consumers will eventually drive out Mechanistic design.

pg 292 There doesn’t have to be a trade-of between market share and quality of life – the two can go hand in hand. But in most cases, a full recognition of this and what it might achieve requires profound changes in the way a company designs its systems as much as its products: people’s needs will have to be put ahead of technology for its own sake; potential users of the product need to be consulted and involved from the very start; prototypes built and tested with real users (not Wizards) to see what works and what doesn’t work – and the results from these tests need to iteratively used to improve the design of the product. (Amanda’s note – this reads like the definition of Google’s approach! Also loads of application to my ID world.)

(Interesting note about the author at the end of the book Kim Vicente, in 1999, was chosen by TIME magazine as one of 25 Canadians under the age of 40 as a “Leader for the twenty-first Century who will shape Canada’s future.”)



No comments:

Gaping Void Goodness