pg 41 As (Robert) Wright says: “Your brain may give birth to any technology, but other brains will decide whether technology thrives. The number of technologies is infinite, and only a few pass this test of affinity with human nature.” The innovations that don’t pass this test get thrown on the scrapheap. But to be successful, candidate technologies need the support of social structures that foster cooperation and coordination amongst individuals or institutions. (Amanda’s note: think of the human factor in the LMS.)
pg 42 The important fourth part of this development cycle is the transitional instability that results as new technologies and social structures arise and are overthrown. This fluid phase is a transitory no-man’s land; the traditional way of thinking has lost its appeal and is leading to social chaos, but a new way of thinking that can lead to social progress has yet to appear on the horizon (sound familiar?). And just when you think things are at their worst and society is totally out of control, real advances are most likely to take place. As Wright put it: “Turbulence and chaos often turn out to be harbingers of new forms of order.” (Amanda’s note: this reminds me of the lyrics from He Got Game... ‘and a new world order…’.)
- Human factor – physical. Technology (hard or soft) – Size, shape, location, weight , color, material
- Human factor – psychological. Technology (hard or soft) – Information content / structure, cause/effect relations
- Human factor – team. Technology (hard or soft) – Authority, communications patterns, responsibilities
- Human factor – organizational. Technology (hard or soft) – Corporate culture, reward structure, staffing levels
- Human factor – political. Technology (hard or soft) – Policy agenda, budget allocations laws, regulations
pg 90 This is where my own discipline, human factors engineering, comes most into play. One of the things my colleagues and I do is document the psychological properties of people and the design techniques that can be used to create a fit with those properties. The best-known book on the subject is Don Norman’s bestseller, The Psychology of Everyday Things.
pg 113 In an attempt to salvage the course, I tried to lessen their anxieties by pointing out that they didn’t have to cure all of the world’s problems in one fell swoop; just design a simple product that could make a modest dent in reducing a significant global problem. The key was to pick an environmentally unfriendly activity that was performed many times by many people and focus on the social aspects of the technology it employed. If they could design a product that would lead to a small social improvement, and that product was used frequently, then the benefits to the environment, and thus to quality of life, could slowly but surely add up over time. A tonne of feathers still weighs a tonne.
pg 137 …Nobody knew how many illegal items each inspector let go by undetected. Yet one of the things we know about human psychology is that it’s critical for people to receive timely feedback on their job performance. Doug Harris, a human factors engineer who has studied airport security came up with a vivid and compelling analogy to illustrate the psychological consequences of lack of feedback: “Consider… how little people would improve their bowling performance and how soon they would stop bowling altogether if there were no system of keeping score and no feedback on how many and which pins where knocked down with each roll of the ball.”
pg 140 Scientific research on other vigilance tasks has shown that people remain more attentive and performance improves if “false signals” are introduced periodically to stimulate people to remain alert. In the case of airport security, images of illegal objects superimposed or blended in with legal objects could occasionally be projected on the monitoring screen.
pg 149 As Dr. Lucian Leape noted, the road to progress and change is a clear, but difficult, one to follow: “Physicians and nurses need to accept the notion that error is an inevitable accompaniment of the human condition, even among conscientious professionals with high standards. Errors must be accepted as evidence of system flaws not character flaws. Until and unless that happens, it is unlikely that any substantial progress will be made in reducing medical errors.” In other words, unless and until a Human-tech Revolution occurs in health care, the idea of designing systems that recognize the human factor will have a hard time showing up on the radar screen, never mind have a positive impact on patient safety. But modifying an entire profession’s basic assumptions about how the world works – like any other conceptual revolution – will take time, patience and dedicated effort. (Amanda’s note: think ‘mental models’.)
pg 150 To learn more about how medical error contributed to patient injury and death in anaesthesia, Cooper used the “critical incident technique” – the same method that Paul Fitts and Richard Jones had used to understand threats to aviation safety during World War II. Anaesthesiologists were asked to remember and describe incidents that either could have led or did lead to a bad outcome, which might be anything from adding to the length of the patient’s stay in the hospital to permanent disability or death. The anaesthesiologists were then asked to recall the circumstances surrounding the critical incidents. These “incident reports” would then be used to provide other anaesthesiologists with a way to learn from experience, by understanding the reasons bad things had happened, or almost happened, and identifying problems with products (e.g. poor equipment design) or work systems (e.g. long hours) – potentially lethal “invisible hands” that were threatening patient safety. This understanding provided a solid basis for making changes and thus reducing error. (Amanda’s note: this is similar to Instructional Design’s critical error analysis.)
pg 190 Rather than provide engineers with management training, some organizations have hired graduates from business schools to oversee the design or operation of technical systems. Because these individuals don’t usually understand the underlying technology that they’ve been put in charge of and don’t usually consult with technical experts, they have little choice but to apply standard management procedures, regardless of the industry they’re managing. Organizations that use this approach to managing technological systems …. have an abysmal long-term performance record, revealing how important it is for effective management to have access to industry-specific technical knowledge. (Amanda’s note: this is applicable to my role in Instructional Design.)
pg 248 … we need to identify the system design levers at the political level that are relevant to the success of complex technological systems. Three levers identified by political scientists include policy aims, legal regulation, and budget allocations.
- task analysis
- stimulus-response compatibility
- behaviour-shaping constraints
- feedback design principle
- shape coding
- Aviation Safety Reporting System (ASRS)
- Cockpit Resource Management (CRM)
- critical-incident technique
- Jen Rasmussen’s framework for risk management
(Interesting note about the author at the end of the book Kim Vicente, in 1999, was chosen by TIME magazine as one of 25 Canadians under the age of 40 as a “Leader for the twenty-first Century who will shape Canada’s future.”)
No comments:
Post a Comment