University of Minnesota Driven to Discover
Universities MUniversities Wordmark
Center for Transportation Studies Heading

Webinar Available

Automation Mania in the Time of Reason: Considerations for Complex Transportation Problems

Stephen Popkin, Director, Human Factors Research and System Applications Center of Innovation, Volpe Center, Research and Innovative Technology Administration (RITA), U.S. Department of Transportation (USDOT)

November 18, 2010

The November 18 Advanced Transportation Technologies seminar featured Stephen Popkin, director of the United States Department of Transportation (USDOT) Human Factors Research and System Applications Center of Innovation. Popkin, who is also an executive agent for the USDOT’s agency-wide Human Factors Coordinating Committee, emphasized the importance of considering the context when designing and introducing new technologies. “Context matters if you’re going to make an impact with the work you do,” he said.           

Human beings should always be the focus of every system. Systems without a human focus invite unintended consequences. For example, airline passengers might have accepted full-body scanning immediately following 9-11, but since then, the cultural mindset has changed. The result is a public backlash against the use of this technology. If changing mindsets had been a focus of the system when it was being designed and implemented, this backlash might have been prevented, Popkin said. Another example is the technology implemented in some new cars that allows drivers to access the Internet. Giving drivers Internet access while driving contributes to driver distraction and reduces safety—an unintended consequence resulting from a lack of human and contextual focus in the design process.           

Popkin described two models that provide factors to consider when developing a system.

James Reason, professor emeritus of psychology at the University of Manchester, developed the “Swiss cheese model of accident causation,” which provides a socio-technical context for technology development. It includes successive layers of defenses, barriers, and safeguards—or  “cheese.” When potential hazards are ignored, they sneak through the holes of each layer or piece of cheese, eventually resulting in damage. Neville Moray, professor emeritus in the Department of Psychology, University of Surrey, further elaborates on this model by creating a hierarchy of layers: 1) the physical system; 2) individual/team behavior; 3) the organizational, management, and labor infrastructure; and 4) the economic, social, legal, and regulatory context.            

Amtrak’s high-speed rail service, Acela Express, is an example of how problems can occur on the physical level. Because of badly designed controls and poor location of the controls, many of the train’s drivers have developed back problems. On the level of individual and team behavior, Popkin noted that train conductors, who once communicated by radio with the dispatcher, now communicate—often for only brief periods—with multiple teams along the way. In addition, digital communication has reduced situational awareness and information sharing, since dispatchers can no longer hear their colleagues’ radio conversations.           

To illustrate the third level—organizational, management, and labor issues—Popkin discussed fatigue management devices for trucking and rail. Devices placed in trucks were often damaged or covered up because drivers feared that collected information would be used against them.  On the level of economic, social, legal, and regulatory context (the fourth level), Popkin used the example of positive train control. Positive train control (PTC) uses integrated command, control, communications, and information systems for precisely and efficiently controlling train movements to improve railroad safety. The Rail Safety Improvement Act of 2008 mandates the widespread installation of PTC systems by the end of 2015, but our legal system and society are not necessarily ready to support this system economically, Popkin said.           

Three current transportation initiatives that have proven problematic due to contextual difficulties are positive train control, alertness monitoring of drivers, and the development of quieter cars, Popkin continued. But difficulties can be avoided through the use of human systems integration (HSI)—a formal systems engineering discipline that considers the human in operating a system. HSI addresses the first three levels of Moray’s model. The fourth is addressed by an evaluation methodology that considers implementation, impact, sustainability, and stakeholder involvement.           

Popkin next discussed the three goals for automation technology: the safe introduction of new technologies, the upgrading of existing technologies without compromising safety, and the efficient and timely implementation of technology. It’s incorrect to think that a system must be either fully automated or fully manual, Popkin said, and it’s also incorrect to apply a level of automation uniformly to an entire information processing and control system.           

Popkin concluded with a discussion on how to create a safety culture. This involves developing a safety council that assesses the needs of the environment where the safety culture is being developed and changing the design of how the environment does business. When a safety initiative was piloted in the railroad industry, the result was an 80 percent reduction in injuries. The USDOT is also implementing an agency-wide safety council to develop a safety culture within the Department of Transportation as well as externally. As a result of increased awareness, safety is now viewed as a critical national public health issue throughout the agency.