University of Minnesota Driven to Discover
U of MNUniversity of Minnesota
Center for Transportation Studies

Programs & Labs

Minnesota Specialty Vehicle Initiative

Project Contact: Craig Shankwitz, Dept. of Mechanical Engineering
External Project Contact: Linda Preisen, Center for Transportation Studies
Year Approved: 2000

Human Centered Technology for Handling Low Visibility: Vision Enhancement and other Driver Assistive Technologies

Photo of snow plow showing DGPS correction antenna, Rear radar sensors and strobes, and PC104 computers and power supplies. Photo of snow plow shoing GPS antenna. Photo of snow plow showing forward radar sensors, Micro-DAS and visibility, HUD, Tactile seat, and Haptic steering feedback actuator.
Photo of snow plow showing DC-AC inverter, and site radar sensors. Photo of snow plow. Photo of snow plow.
Photo of snow plow. Photo of snow plow showing high-ouput alternator, forward radar sensor, and the magnetic sensor behind the blade. Photo of snow plow.

A printable two-page Fact Sheet on this project is also available (PDF file, <1 MB).

The following describes a Field Operational Test funded by the US DOT Intelligent Vehicle Initiative in the area of specialty vehicles, vehicles that are designed to carry out specific purposes other than carrying freight or passengers. Other vehicle platforms that are part of the wider IVI program are light vehicles, heavy vehicles and transit vehicles. More on the IVI program can be found at Specialty vehicles include vehicles such as those used for road maintenance, for emergency response, such as police cars and fire trucks, ambulances, etc. The program described below was funded by the Federal Highway Administration through the Minnesota Department of Transportation as the prime contractor. The University of Minnesota Intelligent Transportation Systems Institute’s (ITS Institute) Intelligent Vehicles Laboratory is the technology integrator, with support of its HumanFIRST Program. Other partners include the Minnesota State Patrol, McLeod County, the City of Hutchinson, 3M, International Truck and Engine Corp. and Altra Technologies. The objective of this field operational test is to evaluate new technologies that enhance the driver’s ability to see the road and other vehicles while performing necessary functions either in order to keep the roads clear and open for others to follow, or to respond to emergencies. Additional background information can be found at and

Tasks and Technologies

The key technology areas incorporated into this initiative are:

  • Vehicle position and orientation: Integration of high-precision GPS and inertial measurements
  • Geospatial database: identification and location of relevant fixed elements local to the road
  • Obstacle detection: Radar-based sensing of mobile elements local to the road
  • Human-Machine interface: Presentation of relevant information to the driver to assist in the task of driving under low-visibility conditions, including the use of a new head-up display (HUD) design, auditory feedback, and a vibrating seat
  • Road visibility sensors and a vehicle data acquisition system: for analysis of system performance, road weather conditions and driver behavior.

Global Positioning System (GPS)

Driven by consumer demand, healthy competition industry competition has resulted in constantly improving GPS system performance and lower prices. Many large trucking firms are already employing GPS in their operations. Today, 12-channel GPS receivers claim to achieve accuracy of one to three meters with a differential correction signal. The correction signal is broadcast from transmitters on the ground, enabling the system to compensate for a variety of effects including signal delays in the upper atmosphere which distort the incoming signal from a GPS satellite.

A differentially corrected GPS system, or DGPS, recieves a satellite signal through a GPS reciever, is processed by a correction algorithm, and is transmitted by an RF modem at a fixed base station over air-waves.  The signal is recieved by a RF modem at a mobile station, processed by a correction algorithm.  The signal from the fixed base-station is coupled with another GPS signal recieved on the mobile station.  This information is processed into a solution on the mobile base station.

Schematic of a differentially corrected global positioning system (DGPS) in operation.

At the most basic level, GPS requires a receiver which determines its position based on signals received from multiple satellites; uncorrected accuracy in such a system can be on the order of ±10 meters, and this is after selective availability was turned off in 2000. Differentially corrected GPS, or DGPS, offers tremendous improvements in accuracy. A DGPS system which uses carrier-phase signal processing on dual frequencies can determine location with an accuracy of 2 cm (0.8 inches) (equal to 1 standard deviation). This level of accuracy is critical for achieving the benefits described here.

Besides high accuracy, the DGPS system must also offer the ability to recover from signal loss caused by passing under bridges, through tunnels, etc. In 1995, GPS systems offered convergence to 20 cm accuracy in three minutes, with a latency of 70 milliseconds. This level of performance was insufficient. More recent DGPS systems offer sub-meter accuracy within one second of the first DGPS signal arriving at the receiver, and two centimeter accuracy within 30 seconds, with a 20 millisecond latency.

Field operational testing along fifty miles of Minnesota Highway 7 uses a network of correction stations to maintain the needed system accuracy in the head up display. A single frequency on which to broadcast corrections was established, DGPS base stations were positioned so as to ensure good accuracy and coverage. The cities of Silver Lake, Mayer, and Chanhassen have approved the use of their municipal water towers for antenna mounts.

Map showing DGPS coverage area. Map showing DGPS coverage area. Map showing DGPS coverage area.
Map showing DGPS coverage area. Map showing DGPS coverage area. Map showing DGPS coverage area.

Map of the study area, west of Minneapolis, showing DGPS signal coverage of the Highway 7 corridor. Transmitters are located on the Silver Lake, Mayer, and Chanhassen municipal water towers.

In the event of loss of GPS signal or loss of DGPS correction signals, 3M’s magnetic tape and lateral positioning sensor is used. This technology has been integrated into the design and will be tested together with the DGPS system.

Geospatial database

Although the concept of a “digital map” is appealing as a way to describe the geospatial database used in this project, existing digital maps fall far short of the requirements of this task. For example:

  • The standard TIGER (Topographically Integrated Geographic Encoding and Referencing) maps compiled by the Census Bureau are based on 1:100,000 resolution—giving a functional accuracy of 50–60 meters.
  • The familiar 1:24,000 scale topographical maps created by the United States Geological Survey, stored as digital raster graphics (DRGs), are accurate to roughly 15 meters.
  • USGS Digital Ortho Quad (DOQ) maps, based on digitized aerial photographs, use a 1:12,000 scale, equivalent to three-meter accuracy horizontally.

In contrast, the current project’s geospatial database identifies and locates all relevant fixed landscape elements local to the road, including land boundaries, guardrails, dividers, bridge abutments, and signs, as well as attributes like intersections, speed regulations, etc. The accuracy of this database is 20cm or better.

A map showing the components of a high-accuracy geospatial database of the test roadway corridor.  Geospatially mapped components include signs, signal lights, street lights, guard rails, image locations, vehicle track, and road boundaries.  Sources: LamdaTech Data, and Photogrametry Data.

Components of a high-accuracy geospatial database of the test roadway corridor.

Closeup of test roadway map showing vehicle track, roud boundaries, and guard rails.

Further, this geospatial database is designed for real-time access by a moving vehicle, which requires minimal latency. Standard GIS tools such as ArcView are not fast enough to provide information to the vehicle’s onboard computer system. The structure of the database and the query engine were designed especially for this application.

To populate the database, the research team employed a number of tools and techniques, including data from the Minnesota Department of Transportation (Mn/DOT) photogrammetric unit, surveying systems from the Mn/DOT Metro GIS unit, and vehicle ‘drive-overs’.

Objects that exist within the database (illustration above right) include:

  • LaneBoundary—the leftmost and rightmost limits of each individual lane (green)
  • RoadShoulder—the extent of any driveable surface (red)
  • RoadIsland—areas within RoadShoulder objects which are not drivable (blue)
  • LaneCenter—Midpoints between lane boundaries (black)

Radar Obstacle Detection

The obstacle detection radar system for the test vehicle must address several design challenges, including:

  • Minimal Latency
  • High Accuracy
  • Reliability
  • Low false alarm rate

To reduce the number of “false positives,” the radar, DGPS, and geospatial database systems are integrated, and the results of radar scans are correlated with the database to filter out radar echoes from known objects such as road signs and objects located off the road, such as trees.

Human-Machine Interface: Head-Up Display

The Head-Up display or "HUD" is more commonly associated with military jet aircraft than with Minnesota snowplows.

The function of the HUD in the test vehicle is to display relevant information superimposed on the driver’s field of view, including information about the vehicle’s location and any vehicles or other obstacles which affect the operation of the snowplow. By referencing the vehicle and the driver’s eye position within an accurate digital map, it is possible to accurately recreate the field of view from the driver’s perspective.

A figure displaying a Heads Up Display.  A virtual projection is overlaid on a screen over the real road, and inputted by the eye coordinate frame.

As the vehicle moves along the highway, the vehicle’s position (from the DGPS system) is used to query the geospatial database. The resulting data is fed to the HUD’s graphics processor, which integrates it into a visual representation and computes the projection perspective needed for registration with the driver’s eyes. In other words, as the vehicle moves, the system transforms the objects from the real time accessible database on the vehicle and projects them into the field of view based on a coordinate system centered at the driver’s eyes. In order to avoid eye fatigue, the optical properties of the HUD have been designed with a virtual focus located approximately 12 meters in front of the vehicle.

A photograph of a Heads Up Display system in a snow plow.

Driver's view through the HUD, traveling westbound on Hwy 7 near Old Market Road. (Minneapolis Star-Tribune, February 2001)

The system allows the vehicle operator to see the ‘computed’ road boundaries projected and superimposed upon the ‘actual’ road boundaries, even if the road itself is obscured by snow, rain, or darkness. Icons representing radar-sensed obstacles are projected into the HUD image to provide the driver with correct cueing information (apparent position and apparent size) to determine distance and location of the obstacles in the field of view.

The researchers have documented and quantified the latency and accuracy errors associated with projecting geographic information onto the HUD. Projection frame rates exceeding 20Hz have been demonstrated; typical display latency values were 6–10 ms.

Prior to the field operational test, during the winter of 2000-1, thirteen snowplow operators were tested on a 5 mile long driving course with the windshields covered. All thirteen operators were able to drive the course successfully. Analysis of the data indicated that drivers liked the combination of visual, auditory, and haptic lane departure warnings. Along the straight segments, the lane departure warnings were rarely deployed. On the challenging corners (when the HUD image disappeared), drivers used the auditory and haptic lane departure warnings to successfully guide them through the turn.

A wider-shot of a Heads Up Display showing the projector, driver, and resulting display in a snow plow.

Video clips of HUD development

Various stages in the development of the HUD system are illustrated in these video clips.

(September 1997) Local TV news segment on what we were doing with DGPS several years back, developing a system for truck drivers in the event of driver impairment (such as fatigue).

Watch Video


(November 1999) Local TC news segment showing snowplows and an early prototype version of the HUD when we were first starting the Field Operational Test. Taped just before the official announcement date for the Field Operational Test (the vibrations of the HUD on the bouncing truck have since been removed).

Watch Video


(October 2000) Test run showing how the projected lane boundaries line up with the true lane boundaries for a moving snowplow on a clear day with radar off.

Watch Video


(October 2000) Test run showing how the projected lane boundaries line up with the true lane boundaries for a moving snowplow at night with radar on, showing how vehicles in front of the plow are accurately tracked.*

*Note: Only one radar was mounted at the time and so only vehicles in the front forward central 12 degree field of view are tracked. This video illustrates that the system correctly does not generate false positives from radar echoes returning from other road furniture adjacent to the roadway itself. This is accomplished by creating a 'filter' using the same accurate DGPS (a dual frequency unit) and accurate road geospatial database (i.e. a real time accessible digital map on board the vehicle) that is used to generate the projected lane boundaries on the HUD.

Watch Video


(December 2001) Improved HUD display at night, tracking moving vehicles and lane boundaries. The snowplow is moving along a highway approaching an intersection. In the HUD, rectangles with a width equivalent to a truck (widest vehicle normally on the road) are drawn when triggered by radar on the plow. For a while, iconic representations of guard-rails appear on the right and a
representation for Jersey barriers appears on the left. These are necessary so that plow operators know that they are there during poor visibility conditions.

At the intersection, the signals turn red and as a result the headway to the vehicle just ahead of the plow is reduced. The radar icon turns red when the vehicle is 50 feet or less from the plow. The traffic signals then turn green and traffic begins to flow normally again.

In the distance, the projected digital map is drawn as green from 350 ft. and out. This is a cue to the driver that this is the maximum range of the radar. Vehicles beyond 350 ft are not sensed by the system and therefore cannot be drawn there.

Note that the video occasionally stops because of heavy traffic on the Internet which delays transmission of the frames of video data. These delays are not part of the standard operation of the HUD and are not part of the original video.

Watch Video

Key technology issues

Three broad technology issues are being addressed by this research project.

First, the system must be optimized for driver acceptance and use. This need has led to an emphasis on human factors studies in the development of the test system.

Second, the system must be reliable and tolerant of faults. If the driver comes to trust the system, that trust must not be broken. For this reason, the project includes built-in redundancy for critical subsystems, such as multiple radar units.

Third, the system must perform well. Cost-benefit analysis is being developed in order to justify the expense of installation and maintenance in high-demand vehicles such as snowplows.

Beyond snowplows

Numerous applications for lane-keeping technologies exist beyond snowplows. For example, run-off-the-road crashes account for at least 20% of accidents reported by the police each year ( i.e. 1.5 to 1.6 million). This is the single largest cause of driving fatalities, accounting for approximately 16,000 of the 42,000 crash deaths annually. In Minnesota, lane departure crashes contribute to significantly more than the national rate of 30% of fatalities.

Nationally, run-off-the-road crashes occur most often on:

  • Straight roads (76%)
  • Dry roads (62%)
  • In good weather (73 %)
  • In rural or suburban areas (75%)