OSU
THE OSU AUTONOMOUS VEHICLE

SUMMARY

The Ohio State University Center for Intelligent Transportation Research (CITR) has developed three automated vehicles demonstrating advanced cruise control, automated steering control for lane keeping, and autonomous behavior including automated stopping and lane changes in reaction to other vehicles. Various sensors were used, including a radar reflective stripe system and a vision based system for lane position sensing, a radar system and a scanning laser rangefinding system for the detection of objects ahead of the vehicle, and various supporting sensors including side looking radars and an angular rate gyroscope. Whenever multiple sensors were available, data fusion and fault detection were imployed to maximize functionality without driver involvement. These vehicles were demonstrated at the National Automated Highway System Consortium (NAHSC) 1997 Technical Feasibility Demonstration in a scenario involving mixed autonomous and manually driven vehicles interacting in a fully autonomous manner at highway speeds.

VEHICLE HARDWARE

The figure below shows the physical layout of the equipment in the vehicle. Steering, throttle, and brake actuator locations are shown. The steering ECUs are mounted under the front of the drivers seat. The DBW ECU is mounted along the right wall of the trunk. The location of the video camera (replacing the central rear view mirror), radar RF components (behind the front bumper shroud), and laser rangefinder (under the front bumper in the air grill) are indicated. The contents of the trunk, including the image processing computer, vehicle control computer, graphical status display computer, angular rate gyro, radar signal processing components, and interface electronics are also shown.

The functional block diagram of the vehicle hardware is shown in the figure below. Data communication paths are indicated as well.

VISION BASED LANE POSITION SENSING

An image processing algorithm was developed to extract lane marker information from monochrome single camera image data in a form suitable for automated steering. It assumes a flat, dark roadway with light colored lane markers, either solid or broken, painted on it. The algorithm is based on extracting statistically significant `bright' regions from the image plane and fitting functions to those points which are consistent with the qualitative properties of roadway lanes and ground to image plane geometric relationships. Historical information about lane markers located in previous image frames is used to predict the location of lane markers in the current image. An estimate of the location of the virtual centerline of the lane is formed using left and right lane marker information if both are available, otherwise a virtual centerline is formed from the visible lane marker and an estimate of the lane width, and a low order polynomial is fitted to the location estimates. Position information needed for steering control is extracted from the equation for this virtual centerline.

Images are acquired using an inexpensive off the shelf monochrome CCD camera mounted in place of the central rear view mirror of the car. The algorithm is implemented on a Visionex Smarteye I, an inexpensive single board image processing system based on the Texas Instruments TMS320C30 digital signal processor. A block diagram of the Smarteye appears in Figure \ref{smteyeblk}. The board has four monochrome video inputs, two monochrome frame grabbers, two image buffers, and an output buffer, all composed of 512 x 512 pixels with a brightness resolution of 8 bits, and an overlay buffer which can be used to generate 16 color graphics overlays which, along with the contents of the output buffer, appear on a RGB-Sync video output. The processor runs at 40 MHz with one wait state access to external memory and buffers.

The vision system's on vehicle accuracy is better than +/- 5 cm at a lookahead distance of approximately 6 meters. The measurement update rate is approximately 17 Hz. Results are transmitted to the control computer over an RS-232 serial interface.

Idealized View of the Output of the Image Processing Algorithm

Radar Selective Stripe Based Lane Position Sensing

The radar sensor measures lateral position by sensing backscattered energy from a frequency selective surface constructed as lane striping and mounted in the center of the lane. The conceptual basis for the radar lane tracking system is shown in the figure below. The radar reflective surface is designed such that radar energy at a particular frequency is reflected back toward the transmitting antenna at a specific elevation angle. Thus, by varying the frequency of the radar signal we can vary the lookahead distance of the sensor.

A block diagram of the radar system appears in the figure below. The radar chirps between 10 and 11 GHz over a 5 millisecond period, transmitting the radar signal from a centrally located antenna cone. Two receive cones, separated by approximately 14 inches, receive the reflected radar energy. The received signal is down-converted into the audio range by mixing with the transmit signal. The lateral offset of the vehicle is found as function of the amplitude of the down-converted left and right channel returns at a radar frequency corresponding to a particular look ahead distance.

In addition, the peak energy in the the down-converted signal appears at a frequency which is a function of the distance from the vehicle to an object ahead. It is thus possible to extract the distance to an object ahead of the automated vehicle using the radar hardware already in place for lateral sensing.

Finally, by analyzing the signature of the received signal it is possible to determine whether the stripe is dry, wet, or covered with snow or oil.

The stripe itself was manufactured by 3M Corporation as a modification of an existing lane striping product. An aluminum foil layer with the appropriate slotted pattern was sandwiched between reinforced lane striping layers and one side coated with an adhesive. The color and texture of the stripe was matched to the road surface to avoid any confusion that might result from the presence of an extra lane marker. The tape is inexpensive, easy to install, and appears to be quite durable.

The radar system's on vehicle accuracy is +/- 5 cm at a lookahead distance of approximately 5 meters. The measurement update rate corresponds to the chirp rate, in this case 200 Hz. Final signal processing of the radar data was done on a small single board computer and the results transmitted to the control computer over an RS-232 serial interface at a rate of 100 Hz.

Control Software Structure

Driver and Passenger Information Panel

A graphical vehicle status display system was installed both to provide the driver with technical status and data and, during demonstrations, to provide the passengers with information about the function of the vehicle and the progress of the demonstration. A graphical view of the automated vehicle and its surroundings was designed by OSU and implemented on a Pentium computer by a subcontractor. This interface provides both numerical and graphical information concerning the status (active, inactive, good, or failed) and measurement data of each vehicle sensor.

A sample output screen is shown in the figure below. On this screen you can see the desired vehicle speed of 86 KPH, desired following distance of 20 meters, actual vehicle speed of 72 KPH, and the distance to and velocity of two other vehicles. Color coded icons represent the status of the vision system, the forward and side looking radar system, the laser rangefinder, and the presence or absence of a radar stripe. Some of these icons change shape, size, or position as the value of the sensor measurement changes. The three cars also move relative to each other to realistically indicate the positions in the lanes of the vehicle and two nearby vehicles.

Actually, this image should be in color, but I don't have a digitized color version handy.

The NAHSC '97 Demonstration Track - San Diego, CA - I15

The NAHSC 1997 Technical Feasibility Demonstration was held on an approximately 7.6 mile section of interstate I-15 just north of San Diego, California during August 7-10, 1997. The road available to the demonstration consisted of a two lane corridor, isolated from the rest of the freeway by concrete barriers and remote-controlled gates on all entrance and exit ramps, designed as reversible high occupancy vehicle lanes. These lanes were open to traffic during normal weekday rush hour periods. It was thus necessary to ensure that demonstration related activities did not disturb the roadway for the public. The figure below shows the demonstration road. The road represents realistic highway driving, with substantial hills and valleys and noticeable road curvatures. Lane markers were painted on the shoulder side of each lane and reflectors divided the two lanes of the road. Radar reflective stripe was installed in both lanes in approximately the middle third of the demonstration road.

Vehicle Development and Testing: TRC (Marysville, OH)

Testing of an automated passing maneuver at the Transportation Research Center test facilities in Marysville, Ohio.

Aerial Images of the OSU Scenario on I 15-San Diego, CA

Two automated (white) and one manual (dark green) vehicle, in ACC mode with automated steering.

The second car decides to pass the slower manual car, and automatically performs a left lane change. The third car begins to close the gap left by the second car to maintain the desired vehicle following distance.

The second car has almost overtaken the lead car, and the third car continues to close the gap.

Selected Bibliography

The Ohio State University Automated Highway System Demonstration Vehicle, Keith A. Redmill and Umit Ozguner, 1998 SAE Internation Congress and Exposition, February 1998, SAE Paper 980855.

Forward-Looking Radar Navigation System for 1997 AHS Demonstration, David Farkas, Jon Young, Brian Baertlein and Umit Ozguner, IEEE Conference on Intelligent Transportation Systems, November 1997, pp. 672-676.

A Simple Vision System for Lane Keeping, Keith Redmill, IEEE Conference on Intelligent Transportation Systems, November 1997, pp. 212-217.

Steering and Lane Change: A Working System, Cem Hatipo, Keith Redmill, and Umit Ozguner, IEEE Conference on Intelligent Transportation Systems, November 1997, pp. 272-277.

Automated Highway Studies at The Ohio State University- An Overview, Robert Fenton and Robert Mayhan, IEEE Trans. Vehicular Technology, 40(1), February 1991, pp. 100-113.


Comments and Questions: Keith Redmill (redmill@ece.osu.edu)