Self-driving cars and humans must learn a common language

For all the intricate technology required for autonomous cars — the sensors to replicate eyes and ears, the computers and algorithms to serve as the car’s brains, the high-definition 3-D maps to guide them — there’s another factor that computer science alone cannot solve: how these cars will engage with people — passengers, motorists, bicyclists and pedestrians — and vice versa. Carolyn Said and David R. Baker explore in the SF Chronicle what this actually implies.

The simple vocabulary most cars now employ — turn signals, brake lights, hazard lights, horns — may need to be radically expanded once driving eliminates the human element. That’s true from seemingly simple situations, like a pedestrian making eye contact with a driver before crossing in front of a car, to the more complex, like negotiating four-way stops and highway lane changes.

New communication methods could include patterned lights; audible cues (perhaps a polite voice saying “Cross now,” or a musical tone as at some stoplights); rooftop displays showing symbols or words; laser devices to project a message such as a crosswalk on the road ahead to indicate that it’s safe to cross; and cars that wirelessly transmit their intentions to other vehicles.

See also Experientia’s interview with Nissan Research’s Melissa Cefkin (featured in the SF Chronicle article).