For all the intricate technology required for autonomous cars â€” the sensors to replicate eyes and ears, the computers and algorithms to serve as the carâ€™s brains, the high-definition 3-D maps to guide them â€” thereâ€™s another factor that computer science alone cannot solve: how these cars will engage with people â€” passengers, motorists, bicyclists and pedestrians â€” and vice versa. Carolyn Said and David R. Baker explore in the SF Chronicle what this actually implies.
The simple vocabulary most cars now employ â€” turn signals, brake lights, hazard lights, horns â€” may need to be radically expanded once driving eliminates the human element. Thatâ€™s true from seemingly simple situations, like a pedestrian making eye contact with a driver before crossing in front of a car, to the more complex, like negotiating four-way stops and highway lane changes.
New communication methods could include patterned lights; audible cues (perhaps a polite voice saying â€œCross now,â€ or a musical tone as at some stoplights); rooftop displays showing symbols or words; laser devices to project a message such as a crosswalk on the road ahead to indicate that itâ€™s safe to cross; and cars that wirelessly transmit their intentions to other vehicles.
See also Experientia’s interview with Nissan Research’s Melissa Cefkin (featured in the SF Chronicle article).