Leave the Driving to Us

By Bryan Hay

Even George Jetson keeps at least one hand gripped on a joystick when he takes his space car out for a spin. With every headline boasting about another step toward launching self-driving cars comes more Jetsonic scenarios of passengers spinning their seats around to flip open laptops, sip lattes, or snooze while their omniscient car safely brings them to their destination.

But the realities of these scenarios seem more far-fetched than the work at the present moment from those on campus studying this automotive advancement.

Consider the municipalities that are starting to grapple with the sudden technological changes.

This spring, the Robert B. and Helen S. Meyner Center for the Study of State and Local Government will present workshops to municipal officials about how to prepare for autonomous vehicles.

John Kincaid, director of the Meyner Center and Robert B. and Helen S. Meyner Professor of Government and Public Service, and David Woglom, associate director for public service, have been actively talking about the whole issue of autonomous vehicles and what topics to include in the workshops.

There are plenty of speed bumps to consider—changes to motor-vehicle ordinances to reflect driverless cars, modifications to school crosswalks and bus stops, and signage embedded with a network of smart devices at circles, intersections, and pedestrian crosswalks.

Kincaid and Woglom have their work cut out for them as they seek to help limited staffs and elected officials in smaller municipalities. Of the 2,600 municipalities in Pennsylvania, 75 percent have populations of 5,000 or less.

“Pennsylvania is the poster boy for inefficient government,” Woglom says. “There’s no question autonomous vehicles are coming. There are many questions to be answered, many scenarios to be addressed by state and local governments.”

One such scenario is infrastructure.

“From an engineering perspective, separate roads are not something being widely considered,” says Kristen Sanford Bernhardt, associate professor of civil and environmental engineering and chair of engineering studies.

“I don’t think infrastructure is seen as an impediment. The infrastructure is in place, so we will make vehicles and software to handle the situation,” she says. “Instead the discussion is about how to integrate autonomous vehicles and how to improve the computing algorithms to keep them driving in regular traffic streams.”

Michael Nees, associate professor of psychology, expects a protracted period where vehicles with some degree of automation are going to interact with vehicles controlled by humans.

Think about something as basic as merging. Then think about autonomous and human-controlled cars merging together into something like the Holland Tunnel.

“We’re hearing stories about cautiously programmed automation struggling in situations like merging into heavy traffic—situations where even a defensive-driving human must make an aggressive driving maneuver under the norms of regular, everyday driving behavior,” Nees says.

If simple merging sounds complicated, some of the biggest challenges with autonomous vehicles and full integration is that they have to be able to deal with people who are unpredictable.

“Software systems are good at learning rules, but when people don’t follow the rules then it’s harder for the software systems to deal with them,” Sanford Bernhardt says.

The rules can get a bit sticky.

“As autonomous vehicles become more common on the road, how will it affect municipal services like police, EMS, and fire?” Woglom says. “If an autonomous vehicle goes through a red light, who’s
to blame?”

Skeptical about assumptions being made regarding human behavior, Nees, as a human-factors psychologist, doesn’t believe it’s wise to think humans will be able to successfully serve as backups if automation fails.

“We can’t get people to put down their phones now when they are fully responsible for the manual control of a vehicle,” he says.

“People’s minds wander. People are texting. There would be no human vigilance,” Nees says. “If a bot is doing all the steering, accelerating, and braking, imagine all the other things people will be doing. Imagine how unprepared they’ll be to take control of the car in the event of an emergency.”

But it’s not fair to blame the driver entirely in this case, he suggests.

“If an engineer designs a structure using materials that can’t support the weight of the structure, is it the fault of the materials when the structure collapses?” Nees asks.

“This is no different. The design is asking the human to do something that both psychological evidence and common sense have shown the human is not able to do well,” he says. “It’s disingenuous to describe failures of say, Level 2 and 3 vehicle automation, as ‘human error’ when the design began with bad assumptions about the capabilities of the human as a component in the system.”

So how can we make self-driving cars a reality?

“There’s a lot being written about your car driving you from place to place while you’re watching a movie or taking a nap or working,” Nees says. “I think we’re a long way from that on a large scale.”

As someone who studies the interface between people and things in the engineered world, Nees imagines a more realistic version of how vehicle automation might unfold that could include geofenced roadways for autonomous vehicles or shuttles.

“You can develop infrastructure that’s attached to the geography of the area,” he says.  “Lower speeds and geofencing mean less risk for harm and more predictability.  Pilot programs for low-speed driverless shuttles in urban areas and campus-like environments have shown a lot of potential.”

Seems like the self-driving we all dream of is moving forward in a lower gear than we imagined.