Meet the Regulators Trying to Make Sure Self-Driving Cars Are Safe

The California DMV has the unenviable job of regulating a new cadre of artificial intelligences that are directing 2,000-pound vehicles around the streets. How do they keep us safe without impeding the development of these vehicles?
More
Google's proposed driverless vehicle, unveiled Tuesday, May 27 (Google)

This week, the California Department of Motor Vehicles released its final regulations for the testing of autonomous vehicles on the roads of the state. They create a process for companies like Google, Nissan, Mercedes Benz, and the rest of the automakers to test out cars that can drive themselves under certain circumstances. 

By the end of the year, the DMV will issue an even more important set of regulations that will govern how the public can operate these cars. 

This is not an easy task, nor one that the regulators asked for. When the California legislature passed Senate Bill 1298 (Vehicle Code Section 38750), they tasked the agency with creating rules that would both encourage the development of autonomous vehicles while protecting the public.

"The State of California, which presently does not prohibit or specifically regulate the operation of autonomous vehicles, desires to encourage the current and future development, testing, and operation of autonomous vehicles on the public roads of the state," the bill reads. "The state seeks to avoid interrupting these activities while at the same time creating appropriate rules intended to ensure that the testing and operation of autonomous vehicles in the state are conducted in a safe manner."

Which sounds reasonable. But how do you create "appropriate rules" that ensure safety without "interrupting" or slowing the development of the technology?

There isn't official federal guidance that the state can lean on. The National Highway Traffic Safety Administration has released a "preliminary" policy statement, but won't have real regulations ready for years. 

In most other safety issues, the manufacturers certify that their goods meet the safety regulations set by NHTSA (or as people say it, "Nitsa"). The state of California (or any other) doesn't get involved. 

California's DMV has it particularly tough. The state has taken the regulatory lead in so many different areas through the years, by dint of its large population and innovative bureaucrats like Art Rosenfeld in energy efficiency, so other states are looking to California. And, of course, Google and many car makers are conducting a big chunk of their research in and around Silicon Valley. So the internal pressure is high, the external pressure is high, and...

They have to figure out how to test the safe functioning of complex artificial intelligence systems commanding 2,000 pound vehicles that already kill 32,000 people under human control!

Given that self-driving cars have the medium-term potential to create all kinds of changes in the way that Americans, at least, move around, the three people at the DMV leaving the charge are some of the most important people shaping our collective future. 

Inside a Nissan autonomous vehicle while testing (Nissan).

Regulators, Mount Up

Within the DMV, there are two co-sponsors of the autonomous regulation project: Bernard Soriano, deputy director, and Stephanie Dougherty, chief of strategic planning. The third member of the triumvirate is Brian Soublet, assistant chief counsel at the DMV. 

I was able to speak with Soriano and Soublet at length this week, and they walked me through how the DMV is approaching this task.

"We're kind of in a bind because every vehicle that's on the roadway has to meet Federal motor vehicle safety standards," Soriano said. But there are no Federal regulations or Federal standards for autonomous technologies.

"NITSA is the one who develops the regulations for safety devices on vehicles, but—they admit this—they are years away, years away, from developing regulations for autonomous vehicles."

"So there is also a push at the state level—various states—to come up with regulations because the companies want to come out with the products," Soriano said. "If there are no federal regulations, they are turning to the states, asking, 'What are the states going to allow?' So California is one of a handful states that have passed legislation. We do regulations a lot, but we've never done regulations with regard to safety devices on vehicles."

So not only are they dealing with the novel problems of autonomous vehicles, but they've had to come up with solutions in an area that they aren't used to regulating. 

"This is the first time a state has had to license the testing of specific technology. Usually manufacturers just test their stuff and the state isn't involved," Soublet said. "So, when this bill got passed, it became, 'How do we do this—and how do we do this in less than two years?'"

How to Figure Out How to Regulate Autonomous Vehicles in Two Years or Less

The DMV put together a steering committee composed of representatives from other state agencies like the California Highway Patrol, Caltrans, the California State Transportation Agency, the Department of Insurance, and finally the National Highway Traffic Safety Administration. Notably, their first liaison when they formed the committee was the NHTSA's former deputy administrator, Ron Medford. He's now Google's safety director for the Self-Driving Car Program.

Along with the governmental partners, they contracted with UC Berkeley's Institute of Transportation Studies and Stanford's Center for Automotive Research. They talked with people at Carnegie Mellon, where a lot of this research took place. And they asked for input from people within the automotive industry but who are not affiliated with any particular company, like software engineers who'd helped them understand previous problems like the Prius's anti-lock brake software problems

They also developed relationships with all the different automakers, suppliers like Bosch, and, of course, Google, trying to understand the potential and limitations of all these different research efforts. "We've gotten insight into what [the different companies'] strategies are. So we kind of have an idea about what they'll be doing," Soriano said. "Of course, we don't know the entire picture. They've only let us in on a little of what they're going to do. But with that knowledge, we have to start crafting the regulations."

At the same time, it's part of their remit to ensure that the motoring public is safe. 

"A huge part of our work group and our statewide steering committee is working with the California Highway Patrol," Soublet said. "And we do get concerns from them: How are you going to make sure who is responsible if violations occur? How is an officer going to know if a vehicle is autonomous? How is an officer going to interact with a vehicle that's autonomous?"

So, I asked them, how do you know if the cars are safe? The fundamental thing is that there are these algorithms making decisions, and it is really hard to evaluate if they will make the right ones, especially under novel conditions. How do we know that when an autonomous vehicle approaches an uncontrolled left (the kind of turn without a green arrow), it is going to make a good decision? 

They considered but decided against trying to get direct access to the driving software itself in order to inspect the algorithms at work. They just don't have the expertise, Soublet said, not to mention that the manufacturers would probably fight them tooth-and-nail if they tried to pry in their proprietary code. Not to mention that it's very hard to define the standards that you might try to apply to software.

"We're really good at licensing drivers and regulating vehicles and the car sales industry, but we don't have a lot of expertise in developing those types of standards," Soublet said. "So as we start approaching things like that, we have to back off. We don't have the technical ability to do it. We have to come at this from a regulatory perspective of what we as a department are capable of."

Assuming they won't have access to and the expertise to look at Google's or Volkswagen's algorithms, they've come up with a compromise approach. One set of things they know they can measure are behavioral competencies. They know what a car needs to be able to do to drive on highways, or city streets, or on rural lanes. They can test for those competencies without actually peering under the metaphorical hood of the autonomous vehicle. 

Volkswagen-Continental collaboration on the "automated" car.

What Happens When Artificial Intelligence Fails?

Most efforts to regulate an emerging technology encounter opposition from the developers of the technology. This one is no different. The biggest area of contention came in the the reporting of failures of the autonomous systems. Not just crashes, which I'm pretty sure everyone would have to agree to, but what are called "disengagements." They define this in the regulations as "deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle."

Basically: a disengagement signals when the car's AI did not work. "That was something that was very important for us to get," Soriano said. "We had a lot a lot of pushback on that from the manufacturers." Of course, they did not want to report when things went wrong to the government agency that would be in charge of their permits. But it seems vital for the safety of the public that someone know the true state of the technology outside of the companies themselves.

They decided to include this requirement, Soublet said, when "we started hearing about this race to have autonomous cars on the road.  Nissan says we're going to have something by 2020. Within a week, Mercedes Benz, said they'd have it by 2020. Then, Volvo says no one is going to die in our cars."  

Regulators worried about the companies pushing the limits of what their cars could do to be first. In the end, the reporting requirements are not that onerous, but they are interesting, and will certainly create the best data about the state of self-driving cars that exists in the world. Each year, the companies will submit a monthly breakdown of their disengagements along with "the circumstances or testing conditions at the time of the disengagements." That will include:

  • The location: interstate, freeway, highway, rural road, street, or parking facility.
  • A description of the facts causing the disengagements, including: weather conditions, road surface conditions, construction, emergencies, accidents or collisions, and whether the disengagement was the result of a planned test of the autonomous technology.
  • The total number of miles each autonomous vehicle tested in autonomous mode on public roads each month.
  • The period of time elapsed from when the autonomous vehicle test driver was alerted of the technology failure and the driver assumed manual control of the vehicle. 

These are the benchmarks on which autonomous vehicles will be judged. And they highlight the key anxieties of the technology developers and regulators. What will cause disengagements? How often will they happen? Where will they happen? And how safe was the handoff to a human driver?

Toyota autonomous vehicle prototype.

Speaking of that driver, another contentious issue was the requirement that a human driver be in the car and ready to assume control at all times. They also ask that the autonomous car operators have gone through a training course and have a clean driving record. All sensible stuff. 

But, Soriano said, "The manufacturers understood, yeah, the test drivers need to be safe, but they also had a philosophy—some of them—that we don't just want the skilled drivers to be in there." The companies wanted to see how normal people might react to unexpected situations. But the DMV had an answer for that: try it out on a track or in parking lots.

The last regulatory battle came over an interesting question: should commercial vehicles, trucks, and buses be allowed to experiment with autonomous driving or should the technology be limited to passenger vehicles? 

It's a difficult debate. Some of the most interesting possibilities for self-driving cars are in the commercial and logistics space. "Some companies have told us, 'We're looking at having 18-wheelers with this.' Or buses with this," Soriano said. "We felt, from a public-perception standpoint, we did not want an 80,000-pound fully laden semi-truck on our roadways being tested with autonomous technologies."

But no commercial vehicles will be allowed to test just yet in California. "Even our partners at UC Berkeley have said there is a big benefit," he continued. But in the end, they thought coming up with rules to govern these other kinds of vehicles by 2015 would be impossible. So they will probably phase in those regulations later.

They know the regulations they put out in the coming year—both those for testing, which take effect in September, and those for the actual operation, which will hit next year—won't be "end all and be all," as Soublet put it. 

But they will represent a significant step toward getting autonomous vehicles on the road safely and with proper governmental oversight. 

"It's an issue that draws you in. It's our future. We find it very exciting to work on," Soriano concluded. "Brian [Soublet] and I, we can't believe that we're working on this. It's something that will change the way that we all live."

Jump to comments
Presented by

Alexis C. Madrigal

Alexis Madrigal is the deputy editor of TheAtlantic.com, where he also oversees the Technology Channel. He's the author of Powering the Dream: The History and Promise of Green Technology. More

The New York Observer has called Madrigal "for all intents and purposes, the perfect modern reporter." He co-founded Longshot magazine, a high-speed media experiment that garnered attention from The New York Times, The Wall Street Journal, and the BBC. While at Wired.com, he built Wired Science into one of the most popular blogs in the world. The site was nominated for best magazine blog by the MPA and best science Web site in the 2009 Webby Awards. He also co-founded Haiti ReWired, a groundbreaking community dedicated to the discussion of technology, infrastructure, and the future of Haiti.

He's spoken at Stanford, CalTech, Berkeley, SXSW, E3, and the National Renewable Energy Laboratory, and his writing was anthologized in Best Technology Writing 2010 (Yale University Press).

Madrigal is a visiting scholar at the University of California at Berkeley's Office for the History of Science and Technology. Born in Mexico City, he grew up in the exurbs north of Portland, Oregon, and now lives in Oakland.

Get Today's Top Stories in Your Inbox (preview)

The Remote Warehouse Where Confiscated Wildlife Ends Up

A government facility outside of Denver houses more than a million products of the illegal wildlife trade, from tigers and bears to bald eagles.


Elsewhere on the web

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register. blog comments powered by Disqus

Video

Is Wine Healthy?

James Hamblin prepares to impress his date with knowledge about the health benefits of wine.

Video

The World's Largest Balloon Festival

Nine days, more than 700 balloons, and a whole lot of hot air

Video

The Origins of Bungee Jumping

"We had this old potato sack and I filled it up with rocks and dropped it over the side. It just hit the water, split, dropping all the stones. And that was our test."

Video

Is Trading Stocks for Suckers?

If you think you’re smarter than the stock market, you’re probably either cheating or wrong

Video

I Spent Half My Life Making a Video Game

How a childhood hobby became a labor of love

Writers

Up
Down

More in Technology

Just In