At Mobileye, now part of Intel, CEO Amnon Shashua talks about standardizing data to better analyze for autonomous driving.
Shashua says the fatal driverless-car accident in Arizona won’t derail the industry timetable
Mobileye’s technology is at the center of the autonomous-driving systems many carmakers are developing. The Israeli sensor and camera specialist, which was acquired by chipmaker intel in 2017, is keen to put in place a framework that will assure that driverless-car technology is safe and secure. In an interview with automotiveIT, Mobileye CEO Amnon Shashua says the trend toward autonomous driving will continue, even after a deadly accident in Arizona recently. But, says Shashua, more transparency is needed in the way the industry approaches the technology.
automotiveIT: How will the recent deadly accident with a driverless car in your view affect people’s acceptance of the new technology?
Amnon Shashua: I think the development has moved too far along to be stopped. But it’s necessary to be more transparent about how these machines make their decisions. The only transparency that we have today is how many miles these cars have traveled and how often the safety driver has had to intervene. That doesn’t tell you anything. To get more transparency, we have to define safety in such a way that regulators, the car industry and technology providers can agree on one standard. I’m talking about a standard that provides information on how players define the safety of their decisions, what constitutes a dangerous situation and which driving strategies are the right ones to get out of such a situation.
How can you achieve such a standardization in the industry and what criteria should be used?
We have developed a formal model called Responsibility Sensitive Safety (RSS). This model is our attempt to create a solid mathematical foundation for a standardization discussion. RSS doesn’t favor or disadvantage anybody’s technology. We’re not standardizing an algorithm. What we want to standardize is the criteria: What it means to be in a dangerous situation and what it means to get out of such a situation. You can then do everything you need to do in terms of decision-making to fulfill these criteria, as long as our model sticks to the definitions.
Are such precise and comprehensive definitions actually possible? You’re going to have to define every conceivable situation.
When we researched our formal model we found that driving complexity can be reduced to four principles. There’s the crash typology of the US NHTSA, the National Highway Traffic Safety Administration. This captured 6 million crashes and divided them into 37 scenarios. We ran these 37 scenarios through our model and reached the conclusion that it’s in line with people’s ability to judge. We are now looking for further scenarios from different sources. Until now, all studies that we have conducted confirm the model: It does, indeed, reflect the human ability to judge in terms of who is responsible for an accident. We are open for the addition of further definitions as well as for changes to the definitions we have. But we believe it’s timely to standardize these kinds of definitions together with the regulators.
On a different topic: In late 2016, you announced jointly with BMW and Intel that you would be developing the car of the future. How are you progressing with that?
It’s going very well. Our 40 test vehicles have gathered dozens of petabytes of data. We have expanded the partnership with Fiat Chrysler, Delphi and Magna and we will accept further partners. We have worked on the sensor configuration and have figured out the placement of the cameras. And there was a string of sensor-related decisions, for example, who would supply lidar and radar sensors. That all went according to plan. In the course of the year, we will shift the center of our activities toward the US. The idea is to map the city of Santa Clara step by step in concentric circles around the Intel headquarters. That will then make autonomous driving in the city possible. This will happen by the end of 2018. In the same timeframe, we will complete the software stack for the sensor systems, the driving strategy and the RSS. The production hardware will come in 2019.
Mobileye was acquired by Intel a year ago. How has this affected your work and the technologies you are developing?
The merger allowed us to cover further areas in the development chain of autonomous vehicles. For example, we can now build a test fleet of 100 cars. These cars are deployed for a variety of purposes. One of these is to gain experience with autonomous driving in places as diverse as Jerusalem, Santa Clara and Arizona. We will also be able to load all the technologies we’re working on onto these test vehicles, from sensor systems to actuators and driving strategy all the way to issues such as security, communication and mapping. That’s a tremendous logistical effort. For a company the size of Mobileye, such an effort would be too big. But Intel can handle it.