Sunday 25 December 2016

Inside BMW's Effort to Deliver a Self-Driving Machine for 2021

Inside BMW's Effort to Deliver a Self-Driving Machine for 2021 
Earlier this year, while celebrating its centennial, BMW told us it plans to offer a self-driving car by 2021 called the iNext. A battery-electric luxury sedan with autonomous capability, developed in partnership with suppliers Intel and Mobileye, the iNext represents an ambitious reach for a company that has—thus far—not been regarded as a leader in either electric cars or self-driving technology. Now we've toured the facilities where this work is underway to see just how big an effort will be required to change that reality.

"Fully autonomous driving is like a mission to Mars," Klaus Fröhlich, BMW board member for research and development, told us at a press briefing in Munich. "We're calling it Project i2.0." Sales and marketing chief Ian Robertson compared it to an "Apollo mission" and cautioned that the company is "in the very, very early stages of this digitalization. We are shifting our horizons dramatically. We have been a classic engineering company, but we are going to become a tech company."
Suggesting how much research will be devoted to the program, Fröhlich noted that BMW's entire R&D program today encompasses 60 petabytes (one petabyte is one million gigabytes) of data; the new project will require 600 petabytes, or a tenfold increase. In five years.
A lot of that will be devoted to the development of the artificial intelligence (AI) to make a self-driving car possible in that timeframe. Artificial-intelligence research has been at the forefront of many tech companies’ and automakers’ autonomous efforts this year. Companies like Google, IBM, and Intel, all with footholds in the auto industry, launched a best-practices group for the technology in September. Automakers are attempting to keep pace, the latest being Daimler, which announced Thursday it will participate in a research initiative in Stuttgart and Tübingen pursuing AI advances.
"Europe, generally, has not been in the forefront of AI development, but it's what you need to make an autonomous vehicle," Fröhlich said. This is especially so given the short timeframe BMW is allowing itself. While many in the field are invested heavily in vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communications to achieve the aims of driverless mobility, BMW expects to make each of its autonomous cars think and drive like a human being, said Bernd Muster, director of digital strategy for the company. This is because leadership in the field won't come to those who wait for other automakers and governments to implement V2V and V2I standards. Muster commented: "If you you want to do it with the car, it takes five years; if you need to use infrastructure, it will take 20."
Rene Grosspietsch, head of autonomous driving cooperation and ecosystems, said the system needs not only sensors onboard but also a high-definition digital map, updated in real time for changing road and traffic conditions, that is much more detailed and accurate than any available today.
BMW Vision Next 100 concept
© Provided by Car and Driver BMW Vision Next 100 concept
The grips for the control yoke of BMW's Vision Next 100 autonomous concept collapse together into the dash when unneeded.
As an example of what they hope to teach the artificial intelligence to do, Grosspietsch cited the circumstance of a car approaching a pedestrian standing at a roadside. A human driver would read the body language and perhaps the facial expression of the person on foot to evaluate whether that person intends to stay at the curb or step into the road. A car, he suggested, can be taught to do essentially the same thing, but it's a much more complicated task than just steering between lane markers, so the vehicle must be taught to interpret postures and facial expressions as well as it does traffic signs or turn indicators on other cars and to make projections from those inferences. BMW already has begun what it expects will be more than 30 million miles of testing to record the needed data and teach the AI to function in complex environs. One short-term goal: completely autonomous navigation of a precisely mapped sector of downtown Munich by the end of next year.
Even at that, BMW is saying its vehicle in 2021 will achieve Level 3 autonomy. In shorthand, Level 1 means an autonomous feature can assist the driver with some driving tasks, Level 2 is when the system can conduct some parts of the driving while the driver monitors, and Level 3 is when the automated system can handle all driving in some circumstances—such as regular highway driving—with the human driver monitoring and ready to assume control on request. At Level 4, a vehicle can conduct all operations under certain environments and conditions, without a driver necessary. At Level 5, the car can perform all driving tasks under all conditions.
BMW's 2021 car will be fully capable of Level 3 autonomy and, in limited circumstances, could operate at Level 4 for a while. Even at Level 3, the AI would have to know its own limitations. "In conditions of heavy rain, snow, fog, or dense rush-hour traffic, the car is going to say, 'This is beyond me, the driver has to take over.' " To that, add BMW's commitment that its cars will always give owners the choice of taking control to enjoy the experience of driving, even if the car is fully capable of managing.
BMW Vision Next 100 concept


For Level 4 function, in which the human "driver" might be freed to take a nap, tend to email, or play Uncharted 7, BMW officials say they are not as concerned with the finer ethical questions some raise, such as the Trolley Problem, which asks whether the car should choose to prioritize the lives of its occupants over those outside the vehicle. (Mercedes-Benz already has come down on the side of cars that prefer the preservation of their owners' and occupants' safety, although it later contended that its quotes were misconstrued. We stand by our reporting.) Circumstances that raise such issues, Muster suggested, likely will be even more rare than today's projections imagine. "With very good sensors, and being very conservative in the most dangerous situations, and remembering that the autonomous car needs fractions of a second to react, where a human can take 1.3 to 5.0 seconds, these cars will almost never be asked to make such decisions."
As if teaching a robot to drive weren't challenging enough, this artificial intelligence has to be packaged into a car that meets all the other expectations consumers bring to the marketplace while adapting to a rapidly changing set of rules, regulations, and laws globally. "Our mule right now is a 7-series sedan, and its trunk is completely packed with hardware," Grosspietsch said. So, yeah, packaging may be a challenge, too.
This is where Fröhlich expects BMW's experience as an automaker to give it advantages over newcomers like Apple or Google. "We have what startups don't have. They have to learn on the job. We have to be faster [than a traditional car company], act like a startup to get these things done. But when we get that product on the road in 2021, we have to act like a grown-up."
How might a grown-up car company be different from a Silicon Valley Johnny-come-lately? One example Fröhlich cited was over-the-air updates for vehicle software. BMW wants to be able to do that—but only to upgrade its systems. "Automatic software updates are to improve the product for the customer, such as adding functions or options, not to do patches," he said. "The vehicle has to be robust before it enters the market so that we do not need this system for fixing bugs."

No comments:

Post a Comment