November 30, 2017
During this week’s LA Auto Show, Intel and Warner Bros. announced a partnership to develop in-cabin, immersive experiences for autonomous vehicles. The companies are creating the AV Entertainment Experience, which Intel chief Brian Krzanich describes as “a first-of-its-kind proof-of-concept car to demonstrate what entertainment in the vehicle could look like in the future.” Since Americans spend an average of 300 hours per year driving, there is a wealth of possibilities for using that time differently when automobiles become self-driving. The collaboration is looking beyond movies and TV programming to more immersive experiences.
“We imagine riders enjoying immersive experiences never seen before, courtesy of in-cabin virtual reality (VR) and augmented reality (AR) innovations,” writes Krzanich in an editorial posted in the Intel Newsroom. “For example, a fan of the superhero Batman could enjoy riding in the Batmobile through the streets of Gotham City, while AR capabilities render the car a literal lens to the outside world, enabling passengers to view advertising and other discovery experiences.”
Krzanich emphasizes the importance of perception and adoption in addition to safety and the need for standards. Riders will need to be comfortable and trusting of the technology behind autonomous systems and related safety features, which Krzanich sees as “the logical extension of seat belts, air bags and anti-lock braking systems.”
“The Mobileye ADAS (advanced driver assistance system) technology on the road today is already saving lives,” he writes. “Current ADAS products from Mobileye have proven to reduce accidents by 30 percent, saved 1,400 lives, prevented 450,000 crashes and saved $10 billion in economic losses.”
Intel is working with the automotive industry and policymakers to develop standards and rules that will address “how safety performance is measured and interpreted for autonomous cars,” according to Krzanich. He says future safety systems will rely upon artificial intelligence and data processing power.
“Now, with the combination of the Mobileye ‘eyes’ and the Intel microprocessor ‘brain,’” notes Krzanich, “we can deliver more than twice the deep learning performance efficiency than the competition.”