Deliver Your News to the World

Intel and Warner Bros. Announce Partnership to Develop In-Cabin, Immersive Experiences in Autonomous Cars of The Future

Autonomous Driving is Today’s Biggest Game Changer


WEBWIRE

So much of the discussion around autonomous driving has naturally focused on the car as a mode of transportation, but as driverless cars become a reality, we must start thinking of the automobile as a new type of consumer space. In fact, we have barely scratched the surface in thinking about the way cars will be designed, the interaction among passengers, and how passengers will spend time while they are riding and not driving. In this respect, autonomous driving is today’s biggest game changer, offering a new platform for innovation from in-cabin design and entertainment to life-saving safety systems.

Advancing what’s possible in autonomous driving, today at the Los Angeles Auto Show, Intel announced a collaboration with entertainment company Warner Bros. to develop in-cabin, immersive experiences in autonomous vehicle (AV) settings. Called the AV Entertainment Experience, we are creating a first-of-its-kind proof-of-concept car to demonstrate what entertainment in the vehicle could look like in the future. As a member of the Intel 100-car test fleet, the vehicle will showcase the potential for entertainment in an autonomous driving world.

The rise of the AV industry will create one of the greatest expansions of consumer time available for entertainment we’ve seen in a long time. As passengers shift from being drivers to riders, their connected-device time, including video-viewing time, will increase. In fact, recent transportation surveys indicate the average American spends more than 300 hours per year behind the wheel.

With this expansion of available time, Warner Bros. and Intel imagine significant possibilities inside the AV space. Not only do we see passengers consuming content ranging from movies and television programming, we imagine riders enjoying immersive experiences never seen before, courtesy of in-cabin virtual reality (VR) and augmented reality (AR) innovations. For example, a fan of the superhero Batman could enjoy riding in the Batmobile through the streets of Gotham City, while AR capabilities render the car a literal lens to the outside world, enabling passengers to view advertising and other discovery experiences.

While the possibilities of in-cabin entertainment are fun to imagine, the ultimate test for the future of autonomous cars is going to be winning over passengers. The technology will not matter if there are no riders who trust and feel comfortable using it.

We believe the technology Intel is bringing to market is not simply about enjoying the ride – it is about saving lives. In fact, autonomous systems are the logical extension of seat belts, air bags and anti-lock braking systems. And the Mobileye ADAS (advanced driver assistance system) technology on the road today is already saving lives. Current ADAS products from Mobileye have proven to reduce accidents by 30 percent, saved 1,400 lives, prevented 450,000 crashes and saved $10 billion in economic losses. However, we cannot stop there. Our long-term goal has to be zero driving-related fatalities.

To reach this goal, we need standards and solutions that will enable mass production and adoption of autonomous vehicles. For the long period when autonomous vehicles share the road with human drivers, the industry will need standards that definitively assign fault when collisions occur.

To this end, Intel is collaborating with the industry and policymakers on how safety performance is measured and interpreted for autonomous cars. Setting clear rules for fault in advance will bolster public confidence and clarify liability risks for consumers and the automotive and insurance industries. Already, Intel and Mobileye have proposed a formal mathematical model called Responsibility-Sensitive Safety (RSS) to ensure, from a planning and decision-making perspective, the autonomous vehicle system will not issue a command leading to an accident.

And finally, safety systems of the future will rely on technologies with maximum efficiencies to handle the enormous amount of data processing required for artificial intelligence.

Earlier this year, we closed our deal with Mobileye, the world’s leader in ADAS and creator of algorithms that can reach better-than-human-eye perception through a camera. Now, with the combination of the Mobileye “eyes” and the Intel microprocessor “brain,” we can deliver more than twice the deep learning performance efficiency than the competition.* That is a huge difference and one that matters. More than two times the deep learning efficiency leads to better fuel economy and less expensive cooling solutions.

From entertainment to safety systems, we view the autonomous vehicle as one the most exciting platforms today and just the beginning of a renaissance for the automotive industry.

*The comparison is based on Mobileye EyeQ5 TOPS performance expectations vs. NVIDIA claimed Xavier platform DL Performance of 30 TOPS at 30W. Source: http://www.nvidia.com/object/drive-px.html Deep learning Tera Operations Per Second (DL TOPS) - Typically 1 multiply-accumulate operation = 2 DL OPS. The widths of the integer matrix multiplication vary by architecture, dedicated hardware and supported topologies. Any claimed DLTOPS number depends on several assumptions such as frequency, number of MACs and various other hardware specifications. Performance tests and ratings are measured using specific computer systems and/or components and reflect the approximate performance of Intel products as measured by those tests. Any difference in system hardware or software design or configuration may affect actual performance. Buyers should consult other sources of information to evaluate the performance of systems or components they are considering purchasing. For more information on performance tests and on the performance of Intel products, visit Intel Performance Benchmark Limitations. Results have been simulated and are provided for informational purposes only. Results were derived using simulations run on an architecture simulator. Any difference in system hardware or software design or configuration may affect actual performance.


( Press Release Image: https://photos.webwire.com/prmedia/6/217377/217377-1.png )


WebWireID217377





This news content was configured by WebWire editorial staff. Linking is permitted.

News Release Distribution and Press Release Distribution Services Provided by WebWire.