Everything You Wanted to Know About MOBILEYE and Were Too Embarrassed to Ask I Advanced Driver Assistance Systems
Summary By. FFShort
Advanced Driver Assistance Systems |
Everything You Wanted to Know About MOBILEYE and Were Too Embarrassed to Ask I Advanced Driver Assistance Systems
Driver Assist |
Mobileye and
STMicroelectronics, building on already existing collaborations, are to work on
developing Mobileye’s EyeQ5 system-on-chip, that will be a sensor fusion central
computer for autonomous vehicles.
The 5th-generation System-on-Chip; scheduled to sample in H1 2018:
prior-generation EyeQ devices are available on vehicles ‘now or in
the near future’ performing functions such as lane departure warning.
On the heels of NXP’s announcement Monday of Bluebox, an autonomous car engine, Mobileye and
STMicroelectronics Tuesday rushed to reveal a new generation of their
Vision SoC, EyeQ5.
They are touting it as a sensor fusion central
computer for autonomous vehicles.
Unlike Bluebox, already sampling today, EyeQ5 is a new SoC
that will be ready in two years, according to ST/Mobileye.
NXP and ST/Mobileye -- two competing teams -- are taking different
approaches to seal deals with OEMs in the autonomous vehicles platform
battle.
On one hand, NXP is promoting not only the Bluebox engine ,
but also a comprehensive autonomous vehicles platform with sensor fusion capabilities and decision-making functions.
Matt Johnson, NXP’s vice president and general manager for automotive microcontrollers
and processors, told EE Times that his company has shipped more than 30 million
ADAS processors worldwide, with eight of the world’s top 10 largest carmakers
using its processors.
On the other hand, the ST/Mobileye team is
angling to enter the sensor fusion market -- for the first time.Mobileye, for a long time,
appeared convinced that vision is enough to enable autonomous driving.
Egil Juliussen, director of research, Infotainment & ADAS
at IHS Automotive, told EE Times, I see that they are changing their tune a
little . I suspect, under pressure from car OEMs, Mobileye is now adding
other sensory data to do sensor fusion on the chip.
Clearly, the ST/Mobileye team hopes to take advantage of an EyeQ chips’ dominant
share for the automotive vision SoC market ( Advanced Driver Assistance Systems).
Earlier this year, Mobileye co-founder
and Chief Technology Officer Amnon Shashua said one-third of the global car
industry is already using EyeQ chips.
He told the audience at a Mobileye press
conference that Toyota and Daimler are the only two automakers not using
Mobileye’s vision chips.
The ST/Mobileye’s EyeQ5 announcement is seen by many in the automotive industry as a
pre-emptive strike against NXP’s Bluebox.
The two companies are co-developing the next generation of Mobileye’s SoC, with a view
to equipping Fully Autonomous Driving vehicles starting in 2020.
To meet power consumption and performance targets, the EyeQ5 will be
designed in advanced 10 nm or below FinFET technology node and will feature
eight multithreaded CPU cores coupled with eighteen cores of Mobileye's next-generation vision
processors.
Taken together, these enhancements will increase performance
8x times over the current 4th generation EyeQ4.
The EyeQ5 will process more than 12 Tera operations per second,
while keeping power consumption below 5W, to maintain passive cooling.Engineering
samples of EyeQ5 are expected to be available by first half of 2018.
Building on its experience in automotive-grade designs, ST
will support state-of-the-art physical implementation, specific memory and
high-speed interfaces, and system-in-package design to ensure the EyeQ5 meets the
full qualification process aligned with the highest automotive standards.
Collision Avoidance System |
EyeQ5 is designed to serve as the central processor for future fully-autonomous driving for both the sheer computing density, which can handle around 20 high-resolution sensors and for increased functional safety, said Prof.Amnon Shashua, cofounder, CTO and Chairman of Mobileye.
The EyeQ5 continues the legacy Mobileye began in 2004 with EyeQ1, in which we leveraged our deep understanding of computer vision processing to develop highly optimized architectures to support extremely intensive computations at power levels below 5W to allow passive cooling in an automotive environment.
EyeQ5’s proprietary accelerator cores are optimized for a wide variety of computer-vision, signal-processing, and machine-learning tasks, including deep neural networks.
EyeQ5 features heterogeneous, fully programmable accelerators, with each of the four accelerator types in the chip optimized for its own family of algorithms.
Mobileye who also is partnering with BMW and Intel for the same reason, claims a leadership position in image processing, localisation, mapping, and machine learning – all key technologies for automated driving with all its implications.
The central pillar of Mobileye’s product strategy are the EQ4/5 platforms that embrace sensors, signal processing and sensor data fusion to generate 360° awareness for the car.
Mobileye |
The systems also contain something Mobileye calls Road Experience Management System that helps administer the vehicle data for real-time map generation.
Another contribution from Delphi’s side is the Delphi Multi Domain Controller which includes camera, radar, and lidar sensors.
In addition, teams from both partners will jointly develop the next generation of sensor fusion technologies as well as driving policies that mimic human strategies of decision making.
Along with Ottomanika’s driving behaviour models and the machine learning system from Mobileye, the system will generate the kind of driving capabilities required in complex urban traffic situation where coordinating with other traffic participants is essential.
AUDI A5
SPORTBACK Expert Interview
|
No comments:
Post a Comment