Microprocessor Report (MPR) Subscribe

Mobileye Increases Car EyeQ

Computer-Vision Processors Will Enable Autonomous Vehicles

July 20, 2015

By Mike Demler


Mobileye supplies EyeQ computer-vision (CV) processors, which hold an 80% share of the market for advanced driver-assistance systems (ADASs). The company’s first-generation chips enabled intelligent headlight control, lane-departure warning, and traffic-sign recognition in BMW-deployed systems. Advances in EyeQ2 supported single-camera forward-collision warning and pedestrian-detection automatic emergency braking (AEB) in Volvos. For models rolling out later this year, Audi is employing the third-generation EyeQ3, which adds object, vehicle, and pedestrian recognition to provide more-comprehensive AEB. In 4Q15, Mobileye will begin sampling its fourth-generation EyeQ4 processor, whose new capabilities are designed to make self-driving cars a reality.

The company primarily supplies its chips with custom application software to Tier One automotive-system integrators such as Delphi, Magna Electronics, and TRW Automotive. The EyeQ3 powers Audi’s A7 autonomous test vehicle, which demonstrated highway self-driving from San Francisco to Las Vegas. The chip can detect animals as well as pedestrians at night, and it can recognize bumps, potholes, and debris as small as 10cm. OEMs have provided EyeQ-powered systems for more than 200 models from car manufacturers in the U.S., China, Europe, and Japan. Tesla is its own systems integrator; it worked directly with the chip developer to build EyeQ3 into its all-electric vehicles. Mobileye also manufactures complete aftermarket automotive-vision systems. That business, however, contributes just 20% of the company’s revenue, and Mobileye expects it to decline to 10% in the near future.

Government agencies that set vehicle-safety standards are driving demand for more-sophisticated ADASs. In the U.S., the National Highway Traffic Safety Administration (NHTSA) runs the New Car Assessment Program (NCAP), which rates vehicle safety using a five-star scoring system. Beginning in 2018, all new American cars must include backup cameras. The NHTSA has yet to establish requirements for other vision-based systems, but it’s considering automatic braking for NCAP. On its Safercar.gov web site, the agency advises consumers to look for cars with forward-collision and lane-departure warning systems. The Euro NCAP is more advanced. To get a four- or five-star rating in Europe, a car must include crash-avoidance technology.

EyeQ4 will take nearly three years to reach volume production, owing to the automotive industry’s long development cycles. Mobileye announced that the processor has already earned a design win at an undisclosed European car manufacturer for vehicles that will be available in early 2018. In 2Q16, the company plans to deliver an EyeQ4 hardware test system to OEMs along with a suite of ADAS application software.

Accelerators Provide Better Braking

EyeQ4 runs a quad cluster of MIPS Warrior I-class CPUs at 1.0GHz with a shared 1MB L2 cache, as Figure 1 shows. Each CPU integrates four independent hardware threads (see MPR 9/29/14, “MIPS Warrior Joins 64-Bit Battle”). An array comprising three different programmable vector accelerators performs real-time vision-processing functions, and it also provides the hardware to implement deep-learning convolutional neural networks (CNNs). STMicroelectronics has manufactured each EyeQ generation in progressively smaller process nodes, and it will build EyeQ4 in 28nm partially depleted silicon-on-insulator (PD-SOI) technology. The process evolution has enabled Mobileye to keep its new designs under the same 3W maximum power limit as previous versions despite adding considerably more hardware.

Figure 1. Block diagram of Mobileye’s EyeQ4 automotive-vision processor. The chip integrates a heterogeneous array of 10 programmable vector accelerator cores, which enable sophisticated ADAS features such as collision-mitigation braking and semiautonomous driving.

The accelerators include six vector microcode processors (VMPs), a pair of multithreaded processing clusters (MPCs), and two programmable-macro-array (PMA) cores. The VMPs employ essentially the same SIMD/VLIW architecture as EyeQ2 and EyeQ3 processors, but Mobileye added two more instances to the quad configuration it used in the previous generations. EyeQ4’s VMP also increases parallelism from 64 to 76 multiply-accumulate (MAC) operations per cycle, and the clock frequency doubles to 1.0GHz, yielding a net 2.4x increase in MACs per cycle.

The company withheld details of the MAC bit width, as well as whether the units are fixed or floating point; because the chip must operate within a 3W power limit, however, we expect they are simpler fixed-point units. Mobileye claims EyeQ4’s performance exceeds that of Nvidia’s Titan X Maxwell GPU, which powers that company’s Drive PX automotive-vision system (see MPR 5/14/15, “Titan X: 50% More Maxwell”), but that is true only for fixed-point algorithms. Each Maxwell ALU can execute two 16-bit floating-point operations per cycle, and Nvidia rates the chip at 1Tflops of peak performance.

The MPCs and PMAs are new to EyeQ4, but the company withheld their architectural details. Whereas some CV processors employ hybrid DSP/GPU architectures blended with fixed hardware accelerators, EyeQ4 divides these functions into separate blocks that are fully programmable. Mobileye says the MPC provides greater flexibility than a GPU for accelerating OpenCL programs, although the core’s performance per watt is slightly lower. Each MPC has its own L1 instruction and data cache, but the company declined to reveal the sizes. The MPCs integrate 32 single-cycle MACs running at 1.0GHz, matching the VMP and CPU clock speeds.

Mobileye designed the PMAs to be more energy efficient than a DSP while providing compute density similar to that of a fixed-function hardware accelerator. The PMAs run at 750MHz, and each core includes 372 single-cycle MACs, which we also expect are fixed point. The 10 accelerator cores share a 2MB L2 cache. An on-chip SRAM provides 64KB of local storage. All of the processor subsystems connect to an Arteris network-on-a-chip (NoC) with industry-standard Open Core Protocol (OCP) interfaces (see MPR 5/25/15, “Arteris FlexNoC Gets Physical”).

EyeQ4 also integrates a MIPS M-class 5150 CPU in the peripheral transport manager (PTM) to control I/O transactions (see MPR 3/24/14, “Smallest Warrior Gets Virtualization”). The 5150 uses a 24KB L1 data cache, a 16KB instruction cache, and a 64KB scratchpad SRAM. Mobileye withheld the clock-frequency specification. The EyeQ4 I/Os include a Gigabit Ethernet (GbE) interface, three control-area-network buses that support the Flexible Data-Rate (CAN FD) protocol, and two 32-bit LPDDR3/LPDDR4-3200 ports for high-speed DRAM. CAN is an automotive standard for microcontrollers and in-vehicle diagnostics, but it supports a 1Mbps maximum data rate. Ethernet over twisted-pair wiring supports 100Mbps, which is more suitable for smart-car functions.

Learning to Be Autonomous

One of the unique features of previous-generation EyeQ-based systems is that they work with a single 50-degree camera mounted to the backside of a rear-view mirror. Semi- and fully autonomous driving will require the ability to see even more than a human driver, including both wider and more-distant fields of view, so EyeQ4 provides three MIPI CSI2 video inputs. As Figure 2 shows, the second camera will use a 180-degree fisheye lens to detect pedestrians and vehicles in areas that the 50-degree camera would miss. The third camera has a narrow 25-degree field of view for long-range imaging. The chip also supports fusion with radar and laser sensors, which increase visibility in darkness, fog, and other inclement weather conditions.

Figure 2. Mobileye’s trifocal system. To enable semi- and fully autonomous driving, the EyeQ4 processor will combine images from three cameras that focus on 180-degree wide-angle, 50-degree near-field, and 25-degree long-distance views.

Identification of pedestrians and vehicles can prevent collisions, but a self-driving car will also need to interpret a wide variety of signs and traffic lights. Using a single camera, EyeQ3 can detect and identify 250 different traffic signs from more than 50 countries, including directional arrows painted on pavement. Mobileye’s goal for EyeQ4 is to identify 1,000 signs from more than 100 countries. Nevertheless, the ability to identify signage and traffic lights is still insufficient to enable autonomous navigation. For that task, the ADAS must also combine location information from maps and GPS with motion sensors and real-time interpretation of the road.

The company says its approach to autonomous driving differs from Google’s self-driving car, which uses sensors to align the vehicle with stored maps. EyeQ4 enables real-time path planning, using its vision accelerators to execute deep-layered neural networks. According to Mobileye’s calculations, constructing a neural network to interpret a 36fps image stream uses less than 1% of EyeQ4’s computing power. The predecessor chip lacks the new MPC and PMA accelerators and therefore uses 5% to perform the same functions. The EyeQ system creates an environmental model on the fly and uses maps only as secondary information. The company estimates it needs less than 1MB of recorded data to keep an autonomous vehicle aligned for one hour of driving.

Figure 3 shows a frame of information processed by Mobileye’s system in a congested city-driving test. The EyeQ processor classifies each pixel to distinguish whether it belongs to a curb, pavement, a pedestrian, or signage, for example. The green area illustrates the free space that is potentially available as a driving lane. Bounding boxes identify other vehicles and pedestrians. The system recognizes a bicyclist to the right and removes its space from the green area. The path-planning software draws a light-blue line to the vehicle in front, determines that the car can proceed by recognizing the green light, and reads directional arrows in the road.

 

Figure 3. Video frame from Mobileye’s path-sensing system. Deep-learning networks in the EyeQ processors enable real-time interpretation of the road and objects in the vehicle’s environment. (Source: Mobileye)

Mobileye says Audi will introduce the first vehicle with its semiautonomous driving system, based on EyeQ3, in 4Q15. Until 2017, that system will support use only on highways. In 2017, the company expects to add the ability to drive semiautonomously on country roads. Such roads typically are less congested than highways and city streets, but they require navigation without the benefit of lane markings and curbs. In 2018, when EyeQ4 enters volume production, it expects to enable semiautonomous city driving.

Competitors Enter the Race

Despite Mobileye’s formidable lead, the potential size of the automotive-vision market is attracting new entrants. The company shipped 2.7 million chips last year, but that number represents only 3% of the total global passenger-vehicle market. EyeQ processors command a $45 average selling price, and their 80% share indicates the market for automotive-vision chips is already $150 million in size.

Semiconductor vendors that are qualified suppliers for other automotive systems are in the best position to enter the ADAS segment, owing to their established OEM relationships. Freescale is an established automotive supplier, and it recently began sampling the S32V234 ADAS vision processor (see MPR 4/27/15, “Freescale Upgrades Automotive Vision”). That chip uses Cognivue’s Apex-642 CV core, which provides 64 SIMD/VLIW vector computational units. The S32V234 also integrates quad Cortex-A53 CPUs and a Vivante GC3000 GPU, and it supports up to eight video streams.

The S32V234 won’t enter volume production until 2H17, but it lags the computational power of the EyeQ4 as well as the predecessor EyeQ3, which is already in production. Cognivue recently introduced deep-learning capabilities with its new Opus CV core, but the S32V234 lacks those features. The Freescale chip also consumes much more power than EyeQ, approaching 8W for multicamera applications.

Nvidia has won automotive designs for its Tegra graphics processors in digital dashboards and infotainment systems, and it’s looking to enter ADAS. At the 2015 CES, the company introduced the Drive PX platform, which it calls a self-driving-car computer. The Drive PX board includes two Tegra X1 chips, which can implement deep-learning neural networks with Cuda programs running on the Maxwell GPU’s 256 shader cores. The pair of X1 chips aims to replace a single EyeQ3 that performs those functions in Audi’s zFAS system, which powered the carmaker’s autonomous-driving prototype. That system uses the Tegra K1 to process image data from the four cameras, but EyeQ3 does the vision processing. Nevertheless, EyeQ4 uses less than half the power of dual Tegra X1s and provides a more efficient purpose-built architecture.

Driverless Cars See 2020

Google has received a lot of publicity for its experimental self-driving cars, but mainstream carmakers are moving closer to putting autonomous vehicles on the road for the general public. Government regulations will determine just how soon self-driving cars become commonplace on public roads, but several testing programs are already underway. The potential reduction in accidents and loss of human life presents a huge incentive (see MPR 11/17/14, “Putting the ‘Auto’ in Automobile”). Annual fatalities in the U.S. alone amount to roughly 30,000 people, and some estimates put the worldwide toll at more than one million. Tesla CEO Elon Musk has predicted that human driving may even be banned in the future, once autonomous cars are proven to be safer.

For that to happen, the system costs must come down so that ADAS is affordable in mainstream vehicles, not just $80,000 luxury cars and Teslas. Mobileye’s business model is to provide a complete hardware/software package to OEMs, with pricing determined by the number of software functions included. This approach will enable car manufacturers to differentiate using the same base hardware across multiple models, offering features such as automatic braking in lower tiers and introducing autonomous driving in higher tiers. The chip vendor estimates it can sell EyeQ4 with a full suite of autonomous-driving software for $150, but increased volumes will drive down cost.

Increased competition could also force Mobileye to reduce its prices. Carmakers and OEMs can license computer-vision intellectual-property (IP) cores to build their own ASICs. Synopsys has demonstrated traffic-sign detection running on neural networks in its DesignWare EV cores (see MPR 4/13/15, “Synopsys Embeds Vision Processing”). Also, Ceva has introduced its fourth-generation XM4 vision-processing cores (see MPR 4/27/15, “Ceva Sharpens Computer Vision”). The challenge for these vendors is to match the EyeQ software package and prove their hardware in the field.

The first semiautonomous car will appear later this year, and the first fully autonomous vehicle will likely ship by 2020. We expect new growth markets such as China to accelerate adoption. That country is now the world’s largest car market, and it will probably follow Europe’s NCAP standards for inclusion of collision-prevention systems. By 2025, we expect autonomous driving systems to be affordable additions to most new cars, and lower insurance premiums will compensate for the upfront cost.

Price and Availability

Mobileye has reported that its chip average selling price (ASP) is $45. The company packages the processors with application software, which increases the price according to the feature set. It expects processors plus software for fully autonomous vehicles to command a $150 ASP. EyeQ4 engineering samples will be available in 4Q15. For more information, access www.mobileye.com/technology/processing-platforms/.  To view a video of Mobileye’s city-driving demonstration, access https://youtu.be/kp3ik5f3-2c.

Events

Linley Autonomous Hardware Conference 2017
Focusing on hardware design for autonomous vehicles and deep learning
April 6, 2017
Hyatt Regency Hotel, Santa Clara, CA
Register Now!
More Events »

Newsletter

Linley Newsletter
Analysis of new developments in microprocessors and other semiconductor products
Subscribe to our Newsletter »