Car Tech

Continental, ZF Debut New Autonomous Driving Tech at CES 2017

Automakers are relative newcomers to the CES spotlight, but the suppliers that manufacture the parts that make each and every new vehicle possible certainly aren’t. At the 2017 CES show, the same trend towards autonomous technology that has dominated the discourse for OEMs was also found in the booths of the big parts players from around the world – which is only natural given the heavy dependence on sensors and computing power inherent to self-driving cars. Two of the leaders in outfitting the industry with the gear required to eliminate human pilots from the equation are Continental and ZF; and each of these tech giants takes a somewhat different approach to the autonomous question

ZF, better known to gearheads for supplying transmissions for a long list of major automotive brands, has partnered with Nvidia to produce its own stand-alone ECU that focuses entirely on self-driving artificial intelligence. Called “ProAI”, the processor relies heavily on the Drive PX 2 platform from Nvidia, allowing ZF to avoid a deep investment in replicating the raw computing power required to manage the vast volume of data generated during autonomous driving. The system, which is scheduled to hit the market in 2018, is capable of managing input from every type of camera or sensor (including lidar and radar) used on the road, and it also features machine-learning capability that works with its artificial intelligence software to help ProAI adapt to various driving situations.

ZF also debuted a more broad-based tech offering at CES called X2Safe. This algorithm links disparate digital signals from automobiles, watches, phones, and other mobile devices, and then manages them spatially as a cloud service. The end goal? To be vigilant for, and warn individuals of, any potential collisions. Given that pedestrians and two-wheelers are disproportionately represented in the number of automotive accident deaths each year, ZF is hoping that it can convince OEMs and tech companies alike to participate in its X2Safe system by linking location data under a single safety umbrella. The technology itself takes its name from the V2X (vehicle-to-infrastructure) communications systems that will be a vital cog in any future autonomous driving strategy.

Not to be left out, Continental has also unveiled several systems and sensors intended to perform functionality similar to what ZF has brought to the table. Specifically, while lacking the AI component featured in the ZF/Nvidia collaboration, Continental’s Assisted & Automated Driving Control Unit aims to provide centralized control over vehicle systems in order to allow for a one-stop interface when connecting an automobile to a self-driving system. In addition, Continental’s 3D Flash Lidar joins the supplier’s existing crop of sensor systems (including the vehicle-surroundings radar that earned the company an award at this year’s CES).

As impressive as these efforts might be, there are still a number of challenges facing the field of autonomous driving that technology cannot yet dismiss with the wave of a digital wand. I spoke to Karsten Schulze, an engineer with Dutch company IAV Automotive Engineering about a situation near and dear to the hearts of Canadian drivers: how, exactly, are self-driving cars going to deal with blizzard-like road conditions and harsh winter environments such as the ones we face on a regular basis?

The answer was not exactly heartening. IAV, which was working with partner Microsoft at CES as part of an autonomous technology showcase, is quite experienced in building and designing sensor systems that enable a vehicle to see the road around it. Schulze explained, however, that there will always – his words – be some situations where sensors will not be able to do their job, whether that’s because the lines on the road have become obscured by snow, or as a result of the sensors themselves being coated with ice, grime, or something else blocking their field of view.

According to IAV, there are a few options that can be deployed to deal with these situations. Surprisingly, Schulze was not of the opinion that better sensors, available in the future, were the answer, as any electronic eye can suffer from impairment given the right (or wrong) set of circumstances. Instead, he detailed how sensor technology must work together with infrastructure communications systems, especially in city environments, to fill out the gaps in the picture when rough weather intervenes. The need for hyper-accurate maps – several levels above the level of precision currently available from GPS – is also key to this sensor-assistance strategy.

Still, even with all of these features working together, IAV still sees the potential for instances where the safest thing an autonomous vehicle can do is park itself and await assistance. This strategy is preferable, according to Schulze, than trying to power through inclement conditions with an imperfect view of the road. One could argue that this advice would apply equally well to human drivers facing slippery, low-visibility conditions as well. Yet when discussing the future – one where the expectation is for robots, AI, and sensors to not only replace, but improve on human drivers – it’s a sobering perspective.