Ford to launch the new Gen-Fusion Hybrid at CES 2017

Ford Fusion Hybrid Autonomous Development Car
Ford Fusion Hybrid Autonomous Development Car sneak peek. Image. YouTube.

The Ford Motor Company officially revealed their Fusion Hybrid autonomous development vehicle today. The car will receive a formal debut at the Consumer Electronics Show and the North American International Auto Show, both in January next year.

The new vehicle works with the Ford autonomous vehicle platform but features revamped computer hardware, including upgraded LiDAR sensors with better design and a broader field of vision.

The introduction of this new Fusion Hybrid is the latest move for Ford’s Smart Mobility Plan, which develops autonomous technologies, looking to compete in the growing self-driving car industry.

Ford recently partnered with Uber to bring a fleet of self-driving first-generation Fusion models to the streets of Pittsburgh.

Ford will triple the size of its autonomous fleet

A press release indicated that cars would now use two sensors instead of four while still garnering the same amount of data from the road. This redesign pairs up with bigger processing power, though the company did not release a specs sheet.

Ford started producing self-driving Fusion models in early 2013, starting with ten cars in total. They tripled this number in 2016 by building 20 more units, and now they plan to make 90 autonomous hybrids in 2017.

Ford’s intention mirrors that of other companies like Google, who seek to create an entirely autonomous vehicle, certified as Level 4 by the standards of the Society of Automotive Engineers or SAE.

SAE defines the fourth level of self-driving as “High Automation” and requires the car to perform all aspects of controlling the vehicle even if a human driver inside the vehicle does not (or is unable to) take back control.

Ford’s current advancements in sensor technology

In a Medium post, Chris Brewer, Ford’s Chief Program Engineer for their autonomous vehicle program, said his team was working on two different methods of perception, direct and mediated.

Mediated knowledge is responsible for creating the high-resolution 3D maps the cars uses to drive on its own. Direct experience is what helps the car ‘see’ moving object, which could range from a cyclist on the road, to an obstacle, to hand signals.

Ford’s LiDAR (Light Detection and Ranging) sensors can cover the extension of two football fields in every direction. The central computer, located in the drunk, generates about 1TB of data per hour.

Ford wants to build self-driving cars without a steering wheel or pedals. These completely autonomous cars would work as taxis, and they could release as soon as 2021.

Source: Forbes