On April 2, 2019, Ford’s Chief Technology Officer, Ken Washington told Kara Swisher on Recode Decode with Kara Swisher that by 2021, we would begin to see cars with no one in the driver’s seat. Today, April 22, 2019, Elon Musk showed investors Tesla’s self-driving technology at Autonomy Investor Day, proving that the discourse over which technologies make autonomous cars safe is still open for debate.
“Self-driving car technology has made incredible advances over the last five years. Vastly improved vision technology combined with inputs from other sensors are getting us close to full autonomy,” said Bart Selman, professor of computer science at Cornell University. “However, we don’t yet know whether we can reach the level of safety of a human driver within the next three to five years.”
Selman says that in Tesla’s case, significant reliance on computer vision introduces an extra level of difficulty.
“Current computer vision systems can fail in quite unpredictable ways and having multiple sensors, ideally including LiDAR are [..] critical,” said Selman through a press statement from Cornell University. “The challenge remains of how to resolve possibly conflicting information of multiple sensors, as well as the question of how to gracefully handle unexpected situations without needing human input.”
Measuring the safety of fully autonomous cars still has some gray areas, but the California Department Motor Vehicles has data from 2018 showing the number of miles driven by autonomous vehicles alongside the number of disengagements. The data puts Google’s Waymo in the lead for the least amount of times a human had to take over from the car.
“Alert human drivers are surprisingly good at interpreting unexpected events and generally can take the necessary preventive steps to avoid accidents. However, because current autonomous driving systems lack a broader understanding of their environment, it is difficult for those systems to take similar preventive measures,” added Selman. “The question [..] is whether we can develop systems that gracefully reduce risks when faced with unexpected events.”
Teaching cars to perceive as a human is something that AEye takes seriously. The startup, which has raised $60 million from Kleiner Perkins, Taiwania Capital, Intel Capital, Airbus Ventures, LG, Hella and Subaru for their Intelligent Detection and Ranging (iDAR) system which replaces current passive LiDAR sensors.
IDAR is their perception system that mimics the human brain and its ability to intelligently perceive versus being overwhelmed by its surroundings. The company leverages artificial intelligence (AI) to help self-driving cars think more like a robot, but understand like a human.
“While driving, a human pays attention to certain things, like a child walking in front of a car, while filtering out others, like a bird in the sky. And we are creating this same type of filtering or prioritization for autonomous vehicles [..],” said Blair LaCorte, President of AEye.
LaCorte says they started the company with a sheet of paper with one sentence written on it: ‘What’s the best way to deliver artificial perception for robotic and autonomous vehicles’? The company based their answer not on technology, but on a model that would inform them about how to think about the solution, which was the human visual cortex.
“Humans process some 70% of the spatial and environmental data they receive in the visual cortex, rather than sending the data to the brain. The human visual cortex filters out in real-time, information that the brain doesn’t need to make effective decisions,” said LaCorte. “The complexity of the data being processed in the human visual cortex includes multiple complex dimensions – color, space, time, distance, vector, velocity. In effect, intelligence is pushed to the edge of the human perception network.”
LaCorte says then the team applied that concept to artificial perception – asking ‘how can we mimic the human visual cortex with technology’?
“The human visual cortex analyzes spatial and temporal aspects of their situation simultaneously, constantly re-interrogating the environment, while identifying objects that are “of interest” or are potential threats,” adds LaCorte. “We attribute intelligence to processing power, but in reality, it starts with what is input into the system. Intelligence begins with how you collect data.”
“It’s standard practice in autonomous vehicle pilot projects to use sensors like LiDAR, cameras and radar to capture data about the environment through which they are moving. The problem is current sensors are like vacuums. They suck up every bit of data they can find (cars, people, trees, sky) – terabytes of data daily – without regard to its relevance or importance,” added LaCorte. “IDAR captures and processes environmental data – just as the human visual cortex does.”
The company introduces term agile LiDAR when talking about their iDAR system. Agile LiDAR allows iDAR to deploy what the company calls regions of interest or situational demands within a driving scene and then deploys a dense array of pulses which can instantly capture large amounts of data about an object or area including vector and velocity allowing it to triage data in real-time.
“We believe that AEye gives perception engineers their first tool kit to actively explore how to best interrogate different scenes a car encounters as it drives,” said LaCorte.