Driverless cars still need a human touch: For the people having the interest in cars, after hearing industry executives discussing the self-driving technology being incorporated into their vehicle, might be spared for thinking robotic cars will soon drive themselves out of showrooms.
In a news media event on January 7, Carlos Ghosn, the chairman and chief executive of the Renault-Nissan Alliance at the company’s research laboratory in Silicon Valley announced that Nissan would come up with 10 new autonomous vehicles in the next 4 years.
In a conference call with reporters last week, Elon Musk, the chief executive of Tesla asserted that the so-called Autopilot feature introduced in the Tesla Model S last Autumn was probably better than a person right now.
What Ghosn and Musk are describing cross-country driving hyperbole aside are cars with smart capabilities that can help drive or even take over in tricky situations.
There is no doubt autonomous cars that do all the work are still, at least, a decade away from carrying people around the city, like the bubble-shaped vehicles Google has been testing near its Silicon Valley Campus.
But this self-driving cars will require human supervision, on occasions, the car will in effect still tell their human drives, “Here, You take the wheel” when they encounter complex driving situations or state of emergency.
Driverless cars still need a human touch
In the automotive industry, this is often termed to as the handoff problem, and automotive engineers know that there is no easy solution. Automotive designers have not yet found a way to make a driver who often distracted by texting, reading email or watching a movie perk up and retake control of the car in the fraction of a second that is required in an emergency.
The danger is that by inducing human drivers to pay even less attention to driving, the safety technology may be creating new hazards.
“The whole issue of interacting with people inside and outside the car exposes real issues in artificial intelligence,” said John Leonard, a professor of mechanical engineering at the Massachusetts Institute of Technology. “The ability to know if the driver is ready and are you giving them enough notice to hand off is a really tricky question.”
The limitations of Autopilot, which Tesla describes as offering the ability to “automatically steer down the highway, change lanes and adjust speed in response to traffic,” were clearly visible in a recent test drive with Sebastian Thrun, a roboticist and artificial-intelligence expert who led the Stanford University team that won the Pentagon’s autonomous vehicle Grand Challenge in 2005 and later founded Google’s self-driving effort.
Although Thrun left Google several years ago, he is still involved in the field of artificial intelligence. He describes himself as an enthusiastic Tesla owner. On a recent test drive, he catalogued a series of the car’s limitations and errors, including those he described as “critical interventions” in which the driver is required to override the car’s behaviour.
The Tesla Autopilot system allows the drivers to remove their hands off the wheel, but it prompts them to regain control after a certain period. It will also warn a driver to retake control in certain situations.
The Tesla performed well in freeway driving, and the company recently fixed a bug that had caused the car to unexpectedly veer off onto freeway exits. However, on city streets and country roads, Autopilot’s performance could be described as hair-raising. The car, which uses only a camera to track the roadway by identifying lane markers, did not follow the curves smoothly. It also did not slow down when approaching turns.
On a recent 220-mile drive to Lake Tahoe from Palo Alto, California, Thrun said he was forced to intervene more than a dozen times.
The company said that on January 9 it introduced a new version of the Autopilot software that offered both restrictions and improvements in handling.
Like the Tesla, the new autonomous Nissan models will still require human oversight and will not drive autonomously in all conditions. Nissan’s engineers acknowledged that even their most advanced models would not be autonomous in every situation, including snow, heavy rain and even some kinds of nighttime driving.
“There are certain limitations depending on the condition of the weather. For example, if you are in heavy snow or rain, it is impossible to have autonomous driving,” said Tsuyoshi Yamaguchi, Renault-Nissan’s executive vice president for technology development. “We should make sure the vehicle recognises it and gives a caution to the driver.”
The situation is further complicated by laws and regulations that require cars to be controlled by humans. In Europe, the 1968 Vienna Convention required that “every moving vehicle or combination of vehicles shall have a driver,” and “every driver shall at all times be able to control his vehicle.” Although an amendment has been proposed, no new legislation is yet in place.
In the United States, Google began an extensive lobbying campaign with individual states in 2011. But the company acknowledged a setback this year when the California Department of Motor Vehicles issued draft regulations that required a human driver capable of controlling the car to be in the vehicle.
None of that has discouraged some enthusiastic Tesla owners. Doug Carmean, a Microsoft computer designer who commutes daily between Seattle and Redmond, Washington, said he had encountered the Tesla Autopilot off-ramp bug and found it “scary.”
Yet as much as 45 minutes of his commute each day is in slow, stop-and-go traffic, and his car will effortlessly and predictably follow the car ahead, permitting him to surf the Web on the Tesla’s giant display.
“Even though it has these frightening movements, I’ve come to enjoy it,” he said. “It’s a sense of awe and pleasure.”