It’s Business, And It’s Personal

Are Autopilot Systems In Motor Vehicles Ready For Prime Time?

The marketing of self-driving features may have gotten ahead of the technology

The recent death in Florida of a man driving a Tesla S while using the autopilot mode has raised concerns over how this death will affect public perception of self-driving vehicles. The problem is the Tesla is not really a self-driving vehicle. Second, rather than simply suggesting that there are flaws with the concept of autonomous vehicles, the problem with the Tesla may be more one of implementation.

Driving: more difficult than it looks

Many people believe themselves to be good drivers. If they have driven for any length of time, say 10 or 15 years, without an accident, they may believe that to demonstrate they are “safe” drivers. While it may suggest that they probably do not engage in risky driving with much frequency, it may not provide evidence for much more than that.

Most drivers’ experience moments of inattention. Distracted driving has gained much notoriety recently with the advent of the small computers that also make phone calls. That technology enhances the ability of drivers to engage in a large variety of distracting behaviors, from reading and writing texts, watching videos, updating Facebook pages, to even live-streaming oneself driving drunk.

People have long engaged in ill-conceived activities, such as reading newspapers, shaving, applying makeup and eating, all while driving. Inattention is often found as a contributory factor in a large percentage of motor vehicle crashes. Cellphones merely provide a convenient and ubiquitous distraction, as most people carry a phone with them at all times.

Other sources of inattention, such as drunk driving remain a serious concern. About 10,000 people die every year from crashes where alcohol was a factor. Another problem is drowsy driving.

Fatigue can leave a driver as impaired as one who is legally drunk and given our societies demands for constant availability leads many to push themselves beyond safe limits. Truck drivers, most of whom only are paid if they are in their trucks driving have even greater incentives to push their and other motorists luck.

What should current technology do?

The problem, it seems, with the Tesla autopilot is that the technology in these vehicles is not sufficiently robust to fully take over operational responsibility from a human driver. The sensor, in this case, failed to recognize a white trailer against a bright sky and drove right into the truck as it crossed the Tesla’s lane.

As Google and other companies working to develop autonomous vehicles have found, driving is a very complex task. Keeping track of road and traffic signals, other vehicles on the road, the presence of parked vehicles, trees, shrubs and pedestrians in variable lighting conditions and weather is a demanding activity.

Teaching a vehicle what it needs to pay attention to and what it can ignore is not simple and requires sophisticated sensors and equally sophisticated processing of that data by onboard computers. Sensors need to distinguish a very wide variety of oddly shaped and sized objects, determine their movement relative to the vehicle and correctly change the vehicle’s speed and direction. Eventually, the vehicles will communicate with each other and with roadside traffic signals and other infrastructure that will be able to provide these systems with greater awareness to the complete driving environment.

The state of autonomous vehicles

We are not there yet. While Tesla notes its vehicles have driven 130 million miles safely while using the autopilot, Americans drive more than 3 trillion miles annually. The current vehicles use sensors to keep a vehicle in the lane and provide automatic braking, but beyond that, as the recent crash demonstrates, they have significant limitations.

As more vehicles carry these systems, more robust and sophisticated data sets will be amassed and will allow constant improvement to their abilities, but that will take years to develop and many more vehicles will need to gather data.

The future?

Many very large, wealthy companies are spending billions on these systems. The federal government is working to develop regulations that will foster this type of innovation and in the trucking industry, there are significant economic gains to be realized if autonomous trucks could be implemented in the near future.

Nevertheless, until sensor technology is better developed and more capable of dealing with issues like a white truck against a bright sky or heavy rain, snow, fog or ice, the more appropriate use of these systems would be as a backup for the human driver, helping to keep that person fully engaged in the driving activity. Tesla appears to have recognized this, and after a near-accident in China, is now callings its features a “driver-assist system.”

These electronic systems could then monitor everything from the driver’s lane maintenance to applying brakes in emergency stop situations. Rather than selling consumers these systems as an “autopilot,” they should be sold as a never-sleeping, never fatigued or impaired copilot, ready to apply instant, if limited, safety features to help the human driver avoid or minimize a potential crash.