In machines we trust

The other night we saw demonstrations on television of driverless cars. The first demonstration showed a seemingly normal sedan with passengers – but no driver. The car’s steering wheel turned by itself and it accelerated and stopped without human input.

The second demonstration featured something called a “Google Car.” The vehicle looked like a bubble with two people sitting in it. There was no steering wheel or other controls. Apparently, there was no way to override what the car’s computer was telling it to do.

It appears a lot of folks have a great deal of confidence in computers.

We are big fans of computer-aided safety devices in automobiles, but we’re not sure we’re ready to turn the driving over to them completely. A couple of recent incidents reinforced those fears:

Tuesday, the National Transportation Safety Board blamed pilots for the crash of the Asiana Airlines flight at San Francisco International Airport last year. The report indicated the pilots were confused by the auto throttle on the landing system and waited too long to override it and land the plane safely.

Meanwhile, the Washington Post has been doing a series this week on drone crashes and near misses. There have been 47 military drone crashes on U.S. soil since 2001. Civilian drones have crashed 23 times since 2009.

And the Federal Aviation Agency reports that in the last two years there have been 15 reports of near misses between commercial aircraft and drones or where drones flew too close to airports.

All of this comes at a time when Congress is pushing the FAA to draft rules that will allow more civilian use of drones.

At the risk of sounding like a modern-day Luddite, we’d urge caution in turning over too much control of our lives to machines. We don’t want to be traveling in a steering-wheel-less car when its computer develops the blue screen of death.

Likewise, a sky packed with drones seems like a recipe for midair collisions.

* Editorials reflect the opinion of the publisher.