By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
Would you challenge a bus over your right-of-way?
dennis Wyatt web
Dennis Wyatt

As accidents go, it wasn't much of a fender bender.

A Santa Clara Valley Transportation Valley bus was moving forward at 15 mph.

In the adjacent right lane, a Lexus SUV was getting ready to turn right but sandbags around a storm drain appeared to be an obstacle so the car hugged the left side of its lane and struck the right front of the bus.

This isn't the type of accident that would make national news. Unless, of course, the Lexus was a self-driving car being tested by Google.

If you aren't a major critic of driverless cars, such an incident probably doesn't worry you much.

Google said its computers reviewed the accident. That review led engineers to modify the software controlling the car to understand buses may not be as inclined to yield as other vehicles.

No kidding, Sherlock.

The fact engineers quickly adjusted software may make you comfortable with the entire notion of self-driving cars given the fact they are a work in progress.

But here's the rub. The engineers are adjusting the programing for the self-driving car to operate in absolutes when presented with an issue but to give it a variety of options. In other words they are injecting unpredictability when it comes to how a self-driving car reacts to a situation.

While Google never said their vehicles will be flawless they did imply it would take the human component out of driving or drivers doing the unpredictable. That might work if every car was self-driving and every road obstacle and condition fit into a nice 100 percent predictable pattern.

The incident poses a serious question that Google and others have never given much thought to: Are humans better drivers than machines programmed by specific humans simply for the fact they are humans?

The odds are most drivers would never try to assert right of way in questionable situations when going up against a bus. But what about the person who is distracted, high on some substance, sleepy, or having a heart attack? Wouldn't self-driving vehicles save us - and themselves - from them?

The answer would be to create vehicles that can detect all of those issues and either aggressively nag the driver to pay attention or not let a driver start a vehicle if it senses alcohol consumption or drowsiness. Coming up with a device to address such concerns is probably easier and ultimately will be much more effective.

In 2014, the Insurance Institute for Highway Safety noted there were 32,675 traffic deaths in the United States or 1.08 for every 100 million miles driven.

While even one death is a tragedy, are self-driving cars the right technology to address the concern? Automobile deaths per se peaked in 1972 at 54,589 or 4.33 for every 100 million miles driven. Since 1921 when there were 13,253 deaths in traffic accidents in this country and the deaths per 100 million were at 24.09 for every 100 million vehicle miles driven, the death rate per 100 million miles driven has dropped virtually every year to the point is 1/24th of what it once was.

While improved safety features and crumple zones obviously have played key roles in reducing the carnage, there is something else at play here - the human mind.

We drove a collective 2.946 billion miles in 2014. A vehicle death occurred roughly every 992,000 miles.

That's the equivalent of 341 trips on Interstate 80 back and forth between San Francisco and New York before there's an accident that kills somebody. It would take you driving non-stop at 65 mph around the clock 24/7 for 639 days to rack up that kind of mileage.

The question that no one at Google explored is how effective a human mind is versus a computer at avoiding accidents since that would not produce a product that would pad Alphabet's bottom line. Keep in mind Google cars with limited road time in and around Mountain View since the spring of 2014 have been involved in about a dozen collisions. In most instances Google vehicles were rear-ended. In the 44 years of driving I've been involved in four accidents of which only the backing into a vehicle in a parking lot was my fault.

Think about the number of near misses you have or see during the course of the day driving or walking around Ceres. Between red-light running, failing to yield, tailgating, zombie pedestrians, motorists yakking on the cell phones at 35 mph, motorists being cut-off and other bonehead moves a typical person probably sees 10 times that many of near accidents during the course of a day and that's just one person.

The human brain is pretty darn effective at avoiding collisions. Motorists, of course, must not be distracted or impaired. Computer controlled airplanes at 20,000 feet or trains on tracks are one thing given the limited variables they will encounter while a car is an entirely different story.

In a world when everyone is in a driverless car and street conditions are as predictable a video game I have 100 percent faith in self-driving vehicles. But until reality actually mirrors the virtual world of Silicon Valley, our chances of avoiding carnage on the road is better with attentive and unimpaired human drivers.

This column is the opinion of Dennis Wyatt and does not necessarily represent the opinion of Morris Newspaper Corp. of CA.