The Prolonged Mince Towards AI Autonomous Cars

--

Imagine driving your car and coming upon an animal that has suddenly darted into the roadway. Most of us have had this happen. You hopefully were able to take elusive action. Assuming that all went well, the animal was fine, and nobody in your car got hurt either.

In a kind of Predestination movie manner, let’s repeat the scenario, but we will make small changes. Are you ready?

Imagine driving your car and coming upon a particular animal say deer that has suddenly darted into the roadway. Fewer of us have had this happen, though nonetheless, it is a somewhat common occurrence for those that live in a region that has deer aplenty.

Big Data Jobs

Would you perform the same elusive actions when coming upon a deer as you would in the case of the animal that was in the middle of the road?

Let’s iterate this once again and make another change.

For some drivers, a chicken is a whole different matter than a deer or a cat. If you were going fast while in the car and there wasn’t much latitude to readily avoid the chicken, it is conceivable that you would go ahead and ram the chicken. We generally accept the likelihood of having chicken as part of our meals, thus one less chicken is ostensibly okay, especially in comparison to the risk of possibly rolling your car or veering into a ditch upon sudden braking.

Essentially, you might be more risk-prone if the animal was a deer or a cat and be willing to put yourself at greater risk to save the deer or the cat. But when the situation involves a chicken, you might decide that the personal risk versus the harming of the intruding creature is differently balanced. Of course, some would argue that the chicken, the deer, and the cat are all equal and drivers should not try to split hairs by saying that one animal is more precious than the other.

Let’s make another change. Without having said so, it was likely that you assumed that the weather for these scenarios of the animal crossing into the street was relatively neutral. Perhaps it was a sunny day and the road conditions were rather plain or uneventful.

Trending AI Articles:

1. Why Corporate AI projects fail?

2. How AI Will Power the Next Wave of Healthcare Innovation?

3. Machine Learning by Using Regression Model

4. Top Data Science Platforms in 2021 Other than Kaggle

Adverse Weather Stimulates Another Variation

Change that assumption about the conditions and imagine that there have been gobs of rain, and you are in the midst of a heavy downpour. Your windshield wiper blades can barely keep up with the sheets of water, and you are straining mightily to see the road ahead. The roadway is completely soaked and extremely slick.

Do your driving choices alter now that the weather is adverse?

Whereas you might have earlier opted to radically steer around the animal, any such maneuver now, while in the rain, is a lot unpredictable. The tires might not stick to the roadway due to the coating of water. Your visibility is reduced, and you might not be able to properly judge where the animal is, or what else might be near the street. All in all, the bad weather makes this an even worse situation.

How many such twists and turns can we assume?

We can continue to add or adjust the elements, doing so over and over. Each new instance becomes its own particular consideration. The combination and permutations can be dizzying.

A newbie teenage driver is often taken aback by the variability of driving. They encounter one situation that they’ve not encountered before and go into a bit of a momentary panic mode. Experienced drivers have seen more and therefore can react as needed. That vastness of knowledge about driving situations does have its limits.

These examples bring up a debate about the so-called corner cases that can occur when driving a car. A corner case is a reference to the instance of something that is considered rare or unusual. These are events that tend to happen once.

A cat or deer or chicken that wanders into the roadway would be less likely construed corner case — it would be a more common, or core experience. The former instance is extraordinary, while the latter instance is somewhat commonplace.

Getting back to driving a car, the cat or even a deer that ran into the street proffers a driving incident or event that we probably would agree is somewhere in the core of driving.

In terms of a chicken entering into the roadway, well, unless you live near a farm, this would seem a bit more extreme. On a daily drive in a typical city setting, you probably will not see many chickens charging into the street.

So how will self-driving cars handle Corner cases?

Self-driving cars are driven via an AI driving system. There isn’t a need for a human driver at the wheel, and nor is there a provision for a human to drive the vehicle.

Some opposed that we will never attain true self-driving because of certain corner cases. The argument is that zillions of corner cases will continually arise unexpectedly, and the AI driving system won’t be prepared to handle those instances. This in turn means that self-driving cars will be ill-prepared to adequately perform on our public roadways.

Furthermore, they assert that no matter how tenaciously those heads-down all-out AI developers keep trying to program the AI driving systems, they will always fall short of the mark. There will be yet another corner case to be had. It is like a game of whack-a-mole, wherein another mole will pop up.

The thing is, this is not simply a game, it is a life-or-death matter since whatever a driver does at the wheel of a car can spell life or possibly death for the driver, and the passengers, and for drivers of nearby cars, and pedestrians, etc.

Understanding The Levels Of Self-Driving Cars

As a clarification, true self-driving cars are ones where the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.

These driverless vehicles are considered Level 4 and Level 5, while a car requiring a human driver to co-share the driving effort is usually considered Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed.

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different from driving conventional vehicles.

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And the Prolonged Mince

For Level 4 and Level 5 true self-driving vehicles, there won’t be a human driver involved in the driving task. All occupants will be passengers; the AI is doing the driving.

One aspect to immediately discuss entails the fact that the AI involved in today’s AI driving systems is not conscious. In other words, the AI is altogether a collective of computer-based programming and algorithms, and most assuredly not able to reason in the same manner that humans can.

Why this added emphasis about the AI not being conscious?

Because I want to underscore that when discussing the role of the AI driving system, I am not ascribing human qualities to the AI. Please be aware that there is an ongoing and dangerous tendency these days to anthropomorphize AI. In essence, people are assigning human-like sentience to today’s AI, despite the undeniable and inarguable fact that no such AI exists as yet.

With that clarification, you can envision that the AI driving system won’t natively somehow “know” about the facets of driving. Driving and all that it entails will need to be programmed as part of the hardware and software of the self-driving car.

Let’s dive into the myriad of aspects that come to play on this topic.

If you were to try and program an AI driving system based on each possible instance, this indeed would be a laborious task. Even if you added a veritable herd of ace AI software developers, you can certainly expect this would take years upon years to undertake, likely many decades or perhaps centuries, and still be faced with the fact that there is one more unaccounted edge or corner case remaining.

Most automakers and self-driving tech firms use computer-based simulations to try and ferret out driving situations and get their AI driving systems ready for whatever might arise. Some believe that if enough simulations are run, the totality of whatever will occur in the real world will have already been surfaced and dealt with before entering self-driving cars into the real world.

The other side of that coin is the contention that simulations are based on what humans believe might occur. As such, the real world can be surprising in comparison to what humans might normally envision will occur. Those computer-based simulations will always then fall short and not end up covering all the possibilities, say those critics.

Make no mistake, simulations are essential and a crucial tool in the pursuit of AI-based true self-driving cars.

An allied topic entails the use of closed tracks that are purposely set up for the testing of self-driving cars. By being off the public roadways, a proving ground ensures that the public at large is not endangered by whatever waywardness might emerge during driverless testing. The same arguments surrounding the closed track or proving grounds approach are similar to the tradeoffs mentioned when discussing the use of simulations.

This has taken us full circle and returned us back to the angst over an endless supply of corner cases. It has also brought us squarely back to the dilemma of what constitutes a corner case in the context of driving a car.

This squishiness has another undesirable effect.

Whenever a self-driving car does something amiss, it is easy to excuse the matter by claiming that the act was merely prolonged. This disarms anyone expressing concern about the misdeed. Here’s how that goes. The contention is that any such concern or finger-pointing is misplaced since the corner case is only a corner case, implying a low-priority and less weighty aspect.

Conclusion

There are a lot more twists and turns on this topic.

One perspective is that it makes little sense to try and enumerate all the possible corner cases. Presumably, human drivers do not know all the possibilities and despite this lack of awareness can drive a car and do so safely the preponderance of the time.

You sit at the steering wheel with those macroscopic mental templates and invoke them when a specific instance arises, even if the specifics are somewhat surprising or unexpected. If you’ve dealt with a cat that was loose in the street, you likely have formed a template for when nearly any kind of animal is loose in the street, including deer, chickens, turtles, and so on. You don’t need to prepare beforehand for every animal on the planet.

The developers of AI driving systems can presumably try to leverage a similar approach.

One viewpoint is that humans fill in the gaps of what they might know by exploiting their capability of performing common-sense reasoning. This acts as the always-ready contender for coping with unexpected circumstances. Today’s AI efforts have not yet been able to crack open how common-sense reasoning seems to occur, and thus we cannot, for now, rely upon this presumed essential backstop.

Doomsayers would indicate that self-driving cars are not going to successfully be readied for public roadway use until all corner cases have been conquered. In that vein, that future nirvana can be construed as the day and moment when we have completely emptied and covered all the bases that furtively reside in the imperious prolongment of autonomous driving.

Don’t forget to give us your 👏 !

--

--