Self-driving cars will kill people and we need to accept that

Kia Stinger

Staff member
Joined
Jan 8, 2017
Messages
9,712
Reaction score
3,282
Points
118
Location
Palm Beach, FL
Recently, headlines have been circulating speculation about what we need to do about the risks of self-driving vehicles. After one of its self-driving vehicles was responsible for a fatal crash, Uber has temporarily paused all autonomous vehicle testing in the state of Arizona. In its wake, Arizona Governor Douglas Ducey has reiterated his position to prioritize public safety as a top priority and has described the Uber accident as an “unquestionable failure” in preserving this priority.

Also recently, Tesla confirmed that a recent highway crash (which killed the driver of the vehicle) happened while the Tesla Autopilot system (a semi-autonomous feature) was controlling the car. This is the second accident in which the Tesla Autopilot system was at least partially at fault.

To many consumers, these incidents are a confirmation of something they suspected all along; trusting an AI system to handle driving is a mistake and one that’s destined to kill people. Self-driving cars, they therefore conclude, need to be heavily regulated and scrutinized, and potentially delayed indefinitely, until we can be sure that they’ll bring no harm to their drivers and passengers.

This is an inherently flawed view. It’s not a good thing that self-driving cars have killed people, but testing them in real-world situations is a necessary thing if we want to keep moving forward toward a safer, brighter future. And unless we want to jeopardize that future, we need to get over our fears.

Self-driving cars are going to kill people. Period.
First, we need to recognize that no matter what safeguards we put in place or how cautious we are with rolling out self-driving technology, autonomous vehicles are going to be involved in fatal collisions.

There are 325 million people in the United States and more than 260 million registered vehicles. Cars and pedestrians are constantly engaging in a world with random variables, from unexpected traffic patterns to crazy weather conditions to sudden falling objects obstructing the road. With a fleet of vehicles traveling millions of miles, it’s inevitable that some conditions could make an accident unavoidable—no matter how advanced the driving algorithm is.

No matter what, people are going to die at the “hands” of an autonomous vehicle. Read more...
 
As much as I have no desire to own one, I'd trust them over a large number of drivers on our roads!
 
______________________________
Mixing machines and humans always creates accidents - of course, so does the environment that's nothing but humans. LOL

But the machines can't anticipate what a human's going to always do. There will always be exceptions.

That's not what I wanted to bring up. This whole "smart cars", "smart roadways (better sensors for the cars, etc.)", "self-driving cars", etc. is another mass-transit "solution" that's not going to go the last mile (or longer) to get a lot of people home. Just like buses, trains, etc., they aren't likely to get to our neighborhoods and those living outside of metro areas, in my mind. Something I can drive will remain mandatory. So now we all have to co-exist.

And once humans have an idea what parameters the "smart" or "self-driving" cars are operating under, we'll start taking advantage of them. LOL How close do I have to get in order for the car to swerve or brake, etc. Can you see the kids screwing with them? LOL

If the speed limit is 55, is it doing 55 or is it willing to go a bit over or under to help traffic flow? Cuz if it's gonna do 55 in a 55 it's going to be a problem in the traffic patterns I'm most commonly in. People will be passing it constantly, changing lanes behind and in front will be commonplace, and frustrating the human drivers.

Lotsa issues to be worked out on this stuff. It's early. But I don't really see this going mainstream too soon. Kinda like 3D TVs - remember those?
 
I agree and disagree. We still don’t have level 5 autonomous vehicles or infrastructure to support these vehicles . Ultimately level 5 cars we’ll be safe but there are still lots of glitches in current gen of autonomous cars. Not sure public should be exposed to tests unless a driver is alert and behind wheel at all times. A pedestrian being run down by AI car is a travesty that should not have happened. Shows limitation of current AI . I’m confident AI will get to point where it’s much safer but no hurry until technology , sensors , infrastructure improve.

Here’s Why Level 5 Autonomous Cars May Still be a Decade Away
 
From interior to exterior to high performance - everything you need for your Stinger awaits you...
I agree and disagree. We still don’t have level 5 autonomous vehicles or infrastructure to support these vehicles . Ultimately level 5 cars we’ll be safe but there are still lots of glitches in current gen of autonomous cars. Not sure public should be exposed to tests unless a driver is alert and behind wheel at all times. A pedestrian being run down by AI car is a travesty that should not have happened. Shows limitation of current AI . I’m confident AI will get to point where it’s much safer but no hurry until technology , sensors , infrastructure improve.

Here’s Why Level 5 Autonomous Cars May Still be a Decade Away
I guess our cars are level 2 or 3?
 
I agree and disagree. We still don’t have level 5 autonomous vehicles or infrastructure to support these vehicles . Ultimately level 5 cars we’ll be safe but there are still lots of glitches in current gen of autonomous cars. Not sure public should be exposed to tests unless a driver is alert and behind wheel at all times. A pedestrian being run down by AI car is a travesty that should not have happened. Shows limitation of current AI . I’m confident AI will get to point where it’s much safer but no hurry until technology , sensors , infrastructure improve.

Here’s Why Level 5 Autonomous Cars May Still be a Decade Away
Again, I don't believe this technology will be rolled to everyone's driveway.
 
Again, I don't believe this technology will be rolled to everyone's driveway.
It will if government mandates this as a safety requirement in distant future . Just like airbags and abs.

And think about how many people have been killed by airbags!
 
Then it's a very, very long way away. They won't be ripping up roads and putting in sensors, etc. to a LOT of homes for a LONG, long time out in the countrysides.
 
The core ingredient is involvement. The drivers of these accident causing cars are doing something else besides DRIVING. Tesla specifically says to keep your hands on the wheel. The recent Utah accident: the driver had not touched the steering wheel for eighty seconds before the rear-ender; the car increased in speed back up to 60 mph for the 3.5 seconds prior to the impact; the driver admitted that she had been on her cell phone, "looking for route options" (!?). She had been driving her Tesla this way for TWO YEARS! Something similar was going on with the Cali accident a week later, also involving a Tesla. This is not an "autopilot", it is a driver assist; and Tesla is remiss in labeling their driver assist "autopilot".

Because of idiots on their cell phones. And because some (an increasing number) of them use their driver assists to free up their hands and eyes, I have rearranged the order of priority in "Merlin the Mad's Six Rules for Defensive Driving". I used to have "watch your mirrors often" as number six. Now it is number four, because we need to keep our maneuvering room as open as possible in case some maroon is closing quickly on us from the rear, with their head down, letting their car drive for them.

Merlin the Mad's Six Rules for Defensive Driving:

1. Always devote as much conscious attention to your driving as required. I would say at least half of your brain power at all times. Reduce multitasking.

2. Never follow too closely. Be generous in the allowance of space between your car and the vehicle you are following.

3. Reduce speed to existing conditions. This means never use cruise control in slick or icy or restricted visibility conditions.

4. Check your mirrors often. The idiots behind you will pose your greatest risk that you have zero control over.

5. Never perform sudden, unsignaled maneuvers. Always be where other drivers expect you to be.

6. Check for side traffic, and double check, before pulling out into traffic or changing lanes.
 
______________________________
From interior to exterior to high performance - everything you need for your Stinger awaits you...
Edit: The Anti spam here is killing one specific line in my post. I can't erase the BLAH without it being flagged. Lol.
Edit 2: Wow, thanks for fixing that Sal, that was fast!

This is always an interesting discussion because everyone's expectation and definition of 'Self Driving' is so varied. We have a bit of self driving today in the form of LKA, Smart cruise, and Blind Spot monitoring. Cobble that all together, make it aggressive, and you have Tesla's autopilot. Throw in some more sensors, give it full start/stop, traffic sign and road sensing, obstruction avoidance and low speed controls, and you have Uber/GM/Google/Waymo's self driving cars. Give it a few years, bump up the tech to make it smarter, start selling it to people, and you'll have door-to-door capable cars.

The problem is right now, it's not ready, and the middle ground between no automation and full automation is scary. If you're driving yourself (No automation), you know you have to pay attention and watch for problems. You know what you're doing, and can react accordingly. If someone else is driving (Full automation), you don't need to pay attention or react at all. So what happens when you put your 90 year old grandpa with a history of heart attacks and sleep apnea in the driver seat (Partial automation)? Even though you're the passenger now, you still have to be ready to react and take over if he drops dead.

That's pretty hard to do, surprisingly. The human brain is absolutely garbage at passively paying attention. Everyone has 'spaced out' during an important conversation, or during a meeting, or during a class, just to be suddenly snapped back to it when someone directly addresses you. Whereas the person conducting the meeting, the class, the conversation has no such problem, because their brain is actively involved. Now pretend that was you spacing out while watching the self driving car, and being snapped back into it is a dumbass jumping into the road. Now you stumble. Is the car going to slow itself down? No? Gotta find the brake! Too late. You hit them.

Plenty of studies float around putting the average, undiverted, passive attention span of a person at ~5 minutes. That drops to ~1 minute if tired or bored (Both things that happen on long trips). Conversely, the 'active' attention span has no real limit outside of distraction or tiredness. I think the study I read about involved a boring freeway driving simulation. The drivers had to slam on the brakes when they saw an obstacle. Half the drivers had to steer themselves and control throttle and braking. The other half had no controls except a brake. Up till about 5 minutes, both groups scored near 100% obstacle avoidance. After 5 minutes, the full control group still scored near 100%, but the brake-only group dropped hard, getting lower the longer it went on, to about 20% at 15 minutes.

So why is that a bad thing? Going to point out the Tesla examples. Here is how Tesla advertises their 'Autopilot' on their main page:
Full Self-Driving Hardware on All Cars
All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

Now you watch the videos. The driver in their promo videos has his hands off of the wheel for 95% of the video. So, you think, if they're doing it, the videos show it, and they advertise that it's self driving, you try it out, and it works! Yay! It's great. You try it again and again, babysitting it less and less each time. The car has got this. You keep an eye on it sometimes, but you've gotten over the distrust hump. Then one day after 2 years it fucks up and crashes. Should you have been paying attention? Yeah. But false sense of security and advertising have made you mostly oblivious to the danger. It's the human element compounded by high expectations and previously good luck.

So we go back to, "Just pay attention when the car is driving itself", and we see that it's basically impossible to expect any human to pay full passive attention for an extended period of time. So it's a catch 22. If you're partially driving, you're relying on luck that you don't need to intervene. If you are driving yourself completely, any kind of drive assist is totally pointless. And we don't have 100% 'no driving' yet.

So what do I think about the future of self driving cars? What I hope we don't see is a flood of half measures that are advertised as being better than they are (*cough* Tesla), that lead to passive driving and inattention and accidents by giving their users a false sense of security. Telsa is doing it so wrong. Their tech is neat, I love their cars, but they're not doing anything about the false expectations their customers have about their autopilot technology. They've never once come out and said "Well, yeah. Autopilot is just glorified lane keep assist".

I'm more looking at GM, Waymo or Google (Uber is a shit company with shit safety standards that was bound to screw it up out of incompetence), and Waymo is in the lead. I think we will have 100% self driving in the future, with full NHTSA certification and fail safe modes, and that technology will be available in consumer cars not long after, and it will eventually become a standard option.

I fully believe that my daughter (Who is 3 years old right now) will have the option of purchasing a level 4 or 5 self driving car as her first car. (Year 2030ish) Whether or not that involves dropping 100 grand on a high spec halo car, who knows, but I think it will be an option. And I think her kids will be able to get a base model Toyota with level 5 self driving (2055ish).

Now, this is all like 40 years in the future, so who knows, right? I just don't think there's any reason for it to be impossible. Computer tech moves at lightspeed compared to automotive tech, and the self driving ball is in the computer tech court right now.
 
Last edited:
Mix of auto and non auto personal vehicles is of course a bad idea and will never work in my lifetime (I figure to be alive another 40 years give or take). Well, unless somehow I get hit by an auto vehicle...then it's the give or is it the take? In any event, bad idea. :coffee:
 
Mix of auto and non auto personal vehicles is of course a bad idea and will never work in my lifetime (I figure to be alive another 40 years give or take). Well, unless somehow I get hit by an auto vehicle...then it's the give or is it the take? In any event, bad idea. :coffee:

The goal is that you won't be able to tell. It won't be like getting stuck behind a car on rails, stuck at one speed and unyielding. It'll slow up, speed down, follow traffic and construction, move over, and maybe one day flip you off, just like any other car on the road.
 
Flip me off?! So you're saying auto road rage will be a thing in the future?!

Seriously though, it will happen many moons down the road... I just hope it's no where near in full on live action while I'm alive. Way too many possibilities with mixed use. And, can you imagine all the laws that would be needed? Hmmmm... instead of ambulance chasers it'll be auto vehicle chasers!
 
Back
Top