It takes just a moment to join our fantastic community
Discussion in 'The Pub' started by Matey, Jan 5, 2020.
Hi. Would like to know peoples thoughts on driverless cars. I'm dreading them.
Not really the correct place for this conversation I guess.
Should be in the sub-forum "The Pub".
Sure as hell don't know what this has to do with 20+ year old Commodores?
Coz the way some people drive them they may as well be driverless ?
I thought it was maybe because there were lots of driverless VR/S in wreckers yards?
Lol. It's because we're motor enthusiasts that love our cars & driving. Believe me they are coming sooner than you think, and human driving will be banned.
I'm driving my wife's Santa Fe and it feels like driverless cars are just a few steps away. Adaptive cruise control, lane assist, auto breaking it's almost there. But I doubt here in Australia, particularly in the West, cars could ever be totally driverless. The computer assists require perfect road conditions, as soon as you drive on unmarked roads, or with other idiot drivers the assists don't work. So unless the government can design perfect roads and have them maintained 100 per cent of the time I doubt driverless cars will ever happen.
One thing I'm curious about is if you have an accident with a driverless car are you automatically at fault as it will be assumed that the other car was programmed to do nothing wrong, if the other car is to found to be at fault then who is responsible? The owner could say the manufacturer should be responsible for damages seeing as they programmed it. I see many court cases coming up if we go down the path of driverless cars especially when someone dies
I’d like to see how an autonomous car changes a flat tyre
An even more interesting question is how you program it in the first place.
Say a child runs out in front of the car. Do you program the car to swerve into incoming traffic, potentially harming multiple other people? As humans we have an instinctive reaction to moral decisions (good or bad!) which is hard to program for.
At least driverless cars should be better than some of the zombies currently allowed behind the wheel
A pressure monitoring sensor in the tyre will instruct the vehicle to safely stop the car in a stopping bay, send a distress signal to that place in Canberra which monitors EPIRBS, who will then notify NRMA/RAA/RAC/RACQ, who will then send a mechanic to rescue the passengers locked in the vehicle for their own safety, and replace the tyre.
Uses run flats, proceeds to nearest tyre shop and toots horn until attendant responds.
It’s important to remember that Google’s idea of a driverless car is one without a steering wheel or control pedals, which is rather different to the somewhat basic driver assist tech we currently have today. As such, legislation needs to be amended to cater for Googles view of driverless cars without human controls. So, it’s being discussed in many countries in preparation for legislation changes to accomodate true driverless cars but it’s something that will take a rather long time to sort out. It’s currently seem as assist tech which still makes the ‘driver’ responsible.
Regardless, I won’t be an early adopter or alpha tester of any of that stuff
Such moral/ethical questions are already being asked; should driverless cars avoid a fatal crash into a group of young school kids running across a road by veering into a group of old decrepit pensioners
Doesn't matter I wasn`t driving
I can recall hearing someone in the industry discussing this very thing a while back and from memory they were saying something along the lines of that part of the vehicle programming is to determine which party is better off being sacrificed. In other words the people programming the cars are making decisions like if someone is to run in front of the vehicle and there would be more victims by swerving and hitting another vehicle or something then it would hit the person running in front, if the car was going to have a head on with a bus then the car would veer off the road and sacrifice the occupants of the car. Basically the cars are being programmed when to kill to put it bluntly, sounds bad and I don't like the idea of a stranger deciding when to sacrifice a family member. This is why I see some pretty interesting stuff happening legally when something happens.
Google scholastic articles on ethical decisions by driverless cars
or watch the ted talks video below
Not sure what the end result was, but from memory a person was killed or injured in the US by a driverless car about 2 years ago.....There was a person in the passengers seat at the time too, supervising the testing
There was a cyclist that was crossing the road at night and was stuck and killed by a driverless car so years ago. Is that the one you’re thinking of?
In that case, the death was not in any way related to an algorithm deciding to avoid another more serious crash and sacrificing the cyclist as a best outcome. Also, the human driver of the driverless car was still legally responsible for the crash but it was determined that the cyclist wasn’t able to be seen in time so no further action taken
Currently, i haven’t heard of any jurisdictions that have specific legislation for driverless cars where blame is assigned to the manufacturer. Driverless cars in all cases to date require a human who is considered in charge and liable for all accidents (as it’s still something being trialed and not for general consumption, so the public are alpha testes in one sense)
Once true driverless cars are available and an accident occurs which it will, for example with a human (meat bag) driver, apportioning blame will be rather simple as all the telemetry, video, vehicle position, etc will all be logged and available to the investigators. In almost all situations, it will be the erratic human meat bag at fault
when these things are mainstream we won't recognise the world that they will create. in some very real ways i'm not looking forward to them but in others i think they will be amazing.
Separate names with a comma.