Move over, SawStop ...

The train schedule, labor contract and key access process was not available at the time of my posting. Sorry.

Reply to
DerbyDad03
Loading thread data ...

Thinking along the lines if I were the programmer for the code, I would have to conclude insufficient info and let what happens happen until such time as there is more info.

Reply to
OFWW

I reconsidered my thoughts on this one as well.

The AV should do as it was designed to do, to the best of its capabilities. Staying in the lane when there is no option to swerve safely.

There is already a legal reason for that, that being that the swerving driver assumes all the damages that incur from his action, including manslaughter.

Reply to
OFWW

...snip...

OK, have it your way.

"To truly guarantee a pedestrian?s safety, an AV would have to slow to a crawl any time a pedestrian is walking nearby on a sidewalk, in case the pedestrian decided to throw themselves in front of the vehicle," Noah Goodall, a scientist with the Virginia Transportation Research Council, wrote by email."

formatting link
ll-2016-12

...snip...

Reply to
DerbyDad03

So in the following brake failure scenario, if the AV stays in lane and kills the four "highly rated" pedestrians there are no charges, but if it changes lanes and takes out the 4 slugs, jail time may ensue.

formatting link

Interesting.

Reply to
DerbyDad03

Yes, and I've been warned that by my taking evasive action I could cause someone else to respond likewise and that I would he held accountable for what happened.

Reply to
OFWW

I find the assumption that a fatality involving a robot car would lead to someone being jailed to be amusing. The people who assert this never identify the statute under which someone would be jailed or who, precisely this someone might be. They seem to assume that because a human driving a car could be jailed for vehicular homicide or criminal negligence or some such, it is automatic that someone else would be jailed for the same offense--the trouble is that the car is legally an inanimate object and we don't put inanimate objects in jail. So it gets down to proving that the occupant is negligent, which is a hard sell given that the government allowed the car to be licensed with the understanding that it would not be controlled by the occupant, or proving that the engineering team responsible for developing it was negligent, which given that they can show the logic the thing used and no doubt provide legal justification for the decision it made, will be another tall order. So who goes to jail?

Reply to
J. Clarke

You've taken it to the next level, into the real word scenario and out of the programming stage.

Personally I would assume that anything designed would have to co-exist with real world laws and responsibilities. Even the final owner could be held responsible. See the laws regarding experimental aircraft, hang gliders, etc.

But we should be sticking to this hypothetical example given us.

Reply to
OFWW

Experimental aircraft and hang gliders are controlled by a human. If they are involved in a fatl accident, the operator gets scrutinized. An autonomous car is not under human control, it is its own operator, the occupant is a passenger.

We don't have "real world law" governing fatalities involving autonomous vehicles. The engineering would, initially (I hope) be based on existing case law involving human drivers and what the courts held that they should or should not have done in particular situations. But there won't be any actual law until either the legislatures write statutes or the courts issue rulings, and the latter won't happen until there are such vehicles in service in sufficient quantity to generate cases.

Rather than hang gliders and homebuilts, consider a Globalhawk that hits an airliner. Assuming no negligence on the part of the airliner crew, who do you go after? Do you go after the Air Force, Northrop Grumman, Raytheon, or somebody else? And of what are they likely to be found guilty?

It was suggested that someone would go to jail. I still want to know who and what crime they committed.

Reply to
J. Clarke

Damages would be a tort case, as to who and what crime that would be determined in court. Some DA looking for publicty would brings charges.

Reply to
Markem

So why do you mention damages?

What charges? To bring charges there must have been a chargeable offense, which means that a plausible argument can be made that some law was violated. So what law do you believe would have been violated? Or do you just _like_ being laughed out of court?

Reply to
J. Clarke

The person who did not stay in their own lane, and ended up committing involuntary manslaughter.

In the case you bring up the AV can be currently over ridden at anytime by the occupant. There are already AV vehicles operating on the streets.

Regarding your "whose at fault" scenario, just look at the court cases against gun makers, as if guns kill people.

So can we know return to the question or at the least, wood working?

Reply to
OFWW

Are you arguing that an autonomous vehicle is a "person"? You really don't seem to grasp the concept. Rather than a car with an occupant, make it a car, say a robot taxicab, that is going somewhere or other unoccupied.

In what case that I bring up? Globalhawk doesn't _have_ an occupant. (when people use words with which you are unfamiliar, you should at least Google those words before opining). There are very few autonomous vehicles and currently they are for the most part operated with a safety driver, but that is not anybody's long-term plan. Google already has at least one demonstrator with no steering wheel or pedals and Uber is planning on using driverless cars in their ride sharing service--ultimately those would also have no controls accessible to the passenger.

I have not introduced a "who's at fault scenariao". I have asked what law would be violated and who would be jailed. "At fault" decides who pays damages, not who goes to jail. I am not discussing damages, I am discussing JAIL. You do know what a jail is, do you not?

You're the one who started feeding the troll.

Reply to
J. Clarke

They can impound your car in a drug bust. Maybe they will impound your car for the offense. We'll build special long term impound lots for serious offenses, just disconnect the battery for lesser ones.

The programmer will be jailed. Or maybe they will stick a pin in a Voodoo doll to punish him.

The sensible thing would be to gather the most brilliant minds of the TV ambulance chasing lawyers and let them come up with guidelines for liability. Can you think of anything more fair than that?

Reply to
Ed Pawlowski

ading in their

down the tracks,

you is a fairly

n be a stranger.

e that an AV

s all of the

communicates

e the initial

on a bridge

...and then you joined the meal.

Reply to
DerbyDad03

On Nov 24, 2017, OFWW wrote (in article):

GlobalHawk drones do have human pilots. Although they are not on board, they are in control via a stellite link and can be thousands of miles away.

.

Joe Gwinn

Reply to
Joseph Gwinn

Sure. Build a random number generator into the AI. The AI simply uses the random number to decide who to take out at the time of the incident.

"Step right up, spin the wheel, take your chances."

It'll all be "hit or miss" so to speak.

Reply to
DerbyDad03

USATODAY: Self-driving cars programmed to decide who dies in a crash

formatting link

Reply to
Spalted Walt

I am not looking for political office, ever heard the saying a DA can indict a ham sandwich.

Reply to
Markem

Reply to
Markem

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.