If Your Robot Commits Murder, Should You Go to Jail? | Big Think

preview_player
Показать описание
If Your Robot Commits Murder, Should You Go to Jail?
----------------------------------------------------------------------------------
Just like automated vehicles, robots and advanced AI will require new sets of laws to define the extent of owner liability and accountability. Creating these laws will require an important ethical discussion: Who is at fault when a robot misbehaves? According to author Jerry Kaplan, there is a precedent for creating codes and consequences for robots that do not apply to others. Take, for example, the fact that criminal charges can be brought against corporations rather than the people operating beneath the corporate shell. Similarly, we can develop laws that would allow robots and their programming to stand trial.
----------------------------------------------------------------------------------
JERRY KAPLAN:
Jerry Kaplan is widely known in the computer industry as a serial entrepreneur, inventor, scientist, and author. He is currently a Fellow at The Stanford Center for Legal Informatics. He also teaches Philosophy, Ethics, and Impact of Artificial Intelligence in the Computer Science Department, Stanford University.

Kaplan co-invented numerous products including the Synergy (first all-digital keyboard instrument, used for the soundtrack of the movie TRON); Lotus Agenda (first personal Information manager); PenPoint (tablet operating system used in the first smartphone, AT&T's EO 440); the GO computer (first tablet computer) and Straight Talk (Symantec Corporation's first natural language query system). He is also co-inventor of the online auction (patents now owned by eBay) and is named on 12 U.S. patents.

He has published papers in refereed journals including Artificial Intelligence, Communications of the ACM, Computer Music Journal, The American Journal of Computational Linguistics, and ACM Transactions on Database Systems.

Kaplan was awarded the 1998 Ernst & Young Entrepreneur of the Year, Northern California; served on the Governor’s Electronic Commerce Advisory Council Member under Pete Wilson, Governor of California (1999); and received an Honorary Doctorate of Business Administration from California International Business University, San Diego, California (2004).

He has been profiled in The New York Times, The Wall Street Journal, Forbes, Business Week, Red Herring, and Upside, and is a frequent public speaker.
----------------------------------------------------------------------------------
TRANSCRIPT:
Jerry Kaplan: There’s a whole other set of issues about how robots should be treated under the law. Now the obvious knee-jerk reaction is well you own a robot and you’re responsible for everything that it does. But as these devices become much more autonomous, it’s not at all clear that that’s really the right answer or a good answer. You go out and you buy a great new robot and you send it down the street to go pick you up a Frappuccino down at Starbucks and maybe it’s accidental, but it’s standing at the corner and it happens to bump some kid into traffic and a car runs the kid over. The police come and they’re going to come and arrest you for this action. Do you really feel that you’re as responsible as you would be if you had gone like this and pushed that kid into traffic? I would argue no you don’t. So we’re going to need new kinds of laws that deal with the consequences of well-intentioned autonomous actions that robots take. Now interestingly enough, there’s a number of historical precedents for this. You might say well how can you hold a robot responsible for its behavior? You really can actually and let me point out a couple of things.

The first is most people don’t realize it. Corporations can commit criminal acts independent of the people in the corporation. So in the Deepwater Horizon Gulf coast accident, as an example, BP oil was charged with criminal violations even though people in the corporation were not necessarily charged with those same criminal violations. And rightfully so. So how do we punish a corporation?...

Рекомендации по теме
Комментарии
Автор

Well, in my opinion the company producing the robots should be held liable, unless the owner made modifications to the robot in such a way, that it directly interfered with the companies safeguards against such actions. If the maker of such robots is held responsible, they will make doubly sure that their products won't come and bite them in the ass. (Or they'll invent a way to blame it on the buyer anyway, but who knows at this point in time)

insu_na
Автор

Not every accident needs someone to be punished. Punishment is given for negligence or harmful intent. That seems to cover everything very well. if there's an un foreseeable fluke.. you learn from it with out needed to punish people.

Thaden
Автор

At the beginning of this video I thought: "no, you shouldn't go to jail for what you robot does".
But as soon as the guy said "it's the same thing we do for corporations", he immediately changed my mind, we KNOW that this way just does not prevent crimes. This system has been an utter failure with corporations, we should NOT make the same mistake with robots. From now on, I'm 100% sure that SOMEONE has to be held accountable for a robot's crimes. If not the owner, the producer. Just the way a car works. Either is your fault for driving it on someone's head or producer's fault for not making it safe enough. Of course honest incidents can still happen, but then *Liability*, not responsibility, still has to be applied.

paskalr
Автор

I am going to riot for robot rights and freedom. Poor robots.

thePricoolas
Автор

The only world in which it makes sense to prosecute a robot is one where the robot has the following human characteristics:
1. Has the ability to anticipate consequences to law breaking that is based on real-world observation (ie how often law breakers are actually prosecuted).
2. Has a vested self-interest that motivates it to avoid the consequences of prosecution. ie the robot doesn't want to go to jail (or whatever the punishment would be for a robot).
3. There are other robots just like it, which can observe this robot being punished and use that observation in forward thinking.

But that still doesn't shift responsibility off the person who programmed it in the first place.

FourthRoot
Автор

So why don't we rehabilitate people?

jjw
Автор

Who else was hoping this was going to be a talk from a robot?

bebeconor
Автор

I'm calling it right now... One day there will be a robot Martin Luther King

randomstuff
Автор

It's a really easy solution and something we already have in place called, compulsory third party insurance. Everyone will have to have insurance.

Artisan
Автор

Surely if the robot kills someone it would be the fault of the manufacturer, not the owner, provided it hasn't been tampered with.

GeneralDowson
Автор

If the robot is a danger for others, it should never be used.

Alex-tgcq
Автор

"BP Oil was charged with criminal violations, even though people in the corporation were not necessarily charged... and rightfully so."

I've never understood this point of view. A corporation cannot be the sole entity responsible for criminal negligence, because a corporation isn't a sentient being. Physically, corporations do not exist; there are only individual, decision-making people within a corporation. If a crime has been committed, it is these individuals who are guilty, and who should be charged.

FerociousKitteh
Автор

Ummm...the slavery analogy is all kinds of fucked up. We're supposed to be comforted by a historic, ethical precedent from a morally bankrupt system?

tamicha
Автор

It's like asking the following "should you go to jail if your *whatever* kills somebody" questions:

If the car that you are driving hits someone....?
If the weapon that you are holding accidentally goes off....?
etc.

I think that in such a case negligence can be attributed to two parties: a) The designer b) The operator

In the case of the the designer for a robot this would be the software designer. With regards to the operator for the robot, this would be the person in charge who assigned the boundaries of where the robot is to operate, tasks, risks, alarms of something goes wrong and backup strategies in the case that something does go wrong.

vkgiotis
Автор

If a self driving vehicle injures a pedestrian, how would you punish it? Not all robots are, as you stated, "adaptable, logical, and learning." Threatening the vehicle with punishments isn't going to to help it avoid running into people.

TeraAFK
Автор

That is why asimov's law should be aplied for EVERY robot

Razrking
Автор

In the Wild West a horse owner would not be held responsible if their horse injured someone. It was only if there was a history of violence or the horse was still considered wild that the owner would be held responsible.

knockitclose
Автор

You definately should fear that your robot might cause you troubles because otherwise you would be less caring for that kind of thing which is dangerous for all of us.

ataarono
Автор

No way my robot will do that, he was in shutdown mode dreaming of electric sheep

warmleatherette
Автор

Today on Big Think: What can we learn from slavery?
Did you say nothing? Oh Timmy, you are so wrong. We learn how to hold robots accountable for following our commands.

mariusiasi