www.zejournal.mobi
December 24, 2024

Killer Robots: Myth or Reality?

Author : Sputnik | Editor : Indie | November 29, 2017 at 10:21 AM

Last week, Geneva hosted the first official meeting of governmental officials from different countries discussing a possible ban on so-called killer robots. However, the event's participants failed to reach any agreement. Sputnik asked an expert on artificial intelligence, Raul Rojas, to comment on the issue.

Killer robots don't have ethical principles and are prone to technical mistakes; this is why machines should not be allowed to make decisions with possible lethal outcomes without prior approval from a human being, Raul Rojas, a professor at the Free University of Berlin told Sputnik Germany.

"Eventually, someone should feel responsible for this, and not just blame it on the machines," Rojas stated.

Robots Make Mistakes

Technical errors that occur in machines are quite a common issue, the analyst argued. This is often the case when countries use drones.

While a human being can always apply his or her intuition and knowledge of the situation, the machines are incapable of doing so, Rojas claimed.

"Robots are made to only recognize standard cases, all deviations can't be programmed," the analyst noted.

Rojas also added that machines are "ridiculously easy to deceive." For example, rockets that are programmed to react to the heat by planes, can be misled if the aircraft starts dropping hot metal objects.

Ethical Concerns

The usage of machines during military operations also raises a lot of ethical questions. How will a robotic killer behave if he encounters a group of terrorists among the civilian population? And if there is only one civilian, will the machine sacrifice his or her life for the sake of the operation?

Rojas is confident that such questions require human rational analysis and empathy — qualities that machines are incapable of.

"I think that it is impossible to incorporate ethical principles inside the machines," Rojas stated.


Read More :

Rules and exceptions can be fixed in an algorithm, but for making ethical decisions, "robots require human cognitive abilities."

According to the analyst, it is impossible to create a human-like robot, even in the distant future.

"This will never happen with machines, and therefore robots will never be able to behave in accordance with any ethical considerations," the analyst concluded.


- Source : Sputnik

Send via email :

Comment

Send your comment via :



Close

Search
Like Our Site?
(34)
Latest Articles
Most Read Articles
Loading...
Loading...
Loading...

Email Subscribe

Received our newsletter, we send it to your email

  


Close