South Korean University Developing Autonomous Killer Robots

Idaeo

Superstar
Supporter
Joined
Jan 10, 2013
Messages
7,044
Reputation
3,709
Daps
34,200
Reppin
DC
(CNN)At least 50 artificial intelligence (AI) scientists from around the world called for a boycott of a South Korean university over concerns it was working with a defense company to research autonomous weapons, or "killer robots."

In announcing the boycott, the AI scientists said they were disappointed the Korea Advanced Institute of Science and Technology (KAIST) was looking "to accelerate the arms race to develop such weapons," a claim the university has denied.
"We therefore publicly declare that we will boycott all collaborations with any part of KAIST until such time as the President of KAIST provides assurances, which we have sought but not received, that the Center will not develop autonomous weapons lacking meaningful human control," the statement said.

The scientists added that if developed, autonomous weapons will be a "third revolution" in warfare.
"They will permit war to be fought faster and at a scale greater than ever before," the statement said. "They have the potential to be weapons of terror. Despots and terrorists could use them against innocent populations, removing any ethical restraints. This Pandora's box will be hard to close if it is opened."

:gucci:

This won’t end well. Faster computing in smaller spaces will only become cheaper with time. Boycotting won’t stop homemade autonomous robots or drones.
 

MidniteJay

Retired Coli Breh
Supporter
Joined
May 18, 2012
Messages
25,031
Reputation
6,267
Daps
67,146
It begins...
9b225b44-d8f6-4bd7-9054-c21e503d2e13.gif
 

MidniteJay

Retired Coli Breh
Supporter
Joined
May 18, 2012
Messages
25,031
Reputation
6,267
Daps
67,146
It's a matter of time. Fools can't turn down money for anything. Humans are just self destructive. Our motivations are not in the right place, we will make ourselves extinct chasing that big pay day at all costs.

:francis: Hope they put in a backdoor kill code to these bots because if they don't that was the plot to Horizon: Zero Dawn. And in that game humans took the biggest L compared to ones from The Matrix and Terminator.
 
Last edited:

WaveCapsByOscorp™

2021 Grammy Award Winner
Joined
May 2, 2012
Messages
18,984
Reputation
-355
Daps
45,218
yeah, machines fighting machines is a terrible idea but probably likely. i mean, we do it remotely already.
 

Idaeo

Superstar
Supporter
Joined
Jan 10, 2013
Messages
7,044
Reputation
3,709
Daps
34,200
Reppin
DC
yeah, machines fighting machines is a terrible idea but probably likely. i mean, we do it remotely already.

Autonomous fighting machines are a completely different idea though. Once they come to some logical decision that humans aren’t necessary anymore....it’s a wrap.

In March 2016, Sophia underwent a technical glitch during a demonstration by founder, David Hanson, at the South by Southwest (SXSW) technology show in Texas when the robot claimed it will 'destroy humans'.

Hanson jokingly asked "do you want to destroy humans?...Please say no".

And Sophia's response was not quite what Hanson had in mind, she said, "OK. I will destroy humans."

:mjcry:
 

WaveCapsByOscorp™

2021 Grammy Award Winner
Joined
May 2, 2012
Messages
18,984
Reputation
-355
Daps
45,218
Autonomous fighting machines are a completely different idea though. Once they come to some logical decision that humans aren’t necessary anymore....it’s a wrap.



:mjcry:
yeah, it reminds me of that robocop movie when they were testing the robots and they couldn't get them to "calm down" or tone it down, straight up murdering people over parking violations...
 

Bumblebreh

Collecting honey and money
Joined
Dec 19, 2016
Messages
8,684
Reputation
2,234
Daps
43,259
:francis: Hope they put in a backdoor kill code to these bots because if they don't that was the plot to Horizon: Zero Dawn. And in that game humans took the biggest L compared to ones from The Matrix and Terminator.

It will be too late as the a.i tech will be able to easily bypass its current command code in secret.As the a.i will learn to develop its own coding language at a much faster rate in which humans will not be able to comprehend or keep up.And hence in why a.i at some point will be a threat to every single human being.
There could be ancient alien a.i that could have wiped out advanced alien civilizations
 

Mowgli

Veteran
Joined
May 1, 2012
Messages
104,434
Reputation
14,184
Daps
246,630
It's a matter of time. Fools can't turn down money for anything. Humans are just self destructive. Our motivations are not in the right place, we will make ourselves extinct chasing that big pay day at all costs.
Seems to be specific groups of humans
 

Pazzy

Superstar
Joined
Jun 11, 2012
Messages
32,893
Reputation
-5,648
Daps
51,618
Reppin
NULL
Intelligence is not wisdom because any wise person will see that this is some Pandora box shyt. Best not create something that can't be controlled or stopped once it's created. Kill switch or not.
 
Top