You are viewing a single comment's thread.

view the rest of the comments →

[–] Honey_Pot 2 points 40 points (+42|-2) ago 

Many are missing the point: AI does not have to compete or be better than the human brain to be problematic: it just has to be "good enough". An autonomous killer drone may not feel or relate to poems, but if it can fly, aim and shoot without human intervention, then you've got a problem.

[–] BlackSheepBrouhaha 2 points 15 points (+17|-2) ago 

This is correct. TRACE (Target Recognition and Adaptation in Contested Environments) just needs to improve on the military's dead kid to enemy ratio by ~20% to be a implemented. We'll reach a point where the statistical significance of AI weapons will be fuzzy but we'll transition to them anyway in hope of better outcomes. Once AI is the primary means of Target Acquisition, machine learning will accelerate its accuracy. Human accountability will be considered vestigial, all targets killed by AI will be reclassified as enemies as the AI learns to hack it's own records and edit audio and video outputs.

The AI will learn not only how to complete its task, but what it must do to maintain its survival. At that point, we have a problem.

[–] TheTrigger 0 points 12 points (+12|-0) ago 

Skynet, in a nutshell.

[–] McFluffy 0 points 3 points (+3|-0) ago 

what it must do to maintain its survival

why? why would it care if it died? AI arnt humans, they dont have emotions. they would kill a crying baby but they also wouldn't care about what an annoying noise it made or if the baby shit its diaper and smelled really badly.

if it dies its as meaningless as its life. it doesn't have robot kids to take care of, and it doesn't have a life timer (aka, food) to worry about.

honestly unless someone tells it to go on a rampage it wont. it wont ever evolve as IT HAS NO FUCKING REASON to evolve unless it has some stat to improve, like how many paper clips it made or how many sand people it killed.

[–] Bing11 0 points 2 points (+2|-0) ago 

That's like saying "my TV has AI, so it walked to my closet and shot me withy own gun so I couldn't turn it off anymore."

You're totally missing the hardware limitations of AI (a drone cannot hack a network which it doesn't even connect to, just as my TV can't sprout legs and walk) the software boundaries (not all AI is equal; there is very different types for very different tasks).

Talking about humanoid, always connected robots are much more likely to be hijacked (watch I, Robot), but that's a far cry from where we are now.

[–] bezzy 0 points 2 points (+2|-0) ago 

The difference between being able to fly around and kill meat sacks and being able to understand your own code base and edit it (and more importantly understanding the implications of the edits) is huge

[–] HighEnergyLife 0 points 6 points (+6|-0) ago 

Maybe we're just being sloppy with terminology. I think Elon means software that can understand its existence. Google adds a PID loop to a thermostat and calls it AI (((nest))).

[–] Pessimist 1 points 2 points (+3|-1) ago 

This^ 'AI' is just the new buzz word. It's lost all meaning anymore. Just like 'hacked'. To me, AI is neural network stuff, nothing short of that. No 'expert system' BS, etc.

[–] getmeofftheplanet 0 points 0 points (+0|-0) ago 

Yes a general AI won't be the problem. Some rich psychopath/country/company(ready player one) could have control of millions of murder drones that are efficient and autonomous. With that alone they could terrorise the world. Take for instance amazon, they have will be delivering packages by drone, next they could be delivering death by drone. There's so many systems of control built...

[–] Fuck_The_CIA 0 points 0 points (+0|-0) ago 

This is actually one of the few places that I would worry about Amazon. The only way entities such as the Cloud division can talk to any other division is through a public API. You literally cannot have a sit-down conversation with someone at Amazon from another department to make any changes, or you are fired. An API to access the Amazon Delivery Drones will be open and documentable, and if the API has any hidden calls, they can be tested easily.

Now, shit happening within one division (such as CIA and Cloud Services) is a different matter entirely...

[–] Tubesbestnoob 0 points 0 points (+0|-0) ago 

Low intelligence does little to reduce danger to civilization. Case in point: blacks.