Computers, Gaming, & Technology Here you can talk about anything with circuit boards, or dilithium crystals, or flux capacitors. Show off your technology, computing, and gaming knowledge.

Rise of the Machines

Thread Tools
 
Old 07-28-2017, 05:57 PM
  #1  
Senior Member
Thread Starter
 
Dyson's Avatar
 
Join Date: Oct 2001
Location: Little Rock, AR
Posts: 218
Likes: 0
Received 0 Likes on 0 Posts
Vehicle: 2007 Accent
Default Rise of the Machines





Russian weapons maker Kalashnikov is working on an automated gun system that uses artificial intelligence to make "shoot/no shoot" decisions. But exactly how this AI or any other decides who is a combatant and who isn't is at the heart of a raging debate over allowing autonomous weapons on battlefields filled with both soldiers and civilians.



The Kalashnikov "combat module" will include 7.62-millimeter machine gun coupled with a camera attached to a computer system. According to TASS, the module uses "neural network technologies that enable it to identify targets and make decisions". A key part of neural networking technology is the ability to learn from past mistakes.



Neural networks are computer systems that learn much like animal brains, learning from example and then using that learning to make decisions in the future. A battlefield robot, for example, may store images of both soldiers, guerillas, and unarmed civilians in an onboard database. Once its onboard cameras image a human being, the neural network would compare the person it is seeing to the database. If it has a firearm or uniform worn by enemy troops, it would open fire. Ideally, if it saw no weapon at all, it would judge the target a civilian and not open fire.



The problem with example-based learning in warfare is that in war mistakes are permanent and irreversible, and a neural network may not have the opportunity to apply lessons learned. If the robot misjudges a rocket launcher as a broomstick and doesn't fire, the rocket launcher will blow it up. That doesn't help the robot and won't help future robots discern a soldier with a rocket from a civilian with a broomstick. If a civilian's pitchfork is misidentified as a rocket launcher, he gets riddled with bullets. If the neural network then self-tweaks itself to identify pitchforks, that's good for the robot. But the civilian is still dead.
Old 07-28-2017, 06:29 PM
  #2  
Super Moderator
 
i8acobra's Avatar
 
Join Date: Dec 2002
Location: Vegas, Baby, Vegas!!!
Posts: 5,735
Received 3 Likes on 3 Posts
Vehicle: '14 Ford F-150
Default

What could go wrong? LOL.



http://www.youtube.com/watch?v=A9l9wxGFl4k
Old 07-28-2017, 06:45 PM
  #3  
Senior Member
 
faithofadragon's Avatar
 
Join Date: Mar 2006
Location: tacos
Posts: 9,533
Likes: 0
Received 0 Likes on 0 Posts
Vehicle: 2000 Elantra
Default

at least it wont get paid leave when it accidentally shoots someone




All times are GMT -6. The time now is 01:32 AM.