Hyundai Aftermarket

Hyundai Aftermarket (https://www.hyundaiaftermarket.org/forum/)
-   Computers, Gaming, & Technology (https://www.hyundaiaftermarket.org/forum/computers-gaming-technology-75/)
-   -   Rise of the Machines (https://www.hyundaiaftermarket.org/forum/computers-gaming-technology-75/rise-machines-82935/)

Dyson 07-28-2017 05:57 PM

Rise of the Machines
 
http://pop.h-cdn.co/assets/16/37/768...-soratnik1.jpg




Russian weapons maker Kalashnikov is working on an automated gun system that uses artificial intelligence to make "shoot/no shoot" decisions. But exactly how this AI or any other decides who is a combatant and who isn't is at the heart of a raging debate over allowing autonomous weapons on battlefields filled with both soldiers and civilians.



The Kalashnikov "combat module" will include 7.62-millimeter machine gun coupled with a camera attached to a computer system. According to TASS, the module uses "neural network technologies that enable it to identify targets and make decisions". A key part of neural networking technology is the ability to learn from past mistakes.



Neural networks are computer systems that learn much like animal brains, learning from example and then using that learning to make decisions in the future. A battlefield robot, for example, may store images of both soldiers, guerillas, and unarmed civilians in an onboard database. Once its onboard cameras image a human being, the neural network would compare the person it is seeing to the database. If it has a firearm or uniform worn by enemy troops, it would open fire. Ideally, if it saw no weapon at all, it would judge the target a civilian and not open fire.



The problem with example-based learning in warfare is that in war mistakes are permanent and irreversible, and a neural network may not have the opportunity to apply lessons learned. If the robot misjudges a rocket launcher as a broomstick and doesn't fire, the rocket launcher will blow it up. That doesn't help the robot and won't help future robots discern a soldier with a rocket from a civilian with a broomstick. If a civilian's pitchfork is misidentified as a rocket launcher, he gets riddled with bullets. If the neural network then self-tweaks itself to identify pitchforks, that's good for the robot. But the civilian is still dead.

i8acobra 07-28-2017 06:29 PM

What could go wrong? LOL.



http://www.youtube.com/watch?v=A9l9wxGFl4k

faithofadragon 07-28-2017 06:45 PM

at least it wont get paid leave when it accidentally shoots someone


All times are GMT -6. The time now is 04:26 AM.


© 2024 MH Sub I, LLC dba Internet Brands