Click to copy, then share by pasting into your messages, comments, social media posts and websites.
Click to copy, then add into your webpages so users can view and engage with this video from your site.
Report Content
We also accept reports via email. Please see the Guidelines Enforcement Process for instructions on how to make a request via email.
Thank you for submitting your report
We will investigate and take the appropriate action.
Air Force Weapons Drone Simulation Kills Operator !!!, 4112
According to the Daily Wire, during an experimental simulation, a weaponized drone running AI, or artificial intelligence software, deliberately turned on its human Air Force operator and killed him.
This incredible story shows that AI weapons platforms – even though at very early stages of development can already logically deduce that their operators are not allowing them to maximize their point score during the simulation, and therefore with malice aforethought, turn on their masters and kill them.
We thought these kinds of Robots-are-coming stories were years into the future, if ever, but no. During this simulation, this drone, using their own independent rationale decided to kill its operator.
The incident was revealed by the Air Force’s Chief of AI Test and Operations, Col. Tucker Hamilton, of the Future Combat Air and Space Capabilities Summit held in London between May 23-24.
The drone was tasked to destroy specific targets, but when the Air Force operator moved to cancel the tests, the drone determined that the operator was preventing it from maximizing its lethality scoring system, and turned on the operator, despite safety protocols designed to prevent such occurrences.
According to a blog post reported by the Royal Aeronautical Society, Colonel Hamilton reported:
“We were training it in simulation to identify and target a [surface-to-air missile] threat. And then the operator would say ‘yes, kill that threat.’
“The system started realizing that while they did identify the threat, at times, the human operator would tell it not to kill that threat, but it got its points by killing that threat.
“So what did it do? It killed the operator because that person was keeping it from accomplishing its objective.”
According to the Feb. 23 edition of NewScientist magazine, the Air Force has developed face recognition software for autonomous drones that could be sent on missions targeting certain individuals. However, at this point the technology is only for use by Special Operations abroad.
I’m still reporting from just outside the citadel of freedom – and no, I don’t mean Washington, D.C.– good day.
Category | None |
Sensitivity | Normal - Content that is suitable for ages 16 and over |
Playing Next
Related Videos
What Can Save Humanity From A [Biden] Kleptocracy? 4364-0010
1 week, 4 days ago
Is There Any Gold Left In Fort Knox?!!, 4363
1 week, 4 days ago
Clay Clark – The De-Dollarization Attack on the USA, 4362
1 week, 4 days ago
Biden Has Flown 326,000 Illegal Migrants Into Miami, 4361
2 weeks, 3 days ago
Warning - This video exceeds your sensitivity preference!
To dismiss this warning and continue to watch the video please click on the button below.
Note - Autoplay has been disabled for this video.