Short film is black mirror fiction with the context of today. Your don't miss much watching at faster speeds, https://autonomousweapons.org/ Spoiler The risks What risks do lethal autonomous weapons pose? Unpredictability Lethal autonomous weapons are dangerously unpredictable in their behaviour. Complex interactions between machine learning-based algorithms and a dynamic operational context make it extremely difficult to predict the behaviour of these weapons in realworld settings. Moreover, the weapons systems are unpredictable by design; they’re programmed to behave unpredictably in order to remain one step ahead of enemy systems. Escalation Given the speed and scale at which they are capable of operating, autonomous weapons systems introduce the risk of accidental and rapid conflict escalation. Recent research by RAND found that “the speed of autonomous systems didlead to inadvertent escalation in the wargame” and concluded that “widespread AI and autonomous systems could lead to inadvertent escalation and crisis instability.” Proliferation Slaughterbots do not require costly or hard-to-obtain raw materials, making them extremely cheap to mass-produce. They’re also safe to transport and hard to detect. Once significant military powers begin manufacturing, these weapons systems are bound to proliferate. They will soon appear on the black market, and then in the hands of terrorists wanting to destabilise nations, dictators oppressing their populace, and/or warlords wishing to perpetrate ethnic cleansing. Selective Targeting of Groups Selecting individuals to kill based on sensor data alone, especially through facial recognition or other biometric information, introduces the risk of selective targeting of groups based on perceived age, gender, race, ethnicity or religious dress. Combine this with the risk of proliferation, and autonomous weapons could greatly increase the risk of targeted violence against specific classes of individuals, including even ethnic cleansing and genocide. Learn more about the risks of lethal autonomous weapons:
Do we really want AI enabled drones with the deployment and cost cycle of an IPhone? I guess the world signs a ban after the boom boom, not before. “Once that software has been developed, it’s effectively costless for that software to proliferate and be reused elsewhere,” Fedorov conceded that the spread of AI technology represents a “threat to the future,” but underscored that Kyiv must prioritize its immediate fight for survival. Each Ukrainian brigade, for instance, is equipped with a 3D printer that troops use to build the mechanism that holds and releases bombs from commercially available drones. The process is easily replicable, experts say. “Manuals are being published in both Russian and Ukrainian on how to fly a drone, operate a quadcopter and avoid detection,” Bendett said. “Can nefarious actors worldwide use this experience and technology? Absolutely.” The war in Ukraine is spurring a revolution in drone warfare using AI
Lol We all know what happens breh @Xerobull @rocketsjudoka Before the Wokeness there were movies that predicted the future breh
You better hope there's wokeness in the AI so that it stalls in determining whether you're a dude or not.
The AI will realize that Wokeness killed John Connor in the last terminator movie and made the hero into some random girl in Mexico @Salvy So the AI will at least replace the writers @Xerobull
No? At least not by state… or after an unfortunate event. But, basic ammunition automated with drone on a small scale seems very possible.
If a software can be engineered and married to a hardware that combined is a more effective tool of war than its human counterpart, it will get created.
I do think it is nearly inevitable. The moment one country does it other countries will rush in so they don’t feel left behind. The end of Dr Strangelove perfectly shows the mindset. There will be a fear of an AI Drone gap.