Skip to main content
La Bussola dell'IA
La Bussola dell'IA
  • πŸ“ Articles
  • πŸš€ Working with Me
  • 🧭 The Route Ethics AI
  • πŸ“– Insights
    • πŸ“§ Subscribe to our newsletter
    • πŸ“– Books
      • πŸ’Ό AI Business Lab (Saturday Smart)
      • 🧠 MindTech (Sunday, Mental)
      • πŸ“° IA News Weekly
  • πŸ‘₯ Who we are
    • 🧭 Who we are
    • πŸ’¬ Contact
English
English
Italian
🏠 Home " Ethics and society " War In The Future? The Ominous Shadow of the Smart Weapons

War In The Future? The Ominous Shadow of the Smart Weapons

πŸ“… 2 May 2025 πŸ‘€ Manuel πŸ“‚ Ethics and society ⏱️ 9 min read
IA e armi autonome: simboli di tecnologia militare avanzata

The weapons the autonomous decide life and death without human control

Imagine a not too distant future, where decisions over life and death are not to be taken by humans, but by sophisticated algorithms, from artificial intelligence integrated weapon systems lethal. Sounds like science fiction, I know, but the reality is approaching with giant steps.

The application of artificial intelligence in the military sector is opening new scenarios and, let's face it, rather disturbing. We're not talking about simple drones, remote-controlled, but of systems able to operate autonomously identify targets and engage the enemy without any human intervention direct. A transformation that raises fundamental questions on the future of humanity.

What really are the arms of the autonomous

The weapon systems of autonomous lethal (LAWS – Lethal Autonomous Weapon Systems) represent the most advanced evolution of artificial intelligence applied to the military field. These systems are divided generally into two categories: the semi-automatic β€œhuman in the loop”, which may not work without human intervention, and those fully automated β€œhuman out of the loop”, which, once activated, they operate in complete independence.

The difference is crucial. While a Predator drone always requires a human operator to decide when to shoot, weapons, autonomous of the new generation can identify, track and attack targets, based exclusively on their own algorithms. As we explored in our article on AI on a leashthe question of the control of intelligent machines become more and more central.

Currently, the full automation is more a vision than a reality of the service, but many countries are investing heavily in this direction. The conflict in Ukraine has been a tour of crucial evidence, showing how drones and autonomous systems, AI can significantly improve the military effectiveness.

Artificial intelligence transforms the modern war

The arms race of the future has already begun. The United States has allocated $ 1.8 billion for the development of AI in the military in 2024, with roughly 685 active projects. China and Russia are developing increasingly sophisticated systems. According to a report of the The Stockholm International Peace Research Institute (SIPRI), the global investments in weapon systems, autonomous exceeded $ 12 billion in 2024, with a growth of 35% compared to the previous year.

A concrete example is the system of Lavender in the israeli army, which analyzes massive amounts of data to identify potential targets. According to military sources, it can detect up to 100 targets per day, a number significantly higher than the traditional methods. The system of Gospel produces recommendations automated strategic objectives, while in Ukraine the company Palantir provides tools for data analysis for the rapid identification of enemy targets.

OpenAI has recently removed the prohibition of the use of the military by his models, working now with Anduril to provide systems that are anti-drone intelligent Defense. Also the Meta has made available its model Llama for applications to national security. As we analyzed in our study of the algorithmic biasthese systems inherit inevitably the imperfections of the data on which they are trained.

The responsibility in the era of the deadly machines

πŸ“–

Are you finding this article useful?

Discover the complete system to create viral contents with the AI

πŸ”₯ 47 prompt tested
πŸ’° Only $9.99
⚑ Instant Download
πŸ“š Download the book now β†’

🎁 Bonus: 30 calendar days + video tutorial included

This perspective raises a number of questions of an ethical, legal, and practices that we can not ignore. At the center of the debate is the question of responsibility. Who will be liable if a weapon stand-alone commits an error, causing collateral damage or targeting innocent civilians? The programmer? The military commander that has deployed the system? Artificial intelligence the same?

Currently, the international humanitarian law is based on the principle of human responsibility in the decisions of the attack. Transfer this decision to a machine undermines the very foundations of this system. The The European Parliament he pointed out that the systems that are enabled in the IA should enable human beings to exercise significant control, while maintaining the responsibility for using.

The complexity of the algorithms of artificial intelligence makes it difficult to predict with certainty the behavior in every situation. To entrust to a machine the power to kill means to be thrown into a dark source, with potentially catastrophic consequences. As discussed in our article on ethics of artificial intelligencethe question of human control becomes ever more pressing.

Bias algorithmic and discrimination of war

A particularly troubling aspect relates to biases in the data on which these artificial intelligences are trained. If the data reflect the inequalities and discrimination in our society, there is the serious risk that the weapons the autonomous inherit and to amplify these biases.

Imagine a facial recognition system that works less well with certain ethnic groups, or an algorithm for the identification of threats that binds to certain demographic characteristics with an increased level of danger. The risk of discrimination in algorithmic contexts of war is very real, and terribly disturbing.

The recent condemnation of the UN's use of AI by Israel in the Gaza Strip highlights these risks. More than 15,000 civilian casualties in the first six weeks following the October 7, 2024, when the artificial intelligence systems have been widely used for the selection of the targets, raise fundamental questions about the accuracy and ethics of these systems.

As we explored in our article on surveillance and artificial intelligencethe control algorithm can easily be turned into systematic oppression.

Key points to remember

  • The weapons are autonomous and can decide for themselves whom to attack without human intervention, direct, undermining the principle of human responsibility
  • The arms race, IA has already begun with a high investment on the part of the superpowers in the world
  • The bias algorithmic may cause discrimination lethal amplifying human prejudices in contexts of war
  • The human control significant remains essential to comply with international humanitarian law and to prevent abuse

Frequently asked questions

The weapons autonomy, are already operating today? The semi-autonomous are already used in various conflicts, but the weapons fully autonomous remain in an advanced development phase. The border between automation and human control, there is thinning quickly.

There is a regulation of the international arms autonomous? Currently, the UN Convention on certain conventional weapons includes references limited. Several organizations, such as Stop Killer Robots, are pressing for a ban full of weapons autonomous lethal.

How can you ensure human control in the decisions lethal? The European Parliament requires that the IA will allow human control significant, but define β€œsignificant” in contexts of war remains an open challenge.

What are the main risks of the weapons autonomous? The loss of human control, escalation, uncontrolled conflict, discrimination, algorithmic, and violation of international humanitarian law.

Towards a responsible future

The debate on guns autonomous is anything but academic. The logic of deterrence and competition is likely to prevail on the prudence and ethical reflection. We must prevent the technological innovation we drag in a spiral uncontrollable, where the decisions on war and peace are delegated to machines, devoid of conscience and empathy.

It is crucial to promote an international dialogue open and inclusive, involving governments, scientists and ethics experts, civil society organisations and the general public. As pointed out in our article on bioethics and artificial intelligence, we must define clear limits and binding to the development and use of weapons, autonomous, before it is too late.

This is not to stop the progress of technology, but direct it in a responsible and aware. The artificial intelligence has the potential to bring extraordinary benefits in many fields, but its application to weapons requires a reflection particularly serious and thorough. The stakes are too high to allow us to remain inert. The future of war, and perhaps humanity itself, depends on the choices that we make today.

🧭

Don't miss the future of AI

Every Friday, you get the compass to navigate the future artificial intelligence (ai). Analysis, trends, and insight, practical and directly in your inbox.

πŸš€ The journey begins

πŸ”’ No spam β€’ Deleted when you want β€’ Privacy guaranteed

πŸ“š

Viral AI Prompts

Complete system with 47 prompt tested to create viral contents with the AI. Only €9.99

πŸ“ 47 Prompt ⚑ Instant Download
πŸ’Ž Get the System Time
⭐ PREMIUM
πŸš€

Advice, IA Custom

Find out how I can help you to implement AI in your business. Book a strategic consulting.

πŸ’‘ Customized πŸ“ˆ Focus on Results
🎯 Ask for your Strategy

πŸ’‘ Affiliate Link transparent - supports The Compass of the IA

🏷️ Tags: weapons-autonomous ethics-military ia intelligence-artificial technology-war

πŸ“€ Share this article:

X Facebook LinkedIn Email
← Previous article AI on a Leash? Rethinking Control Over Intelligent Machines
Next article β†’ What is Artificial Intelligence (and what isn't, really)

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

🧭 La Bussola dell'IA

Your guide to navigate the future of artificial intelligence with awareness and competence.

πŸ“š Content

πŸ“ Latest Articles 🧭 The Route Ethics AI πŸ“– Books πŸ’ΌAI Business Lab 🧠 MindTech

⭐ Products and services

πŸš€ Consulting AI πŸ“š Book: Viral AI the Prompts πŸŽ“ Online Courses

ℹ️ Information

πŸ‘₯ Who We Are πŸ’¬ Contact πŸ”’ Privacy Policy πŸͺ Cookie Policy πŸ“§ Newsletter βš–οΈ Terms Of Advice

πŸ† Certifications

Google Analytics Individual Qualification - La Bussola dell'IA Google Analytics Certified
Google Ads Search Certification - La Bussola dell'IA Google Ads Search Certified
πŸŽ“
My Certifications View all

Β© 2025 La Bussola dell'IA. All rights reserved.
Comply with the EAA - European Accessibility Act

Manage Consent
To provide you with the best experience, we use technologies such as cookies to store and/or access information from the device. The acceptance of these technologies will allow us to process data such as your browsing behavior or unique ID on this site. Not to consent or to withdraw your consent can adversely affect some features and functions.
Functional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or by the user, or only for the purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose to store the preferences that are not requested by the subscriber or by the user.
Statistics
The technical storage or access which is used exclusively for statistical purposes. The technical storage or access which is used exclusively for statistical purposes and anonymous. Without a subpoena, a compliance voluntary on the part of your Internet Service Provider, or additional records from a third party, your information is stored or retrieved for this purpose alone cannot usually be used for identification.
Marketing
The technical storage of, or access are needed to create user profiles to send advertising, or track the user on a web site or on various websites for marketing purposes similar.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Display preferences
{title} {title} {title}