Israel uses AI to target Hamas

 Guardian:

The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.

Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the six-month war.

Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines.

“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
...

 Artificial intelligence apparently speeds up the process of identifying targets.  That is important when you are trying to stop an enemy attack. But the computers are not infallible.

See also:

Biden presses Netanyahu toward cease-fire deal, warns of change in US policy

Comments

Popular posts from this blog

Should Republicans go ahead and add Supreme Court Justices to head off Democrats

29 % of companies say they are unlikely to keep insurance after Obamacare