Israel’s A.I. Experiments in Gaza War Raise Ethical Concerns

12 Min Read

At the end of 2023, Israel was aimed at murdering Ibrahim Biari, a superior commander of Hamas in the Northern Gaza Strip that had helped plan the massacres of October 7. But Israeli intelligence could not find Mr. Biari, who believed he was hidden in the tunnel network under Gaza.

Then, Israeli officers resorted to a new military technology infused with artificial intelligence, three Israeli and American officials said about events. The technology developed a decade before, but had not used the leg in the battle. Finding Mr. BIII provides a new incentive to improve the tool, so that the 8200 unit of Israel, the country’s equivalent to the National Security Agency, soon integrated the AI, people said.

Shortly after, Israel listed Mr. Biar’s calls and tested AI Audio tool, which gave an approximate location where he was making his calls. Using that information, Israel ordered the air attacks that were heading to the area on October 31, 2023, killing Mr. Biari. More than 125 civilians also died in the attack, according to Airwars, a conflict monitor based in London.

The audio tool was only an example of how Israel has used the war in Gaza to quickly try and deploy military technologies backed by AI in a degree that had not been seen before, according to the interviews with nine US scholarships and Israelis, costs, some conditions, some of the straightening people, what impasses? confidential.

In the last 18 months, Israel has also combined AI with facial recognition software to coincide in partially obscured or Kidden faces with real identities, resorted to compile potential air attack targets and created an Arabic language and analyzes, E-Chaatbot Media, yours, that is yours, experiences ours and yours. Other data in the Arab language, two people said with knowledge of the programs.

Many of these efforts were an association between soldiers enlisted in Unit 8200 and reserve soldiers working in technological companies such as Google, Microsoft and Meta, three people said with knowledge of technologies. Unit 8200 established what was known as “The Studio”, an innovation center and a place to match experts with AI projects, people said.

However, even when Israel ran to develop Arsenal AI, the deployment of technologies sometimes led to erroneous identifications and judgments, as well as civil deaths, said Israeli and American officials. Some officials have fought with the ethical implications of AI tools, which could result in greater surveillance and other civil murders.

No other nation has been as active as Israel in experimenting with AI tools in real -time battles, said European and American defense officials, giving a preview of how such technologies can be used in future wars and could be wrong.

“The urgent need to cope with the accelerated crisis of innovation, much of the IT,” said Fairy Lorder, head of the Applied Research Institute in the responsible for the Holon Institute of Technology of Israel and former senior director. “It led to technologies that changed the game on the battlefield and advantages that were criticism in combat.”

But technologies “also raise serious ethical questions,” said Lorber. She warned that AI needs controls and balances, adding that humans should make the final decisions.

A spokeswoman for Israel’s military said he could not comment on specific technologies due to his “confidential nature.” Israel “is committed to the legal and responsible use of data technology tools,” he said, added that the military were investigating the strike on Mr. Biari and “could not provide more information until the investigation is completed.”

Goal and Microsoft declined to comment. Google said they have “employees who reserve their service in several countries around the world. The work that these employees do as reservists are not connected to Google.”

Israel previously used conflicts in Gaza and Lebanon to experiment and advance technological tools for its military, such as drones, telephone piracy tools and Iron Dome’s defense system, which can help short -range ballistic missiles.

After Hamas launched cross -border attacks on Israel on October 7, 2023, killing more than 1,200 people and taking 250 hostages, AI Technologies were quickly authorized for their deployment, four Israeli officials said. That led to cooperation between unit 8200 and reserve soldiers in “The Studio” to quickly develop new AI capabilities, they said.

Avi Hasson, Executive Director of Startup Nation Central, a non -profit Israeli organization that connects with companies, said the reserves of Meta, Google and Microsoft had become crucial to boost innovation in drones and data integration.

“The reservists brought knowledge and access to key technologies that were not thicker in the army,” he said.

Israel’s army soon used AI to improve its drone fleet. Aviv Shapira, founder and executive director of XTEND, a software and drones company that works with the Israeli army, said that algorithms with AI were used to build gifts to block and track targets from distance.

“In the past, reference capabilities were based on concentrating on an image image,” he said. “Now the AI ​​can recognize and track the object itself, that it is a moving car or a person, with deadly precision.”

Shapira said that his main clients, the Israeli army and the United States Department of Defense, were aware of the ethical implications of AI in war and discussed the responsible use of technology.

A tool developed by “The Studio” was a model of AI in the Arab language known as a large language model, said three Israeli officers with the program. (The large language model was previously informed by more 972, an Israeli-Palestinian news site).

Developers previously fought to create such a model due to scarcity or data in the Arab language to train technology. When these data were available, it was mainly in Arabic standard written, which is more formal than the dialect paintings used in spoken Arabic.

The Israeli army did not have that problem, the three officers said. The country had intercepted text messages decades, transcribed phone calls and resulted publications of social networks in spoken Arab dialects. Then, the Israeli officers created the large language model in the first months of the war and built a chatbot to execute consultations in Arabic. They merged the tool with multimedia databases, allowing analysts to execute complex searches between images and videos, four Israeli officials said.

When Israel murdered the leader of Hezbollah Hassan Nasrallah in September, Chatbot analyzed the answers worldwide of Arabic speech, three Israeli officers said. The technology was differentiated between the different dialects in Lebanon to measure the public reaction, helping Israel to assess whether there was public pressure for a contractor.

Sometimes, the chatbot could not identify some terms and words of modern snakes that translite from English to Arabic, two officers said. That required that Israeli intelligence officers with experience in different dialects review and correct their work, said one of the officers.

The chatbot also caused incorrect responses, for example, returning photos of pipes instead of weapons, two Israeli intelligence officers said. Even so, the AI ​​tool significantly accelerated research and analysis, they said.

At the temporary control points established between the North and South Gaza Strip, Israel also begged the equipment of equipment after the October 7 attacks with the ability to scan and send high resolution of Palestinians to a facial recognition program backed by AI.

This system also sometimes had trouble identifying people whose faces were obscured. That led to judging and interrogations of Palestinians who were marked by mistake by the facial recognition system, two Israeli intelligence officers said.

Israel also used AI to examine the data accumulated by intelligence officials in Hamas members. Before the war, Israel built a “lavender” of automatic learning algorithm, which could quickly order the data to search low -level militants. He was trained in a confirmed member database of Hamas and intended to predict who more could be part of the group. He thought that system predictions were imperfect, Israel used it at the beginning of the war in Gaza to help choose attack objectives.

Few objectives were considered larger than finding and eliminating the upper leadership of Hamas. Around the upper part of the list was Mr. Biari, the commander of Hamas who believed Israeli officials played a central role in planning the attacks of October 7.

Israel’s military intelligence quickly intercepted Mr. Biari’s calls with other Hamas members, but could not identify its location. Then they went to the audio tool backed by AI, which analyzed different sounds, such as sound pumps and air attacks.

After deducting an approximate location for where Mr. Biari was making his calls, Israeli military officials were warned that the area, which included several apartment complexes, was densely populated, two intelligence officers said. An air attack would need to point to several buildings to ensure that Mr. Biari was killed, they said. The operation was green.

Since then, Israeli intelligence has also used the audio tool along with maps and photos of the Labyrinth of Gaza underground tunnels to locate hostages. Approximately in time, the tool was refined to find more precisely individuals, two Israeli officers said.

Share This Article