Overview
- Israel's AI program 'Lavender' used in Gaza conflict raises ethical concerns as it leads to high civilian casualties.
- Investigations reveal that AI's target identification process is flawed, with a significant error rate and minimal human oversight.
- Civilian death toll in Gaza exceeds 33,000, with reports questioning Israel's adherence to international law and principle of proportionality.
- The AI program 'Where's Daddy?' tracks Hamas suspects to their homes, often resulting in the death of family members as 'collateral damage'.
- International scrutiny increases over Israel's use of AI in warfare, with calls for more stringent regulations and accountability.