TEHRAN β€” During the first days of the 2026 U.S.-Israeli aggression against Iran, families gathered at Tehran's Police Park.

Children chased each other across the grass. Parents pushed strollers along shaded paths.

Then the missiles came.

Somewhere in a command center, an artificial intelligence system had scanned satellite imagery and street names, detected the word "police," and flagged the location as a government target.

Even if it were a police station, it would still be criminal aggression. However, it starkly exposes how callous algorithms (and those who deploy them) are in turning civilians into targets.

Did a human analyst review the coordinates and pull up photographs showing playground equipment and picnic blanket? Or did the algorithm decide based on the data it was fed?

Either way, the decision was executed. And Iranian families paid with their blood.

This is not a malfunction. This is how war is waged in the twenty-first centuryβ€”by machines that kill without conscience, enabled by humans who refuse to look.

From Ga

πŸ“°

Continue Reading on Tehran Times

This preview shows approximately 15% of the article. Read the full story on the publisher's website to support quality journalism.

Read Full Article β†’