
"Statisticians speak of something called the Paradox of the False Positive. Here's how that works: imagine that you've got a disease that strikes one in a million people, and a test for the disease that's 99% accurate. You administer the test to a million people, and it will be positive for around 10,000 of them – because for every hundred people, it will be wrong once (that's what 99% accurate means). Yet, statistically, we know that there's only one infected person in the entire sample. That means that your "99% accurate" test is wrong 9,999 times out of 10,000!
Terrorism is a lot less common than one in a million and automated "tests" for terrorism – data-mined conclusions drawn from transactions, Oyster cards, bank transfers, travel schedules, etc – are a lot less accurate than 99%. That means practically every person who is branded a terrorist by our data-mining efforts is innocent."
Read the whole article, it's short and makes a great point, and is the perfect answer the next time someone tries to say we need data-mining, or illegal wiretapping, or widespread bank transaction surveillance, or... You get the picture. Statistics are good at making predictions of generic - not specific - cases. Useful as a guide, but never as useful in bringing in a potential perpetrator as hardcore, down and dirty police work. Instead of providing the tools needed for law enforcement and terror prevention, as Doctorow points out it's taking needles in haystacks and burying them in deeper haystacks for our law enforcement and intelligence agencies to sift through.