In the data protection industry, we talk a lot about potential risks. "It's not if; it's when you're breached," we warn. We promote data minimization, the idea that you shouldn't take more data than you need and you should delete it when you're done with it.
Sometimes, honestly, it can all seem a little alarmist. Sure, there are data breaches every day. And it's 100 percent annoying if you were affected, and now you have to put a freeze on your credit score and purchase identity theft protection.
But this week's story that the Afghan government stored profiles on every person enlisted in the Afghan National Army and Afghan National Police is one of those horror stories come to life. The profiles included 40 data points on each person, and they remained within the database even when the individual had left the service.
MIT Technology Review reported that the database, called the Afghan Personnel and Pay System, was "used by both the Afghan Ministry of Interior and the Ministry of Defense to pay the national army and police," adding that the database "is arguably the most sensitive system of its kind in the country, going into extreme levels of detail about security personnel and their extended networks."
The MIT report continues: "The data is collected 'from the day they enlisted,' says one individual who worked on the system. .... Records could be updated, he added, but there was no deletion or data retention policy—not even in contingency situations, such as a Taliban takeover."
The first time I heard Woody Hartzog, a professor at Northeastern University, call for a total ban on facial recognition technology, it sounded crazy. Was he saying that to be provocative? He couldn't be proposing we ban an entire industry of emerging tech, could he?
If you know Hartzog or follow him on Twitter, you know he wasn't provoking for sport. His call to ban facial recognition technology is serious because the potential ramifications of using it to identify individuals are profound.
Imagine right now if you were a former U.S. ally who'd worked in the Afghan service. People are being dragged in the streets and killed for less. And you can do your best to hide. Perhaps you start dressing differently or shave your facial hair. But the Taliban has access to a scan of your iris. How do you escape that? There's no getting a re-issue for an eye. It's yours, and you're inextricably tied to it as an identifier.
Again from the MIT report: 'The Afghan military trusted their international partners, including and led by the U.S., to build a system like this,' says one of the individuals familiar with the system. 'And now that database is going to be used as the [new] government's weapon."'
For now, I can think only of the danger facing Afghan citizens in and outside of the databases at hand. But I hope this story makes visceral to all of us that our choices as data stewards matter. Yes, some decisions are riskier than others. But I've always loved that Chinese proverb that "The way you do one thing is the way you do everything."
If that's true, we should protect data as if lives depend on its safekeeping. Because sometimes, they do.
We're going to talk about this and more on our next Twitter Spaces chat on Thursday, Sept. 2, at 4 p.m. Eastern, 1 p.m. Pacific. I hope you can join us! Click here to set an in-Twitter reminder.
Enjoy reading, and I'll see you next week!
Biometric databases could put Afghans at risk of Taliban retribution
This week, media sources reported Afghanistan's Taliban captured biometric devices abandoned when the U.S. military evacuated the country. In the end, the devices found aren't as data-heavy as previously understood. But MIT Technology Review reports a "greater threat from Afghan government databases containing sensitive personal information that could be used to identify millions of people around the country," especially Afghans who helped coalition forces.
Read Story
China's aggressive privacy stance could shift global standards
China's new privacy law will help the country establish global standards for data management, South China Morning Post reports. One expert said the law would "push data recipients located outside of the country to comply with Chinese laws more seriously, establishing long-arm jurisdiction." Chinese officials also ordered app platforms to remove 67 apps for users’-rights infringements. Plus, China says it will use facial recognition technology to enforce a new law allowing Chinese children to play three hours of video games per week.
Read Story
21-year-old claims he hacked T-Mobile to retaliate against US
A 21-year-old American living in Turkey says he is the hacker behind the T-Mobile breach that exposed more than 50 million customers' data. The man, John Binns, said he breached the company "to retaliate against the U.S. for kidnapping and torture of John Erin Bins in Germany by CIA and Turkish intelligence agents in 2019," ZDNet reports.
Read Story
UK taps New Zealand privacy commissioner to replace current data chief
The U.K. has named John Edwards to replace Information Commissioner Elizabeth Denham, TechCrunch reports. Edwards has served as New Zealand's Privacy Commissioner for the last seven years. He's known for his general disdain for Facebook. During Facebook's Cambridge Analytica scandal, he tweeted he deleted his account based on Facebook's skirting of New Zealand privacy laws. The U.K. is in the midst of reforming its privacy law since its departure from the EU and its governing rules.
Read Story
UK's post-Brexit data privacy plans draw criticisms
Just as the U.K. announces its new privacy commissioner, it has also announced plans for a new series of data adequacy partnerships to allow the U.K. to "drive international trade," ZDNet reports. The Department for Digital, Culture, Media & Sport said it would prioritize partnerships with the U.S., Australia, the Republic of Korea, Singapore, Dubai and Colombia. Critics worry any plans that diverge from the EU's GDPR could jeopardize individuals' privacy rights. A spokesperson for the department said, "It means reforming our own data laws so that they're based on common sense, not box-ticking."
Read Story
Judge denies Clearview AI's move to dismiss biometric privacy suit
Last week, an Illinois court ruled against Clearview AI's request to dismiss a case alleging the company unlawfully collects Illinois residents' faceprints. Clearview argued the state's Biometric Information Privacy Act doesn't apply to faceprints. But the judge disagreed, and the case will proceed as planned.
Read Story