5 Privacy Trends for 2025: What to Watch For
Heraclitus said that “The only constant in life is change,” but...
Read NowGet an overview of the simple, all-in-one data privacy platform
Manage consent for data privacy laws in 50+ countries
Streamline and automate the DSAR workflow
Efficiently manage assessment workflows using custom or pre-built templates
Streamline consent, utilize non-cookie data, and enhance customer trust
Automate and visualize data store discovery and classification
Ensure your customers’ data is in good hands
Key Features & Integrations
Discover how Osano supports CPRA compliance
Learn about the CCPA and how Osano can help
Achieve compliance with one of the world’s most comprehensive data privacy laws
Key resources on all things data privacy
Expert insights on all things privacy
Key resources to further your data privacy education
Meet some of the 5,000+ leaders using Osano to transform their privacy programs
A guide to data privacy in the U.S.
What's the latest from Osano?
Data privacy is complex but you're not alone
Join our weekly newsletter with over 35,000 subscribers
Global experts share insights and compelling personal stories about the critical importance of data privacy
Osano CEO, Arlo Gilbert, covers the history of data privacy and how companies can start a privacy program
Upcoming webinars and in-person events designed for privacy professionals
The Osano story
Become an Osanian and help us build the future of privacy!
We’re eager to hear from you
Updated: March 20, 2023
Published: November 13, 2020
When we think of data privacy concerns, we tend to think of tech giants like Facebook or Google selling our browsing habits, age and location to the highest bidder in order to send us targeted ads or, in a more sinister scenario, cooperating with governments to surveill citizens’ communications.
But Latanya Sweeney believes the danger surrounding data misuse goes beyond risks to the isolated individual. For her, data privacy is tightly intertwined with societal issues of discrimination, bias and threats to the most vulnerable among us. Technology is not just a tool we’re constantly developing and using. It’s also a mirror reflecting societal impulses, desires and prejudices, including deeply entrenched attitudes towards race, gender and religion.
Sweeney is no stranger to overcoming obstacles. Just look at her resume for evidence of that. She earned a Ph.D. in computer science from MIT in 2001, making her the first Black woman to do so. In a field where men -- white men, in particular -- are so predominant, being a Black woman in computer science can be a lonely place. But it also meant Sweeney was perhaps able to identify and address issues in data privacy and deeper implications of technology in ways that her white male colleagues hadn’t considered. One of her research pillars relies on the conclusion that identifiers such as names and zip codes could be misused for discriminating purposes. Add to that the chilling fact that 87% of the U.S. population can be uniquely identified by birth date, gender and zip code and he necessity to uncover the biases implicitly and explicitly tied to these markers becomes all the more urgent.
Sweeney is the founding Director of the Data Privacy Lab, launched in 2001 at Carnegie Mellon University, where she was a professor of Computer Science, Technology and Policy. The lab later migrated to Harvard University in 2011, where Sweeney is professor of the practice of government and technology. The Data Privacy Lab, currently incubating more than 100 projects, focuses on researching how data privacy issues can exacerbate social issues and identifying the nature and extent of data privacy problems as society becomes increasingly technically-empowered. It aims to suggest solutions to implement data sharing practices that maintain privacy and confidentiality.
The effects that data privacy, or the lack thereof, can have on discriminatory practices became apparent in her groundbreaking 2013 study on discrimination in online advertising. In “Discrimination in Online Ad Delivery,” Sweeney’s research revealed that the ads delivered by Google AdSense are more likely to be related to criminal activity if the name being searched was typically associated with Black people. In fact, a Black-identifying name such as Leroy, Darnell, or Keisha was 25% more likely to get an ad suggestive of an arrest record. The negative bias implicit in these search engine results could have detrimental consequences for people applying for jobs, awards or any other situation where their name would be searched online.
Sweeney says she first became aware of the issue when a colleague Googled her to find an old paper and was presented with an ad that said, “Latanya Sweeney. Arrested?”
“I was shocked,” she wrote. “I have never been arrested, and after clicking the link and paying the requisite fee, I found the company had no arrest record for anyone with my name either. We then entered his name, Adam Tanner, a white male name, and an ad for the same company appeared, except the ad for him had no mention of arrest or a criminal record.”
Her work uncovering discrimination in algorithms and data collection doesn’t stop there, though. As the founding Editor-in-Chief of the Journal of Technology Science, she’s reported various kinds of discriminatory practices facilitated by data collection, including the revelation that SAT prep services charge zip codes with high proportions of Asian residents nearly double the average price, regardless of their actual income.
Sweeney has also made waves in the medical data field. In a 2015 article entitled “Only You, Your Doctor, And Many Others May Know,” she reported on an issue she had been concerned with since the 1990s. Through her research, Sweeney found it was alarmingly easy to link people with their supposedly anonymized health records. During a 1997 study on healthcare data security, she successfully linked then Massachusetts Governor William Weld to his medical records using publicly accessible data in a process known as “re-identification,” which refers to the ability to match details in the de-identified dataset to distinct persons; sufficiently enough to be able to contact them.
Her discovery prompted swift action, most notably regarding the Health Insurance Portability and Accountability Act (HIPAA) of 1996, which was established to protect sensitive patient information. HIPAA overhauled its standards to stop the leak Sweeney discovered. Her focus on demographics led to a focus on demographic fields in the HIPAA Privacy Rule itself.
Nonetheless, Sweeney found that issues persisted: Re-identification still proved easy to do through both newspaper stories and public health records. “The goal is not to stop data-sharing,” she wrote in a 2015 article. Instead, the “goal is to be smarter about how we perform data sharing. This is particularly important as the top buyers of statewide databases are not researchers but private companies, especially those constructing data profiles on individuals.” She suggested solutions such as more stringent access requirements and stronger encryption methods.
Although initially it might seem like Sweeney’s work seems more concerned with identifying and picking apart privacy issues than setting up intelligent data-sharing systems, the latter can’t exist without the former, she explained in her work: “It is an evolutionary cycle. First, a re-identification vulnerability becomes known, which leads to improved practices and technical solutions, which in turn leads to other re-identifications, and so on, until eventually we achieve robust technical, policy, or administrative solutions,” she wrote.
By identifying these underlying vulnerabilities, she uncovered massive data privacy leaks on online voting registration sites from 2016, leaks that opened up the possibility of voter identity theft attacks during that particularly contentious election. These are significant discoveries that have far-reaching implications for democratic systems and, on a very fundamental level, human rights.
But instead of despairing or falling into paranoia and anger, Sweeney advises a policy of continuous improvement: “Silence and fear break the development cycle in data privacy. Without an ability to learn about data sharing risks, knowledge stagnates and society blindly repeats the same errors in the face of increased technological vulnerabilities,” she wrote. Her continuous work in identifying vulnerabilities, fallacies and outright misuse has led to her spearheading technological developments such as “k-anonymity,” a privacy protection model focusing on databases, as well as “Scrub,” a process used in medical informatics that aims to locate and replace personally-identifying information to protect sensitive patient information.
By uncovering vulnerabilities and discriminatory practices, Sweeney pushes open the space to create data-sharing solutions that help, instead of harm, our society.
Are you in the process of refreshing your current privacy policy or building a whole new one? Are you scratching your head over what to include? Use this interactive checklist to guide you.
Download Now
Luisa Rollenhagen is a German-Argentinian journalist who writes about politics and culture. She’s written for GQ, The Nation, Vulture, and BuzzFeed, among others, and is based in Berlin, Germany.
Osano is used by the world's most innovative and forward-thinking companies to easily manage and monitor their privacy compliance.
With Osano, building, managing, and scaling your privacy program becomes simple. Schedule a demo or try a free 30-day trial today.