5 Privacy Trends for 2025: What to Watch For
Heraclitus said that “The only constant in life is change,” but...
Read NowGet an overview of the simple, all-in-one data privacy platform
Manage consent for data privacy laws in 50+ countries
Streamline and automate the DSAR workflow
Efficiently manage assessment workflows using custom or pre-built templates
Streamline consent, utilize non-cookie data, and enhance customer trust
Automate and visualize data store discovery and classification
Ensure your customers’ data is in good hands
Key Features & Integrations
Discover how Osano supports CPRA compliance
Learn about the CCPA and how Osano can help
Achieve compliance with one of the world’s most comprehensive data privacy laws
Key resources on all things data privacy
Expert insights on all things privacy
Key resources to further your data privacy education
Meet some of the 5,000+ leaders using Osano to transform their privacy programs
A guide to data privacy in the U.S.
What's the latest from Osano?
Data privacy is complex but you're not alone
Join our weekly newsletter with over 35,000 subscribers
Global experts share insights and compelling personal stories about the critical importance of data privacy
Osano CEO, Arlo Gilbert, covers the history of data privacy and how companies can start a privacy program
Upcoming webinars and in-person events designed for privacy professionals
The Osano story
Become an Osanian and help us build the future of privacy!
We’re eager to hear from you
Updated: March 20, 2023
Published: July 30, 2021
Here at Osano, we name our major product feature releases after privacy heroes. Here's why our July feature release, Text Customization for Consent Manager, is named after Helen Nissenbaum.
One of the reasons it can be difficult for people to understand their privacy rights or for companies to make intelligent decisions is that privacy isn't easily defined. That's why Helen Nissenbaum's contribution to privacy theory is so significant.
Nissenbaum developed one of the most critical theoretical underpinnings for regulating data uses. While the U.S. framework currently relies on giving users "notice" and "choice" when companies seek to collect their data, Nissenbaum's work says that instead, the focus should be on "contextual integrity."
Using contextual integrity means companies should evaluate whether the personal data they're collecting will be used in ways other than the initial purpose for which it was collected. Things can get slippery for companies sometimes in defining a customer's "reasonable" expectation of how you'll treat their data. Let's say you collect data from a customer to provide them a service, and then you want to sell that data to a company seeking to serve personalized ads. The Contextual Integrity Theory would say that you need to look at whether the customer would reasonably expect the data to be used for this purpose in light of the overall context in which they provided that data. Telling customers that you do this in a privacy policy may help, but it may not be enough on its own given that we know customers don't generally read the privacy policy and wouldn't likely expect their data to be used for this purpose.
A real-life example of this might be helpful here. Think about Facebook's News Feed feature, which debuted in 2006 and "fed" users real-time updates on what their friends were posting; whether they'd broken up with their boyfriends and if they'd added any new connections, among other notifications. Not everyone liked the intrusion. One million people joined a "Facebook News Feed protest group."
The crux of the problem was that users didn't expect Facebook to peer so closely into their lives and display that to their online connections. Offline, we understand how much privacy we expect based on context. If I close the door to my office, everyone understands I want space and privacy. If I'm sitting at a cafe, close to a friend and chatting quietly, no one would assume it'd be fine to stand nearby and cock an ear our way. But online, that context is difficult to perceive. So when Facebook's News Feed displays broadly that I broke up with my boyfriend, it feels invasive suddenly. Just because I signed up for Facebook and clicked "yes" to whatever I had to so I could join doesn't mean I'm expecting my personal details to be treated like a news scroll on cable TV.
In their article, “Contextual gaps: Privacy issues on Facebook,” Gordon Hull, Celine Latulipe and Heather Richter Lopfod write, "Offline, privacy is mediated by highly granular social contexts. Online contexts, including social networking sites, lack much of this granularity. These contextual gaps are at the root of many of the sites' privacy issues."
And that's what Nissenbaum would have advised Facebook to avoid out of respect for consumer privacy. (In the end, Facebook ignored the protests and the feature lives on.) In an age when companies often seem more interested in dodging liability than gaining customer trust, Nissenbaum's theory asks companies to do better. A contextual integrity framework looks at the data-use holistically. It requires data collectors to take an ethical approach that considers societal norms, which will always evolve. It asks data collectors to look at the context of the situation to determine what that "norm" is. Is the data subject in this context a patient? An online shopper? An existing customer? Is the data collector an online retailer? A doctor? What kind of information is being collected, and how will it be used? Once you take all of those factors into account, you can start to apply what the societal "norm" would be in the context at hand.
If Facebook looked at its plans through a contextual integrity lens, it likely would have decided the News Feed didn't align with users' expectations in joining the site and forking over their data.
In addition to her ethical contributions to the field, Nissenbaum, a professor of information science at Cornell Tech in New York City, is also director of the Digital Life Initiative. Launched in 2017, it aims to study the "societal tensions arising from existing and emerging digital technologies. It takes into consideration ethics, policy, politics and quality of life.
Among other accolades, Nissenbaum received the International Association for Computing and Philosophy's 2021 Covey Award for her contributions to computing, ethics and philosophy.
Her books include Obfuscation: A User's Guide to Privacy and Protest; Values at Play in Digital Games; and Privacy in Context: Technology, Policy and the Integrity of Social Life.
Are you in the process of refreshing your current privacy policy or building a whole new one? Are you scratching your head over what to include? Use this interactive checklist to guide you.
Download Now
Osano Staff is pseudonym used by team members when authorship may not be relevant. Osanians are a diverse team of free thinkers who enjoy working as part of a distributed team with the common goal of working to make a more transparent internet.
Osano is used by the world's most innovative and forward-thinking companies to easily manage and monitor their privacy compliance.
With Osano, building, managing, and scaling your privacy program becomes simple. Schedule a demo or try a free 30-day trial today.