5 Privacy Trends for 2025: What to Watch For
Heraclitus said that “The only constant in life is change,” but...
Read NowGet an overview of the simple, all-in-one data privacy platform
Manage consent for data privacy laws in 50+ countries
Streamline and automate the DSAR workflow
Efficiently manage assessment workflows using custom or pre-built templates
Streamline consent, utilize non-cookie data, and enhance customer trust
Automate and visualize data store discovery and classification
Ensure your customers’ data is in good hands
Key Features & Integrations
Discover how Osano supports CPRA compliance
Learn about the CCPA and how Osano can help
Achieve compliance with one of the world’s most comprehensive data privacy laws
Key resources on all things data privacy
Expert insights on all things privacy
Key resources to further your data privacy education
Meet some of the 5,000+ leaders using Osano to transform their privacy programs
A guide to data privacy in the U.S.
What's the latest from Osano?
Data privacy is complex but you're not alone
Join our weekly newsletter with over 35,000 subscribers
Global experts share insights and compelling personal stories about the critical importance of data privacy
Osano CEO, Arlo Gilbert, covers the history of data privacy and how companies can start a privacy program
Upcoming webinars and in-person events designed for privacy professionals
The Osano story
Become an Osanian and help us build the future of privacy!
We’re eager to hear from you
Updated: October 14, 2024
Published: February 9, 2023
Data privacy has never been more top of mind. From regulators to businesses, privacy professionals to consumers, and more, everyone has a stake in data privacy.
With all this attention and focus, the data privacy world is evolving at a break-neck pace—not just in terms of legislation, but also in terms of best practices, awareness, and risk. Here are the top 5 trends we’re seeing unfold in 2024.
Without a doubt, AI will be making every 2024 data privacy trends list you come across.
In late 2022, OpenAI released a free research preview of its generative AI project ChatGPT. Then, in 2023, ChatGPT was upgraded to use the more robust GPT-4 model as its core AI. Before long, ChatGPT and other generative AI models were virtually ubiquitous in digital spaces.
Tech conversations were dominated by AI over the course of 2023. But now that the dust has settled somewhat, the world has had (some) time to digest the impact of generative AI on regulations and business ethics.
No doubt AI technology is going to advance exponentially over the next few years, and there will almost certainly be another massive leap in functionality, just like we saw in 2023. But when that does happen, we’ll be better prepared; now, regulators, academics, and technologists are hard at work determining how to mitigate unethical uses of AI today and in the future.
There are plenty of ways AI can be misused, but data privacy is of particular concern when it comes to the (un)ethical use of AI.
The first thing that might come to mind is the accidental exposure of private information. Generative AI models like ChatGPT scrape huge amounts of data from a variety of sources in order to “train” the AI algorithm and generate human-like responses to queries.
Here’s what ChatGPT had to say about its own training dataset:
So, less than helpful.
However, estimates suggest that ChatGPT’s training set consisted of around 570GB of data obtained from internet-available texts—that’s a lot of data, and some of it is likely personal information. If an AI model is trained on personal information, then there is a good chance that information could be exposed in its outputs. And that’s not to mention the fact that any personal information you provide in a chat with ChatGPT is also fair game for use as training data.
If that data and the data subject is protected by a data privacy regulation, then the AI developer will be in violation unless they take the appropriate steps, like asking for consent, first.
But toothpaste isn’t easy to put back into the tube, and personal data isn’t easy to delete from a trained AI model—as a result, the AI model must be destroyed. The FTC, for instance, has the power to demand the deletion of such models through an action called “algorithm disgorgement.”
While the exposure of private information in an AI model is a major concern, it’s not the only conflict between AI and data privacy. Some AI applications can be used to scrape biometric information from photos and videos on the web.
Forget the fact that a dataset of biometric information in the wrong hands, such as a domestic abuser's, is dangerous; even if such datasets were reserved for law enforcement, as is the typical use case for such technology, it would effectively place everyone on a permanent police lineup. This exact use case has happened before with Clearview AI, and it spurred this exact criticism.
Lastly, AI can be a powerful tool to access sensitive personal information that would have otherwise been protected. Again, some might not object to security-breaking AI tools being in the hands of “the good guys”—but technology has never remained solely in the hands of those capable of using it responsibly for long.
Privacy-enhancing technologies, or PETs, have been around for a while. However, the recent surge in AI technology means they’ll be more important than ever.
PETs are all about reducing or even eliminating a system’s access to personal information without affecting its functionality. Thus, for generative AI systems that rely on the collection of huge datasets for training, PETs will be an essential tool to comply with regulations while still generating a useful AI tool.
In fact, the PET market is forecasted to grow to reach $25.8 billion by 2033. Over the course of this period, we can expect to see more focus paid on PETs, such as:
Greater enforcement was on our list for last year’s data privacy trends, and this year is no exception. As more laws come online, more enforcement is inevitable.
Especially when it comes to data privacy regulations, state attorneys general, the FTC, the CPPA, and EU data protection authorities are eager to prove that they have bite to match their bark. Throughout 2024, we expect to see enforcement of state laws to kick in, especially from the CPPA and their enforcement of the CPRA, which kicked into effect July 1, 2023.
Note, however, that the CPPA’s enforcement is only for the additional rulemaking they’ve gone through; anything in the text of the CPRA/CPPA regulations is fair game for enforcement (as we saw with 2022’s Sephora enforcement).
In 2024, several other state laws go into effect and are therefore enforceable. They include:
Furthermore, numerous state laws have already gone into effect in previous years and are enforceable. Those include:
All of these laws are enforced by state attorneys general eager to make an example of violators. If you want to review an updated list of U.S. privacy laws and their associated characteristics, check out U.S. Data Privacy Laws: A Guide to the 2024 Landscape.
Investment decisions and corporate identities are increasingly influenced by environmental, social, and governance (ESG) factors. In fact, asset managers’ total ESG-related assets under management (AuM) are forecasted to reach US$33.9 trillion by 2026—that’s nothing to sneeze at.
The 2004 report titled "Who Cares Wins", which popularized the term “ESG,” introduced the concept like so:
Ultimately, successful investment depends on a vibrant economy, which depends on a healthy civil society, which is ultimately dependent on a sustainable planet. In the long-term, therefore, investment markets have a clear self-interest in contributing to better management of environmental and social impacts in a way that contributes to the sustainable development of global society.
As a result, argued the report, investors and businesses should prioritize ESG factors. In doing so, they would promote a sustainable planet, a healthy civil society, and a vibrant economy.
Few could argue that the full-throated support of individuals’ data privacy rights does not contribute to a healthy civil society. However, it should be noted that the ESG approach has its roots in 2004 with the “Who Cares Wins” report. While data privacy has been an issue for a long time, it really only reached the public consciousness in 2016, when the GDPR was passed. Perhaps in part because of this gap, data privacy has been viewed as a parallel, but distinct, factor relative to ESG factors.
Increasingly, however, organizations are recognizing that data privacy is very much an integral aspect of ESG. External ESG rating agencies, which investors rely on to identify ESG-focused investment opportunities, often include privacy and cybersecurity as components of a business’s overall ESG score.
What this trend really means is that data privacy is becoming a brand statement. The public is more and more aware of their data privacy rights as well as the data privacy wrongs committed by companies in the past. Businesses are more aware of the responsibility they have to ethical data stewardship.
In 2024, we can expect this trend to intensify; businesses that wish to be known as ESG-driven and responsible participants in the economy will tout their respect for their customers’ data privacy.
Research shows that targeted advertising is only about 4% more profitable than “dumb” advertising, but ad-based businesses are still fighting hard to hold onto that extra 4%. That’s evident in Meta’s experiment with a model dubbed “Pay or Okay” in the EU.
Under this approach, Meta is offering EU users the choice to either:
As of this writing, TikTok is already following Meta’s lead with a limited test of the same model. If Meta and TikTok are successful, then it wouldn’t be a surprise if other ad-supported social media platforms offered an ad-free subscription tier.
This development isn’t sitting well with privacy advocates, however, who characterize the choice between paying a fee or consenting to data processing as coercive. If that’s the case, then users couldn’t be considered to have “freely given” consent, as required by the GDPR.
If every social media system followed this model, then it could become very expensive indeed to simply wish not to be tracked. Does that mean businesses should provide their services for free? Of course not. Instead, they could simply offer non-targeted advertisements, forgoing the extra 4% of profitability in favor of a lower risk profile and better brand reputation.
Currently, the “pay or okay” model is being challenged in European courts, and TikTok’s experiment is still ongoing. Over the course of 2024, we’ll see whether or not this approach is legal in the eyes of the EU authorities and tolerable in the eyes of EU citizens.
Really, these different trends are all the same trend: Data privacy compliance is becoming more important for modern businesses. Consumers are becoming savvier and better equipped, enforcement authorities are on the prowl, businesses of all sizes are building privacy programs, the risk inherent to data collection has become more obvious, and companies are paying attention to the space. All of this adds up to a level of focus on data privacy that has never been higher.
At the same time, there is also widespread confusion around where to start. There is a lot to compliance, and prioritization is a challenge. For businesses and privacy professionals looking for guidance on what comes next, we recommend checking out Osano’s action plan for compliance with 2024’s privacy laws.
Driving innovation and growth across the business. Learn why organizations that invest in data privacy gain a return of up to $2.70 for every dollar spent.
Download Now
Matt Davis is a writer at Osano, where he researches and writes about the latest in technology, legislation, and business to spread awareness about the most pressing issues in privacy today. When he’s not writing about data privacy, Matt spends his time exploring Vermont with his dog, Harper; playing piano; and writing short fiction.
Osano is used by the world's most innovative and forward-thinking companies to easily manage and monitor their privacy compliance.
With Osano, building, managing, and scaling your privacy program becomes simple. Schedule a demo or try a free 30-day trial today.