Data privacy, also called "information privacy," is the principle that a person should have control over their personal data, including the ability to decide how organizations collect, store and use their data.
Businesses regularly collect user data like email addresses, biometrics and credit card numbers. For organizations in this data economy, supporting data privacy means taking steps like obtaining user consent before processing data, protecting data from misuse and enabling users to actively manage their data.
Many organizations have a legal obligation to uphold data privacy rights under laws like the General Data Protection Regulation (GDPR). Even in the absence of formal data privacy legislation, companies may benefit from adopting privacy measures. The same practices and tools that protect user privacy can defend sensitive data and systems from malicious hackers.
Data privacy and data security are distinct but related disciplines. Both are core components of a company's broader data governance strategy.
Data privacy focuses on the individual rights of data subjects—that is, the users who own the data. For organizations, the practice of data privacy is a matter of implementing policies and processes that allow users to control their data in accordance with relevant data privacy regulations.
Data security focuses on protecting data from unauthorized access and misuse. For organizations, the practice of data security is largely a matter of deploying controls to prevent hackers and insider threats from tampering with data.
Data security reinforces data privacy by ensuring that only the right people can access personal data for the right reasons. Data privacy reinforces data security by defining the "right people" and "right reasons" for any set of data.
In many organizations, data privacy is overseen by an interdisciplinary team with representatives from the legal, compliance, IT and cybersecurity departments. These teams craft data management policies that govern how their organizations collect, use and protect personal data in light of users' privacy rights. They also design processes for users to exercise their rights and implement technical controls to secure data.
Organizations can use a variety of data privacy frameworks to guide their data policies, including the NIST Privacy Framework1 and the Fair Information Practice Principles.2 Moreover, the specifics of any organization's data governance strategy depend heavily on the privacy laws the company must comply with, if any.
That said, there are a few general data privacy principles that appear in most frameworks and regulations. These principles inform many organizations' data privacy policies, processes and controls.
Users have a right to know what data a company holds. Users should be able to access their personal data on demand. They should be able to update or amend that data as needed.
Users have a right to know who has their data and what they do with it. At the point of data collection, organizations should clearly communicate what they are collecting and how they intend to use it. After collecting data, organizations should keep users informed about key data processing details, including any changes to how data is used and any third parties the data is shared with.
Internally, organizations should maintain up-to-date inventories of all the data they hold. Data should be classified based on type, level of sensitivity, compliance requirements and other relevant factors. Access control and usage policies should be enforced based on these classifications.
Organizations should get user consent for data storage, collection, sharing or processing whenever possible. If an organization keeps or uses personal data without the subject's consent, it should have a compelling reason to do so, such as a public interest use or a legal obligation.
Data subjects should have a way to raise concerns about, or object to, the handling of their data. They should be able to withdraw their consent at any time.
Organizations should strive to ensure the data they collect and keep is accurate. Inaccuracies can lead to privacy violations. For example, if a company has an old address on file, it could accidentally mail sensitive documents to the wrong person.
An organization should have a definite purpose for any data it collects. It should communicate this purpose to users and only use the data for this purpose. The organization should only collect the minimum amount of data necessary for its stated purpose and keep the data only until that purpose is fulfilled.
Privacy should be the default state of every system and process in the organization. Any products the organization designs or implements should treat user privacy as a core feature and key concern. Data collection and processing should be opt-in rather than opt-out. Users should maintain control of their data at every step.
Organizations should implement processes and controls to protect the confidentiality and integrity of user data.
At the process level, organizations can take steps like training employees on compliance requirements and only working with vendors and service providers that respect user privacy.
At the level of technical controls, organizations can use a number of tools to safeguard data. Identity and access management (IAM) solutions can enforce role-based access control policies so only authorized users can access sensitive data. Strict authentication measures like single sign-on (SSO) and multi-factor authentication (MFA) can keep hackers from hijacking legitimate users' accounts.
Data loss prevention (DLP) tools can discover and classify data; monitor usage; and prevent users from inappropriately altering, sharing or deleting data. Data backup and archiving solutions can help organizations recover lost or damaged data.
Organizations may also use data security tools designed specifically for regulatory compliance. These tools often include features like encryption, automated policy enforcement and audit trails tracking all relevant data activity.
The average organization today collects a massive amount of consumer data. Organizations have a responsibility to ensure the privacy of this data—not out of the goodness of their hearts, but as a matter of regulatory compliance, security posture and competitive advantage.
Institutions like the United Nations3 recognize privacy as a fundamental human right, and many countries have adopted privacy regulations that enshrine this right in law. Most of these regulations come with harsh penalties for non-compliance.
The European Union's General Data Protection Regulation (GDPR) is considered one of the most comprehensive data privacy laws in the world. It sets strict rules that any company—based in or outside of Europe—must follow when processing EU residents' data. Violators can be fined up to EUR 20 million or 4% of the company's global revenue.
Countries outside the EU have similar regulatory requirements, including the UK GDPR, Canada's Personal Information Protection and Electronic Documents Act (PIPEDA), and India's Digital Personal Data Protection Act.
The U.S. does not have any federal data protection laws as sweeping as the GDPR, but it does have a few pieces of more targeted legislation. The Children's Online Privacy Protection Act (COPPA) COPPA sets rules for collecting and processing the personal data of children under 13. The Health Insurance Portability and Accountability Act (HIPAA) covers how healthcare organizations and related entities handle personal health information.
Penalties under these laws can be significant. In 2022, for example, Epic Games was fined a record USD 275 million for COPPA violations.4
The U.S. also has state-level privacy regulations like the California Consumer Privacy Act (CCPA), which gives consumers in California more control over how and when their data is processed. While the CCPA is perhaps the most well-known state privacy law, it has inspired others, such as the Virginia Consumer Data Protection Act (VCDPA) and the Colorado Privacy Act (CPA).
Organizations today collect a lot of personally identifiable information (PII), like users' social security numbers and banking details. This data is a target for hackers, who can use it to commit identity theft, steal money or sell it on the dark web.
Additionally, companies have their own proprietary sensitive data that hackers may be after, such as intellectual property or financial data.
According to IBM's Cost of a Data Breach 2023 report, the average breach costs a company USD 4.45 million. Many factors contribute to this price tag, including lost business due to system downtime and the costs of detecting and remediating the breach.
Many of the same tools that support data privacy can also reduce the threat of breaches and strengthen overall cybersecurity posture. For example, IAM solutions that prevent unauthorized access can help stop hackers while enforcing privacy policies. Data security tools can often detect suspicious activity that may signal a cyberattack in progress, allowing the incident response team to act faster.
Similarly, employees and consumers can defend against some of the most damaging social engineering attacks by adopting data privacy best practices. Scammers often scour social media apps to find personal data they can use to craft convincing business email compromise (BEC) and spear phishing ruses. By sharing less information and locking down their accounts, users can cut scammers off from one potent source of ammunition.
Respecting users' privacy rights can sometimes grant organizations a competitive advantage.
Consumers may lose trust in businesses that do not adequately protect their personal data. For example, Facebook's reputation took a significant hit in the wake of the Cambridge Analytica scandal.5 Consumers are often less willing to share their valuable data with businesses that have fallen short on privacy in the past.
Conversely, businesses with a reputation for protecting data privacy may have an easier time obtaining and leveraging user data.
Furthermore, in the interconnected global economy, data often flows between organizations. A company may send the personal data it collects to a cloud database for storage or a consulting firm for processing. Adopting data privacy principles and practices can help organizations shield user data from misuse even when that data is shared with third parties. Under some regulations, such as the GDPR, organizations are legally responsible for ensuring their vendors and service providers keep data secure.
Finally, new generative artificial intelligence technologies can pose significant data privacy challenges. Any sensitive data fed to these AIs can become part of the tool's training data, and the organization may be unable to control how it is used. For example, engineers at Samsung unintentionally leaked proprietary source code by entering the code into ChatGPT to optimize it.6
Additionally, if organizations don't have users' permission to run their data through generative AI, this could constitute a privacy violation under certain regulations.
Formal data privacy policies and controls can help organizations adopt these AI tools and other new technologies without breaking the law, losing user trust or accidentally leaking sensitive information.
Discover the benefits and ROI of IBM Security Guardium Data Protection in this Forrester TEI study.
Learn about strategies to simplify and accelerate your data resilience roadmap while addressing the latest regulatory compliance requirements.
Data breach costs have hit a new high. Get essential insights to help your security and IT teams better manage risk and limit potential losses.
Follow clear steps to complete tasks and learn how to effectively use technologies in your projects.
Stay up to date with the latest trends and news about data security.
Identity and access management (IAM) is a cybersecurity discipline that deals with user access and resource permissions.
All links reside outside ibm.com
1 NIST Privacy Framework, NIST.
2 Fair Information Practice Principles, Federal Privacy Council.
3 Universal Declaration of Human Rights, United Nations.
4 Fortnite Video Game Maker Epic Games to Pay More Than Half a Billion Dollars over FTC Allegations of Privacy Violations and Unwanted Charges, Federal Trade. Commission, 19 December 2022.
5 The Cambridge Analytica Files, The Guardian.
6 Whoops, Samsung workers accidentally leaked trade secrets via ChatGPT, Mashable, 6 April 2023.