Skip to Main Content

Keeping you informed

The Major Developments in 2024 in Cybersecurity and Data Privacy

    Client Alerts
  • January 22, 2025

Companies continue to face a patchwork of state data privacy laws, federal agencies targeted companies' collection of sensitive consumer information, and a handful of states passed artificial intelligence-related regulation for developers and deployers of "high-risk AI." Those are a few takeaways from members of Parker Poe’s Cybersecurity and Data Privacy Team from 2024. 

Our team analyzed third-party data sharing litigation, comprehensive state laws aimed at protecting consumer health data outside the scope of HIPAA, and state enforcement actions targeting improper data collection and sharing practices. Below are highlights of our year-in-review, including what these developments mean for companies. On Monday, January 27, be on the lookout as our team provides a three-part series to kick off Data Privacy Week, with further insights for companies on cybersecurity and data privacy. 

Expansion of State Privacy Laws: 2024, like 2023, saw continued growth in states passing privacy laws. As states continued to take matters into their own hands, we saw one of the largest expansions of data privacy laws to date.  

Notably, the California Privacy Rights Act went into effect, extending the California Consumer Protection Act to employees as a type of consumer. It also introduced a new set of data rights such as the ability to opt out of the "sharing" of personal information and limit the use of "sensitive personal information." Not to be outdone by California, Texas’s new privacy law, the Data Privacy and Security Act (TXDPSA), also went into effect in 2024. With one of the broadest applicability scopes of any state data privacy law thus far, the TXDPSA applies not only to any company that does business in Texas or sells Texans' consumer data, but also to any company whose products and services are consumed by Texans.  

New Jersey, New Hampshire, Kentucky, Nebraska, Maryland, Minnesota, and Rhode Island each passed their own comprehensive data privacy laws. Maryland stood out among these for its stricter data minimization standards and prohibition on sale of sensitive data without an option for consumer opt in or out.  

Other key changes in state law privacy happened in areas such as: 

  • Third-Party Data Sharing: In a decision likely to ripple across the various multi-party consent states, the Supreme Judicial Court of Massachusetts narrowed the scope of its state wiretapping law. Previously, Massachusetts as well as California and Pennsylvania had been the chosen forums to bring class actions against companies' use of website tracking technologies, such as cookies, pixels, or chatbots. As courts grow tired of repeated attempts to extract settlements from companies by using decades old wiretapping laws, other states may follow Massachusetts's example and adjust the scope of their laws to exclude these otherwise industry standard tools. 
     
  • Health and Biometrics: As the first major law of its kind, the Illinois Biometric Information Privacy Act (BIPA) has recently served as the primary vehicle for class actions related to biometric data, causing headaches for employers and businesses. But, in August 2024, the Illinois Supreme Court and, subsequently the state legislature, limited the scope of BIPA to only the first collection of biometric data lacking consent. Focused on more general health data, Washington’s My Health My Data Act (MHMDA) and Nevada’s Consumer Health Data Privacy Law went into effect back in March of 2024. These bills were the first comprehensive state laws aimed at protecting consumer health data outside the scope of HIPAA. Washington sought to cast a wide net with MHMDA, extending its scope to any company that conducts business in Washington and collects health data from state residents. Similarly, the definition of health data includes any information related to an individual’s physical or mental health condition, versus Nevada’s definition that only covers data actively used to identify health status. While there has been limited enforcement actions or any large-scale litigation under the MHMDA, the curtailing of BIPA and state wiretapping laws could mean the broad scope of the MHMDA makes it the plaintiffs' bar’s next target.  
     
  • "High-Risk" Artificial Intelligence: States legislatures caught the same artificial intelligence-fever that swept the world in 2024, passing AI regulations in record time. Never the ones to let themselves be left off the privacy vanguard, California and Colorado were the first states to announce AI-specific legislation, with Utah following soon after. The Colorado Artificial Intelligence Act regulates developers and deployers of "high-risk AI." Modeled after the EU AI Act’s risk-based hierarchy, the Colorado AI Act requires these groups to implement various safeguards and disclosures as well as creates a duty of reasonable care to avoid algorithmic discrimination. California, taking a more transparency-based approach to AI regulation, passed multiple AI-related regulations requiring disclosures, detection tools, and labeling of content involving AI. Utah enacted the Utah Artificial Intelligence Policy Act, mirroring the disclosure requirements of Colorado and California in large parts, but extending liability for an AI tool’s output to the company using the tool. As a result, companies accused of violating the Utah AI Policy Act cannot argue that the AI tool was independently responsible for the violation as a defense. Even though other states did not adapt any similar type of comprehensive AI law like those in Colorado and Utah, various states and even cities passed or updated legislation to place restrictions on AI use in legally significant decisions, such as housing, employment, and health care. 
     
  • State Enforcement: California and Texas were out front with enforcement actions targeting improper data collection and sharing practices. California performed a number of sweeps for certain business types, such as streaming services and data brokers. Texas sent notice letters to over 100 data brokers and initiated various investigations through its new Consumer Protection Taskforce, targeting automotive and AI companies. Similarly, Virginia, Connecticut, and other states have ramped up their own enforcement efforts, focusing compliant privacy notices, data broker practices, and processing children’s data.  

Increased Federal Trade Commission Enforcement  

  • Data Brokers: One area of Federal Trade Commission (FTC) focus in 2024 was the collection, use, and sale of precise location data by data brokers. The FTC challenged the notion that aggregated location data is anonymous, arguing it can often be used to identify individuals. This capability, they noted, could expose sensitive information about visits to locations related to health or religion. Enforcement actions required data brokers to implement "sensitive location data programs" that filtered out sensitive location data and required clear consent before processing precise location data. 
     
  • Sensitive Data: The FTC emphasized their concern regarding sensitive data like the precise location data extended to genetic, biometric, and children’s data. By testing the accuracy of companies' claims regarding privacy protocols, consent procedures, and security measures, the FTC intended to put companies "on notice of its expectation that security be in line with the sensitivity of the date." The FTC doubled down by determining that health care providers sharing consumer health data with analytics and advertising services without proper consent required expanding the Health Breach Notification Rule to categorize such disclosures as forms of security breaches. In its multiple enforcement actions against companies accused of improperly collecting or using children's data, there were similar calls from the FTC to expand the Children’s Online Privacy Protection Act. 
     
  • Unfair AI Use: Marking the first time the FTC has taken an enforcement action against a company for using AI in an unfair manner, 2024 provided insight into the FTC’s approach to regulating AI. Whether it was requiring companies accused of AI violations to disgorge of all data collected via the violating AI tools or teaming up with the SEC to clamp down on exaggerated claims of the AI product capabilities, the FTC considers AI to be squarely within the realm of its authority and a tool ripe for unfair or deceptive business practices.   

While the FTC stands out in its 2024 enforcement, it is certainly not alone as the Securities and Exchange Commission, Department of Justice, and other regulators made themselves known in the privacy and security space. Businesses should anticipate more scrutiny across regulators in 2025 and beyond. 

For more information, please contact us or your regular Parker Poe contact. You can also subscribe to our latest alerts and insights here