What to Expect in U.S. Privacy for 2024

10 min read

In 2023, the privacy landscape saw a proliferation of comprehensive state data privacy laws being enacted in several jurisdictions, as well as a few that have also taken effect. Although, many of the laws are similar, businesses must continue to assess them individually – accounting for the nuances in their compliance obligations and the rights afforded to consumers. On the federal level, there has not been significant movement on an omnibus privacy law despite the American Data Privacy and Protection Act, introduced in 2022.

In 2024, we will likely see additional states enact similar laws in the absence of a federal framework. Furthermore, as regulating authorities continue to develop, enforcement actions and settlements are likely on the horizon. As such, businesses must continue to allocate adequate resources and make complying with data privacy laws a priority.

In addition, Artificial Intelligence ("AI") has seen a huge expansion in terms of technology, adoption, proposed regulation, enforcement, and even an Executive Order from President Biden. In 2024, we will likely see the expansion of AI policies and regulation, particularly at the state level.

Upcoming State Data Privacy Laws

On December 31, 2023, Utah's Consumer Privacy Act will take effect. The law applies to businesses that have at least US$25 million in annual revenue, and either (a) control or process the personal information of 100,000 or more Utah consumers during a calendar year, or (b) derive more than 50 percent of their gross revenue from the sale of personal information and control or process the personal information of 25,000 or more Utah consumers. Though the law provides a 30-day cure period, covered businesses should endeavor to meet their compliance obligations before the end of the year.

On July 1, 2024, Florida's Digital Bill of Rights, Oregon's Consumer Privacy Act, and Texas' Data Privacy and Security Act will take effect. On October 1, 2024, Montana's Consumer Data Privacy Act will also take effect. Here are a few key takeaways of each upcoming law:

  • Florida's Digital Bill of Rights, has a narrow scope jurisdictionally because among other requirements, applies primarily to businesses that have an annual global revenue greater than US$1 billion.  Additionally, consumers must be given the ability to opt-out of the collection of their personal information that is obtained through the use of voice or facial recognition features. Furthermore, Florida's Digital Bill of Rights takes a unique approach to children's privacy by prohibiting online platforms (e.g., social media platforms, online games, online gaming platforms, etc.) from processing children's personal information if there is a substantial risk or harm to their privacy, requiring they justify the necessity of profiling children and ensure the presence of adequate safeguards, as well as limiting the collection, selling, and sharing of personal information and precise geolocation data.
  • Oregon's Consumer Privacy Act does not have a revenue threshold for entities to be subject to privacy obligations. The law applies to businesses that conduct business in the state or produce products or services targeted to state residents and control or process the personal information of at least 100,00 state residents or control or process the personal information of 25,000 state residents and derives more than 25 percent of its gross revenue from selling personal information. Additionally, non-profit entities are not exempt from the law, but have an additional year – until July 1, 2025 – to comply.
  • Texas' Data Privacy and Security Act will sweep up a broader array of businesses under its jurisdiction because it does not contain a revenue threshold nor a minimum number of consumers whose personal information is processed or sold for the law to apply. However, small businesses, as defined by the U.S. Small Business Administration, are generally exempt, unless the small business engages in the selling of sensitive data, where it then must first obtain consumer consent before selling the sensitive data.
  • Montana's Consumer Data Privacy Act is similar to Oregon's Consumer Privacy Act as it also does not have a revenue threshold for entities to be subject to privacy obligations. The law applies to businesses that conduct business in the state or produces products or services targeted to state residents and control or process the personal information of at least 5,000 state residents or control or process the personal information of 25,000 state residents and derives more than 25 percent of its gross revenue from selling personal information.


In 2024, businesses should expect to see additional rules promulgated by regulatory authorities, particularly by the Federal Trade Commission ("FTC"), Consumer Financial Protection Bureau ("CFPB"), and the California Privacy Protection Agency ("CPPA").

Federal Trade Commission

In November 2023, the FTC issued a final rule to amend the Gramm Leach Bliley Act's Safeguard Rule to require financial institutions to notify the FTC within 30 days of discovering a data breach where there is the unauthorized acquisition of unencrypted customer information of at least 500 customers, unless a law enforcement exception applies. The rule becomes effective on May 13, 2024.

In June 2023, the FTC proposed to amend its Health Breach Notification Rule ("HBNR") to, in part, clarify the rule's scope including explicitly noting many health applications fall within its jurisdictional orbit and clarify that a breach of security includes data security breaches and unauthorized disclosures. Although the rule has not yet been finalized, the FTC has already acted against several health care companies with mobile applications in 2023 under its existing authority under the American Recovery and Reinvestment Act of 2009 and the existing HBNR (e.g., relating to GoodRx and Easy Healthcare). As such, entities processing health data should be aware of the potential for enforcement actions in 2024 relating to their use of personal health records.

Consumer Financial Protection Bureau

In October 2023, the Consumer Financial Protection Bureau (the "CFPB"), under the Dodd-Frank Consumer Financial Protection Act, proposed the Personal Financial Data Rights Rule, which would govern access for consumers and data aggregators to personal financial information from financial institutions. The proposed rule would require applicable entities (e.g., banks) to make transaction data available to consumers as well as establish and maintain systems that could receive data access revocation requests, track duration-limited authorizations, and delete data when required. The CFPB anticipates finalizing the proposed rule by Fall 2024.

California Privacy Protection Agency

In recent months, the CPPA has advanced several proposals to ease the manner in which consumers may exercise their statutory rights, all of which are likely to be finalized and could become effective in 2024.

  • In December 2023, the CPPA Board unanimously approved a legislative proposal that would require browser vendors (e.g., Google Chrome, Apple Safari, etc.) to include a feature to allow users to automatically opt-out of the sale and/or sharing of their personal information.
  • In addition, in November 2023 the CPPA released draft regulations regarding the use of automated decision-making technology ("ADMT"). In short, the regulations if adopted, would require businesses to provide consumers additional information and rights including: notice that the business intends to use ADMT (including how); the right to opt out of such use; and the right to access information about how the business used ADMT to make decisions about the consumer. The draft language is extremely broad and could cover a myriad of use cases. Absent federal AI legislation, we expect to see more similar proposals in other states.
  • The CPPA is also considering requiring businesses to incorporate, within their data risk assessments, their intended use of ADMT including to explain its benefits, the personal information to be processed, any limitations on its use, and a description of the need for human involvement. At the CPPA December Board meeting, the Board requested the CPPA staff to further evaluate the draft regulations, particularly its impact within the workplace and in the employment context, and the increased compliance costs of conducting data risk assessments due to the additional requirements. Thus, we should expect to see further guidance relating to ADMT from the CPPA in 2024.


In 2023, regulating authorities launched enforcement sweeps and inquiries to ensure compliance that could continue into 2024. On the federal level, the FTC took action against several businesses including relating to data breaches, unfair and deceptive disclosures relating to the sharing of health data, the failure to obtain consent from parents regarding the collection of children's data, and the use of dark patterns relating to children's privacy.

The FTC has also taken action to enforce violations from the use of artificial intelligence that was deployed without having appropriate safeguards in place which negatively impacted consumers. In December 2023, the FTC announced a settlement (still subject to court approval) with Rite Aid which reportedly used facial recognition software for surveillance purposes that erroneously led Rite Aid employees to accuse innocent consumers of wrongdoing. If approved, Rite Aid will be prohibited from using facial recognition technology for five years and must take additional remedial measures including deleting images collected because of the facial recognition systems (including algorithms developed using the images), notifying consumers when their biometric information is used for surveillance purposes, implementing a data security program, and obtaining third-party assessments of its data security program. In 2024, the FTC will likely continue to police alleged misuse of biometric information, which it identified as a high priority earlier this year.

On the state level, both California and Colorado have announced sweeps to enforce their respective data privacy laws. For example, California's Attorney General sent inquiries to certain California employers relating to their processing of employee personal information and the CPPA initiated a review of data privacy practices of connected vehicle manufacturers and related technologies. Likewise, Colorado's Attorney General sent inquiries to entities relating to their processing of sensitive data.

In a noteworthy announcement, in December 2023, the Federal Communications Commission and four state attorneys general (Connecticut, Illinois, New York, and Pennsylvania) entered into Memoranda of Understanding to strengthen and formalize cooperation on privacy, data protection, and cybersecurity investigations and enforcement. The coordination will lead to information sharing and the pooling of government resources.

As additional privacy laws become effective, and cross jurisdiction collaborations are announced, we can expect to see regulators including the FTC, FCC, CPPA, and state attorneys general look to enforce compliance and levy fines pursuant to data privacy laws in 2024.

Privacy Best Practices Checklist

As we enter 2024, we offer a few reminders for businesses to consider in light of the constantly evolving U.S. data privacy framework:

  • As new data types and processing activities are swept into existing and upcoming data privacy laws, it is essential for businesses to perform data mapping exercises to ensure data collection, sharing, and processing practices are not subject to new requirements and have not changed;
  • Review external privacy policies to ensure appropriate disclosures are included and that disclosures remain accurate and account for consumer rights afforded under applicable laws;
  • Ensure opt-out preference signals are implemented and recognized by websites in additional states where such obligations will become effective in 2024, to the extent those laws are applicable to the business;
  • Conduct a data protection impact assessment ("DPIA") on the processing of personal information, to the extent required by laws applicable to the business, or ensure existing DPIA's comply with relevant laws, and consider all relevant factors identified in applicable laws and accompanying regulations (e.g., use of sensitive information, ADMT, etc.); and
  • Review and analyze the use of artificial intelligence tools in accordance with published frameworks, including relating to privacy (e.g., Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence; The White House's Blueprint For An AI Bill of Rights and NIST's Artificial Intelligence Risk Management Framework); understand all model inputs and outputs and ensure appropriate data and IP compliance; have an internal AI policy that covers employee use of AI tools; review all external AI statements for truth and transparency; and build privacy, bias, ethics, and safety review into AI products.


White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.

This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.

© 2023 White & Case LLP