The 2025 legislative cycle marked a pivotal year in US privacy law, defined not only by continued nationwide expansion into Artificial Intelligence (AI) governance, children’s and teen privacy and online safety, as well as emerging data categories, but by a major restructuring of California’s privacy enforcement infrastructure. California’s introduction of the Delete Request and Opt-out Platform (DROP) system, the nation’s first centralized, statewide platform for managing consumer deletion requests; combined with sweeping reforms to the Consumer Privacy Fund, will materially increase CalPrivacy and attorney general enforcement capacity on a recurring, self-replenishing basis. These developments accompany completion of a far-reaching rulemaking package that imposes detailed obligations for Data Protection Impact Assessments (DPIAs or risk assessments), cybersecurity governance and Automated Decision-Making Technology (ADMT). At the same time, states beyond California have enacted targeted statutory reforms addressing neurotechnology, data-broker practices and minors’ online safety, underscoring that – absent federal preemption – state-driven models will continue to shape the national privacy compliance landscape in 2026. By January 2026, there will be 20 state consumer privacy laws in effect, several with unique material obligations. We detail what enterprises need to be prepared for in 2026 and explain why we believe next year will be a watershed period for consumer privacy in the US.
National Trends: AI, Expansion of Sensitive Data and Focus on High-Risk Processing
Across the country, states expanded the substantive boundaries of privacy law, shifting from broad principles toward deeper regulation of automated systems, data-intensive practices and novel information categories. Several themes dominated:
- AI and automated decision-making –While Colorado has delayed the effectiveness of its comprehensive AI Act, and may materially revise it, states advanced transparency requirements for training datasets, profiling restrictions and guardrails for high-risk algorithmic systems. For instance, California has passed AI transparency laws and New York has introduced, but has not yet passed, an even more robust AI training-data disclosure requirements. Federal efforts for a moratorium on, or preemption of, state AI laws have been met with resistance and litigation, and enforcement based on existing consumer protection, anti-discrimination and tort laws is becoming more common.
- Neurotechnology and biometric expansion – California added neural data as sensitive personal information in 2024, which has been reflected in the most recent regulations update. Montana followed to expressly regulate neurotechnology data through S 297, significantly revising the Montana Consumer Data Privacy Act (MCDPA) to cover early-stage risks associated with brain-computer interfaces and other neural-signal technologies. Massachusetts, which lacks a comprehensive privacy law, is considering a stand-alone Neural Data Privacy Protection Act (H103), in addition to including neural data in a draft comprehensive privacy bill, reflecting growing nationwide interest in governing neural data as a distinct category of sensitive information in need of protection.
- Children’s and teen privacy – California has added personal information of youth under 16 as a sensitive category, following the lead of other states. Utah, Louisiana and North Carolina advanced comprehensive measures restricting how platforms collect, use, profile or engage with minors’ data; reflecting a broader trend toward age-specific data protections. Texas, Utah, Louisiana and California have passed app store age verification laws with parental controls over certain online activities. Age-appropriate design and risk assessment laws, or amendments to comprehensive privacy laws, continue to be litigated in the courts. Congress is considering child and teen privacy and safety bills that could potentially set a uniform federal standard next year.
- Data brokers and targeted advertising – Texas and California pursued heightened transparency, registration and data-deletion requirements for data brokers and AI-driven profiling systems. California’s DROP system, discussed below, represents the most significant development in centralized consumer-deletion workflows to date.
Collectively, these initiatives reflect a future in which privacy law governs not only the collection and use of personal data, but also the design, behavior and oversight of the automated systems that process it and sensitive data is given heightened protection and scrutiny.
California: Legislative and Regulatory Developments With National Impact
California led the privacy landscape in 2025, coupling extensive rulemaking, new technical infrastructure and a redesigned financial model for privacy enforcement that will materially expand regulatory capacity beginning in 2026.
New Regulatory Requirements Effective in 2026
CalPrivacy finalized a sweeping rulemaking package that introduces some of the country’s most detailed operational requirements. As outlined in our firm’s analysis, seven areas will drive 2026 compliance obligations:
- Risk assessments – Beginning January 1, businesses must complete detailed assessments before initiating high-risk activities, including targeted advertising, selling or sharing personal information, as well as processing sensitive personal information. Ongoing activities must be assessed by year-end 2027, and the first executive attestation is due April 1, 2028 (for calendar years 2026 and 2027).
- Opt-out confirmation – Businesses must give consumers a way to confirm opt-out status, including honoring Global Privacy Control (GPC) signals.
- Expanded access rights – Request-to-know workflows must inform consumers that they may access any personal information collected on or after January 1, 2022, if a look-back limit is applied.
- Correction rights – Businesses must identify or notify the source of inaccurate information or require the source to correct the data.
- Maintaining accuracy – Once corrected, personal information cannot be overwritten by subsequently acquired third-party data, underscoring the need for robust data-governance controls.
- Health data disputes – When consumers dispute the accuracy of health-related data, the dispute must follow the information downstream.
- Youth data sensitivity – Personal information of individuals under 16 is now classified as sensitive personal information, triggering heightened assessment and processing restrictions.
These rules materially raise the compliance bar and foreshadow increased supervisory inquiries and targeted enforcement activity in 2026.
California’s DROP System: A Model with Global Implications
California’s growing focus on data brokers is highlighted by the rollout of CalPrivacy’s DROP system, a first-of-its-kind statewide platform designed to streamline consumer deletion requests and standardize how more than 500+ registered data brokers process those requests.
- How DROP Works for Consumers
Consumers will verify state residency through Login.gov or the California Identity Gateway, create a profile, select the identifiers they want deleted (e.g., names, emails, mobile advertising IDs, CTV IDs or VINs) and submit a request tied to a unique DROP ID. Requests remain active and can be updated over time. Authorized agents may file on behalf of consumers. - How DROP Works for Data Brokers
Under the DELETE Act, data brokers must register with CalPrivacy, pay an annual US$6,000 fee, issue annual activity reports and comply with ongoing deletion obligations unless exempt under specific statutes (e.g., Fair Credit Reporting Act (FCRA), Health Insurance Portability and Accountability Act (HIPAA)). Beginning August 1, brokers must either download new deletion-request lists every 45 days or integrate with DROP via an Application Programming Interface (API). DROP will store consumer identifiers as hashed values; brokers must hash their own datasets and match them, then delete all non-exempt personal information, including inferences, when a match occurs. - Penalties
The DELETE Act imposes fines of US$200 per day per unfulfilled deletion request beginning January 31, 2026. Inaction on 100,000 consumer requests for one year would exceed US$7.3 billion in fines.
If DROP succeeds, it may serve as a template for other states and potentially for international regulators seeking to harmonize data-subject rights workflows across jurisdictions.
Expanding Enforcement Capacity in 2026 and Beyond – Revised Consumer Privacy Fund Structure
California’s amendments to Civil Code §§ 1798.155 and 1798.160 through A 137 fundamentally restructure how privacy enforcement is financed. The prior model diverted 91% of fine revenue into a state investment fund, leaving limited resources available for privacy operations. The new model eliminates that investment structure and creates three dedicated subfunds:
- Consumer Privacy Subfund (CalPrivacy)
- Attorney General Consumer Privacy Enforcement Subfund
- Consumer Privacy Grant Subfund
Under the new system, 95% of administrative fines and civil penalties remain within the privacy regulatory ecosystem, dramatically increasing the funding available for investigations, audits, litigation, risk-assessment review, ADMT oversight, cybersecurity inspections, technical infrastructure and staffing. Both CalPrivacy and the attorney general now benefit from a self-replenishing, enforcement-driven funding model rather than relying on annual appropriations.
Over time, this structure will generate year-over-year growth in enforcement capacity. As CalPrivacy’s oversight obligations expand, particularly around high-risk processing, youth data and automated decision-making, fine revenue will rise, increasing available funding and enabling further expansion of enforcement operations. This creates a compounding feedback loop: more enforcement generates more penalties; more penalties fund more enforcement. In combination with DROP and the one-time fiscal year 2025–2026 fund infusion, California now has the most robust privacy enforcement framework in the US.
Looking Ahead
The national privacy landscape is becoming more complex and prescriptive, and California continues to set the pace. With significant statutory reforms, sweeping new operational rules, a statewide deletion-request system and a growing, self-sustaining enforcement budget, 2026 is poised to bring heightened regulatory expectations for businesses operating within or affecting California. Other states are developing parallel requirements, particularly in AI, neurodata, youth privacy and safety and data-broker regulation, suggesting that deeper, more technical state-level obligations will continue to proliferate.
Businesses should expect:
- Increased review of risk assessments
- More frequent and detailed investigations into high-risk data practices, with a continued focus on cookies and targeted advertising
- Greater scrutiny of sensitive information, especially youth data and neuro/biometrics
- Active enforcement around ADMT and AI-driven systems
- Expanded multijurisdictional enforcement coordination supported by new grant funding
