A new line of text is popping up on some online checkout pages in New York this season: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.”
The message, a state-mandated disclosure, is part of an important New York law that took effect in November, making it the first state to directly regulate the controversial practice of “personalized pricing.”
Also called algorithmic or surveillance pricing, this is the tactic where retailers use artificial intelligence and a customer’s personal data — like their browsing history, device type and past purchases — to set an individual price, potentially charging people differing amounts for the same item.
If a business uses personal consumer data in an algorithm to determine specific prices, it now needs to show the mandated label, illuminating a process that usually happens in the digital shadows.
“Algorithmic pricing bills are probably the next big battleground in A.I. regulation,” said Goli Mahdavi, a lawyer at Bryan Cave Leighton Paisner who focuses on artificial intelligence and data privacy, in an interview with The New York Times.
The push for regulation comes from the rapid advancement of data harvesting and AI, but this isn’t a brand-new concept: A foundational case dates back to 2012, when The Wall Street Journal reported that travel site Orbitz was showing Mac users, whom it statistically associated with higher incomes, pricier hotel options than it showed PC users — a practice that was considered innovative at the time but is now viewed as primitive.
A key federal report earlier this year drew out how sophisticated and pervasive these systems have become. In January 2025, the Federal Trade Commission (FTC) published a report detailing how companies can track subtle consumer behavior, warning, “a person’s precise location or browser history can be frequently used to target individual consumers with different prices for the same goods and services.”
Dynamic Pricing
Per the new law, the disclosure must appear at or near the price offered. Retailers who fail to provide the required disclosure can be penalized by the state — up to $1,000 per violation.
“This new law shines the light on hidden online pricing tactics that take advantage of consumers,” said New York Governor Kathy Hochul.
“[The law] ensures consumers understand when algorithms and personal data are influencing the prices they see,” echoed Assemblymember Nily Rozic in their statement.
Dynamic pricing is already familiar in industries like ride-hailing or airline tickets where prices shift with demand, availability or time — what’s changed is scale and sophistication, insofar as modern algorithms can aggregate browsing history, past purchase behavior, device information, location data, loyalty status — even subtle digital footprints — to customize prices.
This personalization arguably enables better price discrimination — a tool used in economics to adjust price to willingness to pay or perceived ability to pay. In economic theory, well-executed price discrimination can increase total social surplus by extracting more surplus from high-willingness buyers while allowing bargain pricing for price-sensitive buyers.
At the same time, it also raises fairness and equity concerns when there is information asymmetry, lack of transparency or first-degree discrimination — i.e. individualized pricing.
In classic economic models of imperfect information, disclosure requirements help correct for asymmetric information. Here, the law’s transparency mandate gives consumers a signal that the price is personalized based on their data profile. That disclosure may alter consumer behavior (shopping around, using VPNs, switching devices, clearing cookies), which in turn may change how retailers build their pricing algorithms.
Legal Backlash
Since the law was signed in mid-2025, it triggered immediate legal pushback. The National Retail Federation (NRF) sued, arguing the disclosure requirement violated retailers’ free-speech rights and mischaracterized pricing as deceptive.
“This law interferes with retailers’ ability to provide their customers with the highest value and best shopping experience they can,” NRF Chief Administrative Officer and General Counsel Stephanie Martz said.
“Algorithms are created by humans, not computers, and they are an extension of what retailers have done for decades, if not centuries, to use what they know about their customers to serve them better.”
On October 8, 2025, a federal judge dismissed the case, ruling the statute is constitutional and stating that the disclosure is factual and serves a legitimate consumer-protection interest.
The disclosure “serves to ameliorate consumer confusion or deception by ensuring that consumers are better informed about how a merchant has set the displayed price,” U.S. District Judge Jed Rakoff wrote in a 28-page decision.
With judicial backing, New York now sets a blueprint for other states, several of which already have draft bills or are considering similar laws.
Unresolved Questions
Despite its strengths, the statute excludes some uses of personal data from disclosure requirements, including certain ride-share fare calculations using solely location data, financial products, insurance and subscription-based pricing — as a result, some forms of price differentiation, even algorithmic, may continue to be opaque or vague.
Also, the law empowers the state’s New York Attorney General to enforce compliance, but enforcement largely hinges on consumers recognizing and reporting violations. A November 2025 consumer alert encourages New Yorkers to file complaints if they suspect non-disclosure. Some warn that many consumers may not even notice or understand pricing disclosures, especially on mobile apps or ambiguous checkout flows.
From a retailer’s point of view, algorithmic pricing may enable better matching of supply, demand and individual willingness — potentially boosting revenue and investment in personalization, but for consumers, especially those with less bargaining power, it may deepen inequality or lead to price gouging.
Given that the law doesn’t outlaw such pricing, the core ethical economic question persists: is disclosure enough to prevent abuse or merely expose or document it after the fact?
What to Watch Next
Several states have bills pending that would either mandate similar disclosures or outright ban personalized pricing based on sensitive consumer data. Personalized pricing sits at the intersection of antitrust, data privacy, consumer protection and fairness — as states experiment with laws, we may see a patchwork regulatory environment that could ultimately lead to a federal standard or consumer-data regulation at scale.
Also, if enough consumers notice and respond to disclosures — for instance, by switching platforms, using privacy tools or avoiding data-heavy retailers — retailers may adjust or abandon aggressive personalized-pricing algorithms altogether.
Enforcement will also matter. If violations trigger significant penalties or reputational cost, firms may end up self-regulating or lobbying for softened laws. And if the opposite ensues, the law may prove to be a symbolic, ceremonial victory with limited real-world effect.
On paper, New York’s law could alter e-commerce economic structures. If enforced well, what was buried under digital code and unclear checkout flows could now become more obvious to every consumer. Whether this important legal move rebalances power and information between consumers and retailers, or simply makes pricing more messy and complex, is yet to be seen.
