Consumers have long been wary of dynamic pricing. Now, with algorithms quietly setting prices based on individual data points — such as location, browsing history and purchase behavior — skepticism is giving way to outright suspicion. That mistrust came to a head in May when New York State passed the first law in the nation requiring retailers to disclose when an algorithm is used to determine the price tag.
Set to take effect July 8, the law mandates clear signage: “This price was set by an algorithm using your personal data.” Retailers are now fighting back. The National Retail Federation (NRF) has filed a federal lawsuit in Manhattan, arguing that the disclosure law is unconstitutional, harmful to business and misleading to consumers. They argue that it stigmatizes a tool that, in their view, simply automates the time-honored practice of tailoring prices to meet customer needs.
Legally, the NRF may well prevail. Courts have shown sensitivity to compelled speech and are likely to scrutinize the law’s “arbitrary exemptions,” as the suit alleges. But even if retailers win in court, they still face a deeper challenge: regaining the confidence of the customers they serve.
At the heart of the controversy is a fundamental disconnect between the tech and the perception. Retailers see algorithmic pricing as efficient, personalized and often pro-consumer. Critics — including New York lawmakers — see something far murkier. As the state’s press release warned, “Today’s technology means corporations are able to collect mountains of personal data, feed it into algorithms and generate a price that’s individual to a consumer.” That, they say, erodes transparency and “strips consumers of their ability to comparison shop and plan.”
This phenomenon has even earned a name from the Federal Trade Commission: surveillance pricing.
For shoppers, the issue isn’t just the use of data. It’s the feeling of being manipulated by an invisible system that may charge one person more than another for the exact same item — based not on demand, but on what a machine thinks you’re willing to pay. That unease is only magnified by the secrecy that often surrounds algorithmic decision-making.
The NRF counters that these systems aren’t nefarious, just misunderstood. “Algorithms are created by humans, not computers,” said NRF general counsel Stephanie Martz. “They are an extension of what retailers have done for decades, if not centuries.” And it’s true: Personalized promotions, variable pricing and even haggling are as old as commerce itself.
But today’s algorithms operate at a scale, speed and opacity that consumers haven’t encountered before. Without disclosure or transparency, price personalization starts to feel like discrimination.
Retailers are right to defend tools that help manage margin and inventory in real time — especially in a volatile supply chain environment. But brushing aside consumer concerns or fighting transparency measures in court sends the wrong message. Instead of asking, “How do we block this disclosure?” the better question might be: “How do we explain this to our customers in a way that builds trust?”
Retailers can communicate that algorithmic pricing helps deliver discounts, manages inventory, and prevents overstock and waste. They can offer opt-ins for personalized deals, provide clear FAQs, or even give customers a choice between personalized and standard pricing.
Yes, the lawsuit may be won on technical or constitutional grounds. But that victory will be pyrrhic unless retailers also win their shoppers’ trust.