Legal Challenges to Algorithmic Pricing May Undermine Market-Process Improvements

Cite this Article
Alden Abbott, Legal Challenges to Algorithmic Pricing May Undermine Market-Process Improvements, Truth on the Market (December 08, 2025), https://truthonthemarket.com/2025/12/08/legal-challenges-to-algorithmic-pricing-may-undermine-market-process-improvements/

Recent private and government antitrust challenges to individual firms’ use of “algorithmic pricing” may discourage the use of algorithms, and thereby reduce market efficiency, to the ultimate detriment of producers and consumers. A proposed U.S. Justice Department (DOJ) settlement with RealPage—a provider of commercial revenue-management software and services for the conventional multifamily rental-housing industry—is an exercise in micromanagement that could deter the development of welfare-enhancing algorithms. New state laws go even further in limiting the use.

Federal and state antitrust enforcers and legislators may want to reconsider their algorithm-skeptical positions, which if maintained could retard technology-driven improvements in the working of U.S. markets. In assessing recent government litigation aimed at limiting algorithmic-pricing freedom, keep in mind that the government’s record in seeking to regulate business pricing is dismal and rife with economically harmful failures, as reflected in shortages, black markets, inflation, and reduced innovation.

Simply put, price controls undermine the economy. They prevent market participants from reacting to constantly changing market conditions, thereby misallocating resources, and reducing the quality of goods and services over time. “Softer” forms of pricing oversight through antitrust litigation may be less draconian, but it should not be assumed that they are without economic costs.

Efficient Business Price Setting

Businesses are forever seeking new sources of market information to improve pricing, including data on competitors, customers, and market trends. This intelligence allows them to use such strategies as dynamic pricing, cost-plus pricing, and value-based pricing to set competitive and profitable prices. Analyzing market conditions, competitor actions, and customer willingness to pay helps businesses optimize prices for better margins and sales.

Algorithms have become a valuable instrument to improve the efficiency of business pricing. Algorithmic pricing is the automated use of computer programs to set prices for products and services based on real-time data. Algorithms analyze a range of factors—including consumer behavior, competitor prices, and market demand—to automatically adjust prices to maximize profits or achieve other business goals.

Using algorithms to set prices can enhance both firm-level performance and overall market efficiency. Algorithmic pricing allows firms to rapidly incorporate real-time data—such as demand fluctuations, inventory conditions, competitor pricing, and cost changes—to form more accurate and responsive price signals. This reduces the likelihood of persistent mispricing, minimizes shortages or surpluses, and improves allocative efficiency.

From an operational standpoint, algorithms reduce the cost and delay associated with manual price adjustments. They enable finer-grained pricing strategies that reflect heterogeneous customer preferences and temporal-demand cycles, often increasing total welfare by matching products with consumers who value them most. Dynamic algorithms can also help firms smooth demand peaks, optimize capacity utilization, and lower marginal costs.

Furthermore, algorithmic pricing fosters greater market transparency and predictability in competitive environments. By adjusting prices in response to real conditions, rather than heuristics or inertia, firms can reduce information asymmetries and mitigate the risk of price volatility. When properly governed and monitored, algorithmic pricing contributes to more efficient markets, improved consumer access to goods and services, and enhanced productivity across sectors.

Antitrust Challenges to Price-Setting Algorithms

Despite these economic-efficiency benefits, both private and public antitrust suits have challenged the use of algorithmic pricing, arguing that these tools facilitate illegal price fixing and collusion among competitors, particularly when they use shared, nonpublic data. Cases involving software providers and their users in industries like real estate, hotels, and insurance are at the forefront of applying antitrust laws to modern technology, although legal standards are still being developed.

Plaintiffs in algorithmic antitrust cases have received encouragement from federal antitrust enforcers. In March 2024, the DOJ and the Federal Trade Commission (FTC) filed a very aggressive statement of interest in a private suit alleging anticompetitive algorithmic pricing, which argued that AI algorithms can effectuate illegal price collusion even without a specific agreement among the parties.

AI algorithms “trained” on industry pricing practices may help firms predict how their competitors are likely to set prices in the future. If firms purchase the same algorithmic software, they may be able to avoid price wars and adjust instantly to competitors’ price changes. The agencies argue that this could stabilize and fix prices. Furthermore, the DOJ/FTC statement stressed that:

[A]n agreement to use shared pricing recommendations, list prices or pricing algorithms is still unlawful even when co-conspirators retain some pricing discretion. Setting or recommending initial starting prices can still violate the antitrust laws even if those are not the prices that consumers ultimately pay.

In short, the statement can be read broadly to suggest that information exchanges regarding industry prices (which may effectuate more efficient individual-firm pricing) may constitute antitrust violations, even when firms retain pricing discretion and charge different prices than those suggested by the algorithms. Such a reading, if accepted by some courts, would create enormous disincentives for firms to employ algorithms, even when the algorithms were merely helping the firms align their pricing strategies with rapidly changing market conditions.

Jay Ezrielev’s Perspective

In a piece for the Fall 2025 edition of the American Bar Association’s Antitrust Magazine, economist and former FTC adviser Jay Ezrielev delved into the economic harms that may flow from far-reaching theories of antitrust liability in common-date-algorithm (CDA) cases, which involved allegations that algorithmic-pricing software facilitates collusion among users when it applies an algorithm to a common dataset to calculate pricing recommendations for competing firms.

As Ezrielev explained, two CDA courts in private antitrust suits (RealPage and Duffy) have adopted flawed economic reasoning and failed to evaluate the defendants’ incentives to engage in allegedly collusive conduct. In particular, he notes that:

[S]uccessful price-fixing conspiracies require plausible mechanisms for forcing participants to adhere to the fixed price. Such mechanisms are missing from the courts’ findings in RealPage and Duffy.

The piece further bemoans the inconsistency between theories of CDA competitive harm and CDAs’ actual effects:

One troubling issue for antitrust theories of CDA-based collusion is that it is difficult to rationalize firms’ participation in such collusion. While there are economic models that rationalize firms’ participation in hub-and-spoke conspiracies, the models’ assumptions are inconsistent with the typical CDA fact pattern.

Ezrielev also noted economic research that counsels against expansive legal theories of CDA harm:

CDAs can generate significant efficiencies by calculating prices that more accurately reflect local market conditions. Such prices can more effectively balance market supply and demand, thus avoiding shortages and excess supply. Avoiding shortages and excess supply amounts to a more efficient allocation of resources. …

There is no economic evidence suggesting that prohibiting CDAs from using nonpublic data of multiple firms would reduce the likelihood of collusion. There are also significant potential benefits of aggregating nonpublic data across multiple entities. Aggregating nonpublic data across multiple providers allows the algorithm to discover information that is otherwise unavailable from public sources or from nonpublic information of a single entity. …

Sharing nonpublic information among competitors may raise concerns about facilitating tacit collusion because access to such information may make monitoring of deviations from collusive schemes more feasible. However, algorithms’ pooling of nonpublic data would not raise this concern because combining nonpublic data of different firms into a consolidated dataset that is visible only by the algorithm would not enable users of the algorithm to detect competitors’ deviations from tacit collusion. …

Moreover, discouraging pricing algorithms from using nonpublic data may incentivize firms to disclose private data publicly to mitigate antitrust risks. Such public disclosures may facilitate tacit collusion by making detection of deviations from collusion easier.

The DOJ’s RealPage Settlement

The DOJ last month filed a proposed settlement of its CDA lawsuit against RealPage Inc., a provider of commercial revenue-management software and services for the conventional multifamily rental-housing industry. As alleged in the plaintiffs’ complaint, RealPage’s revenue-management software has relied on nonpublic, competitively sensitive information shared by landlords to set rental prices. The software has also included features designed to limit rental-price decreases and otherwise align pricing among competitors.

If approved by the court, the settlement would require RealPage to:

  • Cease having its software use competitors’ nonpublic, competitively sensitive information to determine rental prices in runtime operation;
  • Cease using active lease data for purposes of training the models underlying the software, limiting model training to historic or backward-looking nonpublic data that has been aged for at least 12 months;
  • Not use models that determine geographic effects narrower than the state level, which is broader than the markets alleged in the complaint;
  • Remove or redesign features that limit price decreases or align pricing between competing users of the software;
  • Cease conducting market surveys to collect competitively sensitive information;
  • Refrain from discussing market analyses or trends based on nonpublic data, or pricing strategies, in RealPage meetings relating to revenue-management software;
  • Accept a court-appointed monitor to ensure compliance with the terms of the consent judgment; and
  • Cooperate in the DOJ’s lawsuit against property-management companies that have used its software.

The RealPage settlement, which goes far beyond barring collusive contacts among competitors, is highly regulatory. Its broad limitations on the use of lease data and nonpublic information, market surveys, design features focused on pricing, and narrow geographic models would prevent RealPage from developing maximally efficient algorithms. This effect would be magnified by the presence of a court-appointed monitor who would be able to micromanage design decisions.

While avoiding direct price controls, this “softer” set of limitations on pricing practices imposes a regulatory constraint on pricing freedom. Like many other government limitations on business practices that are not well understood by public officials, it bespeaks a bureaucratic “pretense of knowledge” that may be expected to reduce market efficiency.

The RealPage settlement’s greatest harm may be its effect on market efficiency and innovation. Firms that have begun to utilize (or are considering utilizing) algorithms to better align their pricing decisions with current market conditions may avoid or limit their use of algorithms to avoid antitrust scrutiny. What’s more, software firms may turn away from investing in new and improved pricing algorithms for the same reason. This reduced innovation will tend to render market processes less efficient, harming both sellers and buyers.

The States Make Their Move

Meanwhile, state-law antitrust challenges to algorithmic pricing are gearing up whose effects may dwarf those of federal antitrust litigation that indirectly discourages the efficient use of algorithms. Moreover, recent state legislation directly threatens the use of all pricing algorithms. New York and California have passed new laws that heavily regulate algorithmic pricing, and other states are expected to follow.

On their face, the New York and California laws are designed to substantially reduce, if not eliminate, the use of algorithms in pricing, although the California provisions preserve more leeway for efficient uses of algorithmic pricing.

New York State

Amendments to New York’s antitrust statute passed in October take direct aim at pricing algorithms:

  • “[R]residential rental property owners and managers [are prohibited] from using pricing recommendations generated by tools that collect and analyze data from multiple property owners.”
  • “Third-party vendors who operate or license algorithmic tools used by New York landlords [are also targeted], even if the vendors are located outside of New York.”
  • “[A]ny conduct that affects residential rental units located within the state, regardless of where the actor is physically located,” is covered.
  • “[D]irect adoption of algorithmic recommendations [is not required] to trigger liability.”
  • “[T]he law does not distinguish between public and non-public data sources, nor does it require proof of actual harm to competition.”
  • Companies doing business in NY must display “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA” if they dynamically price goods/services using consumer data.
  • Finally, New York City has established an Office of Algorithmic Accountability that will review, audit, and advise on algorithmic tools used by NYC agencies, focusing on bias, transparency, and public input.

California

Newly enacted California statutory prohibitions are “zeroing in” on algorithmic-pricing abuses, although they are less draconian than the New York legal changes. They include:

  • Using Common Pricing Algorithms: It is unlawful to use technology that pools competitor data (prices, supply, etc.) to recommend, stabilize, or set prices for two or more entities.
  • Collusion via Algorithms: The law clarifies that using such tools to restrain trade is illegal under the Cartwright Act (California’s antitrust law).
  • Coercing Price Adoption: Forcing another party to adopt a price or term from a common algorithm is also banned.
  • Data Use: Prohibits using algorithms trained with competitor data, even historical data, for coordinating prices.

Looking Forward

Possible “hub-and-spoke” conspiracies, whereby competitors scheme to use the same third-party algorithm as a means to collude on price, certainly merit antitrust prosecution when the facts warrant. But the more expansive theories advanced in recent litigation, egged on by federal pronouncements, threaten to discourage the use of efficiency-enhancing price algorithms and undermine desirable algorithmic innovation.

State regulation of algorithms would exacerbate the problem. The ultimate losers in this story would not be algorithmic cartelists, but rather, market transactors who would be denied welfare-enhancing improvements in market processes.

Federal and state enforcers and legislators may want to take a closer look at how recent attacks on algorithmic pricing may harm the U.S. economy. The Trump administration has highlighted its opposition to anticompetitive federal and state regulation. Neo-Luddite laws and ill-considered antitrust cases that undermine algorithm-supported improvements in market processes may fall into that category.