Nicole Polisar
Associate Editor
Loyola University Chicago School of Law, JD 2027
Retail pricing is undergoing a significant technological shift. Instead of relying on fixed price tags, many businesses now use dynamic pricing systems that adjust prices automatically based on real-time data. These systems analyze factors such as demand, competitor pricing, inventory levels, and consumer behavior to determine what price to display at a given moment. Dynamic pricing is already prevalent in many industries, such as live entertainment, airlines, hotels, and ride-sharing platforms, all of which routinely adjust prices in response to changing demand. Increasingly, retailers and e-commerce platforms are adopting similar strategies in everyday consumer markets. As this practice expands, regulators are evaluating how existing consumer protection, antitrust, and data privacy laws apply to algorithm-driven pricing models.
How dynamic pricing systems work
Dynamic pricing refers to the use of automated tools or algorithms to frequently modify prices based on changing market conditions. Instead of setting a single price for a product, businesses may update prices throughout the day as inputs such as supply levels or consumer demand shift. Many pricing systems rely on large datasets and predictive analytics to forecast consumer behavior. These tools may incorporate information about competitor pricing, seasonal demand patterns, or the timing of purchases. For instance, ride-sharing companies increase fares during periods of high demand, while airlines adjust ticket prices as flight availability changes.
More advanced pricing systems may also incorporate consumer-specific data, a practice often referred to as “surveillance pricing”. For example, some companies experiment with personalized pricing, where algorithms attempt to estimate a consumer’s willingness to pay based on available data. The Federal Trade Commission (FTC) has reported that companies consider factors such as browsing behavior, purchasing history, and geographic location when determining individualized prices. While these systems can improve pricing efficiency, they also raise regulatory questions about transparency and fairness. Since pricing decisions are generated by automated systems that are not visible to consumers, individuals often have little insight into how or why the price offered to them was determined.
Consumer protection and pricing transparency
One of the primary compliance issues associated with dynamic pricing is transparency. When prices change frequently, consumers may find it difficult to determine whether the price presented is consistent or comparable across transactions. Under federal consumer protection law, pricing practices violate Section 5 of the Federal Trade Commission Act (FTCA) if they are misleading or deceptive, so businesses must avoid practices that misrepresent the cost of a product or service. Recent regulatory developments have emphasized the importance of clear price disclosures. The FTC’s Rule on Unfair or Deceptive Fees, which took effect in 2025, requires covered businesses to clearly disclose the total price of goods or services upfront when advertising or offering them for sale. Although the rule does not prohibit variable pricing models, companies must ensure that the price consumers see accurately reflects the amount they will pay. If a business advertises one price but later applies algorithmic adjustments or hidden fees, that practice may be considered deceptive under the FTCA.
Data use and surveillance pricing
Another emerging compliance issue is surveillance pricing, which involves using personal data to determine how much an individual customer may be willing to pay. In 2024, the FTC conducted a study examining how companies use consumer data, artificial intelligence, and algorithmic systems to set individualized prices. The FTC reported that some businesses rely on personal data such as browsing behavior, purchase history, and geographic location when determining the prices consumers see. Since these systems often rely on detailed behavioral and demographic data, regulators have begun examining whether the use of consumer data in pricing decisions complies with privacy and consumer protection laws. In addition, personalized pricing models may create legal risk if they rely on sensitive characteristics. Pricing differences tied to protected attributes such as race, gender, or disability could raise concerns under civil rights laws and state consumer protection statutes.
Antitrust considerations in algorithmic pricing
Dynamic pricing tools also raise antitrust issues. In 2024, the U.S. Department of Justice (DOJ) filed a lawsuit against RealPage, alleging that the company’s rent-setting software helped competing landlords facilitate illegal price coordination in violation of the Sherman Act. According to the complaint, RealPage’s algorithm used nonpublic rental data from multiple property managers to generate pricing recommendations that landlords were encouraged to follow. The government argued that this effectively aligned rental prices across competing properties. The fact that the coordination occurred through software did not make it any less problematic under antitrust law. This case is an example of growing regulatory concern that algorithmic pricing tools could enable price fixing among competitors even without direct communication. In November 2025, the DOJ announced a proposed settlement that would require RealPage to stop sharing competitively sensitive information, restrict how it uses nonpublic data in its products, and end practices the government said distorted competition in rental housing markets.
What needs to happen
As dynamic pricing technologies become more widespread, policymakers need to establish clearer rules governing how businesses use consumer data in pricing decisions. Without additional safeguards, algorithm-driven pricing models may undermine price transparency and consumer trust as well as cause prejudice in the marketplace. Therefore, to protect consumers, the law should completely prohibit companies from using personal data to set individualized prices. Prices for consumer goods and services should remain uniform across communities rather than fluctuate based on data profiles that may reflect income, race, location, or other sensitive characteristics. Without a bright-line rule, there is too much risk that algorithmic pricing will reproduce bias and deepen inequality in the marketplace. To ensure transparency and consumer trust, dynamic pricing that responds to current market conditions may not require a complete ban, but should be subject to much stricter rules, including clear disclosure requirements, limits on how often prices can change, and regular auditing to detect discriminatory effects.