FTC Attempts to Keep Up with Tech: Proposed ban on AI to Impersonate Individuals

Jossie Ward

Associate Editor

Loyola University Chicago School of Law, JD 2025

 

The Federal Trade Commission (FTC) enforces federal consumer protection laws that prevent fraud among other unfair business practices. On February 15, 2024 the Chair of the FTC Lina M. Khan released a joint statement with Commissioner Rebecca Kelly Slaughter and Commissioner Alvaro M. Bedoya stating the FTC finalized its rule prohibiting both government and business impersonation scheme, and is seeking public comment on a supplemental notice of proposed rulemaking that would extend the ban to include the impersonation of individuals as well. Specifically, the finalized rule and the proposed addition would prohibit the use of artificial intelligence or AI tools for impersonation. For consumers, this rule and the proposed addition provide much needed protection against fraudsters using AI-generated deepfakes. Given the increase in consumer fraud, it is imperative that the FTC expand consumer protection regulation in tandem with evolving technology.

Finalized rule prohibiting government and business impersonation

The Government and Business Impersonation Rule prohibits the impersonation of government, businesses, or their officials. In 2023 alone, consumers reported losing more than $10 billion to fraud, which is a 14% increase over reported losses in the previous year. The new regulation is aimed at preventing scammers using government seals or logos when communicating with consumers both online and through the mail. Additionally, the rule prohibits the use of spoof websites and email addresses, including lookalike email address or websites with “.gov” or those that rely on slight misspellings of a business’s name. Moreover, the regulation prohibits scammers from falsely implying affiliation with the government or business. This rule responds against the 2021 United States Supreme Court ruling in AMG Capital Management LLC v. FTC which placed limitations on the FTC’s ability to require defendants to return money to scammed consumers. The Government and Business Impersonation Rule allows the FTC to directly file federal court claims against scammers to return money made from government or business impersonation scams committed against injured consumers.

Proposed rule prohibiting impersonation of individuals

The proposed addition to the Government and Business Impersonation Rule is a part of the FTC’s response to receiving fraud reports from 2.6 million consumers last year with the most commonly reported scams being imposter scams. With the proliferation of technology and artificial intelligence programs, like ChatGPT, the ability to impersonate entities and individuals is becoming easier and easier. The FTC announced it is seeking public comment on whether the revised rule should prohibit a firm, like an AI platform that creates images, videos, texts, to provide goods or services that they know or have reason to know is being used to harm consumers through impersonation. On the same day, OpenAI, the company that owns ChatGPT released Sora. Sora is a new AI tool that allows creators to generate a sixty second video from a written prompt. While Sora is not available to the public yet, technology like Sora could provide fraudsters with a new tool to create deepfake videos impersonating real individuals if not properly monitored.

FTC pushes for accountability

With emerging technology and new AI tools being developed and released at unprecedented rates, the FTC’s attempts to keep up with technology are noble. AI tools have been linked to everything from politics to the arts but as AI becomes more sophisticated and commonplace, consumers may struggle to detect scams. The FTC having the power to hold AI firms more accountable will encourage the firms themselves to be proactive and self-regulate to prevent illegal usage of the tools they provide. AI firms, like ElevenLabs, can self-regulate to protect consumers from being misled. ElevenLabs recently banned the account of a creator responsible for generating an audio deepfake of US President Joe Biden urging people not to vote in the New Hampshire primary. Providing a direct way for the FTC to file suit and recover damages for injured consumers is a natural extension of the FTC’s objective of protecting consumers. With the upcoming 2024 election and the uptick in consumer fraud reports related to scams, the FTC’s rules for AI impersonation tools push for much-needed accountability in the world of regulation and consumer protection.