By Tom Budzyn

On February 8, 2024, the Federal Communications Commission (“FCC”) issued a unanimous declaratory ruling giving agency guidance on the applicability of the Telephone Consumer Protection Act (“TCPA”) to unwanted and illegal robocalls using artificial intelligence.[1] In this ruling, the FCC stated its belief that unwanted spam and robocalls making use of artificial intelligence are in violation of existing consumer protections.[2] The FCC’s analysis focused on protecting consumers from the novel and unpredictable threats posed by artificial intelligence.[3] It may be a harbinger of things to come, as other agencies (and various tribunals) are forced to consider the applicability of older consumer protection laws to the unique challenge of artificial intelligence.[4] As federal agencies are often the first line of defense for consumers against predation,[5] the onus is on them to react to the dangers posed by artificial intelligence.

The FCC considered the TCPA, passed in 1991, which prohibits the use of “artificial” or “prerecorded” voices to call any residential phone line if the recipient has not previously consented to receiving such a call.[6] This blanket prohibition is effective unless there is an applicable statutory exception, or it is otherwise exempted by an FCC rule or order.[7] However, the statute does not define what an “artificial” or “prerecorded” voice is.[8] Thus, on November 16, 2023, the FCC solicited comments from the public as to the applicability of the TCPA to artificial intelligence in response to the technology’s fast and ongoing developments.[9] In its preliminary inquiry, the FCC noted that some artificial intelligence-based technologies such as voice cloning[10] facially appear to violate the TCPA.[11]

Following this initial inquiry, the FCC confirmed its original belief that phone calls made using artificial intelligence-generated technologies without the prior consent of the recipient violate the TCPA.[12] In doing so, the FCC looked to the rationale underlying the TCPA and its immediate applicability to artificial intelligence.[13] As a consumer protection statute, the TCPA safeguards phone users from deceptive, misleading, and harassing phone calls.[14] Artificial intelligence, and the almost limitless technological possibilities it offers,[15] presents a uniquely dangerous threat to consumers. While most phone users today are well-equipped to recognize and deal with robocalls or unwanted advertisements, they are likely much less able to deal with the shock of hearing the panicked voice of a loved one asking for help.[16] Pointing to these severe dangers, the FCC found that the TCPA must extend to artificial intelligence to adequately protect consumers.[17]

As a result, the FCC contemplates future enforcement of the TCPA against callers using artificial intelligence technology without the prior consent of the recipients of the calls.[18] The threat of enforcement looms heavy, as twenty-six state attorney generals wrote to the FCC in support of the decision, and more impressively, there is almost unanimous accord among the state attorney generals in their understanding of this law.[19]

It is worth noting that the FCC’s ruling is possibly not legally binding.[20]  The ruling serves to explain the agency’s interpretation of the TCPA, and as such, is not necessarily binding on the agency.[21] Moreover, the possible downfall of Chevron would mean that the FCC’s interpretation of the TCPA would likely be afforded little, if any deference.[22] Legal technicalities notwithstanding, the FCC’s common sense declaratory ruling states the obvious: unsolicited phone calls using artificial intelligence-generated voices are covered by the TCPA’s prohibition on “artificial” or “prerecorded” voices in unsolicited phone calls.[23] If there was any doubt before that callers should avoid the use of artificial intelligence, without the consent of call recipients, it is gone now.

Perhaps the most interesting part of the FCC’s ruling is its straightforward analysis of the application of the facts to the law. Other federal agencies will certainly be asked to make similar analyses in the future, as artificial intelligence becomes only more and more ubiquitous. In the TCPA context, the analysis is straightforward. It is much less so in the context of other consumer protection statutes.[24] For example, the Federal Trade Commission (“FTC”) is authorized to take action against “persons, partnerships, or corporations” from using unfair methods in competition affecting commerce or unfair or deceptive acts affecting commerce by 15 U.S.C. § 45.[25] Unsurprisingly, “person” is not defined by the statute[26]  as the law was originally enacted in 1914.[27] If it remains in its current form, it could exclude artificial intelligence from one of the most obvious consumer protections in the modern United States. While artificial intelligence has not been recognized as a person in other contexts,[28] it should be recognized as such where it can do as much harm, if not more, than a person could.

This statute is only one of many traditional consumer protection statutes that, as written, may not adequately protect consumers from the dangers of artificial intelligence.[29] While amending the law is certainly possible, legislative gridlock and inherent delays place greater importance on agencies being proactive to artificial intelligence developments. The FCC’s ruling is a step in the right direction, a sign that agencies will not wait for artificial intelligence to run rampant before seeking to rein it in. Hopefully, other agencies follow suit and issue similar guidance, using existing laws to protect consumers from new threats.

[1] F.C.C., CG Docket no. 23-362, Declaratory Ruling (2024) [hereinafter F.C.C. Ruling].  

[2] Id.

[3] Id.

[4] Fed. Trade Comm’n, FTC Chair Khan and Officials from DOJ, CFPB, AND EEOC Release Joint Statement on AI (2024),

[5] See, e.g., J. Harvie Wilkinson III, Assessing the Administrative State, 32 J. L. & Pol. 239 (2017) (discussing modern administrative state and its goals, including stabilizing financial institutions, making homes affordable and protecting the rights of employees to unionize).  

[6] Telephone Consumer Protection Act of 1991, 47 U.S.C. § 227.

[7] Id.

[8] Id.

[9] F.C.C. Ruling, supra note 1.

[10] See Fed. Trade Comm’n, Preventing the Harms of AI-enabled Voice Cloning (2024)

[11] FC.C. Ruling , supra note 1.

[12] Id.

[13] Id.

[14] See Telephone Consumer Protection Act of 1991, 47 U.S.C. § 227.

[15] See, e.g., Cade Metz, What’s the Future for AI?, N.Y. Times (Mar. 31, 2023).

[16] Ali Swenson & Will Weissert, New Hampshire investigating fake Biden robocall meant to discourage voters ahead of primary, Associated Press (Jan. 22, 2024),

[17] F.C.C. Ruling, supra note 1.

[18] Id.

[19] Id.

[20] Azar v. Allina Health Servs., 139 S. Ct. 1804, 1811 (2019) (explaining that interpretive rules, which are exempt from notice and comment requirements under the Administrative Procedure Act, “merely advise” the public of the agency’s interpretation of a statute).

[21] Chang Chun Petrochemical Co. Ltd. v. U.S., 37 Ct. Int’l Trade, 514, 529 (2013) (“Unlike a statute or regulations promulgated through notice and comment procedures, an agency’s policy is not binding on itself.”).

[22] See generally Caleb B. Childers, The Major Question Left for the Roberts Court, will Chevron Survive? 112 Ky. L.J. 373 (2023).

[23] F.C.C. Ruling, supra note 1.

[24] See 15 U.S.C. §§ 1601–1616 (consumer credit cost disclosure statute defines “person” as a “natural person” or “organization”).

[25] 15 U.S.C.§ 45.

[26] Id.

[27] Id.  

[28] See Thaler v. Hirshfeld, 558 F.Supp. 3d 328 (2021) (affirming United States Patent and Trademark Office’s finding that the term “individual” in the Patent Act referred only to natural persons, and thus artificial intelligence could not be considered an inventor of patented technology).

[29] See, e.g., 15 U.S.C. §§ 1601-1616, supra note 24.                                      

Madelyn Strohm 

On December 20, 2023, the Federal Trade Commission (“FTC”) announced proposed changes to its Children’s Online Privacy Protection Act Rule (“Rule” or “COPPA Rule”).[1] The proposed changes aim to keep up with changes in technology and how businesses are using children’s information collected online.[2] The updated Rule aims to further enhance children’s privacy online by (1) expanding the definition of “personal information” under the Rule to include biometric identifiers, (2) preventing companies from abusing exceptions to consent policies, and (3) requiring companies to establish an express policy for retention and deletion of children’s personal information.[3]

History of the Act and Previous Changes to the Rule

The Children’s Online Privacy Protection Act was enacted in 1998.[4] The Act was passed in direct response to FTC research finding that websites for children were collecting their personal information without requesting permission from their parents, with the information ranging from their names to their parents’ income, or even Social Security Numbers.[5] The goal of the Act was to enhance children’s safety by ensuring their parents had knowledge and control of the collection of information from their children.[6]

Congress gave the FTC the power to issue and enforce regulations under the Act, and the FTC issued the COPPA Rule.[7] The Rule applies to operators of websites directed at children or any online service operators who have actual knowledge that they are collecting personal information from a child.[8] The Rule requires that these websites provide notice about their collection of personal information from children, how they use the information, and their practices for disclosure, and requires parental consent before they collect, use or disclose the information.[9] Personal information includes various types of information, like a child’s name, address, contact information, or Social Security Number.[10]

In its latest amendments to the Rule in 2013, the FTC expanded personal information to include “persistent identifiers,” like IP addresses, device serial numbers, or precise geolocations.[11] In 2019, the FTC issued a Rule Review Invitation and received more than 175,000 comments that helped them identify several areas for improvement, which are reflected in the proposed changes.[12]

Key Aspects of the Proposed Changes to the Rule

  1. Expanding the definition of personal information to include biometric data.

The first proposed expansion of the Rule would modify the Rule’s definition of “personal information” to include biometric identifiers that companies can use for automated recognition of individuals, including fingerprints, handprints, retina and iris patterns, genetic data, or data derived from voice data, gait data, or facial data.[13] Many of the comments by the public addressing biometric data supported its inclusion in the definition of personal information, noting that biometric data is uniquely susceptible because of its “permanent and unalterable nature.”[14] Additionally, some states have expanded their definition of personal information to include biometric data, as have other federal laws, like the Department of Education’s Family Educational Rights and Privacy Act (“FERPA”) regulations.[15] In his remarks on the proposed updates to the Rule, FTC Commissioner Bedoya cited “the beginning of an era of biometric fraud” and concern for how companies are protecting children’s biometric data against breaches, fraud and abuse.[16]

  1. Narrowing current exceptions to parental consent policies to prevent websites from encouraging to engage with online platforms without parental consent.

The FTC also proposed expanding its list of use restrictions for persistent identifiers.[17] The 2013 amendments allowed for an exception to the prohibition on the use of persistent identifiers, like IP address, if they were used for the “sole purpose of providing support for the internal operations of the website or online service.”[18] In the most recent round of comments, organizations expressed concern that this exception is overly broad and allows companies to shirk their COPPA obligations.[19] Commenters argued, and the FTC agreed, that this exception should not be allowed to promote increased site usage by children without verifiable parental consent.[20] This change would prevent companies from using or disclosing persistent identifiers to increase children’s attention and engagement, including using the information to send push notifications to the child reminding or encouraging them to engage with the company’s website or service without consent from parents.[21]

  1. Prohibiting companies from retaining children’s personal information indefinitely and requiring a written policy on data retention.

Finally, the proposed updates clarify the portion of the 2013 amendment that prohibits indefinite retention of children’s personal information.[22] Commenters from consumer groups argued that allowing companies to use the permissive “only as long as is reasonably necessary” standard for retention of children’s information allowed operators to both retain data unnecessarily and use it for purposes other than those understood by parents in their initial consent.[23] As such, the FTC intends to clarify that operators can only retain children’s personal information for the specific purpose for which it was collected, and not for any secondary uses.[24] The FTC will also require operators to establish a written data retention policy addressing businesses’ need for retention of children’s personal information, as well as a timeframe for deletion to prevent indefinite retention and further use.[25]


Though the proposed changes do not incorporate every request from commenters, they reflect the most prominent concerns raised by commenters across a variety of industries and platforms.[26] At the end of the notice of the proposed changes, the FTC highlighted questions it was still considering and asked for additional comments, indicating that additional changes are still possible.[27] The public has sixty days from the proposal’s publication in the Federal Register to submit additional comments.[28]


[1] Fed. Trade Comm’n, Statement of Commissioner Bedoya on the Issuance of Notice of Proposed Rulemaking to Update the Children’s Online Privacy Protection Rule (Dec. 20, 2023),

[2] Children’s Online Privacy Protection Rule, 89 Fed. Reg. 2034, 2034 (Jan. 11, 2024) (to be codified at 16 C.F.R. pt. 312).

[3] Fed. Trade Comm’n, supra note 1.

[4] 15 U.S.C. §§ 6501 et. seq.

[5] Fed. Trade Comm’n, Privacy Online: A Report to Congress 31-42 (1998).

[6] Id. at iii-iv; Fed. Trade Comm’n, supra note 1.

[7] 15 U.S.C. § 6502(b); 16 C.F.R. § 312 (2013).

[8] 16 C.F.R. § 312.3.

[9] Id.

[10] Id. § 312.2.

[11] Id.

[12] Children’s Online Privacy Protection Rule, 89 Fed. Reg. 8 at 2035.

[13] Id. at 2041.

[14] Id.

[15] Id.; 34 C.F.R. § 99.3 (1988).

[16] Fed. Trade Comm’n, supra note 1.

[17] Children’s Online Privacy Protection Rule, 89 Fed. Reg. at 2042-43.

[18] Id. at 2044.

[19] Id.

[20] Id. at 2045.

[21] Id.

[22] Id. at 2062.

[23] Id.

[24] Id.

[25] Id.

[26] Id. at 2044, 2062.

[27] Id. at 2069; Allison Grande, FTC Targets Data Profits with Kids’ Privacy Rule Changes, Law360 (Dec. 20, 2023 7:56 PM),

[28] Children’s Online Privacy Protection Rule, 89 Fed. Reg. at 2034.