By Kyle Brantley

It’s that time of day.  Your child is positioning the antenna just right in order to catch their favorite broadcast TV show.  No, that doesn’t sound quite right.  They are actually dialing up the old FM radio for their favorite weekly jamboree!  No, that’s definitely not happening.  Instead, kids today consume their entertainment through mobile devices—a recent study estimates that 90 percent of children have cell phones by the age of eleven and that on average they spend over three hours on that device per day.[1]

Given the realities of how today’s children access content, one would think that the legal doctrine for policing explicit TV/radio content would morph to accommodate the internet.  However, there is a double standard currently in place.  A high bar for obscene, indecent, and profane content exists on the broadcast airwaves.[2]  In contrast, there is no discernable regulation of expression on the internet.[3]

The lack of internet content policing stems from the First Amendment right to freedom of expression.[4]  While the First Amendment has a broad baseline standard,[5] the government limits what can be said in a few key areas including (but not limited to) fighting words,[6] incitement,[7] obscenity,[8] and indecent speech that invades the privacy of the home.[9]  The overarching authority for the latter still has its roots in FCC v. Pacifica Foundation.[10]  In Pacifica, a New York radio station aired a previously recorded skit by the comedian George Carlin entitled Dirty Words in which he expressed all of the curse words that he thought were disallowed on the public airwaves.[11]  The Supreme Court took issue with the airing of that slot in the middle of the afternoon and honed in on two overriding motivators for censoring the curse words used in the segment: (1) the unique pervasive access of the broadcast airwaves, and (2) the susceptibility of children to be exposed to the content.[12]  

Those overarching reasons delineated in Pacifica still form the basis for FCC guidance that broadcast providers must follow.[13]  The FCC currently prohibits indecent conduct that “portrays sexual or excretory organs” and profane conduct like “‘grossly offensive’ language that is considered a public nuisance.”[14]  Notably, these rules only apply to the major broadcast TV stations (e.g., ABC, NBC, FOX, PBS, etc.)[15] and FM/AM radio from 6:00 a.m. to 10:00 p.m.[16]  Cable and satellite TV are excluded since those are pay-for-service options.[17]

Twenty years later, the federal government saw a need to implement baseline measures for explicit content that children could access on the internet when it included specific protection provisions for “indecent and patently offensive communications” within the Communications Decency Act.[18]  The Supreme Court struck down that portion of the act in Reno v. ACLU[19] when it reasoned that, “[u]nlike communications received by radio or television, ‘the receipt of information on the Internet requires a series of affirmative steps more deliberate and directed than merely turning a dial.  A child requires some sophistication and some ability to read to retrieve material and thereby to use the Internet unattended.’”[20]  The Court then dug in its heels by saying “the Internet is not as ‘invasive’ as radio or television”[21] and that “users seldom encounter [sexually explicit] content accidentally.”[22] 

Times have changed since the Court decided Reno in 1997.  Today, internet access is often unabated, and one can easily stumble upon far more sexually explicit material than could be fathomed on the traditional broadcast airwaves.[23]  How many deliberative and affirmative steps does it take for a TikTok video to pop in front of your face?[24]  How about an Instagram post as you scroll down on your home page?  What about a tailored ad on the side of an otherwise mundane web page?  Apps like TikTok and Instagram present endless amounts of new revealing and potentially vulgar images and sounds automatically—the new videos will simply appear after the previous one ends.[25]  

Another example of a potential hazard that a child can stumble upon is pornography.  Porn’s online proliferation has been well documented; Pornhub, the world’s largest porn site, has 100 billion video views per year[26] and 130 million unique viewers per day. [27]  Twenty-five percent of those users are between the ages of eighteen and twenty-four.[28]  In contrast, only 4 percent of users are over the age of sixty-five.[29]  Its user traffic exceeds that of both Netflix and Yahoo.[30]  Eighty percent of that traffic comes from mobile devices.[31]  This pervasive medium can be accessed with as little as two clicks from Google’s homepage or an errant link from social media.[32]  

While the effects of easily accessible porn on children are still being studied, early experiments have shown that heavy porn consumption can lead to body shaming, eating disorders, and low self-esteem.[33]  There are many other issues with porn access beyond the mental effect on children that are actively being debated, including Pornhub’s lack of adequate age screening for its users and its blatantly illegal acts of profiting off children’s pornography.[34]  Big Tech is also finally getting the hint that they have skin in the game too as they begrudgingly start to put in age verification safeguards of their own.[35]  

When reevaluating the factors employed in Pacifica, it becomes clear that the two-prong test originally used for radio broadcasts is now satisfied on the internet.[36]  The ubiquitous access children have to the internet via smartphones demonstrates that the medium is pervasive.[37]  Children are susceptible to exposure to indecent content because of the ease of access through two quick clicks from Google,[38] automatic video recommendations on social media,[39] and the sheer popularity of porn content amongst their peers who are just a few years older than they are.[40]  The concern in Reno around the lack of a “series of affirmative steps” needed to access illicit content on the internet[41] is outdated because of the automatic content that will load on apps like TikTok and Instagram.[42]  Similarly, the majority of children as young as seven years old have both smartphones and the sophistication to seamlessly access the internet, even though they may not fully understand the ramifications of some of their content choices.[43]

Balancing the government’s interest in limiting children’s exposure to indecency and profanity with the right to express ideas freely online is no easy task.[44]  However, other countries have found ways to regulate the extreme ends of the porn industry and children’s access to such content.[45]  No matter where one stands on the issue, it is abundantly clear that the traditional view of mundane curse words encountered on broadcast television is not compatible with the endless explicit content that is so easily displayed on smartphones.  Both are uniquely pervasive and are accessible to children with minimal effort or “steps.”[46]  One of the two doctrines should evolve. 


[1] See Most Children Own Mobile Phone by Age of Seven, Study Finds, The Guardian (Jan. 29, 2020, 19:01 EST), https://www.theguardian.com/society/2020/jan/30/most-children-own-mobile-phone-by-age-of-seven-study-finds.

[2] See Obscene, Indecent and Profane Broadcasts, FCC, https://www.fcc.gov/consumers/guides/obscene-indecent-and-profane-broadcasts (Jan. 13, 2021) [hereinafter Obscene, Indecent and Profane Broadcasts].

[3] See Rebecca Jakubcin, Comment, Reno v. ACLU: Establishing a First Amendment Level of Protection for the Internet, 9 Univ. Fl. J.L. Pub. Pol’y 287, 292 (1998).

[4] See id.; U.S. Const. amend. I.

[5] See Jakubcin, supra note 3, at 288.

[6] See Cohen v. California, 403 U.S. 15, 20 (1971); Chaplinksy v. New Hampshire, 315 U.S. 568, 572, 574 (1942).

[7] See Brandenburg v. Ohio, 395 U.S. 444, 447, 449 (1969); Schenk v. United States, 249 U.S. 47, 52 (1919).

[8] See Miller v. California, 413 U.S. 15, 24 (1973).

[9] See 18 U.S.C. § 1464.

[10] 438 U.S. 726 (1978).

[11] Id. at 729–30.

[12] See id. at 748–50.

[13] Obscene, Indecent and Profane Broadcasts, supra note 2.

[14] Id.

[15] Id.

[16] Id.

[17] Id.

[18] See Am. C.L. Union v. Reno, 929 F. Supp. 824, 850 (E.D. Pa. 1996), aff’d, Reno v. Am. C.L. Union, 521 U.S. 844, 849 (1997).

[19] Reno, 521 U.S. at 854.

[20] Id. (emphasis added) (quoting Am. C.L. Union, 929 F. Supp. at 845).

[21] Id. at 869.

[22] Id. at 854.

[23] See Byrin Romney, Screens, Teens, and Porn Scenes: Legislative Approaches to Protecting Youth from Exposure to Pornography, 45 Vt. L. Rev. 43, 49 (2020).

[24] See generally Inside TikTok’s Algorithm: A WSJ Video Investigation, Wall St. J. (July 21, 2021, 10:26 AM), https://www.wsj.com/articles/tiktok-algorithm-video-investigation-11626877477 (demonstrating how TikTok’s algorithm pushes users towards more extreme content with recommendations that load automatically without any additional clicks).

[25] Id.

[26] Pornhub, https://www.pornhub.com/press (last visited Nov. 16, 2021).

[27] The Pornhub Tech Review, Pornhub: Insights (Apr. 8, 2021), https://www.pornhub.com/insights/tech-review.

[28] The 2019 Year in Review, Pornhub: Insights (Dec. 11, 2019), https://www.pornhub.com/insights/2019-year-in-review.

[29] Id.

[30] Joel Khalili, These Are the Most Popular Websites Right Now –  And They Might Just Surprise You, TechRadar (July 13, 2021), https://www.techradar.com/news/porn-sites-attract-more-visitors-than-netflix-and-amazon-youll-never-guess-how-many.

[31] The Pornhub Tech Review, supra note 27.

[32] See Gail Dines, What Kids Aren’t Telling Parents About Porn on Social Media, Thrive Global (July 15, 2019), https://thriveglobal.com/stories/what-kids-arent-telling-parents-about-porn-on-social-media/.

[33] Id.

[34] Nicholas Kristof, The Children of Pornhub, N.Y. Times (Dec 4, 2020), https://www.nytimes.com/2020/12/04/opinion/sunday/pornhub-rape-trafficking.html.

[35] See David McCabe, Anonymity No More? Age Checks Come to the Web, N.Y. Times (Oct. 27, 2021), https://www.nytimes.com/2021/10/27/technology/internet-age-check-proof.html.

[36] See FCC v. Pacifica Found., 438 U.S. 726, 748–50 (1978).

[37] See Most Children Own Mobile Phone by Age of Seven, Study Finds, supra note 1.

[38] Dines, supra note 32.

[39] See Inside TikTok’s Algorithm: A WSJ Video Investigation, supra note 24.

[40] See, e.g., The 2019 Year in Review, supra note 28.

[41] See Reno v. Am. C.L. Union, 521 U.S. 844, 854 (1997)

[42] See Inside TikTok’s Algorithm: A WSJ Video Investigation, supra note 24.

[43] See Most Children Own Mobile Phone by Age of Seven, Study Finds, supra note 1.

[44] See Romney, supra note 23, at 97.

[45] See Raphael Tsavkko Garcia, Anti-Porn Laws in Europe Bring Serious Privacy Issues, Yet They’re Fashionable As Ever, CyberNews (Nov. 30, 2020), https://cybernews.com/editorial/anti-porn-laws-in-europe-bring-serious-privacy-issues-yet-theyre-fashionable-as-ever/.

[46] Cf. Reno, 521 U.S. at 854; FCC v. Pacifica Found., 438 U.S. 726, 749–50 (1978).


Post image by ExpectGrain on Flickr.