By Lauren E. Douglas

The year 2021 marks the forty-eighth anniversary of the mobile phone[1] and the eighty-third anniversary of the programmable computer.[2]  It is no secret that mobile devices are significantly more powerful than their inventors could have ever predicted.[3]  What started as clunky, cumbersome machinery has transformed into the backbone of society as we know it.[4]

Many aspects of corporate success are rooted in the evolution of the mobile device.[5]  To aid in their success, the majority of employers now permit the cross-use of electronic devices for personal and professional purposes—a movement known as “Bring Your Own Device” and lovingly referred to as “BYOD” for short.[6]  More specifically, the BYOD phenomenon is a practice whereby employees[7] use their own electronic devices[8] to do their jobs on the employers’ platforms.[9]  The BYOD practice is loved by employees and employers alike[10] and drives corporate activity in ways nearly unimaginable even ten years ago.[11]  In fact, in 2018, the BYOD market was predicted to reach nearly $367 billion by 2022.[12]  In the wake of the COVID-19 pandemic, the true number is likely even higher.[13]  Pandemic or no pandemic, it is clear that the BYOD era is here to stay.[14]

Despite the rising popularity of the BYOD workstyle, very few businesses actually enforce formal BYOD policies and instead simply allow employees to access corporate data on their own devices free of internal regulation.[15]  A study conducted in 2016 revealed that, even though roughly 70 percent of employees conduct work-related activities on their personal devices, only 39 percent of companies have a formal BYOD policy in place.[16]  This is dangerous; if companies do not regulate their BYOD practices, they risk breaches of sensitive corporate data and violations of federal privacy laws.[17]

The Tension Between Privacy Interests and Data Protection

Perhaps the most problematic issue that comes with implementing a BYOD policy is determining to what extent an employer can legally monitor and access its employee’s personal devices.[18]  The American Bar Association describes balancing the employers’ interests in data security with the employees’ rights of privacy as “the single greatest challenge” to a successful BYOD program.[19]  With that said, it must be recognized that employees enjoy a heightened right to privacy regarding information stored on their own personal devices as compared to employer-provided devices.[20]  Such protection may be found in both statutory and common law.

Specifically, the Computer Fraud and Abuse Act of 1986 (“CFAA”)[21] imposes criminal and civil penalties on individuals and companies that “intentionally access a computer without authorization or exceeds authorized access” to obtain “information from any protected computer.”[22]  The CFAA began as a means to protect government computers from hackers, but today the law reaches every computer connected to the internet.[23]  Of course, in 2021, computers are not the only devices used to conduct work.  Luckily, federal case law establishes that, in addition to desktop and laptop computers, the CFAA protects devices such as cell phones,[24] tablet devices, and even videogame systems.[25]  To supplement the CFAA, employees might also claim a violation of the Electronic Communications Privacy Act (“ECPA”), a subsection of the Stored Communications Act (“SCA”) that protects the privacy of electronic communications in electronic storage.[26]

So far, courts addressing the new BYOD privacy dichotomy seem reluctant to construe statutory protection broadly in favor of plaintiff-employees who had their personal devices wiped clean of all information.  For instance, in Rajaee v. Design Tech Homes, Ltd.,[27] a company remotely deleted all of an employee’s work and personal files from his mobile device after he resigned.[28]  The device was connected to the employer’s data server, allowing for remote access to email and calendar platforms.[29]  The employee sued for damages under the ECPA and the CFAA, but the court rejected both claims, reasoning that the personal data lost was not “electronic storage” as defined under the ECPA and was not a qualified “loss” under the CFAA.[30]  Such stringent readings of these statutes makes asserting a viable claim for lost personal data caused by the employer a very difficult feat for employees.[31]

Because technology moves light-years faster than the development of law,[32] the CFAA and ECPA are two of the closest statutes to the BYOD issue; there is currently no federal or state legislation directly addressing BYOD policies.[33]  Similarly, the relevant case law is negligible.[34]  It is thus vital for employers to create and implement comprehensive, transparent, and officially documented BYOD programs.  Thorough BYOD policies are important because, with the lack of statutory law on point, courts in many cases will look directly to the provisions in the BYOD policy to “determine the bounds of permissible employer conduct.”[35]

Private employees generally retain privacy interests in common law so long as their expectations are “reasonable.”[36]  When considering whether an employee has a reasonable expectation of privacy in electronic communications, courts tend to balance the following factors: (1) account ownership;[37] (2) device ownership;[38] (3) the security level of the communication;[39] and (4) published employer policies and whether they were routinely enforced.[40]  No one factor is dispositive and different courts may weigh factors differently.[41]  Because two of the four factors essentially look for whether a BYOD program exists, employers can cover a significant number of bases simply by drafting a formal policy.[42]

Best Practices for Employers

Despite the lack of concrete legal guidelines surrounding the BYOD phenomenon, an employer can mitigate the majority of its concerns by implementing a formal BYOD policy.  An effective policy should “emphasize security and contain clear instructions” regarding acceptable and unacceptable activities on personally owned devices that have access to corporate information systems.[43]  Additionally, an employer’s BYOD policy should make clear that “any and all company information and emails” on all personal devices remain “the sole property” of the company, and that the employer “in its sole discretion” may access and delete any company data that “may be in jeopardy.”[44]  No threat is too small when it comes to business use of personal devices, and providing express notice to employees is the very best way to mitigate legal liability surrounding BYOD policies.[45]

In addition to implementing a comprehensive, written BYOD policy, many companies—especially those that handle sensitive information—are utilizing Mobile Device Management (“MDM”) service providers to help mitigate the technical risks associated with allowing employees to access company data on their own devices.[46]  In essence, MDM broadens the scope of BYOD protection and is “designed to fill security gaps” in employee use of personal devices.[47]  MDM allows employers to exercise control over the device, monitor applications, and remotely wipe the device if it is lost or stolen—clearly a worthwhile addition to any BYOD policy.[48]

The Bottom Line

An astonishing number of employees and employers engage in the BYOD workstyle.  While BYOD practices often increase employee satisfaction and performance, and decrease employer costs, they come with inherent risks that cannot be ignored, such as data breaches and violations of federal privacy law.  Employers can mitigate many of these risks by implementing a strong written BYOD policy that clearly delineates the employers’ rights of access to each device.  Now more than ever, actively and uniformly enforcing a BYOD policy is the best mechanism to alleviate legal risks while the law plays catch-up with the modern technological workplace.

[1] Charlee Dyroff, Here’s How Much Cellphones Have Actually Changed Over the Years, Insider (July 25, 2018, 12:42 PM),

[2] When Was the First Computer Invented?, Computer Hope (June 30, 2020),

[3] See Dyroff, supra note 1 (“Over the past half-century, the cell phone has . . . evolved to connect us in ways that [its inventors] perhaps never imagined.”).  In fact, it is estimated that “the number of mobile-connected devices now exceeds the number of people on Earth.”  Danielle Richter, “Bring Your Own Device” Programs: Employer Control Over Employee Devices in the Mobile E-Discovery Age, 82 Tenn. L. Rev. 443, 458 (2015) (quoting Stephen Wu, A Legal Guide to Enterprise Mobile Device Management: Managing Bring Your Own Device (BYOD) and Employer-Issued Device Programs 1 (2013)).

[4] See Lindsey Blair, Note, Contextualizing Bring Your Own Device Policies, 44 J. Corp. L. 151, 152 (2018) (describing the impact of mobile devices on the corporate environment).

[5] See Richter, supra note 3, at 443.

[6] See Blair, supra note 4, at 152 (“BYOD policies rapidly expanded during the 2010s in [information-technology] communities and have [since] become increasingly common in other professional fields.”).

[7] For purposes of this post, the terms “employer” and “employers” refer only to private employers.  Similarly, the terms “employee” and “employees” refer only to employees of private employers.

[8] Although the term “BYOD” may refer to personal use by employees of employer-owned devices, BYOD is more often understood as employee use of a personally owned device to conduct work activities.  With BYOD, the Genie Is Out of the Bottle, So Deal with It, 23 No. 1 N.C. Emp. L. Letter 5 (M. Lee Smith ed., 2013).  The latter is the only form of BYOD that is discussed in this post.

[9] Id.  Such platforms include company email, calendar, and data servers.  Id.

[10] Employees favor BYOD policies because they provide the “freedom to ‘work and collaborate the way they prefer’” on devices familiar to them; employers favor BYOD policies because they cut costs and allow for a “‘more mobile, productive, [accessible], and satisfied’ workforce.”  Melinda L. McLellan et al., Wherever You Go, There You Are (With Your Mobile Device): Privacy Risks and Legal Complexities Associated with International “Bring Your Own Device” Programs, 21 Rich. J. L. & Tech., no. 3, 2014, at 1, 1.

[11] Id.

[12] Anna Johansson, Growth of BYOD Proves It’s No Longer an Optional Strategy, BetaNews, (last visited Mar. 11, 2021).

[13] See Alice Selvan, Adopting a BYOD Policy Amid the COVID-19 Era, ManageEngine (Dec. 3, 2020),  (describing how COVID-19 forced companies previously against the BYOD concept to accept it, as a substantial amount of remote work would not even be possible without it).

[14] See Richter, supra note 3, at 459 (“The growth of technology [in the] workplace is unavoidable and imminent.”).

[15] Id. at 445.

[16] Q4 2016: BYOD Trends & Practices, Trustlook Insights, (last visited Mar. 11, 2021).

[17] Id.  Such risks arise in large part because employers lose control when employees use their own devices and networks to store and transmit company data.  Bring Your Own Device (BYOD) . . . At Your Own Risk, Priv. Rts. Clearinghouse, (Oct. 1, 2014).

[18] Fredric D. Bellamy & Arturo Gonzalez, Crafting Bring Your Own Device (“BYOD”) Policies to Protect Your Company Data and Ensure Compliance With the Law, Nat’l L. Rev. (Oct. 11, 2018),

[19] Pedro Pavón, Risky Business: “Bring-Your-Own-Device” and Your Company, Am. Bar Ass’n (Sept. 30, 2013),

[20] Bellamy & Gonzalez, supra note 18.  In contrast, employers retain significantly greater control and access to company-owned devices that are merely provided to employees.  Richter, supra note 3, at 458.  However, the scope of employee rights surrounding employer-provided devices is outside the purview of this post and will not be discussed further.

[21] 18 U.S.C. § 1030.

[22] Id. § 1030(a)(2)(C).

[23] Brenda R. Sharton et al., Key Issues in Computer Fraud and Abuse Act (CFAA) Civil Litigation, Thomson Reuters Prac. L. 1 (2018),

[24] See United States v. Nosal, 844 F.3d 1024, 1050–51 n. 2 (9th Cir. 2016) (recognizing protection of cell phones under the CFAA); see also United States v. Mitra, 405 F.3d 492, 495 (7th Cir. 2005) (same).

[25] See United States v. Nosal, 676 F.3d 854, 861 (9th Cir. 2012).

[26] 18 U.S.C. §§ 2510–2523.

[27] No. H-13-2517, 2014 WL 5878477 (S.D. Tex. Nov. 11, 2014).

[28] Id. at *1.

[29] Id.

[30] Id. at *2 (citing Garcia v. City of Laredo, Tex., 702 F.3d 788, 791 (5th Cir. 2012)).

[31] CFAA claims are only cognizable if the plaintiff can show that the unauthorized access to his or her computer resulted in a loss of at least $5,000 in a one-year period.  See 18 U.S.C. § 1030(c)(4)(A).  This is often a high hurdle for plaintiff-employees to meet.  See, e.g., Rajaee, 2014 WL 5878477, at *3–4 (determining that plaintiff’s loss of personal photos, cell phone contacts, text messages, notes, and emails did not satisfy the CFAA’s “loss” requirement).

[32] See Richter, supra note 3, at 458 (“It is a common principle that ‘law follows technology.’”) (quoting Fabio E. Marino & Teri H.P. Nguyen, Perils of the “Bring Your Own Device” Workplace, Nat’l L. J. (Nov. 18, 2013, 12:00 AM),

[33] See McLellan et al., supra note 10, at 6.

[34] See Mendez v. Piper, No. H041122, 2017 WL 1350770, at *14 (Cal. Ct. App. Apr. 12, 2017) (recognizing that the law surrounding an employee’s “rights of ownership and privacy in personal information” stored on a personal device is actively evolving).

[35] Bellamy & Gonzalez, supra note 18.

[36] Blair, supra note 4, at 162–63.  Private employees do not enjoy the right to privacy under the Fourth Amendment; this is reserved for public employees.  Id.

[37] See, e.g., Mintz v. Mark Bartelstein & Assocs., Inc., 906 F. Supp. 2d 1017, 1033 (C.D. Cal. 2012) (concluding that the plaintiff had an expectation of privacy in his personal email account despite the fact that he used the account for work-related matters).

[38] See, e.g., Sitton v. Print Direction, Inc., 718 S.E.2d 532, 537 (Ga. Ct. App. 2011) (deciding that the use of an employee’s laptop to review that employee’s emails did not invade the employee’s privacy).

[39] See, e.g., Mintz, 906 F. Supp. 2d at 1033 (explaining the appropriateness of the plaintiff’s use of a password on his email account).

[40] See, e.g., In re Asia Global Crossing, Ltd., 322 B.R. 247 (Bankr. S.D.N.Y. 2005) (applying this four-prong test to determine whether employee privacy was breached).

[41] Blair, supra note 4, at 162–63.

[42] Id.

[43] Pavón, supra note 19.

[44] See H.J. Heinz Co. v. Starr Surplus Lines Ins. Co., No. 2:15-CV-00631-AJS, 2015 WL 12791338, at *4 (W.D. Pa. July 28, 2015) (describing a similar BYOD policy that gave Heinz Co. custody and control of any company data present on employees’ personal mobile devices), report and recommendation adopted, No. 2:15-CV-00631-AJS, 2015 WL 12792025 (W.D. Pa. July 31, 2015).

[45] See, e.g., Muick v. Glenayre Elec., 280 F.3d 741, 743 (7th Cir. 2002) (no reasonable expectation of privacy in workplace computer files where employer had announced that he could inspect the computer); Thygeson v. U.S. Bancorp, No. CV-03-467-ST, 2004 WL 2066746, at *20 (D. Or. Sept. 15, 2004) (no reasonable expectation of privacy in computer files and email where employee handbook explicitly warned of employer’s right to monitor files and email).

[46] Pavón, supra note 19.

[47] What Is the Difference Between BYOD and MDM?, Centre Technologies (Mar. 17, 2015),

[48] Priv. Rts. Clearinghouse, supra note 17.

Post Image by Ryan Adams on Flickr.

By Mary Catherine Young

Last month, an Azerbaijani journalist was forced to deactivate her social media accounts after receiving sexually explicit and violent threats in response to a piece she wrote about Azerbaijan’s cease-fire with Armenia.[1] Some online users called for the Azerbaijan government to revoke columnist Arzu Geybulla’s citizenship—others called for her death.[2] Days later, an Irish man, Brendan Doolin, was criminally charged for online harassment of four female journalists.[3] The charges came on the heels of a three-year jail sentence rendered in 2019 based on charges for stalking six female writers and journalists online, one of whom reported receiving over 450 messages from Doolin.[4] Online harassment of journalists is palpable on an international scale.

Online harassment of journalists abounds in the United States as well, with females receiving the brunt of the persecution.[5] According to a 2019 survey conducted by the Committee to Protect Journalists, 90 percent of female or gender nonconforming American journalists said that online harassment is “the biggest threat facing journalists today.”[6] Fifty percent of those surveyed reported that they have been threatened online.[7] While online harassment plagues journalists around the world, the legal ramifications of such harassment are far from uniform.[8] Before diving into how the law can protect journalists from this abuse, it is necessary to expound on what online harassment actually looks like in the United States.

In a survey conducted in 2017 by the Pew Research Center, 41 percent of 4,248 American adults reported that they had personally experienced harassing behavior online.[9] The same study found that 66 percent of Americans said that they have witnessed harassment targeted at others.[10] Online harassment, however, takes many shapes.[11] For example, people may experience “doxing” which occurs when one’s personal information is revealed on the internet.[12] Or, they may experience a “technical attack,” which includes harassers hacking an email account or preventing traffic to a particular webpage.[13] Much of online harassment takes the form of “trolling,” which occurs when “a perpetrator seeks to elicit anger, annoyance or other negative emotions, often by posting inflammatory messages.”[14] Trolling can encompass situations in which harassers intend to silence women with sexualized threats.[15]

The consequences of online harassment of internet users can be significant, invoking mental distress and sometimes fear for one’s physical safety.[16] In the context of journalists, however, the implications of harassment commonly affect more than the individual journalist themselves—free flow of information in the media is frequently disrupted due to journalists’ fear of cyberbullying.[17] How legal systems punish those who harass journalists online varies greatly both internationally and domestically.[18]

For example, the United States provides several federal criminal and civil paths to recourse for victims of online harassment, though not specifically geared toward journalists.[19] In terms of criminal law, provisions protecting individuals against cyber-stalking are included in 18 U.S.C. § 2261A, which criminalizes stalking in general.[20] According to this statute, “[w]hoever . . . with the intent to kill, injure, harass, intimidate, or place under surveillance with intent to . . . harass, or intimidate another person, uses . . . any interactive computer service . . . [and] causes, attempts to cause, or would be reasonably expected to cause substantial emotional distress to a person . . .” may be imprisoned.[21] In terms of civil law, plaintiffs may be able to allege defamation or copyright infringement claims.[22] For example, when the harassment takes the form of sharing an individuals’ self-taken photographs without the photographer’s consent, whether they are explicit or not, the circumstances may allow the victim to pursue a claim under the Digital Millennium Copyright Act.[23]

Some states provide their own online harassment criminal laws, though states differ in whether the provisions are included in anti-harassment legislation or in their anti-stalking laws.[24] For example, Alabama,[25] Arizona,[26] and Hawaii[27] all provide for criminal prosecution for cyberbullying in their laws against harassment, whereas Wyoming,[28] California,[29] and North Carolina[30] include anti-online harassment provisions in their laws against stalking.[31] North Carolina’s stalking statute, however, was recently held unconstitutional as applied under the First Amendment after a defendant was charged for posting a slew of Google Plus posts about his bizarre wishes to marry the victim.[32] The North Carolina Court of Appeals decision in Shackelford seems to reflect a distinctly American general reluctance to interfere with individuals’ ability to freely post online out of extreme deference to First Amendment rights.

Other countries have taken more targeted approaches to legally protecting journalists from online harassment.[33] France, in particular, has several laws pertaining to cyberbullying and online harassment in general, and these laws have recently provided relief for journalists.[34] For example, in July 2018, two perpetrators were given six-month suspended prison sentences after targeting a journalist online.[35] The defendants subjected Nadia Daam, a French journalist and radio broadcaster, to months of online harassment after she condemned users of an online platform for harassing feminist activists.[36] Scholars who examine France’s willingness to prosecute perpetrators of online harassment against journalists and non-journalists alike point to the fact that while the country certainly holds freedom of expression in high regard, this freedom is held in check against other rights, including individuals’ right to privacy and “right to human dignity.”[37]

Some call for more rigorous criminalization of online harassment in the United States, particularly against journalists, to reduce the potential for online harassment to create a “crowding-out effect” that prevents actually helpful online speech from being heard.[38] It seems, however, that First Amendment interests may prevent many journalists from finding relief—at least for now.

[1] Aneeta Mathur-Ashton, Campaign of Hate Forces Azeri Journalist Offline, VOA (Jan. 8, 2021),

[2] Id.

[3] Tom Tuite, Dubliner Charged with Harassing Journalists Remanded in Custody, The Irish Times (Jan. 18, 2021),

[4] Brion Hoban & Sonya McLean, ‘Internet Troll’ Jailed for Sending Hundreds of Abusive Messages to Six Women, The (Nov. 14, 2019),

[5] Lucy Westcott & James W. Foley, Why Newsrooms Need a Solution to End Online Harassment of Reporters, Comm. to Protect Journalists (Sept. 4, 2019),

[6] Id.

[7] Id.

[8] See Anya Schiffrin, How to Protect Journalists from Online Harassment, Project Syndicate (July 1, 2020),

[9] Maeve Duggan, Online Harassment in 2017, Pew Rsch. Ctr. (July 11, 2017),

[10] Id.

[11] Autumn Slaughter & Elana Newman, Journalists and Online Harassment, Dart Ctr. for Journalism & Trauma (Jan. 14, 2020),

[12] Id.

[13] Id.

[14] Id.

[15] Id.

[16] Duggan, supra note 9.

[17] Law Libr. of Cong., Laws Protecting Journalists from Online Harassment 1 (2019),

[18] See id. at 3–4; Marlisse Silver Sweeney, What the Law Can (and Can’t) Do About Online Harassment, The Atl. (Nov. 12, 2014),

[19] Hollaback!, Online Harassment: A Comparative Policy Analysis for Hollaback! 37 (2016),

[20] 18 U.S.C. § 2261A.

[21] § 2261A(2)(b).

[22] Hollaback!, supra note 19, at 38.

[23] Id.; see also 17 U.S.C. §§ 1201–1332.

[24] Hollaback!, supra note 19, at 38–39.

[25] Ala. Code § 13A-11-8.

[26] Ariz. Rev. Stat. Ann. § 13-2916.

[27] Haw. Rev. Stat. § 711-1106.

[28] Wyo. Stat. Ann. § 6-2-506.

[29] Cal. Penal Code § 646.9.

[30] N.C. Gen. Stat. § 14-277.3A.

[31] Hollaback!, supra note 19, at 39 (providing more states that cover online harassment in their penal codes).

[32] State v. Shackelford, 825 S.E.2d 689, 701 (N.C. Ct. App. 2019), After meeting the victim once at a church service, the defendant promptly made four separate Google Plus posts in which he referenced the victim by name. Id. at 692. In one post, the defendant stated that “God chose [the victim]” to be his “soul mate,” and in a separate post wrote that he “freely chose [the victim] as his wife.” Id. After nearly a year of increasingly invasive posts in which he repeatedly referred to the victim as his wife, defendant was indicted by a grand jury on eight counts of felony stalking. Id. at 693–94.

[33] Law Libr. of Cong., supra note 17, at 1–2.

[34] Id. at 78–83.

[35] Id. at 83.

[36] Id.

[37] Id. at 78.

[38] Schiffrin, supra note 8.

Post Image by Kaur Kristjan on Unsplash.

9 Wake Forest L. Rev. Online 21

Alexander W. Prunka*

I. Introduction

In the era of the #MeToo movement, there has been a dramatic push to name names and expose individuals accused of sexual misconduct and harassment across the world.[1] Before Harvey Weinstein was first accused and the #MeToo movement stormed onto the scene, though, college campuses were already predicting what was to come.[2]

For example, in 2014, on the heels of recent changes to the federal government’s interpretation of Title IX as it relates to peer-to-peer sexual misconduct, advocates founded the It’s On Us campaign to end sexual assault.[3] In 2015, a shocking documentary premiered detailing the prevalence of sexual assault on college campuses and institutional failure to address the issue.[4] The documentary featured prestigious universities, including the University of North Carolina at Chapel Hill ( “UNC”).

The Daily Tar Heel ( “DTH”), UNC’s campus newspaper, has long argued that UNC should disclose the names of individuals found responsible for sexual misconduct by University.[5] DTH has a history of seeking access to student disciplinary records: it took its 1996 attempt to publicize Honor Court proceedings and declassify their records to the North Carolina Court of Appeals.[6] DTH has been so dedicated to exposing UNC’s shortcomings in addressing sexual misconduct, it once published the details of victims’ complaints to the Department of Education against the victims’ wishes and without their consent.[7] So what happens when a student news organization allows its desire to spite its university and publicly shame those accused of sexual misconduct to drive its reporting agenda? Groundbreaking litigation, apparently.[8]

The Federal Educational Rights and Privacy Act of 1974[9] (“FERPA”) is a comprehensive statute protecting the privacy of student records.[10] With its broad protections, FERPA can be seen as a shield: protecting students from unwarranted invasions of privacy at all educational levels.[11] FERPA does, however, have some narrow exceptions.[12] The North Carolina Public Records Act[13] (“Public Records Act”), on the other hand, requires disclosure of a broadly defined class of public records and exceptions or exemptions are narrowly construed.[14]

On April 17, 2018, the North Carolina Court of Appeals issued a landmark decision in a lawsuit brought by DTH against UNC.[15] Reversing the superior court’s judgment in favor of UNC, the court of appeals’ decision compels UNC to disclose records identifying students found responsible by the University for virtually any violation of sexual misconduct policies over a nearly ten-year period. Thus, the court of appeals effectively endorsed DTH’s attempt to weaponize FERPA—a protective statute—through a misleading interpretation of a particular FERPA exception read in conjunction with the Public Records Act.

Part II discusses the history and background of FERPA, the Public Records Act, and Title IX of the Education Amendments of 1974 (“Title IX”). Part III discusses the case of DTH Media Corp. v. Folt[16] and the decision by the North Carolina Court of Appeals. Finally, Part IV argues the court of appeals was fundamentally incorrect in deciding for DTH. This Note concludes the North Carolina Supreme Court should properly determine that FERPA grants UNC discretion in determining whether to release the records in question, the Public Records Act is in conflict with that discretion, and FERPA preempts the Public Records Act to the extent it conflicts with the discretion given by FERPA. Further, this Note analyzes some of the public policy implications of the court of appeals decision to illustrate the need to reverse.

II. Background

FERPA and the Public Records Act form the basis of the legal question before the North Carolina Supreme Court in DTH Media Corp. v. Folt.[17] However, without recent interpretations of Title IX and subsequent changes to universities’ Title IX enforcement policies regarding peer-to-peer sexual misconduct,[18] the push to expose inadequacies in institutional responses to sexual misconduct may not have materialized. Thus, Title IX is indirectly at the heart of the litigation as well.

A. Student Disciplinary Records and FERPA

FERPA has two major purposes: to ensure access to student records for parents and students and “to protect [students’ and families’] right to privacy by limiting the transferability of their [educational] records without their consent.”[19] Educational records are “those records, files, documents, and other materials which contain information directly related to a student and are maintained by an educational agency or institution or by a person acting for such agency or institution.”[20] The statue provides only a handful of narrow exceptions.[21]

FERPA protects student privacy through an exercise of Congress’ spending power.[22] However, because FERPA’s statutory scheme and enforcement mechanisms do not confer a private right of action for violations,[23] the only avenue for enforcement is for aggrieved students to file a complaint with the Department of Education.[24] While the Department of Education has broad authority to withhold funding from institutions in violation of FERPA,[25] no school has ever lost funding.[26]

FERPA has been substantively amended several times.[27] In 1990, a section of the Student Right to Know, Crime Awareness, and Campus Security Act modified FERPA by inserting a provision which permits institutions of higher education to disclose the outcome of disciplinary proceedings to the victims of crimes of violence.[28] The Higher Education Amendments Act of 1998 amended FERPA further, creating an exception and giving institutions of higher education the authority to disclose to anyone the final result of a disciplinary proceeding conducted against a student who was alleged to have committed a crime of violence or nonforcible sex offense and has been determined to have violated the institutions rules pertaining to such offenses (hereinafter the “final result exception”).[29] The final result exception, while narrow and limited in scope, includes a broad list of crimes.[30]

The day after the House of Representatives voted in favor of the final result exception, Representative Thomas Foley, the amendment’s primary sponsor, made a statement on the floor of the House,[31] claiming the amendment was designed to provide balance “between one student’s right of privacy to another student’s right to know about a serious crime in his or her college community,”[32] and that it would make reporting on such records “subject to the State laws that apply.”[33] Representative Foley discussed the allegation that schools were using student disciplinary hearings to conceal crime issues on campuses.[34] He stated the amendment was important “[b]ecause . . . parents and community leaders and others deserve to know the statistical problems that are being experienced on our Nation’s campuses.”[35]

In the mid-1990’s, a years-long battle between news media and Miami University began over student disciplinary records.[36] After the Miami Student successfully convinced the Ohio Supreme Court that student disciplinary records were not student records protected by FERPA, The Chronicle of Higher Education sought the disclosure of disciplinary records, “fraught with personally identifiable information and virtually untainted by redaction.”[37] In 2002, the Sixth Circuit held student disciplinary records were protected under FERPA, in part because of the final result exception.[38] Because Ohio’s public records law did not apply to federally-protected records, disclosure was prohibited.[39] In its decision, the Sixth Circuit opined about the significant weight Congress has placed on student privacy rights through its creation of FERPA.[40]

B. North Carolina’s Public Records Law

Until 1935, North Carolina had no public records statute and relied on common law principles to govern citizen access to public records.[41] The statute enacted in 1935 contained significantly more access rights, but it was primarily enacted for historical preservation purposes and citizen access was an afterthought.[42]

In 1975, North Carolina passed a new public records law providing for much broader access to state and local government records.[43] The law as it is now is incredibly broad.[44] Any document created by a public agency constitutes a public record, with the main limitation being specific statutory exceptions.[45] While the General Assembly has provided broad protection to the educational records of elementary and secondary students,[46] no similar provision exempting records of students within the UNC system or the North Carolina Community College system exists.[47]

It is difficult to imagine that this lack of exception was anything other than deference to FERPA[48] or a mere oversight. As Ryan Fairchild explained, the wording of the Public Records Act is so breadth and liberal that application could conceivably require absurd disclosures.[49] Despite the potential for absurdity, the North Carolina Supreme Court has been clear that “whether [exceptions] should be made is a question for the legislature, not the Court.”[50]

The North Carolina Court of Appeals first addressed FERPA’s protection of student disciplinary records in the UNC system twenty years ago in DTH Publishing Corp. v. University of North Carolina.[51] There, it held that student disciplinary proceedings were validly held in closed session under the state open meetings law because the proceedings required divulging student records.[52] The court reasoned that “FERPA was adopted to address systematic . . . violations of students’ privacy and confidentiality rights through unauthorized releases of sensitive educational records,”[53] and FERPA’s conditional funding therefore rendered the records “privileged or confidential.”[54] The court held that the minutes of disciplinary proceedings were exempt from the Public Records Act because release would “frustrate the purpose” of a closed session.[55] While DTH Publishing dealt broadly with student disciplinary records,[56] the issue of records falling under the final result exception has not been addressed by North Carolina courts until now.

C. Title IX and Sexual Misconduct

Title IX declares: “No person in the United States shall, on the basis of sex, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any education program or activity receiving Federal financial assistance . . .”[57] On April 4, 2011, in response to a growing epidemic of sexual misconduct on college campuses,[58] Vice President Joe Biden and Secretary of Education Arne Duncan announced a “Dear Colleague” letter outlining the Department of Education’s interpretations of how peer-to-peer sexual misconduct relates to Title IX.[59] The significant policy pivots in the letter were not subject to notice and comment rulemaking procedures.[60]

In response, universities refined how they addressed peer-to-peer sexual misconduct.[61] Along with new policies came a substantial increase in disciplinary enforcement of sexual misconduct policies.[62] Since the release of the Dear Colleague Letter, complaints of noncompliance to the Office for Civil Rights have increased exponentially each year,[63] and to date, the Office has opened more than 500 investigations into universities’ handling of sexual misconduct allegations.[64]

Accompanying these changes has been a host of litigation against universities by students accused or disciplined in Title IX sexual misconduct proceedings.[65] Doe v. The Ohio State University,[66] claimed that The Ohio State University’s disciplinary procedures relating to Title IX sexual misconduct allegations would violate an accused student’s right to privacy.[67] The district court, noting that such a claim would not be ripe without disclosure, concluded the claim was without merit because all parties, the district court, and the Sixth Circuit Court of Appeals were in agreement that student disciplinary records produced in Title IX disciplinary proceedings were protected under FERPA.[68] The court noted that there was no concern about disclosure under the final result exception because the records in question did not constitute a final result of a disciplinary proceeding.[69]

Since the beginning of President Donald Trump’s term, the Department of Education has rolled back the clock on the interpretation of how Title IX applies to peer-to-peer sexual misconduct. In September 2017, the administration rescinded the Dear Colleague Letter and subsequent clarifying guidance,[70] issuing interim guidance that gives colleges and universities more flexibility in crafting peer-to-peer sexual misconduct policies and allows the use of the more stringent clear and convincing standard in disciplinary proceedings.[71] These changes were implemented in hopes of making the process more fair for all parties and with the intention that official rules would be promulgated in the future.[72]

In November 2018, the Department of Education proposed new rules.[73] The proposed rule features more protections for the accused and narrows the definition of actionable sexual misconduct.[74] Further, universities would have discretion in determining whether to investigate allegations of off-campus sexual misconduct.[75] While the exact impact these changes will have is unclear,[76] it is plain that Title IX will remain the driving force behind universities enforcing peer-to-peer sexual misconduct policies.

III. The Case: DTH Media Corp. v. Folt

On September 30, 2016, DTH sent a letter to UNC requesting “copies of all public records made or received by [UNC] in connection with a person having been found responsible for rape, sexual assault or any related or lesser included sexual misconduct.”[77] In a column days later, DTH Editor-in-Chief Jane Wester argued disclosure of names was necessary because she “badly want[ed] to know” how many people UNC has found responsible for sexual assault and what sanctions were being imposed.[78]

UNC denied the request, and DTH filed a declaratory judgment action on November 21, 2016.[79] Eventually, the Superior Court entered judgment in favor of UNC, concluding that FERPA grants universities discretion in determining whether to release records to the public under the final result exception and that this grant of discretion preempted required disclosure under the Public Records Act.[80] DTH appealed, and the North Carolina Court of Appeals issued its shocking decision on April 17, 2018.[81] The court reasoned that under proper canons of statutory interpretation, FERPA and the Public Records Act should be read to avoid conflict.[82] Reading the statutes in such a way, the court concluded the final result exception did not grant public universities absolute discretion in making disclosures.[83] The court determined that DTH was entitled to the records to the fullest extent they fell under the § 1232g(b)(6)(B) exception, fully granting the request except as to the date of the offenses.[84] Finally, the court explained its belief that FERPA did not preempt the Public Records Act in this case.[85]

IV. FERPA Preempts the Public Records Act

The North Carolina Supreme Court should first determine that the final result exception is a grant of discretionary power to universities to disclose particular records. Next, it should determine that the Public Records Act does not yield to the final result exception because the exception does not serve as an express statutory exemption which prohibits disclosure of the records in question. Finally, the court should conclude that FERPA and the Public Records Act conflict, and FERPA’s grant of discretion preempts the Public Records Act through implicit conflict preemption.

The court of appeals’ interpretation of the final result exception is based on the exception’s plain language.[86] However, the reasoning suggests the court’s interpretation of FERPA’s text relies on the conclusion that FERPA is in pari materia with the Public Records Act, and that they must be read in context with one another.[87] Statutes are considered in pari materia when they share a common aim or purpose or when they speak on the same subject.[88] When the text of a statute under consideration is clear, though, statutes in pari materia should not control construction.[89]

Even assuming, arguendo, the court of appeals read the statutes in pari materia to resolve ambiguity, such a reading would be improper because FERPA and the Public Records Act cannot reasonably be considered in pari materia. FERPA is a shield providing comprehensive protections to students by preventing disclosure of student records.[90] The Public Records Act, on the other hand, is a sword, broadly requiring disclosure of a vast array of records.[91] No matter how the subjects, purposes, and aims of the statutes are framed they will never be in pari materia.[92] Since much of the court’s analysis of the final result exception rests upon the faulty notion that it must be read in context with the Public Records Act,[93] it is a fair assumption that the mistake substantially and fatally flawed the court’s entire analysis.

A. The Meaning of the Section 1232g(b)(6)(B) Exception

1. The Plain Text

North Carolina courts have long followed the plain language rule in statutory interpretation: “If the language of the statute is clear and is not ambiguous, we must conclude that the legislature intended the statute to be implemented according to the plain meaning of its terms.”[94]

While the court of appeals concluded that nothing in the text of the final result exception[95] “required” UNC to exercise discretion in determining whether to disclose results within the final result exception,[96] a plain reading of the statute indicates the final result exception grants universities the discretion to determine whether to make such disclosures.

The language “[n]othing in this section shall be construed to prohibit . . .” indicates that the conduct is allowed, but not required.[97] The exception creates a discretionary decision: the university may choose whether to engage in the excepted conduct.[98] Thus, a university clearly has a discretionary choice of whether to disclose the final result of certain disciplinary proceedings.[99]

The court of appeals ignores this common-sense reading, arguing the only hint of discretion within the final result exception is the limiting condition that the exception applies only when “the institution determines as a result of that disciplinary proceeding that the student committed a violation of the institution’s rules or policies with respect to such crime or offense.”[100] Further, the court of appeals insists that FERPA’s judicial order exception demonstrates that the FERPA exception does not grant institutions discretion in determining whether to release records.[101]

The court’s logic misses the mark, ignoring that the judicial order exception is an independent exception.[102] “Just as Congress’ choice of words is presumed to be deliberate, so too are its structural choices.”[103] In 1998, Congress chose to amend FERPA to add the final result exception.[104] The court should have presumed Congress was deliberate in its structural placement and wording of the final result exception, rather than focus on such a circular argument.[105]

2. Legislative Intent Demonstrates That Discretion is Appropriate

Although the meaning of the final result exception is plain on its face, even if the language is ambiguous, FERPA evinces a legislative intent to leave the decision to disclose records under the exception within the discretion of universities. Our supreme court notes that “legislative intent controls the meaning of a statute” and directs that to determine intent, “a court must consider the act as a whole, weighing the language of the statute, its spirit, and that which the statute seeks to accomplish.”[106]

Because we must presume that Congress was deliberate in its wording of the final result exception,[107] it is telling that Congress crafted a permissive exception.[108] Under the court of appeals’ decision and the language of the Public Records Act, virtually any request for disclosure coming within the final result exception would become mandatory for the sixteen constituent universities within the UNC system. For public universities in North Carolina, the final result exception would become a required disclosure. Where Congress did not choose to require disclosure of these records, such a requirement for disclosure is surely inconsistent with the intent of the law.

Requiring disclosures in such a way is grossly inconsistent with the spirit and goals of FERPA. The court of appeals places great emphasis on the statement Representative Foley made the day after the provision was approved by the House of Representatives.[109] In regards to this type of misguided reliance, Justice Scalia said it best: “Arguments based on subsequent legislative history, like arguments based on antecedent futurity, should not be taken seriously, not even in a footnote.”[110] Considering, for the sake of discussion, that Representative Foley’s statement has even a scintilla of importance in determining the intent of Congress, the statement clearly demonstrates that the intent of the amendment was to balance the interest “between one student’s right of privacy to another student’s right to know about a serious crime in his or her college community.”[111] Balance requires the measurement and offsetting of competing interests to achieve the most desirable result,[112] and universities would be in the best position to balance the interests of the community against the privacy interest of the students.[113] It is preposterous to conclude that Congress expected that the law these records would be subject to would require blind disclosure without any balancing of interests.

B. The Public Records Act Does Not Yield to the Discretion Granted by the Final Results Exception

Because the conflicting law exemption found in section 132-1(b) of the Public Records Act is construed so narrowly,[114] our supreme court should not determine that the Public Records Act yields to FERPA. Construing this provision narrowly, the court should note that while FERPA itself would specifically provide a broad exemption for student records under the Public Records Act,[115] the final result exception removes certain records from that category. Thus, the final result exception does not “otherwise provide” that records within the exception may not be disclosed. Instead, because the final result exception permits disclosure the records, they are therefore subject to section 132-1(b)’s disclosure requirements unless preempted by FERPA.

C. FERPA’s Grant of Discretion to Colleges and Universities Preempts the Public Records Act

The supreme court should determine that the Public Records Act is in conflict with the final result exception of FERPA, and therefore FERPA implicitly preempts the Public Records Act to the extent it requires disclosure of records within the final result exception.[116] The court of appeals relies on the notion that it should presume both that the Public Records Act does not conflict with FERPA[117] and that federal preemption does not apply.[118] While it would be logical to presume that two statutes enacted by the same sovereign are not meant to contradict one another, there is little sense in assuming that two unrelated legislatures would avoid conflict to any extent.[119]

Federal preemption may be either express or implied.[120] Courts have taken two avenues of analysis of implicit conflict preemption: “obstacle” preemption occurs when a state statute “stands as an obstacle to the accomplishment and execution of the full purposes and objectives of Congress,”[121] while “impossibility” preemption occurs when compliance with both state and federal law is a “physical impossibility.”[122]

It has been argued that the federal judiciary has grossly misapplied implicit conflict preemption through a broad reading of purposes and objectives preemption.[123] Since at least 2000, Supreme Court justices have warned of such an overwhelming expansion.[124] Advocates for change often argue in favor of a much stronger presumption against preemption and/or an increased reliance on the nuanced and cumbersome “physical impossibility” analysis.[125] In response to the seemingly artificial requirement of choosing between ridiculously broad or the uncompromisingly narrow analyses, analysis of implicit preemption should simply be “an inquiry into whether the ordinary meanings of state and federal law conflict.”[126]

Such a plain text approach to implicit preemption analysis requires a full understanding of the purposes underlying the Supremacy Clause.[127] The Supremacy Clause contains a rule of applicability requiring application of federal law in state courts with equal force as state law[128] and a rule of priority requiring application of federal law over state law when conflict exists.[129] These two rules, without further historical understanding, leave the final phrase of the Supremacy Clause –“anything in the Constitution or laws of any State to the contrary notwithstanding”[130]– seemingly redundant.[131]

Understood in the context of the ratification debates, however, this phrase was critically necessary to the success of the Supremacy Clause.[132] At the time of ratification, there was a judicial presumption against reading statutes in a manner which resulted in conflict, which would result implied repeal.[133]

In response to the presumption against implied repeals, legislatures sometimes include a non obstante provision to indicate to courts that new legislation may indeed contradict other statutes and that possible conflict should not skew the meaning of the statute.[134] The language of such clauses often dictated that the statute would apply “any law to the contrary notwithstanding,” or similar wording to the same effect.[135] Instead of leaving the Supremacy Clause’s rule of priority open to the interpretation of state courts, which might still apply the presumption and stretch the meaning of a federal statute to avoid conflict and implied repeal, the drafters of the Constitution included the phrase “anything in the Constitution or laws of any State to the contrary notwithstanding” as the final phrase of the Supremacy Clause as a universal non obstante clause, applying to all federal laws, and specifically contemplating potential conflict with state law and cautioning interpreting courts not to stretch their interpretation of federal statutes.[136] A plain text approach to implicit preemption, free from judicial policymaking, gives meaning to the framer’s express words and their intent that courts should strain to find harmony between apparently conflicting state and federal statutes.[137]

In 2011, the Supreme Court came its closest to implementing a plain text approach, guided by the Supremacy Clause’s non obstante provision, to implicit preemption. In PLIVA, Inc. v. Mensing,[138] Justice Thomas delivered the opinion of the court.[139] Although the critical implied preemption analysis was only a plurality portion of the opinion, the time may soon arrive that our nation’s courts finally do away with difficult and nuanced tests for conflict preemption.[140]

Though PLIVA specifically discusses judicial speculation about actions which could reconcile federal and state law under an impossibility preemption analysis,[141] it stands for a broader textualist approach to conflict preemption: “The non obstante provision of the Supremacy Clause indicates that a court need look no further than the ordinary meaning of federal law, and should not distort federal law to accommodate conflicting state law.”[142]

Taking a textual approach to implicit conflict preemption simply requires determining whether the text of the state law conflicts with the text of the federal law.[143] Focusing on the text of statutes would simplify the analysis by removing the need to classify the conflict in terms of obstacle or impossibility. A clear rule based in a textual analysis will remove the need to speculate and stretch meaning, producing more consistent results and comporting more fully with the non obstante provision of the Supremacy Clause.

It is clear that the Public Records Act conflicts with FERPA to the extent that it would require blind disclosure of all records falling within the final result exception. The ordinary language of the exception clearly reveals Congress’ intent to grant universities discretion in disclosing these records.[144] Because the Public Records Act would require UNC to blindly disclose the records, it interferes with UNC’s ability to exercise the discretion the final result exception grants.

D. Policy Implications

North Carolina courts generally defer questions of public policy to the General Assembly.[145] Though the North Carolina Supreme Court need not give much weight to considerations of policy implications, it is important to consider some potential implications of affirming the court of appeals.

The most troubling policy consideration is that the release of records identifying students as responsible for “rape, sexual assault or any related or lesser included sexual misconduct” could create constitutional privacy issues. Doe v. The Ohio State University left open the possibility that if Title IX investigation records were not protected, an accused may have a cognizable substantive due process claim under the United States Constitution.[146] Named students certainly would have a legitimate concern: the Southern District of Ohio framed it as “the interest in avoiding disclosure of highly personal matters.”[147] State run universities would be required to disclose their conclusions, often based on “investigations” with low evidentiary standards and limited due process rights, that individuals committed crimes.

Furthermore, what about the negative effects that required blind disclosure would have upon the goals Title IX’s peer-to-peer sexual misconduct policy enforcement? Confidentiality in the process is at the crux of Title IX and a major reason why victims often prefer reporting to their university rather than the police.[148]

Finally, there are instances where false accusations occur.[149] In a system where for at least the majority of the last ten years the federal government has required adjudication of these allegations by universities using the low standard of preponderance of the evidence,[150] are we ready to risk upending lives by labeling people as predators[151] and rolling back progress made for victims?[152] The Duke Lacrosse and Rolling Stone cases show that such risks should be considered.

These few concerns beg the question: with so much at stake, and a grant of discretion so clear, is there a need to weaponize the final result exception in conjunction with the Public Records Act?

V. Conclusion

In the end, what would truly serve the interests of progress and student welfare would be a release of detailed, non-personally identifiable information about sexual misconduct on campus. Indeed, Wester has gone on the record several times describing the needs allegedly at the heart of DTH’s request.[153] These needs do not require naming names. Even Representative Foley, who sponsored the final result exception, noted the importance of using statistics to inform the community.[154]

It is frustrating that no exception to the Public Records Act is on the books for student records in the University of North Carolina system.[155] The General Assembly could have created such a provision and still could moot this litigation by fixing it now. Perhaps Congress, too, should reconsider the need for the final result exception.

For now, the question is before the North Carolina Supreme Court. With a proper textual approach to statutory construction, our supreme court should conclude that the final result exception does give discretion to universities, and therefore the Public Records Act’s requirement to disclose is in conflict with FERPA. Without acrobatic harmonizing, the supreme court should find that FERPA preempts the Public Records Act to the extent this conflict exists, and reverse the court of appeals.

* J.D. Candidate 2020, Wake Forest University School of Law. Many thanks to my family, the Michael Bublé Fan Club, and most importantly my ever-patient fiancée, Kelsie. Additional thanks to Ms. Andie Harrelle and Dr. Tamika Wordlow-Williams for giving me the opportunity to work at the Office of Student Rights and Responsibilities at East Carolina University where I gained appreciation for student conduct topics.

    1. See generally, Christen A Johnson & KT Hawbaker, #MeToo: A Timeline of Events, Chicago Tribune (Mar. 7, 2019, 9:43 AM), (outlining the history of the #MeToo movement).

    2. Lena Felton, How Colleges Foretold the #MeToo Movement, Atlantic (Jan. 17, 2018),

    3. Our Story, It’s On Us, (last visited Dec. 20, 2018).

    4. See The Hunting Ground (The Weinstein Company 2015). The author notes the painful irony of the fact that Harvey Weinstein’s company was behind a film on this subject.

    5. Jane Wester, Column: We Should Know Who’s Found Responsible for Sexual Assault, Daily Tar Heel (Oct. 2, 2016, 11:47 PM),

    6. DTH’s arguments fell flat at the court of appeals. See DTH Publ’g Corp. v. Univ. of North Carolina, 496 S.E.2d 8 (N.C. Ct. App. 1998).

    7. See Tyler Kingkade, The Daily Tar Heel Published Details of Rape Victims’ Federal Complaint Without Consent, Huffington Post (Jan. 29, 2013),

    8. See, e.g., DTH Media Corp. v. Folt, 816 S.E.2d 518 (N.C. Ct. App. 2018) (bringing suit against university to compel disclosure of records naming those found responsible for sexual misconduct by the university); DTH Publ’g Corp., 496 S.E.2d at 10 (bringing suit against university to compel disclosure of records university was allegedly wrongly withholding).

    9. 20 U.S.C. § 1232g (2012).

    10. See infra Part II.A.

    11. Id.

    12. See, e.g., 20 U.S.C. § 1232g(b)(6)(A)–(B) (2012) (permitting the release of records in certain instances).

    13. N.C. Gen. Stat. § 132-1 et seq. (2017).

    14. See infra Part II.B.

    15. DTH Media Corp. v. Folt, 816 S.E.2d 518, 518–21 (N.C. Ct. App. 2018).

    16. 816 S.E.2d 518 (N.C. Ct. App. 2018).

    17. See generally id. at 523–26 (deciding whether FERPA and the North Carolina Public Records act conflicted and whether the records must be released).

    18. See infra Part II.C.

    19. See 120 Cong. Rec. 39,862 (daily ed. Dec. 13, 1974) (joint statement of Sens. Buckley and Pell). FERPA, introduced as a floor amendment in the Senate, was never considered by a committee and thus lacks much of the typical legislative history, such as committee reports and hearings. See Robert W. Futhey, Note, The Family Educational Rights & Privacy Act of 1974: Recommendations for Realigning Educational Privacy with Congress’ Original Intent, 41 Creighton L. Rev. 277, 311 (2008). As discussed later, subsequent legislative history is nearly worthless in determining legislative intent. See infra Part IV.A.2. This is, however, the only dependable signal of the legislative intent behind FERPA.

    20. 20 U.S.C. 1232g(a)(4)(A) (2012).

    21. See 20 U.S.C. 1232g(a)(4)(B) (2012).

    22. See 20 U.S.C. § 1232g(b) (2012) (“No funds shall be made available under any applicable program to any educational agency or institution which has a policy or practice of permitting the release of educational records . . . .”).

    23. See Gonzaga Univ. v. Doe, 536 U.S. 273, 287 (2002).

    24. See 34 C.F.R. § 99.63 (2008); see also Gonzaga Univ., 536 U.S. at 287.

    25. See 20 U.S.C § 1232g(b) (2012).

    26. See Tyler Kingkade, Why Colleges Hide Behind this One Privacy Law All the Time, Huffington Post (Feb. 1, 2016, 6:44 PM),

    27. See Lynn M. Daggett, Bucking Up Buckley I: Making the Federal Student Records Statute Work, 46 Cath. U. L. Rev. 617, 617 (1996–1997).

    28. See Student Right-to-Know and Campus Security Act, Pub. L. No. 101-542, § 204, 104 Stat. 2381, 2385-87 (1990) (codified as amended at 20 U.S.C. § 1232g(b)(6) (2012)); Daggett, supra note 27, at 621.

    29. In the student disciplinary record disclosure system there are two separate, yet equally important exceptions: 20 U.S.C. § 1332g(b)(6)(A) which applies only to disclosure to the victims, and 20 U.S.C. § 1332g(b)(6)(B) which applies to disclosure to anyone. As noted supra, and as applies infra, this is Section 1332g(b)(6)(B)’s story only. See, e.g., Law & Order: Point of View (NBC television broadcast Nov. 25, 1992) (providing the framework for this witty citation and serving as the world’s introduction to the legendary Detective Lennie Briscoe).

    30. Included in the list of offenses which fall under into the category of “crime of violence” are arson; burglary; criminal homicide; destruction, damage, or vandalism of property; kidnapping or abduction; robbery; forcible sex offenses; and perhaps most broad “assault offenses.” 34 C.F.R. § 99.39 (2000).

    31. See 144 Cong. Rec. 8435 (daily ed. May 7, 1998) (statement of Rep. Foley) (“It did pass yesterday. We hope the Senate will consider the amendment.”).

    32. See id. at 8434.

    33. See id.

    34. See id.

    35. See id. at 8435.

    36. See United States v. Miami Univ., 294 F.3d 797, 803 (6th Cir. 2002).

    37. Id. at 803–04, 811.

    38. See id. at 811–13.

    39. See id.

    40. See id. at 807.

    41. See Thomas H. Moore, You Can’t Always Get What You Want: A Look at North Carolina’s Public Records Law Comments, 72 N.C. L. Rev. 1527, 1543 (1993–1994)

    42. Id.

    43. See Moore, supra note 41, at 1544–45.

    44. See id. at 1544. The law encompasses:

      all documents, papers, letters, maps, books, photographs, films, sound recordings, magnetic or other tapes, electronic data-processing records, artifacts, or other documentary material, regardless of physical form or characteristics, made or received pursuant to law or ordinance in connection with the transaction of public business by any agency of North Carolina government or its subdivisions. Agency of North Carolina government or its subdivisions shall mean and include every public office, public officer or official (State or local, elected or appointed), institution, board, commission, bureau, council, department, authority or other unit of government of the State or of any county, unit, special district or other political subdivision of government

      N.C. Gen. Stat. § 132-1(a) (2017).

    45. See News & Observer Pub. Co., Inc., v. Poole, 412 S.E.2d 7, 12 (N.C. 1992); see also Ryan C. Fairchild, Giving Away the Playbook: How North Carolina’s Public Records Law Can Be Used to Harass, Intimidate, and Spy, 91 N.C. L. Rev. 2117, 2126 (2013).

    46. See N.C. Gen. Stat. § 115C-402(e) (2017).

    47. The only exceptions for records of UNC on the books are for personally identifying information from or about an applicant to a constituent institution, or pertaining to liability insurance programs of constituent institutions. See Fairchild, supra note 45, at 2129–30.; N.C. Gen. Stat §§ 132-1.1(f), 116-222 (2017).

    48. Some states’ failure to enact student privacy laws may be the result of a belief that FERPA adequately provides robust protection for student privacy rights, or that the federal government has occupied the field. See Lynn M. Daggett, FERPA in the Twenty-First Century: Failure to Effectively Regulate Privacy for All Students, 58 Cath. U. L. Rev. 59, 113 (2008).

    49. See Fairchild, supra note 45, at 2130–31. Such disclosures could include football playbooks, academic exams, and academic research work. See id.

    50. News & Observer Pub. Co., Inc., 412 S.E.2d at 18.

    51. 496 S.E.2d 8, 8 (N.C. Ct. App. 1998).

    52. See id. at 13.

    53. Id. at 12 (quoting Smith v. Duquesne Univ., 612 F. Supp. 72, 80 (1985), aff’d, 787 F.2d 583 (1986)). The “privileged and confidential” status of the records allowed disciplinary hearings to be held in closed session under an exception to the state open meetings law. See id.

    54. See id.; see also N.C. Gen. Stat. § 143-318.11(a) (2017).

    55. See DTH Publ’g Corp., 496 S.E.2d at 13; see also N.C. Gen. Stat. § 143-318.10(e) (2017).

    56. See DTH Publ’g Corp., 496 S.E.2d at 8–9 (discussing the factual background of the case).

    57. Education Amendments of 1972, Pub. L. No. 92-318, § 901, 86 Stat. 373 (1972) (codified as amended at 20 U.S.C. 1681(a) (2012)).

    58. See, e.g., Christopher P. Krebs et al., The Campus Sexual Assault (CSA) Study at xiii (2007) (stating that almost twenty percent of women report being victims of sexual assault since entering college). For a more thorough discussion on the issue of sexual misconduct on college campuses, see Brian A. Pappas, Out from the Shadows: Title IX, University Ombuds, and the Reporting of Campus Sexual Misconduct, 94 Denv. L. Rev. 71, 74–75 (2016–2017).

    59. See Press Release, U.S. Dep’t of Educ., Vice President Biden Announces New Administration Effort to Help Nation’s Schools Address Sexual Violence (Apr. 4, 2011),; U.S. Dep’t of Educ. Office for Civil Rights, Dear Colleague Letter (Apr. 4, 2011) [hereinafter Dear Colleague Letter]. The Dear Colleague Letter dramatically altered prior understanding of Title IX by requiring universities to address allegations of sexual misconduct originating on and off campus and by prescribing required knowledge and a preponderance of the evidence standard in addressing such allegations. See Brian A. Pappas, Dear Colleague: Title IX Coordinators and Inconsistent Compliance with the Laws Governing Campus Sexual Misconduct, 52 Tulsa L. Rev. 121, 127 (2016); Dear Colleague Letter, supra note 59, at 11.

    60. See Lance Toron Houston, Title IX Sexual Assault Investigations in Public Institutions of Higher Education: Constitutional Due Process Implications of the Evidentiary Standard Set Forth in the Department of Education’s 2011 Dear Colleague Letter, 34 Hofstra Lab. & Emp. L.J. 321, 333 (2017). The Dear Colleague Letter was designated a “significant guidance document.” Dear Colleague Letter, supra note 59, at 1 n.1. The Dear Colleague Letter thus purported to create interpretive rules of general rather than creating new regulations. See generally, Final Bulletin for Agency Good Guidance Practices, 72 Fed. Reg. 3432 (Jan. 25, 2007) (defining and discussing significant guidance documents).

    61. See Erin E. Buzuvis, Title IX and Procedural Fairness: Why Disciplined-Student Litigation Does Not Undermine the Role of Title IX in Campus Sexual Assault, 78 Mont. L. Rev. 71, 71 (2017).

    62. See id. at 71–72. Victims choose to pursue investigations with universities for a host of reasons: the confidentiality of the process, misunderstanding of the law, fear they would not be believed by the police, and lack of control in the criminal justice system. See Eliza Gray, Why Victims of Rape in College Don’t Report to the Police, Time (June 23, 2014),

    63. See Buzuvis, supra note 61, at 82.

    64. See Title IX: Tracking Sexual Assault Investigations, Chronicle of Higher Educ., (last visited Dec. 18, 2018).

    65. See Buzuvis, supra note 61, at 85. No disciplined student has ever prevailed against a university defendant in a Title IX suit. See id.

    66. Doe v. The Ohio State Univ., 136 F. Supp. 3d 854 (S.D. Ohio 2016).

    67. See id. at 860, 868.

    68. Id. at 869.

    69. See generally id. at 864.

    70. Stephanie Saul & Kate Taylor, Betsy DeVos Reverses Obama-Era Policy on Campus Sexual Assault Investigations, N.Y. Times (Sept. 22, 2017),

    71. See generally U.S. Dep’t of Educ. Office For Civil Rights, Q&A on Campus Sexual Misconduct (Sept. 2017) (discussing interim interpretations of Title IX).

    72. See Press Release, U.S. Dep’t of Educ., Department of Education Issues New Interim Guidance on Campus Sexual Misconduct (Sept. 22, 2017),

    73. See Nondiscrimination on the Basis of Sex in Education Programs or Activities Receiving Federal Financial Assistance, 83 Fed. Reg. 61,642 (proposed Nov. 28, 2018) (to be codified at 34 C.F.R. pt. 106).

    74. Sophie Tatum, Education Dept. Unveils New Protections for Those Accused of Sexual Misconduct on Campuses, CNN (Nov. 16, 2018, 1:17 PM),

    75. Andrew Kreighbaum, What the DeVos Title IX Rule Means for Misconduct Off Campus, Inside Higher Educ. (Nov. 27, 2018)

    76. Sarah Brown & Katherine Mangan, What You Need to Know About the Proposed Title IX Regulations, Chronicle of Higher Educ. (Nov. 16, 2018, 4:40 PM),

    77. Transcript of Record at 29, DTH Media v. Folt, 816 S.E.2d 518 (N.C. Ct. App. 2018) (No. 17-871).

    78. See Wester, supra note 5. Ironically, based on the explanation Wester gives in her column, all of the needs underlying the request could be met without identifying students. See id.

    79. Transcript of Record, supra note 77, at 6–7. Wester again went on the record and demonstrated that the needs underlying the request did not require identifying students, saying

      It would help us tremendously into figuring out basically how seriously UNC is taking these cases, how many of the cases that enter the system get resolved — because we can’t really even see that right now — so basically, there’s stuff we can report, we can talk to survivors and stuff without the record, but we really need to see more on UNC’s side of it.

      Katie Rice, The Daily Tar Heel Files Lawsuit against UNC to Obtain Campus Sexual Assault Records, Daily Tar Heel (Nov. 22, 2016, 12:52 AM),

    80. Transcript of Record, supra note 77 at 37–39.

    81. DTH Media Corp. v. Folt, 816 S.E.2d 518, 518 (N.C. Ct. App. 2018).

    82. See id. at 523.

    83. See id. at 524.

    84. See id. at 521, 526.

    85. See id. at 526–29.

    86. See id. at 524.

    87. See id. at 523–24.

    88. See Hous. Auth. of City of Greensboro v. Farabee, 200 S.E.2d 12, 15–16 (N.C. 1973).

    89. See id. at 16.

    90. See, e.g., Daggett, supra note 27, at 617–19. See also supra Part II.A.

    91. See supra Part II.B.

    92. While the court of appeals does not explain how, exactly, the statutes are in pari materia, the mere fact that they both speak to “records” in some respect would be decidedly insufficient to support a threshold determination that they are on the same subject.

    93. See DTH Media Corp. v. Folt, 816 S.E.2d 518, 523–24 (N.C. Ct. App. 2018).

    94. Lanvale Props., LLC v. Cty. of Cabarrus, 731 S.E.2d 800, 809 (N.C. 2012) (internal quotations and citations omitted).

    95. The final result exception reads as follows:

      Nothing in this section shall be construed to prohibit an institution of postsecondary education from disclosing the final results of any disciplinary proceeding conducted by such institution against a student who is an alleged perpetrator of any crime of violence . . . or a nonforcible sex offense, if the institution determines as a result of that disciplinary proceeding that the student committed a violation of the institution’s rules or policies with respect to such crime or offense.

      20 U.S.C. § 1232g(b)(6)(B) (2012).

    96. See DTH Media Corp., 816 S.E.2d at 524–25.

    97. Prohibit is defined as “to forbid by law.” Prohibit, Black’s Law Dictionary (10th ed. 2014). Because the subsequent action is not forbidden, but is not required it, it is allowed. Indeed, the Department of Education notes that the final result exception is a permissive exception. See Family Educational Rights and Privacy, 65 Fed. Reg. 41,852, 41,860 (July 6, 2000) (to be codified at 34 C.F.R. pt. 99). The comment response notes that the new provision does not require a university to disclose any records under the FERPA exception, but concludes that FERPA “does not prevent” disclosure required under state public records laws. Id. For the reasons described in Part IV.A, the court’s conclusion would make little sense in this case because the North Carolina Public Records Act would require disclosure of all records falling under the FERPA exception.

    98. See 20 U.S.C. § 1332g(b)(6)(B). Discretion is defined as “[w]ise conduct and management exercised without constraint.” Discretion, Black’s Law Dictionary (10th ed. 2014).

    99. Comparing UNC to federally funded private universities outside the reach of the Public Records Act reinforces this rationale. For private universities, administrators would obviously be required to make a decision about whether to release records under the final result exception. Their decision would be an exercise of discretion, despite the lack of language requiring the exercise of discretion. Why would FERPA treat public and private universities differently without explicit wording to such an effect?

    100. See DTH Media Corp., 816 S.E.2d at 524.

    101. See id. at 524–25. This logic is rather circular. The court focuses on the conclusion that because the judicial order exception does not differentiate between judicial orders which require disclosure and those which merely authorize disclosure, an institution could not lose funding for complying with a judicial order requiring disclosure of records under the final result exception. See id. Further, in this portion of the analysis the court of appeals appears to confuse and side-step the true issue, twice turning its conclusions on whether disclosures under these two exceptions would leave an institution in violation of FERPA. See id. at 525. The question is not whether release of records under the final result exception would violate FERPA: it decidedly would not. The question is whether the Public Records Act can completely annihilate the discretion FERPA gives. Answering the first question says nothing about the second.

    102. Existing in its own independent sub-sub-sub section, the judicial order exception is broader than the final result exception and encompasses records well outside the scope of the final result exception. See 20 U.S.C. § 1232g(b)(2)(B) (2012).

    103. Univ. of Tex. Sw. Med. Ctr. v. Nassar, 570 U.S. 338, 353 (2013).

    104. See supra note 29 and accompanying text.

    105. Theoretically a judicial order could compel the release of records under the final result exception. See DTH Media Corp., 816 S.E.2d at 598 (remanding the litigation to superior court to issue a judicial order compelling disclosure). That said, it makes very little sense to stretch a whimsical argument about lack of distinction in the judicial order exception into substantive support for the incorrect conclusion that the final result exception does not grant institutions discretion in deciding whether to disclose records.

    106. North Carolina Ins. Guar. Ass’n v. Bd. of Trs. of Guilford Tech. Cmty. Coll., 691 S.E.2d 694, 699 (N.C. 2010) (internal quotations and citations omitted).

    107. Univ. of Tex. Sw. Med. Ctr., 570 U.S. at 353.

    108. See supra Part IV.A.1.

    109. See DTH Media Corp., 816 S.E.2d at 527.

    110. Sullivan v. Finkelstein, 496 U.S. 617, 632 (1990) (Scalia, J., concurring). Justice Scalia is hardly alone in this belief. See also Clarke v. Sec. Indus. Ass’n, 479 U.S. 388, 407 (1987) (noting that the Court does not attach substantial weight to statements made by sponsors of legislation after the passage of an act); Wallace v. Jaffree, 472 U.S. 38, 86–87 (1985) (Burger, C.J., dissenting) (proselytizing that statements made by a bill’s sponsor after its passing do not offer a “shred of evidence” that the body shared the sponsor’s intentions in passing the legislation). For a detailed explanation of reasons underlying the uselessness of post-passage legislative history, see Covalt v. Carey Canada Inc., 860 F.2d 1434, 1438 (7th Cir. 1988).

    111. See 144 Cong. Rec. H2984 (daily ed. May 7, 1998) (statement of Rep. Foley).

    112. See Balance, Black’s Law Dictionary (10th ed. 2014) (third definition).

    113. Universities are in a position to know the case facts, the severity of the offense, and the community’s need to know, whereas an appellate court is not in a position to balance the interests in what is now eleven years’ worth of disciplinary records. The court of appeals paid special attention to the language “make reporting subject to state laws that apply.” See 144 Cong. Rec. H2984 (daily ed. May 7, 1998) (statement of Rep. Foley). Had Congress intended that the final result exception would require disclosure, as the Public Records Act allegedly requires, it would have chosen language conveying such an intent.

    114. See supra Part II.B. In brief, UNC contends that FERPA and the Public Records Act can be reconciled by applying the deference the Public Records Act affords to conflicting laws. See Brief of Defendant-Appellees at 19–21, DTH Media v. Folt, 816 S.E.2d 518 (N.C. Ct. App. 2018) (No. 17-871). This argument should be unpersuasive, however, because the court should narrowly construe the meaning of “unless otherwise specifically provided by law.” See News & Observer Pub. Co., Inc. v. Poole, 412 S.E.2d 7, 18 (N.C. 1992).

    115. See generally 20 U.S.C. § 1232g (2012) (withholding substantial funding from institutions that impermissibly disclose student records).

    116. While there may be some merit to the argument that FERPA preempts the Public Records Act through implicit field preemption, the argument would be more complex and less compelling than conflict preemption argument based on a clear-cut conflict.

    117. See DTH Media Corp. v. Folt, 816 S.E.2d 518, 524 (N.C. Ct. App. 2018).

    118. See id. at 526. In describing its presumption against federal preemption, the court of appeals relies on State ex rel. Utilities Comm’n v. Carolina Power & Light Co., but neglects to address the subsequent explanation therein that such a presumption exists when the field supposedly preempted is one traditionally occupied by the states, which are those fields relating to the exercise of a state’s police powers over health and welfare. See 614 S.E.2d 281, 287 (2005) (citing Hillsborough Cty. v. Auto. Med. Labs., Inc., 471 U.S. 707, 715 (1985)). It is difficult to see how the Public Records Act is an exercise of North Carolina’s police power over health and welfare and equally difficult to understand how North Carolina has traditionally occupied the field of the privacy of student records when there is only one provision in all of the general statutes relating to the confidentiality of student records. See supra note 47 and accompanying text.

    119. See Caleb Nelson, Preemption, 86 Va. L. Rev. 225, 233 (2000).

    120. See Gade v. Nat’l Solid Wastes Mgmt. Ass’n, 505 U.S. 88, 98 (1992).

    121. Hines v. Davidowitz, 312 U.S. 52, 67 (1941).

    122. See Fla. Lime & Avocado Growers, Inc. v. Paul, 373 U.S. 132, 142–43 (1963).

    123. See Wyeth v. Levine, 555 U.S. 555, 583 (2009) (Thomas, J., concurring in judgment); see also Nelson, supra note 119, at 229.

    124. See, e.g., Geier v. Am. Honda Motor Co., Inc., 529 U.S. 861, 907 (2000) (Stevens, J., dissenting) (opining about the potentially limitless application of purposes and objectives preemption).

    125. See Nelson, supra note 119, at 230–31.

    126. Wyeth, 555 U.S. at 588 (Thomas, J., dissenting) (internal quotations and citations omitted).

    127. U.S. Const. art. VI, cl. 2.

    128. Nelson, supra note 119, at 246.

    129. See id. at 250. This rule of priority would displace the traditional rule, which would apply the law of the statute more recently passed in the event of a conflict. See id.

    130. U.S. Const. art. VI, cl. 2.

    131. Nelson, supra note 119, at 254.

    132. See id. at 255.

    133. See id. at 241–42.

    134. See id.

    135. Id. at 238.

    136. See id. at 255.

    137. See Geier v. Am. Honda Motor Co., Inc., 529 U.S. 861, 911 (2000) (Stevens, J., dissenting).

    138. PLIVA, Inc. v. Mensing, 564 U.S. 604 (2011).

    139. See id. at 622–23.

    140. Following the death of Justice Scalia in 2016 and Justice Kennedy’s retirement in 2018, Justices Gorsuch and Kavanaugh have been elevated to the high court, leaving the court even more conservative than it was at the time of PLIVA. See, e.g., Adam Liptak, Confirming Kavanaugh: A Triumph for Conservatives, but a Blow to the Court’s Image, N.Y. Times (Oct. 6, 2018),

    141. See PLIVA, 564 U.S. at 623.

    142. Id. (internal quotations, punctuation, and citations omitted).

    143. See Wyeth v. Levine, 555 U.S. 555, 588 (2009) (Thomas, J., concurring in judgment).

    144. See supra Part IV.A.

    145. See Martin v. N.C. Hous. Corp., 175 S.E.2d 665, 671 (N.C. 1970).

    146. Doe v. The Ohio State Univ., 136 F. Supp. 3d 854, 869 (S.D. Ohio 2016) (concluding that because Title IX investigation records were protected under FERPA plaintiff did not have a substantive due process claim).

    147. Id.

    148. See supra note 62 and accompanying text. UNC notes that victims could sometimes be identified just through the release of their attacker’s identity. Transcript of Record, supra note 79, at 17.

    149. There is some dispute as to the prevalence of false allegations of sexual misconduct, but most reports suggest they are fairly rare. See Rowan Scarborough, False Sex Assault Reports Not as Rare as Reported, Studies Show, Wash. Times (Oct. 7, 2018),

    150. See supra Part II.C.

    151. William D. Cohan, The Duke Lacrosse Player Still Outrunning His Past, Vanity Fair (Mar. 24, 2014, 8:49 PM),

    152. Kurtis Lee, Fallout from Rolling Stone Feared by Advocates for Sex Assault Victims, L.A. Times (Apr. 6, 2015, 1:38 PM),

    153. See supra notes 78–79 and accompanying text.

    154. See supra note 34–35 and accompanying text.

    155. See supra text accompanying note 47.

By Dan Menken

Today, in the civil case of Covey v. Assessor of Ohio County, a published opinion, the Fourth Circuit reversed the district court’s dismissal of Christopher and Lela Covey’s suit against government officials for entering the curtilage of their house without a search warrant.

Question of Fourth Amendment Protection From Unreasonable Government Intrusion

The Court was asked to decide whether government officials violated the Coveys’ Fourth Amendment right to protection from unreasonable government intrusion when the government officials entered the curtilage of the Covey’s home in search of marijuana without a warrant.

Government Tax Assessor Relayed Information to Police Regarding Marijuana Plants

On October 21, 2009, a field deputy for the tax assessor of Ohio County, West Virginia, entered the Covey’s property to collect data to assess the value of the property for tax purposes. The tax assessor entered the Covey’s property despite seeing “No Trespassing” signs, which is against West Virginia law. When searching the property, the tax assessor found marijuana in the Covey’s walk-out basement patio. The tax assessor then contacted the police.

When the police arrived, they entered the curtilage of the Covey’s residence and proceeded to the area where the marijuana was located. As they were searching the property they encountered Mr. Covey. The officers detained Mr. Covey and continued their search. The officers then waited several hours to obtain a warrant to search the house. During that time, Mrs. Covey returned home and was warned that she would be arrested if she entered the house, after which she left the premises. Upon returning an hour later, Mrs. Covey was seized and interrogated. After the police received the search warrant, the Coveys were arrested and jailed overnight.

On March 30, 2010, Mr. Covey pleaded guilty in state court to manufacturing marijuana in exchange for the government’s promise that they would not initiate prosecution against Mrs. Covey. He was sentenced to home confinement for a period of not less than one year and not more than five years. On October 20, 2011, the Coveys brought this suit pro se. The claims, brought under 42 U.S.C. § 1983 and Bivens, alleged that several defendants violated the Coveys’ Fourth Amendment rights by conducting an unreasonable search. The district court dismissed the Coveys’ claim concluding that none of the defendants violated the Fourth Amendment. This appeal followed.

Fourth Amendment Protects Curtilage of Home

The Court reviewed the district court’s grant of a motion to dismiss de novo. To prevail on a motion to dismiss, a plaintiff must “state a claim to relief that is plausible on its face.” Ashcroft v. Iqbal. A claim is plausible if “the plaintiff pleads factual content that allows the court to draw the reasonable inference that the defendant is liable for the misconduct alleged.” Id.

According to Oliver v. United States (1984), the Fourth Amendment protects homes and the “land immediately surrounding and associated” with homes, known as curtilage, from unreasonable government intrusions. Probable cause is the appropriate standard for searches of the curtilage and warrantless searches of curtilage is unreasonable.   The knock-and-talk exception to the Fourth Amendment’s warrant requirement allows an officer, without a warrant, to approach a home and knock on the door, just as any ordinary citizen could do. An officer may bypass the front door when circumstances reasonably indicate the officer might find the homeowner elsewhere on the property. The right to knock and talk does not entail a right to conduct a general investigation on a home’s curtilage.

The Complaint Presented Plausible Claims For Violations of the Fourth Amendment

Properly construed in the Coveys’ favor, the complaint alleges that the officers saw Mr. Covey only after they entered the curtilage. Thus, applying the Rule 12(b)(6) standard, the Court found that the Coveys plausibly alleged that the officers violated their Fourth Amendment rights by entering and searching the curtilage of their home without warrant. The district court erred by accepting the officers account of events, in which they stated that they saw Mr. Covey prior to entering the curtilage.

Turning to the tax assessor, the Court believed that his entering of the property, although illegal, was not a per se violation of the Fourth Amendment. In this case, the Court believed that the governmental interest in the search for tax purposes was minimal, while the Covey’s privacy interest is significant. Therefore, the Fourth Circuit held that the Coveys pleaded a plausible claim that the tax assessor conducted an unreasonable search of their home and curtilage.

Defendants’ Affirmative Defenses

According to Ashcroft v. al-Kidd (2011) qualified immunity “shields federal and state officials form money damages unless a plaintiff pleads facts showing (1) that the official violated a statutory or constitutional right, and (2) that the right was ‘clearly established’ at the time of the challenged conduct. As to the police officers, the Court stated that they should be aware that a warrantless search of the home, absent consent or exigency, is presumptively unconstitutional. Additionally, the Court noted that Fourth Circuit has, for over a decade, recognized that the curtilage of the home is entitled to Fourth Amendment protection. The Court felt that the tax assessor presented a closer case. Because there was no case law that spoke to a similar set of facts, and the tax assessor should have been aware that he was violating a Constitutional right by searching the property, the Court ruled that the tax assessor was not entitled to qualified immunity.

Finally, the defendants claimed that the Coveys’ § 1983 and Bivens claims are barred by Heck v. Humphrey (1994). There are two requirements for Heck to bar the Coveys’ claims. First, “a judgment in favor of the plaintiff [must] necessarily imply the invalidity of [a plaintiff’s] conviction or sentence.” Second, the claim must be brought by a claimant who is either (i) currently in custody or (ii) no longer in custody because the sentence has been served, but nevertheless could have practicably sought habeas relief while in custody. The court concluded that Mr. Covey’s claims did not necessarily imply the invalidity of his conviction and thus are not necessarily barred by Heck. The Court remanded the district court for further analysis under Heck.

Reversed and Remanded

Thus, the Fourth Circuit reversed the district court’s grant of dismissal and remanded the case for further proceedings.

By Michael Mitchell

Today, in Lynn v. Monarch Recovery Management, Inc., the Fourth Circuit affirmed the summary judgment granted to Kevin Lynn for Monarch Recovery Management’s violation of the Telephone Consumer Protection Act (“TCPA”).   On appeal, the Court rejected Monarch’s argument that it was exempt from the TCPA as a debt collector.

Under 47 U.S.C. § 227, the TCPA prohibits “making any call . . . [using] an artificial or prerecorded voice . . . to any telephone number assigned to a . . . cellular telephone service . . . or any service for which the called party is charged for the call.”   The United States District Court for the District of Maryland, at Baltimore, found that Monarch’s calls to Lynn violated the TCPA because Lynn was individually charged for each call.  Monarch made these calls to Lynn in its capacity as a debt collection company.

Affirming the district court’s grant of summary judgment to Lynn in an unpublished per curiam decision, the Fourth Circuit rejected Monarch’s attempt to avoid liability under the call-charged provision of the TCPA.  Specifically, Monarch argued that the FCC’s regulation excepted debt collectors from the TCPA’s prohibition on “call[s] to any residential telephone line using an artificial or prerecorded voice to deliver a message.”

The Court relied on its review of legislative intent in denying Monarch’s assertion that it was exempt from the call-charged provision of the TCPA.  Citing Clodfelter v. Republic of Sudan, 720 F.3d 199, the Court held that Congress did not intend for companies like Monarch to use the TCPA to limit their liability.  Thus, the Fourth Circuit has maintained civil liability for debt collectors under the call-charged provision of the TCPA.

By Joshua P. Bussen

Today in United States v. Mitchell, the Fourth Circuit, in a per curiam opinion affirmed the conviction of Sidney Mitchell for unlawful possession of a firearm by a felon. Mitchell entered a conditional plea of guilty in the Middle District of North Carolina, reserving his right to appeal the judgment of the district court. Mitchell contends that the district court erred in denying his motion to suppress evidence of a firearm that was found while police were conducting a search of his vehicle. Mitchell was sentenced to twenty-six months in prison.

In the waning hours of sunlight on November 20, 2012, a North Carolina police officer stopped Mitchell’s car on a suspicion that the tint on the vehicle’s windows was darker than allowed under North Carolina law. While performing a test that would gauge the level of tint on Mitchell’s windows—a process that involves placing a device on the inside of the vehicle—the officer claims he noticed the smell of “burnt marijuana.” Though Mitchell denied smoking marijuana, he consented to a search of his person. After searching Mitchell the officer turned to the vehicle, discovering a small amount of marijuana resin and a gun on the driver’s side floorboard.

In the Middle District of North Carolina Mitchell moved to suppress the firearm due to an improper search and seizure. The district court found that the tint on Mitchell’s windows gave the officer reasonable cause to pull the car over, and the smell of burnt marijuana subsequently warranted probable cause to search the vehicle. On appeal Mitchell did not question the officer’s motivation for detaining the vehicle, but disputed “lawfulness of the subsequent search of the [inside of the] car.”

The Fourth Circuit, relying on United States v. Scheetz, 293 F.3d 175, 184 (4th Cir. 2002), held that the odor of marijuana emanating from a car warranted sufficient probable cause to search the vehicle. Mitchell’s final argument that the officer’s credibility should be questioned fell on deaf ears, the Circuit judges were not willing to disturb the factual findings of the district court because “the district court is so much better situated to evaluate these matters.”

By Steven I. Friedland

“The world isn’t run by weapons anymore, or energy, or money.  It’s run by little ones and zeroes, little bits of data.  It’s all just electrons.”[1]

We live in an era of mass surveillance. Advertisers, corporations and the government engage in widespread data collection and analysis, using such avenues as cell phone location information, the Internet, camera observations, and drones.  As technology and analytics advance, mass surveillance opportunities continue to grow.[2]

The growing surveillance society is not necessarily harmful[3] or unconstitutional.  The United States must track people and gather data to defend against enemies and malevolent actors.  Defenses range from stopping attempts to breach government computers and software programs,[4] to identifying and thwarting potential terroristic conduct and threats at an embryonic stage.

Yet, without lines drawn to limit mass data gathering, especially in secret, unchecked government snooping likely will continue to expand.  John Kerry, the sitting Secretary of State, even recently acknowledged that the government has “sometimes reached too far” with its surveillance.[5] The stakes for drawing lines demarcating privacy rights and the government’s security efforts have never been higher or more uncertain.

This Article argues that the forgotten Third Amendment, long in desuetude, should be considered to harmonize and intersect with the Fourth Amendment to potentially limit at least some mass government surveillance.  While the Fourth Amendment has been the sole source of search and seizure limitations, the Third Amendment should be added to the privacy calculus,[6] because it provides a clear allocation of power between military and civil authorities and creates a realm of privacy governed by civil law.

Consequently, in today’s digital world it would be improper to read the words of the Third Amendment literally, merely as surplusage. Instead, the Amendment’s check on government tyranny should be viewed as restricting cybersoldiers from focusing surveillance instrumentalities[7] on and around private residences or businesses in an intrusive way—or using proxies to do so—that would serve as the functional equivalent of military quartering in the civil community.

I.  Mass Surveillance

Imagine an America with continual domestic drones, which collected camera and cell phone surveillance of every person in a particular residential subdivision, business headquarters, or city high-rise building.  The surveillance would be mostly secret but “in public,” capturing people sitting on rocking chairs on their front porches, unloading bags of groceries from their cars, opening their wallets to pay bills, and anything visible through windows in private residences and businesses.  People who go to sporting events or the supermarket would have their faces matched to an existing database. The metadata from Internet use, cell phone location data and other sources, including hyper-local observations, would be fed into computers for complex analysis and combined with other surveillance information.[8]  This information, all gathered and utilized outside the private space protected by the physical walls and doors of houses, would present a fairly intimate picture of these individuals over time, creating in essence a virtual window to what is occurring within the house or building, as well as without.[9]

Such a day is not far off. Drones and robots are currently being employed domestically in the skies,[10] on land, and in the seas[11] for various purposes, although apparently not yet on a continual and widespread basis. Yet, expansion of their use seems inevitable.[12] While most unmanned aircraft systems fly high overhead, out of sight, as more information is released and people look more carefully, we will know they are there. The government also is developing the Biometric Optical Surveillance System (“BOSS”), which will have tremendous capabilities for identifying people from distances of up to 100 meters.  This system was scheduled for testing at a public hockey game in the State of Washington in 2013.[13]  To supplement the information acquired directly, the government obtains considerable amounts of information through the consent of third parties.[14]

While surveillance is not overly intrusive when deployed in public places, where being watched can be expected, it still can be dangerous.[15] Surveillance, when taken as a whole with information and data gathering, can form a mosaic of intrusion in a manner similar to that described by Justices Alito and Sotomayor in their concurrences in the GPS tracking device case United States v. Jones.[16]  Pursuant to this “mosaic theory,” a privacy violation does not require a physical trespass.  One commentator noted the following,

Today’s police have to follow hunches, cultivate informants, subpoena ATM camera footage. . . . Tomorrow’s police . . . might sit in an office or vehicle as their metal agents methodically search for interesting behavior to record and relay. Americans can visualize and experience this activity as a physical violation of their privacy.[17]

Significantly, surveillance also is an expression of power—an accumulation of data that can be used against persons, even creating that intimate picture of what occurs inside a house when the cybersleuth never actually sets foot in it. As another commentator has observed about possible power abuses, “We cannot have a system, or even the appearance of a system, where surveillance is secret, or where decisions are made about individuals by a Kafkaesque system of opaque and unreviewable decision makers.” [18]

II.  The Third Amendment’s Place In Constitutional Orthodoxy

“[N]o Soldier shall, in time of peace be quartered in any house, without the consent of the Owner, nor in time of war, but in a manner to be prescribed by law.”[19]

A.     Origins and Interpretations

The Third Amendment might have an obscure[20] and obsolete[21] place in constitutional law orthodoxy, yet it draws on a rich history. The bright-line Amendment[22] traces its origins to pre-revolutionary war England, where multiple abuses by the king in quartering soldiers, the Royal entourage and their horses in private residences led to laws prohibiting quartering in England.[23]These laws were enacted in part to avoid maintaining a standing army, especially during peacetime.[24]  For example, in 1689,the British Parliament enacted the Mutiny Act, which outlawed the quartering of troops in private homes without the owner’s consent.[25] A standing army was thought to provide a slippery slope to tyranny, and it was the confluence of military with civil authority that was the real problem, not simply the taking of private resources by the King.

Continued quartering abuses in the colonies led to the adoption of the Third Amendment. Patrick Henry argued for the amendment because it offered rule by civil authority, not military force, [26] as did Samuel Adams, who objected to soldiers quartered “in the body of a city” and not just houses.[27]

Perhaps the amendment’s desuetude is attributable in part to the fact that it has only been the subject of Supreme Court cases in passing, such as in Griswold v. Connecticut,[28] and just one significant direct judicial interpretation, Engblom v. Carey,[29] a 1981 Second Circuit Court of Appeals case. In Engblom, the court was confronted with a claim by two correctional officers who claimed their Third Amendment rights were infringed by the State of New York when the state quartered national guardsmen in their dormitory-style residences during a prison strike by the guards in an upstate New York prison.[30]  The guards were renting their rooms from the State.[31]

The court first applied the Third Amendment to the State of New York through the incorporation doctrine of the Fourteenth Amendment.[32]  Significantly, the court viewed several of the key terms in the amendment expansively. The court considered the national guardsmen to be “soldiers” and held that the Third Amendment applied to the guardsmen as “tenants,” even if they did not own their quarters, despite the express language in the amendment.[33]

B.     The Relationship Between the Third and Fourth Amendments

The Second Circuit in Engblom also used an analysis borrowed from the Fourth Amendment, setting forth a standard of a “legitimate expectation of privacy” to determine if Third Amendment rights were triggered.[34] It noted that the amendment’s objective was to protect the fundamental right to privacy in conjunction with the use and enjoyment of property rights.[35]

The Engblom analysis at least implicitly recognized the interlocking nature of the Third and Fourth Amendments and the primary role of the Fourth Amendment as the privacy standard bearer. As one noted commentator observed, “If the Fourth Amendment had never been enacted, the Third Amendment might have provided the raw material for generating something like an anti-search and seizure principle.”[36]

Constitutionally, courts have used the Fourth Amendment to protect against government snooping on others, but the Fourth Amendment has been strapped with textual limits, given its language protecting only against unreasonable, not all, searches and seizures, and interpretive limits authored by a reticent Supreme Court that has stuck by rules created in predigital cases.[37] Also, while the Fourth Amendment protects against United States government spying, it does not apply to such conduct by foreign governments, which can and do swap data with the United States,[38] or apparent swaps of data with thousands of technology, finance, and manufacturing companies.[39]

Preoccupation with the Fourth Amendment doctrine, combined with a Gresham’s Law style of constitutional application suggesting that general principles often end up marginalizing specific provisions, help explain the Third Amendment’s disuse.  A contextual interpretation of this amendment in the digital era could offer a significant link in a system of digital checks and balances.

III.  Interpretation

The Third Amendment’s relevancy to surveillance privacy depends on its interpretation,[40] both in terms of its themes and words. The amendment’s broad themes resonate in the world of “Big Data” and the Internet. The amendment provides a bright line allocation of power, with a clear distinction that limits the military and protects homes from intrusion without consent.  As evidenced by Due Process, Equal Protection, and other constitutional doctrines such as the Eighth Amendment, the Court often takes into account evolving facts and cultural transformations over time.  A more specific analysis of each component of the Amendment follows.

A.     War and Peace

The wartime/peacetime distinction in the amendment provides a useful contrast about the expansiveness of government power at different times. When compared to the Fourth Amendment, the framers of the Third Amendment provided a clear line of what is reasonable in times of war or peace.

B.     Soldiers

History is instructive. Early English case law reflects the concern over forced accommodations and board not only by soldiers, but also by the royal court and its entourage.[41] The prohibition extended to the soldiers’ instrumentalities, namely their horses.[42]  In the late 1700s, soldiers honorably fought in uniform generally within full view of the enemy. Times have changed. InEngblom, national guardsmen were considered soldiers, even though they were defending a domestic prison.  Today, the definition would certainly include cyber agents, military personnel who are paid to hack and disrupt another country’s software and hardware and to protect our own. Instead of horses, these cyber soldiers use codes or metal instrumentalities to invade others’ cyber spaces.[43] Using stealth and remote access to obtain and crunch data is the new face of warfare; these soldiers disrupt and disable various aspects of a country to keep it off balance and vulnerable. For example, deployment of the Stuxnet worm, placed on computers in Iran to disrupt its quest for nuclear weapons, is but one illustration of the new military.

C.         Quartering

Quartering historically came to mean an “act of a government in billeting or assigning soldiers to private houses, without the consent of the owners of such houses, and requiring such owners to supply them with board or lodging or both.”[44] Billeting can mean a letter ordering the assignment or the assignment itself.         This definition yields some insights.  Significantly, it is a military intrusion into home life—civilian life—by soldiers, which is why early English analysis incorporated the forced provision of board and the tethering of horses as part of quartering. Thus, it is the intrusion and diminishment of civil authority and life that matter, even if it is through remote access rather than the physical presence of the soldiers. An unmanned drone is the equivalent of a piloted plane. Would military personnel stationed regularly at businesses, or operating cameras on the rooftops of private residences or businesses, or even on all public mailboxes generate intimidation or intrusion into daily life? Would the intrusions still be significant if the soldiers were outside of the houses and businesses, in the curtilages, peering inside or the equivalent? Especially if seen or heard, electronic surveillance devices could significantly interfere with civilian community life and intrude on civilian authority. As one commentator has noted, “[G]overnment or industry surveillance of the populace with drones would be visible and highly salient. People would feel observed, regardless of how or whether the information was actually used.”[45]

Quartering today also can involve proxies, where the U.S. government knows and promotes the equivalent of private or foreign quartering for its own gain.  One illustration of proxy quartering might involve an agreement between countries to swap sensitive data on each other’s citizens, revealing the intricacies of civil life inside the cities and their residences or businesses.[46]

D.    Any Houses

The term “any houses” on its face appears highly restrictive.[47] Yet, at least in Engblom, it also means tenancies. While tenancies refers to residences, today there are a proliferation of buildings housing businesses, which fall within the types of civil occupancies where sensitive and confidential civil life occurs.  Invasions of these buildings without physical entry can occur regularly in the digital world, which is how the term should be judged and is in keeping with the intent of the framers.

While the term “any houses” could be more broadly construed to mean all private chattel or real property, including electronic devices,[48] this likely would expand the meaning of the amendment to become a version of the Fifth Amendment Takings Clause, not likely intended for the Third Amendment’s “houses” distinction, particularly when the Fourth Amendment protects not only houses, but also “persons, places, and effects.”

E.     Without Consent

Although the amendment permits quartering in peacetime with consent, if quartering extends to businesses, the government-private business partnerships create questions about the voluntariness of the relationships.  This is especially the case if the government inserts employees into the private business locations.  This type of relationship might not generate adequate voluntary consent.[49]


The Third Amendment no longer will be the forgotten amendment if it is considered to interlock with the Fourth Amendment to provide a check on some domestic mass surveillance intruding on civil life, particularly within the home, business or curtilage of each.  In the digital era, the dual purposes of the Amendment should be understood to potentially limit the reach of cyber soldiers and protect the enjoyment of a private tenancy without governmental incursion.

        [1].   Sneakers (Universal Pictures 1992).
        [2].   See, e.g., Quentin Hardy, Big Data’s Little Brother, N.Y. Times, Nov. 12, 2013, at B1 (“Collecting data from all sorts of odd places and analyzing it much faster than was possible even a couple of years ago has become one of the hottest areas of the technology industry. . . . Now Big Data is evolving,         becoming more “hyper” and including all sorts of sources.”).
        [3].   Contra Neil Richards, The Dangers of Surveillance, 126 Harv. L. Rev. 1934 passim (2013) (arguing that surveillance is a direct threat to “intellectual privacy,” or the notion that ideas develop best in private).
        [4].   China allegedly attempts to hack U.S. computers on a daily basis. See Keith Bradsher, China Blasts Hacking Claim by Pentagon, N.Y. Times (May 7, 2013),
        [5].   Mark Memmott, U.S. Spying Efforts Sometimes ‘Reached Too Far,’ Kerry Says, The Two Way, Nat’l Pub. Radio (Nov. 1, 2013), -sometimes-reached-too-far-kerry-says (quoting John Kerry as saying that “some of the electronic surveillance programs of the National Security Agency have been on ‘automatic pilot’ in recent years and have inappropriately ‘reached too far’”).  Google’s Executive Chairman, Eric Schmidt, was less restrained about secret government spying, calling reports of National Security Agency (“NSA”) interception of the main communication links used by Google and Yahoo to connect to their data centers “outrageous.”  See Eyder Peralta, Google’s Eric Schmidt Says Reports of NSA Spying ‘Outrageous,’  The Two Way, Nat’l Pub. Radio (Nov. 4, 2013), /242960648/googles-eric-schmidt-says-reports-of-nsa-spying-are-outrageous (“There clearly are cases where evil people exist, but you don’t have to violate the privacy of every single citizen of America to find them.”).
        [6].   For the dual rationales of the Amendment, see Geoffrey M. Wyatt, The Third Amendment in the Twenty-First Century: Military Recruiting on Private Campuses, 40 New Eng. L. Rev. 113, 122–24 (2005).
        [7].   Instrumentalities do not include malware such as the “Stuxnet” computer worm, tracking devices, cookies and more. The Stuxnet worm was allegedly used by several countries to infiltrate and infect Iran’s nuclear facilities. See Alan Butler, When Cyberweapons End Up on Private Networks: Third Amendment Implications for Cybersecurity Policy, 62 Am. U. L. Rev. 1203, 1204–05 (2013).
        [8].   Indeed, the NSA alone gathers 20 billion “record events” per day. James Risen & Laura Poitras, N.S.A. Examines Social Networks of U.S. Citizens, N.Y. Times, Sep. 29, 2013, at A1.
        [9].   This off-the-wall versus through-the-wall distinction was advanced in Kyllo v. United States, 533 U.S. 27 (2001), where the Court found that the police unconstitutionally used an infrared heat detection device to determine whether heat lamps were being used in the house to grow marijuana. Id. at 40.
      [10].   In fact, Robert Mueller, the current F.B.I. Director, recently conceded at a Senate hearing that drones indeed have been used for some “very minimal” domestic surveillance operations. Phil Mattingly, FBI Uses Drones in Domestic Surveillance, Mueller Says, Bloomberg (June 19, 2013), -sureillance-mueller-says.html.
      [11].   William Herkewitz, Ocean Drones Plumb New Depths, N.Y. Times, Nov. 12, 2013, at D1.
      [12].   M. Ryan Calo, The Drone As Privacy Catalyst, 64 Stan. L. Rev. Online 29, 30–31 (2013). Calo notes that there are several counties where drone use is occurring; however, there are also several restrictions that limit use of drones. See Operation and Certification of Small Unmanned Aircraft Systems (SUAS), 76 Fed. Reg. 40,107, 40,107–08 (July 7, 2011), available at
      [13].   Eddie Keogh, DHS to Test Facial Recognition Software at Hockey Game, Reuters (Sept. 18, 2013),
      [14].   Another way the government obtains information is through warrants and requests under FISA.  See Foreign Intelligence Surveillance Act, 50 U.S.C. §§ 1801-1885 (2010).
      [15].   See Neil Richards, supra note 3, at 1952–58. Professor Richards organizes his argument as follows: “Part II shows how surveillance menaces our intellectual privacy and threatens the development of individual beliefs in ways that are inconsistent with the basic commitments of democratic societies. Part III explores how surveillance distorts the power relationships between the watcher and the watched, enhancing the watcher’s ability to blackmail, coerce, and discriminate against the people under its scrutiny.” Id. at 1936.
      [16].   132 S. Ct. 945, 956 (2012) (Sotomayor, J., concurring); Id. at 961 (Alito, J., concurring).  The case involved the placement of a GPS device on a private individual’s car.  Id. at 948 (majority opinion).  Writing for the majority, Justice Scalia found that the installation of the device was a search within the meaning of the Fourth Amendment.  Id. at 952.
      [17].   Calo, supra note 12, at 32.
      [18].   Neil M. Richards & Jonathan H. King, Three Paradoxes of Big Data, 66 Stan. L. Rev. Online 41, 43 (2013).  The authors discuss the paradox of power associated with Big Data, stating that “[b]ig data will create winners and losers, and it is likely to benefit the institutions that wield its tools over the individuals being mined, analyzed, and sorted. Not knowing the appropriate legal or technical boundaries, each side is left guessing. Individuals succumb to denial while governments and corporations get away with what they can by default, until they are left reeling from scandal after shock of disclosure.”  Id. at 45.
      [19].   U.S. Const. amend. III.
      [20].   William S. Fields & David T. Hardy, The Third Amendment and the Issue of the Maintenance of Standing Armies: ALegal History, 35 Am. J. Legal Hist. 393, 429 (1991).
      [21].   Morton Horwitz, Is the Third Amendment Obsolete? 26 Val. U.  L.  Rev. 209 passim (1991).
      [22].   This provision firmly states its singular prohibition.  Interestingly, it still arguably has been violated on multiple occasions. See, e.g., B. Carmon Hardy, A Free People’s Intolerable Grievancein The Bill of Rights, A Lively Heritage 67, 69 (1987); Tom W. Bell, “Property” in the Constitution: A View From the Third Amendment, 20 Wm. & Mary Bill Rts. J. 1243, 1276 (2012).
      [23].   See, e.g., B. Carmon Hardy, A Free People’s Intolerable Grievance – The Quartering of Troops and the Third Amendment, 33 Va. Cavalcade 126 (1984); J. Alan Rogers, Colonial Opposition to the Quartering of Troops During the French and Indian War, 34 Mil. Aff. 7, 7–11 (1970).
      [24].   Fields & Hardy, supra note 20, at 395; Hardy, supra note 23; Rogers, supra note 23.
      [25].   Horwitz, supra note 21, at 210.
      [26].   Patrick Henry, Patrick Henry’s Objections to a National Army and James Madison’s Reply, Virginia Convention (June 16, 1788), in 2 The Debate on the Constitution 695, 696–97 (Bernard Bailyn ed., 1993).
      [27].   Samuel Adams, Letter to the Editor, Bos. Gazette, Oct. 17, 1768, reprinted in 5 The Founders’ Constitution 215, 215 (Philip B. Kurland & Ralph Lerner eds., 1987) (“No man can pretend to say that the peace and good order of the community is so secure with soldiers quartered in the body of a city as without them.”).
      [28].   381 U.S. 479, 484 (1965) (discussing the Third Amendment as a part of the penumbras forming a constitutional privacy right).
      [29].   677 F.2d 957 (2d Cir. 1982).
      [30].   Id. at 958–59.
      [31].   Id. at 959–60.
      [32].   Id. at 961.
      [33].   Id. at 961–62.
      [34].   See William Sutton Fields, The Third Amendment: Constitutional Protection from the Involuntary Quartering of Soldiers, 124 Mil. L. Rev. 195, 207 & n.108 (1989); Ann Marie C. Petrey, Comment, The Third Amendment’s Protection Against Unwanted Military Intrusion, 49 Brook. L. Rev. 857, 857–64 (1983).
      [35].   Engblom, 677 F.2d at 962.
      [36].   See Horwitz, supra note 21, at 214.
      [37].   See, e.g., the physical trespass test used in United States v. Jones, 132 S. Ct. 945, 950–52 (2012); Id. at 955 (Sotomayor, J., concurring) (“[T]he trespassory test applied in the majority’s opinion reflects an irreducible constitutional minimum.”). The case involved the placement of a GPS device on a private individual’s car. Id. at 948 (majority opinion).  Justice Scalia found that doing so without a warrant unconstitutionally violated Mr. Jones’s property rights. Id. at 949.
      [38].   A prime illustration is the relationship between England and the United States.  They have swapped sensitive data on each other’s citizens, doing indirectly what is not permitted directly.  British Spy Agency Taps Cables, Shares with U.S. NSA – Guardian, Reuters (June 21, 2013), -idUKBRE95K10620130621.
      [39].   Michael Riley, U.S. Agencies Said to Swap Data with Thousands of Firms, Bloomberg (June 15, 2013), -14/u-s-agencies-said-to-swap-data-with-thousands-of-firms.html.
      [40].   Most scholars believe that words in the Constitution require interpretation.  Originalism, for example, looks to ground the meaning of the words based on the era and its sources.  Construction can have varying levels of strictness. For example, Justice Scalia believes that “[w]ords have meaning. And their meaning doesn’t change.” Jennifer Senior, In Conversation: Antonin Scalia, N.Y. Mag. (Oct. 6, 2013)
      [41].   Tom W. Bell, The Third Amendment: Forgotten but Not Gone, 2 Wm. & Mary Bill Rts. J. 117, 121 (1993).
      [42].   Id. at 123 n.46 (citing Coram Rege Roll, no. 564 (Easter 1402), m. 28d, at Westminster in Middlesex, reprinted in VII Select Cases in the Court of King’s Bench 121-23 (G.O. Sayles ed., 1971)).
      [43].   See Butler, supra note 7, at 1231–33, for an argument that it does trigger the Third Amendment.
      [44].   Quartering Soldiers, The Law Dictionary, /quartering-soldiers/ (last visited Jan. 13, 2013).
      [45].   Calo, supra note 12, at 33.
      [46].   See supra note 38.
      [47].   Given the rejection of an alternative Amendment that would have limited it only to private and not public houses, the Framers opted for a broader approach.  Compare Bell, supra note 41, at 129 n.105, with U.S. Const. amend. III.
      [48].   A recent commentator has provided the Amendment with a similar construction. See Butler, supra note 7, at 1230.
      [49].   The government also pays and partners with companies to produce and swap data. Riley, supra note 39.

By Beverly Cohen*


On June 23, 2011, the United States Supreme Court, in Sorrell v. IMS Health Inc.,[1] determined that Vermont’s law prohibiting pharmacies from selling prescription data to “data-mining companies” violated the Free Speech Clause of the First Amendment.[2]  Data miners purchased the prescription data to aggregate and resell it to pharmacy manufacturers for marketing purposes.[3]  Drug manufacturers used the information to target physicians for face-to-face visits (“detailing”) by salesmen to convince the physicians to prescribe more of the manufacturers’ costly brand-name drugs.[4]  The prescription information purchased from the data miners enabled the manufacturers to target particular physicians who were not prescribing their brand-name drugs or who were prescribing competing drugs.[5]

Several states objected to drug manufacturers’ use of prescription information for detailing, contending that it increased sales of brand-name drugs and drove up healthcare costs.[6]  When these states passed laws preventing the pharmacies’ sale of the prescription information to data-mining companies and the use of this information by drug manufacturers,[7]the data miners and drug manufacturers sued.[8]

When the challenge to Vermont’s data-mining law reached the Supreme Court, the Court invalidated it on the grounds that it violated the Free Speech Clause.[9]  The Court held that the law did not survive strict scrutiny.  It prohibited the use of prescription information with a particular content (prescriber histories) by particular speakers (data miners and detailers)[10] and did not advance Vermont’s asserted goals of ensuring physician privacy, improving the public health, and containing healthcare costs in a permissible way.[11]

The Federal Privacy Rule,[12] implementing the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”),[13] is similar to the data-mining laws in its restrictions on the disclosure of private health information.[14]  This Article applies the HIPAA Privacy Rule to the practice of data mining and, surprisingly, finds that HIPAA restricts it.[15]  The Privacy Rule flatly prohibits any unauthorized use or disclosure of protected health information for marketing purposes.[16]  Nevertheless, the practice of data mining continues despite HIPAA.  In fact, at least one court has recently declared that nothing in HIPAA restricts data mining.[17]

The question post-Sorrell is whether the marketing provisions of the Privacy Rule, like Vermont’s data-mining law, also violate freedom of speech.[18]  Although there are obvious similarities between HIPAA’s marketing provisions and the marketing restrictions of Vermont’s data-mining law, there are also substantial differences.[19]  The structure of the Privacy Rule is quite unlike the data-mining law in that the discriminatory intent and impact that the Supreme Court found objectionable in Sorrell is largely absent in HIPAA.[20]  Unlike Vermont’s data-mining law, the Privacy Rule does not target disclosures with particular content or by particular speakers.[21]  Therefore, this Article concludes that it is likely that application of the Sorrell analysis to the Privacy Rule would yield a different answer.[22]

This Article explains the practices of data mining and detailing[23] and describes the state laws that sprang up to prohibit them.[24]  It next discusses the various judicial outcomes of the data miners’ challenges to those laws,[25] culminating in the Supreme Court’s invalidation of Vermont’s data-mining law in Sorrell.[26]  The Article then applies the HIPAA Privacy Rule to the Sorrell facts and finds that the marketing provisions of HIPAA disallow the unauthorized use of such information for sale to data miners.[27]  Finally, the Article compares the HIPAA Privacy Rule to the data-mining law invalidated in Sorrell and finds that their structures considerably differ.[28]  Based on these differences, the Article opines that HIPAA presents a substantially different question from that considered in Sorrell and likely yields a different answer.[29]

I.  The Targeted Practices: Data Mining and Detailing[30]

Every time a pharmaceutical prescription is filled, the pharmacy retains information describing the transaction.[31]  These records generally include the identification of the patient; identification of the prescribing physician, including his name, address, and phone number; the drug prescribed, its dosage, and its refill information; price; and insurance information.[32]  In many cases, state law requires this information to be collected and maintained by the pharmacies[33] so that the state can monitor cases of illicit prescriptions and fraudulent prescribing practices by physicians.[34]

Companies, such as IMS Health Inc. and Verispan, LLC,[35] are in the business of “mining” this pharmacy data.[36]  They purchase prescription data from the pharmacies that the pharmacies’ computer software has collected and encrypted so that individual patients cannot be identified by name.[37]  The prescription information that data miners purchase is estimated to encompass several billion prescriptions per year.[38]  The data miners then aggregate the entries,[39] group the information by prescriber, and cross-reference the prescribing history with information on each prescriber available through publicly accessible databases, such as the American Medical Association’s database of physician specialists.[40]  The ultimate reports that the data miners produce show each prescriber’s identity, medical specialty, and a complete history of the drugs he or she prescribed over a given period of time.

The data miners’ customers for these reports are the pharmaceutical manufacturers because the reports are useful in facilitating the drug manufacturers’ practice of “detailing.”  This practice consists of drug-sales representatives visiting physicians and their staffs in a particular region where specific drugs are being marketed.[41]  At these face-to-face meetings, the sales representatives give the physicians “details” about their drugs (use, side effects, and risks) to convince the physicians that they are a better choice for their patients.[42]  Described as a “valuable tool,”[43] the data-mining reports allow the drug representatives to pinpoint prescribers who might be persuaded to switch to the manufacturer’s drugs or to prescribe the manufacturer’s drugs more frequently.[44]  The data-mining reports also enable the representatives to tailor their presentations based on the particular physician’s prescribing practices to maximize the effectiveness of their sales efforts:

That [data-mining] information enables the detailer to zero in on physicians who regularly prescribe competitors’ drugs, physicians who are prescribing large quantities of drugs for particular conditions, and “early adopters” (physicians with a demonstrated openness to prescribing drugs that have just come onto the market).  The information also allows the detailer to tailor her promotional message in light of the physician’s prescribing history.[45]

Merck’s use of data mining to market Vioxx provides an example of the usefulness of data mining to sell a particular drug:

When Merck marketed Vioxx, for example, it used a wealth of prescriber-identifying data to create monthly reports on individual prescribers in each detailer’s assigned territory.  The reports showed how many Merck versus non-Merck drugs the prescriber prescribed and estimated how many of these prescriptions could be substituted for Merck products.  Merck then tracked its detailers’ progress in converting prescribers in their territories to the Merck brand and gave detailers bonuses based on Merck’s sales volume and market share in the detailer’s territory.[46]

Detailing has been described as “a massive and expensive undertaking for pharmaceutical manufacturers.”[47]  Manufacturers reportedly spent $4 billion in 2000 for detailing,[48] employing some 90,000 sales representatives to make the physician office visits.[49]  The detailers often arrive with small gifts for the physicians and their staffs and drop off free drug samples for the physicians to try with their patients.[50]  It has been estimated that a single physician is visited by an average of twenty-eight detailers a week, and a specialist is visited by an average of fourteen detailers.[51]  Because of the time involved and high cost of detailing, drug manufacturers usually reserve it for marketing high-cost, brand-name drugs,[52] as opposed to lower-cost, generic drugs.[53]  Sales representatives try to convince physicians to switch from generic drugs to their brand-name drug, to utilize it instead of a competing brand-name drug, or to remain loyal to the brand-name drug when the patent expires and generic versions become available.[54]

II.  States’ Objections to Drug Manufacturers’ Use of Data Mining for Detailing

Some states, including New Hampshire, Maine, and Vermont, perceived that pharmaceutical manufacturers’ use of pharmacy data to enhance their detailing efforts increased the cost of prescription drugs with no concomitant improvement to the public health.[55]  These perceptions emanated from several factors.

First, the states became convinced that data mining improved the success of detailing.[56]  These states perceived that “detailers armed with prescribing histories enjoyed a significant marketing advantage, resulting in greater leverage, [and] increased sales of brand-name drugs.”[57]  This “leverage” refers to the detailer’s ability to target physicians who prescribe large quantities of generics, the ability to “zero in” on a physician’s particular prescribing choices, and the ability to “punish” physicians who abandon their loyalty to certain brand-name drugs.[58]  Thus, “prescribing histories helped the detailer to become more adversarial in her presentation and to focus on the weakness of the physician’s erstwhile drug of choice as opposed to the clinical virtues of the detailed drug.”[59]

Second, the states believed that the success of detailing often resulted from less than accurate and balanced information.  Vermont negatively characterized the detailers’ provision of information to physicians on pharmaceutical safety and efficacy as “frequently one-sided,” “incomplete,” and “biased.”[60]  The Vermont legislature found that the “[p]ublic health is ill served by the massive imbalance in information presented to doctors and other prescribers.”[61]  Vermont held detailers’ use of data mining responsible for creating “an unbalanced marketplace of ideas that undermines the state’s interests in promoting public health, protecting prescriber privacy, and reducing healthcare costs.”[62]

Third, the states perceived that detailing improperly influenced physicians’ prescription choices and unnecessarily raised the cost of prescription drugs.  New Hampshire viewed detailing as having a “pernicious effect” upon drug prescribing.[63]

The states’ “common sense” conclusion was that detailing worked to induce physicians to prescribe larger quantities of more expensive brand-name drugs.[64]  The fact “that the pharmaceutical industry spends over $4 billion annually on detailing bears loud witness to its efficacy.”[65]  Despite the much higher cost of detailed drugs, New Hampshire concluded that, based upon “competent evidence,” drugs that were aggressively marketed through detailing “provide no benefit vis-à-vis their far cheaper generic counterparts.”[66]  The State maintained that “detailers armed with prescribing histories encouraged the overzealous prescription of more costly brand-name drugs regardless of both the public health consequences and the probable outcome of a sensible cost/benefit analysis.”[67]

Finally, doctors themselves voiced “a predominantly negative view of detailing.”[68]  A 2006 survey by the Maine Medical Association reported that “a majority of Maine physicians did not want pharmaceutical manufacturers to be able to use their individual prescribing histories for marketing purposes.”[69]

III.  State Laws Regulating Data Mining[70]

In the interests of protecting prescriber privacy, safeguarding the public health, and containing healthcare costs,[71] New Hampshire in 2006 became the first state to enact a law limiting drug prescription data mining, known as the Prescription Information Law.[72]  The law prohibited the sale, transfer, use, or licensing of prescription records by pharmacies and insurance companies for any commercial purpose,[73] except for listed health-related purposes, such as pharmacy reimbursement, healthcare management, utilization review by a healthcare provider, and healthcare research.[74]  The statute did not prohibit the transfer of prescription information to fill patients’ prescriptions[75] and placed no restrictions upon prescription information that did not identify the patient or the prescriber.[76]

Shortly thereafter, Vermont followed suit, enacting Act 80, section 17 of the Vermont General Statutes to restrict the use of pharmacy records for drug marketing.[77]  Vermont’s policy goals, compatible with those of New Hampshire, were:

[T]o advance the state’s interest in protecting the public health of Vermonters, protecting the privacy of prescribers and prescribing information, and to ensure costs are contained in the private health care sector, as well as for state purchasers of prescription drugs, through the promotion of less costly drugs and ensuring prescribers receive unbiased information.[78]

Unlike the flat prohibition of New Hampshire’s statute, however, Vermont’s law adopted an “opt-out” approach, prohibiting insurers and pharmacies from selling or transferring prescription data for marketing purposes unless the prescriber opted out of the prohibition by consenting to the use.[79]  The law also prohibited pharmacy manufacturers from using the data for marketing absent prescribers’ consent.[80]  The law defined “marketing” as advertising or any activity that influenced the sale of a drug or influenced prescribing behavior.[81]  The statute contained a number of exceptions to the prohibition, most of which facilitated healthcare treatment and reimbursement, such as dispensing prescriptions, pharmacy reimbursement, patient care management, utilization review by healthcare professionals, healthcare research, and communicating treatment options to patients.[82]  The law also created a program to educate healthcare professionals on therapeutic and cost-effective drug prescribing.[83]

In 2008, Maine enacted similar legislation.  Its goals, like those of New Hampshire and Vermont, were “to improve the public health, to limit annual increases in the cost of healthcare and to protect the privacy of . . . prescribers in the healthcare system of this State.”[84]  Unlike Vermont’s “opt-out” approach, Maine passed an “opt-in” version, making it unlawful for a pharmacy to use, sell, or transfer prescription drug information for any marketing[85] purpose when the information identified the prescriber and the prescriber had opted in by registering for the statute’s protection.[86]  The law included a number of health-related exceptions to the definition of “marketing,” including pharmacy reimbursement, patient care management, utilization review by a healthcare provider, and healthcare research.[87]

IV.  Challenges to the Data-Mining Laws

With a number of other states considering enactment of similar laws,[88]and facing the loss of billions of dollars in business annually, several data-mining companies[89] and an association of pharmaceutical manufacturers[90] challenged the constitutionality of New Hampshire’s, Vermont’s, and Maine’s data-mining laws.[91]  The claims that survived to appeal included that the statutory prohibition violated the Free Speech Clause of the First Amendment, was unconstitutionally vague and overbroad under the First and Fourteenth Amendments, and offended the Commerce Clause.[92]

In 2008, the United States Court of Appeals for the First Circuit ruled in IMS Health Inc. v. Ayotte[93] that the New Hampshire statute regulated conduct, not speech, and therefore did not abridge the First Amendment rights of the data miners.[94]  Alternatively, the First Circuit ruled that even if New Hampshire’s law amounted to a regulation of protected speech, New Hampshire’s action to protect cost-effective healthcare passed constitutional muster.[95]  Utilizing the Central Hudson test,[96] the court found that healthcare cost containment was a substantial governmental interest,[97] data mining increased the success of detailing,[98] detailing increased the cost of prescription drugs,[99] and the statute was sufficiently tailored to achieve its objectives.[100]  The court summarily disposed of the remaining claims for vagueness[101] and violation of the Commerce Clause.[102]

When the challenge to the Maine statute reached the First Circuit in IMS Health Inc. v. Mills[103] approximately two years later, the court unsurprisingly relied upon its prior New Hampshire Ayotte ruling.[104]  The court rejected the First Amendment claim,[105] the vagueness claim,[106]and the Commerce Clause challenge[107] for the same reasons stated inAyotte.

Four months after Mills, the United States Court of Appeals for the Second Circuit ruled on the same issues with regard to the Vermont statute in IMS Health Inc. v. Sorrell.[108]  In Sorrell, the Second Circuit disagreed with nearly every basis for the First Circuit’s two prior decisions.  Applying theCentral Hudson test,[109] the court found that although Vermont did have a substantial interest in lowering healthcare costs and protecting the public health,[110] the statute did not directly advance those interests.[111]  Rather, the court characterized Vermont’s law as an attempt “to bring about indirectly some social good or alter some conduct by restricting the information available to those whose conduct the government seeks to influence.”[112]  Moreover, the court found that Vermont had “more direct, less speech-restrictive means available” to accomplish its goals.[113]  As less restrictive alternatives, the court suggested that the State could have assessed the results of its campaign to encourage the use of generics or could have mandated the use of generic drugs as a first course of treatment.[114]  Failing these critical prongs of the Central Hudson test, the court ruled that Vermont’s law unconstitutionally restricted freedom of speech.[115]

With the First Circuit and Second Circuit Courts of Appeal thus directly at odds on the constitutionality of the data-mining laws, the United States Supreme Court granted certiorari to consider Vermont’s appeal in Sorrell v. IMS Health Inc.[116]

V.  The Supreme Court’s Decision

In June 2011, the Supreme Court, in a six to three ruling,[117] held that Vermont’s drug prescription data-mining law violated the First Amendment.[118]  While conceding that Vermont’s asserted policy goals of containing pharmacy prescription costs and protecting public health were legitimate concerns,[119] the Court held that the statute was a broad, content-based rule[120] that did not satisfy strict scrutiny.[121]

Initially, the Court unequivocally held that the Vermont law was content and speaker based, as it prohibited the sale of pharmaceutical prescription data only for marketing purposes[122] and only to pharmaceutical manufacturers.[123]  Because the law “impose[d] burdens that are based on the content of speech and that are aimed at a particular viewpoint,” the Court ruled that it must apply strict scrutiny.[124]

The Court flatly rejected Vermont’s argument that the law regulated conduct as opposed to speech.[125]  Instead, the Court ruled that “[f]acts, after all, are the beginning point for much of the speech that is most essential to advance human knowledge and to conduct human affairs.  There is thus a strong argument that prescriber-identifying information is speech for First Amendment purposes.”[126]

Applying the Central Hudson test, whereby “[t]here must be a ‘fit between the legislature’s ends and the means chosen to accomplish those ends,’”[127] the Court held that none of the State’s asserted justifications—prescriber privacy, protecting public health, and reducing healthcare costs—withstood scrutiny.[128]  First, because the law permitted disclosure of prescription information for a number of other purposes and applied the ban only to marketing, the Court rejected the privacy justification.[129]  The Court ruled that Vermont’s statute “permits extensive use of prescriber-identifying information and so does not advance the State’s asserted interest in physician confidentiality.”[130]  In particular, the Court objected to the State’s own ability to use the same prescription information to engage in “counter-detailing” efforts to promote generic drugs.[131]  Moreover, the Court observed that privacy remedies less restrictive of speech were available.[132]  For example, prescribers could simply decline to meet with detailers.[133]  Even though physicians might find the use of their prescription histories by detailers to be “underhanded” or tantamount to “spying,”[134] the Court declared that “[s]peech remains protected even when it may . . . ‘inflict great pain.’”[135]

In similar fashion, the Court declared that Vermont’s stated policy goals of improving public health and reducing healthcare costs did not withstand scrutiny under the Central Hudson test,[136] because the law “does not advance them in a permissible way.”[137]  The law sought to protect patients’ health and cost-effectiveness only indirectly, aimed at the fear that physicians, admittedly sophisticated consumers,[138] would make poor purchasing decisions if given truthful information by detailers.[139]

In short, the Court viewed the statute as a means for the State to advance its own views over those of pharmacy manufacturers by stifling protected speech.[140]  The Court stated that if the statute had provided for only a few narrowly tailored exceptions to its ban on the sale or disclosure of prescription information, then its position that it was not targeting a disfavored speaker and disfavored content might be stronger.[141]  But, here, the law permitted disclosure of the same information to countless others and even to the State itself to persuade physicians to prescribe generic drugs.[142]  The Court declared that free access to and use of privately held information is “a right too essential to freedom to allow its manipulation to support just those ideas the government prefers.”[143]  The Court concluded that “the State has left unburdened those speakers whose messages are in accord with its own views.  This the State cannot do.”[144]

Several days after the Sorrell decision was issued, the Supreme Court vacated the First Circuit’s finding that the Maine data-mining laws were valid and remanded the case to the court for further consideration in light ofSorrell.[145]  Three months later, the New Hampshire District Court issued an order declaring that New Hampshire’s data-mining laws were invalid in light of Sorrell.[146]

VI.  Applying the HIPAA Privacy Rule to Data Mining

Since New Hampshire, Vermont, and Maine each enacted state laws that prohibited pharmacies from selling prescription information to data miners for use in detailing, presumably these states perceived that such laws were necessary to ban the practice.  This necessity apparently stemmed from the states’ belief that nothing in the HIPAA Privacy Rule prohibited these data sales by the pharmacies.  This Part of the Article explains why that belief is not supported by the text of HIPAA.

A.     Provisions of the HIPAA Privacy Rule

The HIPAA Privacy Rule[147] regulates covered entities’ use and disclosure of protected health information.[148]  The covered entities regulated by HIPAA include most health plans and healthcare providers.[149]  The term “provider” is defined by the Rule as “a provider of medical or health services . . . and any other person or organization who furnishes, bills, or is paid for health care in the normal course of business.”[150]

Under HIPAA, any time a covered entity uses or discloses protected health information, the use or disclosure must comply with HIPAA’s privacy provisions.[151]  The term “use” is broadly defined as “the sharing, employment, application, utilization, examination, or analysis” of health information protected by HIPAA.[152]  “Disclosure” is also broadly defined as “the release, transfer, provision of, access to, or divulging in any other manner of information outside the entity holding the information.”[153]

The health information protected by the Privacy Rule includes any information relating to healthcare treatment or payment[154] that has a potential to identify the patient to whom the information applies.[155]  Identifiers that can render health information protected include, inter alia, the patient’s name, address, social security number, phone number, photograph, zip code, treatment date, employer, and names of spouse and children.[156]  Furthermore, any identifier that is not specifically named in the Privacy Rule but, due to its uniqueness, has a potential to identify the subject of the information also renders the information protected.[157]

Under the Privacy Rule, any use or disclosure of protected health information by a covered entity must be explicitly permitted or required by HIPAA.[158]  The Privacy Rule requires disclosure in only two instances: (1) when the subject of the protected health information (“the individual”)[159]requests access to his own healthcare information,[160] and (2) when the Secretary of the Department of Health and Human Services (“HHS”) requests access in order to enforce HIPAA.[161]  All other uses and disclosures authorized by the Privacy Rule are permissive.[162]

Most of the permissive uses and disclosures under the Privacy Rule fall into two broad categories.[163]  First, covered entities may use and disclose protected health information for “treatment, payment, or healthcare operations.”[164]  “Treatment” is defined as the rendering of healthcare services to individuals or managing their care.[165]  “Payment” comprises paying insurance premiums and reimbursing providers.[166]  “Health care operations” broadly encompasses operating the business of healthcare entities, including such activities as business management and administrative activities, quality assessment, evaluating the credentials of providers, customer service, and obtaining legal and auditing services.[167]  Thus, treatment, payment, and healthcare operations cover the myriad activities that allow the healthcare industry to function.

The second broad category of permissive uses allows covered entities to use and disclose protected health information for twelve public-interest activities.[168]  These include, inter alia, participating in public-health activities to prevent or control disease; reporting abuse, neglect, or domestic violence; complying with healthcare audits and investigations; assisting law enforcement activities; engaging in healthcare research; and assisting national security and intelligence activities.[169]

If a covered entity’s use or disclosure of protected health information does not fit within one of the Privacy Rule’s enumerated required or permitted use and disclosure, then the use or disclosure may not occur[170] unless the individual authorizes the use or disclosure in writing.[171]

As the primary goal of HIPAA is to protect the privacy of individuals’ healthcare information,[172] HIPAA grants individuals rights of access to their own information and rights to control its uses and disclosures by covered entities.  These rights include the following:
(1) A right of individuals to access upon request their own protected health information,[173] along with a right to appeal denials of access;[174]

(2) A right of individuals to seek to amend their protected health information possessed by covered entities,[175] as well as a right to submit a written statement disagreeing with a denial of an amendment;[176]

(3) A right of individuals to receive an accounting of certain disclosures of their protected health information made by covered entities;[177]

(4) A right of individuals to request covered entities to restrict certain permissible uses and disclosures of their protected health information;[178]

(5) A right of individuals to request confidential communications of protected health information from providers and health plans,[179] which providers must accommodate[180] and which health plans must accommodate if the individuals state that they will be in danger unless accommodation is made;[181]

(6) A right of individuals to agree or object before covered entities make certain disclosures;[182]

(7) A right of individuals to authorize disclosures to third parties;[183] and

(8) A right of individuals to receive a Notice of Privacy Practices from covered entities, describing the covered entities’ uses and discloses of their protected health information and the individuals’ rights thereunder.[184]

B.     HIPAA’s De-identification and Marketing Provisions

HIPAA’s de-identification and marketing provisions are especially relevant to data mining.  In Sorrell, the data mining involved de-identification because, when the pharmacies’ computer software collected the raw prescription data, the software encrypted or stripped out the patients’ identifying information.[185]  Therefore, when the pharmacies sold the information to the data miners, it had been de-identified because the patients’ names could no longer be identified.[186]

The Privacy Rule provides that once protected health information is de-identified, it is no longer protected by HIPAA and thus is not subject to HIPAA’s use and disclosure restrictions.[187]  HIPAA gives explicit instructions on what information must be removed from protected health information to render it de-identified.[188]  Further, HIPAA specifically permits covered entities to de-identify protected health information.[189]  Moreover, HIPAA defines “health care operations,” one of the permissive uses and disclosures of protected health information under the Privacy Rule,[190] to include a covered entity’s creation of de-identified information when the de-identification relates to a “covered function.”[191]

HIPAA’s marketing provisions are also particularly relevant to data mining.  The data miners purchased the prescription information to market their aggregations and reports to pharmaceutical manufacturers.[192]  The drug manufacturers, in turn, purchased the prescription information to more effectively market their brand-name drugs to prescribers.[193]

HIPAA expressly provides that covered entities’ uses and disclosures of protected health information for the purpose of “marketing” are subject to heightened restrictions.[194]  HIPAA defines “marketing” in two ways.  First, marketing includes “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service.”[195]  However, this definition excludes communications made for description of plan benefits, for treatment of the individual, or for case management or care coordination of the individual.[196]  Second, marketing includes a covered entity’s sale of protected health information to a third party to assist that party in marketing its products.[197]

HIPAA’s primary marketing restriction is that whenever a covered entity uses or discloses protected health information for marketing purposes, the individual must expressly authorize the use or disclosure.[198]  This mandate is stated emphatically: “Notwithstanding any provision of this subpart,[199] other than the transition provisions in § 164.532,[200] a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”[201]

The Rule states only two exceptions to this requirement that the individual must authorize the marketing uses and disclosures.  First, an authorization is not needed if the marketing consists of a face-to-face communication between the covered entity and the individual.[202]  Second, an authorization is not needed if the marketing consists of a promotional gift of nominal value provided by the covered entity.[203]  The affected individual must authorize all other marketing uses and disclosures.[204]

C.     How HIPAA’s Marketing and De-identification Rules Impact Data Mining

While Vermont and other states apparently believed that it was necessary to enact a law to prohibit pharmacies from selling prescription information to data miners, surprisingly, such laws were probably not necessary.  HIPAA already appears to have prohibited those sales, rendering the state laws inconsequential.

As explained above, HIPAA requires an authorization from every affected individual before his protected health information can be used or disclosed by covered entities for marketing purposes.[205]  Each Vermont pharmacy qualifies as “a provider of medical or health services” and as an entity that “furnishes, bills, or is paid for health care in the normal course of business.”[206]  Thus, the pharmacies are covered entity providers under HIPAA.[207]  The prescription information collected and retained by the pharmacies constitutes “protected health information,” as it includes the patients’ names and addresses, as well as other identifying information.[208]

Moreover, the pharmacies’ disclosures of prescription information to the data miners appear to have been “for marketing.”[209]  The pharmacies made the disclosures to data miners to enable the data miners to sell their aggregations and reports of pharmacy data to their customers, including drug manufacturers.[210]  Further, the data miners disclosed the prescription information to drug manufacturers to use in marketing their brand-name drugs to physicians.[211]  Selling prescription information for these purposes appears to qualify as marketing under the Privacy Rule’s broad definition: “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service.”[212]

The “rub” with this analysis, however, is that according to the facts inSorrell, the pharmacies did not disclose “protected health information” to the data miners because the information had been de-identified by the pharmacies’ computer software prior to the sale.[213]  As stated above,[214] HIPAA expressly permits covered entities to de-identify protected health information, thereby removing it from any constraints that HIPAA imposes.[215]  Therefore, the pharmacies’ de-identification of their prescription information may have removed the information from the category of “protected health information”[216] and thereby enabled the pharmacies to make whatever use they wished of the information without violating HIPAA.[217]

But HIPAA’s marketing restrictions do not prohibit only unauthorized disclosures of protected health information.  The restrictions also prohibit the pharmacies from even using protected health information for marketing purposes.[218]  Creating de-identified information from protected health information appears to constitute a use of the protected health information because the pharmacies must “employ” and “utilize” the protected information in order to de-identify it.[219]  In fact, the Privacy Rule itself refers to the “use” of protected health information to create de-identified information.[220]  Here, the purpose of the pharmacies’ de-identification of the prescription information was to facilitate sales of the information to the data miners and to enable sales of the data miners’ aggregations and reports to the drug manufacturers, all for the purpose of marketing brand-name drugs to prescribers.[221]  Therefore, the de-identification itself appears to qualify as a marketing use,[222] so that the pharmacies would be prohibited from such use without the individuals’ express authorization under the Privacy Rule.[223]

Further, the Privacy Rule’s explicit statement that covered entities may de-identify protected health information[224] does not negate the authorization requirement.  The requirement that covered entities must obtain an authorization before any use or disclosure related to marketing expressly states that this requirement is imposed “[n]otwithstanding any provision of this subpart.”[225]  HIPAA’s de-identification provisions, on the other hand, lack this vital “notwithstanding” language.[226]  Therefore, the requirement to obtain individuals’ authorizations for any use or disclosure related to marketing trumps the de-identification provisions.  Although HIPAA expressly permits covered entities and their business associates to de-identify protected health information,[227] it appears that any such use (i.e., de-identification) of protected health information for marketing purposes may not occur without written authorizations from the affected individuals.[228]

Under this reading of HIPAA, whenever the purpose of the de-identification is marketing, the pharmacy must first obtain a written authorization from every individual whose protected health information is being so used before any pharmacy de-identifies its prescription information.[229]  Granted, the requirement to obtain authorizations is not an outright prohibition against the de-identification, subsequent sale, or ultimate use of the information for marketing.  However, the requirement to obtain an authorization from every individual—where billions of prescriptions are being disclosed[230]—places such an enormous burden on the pharmacies that, for all practical purposes, it quashes use of the information for data mining.[231]  Not only will pharmacies need to obtain written authorizations from every individual before his information may be de-identified or disclosed, but it is likely that most of these patients will either refuse to furnish the authorizations or not bother to execute them.[232]  Consequently, HIPAA’s authorization requirement adds so much additional effort and cost to data mining that drug companies will probably no longer find it a cost-effective tool for detailing.[233]

D.     Continuing Failure to Use HIPAA to Restrict Data Mining

It is apparent that parties continue to fail to apply the marketing provisions of the Privacy Rule to restrict data mining.  In a recent case, Steinberg v. CVS Caremark Corp.,[234] prescription drug purchasers sued a pharmacy chain for, inter alia, its disclosures of the purchasers’ prescription information to data miners.  Plaintiffs challenged the pharmacies for accepting remuneration from drug manufacturers for (1) sending letters to the consumers’ physicians suggesting that they prescribe alternate drugs, and (2) selling de-identified prescription information directly to the drug manufacturers and data companies.[235]  The plaintiffs brought state law claims for violation of Pennsylvania’s Unfair Trade Practices and Consumer Protection Law, unjust enrichment, and invasion of privacy.[236]

The United States District Court for the Eastern District of Pennsylvania dismissed the complaint for failure to state a claim.[237]  In so doing, the court made erroneous findings that nothing in the Privacy Rule restricted the defendants’ activities.[238]  First, the court declared that the pharmacies’ sale of de-identified drug prescription information to pharmaceutical manufacturers and data companies for marketing purposes did not offend the HIPAA Privacy Rule because the information had been de-identified prior to sale.[239]  Second, the court stated that the pharmacies’ use of the plaintiffs’ protected health information to send marketing notices to the plaintiffs’ physicians did not violate HIPAA because this constituted permissible healthcare operations.[240]

In fact, the Privacy Rule does not permit either activity.  Without authorizations from the affected individuals, the pharmacies could not use the plaintiffs’ protected health information (even when such use is de-identification) for marketing purposes,[241] thereby rendering the unauthorized de-identification itself illicit.  Further, without the appropriate authorizations, the pharmacies could not disclose protected health information to the plaintiffs’ physicians for marketing purposes,[242] even under the guise of suggesting “treatment alternatives.”[243]  While the court correctly observed that HIPAA does not provide a private right of action, thereby precluding the plaintiffs from bringing a claim directly under HIPAA,[244] the HIPAA violations could arguably have served as bases for the plaintiffs’ state law claims.

VII.  Applying the Sorrell Analysis to the HIPAA Privacy Rule

Both the data-mining laws and HIPAA impose restrictions on the use of health information for marketing.[245]  Both restrict pharmacies from selling de-identified prescription information to data miners.[246]  Although the Supreme Court invalidated the Vermont data-mining law in Sorrell,[247]HIPAA still effectively prevents pharmacies from using protected health information for marketing purposes—that is, de-identifying it for sale to data miners.[248]  Thus, the Sorrell holding raises a question of whether the marketing restrictions in the HIPAA Privacy Rule, like the data-mining law in Sorrell, violate the First Amendment rights of the data miners and drug manufacturers to obtain access to prescription information for marketing purposes.

At first glance, aspects of the data-mining laws and HIPAA’s marketing provisions appear quite similar.  Both bodies of law were motivated by substantial governmental interests—prescriber privacy, public health, and healthcare cost containment for the data- mining laws,[249] and privacy of patients’ medical information for the HIPAA Privacy Rule.[250]  Both laws seek to restrain the use of health information for marketing purposes.[251]  Both define marketing in similar ways.[252]  And both list a number of healthcare-related exceptions to their marketing restrictions.[253]

Despite these similarities, there are substantial grounds to argue that important distinctions between the data-mining laws and the HIPAA Privacy Rule predominate in any comparison.  First, the parties who sought protection are quite different.  The data-mining laws aimed to maintain the privacy of prescribers,[254] many of whom had complained that allowing drug manufacturers access to their prescribing history allowed the detailers “to target them for unwelcome marketing calls.”[255]  The Sorrell Court observed, however, that physicians are hardly hapless victims of detailing.  The Court noted, for instance, that “many listeners find detailing instructive,”[256] and physicians could “simply decline to meet with detailers.”[257]  In fact, the Court characterized prescribing physicians as “sophisticated and experienced consumers.”[258]

Quite unlike the physicians in Sorrell,[259] the HIPAA Privacy Rule seeks to protect private patients from unwarranted invasions into their most private medical information.[260]  Contrasted to physicians, private healthcare patients are typically much more in need of protection.  The medical records amassed on their behalves are done involuntarily, a necessary byproduct of seeking medical treatment.[261]  Not only are many individual patients undoubtedly less sophisticated than physicians, but they may also be unable to watchdog illicit uses of their medical records, particularly if they are ill or aged.  In fact, most patients are probably unaware of the many uses and disclosures of their medical information by covered entities that HIPAA permits.[262]  To address these concerns, HIPAA sets clear limits to covered entities’ uses and disclosures of individuals’ protected health information.[263]  It provides a means whereby covered entities must obtain individuals’ authorization for uses and disclosures that are not expressly permitted by HIPAA,[264] and whereby patients can prevent certain uses and disclosures prior to their occurrence.[265]  As a result, a strong argument can be made that the HIPAA Privacy Rule, unlike the data-mining laws, is a reasoned response to the critical need to protect patients’ medical privacy.

Second, the Supreme Court criticized Vermont’s data-mining law for attempting to advance its goals in too indirect a way.[266]  The State restricted access to prescription information in order to restrict data mining, which in turn would impair detailing, which in turn would result in physicians writing fewer prescriptions for brand-name drugs, which in turn would contain healthcare costs and avoid unnecessary health risks.[267]  HIPAA, on the other hand, directly accomplishes its goal of protecting individuals’ medical privacy by conferring upon the individuals themselves the ability to control, within certain limits,[268] the uses and disclosures of their own protected health information by covered entities.[269]

Third, there were readily available less restrictive alternatives to the data-mining laws that could have accomplished the asserted purposes of achieving prescriber privacy, protecting the public health, and containing pharmaceutical costs.  The Sorrell Court observed that physicians could easily refuse to meet with detailers, thereby preventing the detailers from using the physicians’ prescriber histories to pressure them into purchasing expensive brand-name drugs.[270]  Further, Vermont’s law authorized funds for a drug education program to provide physicians with information on “cost-effective utilization of prescription drugs.”[271]  Accordingly, before prohibiting data mining of pharmacy prescriptions, the State could have waited to see if that program was successful in limiting sales of nongeneric drugs.[272]

In contrast, with regard to HIPAA, there is no readily ascertainable less restrictive means to protect the privacy of patients’ medical records other than to permit limited uses and disclosures and to require patients’ consent for everything else.[273]  Congress, with limited exceptions,[274] conferred upon individuals the ability to control uses and disclosures of their own protected health information by covered entities.[275]  Requiring individuals to authorize uses and disclosures that are not otherwise needed to allow the healthcare industry to operate[276] and enable critical public interest activities[277] is therefore a direct means of achieving that control.  As a reasonable exercise of that control, HIPAA requires individuals to authorize any uses or disclosures of their protected health information to sell items or services that are not related to the individuals’ own healthcare management.[278]  As marketing third-party items and services is not critical either to providing and paying for individuals’ treatment or to enabling public interest activities, the authorization requirement for marketing uses and disclosures is necessary to achieve HIPAA’s privacy goal.

Fourth, the discriminatory impact of the data-mining laws that offended the Supreme Court in Sorrell[279] is largely absent in HIPAA.  The Sorrell Court characterized Vermont’s data-mining law as pointedly aimed at “diminish[ing] the effectiveness of marketing by manufacturers of brand-name drugs.”[280]  Convinced that detailing increased prescriptions for expensive brand-name drugs over just as effective and cheaper generic alternatives, the State sought to discourage detailing:

“In its practical operation,” Vermont’s law “goes even beyond mere content discrimination, to actual viewpoint discrimination.”  Given the legislature’s expressed statement of purpose, it is apparent that [the Vermont law] imposes burdens that are based on the content of speech and that are aimed at a particular viewpoint.[281]

The Sorrell Court found the State’s eradication of pharmacy data mining to be value based because “the State . . . engage[d] in content-based discrimination to advance its own side of a debate.”[282]  The law prohibited the communication of accurate information by detailers even though some prescribers found the information to be helpful.[283]  Also, the Court found that some brand-name drugs may be better for patients than their generic equivalents.[284]  Nevertheless, the State restricted access to prescription information to suppress speech with which it did not agree, while allowing access for itself and others to promote generics.[285]

This pointedly discriminatory goal and impact of Vermont’s data-mining law is absent with the HIPAA Privacy Rule.  Although marketing is not included in HIPAA’s list of permitted uses and disclosures,[286] it falls within a very broad category of all nonpermissive uses and disclosures for which an authorization is required.[287]  Admittedly, HIPAA singles out marketing for special restrictions,[288] as it comprises one of only two uses specified in HIPAA where protected health information may not even be de-identified absent the individual’s authorization.[289]  Here, however, it is all marketing that is so treated, not the more pointed restriction of a particular use by a particular speaker that was present in Sorrell.[290]

Consequently, the overall structure of the data-mining laws and the HIPAA Privacy Rule is markedly different.  Amid the thousands of uses and disclosures to which medical information is subject,[291] Vermont’s data-mining law pointedly prohibited only one—pharmacies’ disclosure of prescription information for marketing and the use of that information by drug manufacturers to market their drugs.[292]  Consequently, any nonmarketing use of prescription information was permitted.[293]  Even with regard to marketing uses, exceptions allowed the information to be utilized for “health care research,” to enforce “compliance” with health insurance preferred drug lists, for “care management educational communications” provided to patients on treatment options, for law enforcement operations, and as “otherwise provided by law.”[294]  Pharmacies could sell the information to insurers, researchers, journalists, the State, and others.[295]  The State itself could use the information for “counterdetailing” activities.[296]  Accordingly, the Court concluded that while the law “permits extensive use of prescriber-identifying information,”[297] it targeted only one use (marketing) and one user (drug manufacturers) for its prohibition.[298]

In contrast, the HIPAA Privacy Rule regulates from the reverse vantage point.  It declares at the outset that no use or disclosure of protected health information may occur unless it is specifically permitted by the Rule.[299]  Therefore, opposite to the structure of the data-mining laws, the prohibitions are virtually limitless, while the allowable uses are distinctly limited.[300]  Generally, the Privacy Rule permits uses and disclosures that fall within two broad categories[301]: (1) those that are related to healthcare treatment, payment, and business operations of the covered entities[302]and (2) those that are related to public interest activities that are so critical to society’s well being that Congress deemed they should not be hindered by medical privacy concerns.[303]  All nonpermitted uses must be authorized.[304]  While, like the data-mining laws, HIPAA earmarks marketing for special restrictions,[305] even those limitations are more broadly drawn in HIPAA, applying to all types of marketing, not just marketing of brand-name drugs by pharmaceutical manufacturers.[306]  This is quite different from Vermont’s prohibition applying solely to pharmacies’ and insurers’ sales of prescription information for drug marketing.[307]  In fact, the Sorrell Court itself pointed out the marked differences between the structure of Vermont’s data-mining law and the HIPAA Privacy Rule:

[T]he State might have advanced its asserted privacy interest by allowing the information’s sale or disclosure in only a few narrow and well-justified circumstances.  See, e.g., Health Insurance Portability and Accountability Act of 1996, 42 U.S.C. §1320d-2; 45 CFR pts. 160 and 164 (2010).  A statute of that type would present quite a different case than the one presented here.[308]


While several states found it necessary to pass laws prohibiting pharmacies from selling de-identified prescription information to data miners for use by drug manufacturers to market their brand-name drugs, a solid argument can be made that the HIPAA Privacy Rule already restricted such sales.  HIPAA prohibits covered entities, including pharmacies, from using protected health information for marketing purposes without the individuals’ authorization.  As a result, it appears that the Privacy Rule restricts pharmacies from even de-identifying protected health information for marketing purposes unless the affected individuals authorize such use.

The recent Sorrell holding, invalidating Vermont’s data-mining law on the ground that it violates the Free Speech Clause of the First Amendment, raises the question of whether the marketing provisions of the HIPAA Privacy Rule could be deemed invalid for similar reasons.  Both the data-mining laws and the HIPAA Privacy Rule restrict pharmacies from selling de-identified prescription information to data miners for marketing purposes.

However, it is evident that there are fundamental distinctions between the data-mining laws and HIPAA’s marketing restrictions.  The two laws protect different parties and are structured very differently.  Most significantly, the discriminatory intent and effect of the data-mining laws are largely absent in HIPAA.  These distinctions present a substantially different question regarding HIPAA from that considered in Sorrell and likely would yield a different answer.[309]

          *     Professor of Law, Albany Law School.  I would like to express my sincere gratitude to Robert Emery, who recently retired as Associate Director and Head of Reference from the Albany Law School Schaffer Law Library, for the outstanding research assistance he has given me over the years.  He has been my research “go-to” person ever since I came to Albany Law School as a student in 1984.  He provided invaluable research expertise throughout my fifteen years of private law practice in Albany and during my past eleven years as a professor at the school.  Each of the articles I have produced while at Albany Law bears his imprint.  I do not believe there is a finer, or more patient and helpful, research expert to be found than Bob Emery.
         [1].   131 S. Ct. 2653 (2011).
         [2].   Id. at 2659.
         [3].   See infra Part I.
         [4].   See infra Part I.
         [5].   See infra Part I.
         [6].   See infra Part II.
         [7].   See infra Part III.
         [8].   See infra Part IV.
         [9].   See infra Part V.
       [10].   See infra Part V.
       [11].   See infra Part V.
       [12].   See 45 C.F.R. pts. 160, 164 (2010).
       [13].   42 U.S.C. § 1320d-2 (Supp. IV 2011).  Hereinafter, the Privacy Rule will be referred to as “the Privacy Rule,” “the Rule,” or “HIPAA” interchangeably.
       [14].   See infra Parts VI.A–B.
       [15].   See infra Part VI.C.
       [16].   See infra Part VI.B–C.
       [17].   See infra Part VI.D.
       [18].   See infra Part VII.
       [19].   See infra Part VII.
       [20].   See infra Part VII.
       [21].   See infra Part VII.
       [22].   See infra Part VII.
       [23].   See infra Part I.
       [24].   See infra Part III.
       [25].   See infra Part IV.
       [26].   See infra Part V.
       [27].   See infra Part VI.
       [28].   See infra Part VII.
       [29].   See infra Part VII.
       [30].   See Marcia M. Boumil et al., Prescription Data Mining, Medical Privacy and the First Amendment: The U.S. Supreme Court in Sorrell v. IMS Health Inc., 21 Annals Health L. 447, 449–51 (2012) (describing the practices of data mining and detailing).
       [31].   See, e.g., Brief for the United States as Amicus Curiae Supporting Petitioners at 4–5, Sorrell v. IMS Health, Inc., 131 S. Ct. 2653 (2011) (No. 10-779) (stating that Vermont requires each pharmacy to maintain a “patient record system” that records the patient’s name, address, telephone number, age or date of birth, gender, name and strength of each drug prescribed, quantity, date received, prescription number, and name of the prescriber).
       [32].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“When filling prescriptions, pharmacies in Vermont collect information including the prescriber’s name and address, the name, dosage, and quantity of the drug, the date and place the prescription is filled, and the patient’s age and gender.”), aff’d, 131 S. Ct. 2653 (2011); IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (describing the “potpourri” of prescription information retained by pharmacies, including “the name of the patient, the identity of the prescribing physician, the drug, its dosage, and the quantity dispensed”), abrogated by Sorrell, 131 S. Ct. 2653.
       [33].   See, e.g., N.Y. Educ. Law § 6810(5) (McKinney 2010) (“Records of all prescriptions filled or refilled shall be maintained for a period of at least five years and upon request made available for inspection and copying by a representative of the department.  Such records shall indicate date of filling or refilling, doctor’s name, patient’s name and address and the name or initials of the pharmacist who prepared, compounded, or dispensed the prescription.  Records of prescriptions for controlled substances shall be maintained pursuant to requirements of article thirty-three of the public health law.”).
       [34].   See, e.g., Al Baker & Joseph Goldstein, Focus on Prescription Records Leads to Arrest in 4 Killings, N.Y. Times, June 23, 2011, at A18 (reporting arrests stemming from information derived from prescription records: “A prosecutor in the Office of the Special Narcotics Prosecutor for New York City re-examined prescription records that the office had in its possession, another law enforcement official said.  Those records are part of continuing long-term investigations into prescription drug diversion, the official said”); see also Questions and Answers for Practitioners Regarding the New Official Prescription Program, N.Y. St. Dep’t Health, (last visited Aug. 28, 2012) (discussing section 21 of the New York Public Health Law, requiring prescriptions written in New York to be issued on official New York State prescription forms, to “combat the growing problem of prescription fraud.  Official prescriptions contain security features specifically designed to prevent alterations and forgeries that divert drugs for sale on the black market.  Some of these contaminated drugs end up in patients’ medicine cabinets.  By preventing fraudulent claims, the law will also save New York’s Medicaid program and private insurers many millions of dollars every year”).
       [35].   IMS and Verispan were plaintiffs in the Vermont, Maine, and New Hampshire data-mining cases.  See Sorrell, 630 F.3d at 263; IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011); Ayotte, 550 F.3d at 42.
       [36].   Data miners have been described as “prescription drug information intermediaries that mine [purchase and process] specialized data.” Mills, 616 F.3d at 15–16.
       [37].   Sorrell, 630 F.3d at 267 (“The PI [prescriber-identifiable] data sold by the data-mining appellants is stripped of patient information, to protect patient privacy.”); Mills, 616 F.3d at 16 (stating that pharmacies’ computer software collects prescription data, encrypts the patient identifiers so that patients cannot be identified by name, and sends the information to the data miners who have purchased the information);Ayotte, 550 F.3d at 45 (stating that patients’ names are encrypted, “effectively eliminating the ability to match particular prescriptions with particular patients”).
       [38].   Ayotte, 550 F.3d at 45 (stating that “[t]he scope of the [data-mining] enterprise is mind-boggling” and noting that IMS and Verispan organize several billion prescriptions each year).
       [39].   Id.
       [40].   Mills, 616 F.3d at 16 (“They [data miners] assemble a complete picture of individual prescribers’ prescribing histories by cross-referencing prescriber names with publicly available databases, including the AMA’s database of medical doctors’ specialties.”); Ayotte, 550 F.3d at 45 (“[Data miners] group [the data] by prescriber, and cross-reference each physician’s prescribing history with physician-specific information available through the American Medical Association.”).
       [41].   Sorrell, 630 F.3d at 267 (“‘Detailing’ refers to visits by pharmaceutical representatives, called detailers, to individual physicians to provide information on specific prescription drugs.”); Ayotte, 550 F.3d at 46 (“Detailing involves tailored one-on-one visits by pharmaceutical sales representatives with physicians and their staffs.”).
       [42].   Sorrell, 630 F.3d at 267 (explaining that detailers provide information to physicians “including the use, side effects, and risks of drug interactions”); Mills, 616 F.3d at 14 (stating that detailers distribute “promotional materials and pamphlets about the different conditions their particular products can be used to treat”); Ayotte, 550 F.3d at 46 (“The detailer comes to the physician’s office armed with handouts and offers to educate the physician and his staff about the latest pharmacological developments . . . [thereby] holding out the promise of a convenient and efficient means for receiving practice-related updates.”).  The Maine drug prescription data-mining law defines “‘detailing’ as ‘one-to-one contact with a prescriber or employees or agents of a prescriber for the purpose of increasing or reinforcing the prescribing of a certain drug by the prescriber.’”  See Mills, 616 F.3d at 14 (citing Me. Rev. Stat. tit. 22, § 1711-E(1)(A-2) (2005)).
       [43].   Mills, 616 F.3d at 14 (“Prescriber-identifying data is a valuable tool in a detailer’s arsenal of sales techniques.”).
       [44].   Sorrell, 630 F.3d at 267 (“Pharmaceutical manufacturers use [the mined] data to identify audiences for their marketing efforts, to focus marketing messages for individual prescribers, [and] to direct scientific and safety messages to physicians most in need of that information.”); Mills, 616 F.3d at 14 (“With [data-mining reports], pharmaceutical manufacturers can pinpoint the prescribing habits of individual prescribers in a region and target prescribers who might be persuaded to switch brands or prescribe more of a detailer’s brand of products.”); Ayotte, 550 F.3d at 44–45 (explaining that data-mining reports enable “detailers . . . to target particular physicians and shape their sales pitches accordingly”).
       [45].   Ayotte, 550 F.3d at 47; see also Mills, 616 F.3d at 14 (“Detailers use prescriber-identifying data to [market their drugs] more effectively; every sales pitch can be tailored to what the detailer knows of the prescriber based on her prescribing history.”).
       [46].   Mills, 616 F.3d at 14 n.3.
       [47].   Id. at 14; see also Sorrell, 630 F.3d at 267 (“[P]harmaceutical industry spending on detailing has increased exponentially along with the rise of data mining.”).
       [48].   Ayotte, 550 F.3d at 46.
       [49].   Mills, 616 F.3d at 14 (“[Pharmaceutical manufacturers] have some 90,000 pharmaceutical sales representatives make weekly or monthly one-on-one visits to prescribers nationwide.”).  Data mining is lucrative for the miners as well.  IMS alone reported revenues of $1.75 billion in 2005.  Id. at 16.
       [50].   Id. at 14 (“[D]etailers distribute upwards of $1 million worth of free product samples per year.”); Ayotte, 550 F.3d at 46 (“[D]etailers typically distribute an array of small gifts to physicians and their staffs. . . . [I]n the year 2000, an estimated $1,000,000,000 in free drug samples flowed from detailers to physicians.”).
       [51].   Mills, 616 F.3d at 14 (“A single prescriber is visited by an average of twenty-eight detailers a week; an average of fourteen detailers a week call on a single specialist.”); Ayotte, 550 F.3d at 47 (“[T]he average primary care physician interacts with no fewer than twenty-eight detailers each week and the average specialist interacts with fourteen.”).
       [52].   Sorrell, 630 F.3d at 268 (“[W]hile a brand-name drug is not necessarily better than its generic version, the brand-name drug is typically more expensive.”).
       [53].   Ayotte, 550 F.3d at 46 (“[Detailing] is time-consuming and expensive work, not suited to the marketing of lower-priced bioequivalent generic drugs.”).  Generic drugs are described as “drugs that are pharmacologically indistinguishable from their brand-name counterparts save for potential differences in rates of absorption.”  Id.
       [54].   Id. (“[D]etailing is employed where a manufacturer seeks to encourage prescription of a patented brand-name drug as against generic drugs, or as against a competitor’s patented brand-name drug, or as a means of maintaining a physician’s brand loyalty after its patent on a brand-name drug has expired.”).
       [55].   See Boumil et al., supra note 30, at 450–53 (describing criticisms of data mining and detailing).
       [56].   See, e.g.Ayotte, 550 F.3d at 56–57 (discussing the effectiveness of data mining as a marketing tool by detailers).
       [57].   Id. at 56.
       [58].   Id.
       [59].   Id.  Indeed, promotional literature from IMS marketed its data reports for efficacy in detailing.  Id.
       [60].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 270 (2d Cir. 2010) (internal quotation marks omitted), aff’d, 131 S. Ct. 2653 (2011); see alsoAyotte, 550 F.3d at 57 (discussing a study finding that eleven percent of detailers’ statements to physicians were “demonstrably inaccurate” (citing Michael G. Ziegler et al., The Accuracy of Drug Information from Pharmaceutical Sales Representatives, 273 J. Am. Med. Ass’n 1296, 1297 (1995))).
       [61].   Sorrell, 630 F.3d at 270 (internal quotation marks omitted).
       [62].   Id.
       [63].   Ayotte, 550 F.3d at 47.
       [64].   Id. at 56 (stating that the “common sense conclusion[]” is that “detailing substantially increases physicians’ rates of prescribing brand-name drugs”).
       [65].   Id.
       [66].   Id. at 57–58.
       [67].   Id. at 58.
       [68].   IMS Health Inc. v. Mills, 616 F.3d 7, 15 (1st Cir. 2010) (quoting Robert A. Musacchio & Robert J. Hunkler, More Than a Game of Keep-Away, Pharmaceutical Executive, May 2006, at 150) (“[P]hysicians ‘complain bitterly’ about detailers ‘who wave data in their faces’ and challenge them with their own prescribing stories when they fail to prescribe more of the product the detailer has been advertising.”), vacated, IMS Health, Inc. v. Schneider, 131 S. Ct. 3091 (2011); see also Boumil et al.,supra note 30, at 452–53 (describing doctors’ dissatisfaction with detailing).
       [69].   Mills, 616 F.3d at 15.
       [70].   See generally Boumil et al., supra note 30, at 453–57 (describing states’ legislative responses to pharmacy data mining).
       [71].   Ayotte, 550 F.3d at 47.
       [72].   N.H. Rev. Stat. Ann. § 318:47-f (2011); Boumil et al., supra note 30, at 453 (explaining New Hampshire was the first state to enact legislation to limit the use of prescription information for commercial or marketing purposes, followed closely by Vermont and Maine).
       [73].   In relevant part, the statute provides:Records relative to prescription information containing patient-identifiable and prescriber-identifiable data shall not be licensed, transferred, used, or sold by any pharmacy benefits manager, insurance company, electronic transmission intermediary, retail, mail order, or Internet pharmacy or other similar entity, for any commercial purpose, except for the limited purposes of pharmacy reimbursement; formulary compliance; care management; utilization review by a health care provider, the patient’s insurance provider or the agent of either; health care research; or as otherwise provided by law.  Commercial purpose includes, but is not limited to, advertising, marketing, promotion, or any activity that could be used to influence sales or market share of a pharmaceutical product, influence or evaluate the prescribing behavior of an individual health care professional, or evaluate the effectiveness of a professional pharmaceutical detailing sales force.§ 318:47-f.
       [74].   Ayotte, 550 F.3d at 47 (quoting § 318:47-f).
       [75].   Id.
       [76].   Id.
       [77].   Vt. Stat. Ann. tit. 18, § 4631(a) (2011); IMS Health Inc. v. Sorrell, 630 F.3d 263, 269 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
       [78].   Sorrell, 630 F.3d at 269 (quoting tit. 18, § 4631(a) (2011)).
       [79].   Id.see Boumil et al., supra note 30, at 455–56 (explaining that the Vermont “opt-in” approach differed from approaches used by other states).  The statute reads in relevant part as follows:A health insurer, a self-insured employer, an electronic transmission intermediary, a pharmacy, or other similar entity shall not sell, license, or exchange for value regulated records containing prescriber-identifiable information, nor permit the use of regulated records containing prescriber-identifiable information for marketing or promoting a prescription drug, unless the prescriber consents as provided in subsection (c) of this section.  Pharmaceutical manufacturers and pharmaceutical marketers shall not use prescriber-identifiable information for marketing or promoting a prescription drug unless the prescriber consents as provided in subsection (c) of this section.tit. 18, § 4631(d).
       [80].   Sorrell, 630 F.3d at 269–70.
       [81].   Id. at 270 (quoting tit. 18, § 4631(b)(5)) (“The law defines ‘marketing’ to include ‘advertising, promotion, or any activity that is intended to be used or is used to influence sales or the market share of a prescription drug, influence or evaluate the prescribing behavior of an individual health care professional to promote a prescription drug, market prescription drugs to patients, or to evaluate the effectiveness of a professional pharmaceutical detailing sales force.’”).
       [82].   Id. at 270 (citing tit. 18, § 4631(e)(1)–(7)) (“The statute expressly permits the sale, transfer, or use of PI [prescriber-identifiable] data for multiple other purposes, including the limited purposes of pharmacy reimbursement; prescription drug formulary compliance; patient care management; utilization review by a health care professional, the patient’s health insurer, or the agent of either; health care research; dispensing prescription medications; the transmission of prescription data from prescriber to pharmacy; care management; educational communications provided to a patient, including treatment options, recall or safety notices, or clinical trials; and for certain law enforcement purposes as otherwise authorized by law.”).
       [83].   Id. at 271 n.3 (citing Vt. Stat. Ann. tit. 33, §§ 2004, 2466a (2011)).
       [84].   IMS Health Inc. v. Mills, 616 F.3d 7, 17 (1st Cir. 2010) (citing Me. Rev. Stat. tit. 22, § 1711-E(1-A) (2010)) (internal quotation marks omitted),vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
       [85].   The statute defines marketing to include “‘advertising, publicizing, promoting or selling a prescription drug;’ ‘activities undertaken for the purpose of influencing the market share of a prescription drug or the prescribing patterns of a prescriber, a detailing visit or a personal appearance;’ ‘[a]ctivities undertaken to evaluate or improve the effectiveness of a professional detailing sales force;’ or ‘[a] brochure, media advertisement or announcement, poster or free sample of a prescription drug.’”  Id. at 16 n.6 (quoting tit. 22, § 1711-E(1)(F-1)).
       [86].   Id. at 16 (citing tit. 22, § 1711-E(2-A)) (“[A] carrier, pharmacy or prescription drug information intermediary . . . may not license, use, sell, transfer, or exchange for value, for any marketing purpose, prescription drug information that identifies a prescriber who has filed for confidentiality protection.”).
       [87].   Id. at 16 n.6 (quoting tit. 22, § 1711-E(1)(F-1)) (“‘Marketing’ does not include pharmacy reimbursement, formulary compliance, pharmacy file transfers in response to a patient request or as a result of the sale or purchase of a pharmacy, patient care management, utilization review by a health care provider or agent of a health care provider or the patient’s health plan or an agent of the patient’s health plan, and health care research.”).
       [88].   Tom Ramstack, Drug Companies Seek Supreme Court Permission for “Data Mining,” (Apr. 26, 2011, 11:41 AM), (“Twenty-five states are considering similar laws[.]”); James Vicini, Supreme Court Strikes Down State Drug Data-Mining Law, Reuters (June 23, 2011, 1:48 PM), (“[S]imilar measures have been proposed in about 25 states in the last three years[.]”).
       [89].   The data miners include IMS, Verispan, and Source Healthcare Analytics, Inc.  See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 269 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
       [90].   The association was Pharmaceutical Research and Manufacturers of America.  See,
       [91].   See IMS Health Inc. v. Sorrell, 631 F. Supp. 2d 434, 440 (D. Vt. 2009), rev’d, 630 F.3d 263 (2d Cir. 2010); IMS Health Corp. v. Rowe, No. CV-07-127-B-W, 2007 U.S. Dist. LEXIS 94268, at *27 (D. Me. Dec. 21, 2007), rev’d, IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010), vacated,IMS Health, Inc. v. Schneider, 131 S. Ct. 3091 (2011); IMS Health Inc. v. Ayotte, 490 F. Supp. 2d 163, 174 (D.N.H. 2007), rev’d and vacated, 550 F.3d 42 (1st Cir. 2008), abrogated by Sorrell, 131 S.Ct. 2653.
       [92].   See Sorrell, 630 F.3d at 266; Mills, 616 F.3d at 13; Ayotte, 550 F.3d at 47–48.
       [93].   Ayotte, 550 F.3d at 42.
       [94].   Id. at 45.
       [95].   Id.
       [96].   The First Circuit described the Central Hudson test as follows:Under Central Hudson—so long as the speech in question concerns an otherwise lawful activity and is not misleading—statutory regulation of that speech is constitutionally permissible only if the statute is enacted in the service of a substantial governmental interest, directly advances that interest, and restricts speech no more than is necessary to further that interest.Id. at 55 (citing Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 566 (1980)).
       [97].   Id. (“Fiscal problems have caused entire civilizations to crumble, so cost containment is most assuredly a substantial governmental interest.”).
       [98].   Id. at 56–57 (discussing evidence showing that “prescribing histories made detailing more efficacious”).
       [99].   Id. at 56 (finding that it was a “‘common-sense conclusion[]” that detailing increases the prescriptions of brand-name drugs).
     [100].   Id. at 58 (“[W]hile a state legislature does not have unfettered discretion ‘to suppress truthful, nonmisleading information for paternalistic purposes . . . there is in this area ‘some room for the exercise of legislative judgment.”) (internal quotation marks omitted) (citation omitted).
     [101].   Id. at 60–61 (ruling that the voidness claim “need not detain us,” as it was “sufficiently clear to withstand the plaintiffs’ vagueness challenge”).
     [102].   Id. at 64 (ruling that the plaintiffs’ Commerce Clause argument was unavailing, as the court was “confident that the New Hampshire Supreme Court would interpret the Prescription Information Law to affect only domestic transactions”).
     [103].   616 F.3d 7 (1st Cir. 2010), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [104].   See id. at 13.
     [105].   Id. at 18–19 (“Plaintiffs’ [First Amendment] claims fail for the same reasons we rejected their nearly identical First Amendment challenge to New Hampshire’s similar statute in Ayotte. . . . Even assuming arguendo that the Maine law restricts protected commercial speech and not conduct, we hold that it directly advances the substantial purpose of protecting opted-in prescribers from having their identifying data used in unwanted solicitations by detailers, and thus Maine’s interests in lowering health care costs.”).
     [106].   Id. at 23 (“Even if there were possible ambiguity in [the statute’s] terms, the law is still not void for vagueness . . . [as it] surely provides enough of a benchmark to satisfy due process.”).
     [107].   Id. at 24–25 (“[T]he statute applies to plaintiffs’ out-of-state use or sale of opted-in Maine prescribers’ identifying data and that the statute does so constitutionally. . . . Plaintiffs have not shown any disproportionate burden on interstate commerce, and the law creates substantial in-state benefits for those Maine prescribers who have affirmatively asked Maine to protect their identifying data and for Maine in its efforts to lower health care costs.”).
     [108].   630 F.3d 263 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
     [109].   The Second Circuit described the Central Hudson test as follows:[T]he government may regulate commercial speech when (1) “the communication is neither misleading nor related to unlawful activity;” (2) the government “assert[s] a substantial interest to be achieved” by the regulation; (3) the restriction “must directly advance the state interest;” and finally (4) “if the governmental interest could be served as well by a more limited restriction on commercial speech, the excessive restrictions cannot survive.”Id. at 275 (quoting Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 564 (1980)).
     [110].   Id. at 276 (“[W]e agree with the district court that Vermont does have a substantial interest in both lowering health care costs and protecting public health.  However, the state’s asserted interest in ‘medical privacy’ is too speculative to satisfy the second prong of Central Hudson.”).
     [111].   Id. at 277 (“The Vermont statute cannot be said to advance the state’s interests in public health and reducing costs in a direct and material way.”).
     [112].   Id.
     [113].   Id. at 280.
     [114].   Id.
     [115].   Id. at 282.
     [116].   131 S. Ct. 2653 (2011).
     [117].    Id. at 2658–59.  Justice Kennedy delivered the opinion of the Court, with Justices Roberts, Scalia, Thomas, Alito, and Sotomayor joining.  Justice Breyer filed a dissenting opinion, in which Justices Ginsburg and Kagan joined.
     [118].   Id. at 2659.
     [119].   Id. (“Vermont argues that its prohibitions safeguard medical privacy and diminish the likelihood that marketing will lead to prescription decisions not in the best interests of patients or the State.  It can be assumed that these interests are significant.”).  The Court noted, however, that, at oral argument, the State declined to affirm that its purpose in enacting the law was to discourage detailing and influence drug prescribing.  Id. at 2670.  The Court concluded that “[t]he State’s reluctance to embrace its own legislature’s rationale reflects the vulnerability of its position.”  Id.  Nevertheless, the Court held that “[t]he text of § 4631(d), associated legislative findings, and the record developed in the District Court establish that Vermont enacted its law” to inhibit drug marketing schemes that increase the prescriptions for expensive brand-name drugs.  Id. at 2672.
     [120].   Id. at 2663 (“On its face, Vermont’s law enacts content- and speaker-based restrictions on the sale, disclosure, and use of prescriber-identifying information.”).
     [121].   Id. at 2659 (“Vermont’s statute must be subjected to heightened judicial scrutiny.  The law cannot satisfy that standard.”).
     [122].   Id. at 2656 (“The statute thus disfavors marketing, i.e., speech with a particular content.”); see also id. at 2668 (“Under Vermont’s law, pharmacies may share prescriber-identifying information with anyone for any reason save one: They must not allow the information to be used for marketing.”).
     [123].   Id. at 2663 (“[T]he statute disfavors specific speakers, namely pharmaceutical manufacturers. . . . Detailers are . . . barred from using the information for marketing, even though the information may be used by a wide range of other speakers.”).
     [124].   Id. at 2664 (“Act 80 [Vermont’s data-mining law] is designed to impose a specific, content-based burden on protected expression.  It follows that heightened judicial scrutiny is warranted.” (citation omitted)).
     [125].   Id. at 2666 (“The State also contends that heightened judicial scrutiny is unwarranted in this case because sales, transfer, and use of prescriber-identifying information are conduct, not speech.”).
     [126].   Id. at 2667.
     [127].   Id. at 2668 (citation omitted).
     [128].   See id. at 2668–72.
     [129].   Id. at 2668 (“The explicit structure of the statute allows the information to be studied and used by all but a narrow class of disfavored speakers.”).
     [130].   Id. at 2669.
     [131].   Id. at 2663 (“[I]t appears that Vermont could supply academic organizations with prescriber-identifying information to use in countering the messages of brand-name pharmaceutical manufacturers and in promoting the prescription of generic drugs.”); see also id. at 2660–61 (discussing the counter-detailing provisions in the Vermont law).
     [132].   See id. at 2670–71.
     [133].   Id. at 2669 (“Physicians can, and often do, simply decline to meet with detailers, including detailers who use prescriber-identifying information.”).
     [134].   Id. at 2670.
     [135].   Id.
     [136].   Id. at 2667–68 (citing, inter alia, Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 566 (1980)) (“[T]he State must show at least that the statute directly advances a substantial governmental interest and that the measure is drawn to achieve that interest.”).
     [137].   Id. at 2670.
     [138].   Id. at 2671 (characterizing physicians as “‘sophisticated and experienced’ consumers” (citation omitted)).
     [139].   Id. at 2670–71 (“The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers. . . . [T]he ‘fear that people would make bad decisions if given truthful information’ cannot justify content-based burdens on speech . . . when the audience, in this case prescribing physicians, consists of ‘sophisticated and experienced’ consumers.” (citations omitted)).
     [140].   Id. at 2672 (“[T]he State cannot engage in content-based discrimination to advance its own side of a debate.”).
     [141].   Id. (“If Vermont’s statute provided that prescriber-identifying information could not be sold or disclosed except in narrow circumstances then the State might have a stronger position.”); see also id. at 2668 (“[T]he State might have advanced its asserted privacy interest by allowing the information’s sale or disclosure in only a few narrow and well-justified circumstances.” (citations omitted)).
     [142].   Id. at 2672 (“[T]he State itself can use the information to counter the speech it seeks to suppress.”).
     [143].   Id.
     [144].   Id.
     [145].   IMS Health, Inc. v. Schneider, 131 S. Ct. 3091, 3091 (2011).
     [146].   IMS Health Inc. v. Ayotte, No. 06-cv-280-PB, 2011 U.S. Dist. LEXIS 116595, at *2–3 (D.N.H. Oct. 7, 2011) (“The parties agree that the Supreme Court’s recent decision in Sorrell v. IMS Health, Inc., 131 S. Ct. 2653, 180 L. Ed. 2d 544 (2011) requires ‘invalidation of N.H. Rev. Stat. Ann. §§ 318:47-f and 318-B:12 to the extent that they prohibit the transfer, use, sale, or licensing of prescriber-identifiable data.’  Accordingly, they have asked me to reinstate the court’s May 7, 2007 judgment for the plaintiffs.  I have reviewed Sorrell and agree that it requires the invalidation of the above-referenced statutes because they improperly restrict speech protected by the First Amendment.”).
     [147].   45 C.F.R. pts. 160, 164 (2010).
     [148].   See, e.g., 45 C.F.R. § 164.502(a) (2011) (regulating covered entities’ use and disclosure of protected health information); see also id. § 160.103 (defining “protected health information” to mean “individually identifiable health information . . . that is: (i) Transmitted by electronic media; (ii) Maintained in electronic media; or (iii) Transmitted or maintained in any other form or medium”).
     [149].   See 42 U.S.C. §§ 1320d(5), 1320d-1(a) (2006) (applying the Act to most health plans, healthcare providers, and other covered entities).  The Rule’s definition of a “covered entity” includes, inter alia, “[a] health plan” and “[a] health care provider who transmits any health information in electronic form in connection with a transaction covered by this subchapter.”  45 C.F.R. § 160.103.
     [150].   45 C.F.R. § 160.103.
     [151].   Id. § 164.502(a).
     [152].   Id. § 160.103.
     [153].   Id.
     [154].   See 42 U.S.C. § 1320d(4)(B) (defining “health information” as any information that “relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual”).
     [155].   See id. § 1320d(6) (defining “individually identifiable health information” as “any information, including demographic information collected from an individual, that . . . relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual, and . . . (i) identifies the individual; or (ii) with respect to which there is a reasonable basis to believe that the information can be used to identify the individual”); see also 45 C.F.R. § 160.103 (defining “[i]ndividually identifiable health information” as “information that is a subset of health information, including demographic information collected from an individual, and: (1) Is created or received by a health care provider, health plan, employer, or health care clearinghouse; and (2) Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual; and (i) That identifies the individual; or (ii) With respect to which there is a reasonable basis to believe the information can be used to identify the individual”).
     [156].   See 45 C.F.R. § 164.514(b)(2)(i) (listing the elements of health information that must be removed to de-identify the information).
     [157].   Id. § 164.514(b)(2)(i)(R) (requiring the removal of “any other unique identifying number, characteristic, or code” for de-identification of protected health information).
     [158].   Id. § 164.502(a).
     [159].   An “individual” is a “person who is the subject of protected health information.”  Id. § 160.103.
     [160].   Id. § 164.502(a)(2)(i).
     [161].   Id. § 164.502(a)(2)(ii).
     [162].   Id. § 164.502(a)(1) (providing a number of permitted uses and disclosures under HIPAA).
     [163].   In addition to the two broad categories of permissive uses described herein, there are also several minor categories of permissive uses.  Covered entities are permitted to disclose protected health information to the individual (who is the subject of the information) even when the individual does not specifically request disclosure.  See id. § 164.502(a)(1)(i).  Covered entities are permitted to inadvertently disclose protected health information when the disclosure occurs during another required or permitted use or disclosure (an “incident to” disclosure).  See id.§ 164.502(a)(1)(iii).  Finally, once covered entities have obtained the agreement of the individual, they are permitted to use and disclose protected health information to list the individual as a patient in a healthcare facility directory, to inform the individual’s visitors and members of the clergy that the individual is a patient in the facility, and to disclose protected health information to family and friends of the individual who are involved in the individual’s care or payment.  See id. §§ 164.502(a)(1)(v), 164.510.
     [164].   Id. § 164.502(a)(1)(ii).
     [165].   Id. § 164.501.
     [166].   Id.
     [167].   Id.
     [168].   See id. § 164.512.
     [169].   Id. (permitting covered entities to disclose protected health information “without the written authorization of the individual . . . or the opportunity for the individual to agree or object” for disclosures that are (a) required by law, (b) for public health activities, (c) about victims of abuse, neglect, or domestic violence, (d) for health oversight activities, (e) for judicial and administrative proceedings, (f) for law enforcement purposes, (g) about decedents, (h) for cadaveric organ, eye, or tissue donation purposes, (i) for research purposes, (j) to avert a serious threat to health or safety, (k) for specialized government functions, and (l) for workers’ compensation).
     [170].   Id. § 164.502(a).
     [171].   Id. § 164.502(a)(1)(iv) (allowing covered entities to disclose protected health information “[p]ursuant to and in compliance with a valid authorization”).  The subsection of the Privacy Rule that describes the twelve permitted public interest activities specifically provides that “[a] covered entity may use or disclose protected health information without the written authorization of the individual.”  Id. § 164.512.
     [172].   See, e.g., Prot. & Advocacy Sys., Inc. v. Freudenthal, 412 F. Supp. 2d 1211, 1220 (D. Wyo. 2006) (“The primary purpose of HIPAA’s Privacy Rule is to safeguard the privacy of medical protected health information.”).
     [173].   45 C.F.R. § 164.524(a)(1).
     [174].   Id. § 164.524(a)(3).
     [175].   Id. § 164.526(a)(1).
     [176].   Id. § 164.526(d)(2).
     [177].   Id. § 164.528(a)(1) (providing individuals a right “to receive an accounting of disclosures of protected health information made by a covered entity in the six years prior to the date on which the accounting is requested”).
     [178].   Id. § 164.522(a)(1)(i)(A)–(B) (granting individuals a right to request that the covered entity restrict uses or disclosures related to “treatment, payment, or health care operations” or disclosures to which individuals have a right to agree or object under 45 C.F.R. § 164.510(b)).  Covered entities need not comply with all of these requests.  See id. § 164.522(a)(1)(ii) (“A covered entity is not required to agree to a restriction.”).  However, where the request relates to health care operations and not treatment, and the protected health information pertains solely to a health care item or service for which the provider has already been fully reimbursed, then the covered entity must comply with the request.  See 42 U.S.C. § 17935(a) (Supp. IV 2010).
     [179].   45 C.F.R. § 164.522(b)(1).
     [180].   Id. § 164.522(b)(1)(i) (requiring providers to “accommodate reasonable requests”).
     [181].   Id. § 164.522(b)(1)(ii).
     [182].   Id. § 164.510(a)–(b) (giving an individual the right to agree or object before a covered entity lists the individual’s name in a facility directory, gives information to the individual’s visitors or members of the clergy, or discloses information to friends or family members who are concerned with the individual’s treatment or payment).
     [183].   Id. § 164.502(a)(1)(iv).
     [184].   Id. § 164.520(a)(1).
     [185].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“The [prescription] data sold by the data-mining appellants is stripped of patient information, to protect patient privacy.”), aff’d, 131 S. Ct. 2653 (2011).
     [186].   Id.see also, e.g., IMS Health Inc. v. Mills, 616 F.3d 7, 16 (1st Cir. 2010) (“The [pharmacies’] software encrypts patient-identifying data so that plaintiffs cannot identify individual patients by name . . . .”), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011); IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (“To protect patient privacy, prescribees’ names are encrypted, effectively eliminating the ability to match particular prescriptions with particular patients.”), abrogated by Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011).
     [187].   45 C.F.R. § 164.502(d)(2) (“The requirements of this subpart do not apply to information that has been de-identified in accordance with the applicable requirements of § 164.514 . . . .”).
     [188].   See id. § 164.502(d)(2) (“Health information that meets the standard and implementation specifications for de-identification under § 164.514(a) and (b) is considered not to be individually identifiable health information, i.e., de-identified.”); id. § 164.514(b)(2)(i)(A)–(R) (stating the identifiers that must be removed from protected health information for de-identification).
     [189].   Id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information . . . whether or not the de-identified information is to be used by the covered entity.”).
     [190].   Id. § 164.502(a)(1)(ii) (listing “health care operations” as a permitted use).
     [191].   Id. § 164.501 (“Health care operations means any of the following activities of the covered entity to the extent that the activities are related to covered functions: . . . (6) Business management and general administrative activities of the entity, including, but not limited to: . . . (v) Consistent with the applicable requirements of § 164.514, creating de-identified health information or a limited data set, and fundraising for the benefit of the covered entity.”).  The term “covered function” is not explicitly defined in HIPAA, but presumably refers to the treatment, payment, and health care operations functions of covered entities.  See id. § 164.502(a)(1)(ii) (permitting a covered entity to use or disclose protected health information for “treatment, payment, or health care operations”).
     [192].   See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2010) (“These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”), aff’d, 131 S. Ct. 2653 (2011).
     [193].   See, e.g.Sorrell, 131 S. Ct. at 2660 (“Detailers, who represent the [drug] manufacturers . . . use the [data-mining] reports to refine their marketing tactics and increase sales.”).
     [194].   See generally 45 C.F.R. § 164.508(a)(3).
     [195].   Id. § 164.501(1).
     [196].   See id. § 164.501(1)(i)–(iii) (describing exclusions from the definition of marketing).
     [197].   See id. § 164.501(1)(i) (defining “marketing” as “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service,” excluding communications made for the purpose of describing an individual’s benefits in a health plan or relating to the individual’s treatment or case management); id. § 164.501(2) (defining “marketing” to include “[a]n arrangement between a covered entity and any other entity whereby the covered entity discloses protected health information to the other entity, in exchange for direct or indirect remuneration, for the other entity or its affiliate to make a communication about its own product or service that encourages recipients of the communication to purchase or use that product or service”).
     [198].   See id. § 164.508(a)(3).
     [199].   Subpart E of the Privacy Rule encompasses 45 C.F.R. §§ 164.500–164.534.  For Subpart E’s table of contents, see id. § 164.102.
     [200].   The transition provisions in 45 C.F.R. 164.532 refer to the effect of authorizations and contracts that existed prior to the effective date of the Privacy Rule.  For example, authorizations executed prior to HIPAA are deemed to be effective post-HIPAA as long as the authorization specifically permits the use or disclosure and there is no agreement between the covered entity and the individual restricting the use or disclosure.  See id. § 164.532 (“Effect of prior authorization for purposes other than research.  Notwithstanding any provisions in § 164.508, a covered entity may use or disclose protected health information that it created or received prior to the applicable compliance date of this subpart pursuant to an authorization or other express legal permission obtained from an individual prior to the applicable compliance date of this subpart, provided that the authorization or other express legal permission specifically permits such use or disclosure and there is no agreed-to restriction in accordance with § 164.522(a).”).
     [201].   Id. § 164.508(a)(3)(i).
     [202].   Id. § 164.508(a)(3)(i)(A).
     [203].   Id. § 164.508(a)(3)(i)(B).
     [204].   See generally id. § 164.508(a)(3)(i).
     [205].   See id.
     [206].   Id. § 160.103.  It should be noted that not every provider is a covered entity under HIPAA.  The Privacy Rule provides that a covered entity includes only those providers “who transmit[] any health information in electronic form in connection with a transaction covered by this subchapter.”  Id.  However, because virtually all pharmacies currently send health care claims and other covered transactions electronically, they qualify as covered entities under HIPAA.
     [207].   See id.
     [208].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“When filling prescriptions, pharmacies in Vermont collect information including the prescriber’s name and address, the name, dosage, and quantity of the drug, the date and place the prescription is filled, and the patient’s age and gender.”), aff’d, 131 S. Ct. 2653 (2011).
     [209].   See 45 C.F.R. § 164.508(a)(3)(i) (imposing requirements on uses and disclosures of protected health information “for marketing”).
     [210].   See, e.g.Sorrell, 630 F.3d at 267 (“Pharmacies sell this PI [prescriber-identifiable] data to the data mining appellants. . . . These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”).
     [211].   See, e.g.Sorrell, 131 S. Ct. at 2660 (“Detailers, who represent the [drug] manufacturers . . . use the [data-mining] reports to refine their marketing tactics and increase sales.”).
     [212].   45 C.F.R. § 164.501.  This marketing definition does not specify who must make the communication, or who must be the recipient of the communication.  Therefore, on its face, the definition does not require the covered entity making the use or disclosure to be either the communicator or the marketer, or that the recipient of the communication be the individual whose protected health information is being used or disclosed.  However, later additions to the Privacy Rule, enacted by Congress on February 17, 2009, appear to equate “recipient” with “individual.”  American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 13406, 123 Stat. 115, 266–70 (2010).  In ARRA provisions relating to marketing, the law states that “the covered entity making such communication obtains from the recipient of the communication . . . a valid authorization . . . with respect to such communication.”  42 U.S.C. § 17936(a)(2)(B)(ii) (Supp. IV 2010).  In the context of the Privacy Rule, authorizations are obtained only from individuals. See 45 C.F.R. § 164.508(c)(1)(vi) (requiring an authorization to be signed by the “individual”).  Further, in proposed rules to implement ARRA, the HHS also appears to assume that the recipient of marketing communications is the individual.  See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,884 (July 14, 2010) (to be codified at 45 C.F.R. pt. 160) (“The Privacy Rule requires covered entities to obtain a valid authorization from individuals before using or disclosing protected health information to market a product or service to them.” (emphasis added) (citation omitted)).  Nevertheless, HIPAA does not explicitly state that the recipient of a marketing communication must be the individual.  See 45 C.F.R. § 164.501 (defining marketing as “mak[ing] a communication about a product or service that encourages recipients of the communication to purchase or use the product or service”); id. § 164.508(a)(3) (providing that “a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing”).  Moreover, the disclosures of prescription information described in Sorrellultimately resulted in sales of brand-name drugs to individuals whose privacy information may have been used to market the drugs.  Even if this were not the case, there is nothing unreasonable about reading the Privacy Rule precisely as it is written—requiring individuals to authorize any use of their protected health information to sell items or services, no matter the product, no matter the seller, and no matter the buyer.
     [213].   See, e.g., IMS Health Inc. v. Mills, 616 F.3d 7, 16 (1st Cir. 2010) (stating that pharmacies’ computer software collects prescription data, encrypts the patient identifiers so that patients cannot be identified by name, and sends the information to the data miners who have purchased the information), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [214].   See supra Part VI.B.
     [215].   See 45 C.F.R. § 164.502(d) (permitting a covered entity to use protected health information to create de-identified information, and providing that the Privacy Rule does not apply to de-identified information).
     [216].   See id. § 160.103 (defining “protected health information” to mean “individually identifiable health information”).
     [217].   See id. § 164.502(d)(2) (providing that the Privacy Rule does not apply to de-identified information).
     [218].   See id. § 164.508(a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [219].   See id. § 160.103 (broadly defining “use” of protected health information as “the sharing, employment, application, utilization, examination, or analysis of such information within an entity that maintains such information”).
     [220].   See id. § 164.502(d)(1) (permitting a covered entity to “use protected health information to create information that is not individually identifiable health information”).
     [221].   See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“Pharmacies sell this PI [prescriber-identifiable] data to the data mining appellants. . . . These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”), aff’d, 131 S. Ct. 2653 (2011).
     [222].   See 45 C.F.R. § 160.103 (defining “covered entity” to include “[a] health care provider who transmits any health information in electronic form in connection with a transaction covered by this subchapter”).  A “health care provider” is defined as “a provider of medical or health services . . . and any other person or organization who furnishes, bills, or is paid for health care in the normal course of business.”  Id.
     [223].   See id. § 164.508(a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [224].   Id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information . . . whether or not the de-identified information is to be used by the covered entity.”).
     [225].   Id. § 164.508(a)(3)(i).
     [226].   See id.  There is no reason to believe that the HHS, the drafter of the Privacy Rule, meant anything by this “notwithstanding” language other than what the language unambiguously states.  The Agency used similar language in another provision of the Privacy Rule to require an authorization before any use or disclosure of psychotherapy notes, subject to limited exceptions.  See id. § 164.508(a)(2).  However, where the Agency intended a more limited impact of its use of the term “notwithstanding,” it clearly restricted its reach to particular provisions within the Privacy Rule. See, § 164.502(g)(3)(ii) (“Notwithstanding the provisions of paragraph (g)(3)(i) of this section”); id. § 164.502(g)(5) (“Notwithstanding a State law or any requirement of this paragraph to the contrary”); id. § 164.532(b) (“Notwithstanding any provisions in § 164.508”); id. § 164.532(c) (“Notwithstanding any provisions in §§ 164.508 and 164.512(i)”).
     [227].   See id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information or disclose protected health information only to a business associate for such purpose, whether or not the de-identified information is to be used by the covered entity.”).
     [228].   See id. § 164.508 (a)(3)(i) (“Notwithstanding any provision of this subpart . . . a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [229].   See id.
     [230].   See IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (stating that IMS and Verispan organize several billion prescriptions each year), abrogated by Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011).
     [231].   See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,907 (July 14, 2010) (to be codified at 45 C.F.R. pts. 160, 164) (opining on the effect of proposed HIPAA privacy rules that would expand the requirement of covered entities to obtain written authorizations prior to marketing disclosures and sales of protected health information: “Even if covered entities attempted to obtain authorizations in compliance with the proposed modifications, we believe most individuals would not authorize these types of disclosures.  It would not be worthwhile for covered entities to continue to attempt to obtain such authorizations, and as a result, we believe covered entities would simply discontinue making such disclosures.”).
     [232].   See id.
     [233].   See American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 17935(d)(1), 123 Stat. 115 (2010) (adding the Privacy Rule restrictions on covered entities’ sale of protected health information and requiring covered entities to obtain an authorization from the affected individuals prior to selling their protected health information for any purpose).  Exceptions to the authorization requirement apply for activities such as public health activities, research, treatment, and healthcare operations.  See id. § 17935(d)(2)(A)–(G).  In addition, ARRA provides that communications by a covered entity encouraging the recipients to purchase or use a product or service may not be considered a health care operation, which would avoid the authorization requirement.  See id. § 17936(a)(1).  Rules proposed to implement ARRA underscore the Agency’s continuing concerns about covered entities’ use of protected health information for marketing purposes.  See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. at 40,868.  The HHS declared:We believe Congress intended with these provisions [marketing and sale] to curtail a covered entity’s ability to use the exceptions to the definition of “marketing” in the Privacy Rule to send communications to the individual that were motivated more by commercial gain or other commercial purpose rather than for the purpose of the individual’s health care, despite the communication’s being about a health-related product or service.Id. at 40,884.  While ARRA restricts sales of protected health information, it does not prohibit sales of de-identified information; while it restricts marketing-related disclosures, it does not restrict marketing-related uses.  Therefore, there is nothing in the text of ARRA that explicitly prohibits a covered entity from first de-identifying protected health information and then selling it to a third party for any purpose without obtaining authorizations from the affected individuals.  Nevertheless, ARRA leaves unaltered HIPAA’s preexisting marketing requirement that covered entities must obtain authorizations from individuals before engaging in any marketing-related use or disclosure of their protected health information. See 45 C.F.R. § 164.508 (a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [234].   No. 11-2428, 2012 U.S. Dist. LEXIS 19372 (E.D. Pa. Feb. 15, 2012).
     [235].   Id. at *1–2.
     [236].   Id. at *1.
     [237].   Id. at *12.
     [238].   Id. at *14.
     [239].   Id. at *17 (“Under the Privacy Rule, healthcare providers are permitted to ‘de-identify’ Protected Health Information.  Once information is de-identified, it is no longer considered Protected Health Information.”).
     [240].   Id. (“[F]ederal regulations permit the disclosure of protected Health Information under certain circumstances, including for ‘treatment, payment, or health care operations.’  The term ‘health care operations’ is defined to include ‘contacting of health care providers and patients with information about treatment alternatives.’”).
     [241].   See 45 C.F.R. § 164.508(a)(3)(i) (2011).
     [242].   Id.
     [243].   The defendants’ letters to the plaintiffs’ physicians suggesting drug prescription alternatives should be characterized as marketing rather than treatment.  The drug manufacturers paid the pharmacies for sending the letters.  Steinberg, 2012 U.S. Dist. LEXIS 19372, at *6.  While the manufacturers stood to benefit when the physicians prescribed the suggested alternative drugs, the pharmacies had no motivation to send the communications other than their remuneration from the manufacturers.  In fact, the American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 17936(a)(1), 123 Stat. 115 (2010), provides that communications by a covered entity encouraging the recipients to purchase or use a product or service may not be considered a health care operation, thereby avoiding the authorization requirement.  The HHS, in proposed rules to implement the ARRA, declared its intent:to curtail a covered entity’s ability to use the exceptions to the definition of ‘marketing’ in the Privacy Rule to send communications to the individual that were motivated more by commercial gain . . . rather than for the purpose of the individual’s health care, despite the communication’s being about a health-related product or service.Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,884 (July 14, 2010) (to be codified at 45 C.F.R. pt. 160)
     [244].   Steinberg, 2012 U.S. Dist. LEXIS 19372, at *13.
     [245].   See discussion of the states’ data-mining laws supra Part III, and discussion of the marketing provisions of the Privacy Rule supra Part VI.B;see also Brief for the United States as Amicus Curiae Supporting Petitioners,supra note 31, at 33 (“There are a number of federal statutory and regulatory provisions that regulate the dissemination or use of information by private parties for various reasons, including to protect individual privacy. . . . For instance, the Health Insurance Portability and Accountability Act of 1996 (‘HIPAA’) and its implementing regulations limit the nonconsensual dissemination and use of patient-identifiable health information by health plans . . . and most health care providers.”).
     [246].   See supra Part VI.C.
     [247].   See supra Part V.
     [248].   See supra Part VI.C.
     [249].   See, e.g., Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2668 (2011) (“[T]he State contends that its law is necessary to protect medical privacy, including physician confidentiality, avoidance of harassment, and the integrity of the doctor-patient relationship . . . [and] improved public health and reduced healthcare costs.”); IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“The Vermont legislature passed Act 80 in 2007, intending to protect public health, to protect prescriber privacy, and to reduce health care costs.”), aff’d, 131 S. Ct. 2653 (2011).
     [250].   See, e.g., Prot. & Advocacy Sys. v. Freudenthal, 412 F. Supp. 2d 1211, 1220 (D. Wyo. 2006) (“The primary purpose of HIPAA’s Privacy Rule is to safeguard the privacy of medical protected health information.”); Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“The governmental interest in protecting patient privacy is clearly a substantial one.”).
     [251].   Compare tit. 22, § 1711-E (2009) (making it unlawful for a pharmacy to use, sell, or transfer prescription information where the prescriber had registered for confidentiality protection), and § 318:47-f (2006) (prohibiting pharmacies and insurance companies from selling or licensing prescription data for any commercial purpose), and tit. 18, § 4631 (2010) (prohibiting the sale or disclosure of pharmacy records for marketing purposes and prohibiting drug manufacturers from using the records for marketing unless the prescribers consented), with 45 C.F.R. § 164.508 (a)(3)(i) (2011) (prohibiting covered entities from using or disclosing protected health information for marketing purposes without the individual’s authorization).
     [252].   Compare tit. 22, § 1711-E(1)(F-1) (defining marketing as advertising, publicizing, promoting, or selling a prescription drug), and § 318:47-f  (defining commercial purpose as advertising, marketing, or any activity that influences sales), and 18, § 4631(b)(5) (defining marketing as advertising or any activity that influences the sale of a drug or influences prescribing behavior), with 45 C.F.R. § 164.501 (defining marketing as a communication that encourages the listener to purchase or use the item or service).
     [253].   Compare tit. 22, § 1711-E(1)(F-1) (excluding a number of health-related activities from the definition of “marketing,” including pharmacy reimbursement, patient care management, utilization review by a healthcare provider, and healthcare research), and § 318:47-f  (exempting from the marketing prohibition disclosures of prescription information for health-related purposes, such as pharmacy reimbursement, care management, utilization review by a healthcare provider, or healthcare research), and § 4631 (excluding from the definition of marketing certain health-related purposes, including pharmacy reimbursement, healthcare management, utilization review by a healthcare provider, and healthcare research), with45 C.F.R. § 164.501 (exempting from the definition of marketing communications to describe the benefits in a health plan, uses and disclosures for treatment, and case management).
     [254].   See Brief for Respondent Pharmaceutical Research and Manufacturers of America at 48, Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011) (No. 10-779) (“[T]he State did not defend its law below on the basis ofpatient privacy.”).
     [255].   IMS Health Inc. v. Mills, 616 F.3d 7, 15 (1st Cir. 2010) (“[P]hysicians ‘complain bitterly’ about detailers ‘who wave data in their faces’ and challenge them with their own prescribing histories when they fail to prescribe more of the product the detailer has been advertising.” (citations omitted)), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [256].   Sorrell, 131 S. Ct. at 2671.
     [257].   Id. at 2669.
     [258].   Id. at 2671.
     [259].   The Second Circuit in Sorrell found that the privacy of patients’ medical information was not at issue.  IMS Health Inc. v. Sorrell, 630 F.3d 263, 276 (2d Cir. 2010) (“[T]he state’s asserted interest in medical privacy is too speculative to qualify as a substantial state interest. . . . Vermont has not shown any effect on the integrity of the prescribing process or the trust patients have in their doctors from the use of PI [prescriber-identifiable] data in marketing.”), aff’d, 131 S. Ct. 2653 (2011).
     [260].   See 45 C.F.R. § 164.502(a) (2011) (regulating the uses and disclosures of protected health information by covered entities).
     [261].   See, e.g., Brief for Petitioners at 23, Sorrell, 131 S. Ct. 2653 (No. 10-779) (characterizing pharmacies’ prescription information as nonpublic, “particularly where the information has been produced involuntarily”); Reply Brief for Petitioners at 3, Sorrell, 131 S. Ct. 2653 (No. 10-779) (“Doctors and patients do not voluntarily provide prescriptions to pharmacies; by law, they must provide this sensitive information to obtain medicine.”).
     [262].   See, e.g.45 C.F.R. § 164.502(a)(1)(ii) (permitting a covered entity to use or disclose protected health information for “treatment, payment, or health care operations”).  Treatment includes coordination of healthcare, managing healthcare, consultations among providers, and referrals.  Payment includes insurers’ collection of insurance premiums, providers’ obtaining reimbursement for providing healthcare, determining eligibility for coverage, adjudicating health benefit claims, risk adjusting, billing and collections, reviewing healthcare services to determine medical necessity, utilization review, and making disclosures to consumer reporting agencies.  Healthcare operations include quality assessment; reviewing the competence or qualifications of healthcare professionals; underwriting; conducting or arranging for medical review, legal services, and auditing, including fraud and abuse detection and compliance; business planning, business management and administrative activities; customer service; resolution of internal grievances; sale, transfer, merger, or consolidation of the covered entity with another entity; and fundraising.  See id. § 164.501.
     [263].   See id. § 164.502 (providing the permitted and required uses and disclosures of protected health information by covered entities).
     [264].   See id. § 164.502(a) (prohibiting covered entities from using or disclosing protected health information “except as permitted or required by [the Privacy Rule]”); id. § 164.502(a)(1)(iv) (allowing covered entities to disclose protected health information “[p]ursuant to and in compliance with a valid authorization”).
     [265].   See, § 164.510 (requiring a covered entity, prior to certain uses and disclosures of an individual’s protected health information, to inform the individual in advance of the use or disclosure and provide the individual an opportunity to agree, or to prohibit, or to restrict the use or disclosure).
     [266].   See Sorrell, 131 S. Ct. at 2670–71 (2011) (“[T]he ‘state’s own explanation of how [the data-mining law] advances its interests cannot be said to be direct.’  The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers—that is, by diminishing detailers’ ability to influence prescription decisions.  Those who seek to censor or burden free expression often assert that disfavored speech has adverse effects.  But the ‘fear that people would make bad decisions if given truthful information’ cannot justify content-based burdens on speech.” (citations omitted)).
     [267].   See id. at 2661 (discussing the impact of pharmacies’ sales of prescription information upon cost containment and the public health).
     [268].   See 45 C.F.R. § 164.502 (listing permitted uses and disclosures of protected health information by covered entities that do not require an authorization from the affected individuals).
     [269].   See id. (providing individuals with rights of access and rights to control certain uses and disclosures of their protected health information by covered entities); see also supra Part VI.A (explaining individuals’ rights of access and control over their protected health information under the Privacy Rule).
     [270].   Sorrell, 131 S. Ct. at 2669 (“Physicians can, and often do, simply decline to meet with detailers, including detailers who use prescriber-identifying information.”).
     [271].   Id. at 2660–61.  But see id. at 2681 (Breyer, J., dissenting) (noting that the education program funded by Vermont’s data-mining law “does notmake use of prescriber-identifying data”).
     [272].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 280 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011) (“The state could wait to assess what the impact of its newly funded counter-speech program will be.”).
     [273].   See 45 C.F.R. § 164.502(a)(1)(ii), (iv) (listing permitted and required uses, and permitting any other use or disclosure “[p]ursuant to and in compliance with a valid authorization”); see also Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“HIPAA and other such federal statutes directly advance substantial federal interests in a narrowly and reasonably tailored way.”).
     [274].   See 45 C.F.R. § 164.502(a)(1)(i)–(iii), (v)–(vi), (2)(i)–(ii) (listing permitted and required uses and disclosures for which an authorization is not required).
     [275].   See id. § 164.508(a)(1) (“Except as otherwise permitted or required by this subchapter, a covered entity may not use or disclose protected health information without an authorization that is valid under this section.”).
     [276].   See id. § 164.502(a)(1)(ii) (listing “treatment, payment, or health care operations” as a permitted use and disclosure).
     [277].   See id. § 164.502(a)(1)(vi) (listing permitted uses and disclosures “[a]s permitted by and in compliance with . . . § 164.512,” which, in turn, describes twelve public interest activities pursuant to which covered entities may use or disclose protected health information without obtaining an authorization from the affected individuals).
     [278].   See id. § 164.508(a)(3) (restricting marketing uses and disclosures); id. § 164.501 (providing health related exceptions to the definition of marketing); see also Reply Brief for Petitioners, supra note 261, at 21–22 (“Doctors and patients expect and intend these [health-related] uses of healthcare information, but they do not expect (or even know) that third parties purchase the information and use it as a marketing tool.”).
     [279].   Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2663–67 (2011).
     [280].   Id. at 2663.
     [281].   Id. at 2663–64 (citation omitted).
     [282].   Id. at 2672.
     [283].   Id. at 2671 (“[S]ome Vermont doctors view targeted detailing based on prescriber-identifying information as ‘very helpful’ because it allows detailers to shape their messages to each doctor’s practice.”).
     [284].   Id. (“[T]he United States, which appeared here in support of Vermont, took care to dispute the State’s ‘unwarranted view that the danger of [n]ew drugs outweigh their benefits to patients.’”); IMS Health Inc. v. Sorrell, 630 F.3d 263, 280 (2d Cir. 2010) (observing that the state law precludes the use of pharmacy information for marketing brand-name drugs “no matter how efficacious and no matter how beneficial those drugs may be compared to generic alternatives”), aff’d, 131 S. Ct. 2653 (2011).
     [285].   Sorrell, 131 S. Ct. at 2672 (concluding that the State “restrict[ed] the information’s use by some speakers and for some purposes, even while the State itself can use the information to counter the speech it seeks to suppress”).
     [286].   See 45 C.F.R. § 164.502(a)(1)–(2) (2011) (listing the permitted and required uses and disclosures of protected health information by covered entities).
     [287].   See id. § 164.508(a)(1) (“Except as otherwise permitted or required by this subchapter, a covered entity may not use or disclose protected health information without an authorization.”).
     [288].   See id. § 164.508(a)(3) (requiring an individual’s authorization for marketing uses and disclosures of protected health information by covered entities).
     [289].   The Privacy Rule provides that, notwithstanding any other provision in the Rule, a covered entity may not use or disclose protected health information for marketing and may not use or disclose protected health information in psychotherapy notes.  See id. § 164.508(a)(2)–(3).
     [290].   Sorrell, 131 S. Ct. at 2658 (“The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers.”).  Moreover, HIPAA similarly restricts all uses and disclosures of psychotherapy notes unless authorized by the individual.  45 C.F.R. § 164.508(a)(2) (“Notwithstanding any provision of this subpart . . . a covered entity must obtain an authorization for any use or disclosure of psychotherapy notes,” subject to limited exceptions, including, inter alia, uses for treatment by the psychotherapist, uses and disclosures for the psychotherapist’s training programs, and uses or disclosures to allow the psychotherapist to defend himself in a legal action brought by the individual).
     [291].   See 45 C.F.R. § 164.501 (describing the myriad permissible uses and disclosures of protected health information that comprise treatment, payment, and healthcare operations); id. § 164.502(a)(1)(ii) (indicating that permitted uses and disclosures of protected health information include treatment, payment, or healthcare operations).
     [292].   See Sorrell, 131 S. Ct. at 2660 (quoting the law as providing that pharmacies may not disclose pharmacy information for marketing and that drug manufacturers may not use the information for marketing, unless the prescriber consents).
     [293].   See id. at 2668 (“Under Vermont’s law, pharmacies may share prescriber-identifying information with anyone for any reason save one: They must not allow the information to be used for marketing.”).
     [294].   Id. at 2660.
     [295].   Id. at 2668 (“Exceptions further allow pharmacies to sell prescriber-identifying information for certain purposes, including ‘health care research.’  And the measure permits insurers, researchers, journalists, the State itself, and others to use the information.” (citations omitted)).
     [296].   See id. at 2663 (“[I]t appears that Vermont could supply academic organizations with prescriber-identifying information to use in countering the messages of brand-name pharmaceutical manufacturers and in promoting the prescription of generic drugs.”).  But see id. at 2680 (Breyer, J., dissenting) (noting that the record “contains no evidentiary basis for the conclusion that any such individualized counterdetailing is widespread, or exists at all, in Vermont”).
     [297].   Id. at 2669 (majority opinion); see also Brief of Respondents IMS Health Inc., Verispan, LLC, & Source Healthcare Analytics, Inc. at 9, Sorrell, 131 S. Ct. 2653 (No. 10-779) (“In stark contrast to HIPAA and other federal statutes and regulatory regimes that protect important personal privacy interests, Act 80 [Vermont’s data-mining law] contains numerous exceptions that freely permit the wide distribution of prescribers’ commercial prescription history information.”).
     [298].   See Sorrell, 131 S. Ct. at 2663 (“The statute thus disfavors marketing, that is, speech with a particular content.  More than that, the statute disfavors specific speakers, namely pharmaceutical manufacturers.”).
     [299].   See 45 C.F.R. § 164.502(a) (2011) (“A covered entity may not use or disclose protected health information, except as permitted or required by this subpart. . . .”).
     [300].   See id. § 164.502(a)(1)–(2) (listing the permissible and required uses and disclosures under the Privacy Rule).
     [301].   The Privacy Rule also permits uses and disclosures in several other areas and requires disclosures in two instances.  See supra Part VI.A.
     [302].   See 45 C.F.R. § 164.502(a)(1)(ii) (listing “treatment, payment or health care operations” as a permissible basis for covered entities to use or disclose protected health information); id. § 164.501 (defining the activities that comprise treatment, payment, and healthcare operations).
     [303].   See id. § 164.502(a)(1)(vi) (listing as permissible uses and disclosure of protected health information by covered entities those that are “permitted by and in compliance with . . . § 164.512”); id. § 164.512 (listing twelve public interest activities that comprise permissible uses of protected health information by covered entities, including uses and disclosures required by law; uses and disclosures for public health activities; disclosures about victims of abuse, neglect, or domestic violence; uses and disclosures for health oversight activities; disclosures for judicial and administrative proceedings; disclosures for law enforcement purposes; uses and disclosures about decedents; uses and disclosures for cadaveric organ, eye, or tissue donation purposes; uses and disclosures for research purposes; uses and disclosures to avert a serious threat to health or safety; uses and disclosures for specialized government functions; and disclosures for workers’ compensation); see also Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“HIPAA’s regulations directly advance that interest, because they permit the nonconsensual disclosure or use of patient-identifiable information only in limited circumstances such as ‘treatment, payment, or health care operations,’ or national ‘public health activities . . . .’” (citations omitted)).
     [304].   See 45 C.F.R. § 164.502(a)(1) (listing permissible uses and disclosure of protected health information by covered entities); id. § 164.502(a)(1)(iv) (permitting disclosures pursuant to an authorization).
     [305].   See id. § 164.508(a)(3) (imposing special restrictions upon marketing uses and disclosures); see also discussion of marketing restrictions supra Part VI.B.
     [306].   See § 164.508(a)(3)(i) (providing generally that “a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing”).  Under the American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111-5, § 17935(d)(1), 123 Stat. 115 (2009), sales of protected health information must be authorized, but this limit is broadly framed to apply to all sales of health information, both marketing and nonmarketing.
     [307].   See Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2660 (2011) (quoting the law as providing that pharmacies may not disclose pharmacy information for marketing and drug manufacturers may not use the information for marketing, unless the prescriber consents).
     [308].   Id. at 2668.
     [309].   See, e.g., Brief for Petitioners, supra note 261, at 36 (“The protection of free speech should not restrict reasonable consumer privacy protections that give consumers control over nonconsensual uses of their information.”); Brief of Respondents IMS Health Inc., Verispan, LLC, & Source Healthcare Analytics, Inc., supra note 297, at 32, 44 (“There is no dispute that, although genuine privacy measures restrict free speech by prohibiting the disclosure of factual information, they satisfy First Amendment scrutiny because they are tailored to further a substantial interest in protecting an important expectation of privacy. . . . Vermont errs in relying on several statutes and regulatory regimes that prohibit private parties from disclosing information.  All those measures satisfy constitutional scrutiny because they are not intended to restrict speech but instead consistently protect an important privacy interest.  The Solicitor General all but acknowledges that, in light of all the contradictions in Vermont law, Act 80 does not function as a genuine privacy statute.”); see also Brief for the United States as Amicus Curiae Supporting Petitioners,supra note 31, at 34–35 (“[T]his Court’s analysis of the ‘fit’ between the Vermont statute and the State’s legislative objectives should not affect those federal provisions [like HIPAA].”); Reply Brief for Petitioners, supra note 261, at 8 (“If respondents were correct, then privacy laws generally would be subject to strict scrutiny. . . . This position is plainly untenable . . . .”).

By Derek E. Bambauer

Cyberlaw is plagued by the myth of perfection.

Consider three examples: censorship, privacy, and intellectual property.  In each, the rhetoric and pursuit of perfection has proved harmful, in ways this Essay will explore.  And yet the myth persists—not only because it serves as a potent metaphor, but because it disguises the policy preferences of the mythmaker.  Scholars should cast out the myth of perfection, as Lucifer was cast out of heaven.  In its place, we should adopt the more realistic, and helpful, conclusion that often good enough is . . . good enough.

Start with Internet censorship. Countries such as China, Iran, and Vietnam use information technology to block their citizens from accessing on-line material that each government dislikes.  Democracies, too, filter content: Britain blocks child pornography using the Cleanfeed system,{{1}} and South Korea prevents users from reaching sites that support North Korea’s government.{{2}}  This filtering can be highly effective: China censors opposition political content pervasively,{{3}} and Iran blocks nearly all pornographic sites (along with political dissent).{{4}}  However, even technologically sophisticated systems, like China’s Golden Shield, are vulnerable to circumvention.  Users can employ proxy servers or specialized software, such as Tor, to access proscribed sites.{{5}}  This permeability has led many observers to conclude that effective censorship is impossible, because censorship is inevitably imperfect.{{6}}  Filtering is either trivially easy to bypass, or doomed to failure in the arms race between censors and readers.  The only meaningful censorship is perfect blocking, which is unattainable.

And yet, leaky Internet censorship works.  Even in authoritarian countries, few users employ circumvention tools.{{7}} Governments such as China’s capably block access to most content about taboo subjects, such as the Falun Gong movement{{8}} or coverage of the Arab Spring uprisings.{{9}}  Those who see imperfect censorship as useless make three errors.  First, they ignore offline pressures that users face.  Employing circumvention tools is like using a flashlight: it helps find what you seek, but it draws attention to you.  China has become adept at detecting and interfering with Tor,{{10}} and Iran recently purchased a sophisticated surveillance system for monitoring Internet communications.{{11}}  Bypassing censorship in cyberspace may have adverse consequences in realspace.  Second, most Internet users are not technologically sophisticated.  They use standard software, and the need to install and update specialized circumvention tools may be onerous.{{12}}  Finally, governments do not need perfect censorship to attain their goals.  They seek to prevent most people from obtaining prohibited content, not to banish it entirely.  Censorship that constrains the average user’s ordinary web browsing generally suffices.

Privacy discourse too is obsessed with perfection.  The reidentification wars have pitted researchers who assert that anonymizing data is impossible{{13}} against those who argue the risk of breaching properly sanitized datasets is vanishingly small.{{14}}  While the arguments are dauntingly technical (for those unfamiliar with advanced statistics), the empirical evidence points toward the less threatening conclusions.  The only rigorous study demonstrating an attack on a properly de-identified dataset under realistic circumstances revealed but 2 out of 15,000 (.013%) participants’ identities.{{15}}  Moreover, critics of anonymized data overlook the effects of incorrect matches. Attackers will have to weed out false matches from true ones, complicating their task.

Opponents make three mistakes by focusing on the theoretical risk of re-identification attacks on properly sanitized data.  First, the empirical evidence for their worries is slight, as the data above demonstrates.  There are no reports of such attacks in practice, and the only robust test demonstrated minimal risk.  Second, anonymized data is highly useful for socially beneficial purposes, such as predicting flu trends, spotting discrimination, and analyzing the effectiveness of medical and legal interventions.{{16}} Finally, the most significant privacy risk is from imperfectly sanitized data: organizations routinely release, deliberately or inadvertently, information that directly identifies people, or that enables an attacker to do so without advanced statistical knowledge.  Examples are legion, from the California firm Biofilm releasing the names and addresses of 200,000 customers who asked for free Astroglide samples{{17}} to AOL’s disclosure of user queries that allowed researchers to link people to their searches.{{18}}  Concentrating on whether perfect anonymization is possible distracts from far more potent privacy threats emanating from data.

Intellectual property (“IP”) in the digital age is similarly obsessed with perfection.  IP owners argue that with the advent of perfect digital copies, high-speed networks, and distributed dissemination technologies, such as peer-to-peer file-sharing software, any infringing copy of a protected work will spread without limit, undermining incentives to create.  This rhetoric of explosive peril has resulted in a perpetual increase in the protections for copyrighted works and in the penalties for violating them.{{19}}

The quest for perfect safeguards for IP predates the growth of the commercial Internet.  In September 1995, President Clinton’s administration released its White Paper, which argued that expanded copyright entitlements were necessary for content owners to feel secure in developing material for the nascent Information Superhighway.{{20}}  Without greater protection, the Paper argued, the Superhighway would be empty of content, as copyright owners would simply refuse to make material available via the new medium.

This prediction proved unfounded, but still persuasive.  In the last fifteen years, Congress has reinforced technological protection measures such as Digital Rights Management with stringent legal sanctions;{{21}} has augmented penalties for copyright infringement, including criminal punishments;{{22}} has pressed intermediaries, such as search engines, to take down allegedly infringing works upon notification by the copyright owner;{{23}} and has dedicated executive branch resources to fighting infringement.{{24}}  And yet, pressures from content owners for ever-greater protections continue unrelentingly.  In the current Congress, legislation introduced in both the House of Representatives and the Senate would, for the first time in American history, have authorized filtering of sites with a primary purpose of aiding infringement{{25}} and would have enabled rightsowners to terminate payment processing and Internet advertising services for such sites.{{26}}  These proposals advanced against a backdrop of relatively robust financial health for the American movie and music industries.{{27}}

Thus, the pursuit of perfection in IP also contradicts empirical evidence.  Content industries have sought to prohibit, or at least hobble, new technologies that reduce the cost of reproduction and dissemination of works for over a century—from the player piano{{28}} to the VCR{{29}} to the MP3 player{{30}} to peer-to-peer file-sharing software.{{31}}  And yet each of these advances has opened new revenue horizons for copyright owners.  The growth in digital music sales is buoying the record industry,{{32}} and the VCR proved to be a critical profit source for movies.{{33}}  New copying and consumption technologies destabilize prevailing business models, but not the production of content itself.{{34}}

Moreover, perfect control over IP-protected works would threaten both innovation and important normative commitments.  The music industry crippled Digital Audio Tapes{{35}} and failed to provide a viable Internet-based distribution mechanism until Apple introduced the iTunes Music Store.{{36}}  The movie industry has sought to cut off supply of films to firms such as Redbox that undercut its rental revenue model,{{37}} and Apple itself has successfully used copyright law to freeze out companies that sold generic PCs running MacOS.{{38}}  And, the breathing room afforded by the fair use and de minimis doctrines, along with exceptions to copyright entitlements, such as cover licenses, enables a thriving participatory culture of remixes, fan fiction, parody, criticism, and mash-ups.  Under a system of perfect control, copyright owners could withhold consent to derivative creators who produced works of which they disapproved, such as critical retellings of beloved classics, for example Gone With The Wind,{{39}} or could price licenses to use materials beyond the reach of amateur artists.{{40}}  Perfection in control over intellectual property is unattainable, and undesirable.

The myth of perfection persists because it is potent.  It advances policy goals for important groups—even, perhaps, groups on both sides of a debate.  For censorship, the specter of perfect filtering bolsters the perceived power of China’s security services.  It makes evasion appear futile.  For those who seek to hack the Great Firewall, claiming to offer the technological equivalent of David’s slingshot is an effective way to attract funding from Goliath’s opponents.  Technological optimism is a resilient, seductive philosophical belief among hackers and other elites{{41}} (though one that is increasingly questioned).{{42}}

Similarly, privacy scholars and advocates fear the advent of Big Data: the aggregation, analysis, and use of disparate strands of information to make decisions—whether by government or by private firms—with profound impacts on individuals’ lives.{{43}}  Their objections to disclosure of anonymized data are one component of a broader campaign of resistance to changes they see as threatening to obviate personal privacy.  If even perfectly anonymized data poses risks, then restrictions on data collection and concomitant use gain greater salience and appeal.

Finally, concentrating on the constant threat to incentives for cultural production in the digital ecosystem helps content owners, who seek desperately to adapt business models before they are displaced by newer, more nimble competitors.  They argue that greatly strengthened protections are necessary before they can innovate.  Evidence suggests, though, that enhanced entitlements enable content owners to resist innovation, rather than embracing it.  The pursuit of perfection turns IP law into a one-way ratchet: protections perpetually increase, and are forever insufficient.

We should abandon the ideal of the sublime in cyberlaw.  Good enough is, generally, good enough.  Patchy censorship bolsters authoritarian governments.  Imperfectly anonymized data generates socially valuable research at little risk.  And a leaky IP system still supports a thriving, diverse artistic scene.  Pursuing perfection distracts us from the tradeoffs inherent in information control, by reifying a perspective that downplays countervailing considerations.  Perfection is not an end, it is a means—a political tactic that advances one particular agenda.  This Essay argues that the imperfect—the flawed—is often both effective and even desirable as an outcome of legal regulation.

*    Associate Professor of Law, Brooklyn Law School (through spring 2012); Associate Professor of Law, University of Arizona James E. Rogers College of Law (beginning fall 2012).  Thanks for helpful suggestions and discussion are owed to Jane Yakowitz Bambauer, Dan Hunter, Thinh Nguyen, Derek Slater, and Chris Soghoian.  The author welcomes comments at

[[1]]   Richard Clayton, Failures in a Hybrid Content Blocking Systemin Privacy Enhancing Technologies: 5th International Workshop PET 2005 78 (George Danezis & David Martin eds., 2006).[[1]]

[[2]]   Eric S. Fish, Is Internet Censorship Compatible With Democracy? Legal Restrictions of Online Speech in South Korea, Asia-Pac. J. Hum. Rts. & the L. (forthcoming 2012), available at

[[3]]   China, OpenNet (June 15, 2009),

[[4]]   Iran, OpenNet (June 16, 2009),

     [[5]]   See, e.g., James Fallows, “The Connection Has Been Reset”, The Atlantic (March 2008),

      [[6]]   See, e.g., Oliver August, The Great Firewall: China’s Misguided—and Futile—Attempt to Control What Happens Online, Wired (Oct. 23, 2007),‑11/ff_chinafirewall?currentPage=all; Troy Hunt, Browsing the broken Web: A Software Developer Behind the Great Firewall of China, Troy Hunt’s Blog (Mar. 16, 2012),‑broken‑web‑software‑developer
.html; Weiliang Nie Chinese Learn to Leap the “Great Firewall”, BBC News (Mar. 19, 2010),[[6]]

[[7]]   Erica Naone, Censorship Circumvention Tools Aren’t Widely Used, Tech. Rev (Oct. 18, 2010),[[7]]

[[8]]   Chinasupra note 3.[[8]]

[[9]]   Richard Fontaine & Will Rogers, China’s Arab Spring Cyber Lessons, The Diplomat (Oct. 3, 2011),

[[10]]   Tim Wilde, Knock Knock Knockin’ on Bridges’ Doors, Tor (Jan. 7, 2012),[[10]]

[[11]]   Phil Vinter, Chinese Sell Iran £100m Surveillance System Capable of Spying on Dissidents’ Phone Calls and Internet, Daily Mail (Mar. 23, 2012),‑2119389/Chinese‑sell‑Iran‑100m‑surveillance-capable-spying-dissidents-phone-calls-internet.html.[[11]]

[[12]]   See generally Nart Villeneuve, Choosing Circumvention: Technical Ways to Get Round Censorshipin Reporters Without Borders, Handbook for Bloggers and Cyberdissidents 63 (2005), available at

[[13]]   See, e.g., Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. Rev. 1701, 1752 (2010); Latanya Sweeney, Patient Identifiability in Pharmaceutical Marketing Data, (Data Privacy Lab, Working Paper No. 1015, 2011), available at[[13]]

[[14]]   See, e.g., Jane Yakowitz, The Tragedy of the Data Commons, 25 Harv. J.L. & Tech. 1, 52 (2011); Khaled El Emam et al., A Systematic Review of Re-identification Attacks on Health Data, PLoS One (Dec. 2011),[[14]]

[[15]]   Deborah Lafky, Program Officer, Dep’t Health and Human Servs., The Safe Harbor Method of De-Identification: An Empirical Test, ONC Presentation (October 9, 2009), available at

[[16]]   See Yakowitz, supra note 14.[[16]]

[[17]]   Christopher Soghoian, Astroglide Data Loss Could Result in $18 Million Fine, DubFire (July 9, 2007),

[[18]]   Katie Hafner, Leaked AOL Search Results Create Ethical Dilemma for Researchers, N.Y. Times (Aug. 23, 2006),

[[19]]   See generally Robert Levine, Free Ride: How Digital Parasites are Destroying the Culture Business, and How the Culture Business Can Fight Back (2011); Jessica Litman, Digital Copyright (2001); Mike Masnick, Why Is The MPAA’s Top Priority “Fighting Piracy” Rather Than Helping the Film Industry Thrive?, Techdirt (Feb. 22, 2011),

[[20]]   Pamela Samuelson, The Copyright Grab, Wired (Jan. 1996),[[20]]

[[21]]   17 U.S.C. § 1201 (2006).[[21]]

[[22]]  17 U.S.C. § 1204 (2006); No Electronic Theft (NET) Act, Pub. L. No. 105-147, 111 Stat. 2678 (1997).[[22]]

[[23]]  17 U.S.C. § 512(c) (2006).[[23]]

[[24]]  Prioritizing Resources and Organization for Intellectual Property (PRO IP) Act, Pub. L. No. 110-403, 122 Stat. 4256 (2008). [[24]]

[[25]]   PROTECT IP Act of 2011, S.968, 112th Cong. (2012).[[25]]

[[26]]   Stop Online Piracy Act of 2011, H.R. 3261, 112th Con. (2012).[[26]]

[[27]]   Robert Andrews, Music Industry Can See The Light After “Least Negative” Sales Since 2004, Time (Mar. 26, 2012),
/03/26/music-industry-can-see-the-light-after-least-negative-sales-since-2004/; Brooks Barnes, A Sliver of a Silver Lining for the Movie Industry, N.Y. Times (Mar. 22, 2012),
-silver-lining-for-the-movie-industry/#; Bob Lefsetz, Movie Industry Is Making Money from Technologies It Claimed Would KILL Profits, The Big Picture (Jan. 30, 2012, 4:30 PM),

[[28]]   See White-Smith Music Publ’g Co. v. Apollo Co., 209 U.S. 1, 13–14 (1908) (holding that a piano roll does not infringe composer’s copyright because the perforated sheets are not copies of the sheet music).[[28]]

[[29]]   See Sony v. Universal Studios, 464 U.S. 417, 442 (1984) (holding that the manufacture of a VCR does not constitute contributory copyright infringement because it “is widely used for legitimate, unobjectionable purposes”).[[29]]

[[30]]   See Recording Indus. Ass’n of Am. v. Diamond Multimedia Sys., 180 F.3d 1072, 1081 (9th Cir. 1999) (upholding a district court denial of preliminary injunction against the manufacture of the Rio MP3 player because the Rio is not subject to the Audio Home Recording Act of 1992).[[30]]

[[31]]   See Metro-Goldwyn-Mayer Studios v. Grokster, 545 U.S. 913, 918 (2005) (holding that distributor of peer-to-peer file sharing network is liable for contributory copyright infringement when “the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement”).[[31]]

[[32]]   Andrews, supra note 27.[[32]]

[[33]]   Michelle Schusterman, Infographic: Why the Movie Industry is So Wrong About SOPA, Matador (Jan. 17, 2012),

[[34]]   See generally Mark A. Lemley, Is the Sky Falling on the Content Industries?, 9 J. Telecomm. & High Tech. L. 125 (2011) (explaining that while the introduction of new technologies in the past may have disrupted certain industries, the new technology did not stop the creation of new content).[[34]]

[[35]]   See generally Tia Hall, Music Piracy and the Audio Home Recording Act, 2002 Duke L. & Tech. Rev. 0023 (2002).[[35]]

[[36]]   Derek Slater et al., Content and Control: Assessing the Impact of Policy Choices on Potential Online Business Models in the Music and Film Industries, (Berkman Center for Internet & Society at Harvard Law School, Research Publication No. 2005-10, 2005), available at

[[37]]   Paul Bond, Warner Bros., Redbox Divided on DVD Terms, The Hollywood Reporter (Feb. 29, 2012),

[[38]]   See Apple Inc. v. Psystar Corp., 658 F.3d 1150, 1162 (9th Cir. 2011).[[38]]

[[39]]   See SunTrust Bank v. Houghton Mifflin Co., 268 F.3d 1257, 1275 (11th Cir. 2001) (denying a preliminary injunction because a fair use defense would prevent the plaintiff, owner of the copyright of Gone With the Wind, from preventing the defendant from publishing a novel that critiques Gone With the Wind).[[39]]

[[40]]   See generally Derek E. Bambauer, Faulty Math: The Economics of Legalizing The Grey Album, 59 Ala. L. Rev. 345 (2007) (contending the economics of the derivative works right prevents the creation of new works and stifles the re-mix culture).[[40]]

[[41]]   John Gilmore averred that “[t]he Net interprets censorship as damage and routes around it.”  Philip Elmer-Dewitt, First Nation in Cyberspace, Time, Dec. 6, 1993, at 62.[[41]]

[[42]]   See generally Evgeny Morozov, The Net Delusion (2011) (arguing that the Internet makes it easier for dictators to prevent democratic uprisings).[[42]]

[[43]]   See generally Julie Cohen, Configuring the Networked Self (2011) (making the case that flows of private information are not restricted and proposing legal reforms to address the problem); Jessica Litman, Information Privacy / Information Property, 52 Stan. L. Rev. 1283 (2000) (contending that industry’s self-regulation of information privacy has failed and proposing that torts may be the best available avenue to improve privacy rights); danah boyd & Kate Crawford, Six Provocations for Big Data, Symposium, A Decade in Internet Time: Symposium on the Dynamics of Internet and Society, Oxford Internet Inst. (Sept. 2011), available at
_id=1926431 (proposing six questions about the potential negative effects of Big Data).[[43]]

By Margot Kaminski

My friends, who are generally well educated and intelligent, read a lot of garbage.  I know this because since September 2011, their taste in news about Justin Bieber, Snooki, and the Kardashians has been shared with me through “social readers” on Facebook.{{1}}  Social readers instantaneously list what you are reading on another website, without asking for your approval before disclosing each individual article you read.  They are an example of what Facebook calls “frictionless sharing,” where Facebook users ostensibly influence each other’s behavior by making their consumption of content on other websites instantly visible to their friends.{{2}}  Many people do not think twice about using these applications, and numerous publications have made them available, including the Washington PostWall Street Journal, and Guardian.{{3}}

I intend to prompt conversation about social readers on three fronts.  First, social readers are part of a shift toward real name policies online, and, for a number of reasons, should remain opt-in rather than becoming the default setting.  Second, if people do choose to use these applications, they should know that they are making that choice against a backdrop of related battles in privacy law concerning the right to consume content without a third party sharing your activity more broadly.  And third, when individuals choose to use these applications, they may be sharing their habits more widely than they think.

I.  Social Readers and Online Real-Name Policies

Social readers are part of a larger trend toward linking online activity to Internet users’ real identities.  Unlike America Online’s use of invented screen names, the two major social networks, Facebook and Google+, require users to register with their real names or verified pseudonyms.{{4}}  Both Google and Facebook aim to link user activity outside of the social network to one identifiable, real name profile, although Google’s aspirations currently appear limited to other Google services, while Facebook’s ambitions are broader.{{5}}  This real-name model is desirable to online companies and their supporting advertisers because it is easier to advertise to someone if you know who he or she is, and know all of his or her online behavior.

Business concerns are not the only motivating factor behind the shift toward real-name policy.  There is also an argument that real-name policies on comment forums may make people behave more civilly toward each other, because they are part of a social community that imports accountability into the online context.{{6}}  This has been compelling to some newspapers.  The Huffington Post, for example, has a Social News feature that encourages readers to log in through their Facebook accounts and comment on articles under their real identities.{{7}}  However, shifting to real name policy creates other problems, such as preventing pseudonymic or anonymous whistleblowing by commenters, and chilling more controversial or critical speech.{{8}}

Social readers are part of this potential collapse of anonymous or pseudonymic online activity.  It used to be the case that reading an article on the New York Times was a separate activity from communicating to your friends on a social network.  Social readers, however, import your reading activity from the newspaper website into your social network and broadcast it instantaneously under your real name.  Your presence on the other website is no longer anonymous.

Despite the potential benefits to companies, the decision to allow instantaneous sharing of all content consumed elsewhere connected to a user’s real identity should remain firmly in the hands of Internet users.  As individuals, we construct discrete identities for different circumstances: one for work, one for home, one for our closest friends.{{9}}  This is in fact the idea behind Google+’s “Circles” feature, which allows a user to tailor the parts of his or her identity that are visible to each “Circle,” whether it be friends, co-workers, or family.{{10}}  For a social network to retain value by mirroring reality, it needs to allow us to retain these distinctions.  Before social readers, one’s decision to read US Weekly at the gym would not be broadcast to one’s coworkers.  If one is forced to sign up for an US Weekly social reader, however, one’s network would see every article read.  This first point is about one’s relationship as an individual to other individuals: we should each be able to control the parts of our identity we want shown to other people.  We do this in real life; we should be able to do this online.  There may also be a benefit to media companies of allowing individuals to be pickier in their sharing: friends might take recommendations more seriously if they are deliberate and limited, rather than a list of everything their mutual friend haphazardly read.{{11}}

Already, some companies have experienced a backlash from making such pervasive sharing the default option for their software.  For example, in September 2011, the music service Spotify announced a partnership with Facebook that would allow new users to sign up only if they have a Facebook account.{{12}}  Users started seeing their music playlists automatically shared on Facebook; they could opt out of the service, but only by manually disabling the sharing feature.{{13}}  In response to a strong negative reaction, Spotify rolled out a more visible new privacy feature to allow users to “hide their guilty pleasures,” according to the Spotify CEO.{{14}}  The strong reaction to Spotify’s automatic frictionless sharing, and the fact that many newspapers have decided not to create social reader applications at all, shows that if users’ interests are kept in mind, frictionless sharing should remain an option, not the default.

II.  Social Readers and Other Privacy Law Battles

Coincidentally or consequentially, the legal debate over privacy and media consumption has taken on new dimensions at the same time that companies move toward frictionless sharing.  As people on Facebook allow the Washington Post to broadly share every article they have ever read, others are fighting to protect reader records from third parties.

First, it’s important to address whether, and why, reader privacy is important.  Librarians are adamant about the importance of reader privacy.{{15}}  The American Library Association has affirmed a right to privacy for readers since 1939,{{16}} and states that “one cannot exercise the right to read if the possible consequences include damage to one’s reputation, ostracism from the community or workplace, or criminal penalties.  Choice requires both a varied selection and the assurance that one’s choice is not monitored.”{{17}}  This concern comes in part from a historical awareness of how the government might abuse knowledge of citizens’ reading material.  Reading material can be used by the government to track dissidence.  Famously, Joseph McCarthy released a list of allegedly pro-communist authors, and the State Department ordered overseas librarians to remove such books from their shelves.{{18}}  Imagine if social readers had existed during the McCarthy era—the government would have been able to check each person’s virtual bookshelf for blacklisted material.  With the advent of data mining, the reading choices that seem innocuous to you can cumulatively be indicative of patterns, intent, or allegiances to others, including law enforcement.{{19}}

The United States has surprisingly scattered law on the question of readers’ privacy.  There is no federal statute explicitly protecting it.  This means that companies are not specifically prohibited on a federal level from sharing your reading history with others.  In practice, librarians usually require a court order for the government to obtain reader records, and most states make that requirement explicit.{{20}} The PATRIOT Act famously raised ire from librarians by permitting the government under certain circumstances to request library patron records secretly and without judicial oversight.{{21}}

Although there is no federal reader privacy statute, related laws concerning library patrons exist in forty-eight states.{{22}}  Recently, there has been a push at the state level to expand protections for reader privacy beyond libraries.  The California Reader Privacy Act, which was signed into law in October 2011 and took effect in January 2012, extends the type of protections traditionally afforded to library patrons to all books and e-books, although it does not extend to other types of reading online.{{23}}  Government entities must obtain a warrant before accessing reader records, and booksellers or providers must be afforded an opportunity to contest the request.  Booksellers must report the number and type of requests that they receive.{{24}}  Requests made in the context of civil actions must show that the requesting parties are using the “least intrusive means” and have a “compelling interest” in the records, and must obtain a court order.

The First Amendment could arguably protect readers from the discovery of their reading history by the government or by third parties using the court system to obtain the information.  A series of cases have given rise to a standard protecting the anonymity of online speakers.{{25}}  Julie E. Cohen has suggested that the First Amendment should extend its protections to a similar right to read anonymously.{{26}}  However, there has not yet been a case where a litigant has successfully made this argument to protect digital reader records under the First Amendment.

We do have one federal law protecting user privacy during content consumption: the Video Privacy Protection Act (“VPPA”),{{27}} which prohibits the disclosure of personally identifiable video rental information to third parties without a user’s specific consent, and prohibits disclosure of the same to police officers without a warrant.{{28}}  This strangely precise piece of law arose after Supreme Court nominee Robert Bork had his video rental records disclosed in a newspaper.{{29}}

Companies have realized, however, that VPPA is a hurdle to their business models.  In December 2011, the House of Representatives passed H.R. 2471, amending VPPA to allow the disclosure of video rental records with consent given in advance and until that consent is withdrawn by the consumer.{{30}}  This change would allow companies such as Netflix to get a one-time blanket consent to disclose user records through frictionless sharing on Facebook.  The Senate Judiciary Committee held a hearing on H.R. 2471 on January 31, 2012, at which many privacy concerns were raised.{{31}}

III.  Oversharing

Those who currently use social readers may be sharing their reading activity far more broadly than they expect.  Your close friends are not the only ones who can see your Facebook profile.  A Freedom of Information Act (“FOIA”) lawsuit by the Electronic Frontier Foundation revealed that law enforcement agencies use social media to obtain information about people by going undercover on social media sites to gain access to nonpublic information.{{32}}  And even if no police officer or other informant has posed as a friend of yours, using a social network to broadcast your reading records means you have shared those records with a third party—the social network itself—which under United States v. Miller means the police may not need a warrant to obtain those records from the social network.{{33}}

Perhaps more significantly, even if we get rid of the Miller doctrine, as Justice Sotomayor recently suggested, the wholesale sharing of your reading history with Facebook friends may ultimately impact the Supreme Court’s understanding of what constitutes a “reasonable expectation of privacy.”{{34}}  In the 1967 seminal Supreme Court case on wiretapping, Katz v. United States, Katz placed a phone call in a public phone booth with the door closed, and was found to have a reasonable expectation of privacy in the phone call, so a warrant was required for wiretapping the phone.{{35}}  Justice Alito recently contemplated that we may be moving toward a world in which so many people share information with so many friends that social norms no longer indicate a reasonable expectation of privacy in that information.{{36}}  Without a reasonable expectation of privacy, there will be no warrant requirement for law enforcement to obtain that information.  This analysis is troubling; sharing information with your friends should not mean that you expect it to be shared with law enforcement.  This would be like saying that just because you sent wedding invitations to 500 of your closest friends, the government is justified in opening the envelope.  The size of the audience for private communication should not change the fact that it is private.

The recent trend toward social readers and other types of frictionless sharing may at first glance seem innocuous, if inane.  But it has occurred just as privacy advocates are pushing to create more privacy protections for readers through state laws, and may result in the loss of VPPA, the one federal law that protects privacy in content consumption.  And users may not understand that sharing what they read with friends may mean sharing what they read with the government, as well.  That is a whole lot more serious than just annoying your friends with your taste for celebrity gossip.  Indeed, it may be another step toward the death of the Fourth Amendment by a thousand cuts.{{37}}

* Research Scholar in Law and Lecturer in Law at Yale Law School, and Executive Director of the Information Society Project at Yale Law School. She thanks Kevin Bankston of the Center for Democracy and Technology for his review and helpful comments.

[[1]] See, e.g., Ian Paul, Wall Street Journal Social on Facebook: A First Look, Today @PCWorld Blog (Sep. 20, 2011, 7:02 AM),

[[2]] Jason Gilbert, Facebook Frictionless App Frenzy Will Make Your Life More Open, Huffington Post (Jan. 18, 2012),

[[3]] See The Washington Post Social Reader, Wash. Post, (last visited Feb. 26, 2012); Press Release, The Guardian, Guardian Announces New App on Facebook to Make News More Social (Sept, 23, 2011),available at
-press-office/guardian-launches-facebook-app; Paul, supra note 1.[[3]]

[[4]] Facebook requires real names as user names, allowing its users to sign into other sites and comment there—although it has just recently started allowing celebrities to use pseudonyms. See Somini Sengupta, Rushdie Runs Afoul of Web’s Real-Name Police, N.Y. Times (Nov. 14, 2011),‑or‑using‑your‑name‑online-and-who-decides.html; see also Nathan Olivarez-Giles, Facebook Verifying Celebrity Accounts, Allowing Pseudonyms, L.A. Times (Feb. 16, 2012),‑fi‑tn‑facebook‑verified‑accounts
‑nicknames-pseudonyms-20120216,0,3899048.story.  Google’s social network, Google+, uses real names and now pseudonyms, but only if you can prove to Google that you are in fact known by that name elsewhere.  See Claire Cain Miller, In a Switch, Google Plus Now Allows Pseudonyms, N.Y. Times Bits Blog (Jan. 23, 2012, 4:08 PM),

[[5]] Google’s new privacy policy is an example of this. The new privacy policy states that “[w]e may use the name you provide for your Google Profile across all of the services we offer that require a Google Account. In addition, we may replace past names associated with your Google Account so that you are represented consistently across all our services. If other users already have your email, or other information that identifies you, we may show them your publicly visible Google Profile information, such as your name and photo.” Preview: Privacy Policy, Google, (last visited Feb. 29, 2012).[[5]]

[[6]] See, e.g., Lawrence Lessig, Code and Other Laws of Cyberspace 80 (1999) (“Just as anonymity might give you the strength to state an unpopular view, it can also shield you if you post an irresponsible view. Or a slanderous view. Or a hurtful view.”).[[6]]

[[7]] See Frequently Asked Questions, Huffington Post, (last visited Feb. 26, 2012).[[7]]

[[8]] Stone v. Paddock Publications, Electronic Frontier Found., (last visited Feb. 26, 2012) (noting that the Illinois Court of Appeals recognized the potential harms in the “chilling effect on the many citizens who choose to post anonymously on the countless comment boards for newspapers, magazines, websites and other information portals”).[[8]]

[[9]] See, e.g., Jan E. Stets & Michael M. Harrod, Verification Across Multiple Identities: The Role of Status, 67 Soc. Psych. Quart. 155 (2004) (investigating status verification across three identities: the worker identity, academic identity, and friend identity).[[9]]

[[10]] See, e.g.Google+ Overview, Google,
+/learnmore/ (last visited Feb. 29, 2012) (“You share different things with different people. But sharing the right stuff with the right people shouldn’t be a hassle. Circles make it easy to put your friends from Saturday night in one circle, your parents in another, and your boss in a circle by himself, just like real life.”).[[10]]

[[11]] Jeff Sonderman, With ‘Frictionless Sharing,’ Facebook and News Orgs Push Boundaries of Online Privacy, Poynter (Sep. 29, 2011),‑news/media‑lab/social‑media/147638/with‑frictionless-sharing-facebook-and-news-orgs-push-boundaries-of-reader-privacy/ (noting that “[i]f everything is shared automatically, nothing has significance”).[[11]]

[[12]] See  Sarah Jacobsson Purewal, Spotify Adds Facebook Requirement, Angering Users, Today @PCWorld Blog (Sep. 27, 2011),[[12]]

[[13]] See Zack Whittaker, Spotify’s ‘Frictionless Sharing’ Bows to Facebook Privacy Pressure, ZD Net Between the Lines Blog (Sept. 30, 2011),‑frictionless‑sharing‑bows‑to‑facebook‑privacy-pressure/59408.[[13]]

[[14]] Id.[[14]]

[[15]] See, e.g.An Interpretation of the Library Bill of Rights, Am. Library Ass’n,
/ContentManagement/ContentDisplay.cfm&ContentID=88625 (last visited Feb. 26, 2012).[[15]]

[[16]] Id.[[16]]

[[17]] Privacy and Confidentiality, Am. Library Ass’n,
/offices/oif/ifissues/privacyconfidentiality (last visited Feb. 26, 2012).[[17]]

[[18]] Robert Griffith, The Politics of Fear: Joseph R. McCarthy and the Senate 215–16 (1970).[[18]]

[[19]] See, e.g., Stephen L. Baker, The Numerati (2008).[[19]]

[[20]] See State Privacy Laws Regarding Library Records, Am. Library
/stateprivacy (last visited Feb. 26, 2012) (stating that “[l]ibraries should have in place procedures for working with law enforcement officers when a subpoena or other legal order for records is made. Libraries will cooperate expeditiously with law enforcement within the framework of state law.”).[[20]]

[[21]] The USA Patriot Act, Am. Library Ass’n,
/advleg/federallegislation/theusapatriotact (last visited Feb. 26, 2012) (observing that “[l]ibraries cooperate with law enforcement when presented with a lawful court order to obtain specific information about specific patrons; however, the library profession is concerned some provisions in the USA PATRIOT Act go beyond the traditional methods of seeking information from libraries.”); see also Resolution on the USA PATRIOT Act and Libraries, Am. Library Ass’n, (June 29, 2005),
/colresolutions/PDFs/062905-CD20.6.pdf (explaining that “Section 215 of the USA PATRIOT Act allows the government to secretly request and obtain library records for large numbers of individuals without any reason to believe they are involved in illegal activity” and “Section 505 of the USA PATRIOT Act permits the FBI to obtain electronic records from libraries with a National Security Letter without prior judicial oversight”).[[21]]

[[22]] State Privacy Laws Regarding Library Records, Am. Library
/stateprivacy (last visited Feb. 28, 2012).[[22]]

[[23]] See Joe Brockmeier, California Gets Reader Privacy Act: Still Not Enough, ReadWrite Enterprise (Oct. 3, 2011),

[[24]] See Rebecca Jeschke, Reader Privacy Bill Passes California Senate—Moves on to State Assembly, Electronic Frontier Found. (May 9, 2011),‑privacy‑bill‑passes‑california‑senate-moves.[[24]]

[[25]] See, e.g., Dendrite Int’l, Inc. v. John Doe No. 3, 775 A.2d 756 (N.J. Super Ct. App. Div. 2001).[[25]]

[[26]] Julie E. Cohen, A Right to Read Anonymously: A Closer Look at “Copyright Management” In Cyberspace, 28 Conn. L. Rev. 981 (1996).[[26]]

[[27]] 18 U.S.C. § 2710 (2006).[[27]]

[[28]] Id.see also Video Privacy Protection Act, Electronic Privacy Info. Center, (last visited Feb. 28, 2012) (providing an overview of the VPPA).[[28]]

[[29]] See Video Privacy Protection Act, Electronic Privacy Info. Center, (last visited Feb. 28, 2012).[[29]]

[[30]] See H.R. 2471, 112th Cong. (1st Sess. 2011).[[30]]

[[31]] The Senate Judiciary Committee had a hearing on VPPA in January.  See The Video Privacy Protection Act: Protecting Viewer Privacy in the 21st Century: Hearing Before the Senate Committee on the Judiciary, Subcommittee on Privacy, Technology, and the Law, 112th Cong. (2nd Sess. 2012), available
3be6d4e412d460f. See also Grant Gross, Lawmakers Question Proposed Change to Video Privacy Law, PCworld (Jan. 31, 2012),

[[32]] Jaikumar Vijayan, IRS, DOJ Use Social Media Sites to Track Deadbeats, Criminal Activity, Computerworld (Mar. 16, 2010),[[32]]

[[33]] 425 U.S. 435, 443 (1976).[[33]]

[[34]] United States v. Jones, No. 10–1259, slip op. at 3–6 (U.S. Jan. 23, 2012) (Sotomayor, J., concurring).[[34]]

[[35]] 389 U.S. 347, 348, 352 (1967); see also id. at 361 (Harlan, J., concurring) (developing the reasonable expectation of privacy test).  Later Courts would adopt the reasonable expectation of privacy test.  See Smith v. Maryland, 442 U.S. 735, 740 (1979).[[35]]

[[36]] Jones, slip op. at 10 (Alito, J., concurring in judgment). Alito in the concurrence in Jones noted that “even if the public does not welcome the diminution of privacy that new technology entails, they may eventually reconcile themselves to this development as inevitable.”  Id.  At oral argument, Alito remarked that “[t]echnology is changing people’s expectations of privacy. Suppose we look forward 10 years, and maybe 10 years from now 90 percent of the population will be using social networking sites and they will have on average 500 friends and they will have allowed their friends to monitor their location 24 hours a day, 365 days a year, through the use of their cell phones. Then—what would the expectation of privacy be then?”  Transcript of Oral Argument at 44, United States v. Jones 565 U.S. ___ (2012) (No.10–1259).[[36]]

[[37]] See Alex Kozinski & Stephanie Grace, Pulling the Plug on Privacy: How Technology Helped Make the 4th Amendment Obsolete, The Daily (June 22, 2011),

By: M. Ryan Calo

Professor Patricia Sánchez Abril opens her article, Private Ordering: A Contractual Approach to Online Interpersonal Privacy, with a profound insight: online interpersonal privacy suffers from a case of broken windows.[1] By “broken windows,” Professor Abril refers to the well-evidenced phenomena that instances of minor disrepair can promote an overall environment of antisocial behavior.[2] Just as a building with one broken window will almost certainly have many more, so could other contexts degenerate if small infractions go visibly unaddressed.

There may be times when upholding an agreement of confidentiality is not in the public interest and even the mantle of law is overkill. For instance, what if a student’s use of a social network reveals that she is a danger to herself or others, but her peers have all contracted not to say anything?

Professor Abril invokes the metaphor of broken windows in the context of online interpersonal privacy to illustrate “the role of norms vis-à-vis legal rules in shaping human behavior.”[3] Specifically, she believes that some combination of four factors—the dominance of sharing-culture, the lack of close-knit groups, the dearth of opportunities for user control over data, and the misapplication of contract law—“conspire to create a public perception of ambivalence toward breaches of interpersonal privacy, perpetuating social disorder.”[4] Professor Abril ultimately “calls on the power of contract to create context and thereby address many online interpersonal privacy concerns.”[5]

I agree with much of Professor Abril’s sophisticated reframing of the problem.  I also see promise in her proposal to leverage contracts to combat a perception that “anything goes” on the Internet.[6] In particular, I appreciate the role Professor Abril has in mind for contract law: not just to create enforceable rights, but more importantly, to signal the solemnity of the transaction.  In her words: “Even when not readily enforceable by legal means, the mere existence of a contract serves the important role of expressing and establishing social norms.”[7]

Indeed, one way to challenge Professor Abril’s argument is to question whether contracts formed in the way she describes will carry any legal water at all.  Arguably, they will not.  Creative Commons gains force from copyright law, which provides an affirmative right that the right holder may then pare back or renounce.[8] The same is not true where, as in interpersonal privacy, there is no underlying statutory right.[9]Even if we agree that opening an e-mail or accepting a friend request can constitute both consideration and assent for purposes of contract formation, it is hard to imagine how damages might be calculated.  Professor Abril concedes at length that damages may prove an insurmountably high hurdle to recovery of a successful claim.[10]

In many ways, observing that a system of interpersonal contracts may in practice be unenforceable misses the point.  What Professor Abril seems to be after is “a new set of norms;”[11] she wants to leverage the formality of contract law to “create context.”[12] Regardless of whether a court would permit recovery for a breach of Professor Abril’s protocol, the very use of that protocol—its mere existence—tends to combat the prevailing cavalier attitude toward interpersonal privacy that we see today.[13] It helps mend the broken windows.

But this observation raises a second question: if all we are doing is signaling, ought we not to prefer a nonbinding, norms-based approach to online communication such as that championed by Jonathan Zittrain and Lauren Gelman?[14] These authors eschew the use of law per se in favor of a model based outright on principles of neighborliness.  There may be times when upholding an agreement of confidentiality is not in the public interest and even the mantle of law is overkill.  For instance, what if a student’s use of a social network reveals that she is a danger to herself or others, but her peers have all contracted not to say anything?[15]

Both models suffer, incidentally, from a common limitation.  No matter how user-friendly the signal is, when faced with too much signaling, users may begin to tune it out.  What effect will a sea of icons, or the need to click assent for every bit of content, have on the average user?  Judging by the literature on information overload generally, and “wear out” specifically, there is a danger users will become inured to even a standardized system of online communication.[16]

This brings me to a final point about how best to improve a social environment.  What is interesting about the broken windows theory is not necessarily that vandalism influences norms; presumably there are many phenomena that influence norms.[17] It is that broken windows are features of the physical environment—they are a form of architecture.[18]

Professor Abril is hardly unaware of the importance of design, as evidenced by her condition that user-to-user contracts be “user-friendly” and “standardized.”[19] Importantly, she is also aware of the existence of literature in psychology suggesting that the form of social interactions helps dictate its content.[20] She nevertheless underemphasizes what I consider to be a crucial point: the very design of a website has a powerful effect on user experience.  Many of the problems we face online result directly from design decisions that we could—and in some cases, ought to—revisit.

Consider the broken window of oversharing.  Design is instrumental both to promote and to combat this ostensible problem.  Social networks in particular are built to make sharing as attractive as possible.  Status-update fields loom large at the top of the screen, beckoning participation.  Comment fields are prepopulated with the user’s picture as though she has already begun to comment (might as well do so!).  These design decisions are not accidental.[21] Meanwhile, sharing content online has immediate, positive effects, whereas the downsides to sharing are not immediately felt.[22] The undergraduate deciding to share pictures of last night’s party probably has his fraternity brothers, not prospective employers, in mind.  He experiences positive feedback in the form of comments and “likes”; he may never know why he did not get that job.

Or consider the insight that people are more likely to disclose personal details to websites that are casual in design, as opposed to formal.  In a study by Leslie John and her colleagues, subjects were more likely to admit to controversial conduct when the study was presented in a silly, playful format.[23] This insight has policy repercussions.  We are ostensibly most concerned with the online disclosure behavior of children, for instance, so much so that we have a special law around it.[24] And yet what are the most casual websites on the Internet, including with respect to online forms that collect information?  The kids don’t stand a chance.

A more direct and potentially more effective way to address online privacy’s broken windows is to examine the design of websites themselves—windows, in a sense, onto the Internet.  What we need in privacy, I believe, is a set of architectural values—both aesthetic and systematic—capable of transforming the web experience in ways that promote public policy goals such as privacy and security.  In the 1970s, architect Oscar Newman revolutionized public housing by introducing the concept of defensible space.[25] We need an Oscar Newman for online privacy.

How might the design of websites, phones, energy meters, and other products help provide the user with an accurate mental model of data practice?  How might we empower users to frame their content in ways that limit abuse without recourse to contracts or even words?  These are the challenges that Acquisiti, Nancy Kim,[26] Woodrow Hartzog,[27] and others have started to address in their work (and which underpin my own notion of nonlinguistic or “visceral” notice).[28] In a sense, this Article is not a response to Professor Arbil’s thought-provoking and well-argued article.  It is an invitation.  Professor Abril ought to take her own metaphor more literally.

[1]. Patricia Sánchez Abril, Private Ordering: A Contractual Approach to Online Interpersonal Privacy, 45 Wake Forest L. Rev. 689, 690 (2010).

[2]. Id.; see also id. at 690 n.11 (citing evidence of the broken windows phenomenon).

[3]. Id. at 691.

[4]. Id. at 694.

[5]. Id.

[6]. Id. at 695, 719, 726.

[7]. Id. at 707.

[8]. 17 U.S.C. § 106 (2006) (granting various exclusive rights in copyrighted works to the copyright owner).

[9]. Federal and state law protects privacy not through a single, baseline statute, but in piecemeal through a series of sector or activity-specific statutes, common law torts, and constitutional doctrines.  See, e.g., Privacy Laws, California Office of Privacy Protection,
_laws.htm (providing a list of some of the state and federal privacy laws of the United States).

[10]. Abril, supra note 1, at 716–19.

[11]. Id. at 723.

[12]. Id. at 691, 694.

[13]. Id. at 719 (“Although the ideal would be a legally enforceable contract, not all promises of confidentiality must be formal contracts in order to effectively safeguard privacy and counteract an ‘anything goes’ attitude toward online privacy.  Sociolegal scholarship indicates that the very existence of a promise or obligation can change social norms.”).

[14]. See Lauren Gelman, Privacy, Free Speech, and “Blurry Edged” Social Networks, 50 B.C. L. Rev. 1315, 1342 (2009) (suggesting “a tool for users to express and exercise privacy preferences over uploaded content.  It would permit users to express their intentions by tagging any uploaded content with an icon that immediately conveys privacy preferences to third parties.”); Jonathan Zittrain, Privacy 2.0, 2008 U. Chi. Legal F. 65, 106–09 (discussing the application of “code-based norms” to privacy).

[15]. Professor Abril assures us that her system, though it restricts information flow, “will not chill speech.”  Abril, supra note 1, at 722.  It may promote interpersonal intimacy, but it will also invoke the force and solemnity of law to limit sharing.  Id.

[16]. See, e.g., Christine Jolls & Cass R. Sunstein, Debiasing through Law, 35 J. Legal Stud. 199, 212 (2006) (describing “wear out” as the phenomenon “in which consumers learn to tune out messages that are repeated too often”).

[17]. For a detailed discussion, see Lawrence Lessig, The New Chicago School, 27 J. Legal Stud. 661 (1998).

[18]. Id. at 663.

[19]. Arbil, supra note 1, at 720.

[20]. Id. at 699 n.74.

[21]. Nor are they intrinsically harmful.  The point of the service is, after all, to communicate.

[22]. Carnegie Mellon Professor Alessandro Acquisti evidences this phenomenon in forthcoming work.

[23]. Leslie K. John, Alessandro Acquisti & George Loewenstein, Strangers on a Plane: Context-Dependent Willingness to Divulge Sensitive Information, 37 J. Consumer Res. 858, 868–69 (2011).

[24]. See Children’s Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501–6506 (2006).

[25]. Oscar Newman, Defensible Space: Crime Prevention Through Urban Design (1972) (arguing that architecture and urban design influence negative social behavior).

[26]. See, e.g., Nancy Kim, Online Contracts: Form as Function (2010) (unpublished manuscript) (on file with author); cf. Nancy Kim, Website Proprietorship and Online Harassment, 2009 Utah L. Rev. 993, 1014–17 (2009) (describing contractual and architectural techniques to constrain online harassment).

[27]. See, e.g., Woodrow Hartzog, Promises and Privacy: Promissory Estoppel and Confidential Disclosure in Online Communities, 82 Temp. L. Rev. 891, 907–08 (2009); Woodrow Hartzog, Website Design as Contract, 60 Am. U. L. Rev. (forthcoming 2011).

[28]. See, e.g., Steve Lohr, Redrawing the Route to Online Privacy, N.Y. Times, Feb. 28, 2010, at BU4 (“M. Ryan Calo, . . . at the Center for Internet and Society at the Stanford Law School, is exploring technologies that deliver ‘visceral notice.’  His research involves voice and animation technology that emulates humans.”).