By Beverly Cohen*

Introduction

On June 23, 2011, the United States Supreme Court, in Sorrell v. IMS Health Inc.,[1] determined that Vermont’s law prohibiting pharmacies from selling prescription data to “data-mining companies” violated the Free Speech Clause of the First Amendment.[2]  Data miners purchased the prescription data to aggregate and resell it to pharmacy manufacturers for marketing purposes.[3]  Drug manufacturers used the information to target physicians for face-to-face visits (“detailing”) by salesmen to convince the physicians to prescribe more of the manufacturers’ costly brand-name drugs.[4]  The prescription information purchased from the data miners enabled the manufacturers to target particular physicians who were not prescribing their brand-name drugs or who were prescribing competing drugs.[5]

Several states objected to drug manufacturers’ use of prescription information for detailing, contending that it increased sales of brand-name drugs and drove up healthcare costs.[6]  When these states passed laws preventing the pharmacies’ sale of the prescription information to data-mining companies and the use of this information by drug manufacturers,[7]the data miners and drug manufacturers sued.[8]

When the challenge to Vermont’s data-mining law reached the Supreme Court, the Court invalidated it on the grounds that it violated the Free Speech Clause.[9]  The Court held that the law did not survive strict scrutiny.  It prohibited the use of prescription information with a particular content (prescriber histories) by particular speakers (data miners and detailers)[10] and did not advance Vermont’s asserted goals of ensuring physician privacy, improving the public health, and containing healthcare costs in a permissible way.[11]

The Federal Privacy Rule,[12] implementing the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”),[13] is similar to the data-mining laws in its restrictions on the disclosure of private health information.[14]  This Article applies the HIPAA Privacy Rule to the practice of data mining and, surprisingly, finds that HIPAA restricts it.[15]  The Privacy Rule flatly prohibits any unauthorized use or disclosure of protected health information for marketing purposes.[16]  Nevertheless, the practice of data mining continues despite HIPAA.  In fact, at least one court has recently declared that nothing in HIPAA restricts data mining.[17]

The question post-Sorrell is whether the marketing provisions of the Privacy Rule, like Vermont’s data-mining law, also violate freedom of speech.[18]  Although there are obvious similarities between HIPAA’s marketing provisions and the marketing restrictions of Vermont’s data-mining law, there are also substantial differences.[19]  The structure of the Privacy Rule is quite unlike the data-mining law in that the discriminatory intent and impact that the Supreme Court found objectionable in Sorrell is largely absent in HIPAA.[20]  Unlike Vermont’s data-mining law, the Privacy Rule does not target disclosures with particular content or by particular speakers.[21]  Therefore, this Article concludes that it is likely that application of the Sorrell analysis to the Privacy Rule would yield a different answer.[22]

This Article explains the practices of data mining and detailing[23] and describes the state laws that sprang up to prohibit them.[24]  It next discusses the various judicial outcomes of the data miners’ challenges to those laws,[25] culminating in the Supreme Court’s invalidation of Vermont’s data-mining law in Sorrell.[26]  The Article then applies the HIPAA Privacy Rule to the Sorrell facts and finds that the marketing provisions of HIPAA disallow the unauthorized use of such information for sale to data miners.[27]  Finally, the Article compares the HIPAA Privacy Rule to the data-mining law invalidated in Sorrell and finds that their structures considerably differ.[28]  Based on these differences, the Article opines that HIPAA presents a substantially different question from that considered in Sorrell and likely yields a different answer.[29]

I.  The Targeted Practices: Data Mining and Detailing[30]

Every time a pharmaceutical prescription is filled, the pharmacy retains information describing the transaction.[31]  These records generally include the identification of the patient; identification of the prescribing physician, including his name, address, and phone number; the drug prescribed, its dosage, and its refill information; price; and insurance information.[32]  In many cases, state law requires this information to be collected and maintained by the pharmacies[33] so that the state can monitor cases of illicit prescriptions and fraudulent prescribing practices by physicians.[34]

Companies, such as IMS Health Inc. and Verispan, LLC,[35] are in the business of “mining” this pharmacy data.[36]  They purchase prescription data from the pharmacies that the pharmacies’ computer software has collected and encrypted so that individual patients cannot be identified by name.[37]  The prescription information that data miners purchase is estimated to encompass several billion prescriptions per year.[38]  The data miners then aggregate the entries,[39] group the information by prescriber, and cross-reference the prescribing history with information on each prescriber available through publicly accessible databases, such as the American Medical Association’s database of physician specialists.[40]  The ultimate reports that the data miners produce show each prescriber’s identity, medical specialty, and a complete history of the drugs he or she prescribed over a given period of time.

The data miners’ customers for these reports are the pharmaceutical manufacturers because the reports are useful in facilitating the drug manufacturers’ practice of “detailing.”  This practice consists of drug-sales representatives visiting physicians and their staffs in a particular region where specific drugs are being marketed.[41]  At these face-to-face meetings, the sales representatives give the physicians “details” about their drugs (use, side effects, and risks) to convince the physicians that they are a better choice for their patients.[42]  Described as a “valuable tool,”[43] the data-mining reports allow the drug representatives to pinpoint prescribers who might be persuaded to switch to the manufacturer’s drugs or to prescribe the manufacturer’s drugs more frequently.[44]  The data-mining reports also enable the representatives to tailor their presentations based on the particular physician’s prescribing practices to maximize the effectiveness of their sales efforts:

That [data-mining] information enables the detailer to zero in on physicians who regularly prescribe competitors’ drugs, physicians who are prescribing large quantities of drugs for particular conditions, and “early adopters” (physicians with a demonstrated openness to prescribing drugs that have just come onto the market).  The information also allows the detailer to tailor her promotional message in light of the physician’s prescribing history.[45]

Merck’s use of data mining to market Vioxx provides an example of the usefulness of data mining to sell a particular drug:

When Merck marketed Vioxx, for example, it used a wealth of prescriber-identifying data to create monthly reports on individual prescribers in each detailer’s assigned territory.  The reports showed how many Merck versus non-Merck drugs the prescriber prescribed and estimated how many of these prescriptions could be substituted for Merck products.  Merck then tracked its detailers’ progress in converting prescribers in their territories to the Merck brand and gave detailers bonuses based on Merck’s sales volume and market share in the detailer’s territory.[46]

Detailing has been described as “a massive and expensive undertaking for pharmaceutical manufacturers.”[47]  Manufacturers reportedly spent $4 billion in 2000 for detailing,[48] employing some 90,000 sales representatives to make the physician office visits.[49]  The detailers often arrive with small gifts for the physicians and their staffs and drop off free drug samples for the physicians to try with their patients.[50]  It has been estimated that a single physician is visited by an average of twenty-eight detailers a week, and a specialist is visited by an average of fourteen detailers.[51]  Because of the time involved and high cost of detailing, drug manufacturers usually reserve it for marketing high-cost, brand-name drugs,[52] as opposed to lower-cost, generic drugs.[53]  Sales representatives try to convince physicians to switch from generic drugs to their brand-name drug, to utilize it instead of a competing brand-name drug, or to remain loyal to the brand-name drug when the patent expires and generic versions become available.[54]

II.  States’ Objections to Drug Manufacturers’ Use of Data Mining for Detailing

Some states, including New Hampshire, Maine, and Vermont, perceived that pharmaceutical manufacturers’ use of pharmacy data to enhance their detailing efforts increased the cost of prescription drugs with no concomitant improvement to the public health.[55]  These perceptions emanated from several factors.

First, the states became convinced that data mining improved the success of detailing.[56]  These states perceived that “detailers armed with prescribing histories enjoyed a significant marketing advantage, resulting in greater leverage, [and] increased sales of brand-name drugs.”[57]  This “leverage” refers to the detailer’s ability to target physicians who prescribe large quantities of generics, the ability to “zero in” on a physician’s particular prescribing choices, and the ability to “punish” physicians who abandon their loyalty to certain brand-name drugs.[58]  Thus, “prescribing histories helped the detailer to become more adversarial in her presentation and to focus on the weakness of the physician’s erstwhile drug of choice as opposed to the clinical virtues of the detailed drug.”[59]

Second, the states believed that the success of detailing often resulted from less than accurate and balanced information.  Vermont negatively characterized the detailers’ provision of information to physicians on pharmaceutical safety and efficacy as “frequently one-sided,” “incomplete,” and “biased.”[60]  The Vermont legislature found that the “[p]ublic health is ill served by the massive imbalance in information presented to doctors and other prescribers.”[61]  Vermont held detailers’ use of data mining responsible for creating “an unbalanced marketplace of ideas that undermines the state’s interests in promoting public health, protecting prescriber privacy, and reducing healthcare costs.”[62]

Third, the states perceived that detailing improperly influenced physicians’ prescription choices and unnecessarily raised the cost of prescription drugs.  New Hampshire viewed detailing as having a “pernicious effect” upon drug prescribing.[63]

The states’ “common sense” conclusion was that detailing worked to induce physicians to prescribe larger quantities of more expensive brand-name drugs.[64]  The fact “that the pharmaceutical industry spends over $4 billion annually on detailing bears loud witness to its efficacy.”[65]  Despite the much higher cost of detailed drugs, New Hampshire concluded that, based upon “competent evidence,” drugs that were aggressively marketed through detailing “provide no benefit vis-à-vis their far cheaper generic counterparts.”[66]  The State maintained that “detailers armed with prescribing histories encouraged the overzealous prescription of more costly brand-name drugs regardless of both the public health consequences and the probable outcome of a sensible cost/benefit analysis.”[67]

Finally, doctors themselves voiced “a predominantly negative view of detailing.”[68]  A 2006 survey by the Maine Medical Association reported that “a majority of Maine physicians did not want pharmaceutical manufacturers to be able to use their individual prescribing histories for marketing purposes.”[69]

III.  State Laws Regulating Data Mining[70]

In the interests of protecting prescriber privacy, safeguarding the public health, and containing healthcare costs,[71] New Hampshire in 2006 became the first state to enact a law limiting drug prescription data mining, known as the Prescription Information Law.[72]  The law prohibited the sale, transfer, use, or licensing of prescription records by pharmacies and insurance companies for any commercial purpose,[73] except for listed health-related purposes, such as pharmacy reimbursement, healthcare management, utilization review by a healthcare provider, and healthcare research.[74]  The statute did not prohibit the transfer of prescription information to fill patients’ prescriptions[75] and placed no restrictions upon prescription information that did not identify the patient or the prescriber.[76]

Shortly thereafter, Vermont followed suit, enacting Act 80, section 17 of the Vermont General Statutes to restrict the use of pharmacy records for drug marketing.[77]  Vermont’s policy goals, compatible with those of New Hampshire, were:

[T]o advance the state’s interest in protecting the public health of Vermonters, protecting the privacy of prescribers and prescribing information, and to ensure costs are contained in the private health care sector, as well as for state purchasers of prescription drugs, through the promotion of less costly drugs and ensuring prescribers receive unbiased information.[78]

Unlike the flat prohibition of New Hampshire’s statute, however, Vermont’s law adopted an “opt-out” approach, prohibiting insurers and pharmacies from selling or transferring prescription data for marketing purposes unless the prescriber opted out of the prohibition by consenting to the use.[79]  The law also prohibited pharmacy manufacturers from using the data for marketing absent prescribers’ consent.[80]  The law defined “marketing” as advertising or any activity that influenced the sale of a drug or influenced prescribing behavior.[81]  The statute contained a number of exceptions to the prohibition, most of which facilitated healthcare treatment and reimbursement, such as dispensing prescriptions, pharmacy reimbursement, patient care management, utilization review by healthcare professionals, healthcare research, and communicating treatment options to patients.[82]  The law also created a program to educate healthcare professionals on therapeutic and cost-effective drug prescribing.[83]

In 2008, Maine enacted similar legislation.  Its goals, like those of New Hampshire and Vermont, were “to improve the public health, to limit annual increases in the cost of healthcare and to protect the privacy of . . . prescribers in the healthcare system of this State.”[84]  Unlike Vermont’s “opt-out” approach, Maine passed an “opt-in” version, making it unlawful for a pharmacy to use, sell, or transfer prescription drug information for any marketing[85] purpose when the information identified the prescriber and the prescriber had opted in by registering for the statute’s protection.[86]  The law included a number of health-related exceptions to the definition of “marketing,” including pharmacy reimbursement, patient care management, utilization review by a healthcare provider, and healthcare research.[87]

IV.  Challenges to the Data-Mining Laws

With a number of other states considering enactment of similar laws,[88]and facing the loss of billions of dollars in business annually, several data-mining companies[89] and an association of pharmaceutical manufacturers[90] challenged the constitutionality of New Hampshire’s, Vermont’s, and Maine’s data-mining laws.[91]  The claims that survived to appeal included that the statutory prohibition violated the Free Speech Clause of the First Amendment, was unconstitutionally vague and overbroad under the First and Fourteenth Amendments, and offended the Commerce Clause.[92]

In 2008, the United States Court of Appeals for the First Circuit ruled in IMS Health Inc. v. Ayotte[93] that the New Hampshire statute regulated conduct, not speech, and therefore did not abridge the First Amendment rights of the data miners.[94]  Alternatively, the First Circuit ruled that even if New Hampshire’s law amounted to a regulation of protected speech, New Hampshire’s action to protect cost-effective healthcare passed constitutional muster.[95]  Utilizing the Central Hudson test,[96] the court found that healthcare cost containment was a substantial governmental interest,[97] data mining increased the success of detailing,[98] detailing increased the cost of prescription drugs,[99] and the statute was sufficiently tailored to achieve its objectives.[100]  The court summarily disposed of the remaining claims for vagueness[101] and violation of the Commerce Clause.[102]

When the challenge to the Maine statute reached the First Circuit in IMS Health Inc. v. Mills[103] approximately two years later, the court unsurprisingly relied upon its prior New Hampshire Ayotte ruling.[104]  The court rejected the First Amendment claim,[105] the vagueness claim,[106]and the Commerce Clause challenge[107] for the same reasons stated inAyotte.

Four months after Mills, the United States Court of Appeals for the Second Circuit ruled on the same issues with regard to the Vermont statute in IMS Health Inc. v. Sorrell.[108]  In Sorrell, the Second Circuit disagreed with nearly every basis for the First Circuit’s two prior decisions.  Applying theCentral Hudson test,[109] the court found that although Vermont did have a substantial interest in lowering healthcare costs and protecting the public health,[110] the statute did not directly advance those interests.[111]  Rather, the court characterized Vermont’s law as an attempt “to bring about indirectly some social good or alter some conduct by restricting the information available to those whose conduct the government seeks to influence.”[112]  Moreover, the court found that Vermont had “more direct, less speech-restrictive means available” to accomplish its goals.[113]  As less restrictive alternatives, the court suggested that the State could have assessed the results of its campaign to encourage the use of generics or could have mandated the use of generic drugs as a first course of treatment.[114]  Failing these critical prongs of the Central Hudson test, the court ruled that Vermont’s law unconstitutionally restricted freedom of speech.[115]

With the First Circuit and Second Circuit Courts of Appeal thus directly at odds on the constitutionality of the data-mining laws, the United States Supreme Court granted certiorari to consider Vermont’s appeal in Sorrell v. IMS Health Inc.[116]

V.  The Supreme Court’s Decision

In June 2011, the Supreme Court, in a six to three ruling,[117] held that Vermont’s drug prescription data-mining law violated the First Amendment.[118]  While conceding that Vermont’s asserted policy goals of containing pharmacy prescription costs and protecting public health were legitimate concerns,[119] the Court held that the statute was a broad, content-based rule[120] that did not satisfy strict scrutiny.[121]

Initially, the Court unequivocally held that the Vermont law was content and speaker based, as it prohibited the sale of pharmaceutical prescription data only for marketing purposes[122] and only to pharmaceutical manufacturers.[123]  Because the law “impose[d] burdens that are based on the content of speech and that are aimed at a particular viewpoint,” the Court ruled that it must apply strict scrutiny.[124]

The Court flatly rejected Vermont’s argument that the law regulated conduct as opposed to speech.[125]  Instead, the Court ruled that “[f]acts, after all, are the beginning point for much of the speech that is most essential to advance human knowledge and to conduct human affairs.  There is thus a strong argument that prescriber-identifying information is speech for First Amendment purposes.”[126]

Applying the Central Hudson test, whereby “[t]here must be a ‘fit between the legislature’s ends and the means chosen to accomplish those ends,’”[127] the Court held that none of the State’s asserted justifications—prescriber privacy, protecting public health, and reducing healthcare costs—withstood scrutiny.[128]  First, because the law permitted disclosure of prescription information for a number of other purposes and applied the ban only to marketing, the Court rejected the privacy justification.[129]  The Court ruled that Vermont’s statute “permits extensive use of prescriber-identifying information and so does not advance the State’s asserted interest in physician confidentiality.”[130]  In particular, the Court objected to the State’s own ability to use the same prescription information to engage in “counter-detailing” efforts to promote generic drugs.[131]  Moreover, the Court observed that privacy remedies less restrictive of speech were available.[132]  For example, prescribers could simply decline to meet with detailers.[133]  Even though physicians might find the use of their prescription histories by detailers to be “underhanded” or tantamount to “spying,”[134] the Court declared that “[s]peech remains protected even when it may . . . ‘inflict great pain.’”[135]

In similar fashion, the Court declared that Vermont’s stated policy goals of improving public health and reducing healthcare costs did not withstand scrutiny under the Central Hudson test,[136] because the law “does not advance them in a permissible way.”[137]  The law sought to protect patients’ health and cost-effectiveness only indirectly, aimed at the fear that physicians, admittedly sophisticated consumers,[138] would make poor purchasing decisions if given truthful information by detailers.[139]

In short, the Court viewed the statute as a means for the State to advance its own views over those of pharmacy manufacturers by stifling protected speech.[140]  The Court stated that if the statute had provided for only a few narrowly tailored exceptions to its ban on the sale or disclosure of prescription information, then its position that it was not targeting a disfavored speaker and disfavored content might be stronger.[141]  But, here, the law permitted disclosure of the same information to countless others and even to the State itself to persuade physicians to prescribe generic drugs.[142]  The Court declared that free access to and use of privately held information is “a right too essential to freedom to allow its manipulation to support just those ideas the government prefers.”[143]  The Court concluded that “the State has left unburdened those speakers whose messages are in accord with its own views.  This the State cannot do.”[144]

Several days after the Sorrell decision was issued, the Supreme Court vacated the First Circuit’s finding that the Maine data-mining laws were valid and remanded the case to the court for further consideration in light ofSorrell.[145]  Three months later, the New Hampshire District Court issued an order declaring that New Hampshire’s data-mining laws were invalid in light of Sorrell.[146]

VI.  Applying the HIPAA Privacy Rule to Data Mining

Since New Hampshire, Vermont, and Maine each enacted state laws that prohibited pharmacies from selling prescription information to data miners for use in detailing, presumably these states perceived that such laws were necessary to ban the practice.  This necessity apparently stemmed from the states’ belief that nothing in the HIPAA Privacy Rule prohibited these data sales by the pharmacies.  This Part of the Article explains why that belief is not supported by the text of HIPAA.

A.     Provisions of the HIPAA Privacy Rule

The HIPAA Privacy Rule[147] regulates covered entities’ use and disclosure of protected health information.[148]  The covered entities regulated by HIPAA include most health plans and healthcare providers.[149]  The term “provider” is defined by the Rule as “a provider of medical or health services . . . and any other person or organization who furnishes, bills, or is paid for health care in the normal course of business.”[150]

Under HIPAA, any time a covered entity uses or discloses protected health information, the use or disclosure must comply with HIPAA’s privacy provisions.[151]  The term “use” is broadly defined as “the sharing, employment, application, utilization, examination, or analysis” of health information protected by HIPAA.[152]  “Disclosure” is also broadly defined as “the release, transfer, provision of, access to, or divulging in any other manner of information outside the entity holding the information.”[153]

The health information protected by the Privacy Rule includes any information relating to healthcare treatment or payment[154] that has a potential to identify the patient to whom the information applies.[155]  Identifiers that can render health information protected include, inter alia, the patient’s name, address, social security number, phone number, photograph, zip code, treatment date, employer, and names of spouse and children.[156]  Furthermore, any identifier that is not specifically named in the Privacy Rule but, due to its uniqueness, has a potential to identify the subject of the information also renders the information protected.[157]

Under the Privacy Rule, any use or disclosure of protected health information by a covered entity must be explicitly permitted or required by HIPAA.[158]  The Privacy Rule requires disclosure in only two instances: (1) when the subject of the protected health information (“the individual”)[159]requests access to his own healthcare information,[160] and (2) when the Secretary of the Department of Health and Human Services (“HHS”) requests access in order to enforce HIPAA.[161]  All other uses and disclosures authorized by the Privacy Rule are permissive.[162]

Most of the permissive uses and disclosures under the Privacy Rule fall into two broad categories.[163]  First, covered entities may use and disclose protected health information for “treatment, payment, or healthcare operations.”[164]  “Treatment” is defined as the rendering of healthcare services to individuals or managing their care.[165]  “Payment” comprises paying insurance premiums and reimbursing providers.[166]  “Health care operations” broadly encompasses operating the business of healthcare entities, including such activities as business management and administrative activities, quality assessment, evaluating the credentials of providers, customer service, and obtaining legal and auditing services.[167]  Thus, treatment, payment, and healthcare operations cover the myriad activities that allow the healthcare industry to function.

The second broad category of permissive uses allows covered entities to use and disclose protected health information for twelve public-interest activities.[168]  These include, inter alia, participating in public-health activities to prevent or control disease; reporting abuse, neglect, or domestic violence; complying with healthcare audits and investigations; assisting law enforcement activities; engaging in healthcare research; and assisting national security and intelligence activities.[169]

If a covered entity’s use or disclosure of protected health information does not fit within one of the Privacy Rule’s enumerated required or permitted use and disclosure, then the use or disclosure may not occur[170] unless the individual authorizes the use or disclosure in writing.[171]

As the primary goal of HIPAA is to protect the privacy of individuals’ healthcare information,[172] HIPAA grants individuals rights of access to their own information and rights to control its uses and disclosures by covered entities.  These rights include the following:
(1) A right of individuals to access upon request their own protected health information,[173] along with a right to appeal denials of access;[174]

(2) A right of individuals to seek to amend their protected health information possessed by covered entities,[175] as well as a right to submit a written statement disagreeing with a denial of an amendment;[176]

(3) A right of individuals to receive an accounting of certain disclosures of their protected health information made by covered entities;[177]

(4) A right of individuals to request covered entities to restrict certain permissible uses and disclosures of their protected health information;[178]

(5) A right of individuals to request confidential communications of protected health information from providers and health plans,[179] which providers must accommodate[180] and which health plans must accommodate if the individuals state that they will be in danger unless accommodation is made;[181]

(6) A right of individuals to agree or object before covered entities make certain disclosures;[182]

(7) A right of individuals to authorize disclosures to third parties;[183] and

(8) A right of individuals to receive a Notice of Privacy Practices from covered entities, describing the covered entities’ uses and discloses of their protected health information and the individuals’ rights thereunder.[184]

B.     HIPAA’s De-identification and Marketing Provisions

HIPAA’s de-identification and marketing provisions are especially relevant to data mining.  In Sorrell, the data mining involved de-identification because, when the pharmacies’ computer software collected the raw prescription data, the software encrypted or stripped out the patients’ identifying information.[185]  Therefore, when the pharmacies sold the information to the data miners, it had been de-identified because the patients’ names could no longer be identified.[186]

The Privacy Rule provides that once protected health information is de-identified, it is no longer protected by HIPAA and thus is not subject to HIPAA’s use and disclosure restrictions.[187]  HIPAA gives explicit instructions on what information must be removed from protected health information to render it de-identified.[188]  Further, HIPAA specifically permits covered entities to de-identify protected health information.[189]  Moreover, HIPAA defines “health care operations,” one of the permissive uses and disclosures of protected health information under the Privacy Rule,[190] to include a covered entity’s creation of de-identified information when the de-identification relates to a “covered function.”[191]

HIPAA’s marketing provisions are also particularly relevant to data mining.  The data miners purchased the prescription information to market their aggregations and reports to pharmaceutical manufacturers.[192]  The drug manufacturers, in turn, purchased the prescription information to more effectively market their brand-name drugs to prescribers.[193]

HIPAA expressly provides that covered entities’ uses and disclosures of protected health information for the purpose of “marketing” are subject to heightened restrictions.[194]  HIPAA defines “marketing” in two ways.  First, marketing includes “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service.”[195]  However, this definition excludes communications made for description of plan benefits, for treatment of the individual, or for case management or care coordination of the individual.[196]  Second, marketing includes a covered entity’s sale of protected health information to a third party to assist that party in marketing its products.[197]

HIPAA’s primary marketing restriction is that whenever a covered entity uses or discloses protected health information for marketing purposes, the individual must expressly authorize the use or disclosure.[198]  This mandate is stated emphatically: “Notwithstanding any provision of this subpart,[199] other than the transition provisions in § 164.532,[200] a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”[201]

The Rule states only two exceptions to this requirement that the individual must authorize the marketing uses and disclosures.  First, an authorization is not needed if the marketing consists of a face-to-face communication between the covered entity and the individual.[202]  Second, an authorization is not needed if the marketing consists of a promotional gift of nominal value provided by the covered entity.[203]  The affected individual must authorize all other marketing uses and disclosures.[204]

C.     How HIPAA’s Marketing and De-identification Rules Impact Data Mining

While Vermont and other states apparently believed that it was necessary to enact a law to prohibit pharmacies from selling prescription information to data miners, surprisingly, such laws were probably not necessary.  HIPAA already appears to have prohibited those sales, rendering the state laws inconsequential.

As explained above, HIPAA requires an authorization from every affected individual before his protected health information can be used or disclosed by covered entities for marketing purposes.[205]  Each Vermont pharmacy qualifies as “a provider of medical or health services” and as an entity that “furnishes, bills, or is paid for health care in the normal course of business.”[206]  Thus, the pharmacies are covered entity providers under HIPAA.[207]  The prescription information collected and retained by the pharmacies constitutes “protected health information,” as it includes the patients’ names and addresses, as well as other identifying information.[208]

Moreover, the pharmacies’ disclosures of prescription information to the data miners appear to have been “for marketing.”[209]  The pharmacies made the disclosures to data miners to enable the data miners to sell their aggregations and reports of pharmacy data to their customers, including drug manufacturers.[210]  Further, the data miners disclosed the prescription information to drug manufacturers to use in marketing their brand-name drugs to physicians.[211]  Selling prescription information for these purposes appears to qualify as marketing under the Privacy Rule’s broad definition: “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service.”[212]

The “rub” with this analysis, however, is that according to the facts inSorrell, the pharmacies did not disclose “protected health information” to the data miners because the information had been de-identified by the pharmacies’ computer software prior to the sale.[213]  As stated above,[214] HIPAA expressly permits covered entities to de-identify protected health information, thereby removing it from any constraints that HIPAA imposes.[215]  Therefore, the pharmacies’ de-identification of their prescription information may have removed the information from the category of “protected health information”[216] and thereby enabled the pharmacies to make whatever use they wished of the information without violating HIPAA.[217]

But HIPAA’s marketing restrictions do not prohibit only unauthorized disclosures of protected health information.  The restrictions also prohibit the pharmacies from even using protected health information for marketing purposes.[218]  Creating de-identified information from protected health information appears to constitute a use of the protected health information because the pharmacies must “employ” and “utilize” the protected information in order to de-identify it.[219]  In fact, the Privacy Rule itself refers to the “use” of protected health information to create de-identified information.[220]  Here, the purpose of the pharmacies’ de-identification of the prescription information was to facilitate sales of the information to the data miners and to enable sales of the data miners’ aggregations and reports to the drug manufacturers, all for the purpose of marketing brand-name drugs to prescribers.[221]  Therefore, the de-identification itself appears to qualify as a marketing use,[222] so that the pharmacies would be prohibited from such use without the individuals’ express authorization under the Privacy Rule.[223]

Further, the Privacy Rule’s explicit statement that covered entities may de-identify protected health information[224] does not negate the authorization requirement.  The requirement that covered entities must obtain an authorization before any use or disclosure related to marketing expressly states that this requirement is imposed “[n]otwithstanding any provision of this subpart.”[225]  HIPAA’s de-identification provisions, on the other hand, lack this vital “notwithstanding” language.[226]  Therefore, the requirement to obtain individuals’ authorizations for any use or disclosure related to marketing trumps the de-identification provisions.  Although HIPAA expressly permits covered entities and their business associates to de-identify protected health information,[227] it appears that any such use (i.e., de-identification) of protected health information for marketing purposes may not occur without written authorizations from the affected individuals.[228]

Under this reading of HIPAA, whenever the purpose of the de-identification is marketing, the pharmacy must first obtain a written authorization from every individual whose protected health information is being so used before any pharmacy de-identifies its prescription information.[229]  Granted, the requirement to obtain authorizations is not an outright prohibition against the de-identification, subsequent sale, or ultimate use of the information for marketing.  However, the requirement to obtain an authorization from every individual—where billions of prescriptions are being disclosed[230]—places such an enormous burden on the pharmacies that, for all practical purposes, it quashes use of the information for data mining.[231]  Not only will pharmacies need to obtain written authorizations from every individual before his information may be de-identified or disclosed, but it is likely that most of these patients will either refuse to furnish the authorizations or not bother to execute them.[232]  Consequently, HIPAA’s authorization requirement adds so much additional effort and cost to data mining that drug companies will probably no longer find it a cost-effective tool for detailing.[233]

D.     Continuing Failure to Use HIPAA to Restrict Data Mining

It is apparent that parties continue to fail to apply the marketing provisions of the Privacy Rule to restrict data mining.  In a recent case, Steinberg v. CVS Caremark Corp.,[234] prescription drug purchasers sued a pharmacy chain for, inter alia, its disclosures of the purchasers’ prescription information to data miners.  Plaintiffs challenged the pharmacies for accepting remuneration from drug manufacturers for (1) sending letters to the consumers’ physicians suggesting that they prescribe alternate drugs, and (2) selling de-identified prescription information directly to the drug manufacturers and data companies.[235]  The plaintiffs brought state law claims for violation of Pennsylvania’s Unfair Trade Practices and Consumer Protection Law, unjust enrichment, and invasion of privacy.[236]

The United States District Court for the Eastern District of Pennsylvania dismissed the complaint for failure to state a claim.[237]  In so doing, the court made erroneous findings that nothing in the Privacy Rule restricted the defendants’ activities.[238]  First, the court declared that the pharmacies’ sale of de-identified drug prescription information to pharmaceutical manufacturers and data companies for marketing purposes did not offend the HIPAA Privacy Rule because the information had been de-identified prior to sale.[239]  Second, the court stated that the pharmacies’ use of the plaintiffs’ protected health information to send marketing notices to the plaintiffs’ physicians did not violate HIPAA because this constituted permissible healthcare operations.[240]

In fact, the Privacy Rule does not permit either activity.  Without authorizations from the affected individuals, the pharmacies could not use the plaintiffs’ protected health information (even when such use is de-identification) for marketing purposes,[241] thereby rendering the unauthorized de-identification itself illicit.  Further, without the appropriate authorizations, the pharmacies could not disclose protected health information to the plaintiffs’ physicians for marketing purposes,[242] even under the guise of suggesting “treatment alternatives.”[243]  While the court correctly observed that HIPAA does not provide a private right of action, thereby precluding the plaintiffs from bringing a claim directly under HIPAA,[244] the HIPAA violations could arguably have served as bases for the plaintiffs’ state law claims.

VII.  Applying the Sorrell Analysis to the HIPAA Privacy Rule

Both the data-mining laws and HIPAA impose restrictions on the use of health information for marketing.[245]  Both restrict pharmacies from selling de-identified prescription information to data miners.[246]  Although the Supreme Court invalidated the Vermont data-mining law in Sorrell,[247]HIPAA still effectively prevents pharmacies from using protected health information for marketing purposes—that is, de-identifying it for sale to data miners.[248]  Thus, the Sorrell holding raises a question of whether the marketing restrictions in the HIPAA Privacy Rule, like the data-mining law in Sorrell, violate the First Amendment rights of the data miners and drug manufacturers to obtain access to prescription information for marketing purposes.

At first glance, aspects of the data-mining laws and HIPAA’s marketing provisions appear quite similar.  Both bodies of law were motivated by substantial governmental interests—prescriber privacy, public health, and healthcare cost containment for the data- mining laws,[249] and privacy of patients’ medical information for the HIPAA Privacy Rule.[250]  Both laws seek to restrain the use of health information for marketing purposes.[251]  Both define marketing in similar ways.[252]  And both list a number of healthcare-related exceptions to their marketing restrictions.[253]

Despite these similarities, there are substantial grounds to argue that important distinctions between the data-mining laws and the HIPAA Privacy Rule predominate in any comparison.  First, the parties who sought protection are quite different.  The data-mining laws aimed to maintain the privacy of prescribers,[254] many of whom had complained that allowing drug manufacturers access to their prescribing history allowed the detailers “to target them for unwelcome marketing calls.”[255]  The Sorrell Court observed, however, that physicians are hardly hapless victims of detailing.  The Court noted, for instance, that “many listeners find detailing instructive,”[256] and physicians could “simply decline to meet with detailers.”[257]  In fact, the Court characterized prescribing physicians as “sophisticated and experienced consumers.”[258]

Quite unlike the physicians in Sorrell,[259] the HIPAA Privacy Rule seeks to protect private patients from unwarranted invasions into their most private medical information.[260]  Contrasted to physicians, private healthcare patients are typically much more in need of protection.  The medical records amassed on their behalves are done involuntarily, a necessary byproduct of seeking medical treatment.[261]  Not only are many individual patients undoubtedly less sophisticated than physicians, but they may also be unable to watchdog illicit uses of their medical records, particularly if they are ill or aged.  In fact, most patients are probably unaware of the many uses and disclosures of their medical information by covered entities that HIPAA permits.[262]  To address these concerns, HIPAA sets clear limits to covered entities’ uses and disclosures of individuals’ protected health information.[263]  It provides a means whereby covered entities must obtain individuals’ authorization for uses and disclosures that are not expressly permitted by HIPAA,[264] and whereby patients can prevent certain uses and disclosures prior to their occurrence.[265]  As a result, a strong argument can be made that the HIPAA Privacy Rule, unlike the data-mining laws, is a reasoned response to the critical need to protect patients’ medical privacy.

Second, the Supreme Court criticized Vermont’s data-mining law for attempting to advance its goals in too indirect a way.[266]  The State restricted access to prescription information in order to restrict data mining, which in turn would impair detailing, which in turn would result in physicians writing fewer prescriptions for brand-name drugs, which in turn would contain healthcare costs and avoid unnecessary health risks.[267]  HIPAA, on the other hand, directly accomplishes its goal of protecting individuals’ medical privacy by conferring upon the individuals themselves the ability to control, within certain limits,[268] the uses and disclosures of their own protected health information by covered entities.[269]

Third, there were readily available less restrictive alternatives to the data-mining laws that could have accomplished the asserted purposes of achieving prescriber privacy, protecting the public health, and containing pharmaceutical costs.  The Sorrell Court observed that physicians could easily refuse to meet with detailers, thereby preventing the detailers from using the physicians’ prescriber histories to pressure them into purchasing expensive brand-name drugs.[270]  Further, Vermont’s law authorized funds for a drug education program to provide physicians with information on “cost-effective utilization of prescription drugs.”[271]  Accordingly, before prohibiting data mining of pharmacy prescriptions, the State could have waited to see if that program was successful in limiting sales of nongeneric drugs.[272]

In contrast, with regard to HIPAA, there is no readily ascertainable less restrictive means to protect the privacy of patients’ medical records other than to permit limited uses and disclosures and to require patients’ consent for everything else.[273]  Congress, with limited exceptions,[274] conferred upon individuals the ability to control uses and disclosures of their own protected health information by covered entities.[275]  Requiring individuals to authorize uses and disclosures that are not otherwise needed to allow the healthcare industry to operate[276] and enable critical public interest activities[277] is therefore a direct means of achieving that control.  As a reasonable exercise of that control, HIPAA requires individuals to authorize any uses or disclosures of their protected health information to sell items or services that are not related to the individuals’ own healthcare management.[278]  As marketing third-party items and services is not critical either to providing and paying for individuals’ treatment or to enabling public interest activities, the authorization requirement for marketing uses and disclosures is necessary to achieve HIPAA’s privacy goal.

Fourth, the discriminatory impact of the data-mining laws that offended the Supreme Court in Sorrell[279] is largely absent in HIPAA.  The Sorrell Court characterized Vermont’s data-mining law as pointedly aimed at “diminish[ing] the effectiveness of marketing by manufacturers of brand-name drugs.”[280]  Convinced that detailing increased prescriptions for expensive brand-name drugs over just as effective and cheaper generic alternatives, the State sought to discourage detailing:

“In its practical operation,” Vermont’s law “goes even beyond mere content discrimination, to actual viewpoint discrimination.”  Given the legislature’s expressed statement of purpose, it is apparent that [the Vermont law] imposes burdens that are based on the content of speech and that are aimed at a particular viewpoint.[281]

The Sorrell Court found the State’s eradication of pharmacy data mining to be value based because “the State . . . engage[d] in content-based discrimination to advance its own side of a debate.”[282]  The law prohibited the communication of accurate information by detailers even though some prescribers found the information to be helpful.[283]  Also, the Court found that some brand-name drugs may be better for patients than their generic equivalents.[284]  Nevertheless, the State restricted access to prescription information to suppress speech with which it did not agree, while allowing access for itself and others to promote generics.[285]

This pointedly discriminatory goal and impact of Vermont’s data-mining law is absent with the HIPAA Privacy Rule.  Although marketing is not included in HIPAA’s list of permitted uses and disclosures,[286] it falls within a very broad category of all nonpermissive uses and disclosures for which an authorization is required.[287]  Admittedly, HIPAA singles out marketing for special restrictions,[288] as it comprises one of only two uses specified in HIPAA where protected health information may not even be de-identified absent the individual’s authorization.[289]  Here, however, it is all marketing that is so treated, not the more pointed restriction of a particular use by a particular speaker that was present in Sorrell.[290]

Consequently, the overall structure of the data-mining laws and the HIPAA Privacy Rule is markedly different.  Amid the thousands of uses and disclosures to which medical information is subject,[291] Vermont’s data-mining law pointedly prohibited only one—pharmacies’ disclosure of prescription information for marketing and the use of that information by drug manufacturers to market their drugs.[292]  Consequently, any nonmarketing use of prescription information was permitted.[293]  Even with regard to marketing uses, exceptions allowed the information to be utilized for “health care research,” to enforce “compliance” with health insurance preferred drug lists, for “care management educational communications” provided to patients on treatment options, for law enforcement operations, and as “otherwise provided by law.”[294]  Pharmacies could sell the information to insurers, researchers, journalists, the State, and others.[295]  The State itself could use the information for “counterdetailing” activities.[296]  Accordingly, the Court concluded that while the law “permits extensive use of prescriber-identifying information,”[297] it targeted only one use (marketing) and one user (drug manufacturers) for its prohibition.[298]

In contrast, the HIPAA Privacy Rule regulates from the reverse vantage point.  It declares at the outset that no use or disclosure of protected health information may occur unless it is specifically permitted by the Rule.[299]  Therefore, opposite to the structure of the data-mining laws, the prohibitions are virtually limitless, while the allowable uses are distinctly limited.[300]  Generally, the Privacy Rule permits uses and disclosures that fall within two broad categories[301]: (1) those that are related to healthcare treatment, payment, and business operations of the covered entities[302]and (2) those that are related to public interest activities that are so critical to society’s well being that Congress deemed they should not be hindered by medical privacy concerns.[303]  All nonpermitted uses must be authorized.[304]  While, like the data-mining laws, HIPAA earmarks marketing for special restrictions,[305] even those limitations are more broadly drawn in HIPAA, applying to all types of marketing, not just marketing of brand-name drugs by pharmaceutical manufacturers.[306]  This is quite different from Vermont’s prohibition applying solely to pharmacies’ and insurers’ sales of prescription information for drug marketing.[307]  In fact, the Sorrell Court itself pointed out the marked differences between the structure of Vermont’s data-mining law and the HIPAA Privacy Rule:

[T]he State might have advanced its asserted privacy interest by allowing the information’s sale or disclosure in only a few narrow and well-justified circumstances.  See, e.g., Health Insurance Portability and Accountability Act of 1996, 42 U.S.C. §1320d-2; 45 CFR pts. 160 and 164 (2010).  A statute of that type would present quite a different case than the one presented here.[308]

Conclusion

While several states found it necessary to pass laws prohibiting pharmacies from selling de-identified prescription information to data miners for use by drug manufacturers to market their brand-name drugs, a solid argument can be made that the HIPAA Privacy Rule already restricted such sales.  HIPAA prohibits covered entities, including pharmacies, from using protected health information for marketing purposes without the individuals’ authorization.  As a result, it appears that the Privacy Rule restricts pharmacies from even de-identifying protected health information for marketing purposes unless the affected individuals authorize such use.

The recent Sorrell holding, invalidating Vermont’s data-mining law on the ground that it violates the Free Speech Clause of the First Amendment, raises the question of whether the marketing provisions of the HIPAA Privacy Rule could be deemed invalid for similar reasons.  Both the data-mining laws and the HIPAA Privacy Rule restrict pharmacies from selling de-identified prescription information to data miners for marketing purposes.

However, it is evident that there are fundamental distinctions between the data-mining laws and HIPAA’s marketing restrictions.  The two laws protect different parties and are structured very differently.  Most significantly, the discriminatory intent and effect of the data-mining laws are largely absent in HIPAA.  These distinctions present a substantially different question regarding HIPAA from that considered in Sorrell and likely would yield a different answer.[309]


          *     Professor of Law, Albany Law School.  I would like to express my sincere gratitude to Robert Emery, who recently retired as Associate Director and Head of Reference from the Albany Law School Schaffer Law Library, for the outstanding research assistance he has given me over the years.  He has been my research “go-to” person ever since I came to Albany Law School as a student in 1984.  He provided invaluable research expertise throughout my fifteen years of private law practice in Albany and during my past eleven years as a professor at the school.  Each of the articles I have produced while at Albany Law bears his imprint.  I do not believe there is a finer, or more patient and helpful, research expert to be found than Bob Emery.
         [1].   131 S. Ct. 2653 (2011).
         [2].   Id. at 2659.
         [3].   See infra Part I.
         [4].   See infra Part I.
         [5].   See infra Part I.
         [6].   See infra Part II.
         [7].   See infra Part III.
         [8].   See infra Part IV.
         [9].   See infra Part V.
       [10].   See infra Part V.
       [11].   See infra Part V.
       [12].   See 45 C.F.R. pts. 160, 164 (2010).
       [13].   42 U.S.C. § 1320d-2 (Supp. IV 2011).  Hereinafter, the Privacy Rule will be referred to as “the Privacy Rule,” “the Rule,” or “HIPAA” interchangeably.
       [14].   See infra Parts VI.A–B.
       [15].   See infra Part VI.C.
       [16].   See infra Part VI.B–C.
       [17].   See infra Part VI.D.
       [18].   See infra Part VII.
       [19].   See infra Part VII.
       [20].   See infra Part VII.
       [21].   See infra Part VII.
       [22].   See infra Part VII.
       [23].   See infra Part I.
       [24].   See infra Part III.
       [25].   See infra Part IV.
       [26].   See infra Part V.
       [27].   See infra Part VI.
       [28].   See infra Part VII.
       [29].   See infra Part VII.
       [30].   See Marcia M. Boumil et al., Prescription Data Mining, Medical Privacy and the First Amendment: The U.S. Supreme Court in Sorrell v. IMS Health Inc., 21 Annals Health L. 447, 449–51 (2012) (describing the practices of data mining and detailing).
       [31].   See, e.g., Brief for the United States as Amicus Curiae Supporting Petitioners at 4–5, Sorrell v. IMS Health, Inc., 131 S. Ct. 2653 (2011) (No. 10-779) (stating that Vermont requires each pharmacy to maintain a “patient record system” that records the patient’s name, address, telephone number, age or date of birth, gender, name and strength of each drug prescribed, quantity, date received, prescription number, and name of the prescriber).
       [32].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“When filling prescriptions, pharmacies in Vermont collect information including the prescriber’s name and address, the name, dosage, and quantity of the drug, the date and place the prescription is filled, and the patient’s age and gender.”), aff’d, 131 S. Ct. 2653 (2011); IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (describing the “potpourri” of prescription information retained by pharmacies, including “the name of the patient, the identity of the prescribing physician, the drug, its dosage, and the quantity dispensed”), abrogated by Sorrell, 131 S. Ct. 2653.
       [33].   See, e.g., N.Y. Educ. Law § 6810(5) (McKinney 2010) (“Records of all prescriptions filled or refilled shall be maintained for a period of at least five years and upon request made available for inspection and copying by a representative of the department.  Such records shall indicate date of filling or refilling, doctor’s name, patient’s name and address and the name or initials of the pharmacist who prepared, compounded, or dispensed the prescription.  Records of prescriptions for controlled substances shall be maintained pursuant to requirements of article thirty-three of the public health law.”).
       [34].   See, e.g., Al Baker & Joseph Goldstein, Focus on Prescription Records Leads to Arrest in 4 Killings, N.Y. Times, June 23, 2011, at A18 (reporting arrests stemming from information derived from prescription records: “A prosecutor in the Office of the Special Narcotics Prosecutor for New York City re-examined prescription records that the office had in its possession, another law enforcement official said.  Those records are part of continuing long-term investigations into prescription drug diversion, the official said”); see also Questions and Answers for Practitioners Regarding the New Official Prescription Program, N.Y. St. Dep’t Health, http://www.health.ny.gov/professionals/narcotic/official_prescription_program/questions_and_answers_for_practitioners.htm (last visited Aug. 28, 2012) (discussing section 21 of the New York Public Health Law, requiring prescriptions written in New York to be issued on official New York State prescription forms, to “combat the growing problem of prescription fraud.  Official prescriptions contain security features specifically designed to prevent alterations and forgeries that divert drugs for sale on the black market.  Some of these contaminated drugs end up in patients’ medicine cabinets.  By preventing fraudulent claims, the law will also save New York’s Medicaid program and private insurers many millions of dollars every year”).
       [35].   IMS and Verispan were plaintiffs in the Vermont, Maine, and New Hampshire data-mining cases.  See Sorrell, 630 F.3d at 263; IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011); Ayotte, 550 F.3d at 42.
       [36].   Data miners have been described as “prescription drug information intermediaries that mine [purchase and process] specialized data.” Mills, 616 F.3d at 15–16.
       [37].   Sorrell, 630 F.3d at 267 (“The PI [prescriber-identifiable] data sold by the data-mining appellants is stripped of patient information, to protect patient privacy.”); Mills, 616 F.3d at 16 (stating that pharmacies’ computer software collects prescription data, encrypts the patient identifiers so that patients cannot be identified by name, and sends the information to the data miners who have purchased the information);Ayotte, 550 F.3d at 45 (stating that patients’ names are encrypted, “effectively eliminating the ability to match particular prescriptions with particular patients”).
       [38].   Ayotte, 550 F.3d at 45 (stating that “[t]he scope of the [data-mining] enterprise is mind-boggling” and noting that IMS and Verispan organize several billion prescriptions each year).
       [39].   Id.
       [40].   Mills, 616 F.3d at 16 (“They [data miners] assemble a complete picture of individual prescribers’ prescribing histories by cross-referencing prescriber names with publicly available databases, including the AMA’s database of medical doctors’ specialties.”); Ayotte, 550 F.3d at 45 (“[Data miners] group [the data] by prescriber, and cross-reference each physician’s prescribing history with physician-specific information available through the American Medical Association.”).
       [41].   Sorrell, 630 F.3d at 267 (“‘Detailing’ refers to visits by pharmaceutical representatives, called detailers, to individual physicians to provide information on specific prescription drugs.”); Ayotte, 550 F.3d at 46 (“Detailing involves tailored one-on-one visits by pharmaceutical sales representatives with physicians and their staffs.”).
       [42].   Sorrell, 630 F.3d at 267 (explaining that detailers provide information to physicians “including the use, side effects, and risks of drug interactions”); Mills, 616 F.3d at 14 (stating that detailers distribute “promotional materials and pamphlets about the different conditions their particular products can be used to treat”); Ayotte, 550 F.3d at 46 (“The detailer comes to the physician’s office armed with handouts and offers to educate the physician and his staff about the latest pharmacological developments . . . [thereby] holding out the promise of a convenient and efficient means for receiving practice-related updates.”).  The Maine drug prescription data-mining law defines “‘detailing’ as ‘one-to-one contact with a prescriber or employees or agents of a prescriber for the purpose of increasing or reinforcing the prescribing of a certain drug by the prescriber.’”  See Mills, 616 F.3d at 14 (citing Me. Rev. Stat. tit. 22, § 1711-E(1)(A-2) (2005)).
       [43].   Mills, 616 F.3d at 14 (“Prescriber-identifying data is a valuable tool in a detailer’s arsenal of sales techniques.”).
       [44].   Sorrell, 630 F.3d at 267 (“Pharmaceutical manufacturers use [the mined] data to identify audiences for their marketing efforts, to focus marketing messages for individual prescribers, [and] to direct scientific and safety messages to physicians most in need of that information.”); Mills, 616 F.3d at 14 (“With [data-mining reports], pharmaceutical manufacturers can pinpoint the prescribing habits of individual prescribers in a region and target prescribers who might be persuaded to switch brands or prescribe more of a detailer’s brand of products.”); Ayotte, 550 F.3d at 44–45 (explaining that data-mining reports enable “detailers . . . to target particular physicians and shape their sales pitches accordingly”).
       [45].   Ayotte, 550 F.3d at 47; see also Mills, 616 F.3d at 14 (“Detailers use prescriber-identifying data to [market their drugs] more effectively; every sales pitch can be tailored to what the detailer knows of the prescriber based on her prescribing history.”).
       [46].   Mills, 616 F.3d at 14 n.3.
       [47].   Id. at 14; see also Sorrell, 630 F.3d at 267 (“[P]harmaceutical industry spending on detailing has increased exponentially along with the rise of data mining.”).
       [48].   Ayotte, 550 F.3d at 46.
       [49].   Mills, 616 F.3d at 14 (“[Pharmaceutical manufacturers] have some 90,000 pharmaceutical sales representatives make weekly or monthly one-on-one visits to prescribers nationwide.”).  Data mining is lucrative for the miners as well.  IMS alone reported revenues of $1.75 billion in 2005.  Id. at 16.
       [50].   Id. at 14 (“[D]etailers distribute upwards of $1 million worth of free product samples per year.”); Ayotte, 550 F.3d at 46 (“[D]etailers typically distribute an array of small gifts to physicians and their staffs. . . . [I]n the year 2000, an estimated $1,000,000,000 in free drug samples flowed from detailers to physicians.”).
       [51].   Mills, 616 F.3d at 14 (“A single prescriber is visited by an average of twenty-eight detailers a week; an average of fourteen detailers a week call on a single specialist.”); Ayotte, 550 F.3d at 47 (“[T]he average primary care physician interacts with no fewer than twenty-eight detailers each week and the average specialist interacts with fourteen.”).
       [52].   Sorrell, 630 F.3d at 268 (“[W]hile a brand-name drug is not necessarily better than its generic version, the brand-name drug is typically more expensive.”).
       [53].   Ayotte, 550 F.3d at 46 (“[Detailing] is time-consuming and expensive work, not suited to the marketing of lower-priced bioequivalent generic drugs.”).  Generic drugs are described as “drugs that are pharmacologically indistinguishable from their brand-name counterparts save for potential differences in rates of absorption.”  Id.
       [54].   Id. (“[D]etailing is employed where a manufacturer seeks to encourage prescription of a patented brand-name drug as against generic drugs, or as against a competitor’s patented brand-name drug, or as a means of maintaining a physician’s brand loyalty after its patent on a brand-name drug has expired.”).
       [55].   See Boumil et al., supra note 30, at 450–53 (describing criticisms of data mining and detailing).
       [56].   See, e.g.Ayotte, 550 F.3d at 56–57 (discussing the effectiveness of data mining as a marketing tool by detailers).
       [57].   Id. at 56.
       [58].   Id.
       [59].   Id.  Indeed, promotional literature from IMS marketed its data reports for efficacy in detailing.  Id.
       [60].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 270 (2d Cir. 2010) (internal quotation marks omitted), aff’d, 131 S. Ct. 2653 (2011); see alsoAyotte, 550 F.3d at 57 (discussing a study finding that eleven percent of detailers’ statements to physicians were “demonstrably inaccurate” (citing Michael G. Ziegler et al., The Accuracy of Drug Information from Pharmaceutical Sales Representatives, 273 J. Am. Med. Ass’n 1296, 1297 (1995))).
       [61].   Sorrell, 630 F.3d at 270 (internal quotation marks omitted).
       [62].   Id.
       [63].   Ayotte, 550 F.3d at 47.
       [64].   Id. at 56 (stating that the “common sense conclusion[]” is that “detailing substantially increases physicians’ rates of prescribing brand-name drugs”).
       [65].   Id.
       [66].   Id. at 57–58.
       [67].   Id. at 58.
       [68].   IMS Health Inc. v. Mills, 616 F.3d 7, 15 (1st Cir. 2010) (quoting Robert A. Musacchio & Robert J. Hunkler, More Than a Game of Keep-Away, Pharmaceutical Executive, May 2006, at 150) (“[P]hysicians ‘complain bitterly’ about detailers ‘who wave data in their faces’ and challenge them with their own prescribing stories when they fail to prescribe more of the product the detailer has been advertising.”), vacated, IMS Health, Inc. v. Schneider, 131 S. Ct. 3091 (2011); see also Boumil et al.,supra note 30, at 452–53 (describing doctors’ dissatisfaction with detailing).
       [69].   Mills, 616 F.3d at 15.
       [70].   See generally Boumil et al., supra note 30, at 453–57 (describing states’ legislative responses to pharmacy data mining).
       [71].   Ayotte, 550 F.3d at 47.
       [72].   N.H. Rev. Stat. Ann. § 318:47-f (2011); Boumil et al., supra note 30, at 453 (explaining New Hampshire was the first state to enact legislation to limit the use of prescription information for commercial or marketing purposes, followed closely by Vermont and Maine).
       [73].   In relevant part, the statute provides:Records relative to prescription information containing patient-identifiable and prescriber-identifiable data shall not be licensed, transferred, used, or sold by any pharmacy benefits manager, insurance company, electronic transmission intermediary, retail, mail order, or Internet pharmacy or other similar entity, for any commercial purpose, except for the limited purposes of pharmacy reimbursement; formulary compliance; care management; utilization review by a health care provider, the patient’s insurance provider or the agent of either; health care research; or as otherwise provided by law.  Commercial purpose includes, but is not limited to, advertising, marketing, promotion, or any activity that could be used to influence sales or market share of a pharmaceutical product, influence or evaluate the prescribing behavior of an individual health care professional, or evaluate the effectiveness of a professional pharmaceutical detailing sales force.§ 318:47-f.
       [74].   Ayotte, 550 F.3d at 47 (quoting § 318:47-f).
       [75].   Id.
       [76].   Id.
       [77].   Vt. Stat. Ann. tit. 18, § 4631(a) (2011); IMS Health Inc. v. Sorrell, 630 F.3d 263, 269 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
       [78].   Sorrell, 630 F.3d at 269 (quoting tit. 18, § 4631(a) (2011)).
       [79].   Id.see Boumil et al., supra note 30, at 455–56 (explaining that the Vermont “opt-in” approach differed from approaches used by other states).  The statute reads in relevant part as follows:A health insurer, a self-insured employer, an electronic transmission intermediary, a pharmacy, or other similar entity shall not sell, license, or exchange for value regulated records containing prescriber-identifiable information, nor permit the use of regulated records containing prescriber-identifiable information for marketing or promoting a prescription drug, unless the prescriber consents as provided in subsection (c) of this section.  Pharmaceutical manufacturers and pharmaceutical marketers shall not use prescriber-identifiable information for marketing or promoting a prescription drug unless the prescriber consents as provided in subsection (c) of this section.tit. 18, § 4631(d).
       [80].   Sorrell, 630 F.3d at 269–70.
       [81].   Id. at 270 (quoting tit. 18, § 4631(b)(5)) (“The law defines ‘marketing’ to include ‘advertising, promotion, or any activity that is intended to be used or is used to influence sales or the market share of a prescription drug, influence or evaluate the prescribing behavior of an individual health care professional to promote a prescription drug, market prescription drugs to patients, or to evaluate the effectiveness of a professional pharmaceutical detailing sales force.’”).
       [82].   Id. at 270 (citing tit. 18, § 4631(e)(1)–(7)) (“The statute expressly permits the sale, transfer, or use of PI [prescriber-identifiable] data for multiple other purposes, including the limited purposes of pharmacy reimbursement; prescription drug formulary compliance; patient care management; utilization review by a health care professional, the patient’s health insurer, or the agent of either; health care research; dispensing prescription medications; the transmission of prescription data from prescriber to pharmacy; care management; educational communications provided to a patient, including treatment options, recall or safety notices, or clinical trials; and for certain law enforcement purposes as otherwise authorized by law.”).
       [83].   Id. at 271 n.3 (citing Vt. Stat. Ann. tit. 33, §§ 2004, 2466a (2011)).
       [84].   IMS Health Inc. v. Mills, 616 F.3d 7, 17 (1st Cir. 2010) (citing Me. Rev. Stat. tit. 22, § 1711-E(1-A) (2010)) (internal quotation marks omitted),vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
       [85].   The statute defines marketing to include “‘advertising, publicizing, promoting or selling a prescription drug;’ ‘activities undertaken for the purpose of influencing the market share of a prescription drug or the prescribing patterns of a prescriber, a detailing visit or a personal appearance;’ ‘[a]ctivities undertaken to evaluate or improve the effectiveness of a professional detailing sales force;’ or ‘[a] brochure, media advertisement or announcement, poster or free sample of a prescription drug.’”  Id. at 16 n.6 (quoting tit. 22, § 1711-E(1)(F-1)).
       [86].   Id. at 16 (citing tit. 22, § 1711-E(2-A)) (“[A] carrier, pharmacy or prescription drug information intermediary . . . may not license, use, sell, transfer, or exchange for value, for any marketing purpose, prescription drug information that identifies a prescriber who has filed for confidentiality protection.”).
       [87].   Id. at 16 n.6 (quoting tit. 22, § 1711-E(1)(F-1)) (“‘Marketing’ does not include pharmacy reimbursement, formulary compliance, pharmacy file transfers in response to a patient request or as a result of the sale or purchase of a pharmacy, patient care management, utilization review by a health care provider or agent of a health care provider or the patient’s health plan or an agent of the patient’s health plan, and health care research.”).
       [88].   Tom Ramstack, Drug Companies Seek Supreme Court Permission for “Data Mining,” GantDaily.com (Apr. 26, 2011, 11:41 AM), http://gantdaily.com/2011/04/26/drug-companies-seek-supreme-court-permission-for-data-mining (“Twenty-five states are considering similar laws[.]”); James Vicini, Supreme Court Strikes Down State Drug Data-Mining Law, Reuters (June 23, 2011, 1:48 PM), http://www.reuters.com/article/2011/06/23/us-usa-healthcare-privacy-idUSTRE75M3T720110623 (“[S]imilar measures have been proposed in about 25 states in the last three years[.]”).
       [89].   The data miners include IMS, Verispan, and Source Healthcare Analytics, Inc.  See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 269 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
       [90].   The association was Pharmaceutical Research and Manufacturers of America.  See, e.g.id.
       [91].   See IMS Health Inc. v. Sorrell, 631 F. Supp. 2d 434, 440 (D. Vt. 2009), rev’d, 630 F.3d 263 (2d Cir. 2010); IMS Health Corp. v. Rowe, No. CV-07-127-B-W, 2007 U.S. Dist. LEXIS 94268, at *27 (D. Me. Dec. 21, 2007), rev’d, IMS Health Inc. v. Mills, 616 F.3d 7 (1st Cir. 2010), vacated,IMS Health, Inc. v. Schneider, 131 S. Ct. 3091 (2011); IMS Health Inc. v. Ayotte, 490 F. Supp. 2d 163, 174 (D.N.H. 2007), rev’d and vacated, 550 F.3d 42 (1st Cir. 2008), abrogated by Sorrell, 131 S.Ct. 2653.
       [92].   See Sorrell, 630 F.3d at 266; Mills, 616 F.3d at 13; Ayotte, 550 F.3d at 47–48.
       [93].   Ayotte, 550 F.3d at 42.
       [94].   Id. at 45.
       [95].   Id.
       [96].   The First Circuit described the Central Hudson test as follows:Under Central Hudson—so long as the speech in question concerns an otherwise lawful activity and is not misleading—statutory regulation of that speech is constitutionally permissible only if the statute is enacted in the service of a substantial governmental interest, directly advances that interest, and restricts speech no more than is necessary to further that interest.Id. at 55 (citing Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 566 (1980)).
       [97].   Id. (“Fiscal problems have caused entire civilizations to crumble, so cost containment is most assuredly a substantial governmental interest.”).
       [98].   Id. at 56–57 (discussing evidence showing that “prescribing histories made detailing more efficacious”).
       [99].   Id. at 56 (finding that it was a “‘common-sense conclusion[]” that detailing increases the prescriptions of brand-name drugs).
     [100].   Id. at 58 (“[W]hile a state legislature does not have unfettered discretion ‘to suppress truthful, nonmisleading information for paternalistic purposes . . . there is in this area ‘some room for the exercise of legislative judgment.”) (internal quotation marks omitted) (citation omitted).
     [101].   Id. at 60–61 (ruling that the voidness claim “need not detain us,” as it was “sufficiently clear to withstand the plaintiffs’ vagueness challenge”).
     [102].   Id. at 64 (ruling that the plaintiffs’ Commerce Clause argument was unavailing, as the court was “confident that the New Hampshire Supreme Court would interpret the Prescription Information Law to affect only domestic transactions”).
     [103].   616 F.3d 7 (1st Cir. 2010), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [104].   See id. at 13.
     [105].   Id. at 18–19 (“Plaintiffs’ [First Amendment] claims fail for the same reasons we rejected their nearly identical First Amendment challenge to New Hampshire’s similar statute in Ayotte. . . . Even assuming arguendo that the Maine law restricts protected commercial speech and not conduct, we hold that it directly advances the substantial purpose of protecting opted-in prescribers from having their identifying data used in unwanted solicitations by detailers, and thus Maine’s interests in lowering health care costs.”).
     [106].   Id. at 23 (“Even if there were possible ambiguity in [the statute’s] terms, the law is still not void for vagueness . . . [as it] surely provides enough of a benchmark to satisfy due process.”).
     [107].   Id. at 24–25 (“[T]he statute applies to plaintiffs’ out-of-state use or sale of opted-in Maine prescribers’ identifying data and that the statute does so constitutionally. . . . Plaintiffs have not shown any disproportionate burden on interstate commerce, and the law creates substantial in-state benefits for those Maine prescribers who have affirmatively asked Maine to protect their identifying data and for Maine in its efforts to lower health care costs.”).
     [108].   630 F.3d 263 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011).
     [109].   The Second Circuit described the Central Hudson test as follows:[T]he government may regulate commercial speech when (1) “the communication is neither misleading nor related to unlawful activity;” (2) the government “assert[s] a substantial interest to be achieved” by the regulation; (3) the restriction “must directly advance the state interest;” and finally (4) “if the governmental interest could be served as well by a more limited restriction on commercial speech, the excessive restrictions cannot survive.”Id. at 275 (quoting Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 564 (1980)).
     [110].   Id. at 276 (“[W]e agree with the district court that Vermont does have a substantial interest in both lowering health care costs and protecting public health.  However, the state’s asserted interest in ‘medical privacy’ is too speculative to satisfy the second prong of Central Hudson.”).
     [111].   Id. at 277 (“The Vermont statute cannot be said to advance the state’s interests in public health and reducing costs in a direct and material way.”).
     [112].   Id.
     [113].   Id. at 280.
     [114].   Id.
     [115].   Id. at 282.
     [116].   131 S. Ct. 2653 (2011).
     [117].    Id. at 2658–59.  Justice Kennedy delivered the opinion of the Court, with Justices Roberts, Scalia, Thomas, Alito, and Sotomayor joining.  Justice Breyer filed a dissenting opinion, in which Justices Ginsburg and Kagan joined.
     [118].   Id. at 2659.
     [119].   Id. (“Vermont argues that its prohibitions safeguard medical privacy and diminish the likelihood that marketing will lead to prescription decisions not in the best interests of patients or the State.  It can be assumed that these interests are significant.”).  The Court noted, however, that, at oral argument, the State declined to affirm that its purpose in enacting the law was to discourage detailing and influence drug prescribing.  Id. at 2670.  The Court concluded that “[t]he State’s reluctance to embrace its own legislature’s rationale reflects the vulnerability of its position.”  Id.  Nevertheless, the Court held that “[t]he text of § 4631(d), associated legislative findings, and the record developed in the District Court establish that Vermont enacted its law” to inhibit drug marketing schemes that increase the prescriptions for expensive brand-name drugs.  Id. at 2672.
     [120].   Id. at 2663 (“On its face, Vermont’s law enacts content- and speaker-based restrictions on the sale, disclosure, and use of prescriber-identifying information.”).
     [121].   Id. at 2659 (“Vermont’s statute must be subjected to heightened judicial scrutiny.  The law cannot satisfy that standard.”).
     [122].   Id. at 2656 (“The statute thus disfavors marketing, i.e., speech with a particular content.”); see also id. at 2668 (“Under Vermont’s law, pharmacies may share prescriber-identifying information with anyone for any reason save one: They must not allow the information to be used for marketing.”).
     [123].   Id. at 2663 (“[T]he statute disfavors specific speakers, namely pharmaceutical manufacturers. . . . Detailers are . . . barred from using the information for marketing, even though the information may be used by a wide range of other speakers.”).
     [124].   Id. at 2664 (“Act 80 [Vermont’s data-mining law] is designed to impose a specific, content-based burden on protected expression.  It follows that heightened judicial scrutiny is warranted.” (citation omitted)).
     [125].   Id. at 2666 (“The State also contends that heightened judicial scrutiny is unwarranted in this case because sales, transfer, and use of prescriber-identifying information are conduct, not speech.”).
     [126].   Id. at 2667.
     [127].   Id. at 2668 (citation omitted).
     [128].   See id. at 2668–72.
     [129].   Id. at 2668 (“The explicit structure of the statute allows the information to be studied and used by all but a narrow class of disfavored speakers.”).
     [130].   Id. at 2669.
     [131].   Id. at 2663 (“[I]t appears that Vermont could supply academic organizations with prescriber-identifying information to use in countering the messages of brand-name pharmaceutical manufacturers and in promoting the prescription of generic drugs.”); see also id. at 2660–61 (discussing the counter-detailing provisions in the Vermont law).
     [132].   See id. at 2670–71.
     [133].   Id. at 2669 (“Physicians can, and often do, simply decline to meet with detailers, including detailers who use prescriber-identifying information.”).
     [134].   Id. at 2670.
     [135].   Id.
     [136].   Id. at 2667–68 (citing, inter alia, Cent. Hudson Gas & Elec. Corp. v. Pub. Serv. Comm’n, 447 U.S. 557, 566 (1980)) (“[T]he State must show at least that the statute directly advances a substantial governmental interest and that the measure is drawn to achieve that interest.”).
     [137].   Id. at 2670.
     [138].   Id. at 2671 (characterizing physicians as “‘sophisticated and experienced’ consumers” (citation omitted)).
     [139].   Id. at 2670–71 (“The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers. . . . [T]he ‘fear that people would make bad decisions if given truthful information’ cannot justify content-based burdens on speech . . . when the audience, in this case prescribing physicians, consists of ‘sophisticated and experienced’ consumers.” (citations omitted)).
     [140].   Id. at 2672 (“[T]he State cannot engage in content-based discrimination to advance its own side of a debate.”).
     [141].   Id. (“If Vermont’s statute provided that prescriber-identifying information could not be sold or disclosed except in narrow circumstances then the State might have a stronger position.”); see also id. at 2668 (“[T]he State might have advanced its asserted privacy interest by allowing the information’s sale or disclosure in only a few narrow and well-justified circumstances.” (citations omitted)).
     [142].   Id. at 2672 (“[T]he State itself can use the information to counter the speech it seeks to suppress.”).
     [143].   Id.
     [144].   Id.
     [145].   IMS Health, Inc. v. Schneider, 131 S. Ct. 3091, 3091 (2011).
     [146].   IMS Health Inc. v. Ayotte, No. 06-cv-280-PB, 2011 U.S. Dist. LEXIS 116595, at *2–3 (D.N.H. Oct. 7, 2011) (“The parties agree that the Supreme Court’s recent decision in Sorrell v. IMS Health, Inc., 131 S. Ct. 2653, 180 L. Ed. 2d 544 (2011) requires ‘invalidation of N.H. Rev. Stat. Ann. §§ 318:47-f and 318-B:12 to the extent that they prohibit the transfer, use, sale, or licensing of prescriber-identifiable data.’  Accordingly, they have asked me to reinstate the court’s May 7, 2007 judgment for the plaintiffs.  I have reviewed Sorrell and agree that it requires the invalidation of the above-referenced statutes because they improperly restrict speech protected by the First Amendment.”).
     [147].   45 C.F.R. pts. 160, 164 (2010).
     [148].   See, e.g., 45 C.F.R. § 164.502(a) (2011) (regulating covered entities’ use and disclosure of protected health information); see also id. § 160.103 (defining “protected health information” to mean “individually identifiable health information . . . that is: (i) Transmitted by electronic media; (ii) Maintained in electronic media; or (iii) Transmitted or maintained in any other form or medium”).
     [149].   See 42 U.S.C. §§ 1320d(5), 1320d-1(a) (2006) (applying the Act to most health plans, healthcare providers, and other covered entities).  The Rule’s definition of a “covered entity” includes, inter alia, “[a] health plan” and “[a] health care provider who transmits any health information in electronic form in connection with a transaction covered by this subchapter.”  45 C.F.R. § 160.103.
     [150].   45 C.F.R. § 160.103.
     [151].   Id. § 164.502(a).
     [152].   Id. § 160.103.
     [153].   Id.
     [154].   See 42 U.S.C. § 1320d(4)(B) (defining “health information” as any information that “relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual”).
     [155].   See id. § 1320d(6) (defining “individually identifiable health information” as “any information, including demographic information collected from an individual, that . . . relates to the past, present, or future physical or mental health or condition of an individual, the provision of health care to an individual, or the past, present, or future payment for the provision of health care to an individual, and . . . (i) identifies the individual; or (ii) with respect to which there is a reasonable basis to believe that the information can be used to identify the individual”); see also 45 C.F.R. § 160.103 (defining “[i]ndividually identifiable health information” as “information that is a subset of health information, including demographic information collected from an individual, and: (1) Is created or received by a health care provider, health plan, employer, or health care clearinghouse; and (2) Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual; and (i) That identifies the individual; or (ii) With respect to which there is a reasonable basis to believe the information can be used to identify the individual”).
     [156].   See 45 C.F.R. § 164.514(b)(2)(i) (listing the elements of health information that must be removed to de-identify the information).
     [157].   Id. § 164.514(b)(2)(i)(R) (requiring the removal of “any other unique identifying number, characteristic, or code” for de-identification of protected health information).
     [158].   Id. § 164.502(a).
     [159].   An “individual” is a “person who is the subject of protected health information.”  Id. § 160.103.
     [160].   Id. § 164.502(a)(2)(i).
     [161].   Id. § 164.502(a)(2)(ii).
     [162].   Id. § 164.502(a)(1) (providing a number of permitted uses and disclosures under HIPAA).
     [163].   In addition to the two broad categories of permissive uses described herein, there are also several minor categories of permissive uses.  Covered entities are permitted to disclose protected health information to the individual (who is the subject of the information) even when the individual does not specifically request disclosure.  See id. § 164.502(a)(1)(i).  Covered entities are permitted to inadvertently disclose protected health information when the disclosure occurs during another required or permitted use or disclosure (an “incident to” disclosure).  See id.§ 164.502(a)(1)(iii).  Finally, once covered entities have obtained the agreement of the individual, they are permitted to use and disclose protected health information to list the individual as a patient in a healthcare facility directory, to inform the individual’s visitors and members of the clergy that the individual is a patient in the facility, and to disclose protected health information to family and friends of the individual who are involved in the individual’s care or payment.  See id. §§ 164.502(a)(1)(v), 164.510.
     [164].   Id. § 164.502(a)(1)(ii).
     [165].   Id. § 164.501.
     [166].   Id.
     [167].   Id.
     [168].   See id. § 164.512.
     [169].   Id. (permitting covered entities to disclose protected health information “without the written authorization of the individual . . . or the opportunity for the individual to agree or object” for disclosures that are (a) required by law, (b) for public health activities, (c) about victims of abuse, neglect, or domestic violence, (d) for health oversight activities, (e) for judicial and administrative proceedings, (f) for law enforcement purposes, (g) about decedents, (h) for cadaveric organ, eye, or tissue donation purposes, (i) for research purposes, (j) to avert a serious threat to health or safety, (k) for specialized government functions, and (l) for workers’ compensation).
     [170].   Id. § 164.502(a).
     [171].   Id. § 164.502(a)(1)(iv) (allowing covered entities to disclose protected health information “[p]ursuant to and in compliance with a valid authorization”).  The subsection of the Privacy Rule that describes the twelve permitted public interest activities specifically provides that “[a] covered entity may use or disclose protected health information without the written authorization of the individual.”  Id. § 164.512.
     [172].   See, e.g., Prot. & Advocacy Sys., Inc. v. Freudenthal, 412 F. Supp. 2d 1211, 1220 (D. Wyo. 2006) (“The primary purpose of HIPAA’s Privacy Rule is to safeguard the privacy of medical protected health information.”).
     [173].   45 C.F.R. § 164.524(a)(1).
     [174].   Id. § 164.524(a)(3).
     [175].   Id. § 164.526(a)(1).
     [176].   Id. § 164.526(d)(2).
     [177].   Id. § 164.528(a)(1) (providing individuals a right “to receive an accounting of disclosures of protected health information made by a covered entity in the six years prior to the date on which the accounting is requested”).
     [178].   Id. § 164.522(a)(1)(i)(A)–(B) (granting individuals a right to request that the covered entity restrict uses or disclosures related to “treatment, payment, or health care operations” or disclosures to which individuals have a right to agree or object under 45 C.F.R. § 164.510(b)).  Covered entities need not comply with all of these requests.  See id. § 164.522(a)(1)(ii) (“A covered entity is not required to agree to a restriction.”).  However, where the request relates to health care operations and not treatment, and the protected health information pertains solely to a health care item or service for which the provider has already been fully reimbursed, then the covered entity must comply with the request.  See 42 U.S.C. § 17935(a) (Supp. IV 2010).
     [179].   45 C.F.R. § 164.522(b)(1).
     [180].   Id. § 164.522(b)(1)(i) (requiring providers to “accommodate reasonable requests”).
     [181].   Id. § 164.522(b)(1)(ii).
     [182].   Id. § 164.510(a)–(b) (giving an individual the right to agree or object before a covered entity lists the individual’s name in a facility directory, gives information to the individual’s visitors or members of the clergy, or discloses information to friends or family members who are concerned with the individual’s treatment or payment).
     [183].   Id. § 164.502(a)(1)(iv).
     [184].   Id. § 164.520(a)(1).
     [185].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“The [prescription] data sold by the data-mining appellants is stripped of patient information, to protect patient privacy.”), aff’d, 131 S. Ct. 2653 (2011).
     [186].   Id.see also, e.g., IMS Health Inc. v. Mills, 616 F.3d 7, 16 (1st Cir. 2010) (“The [pharmacies’] software encrypts patient-identifying data so that plaintiffs cannot identify individual patients by name . . . .”), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011); IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (“To protect patient privacy, prescribees’ names are encrypted, effectively eliminating the ability to match particular prescriptions with particular patients.”), abrogated by Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011).
     [187].   45 C.F.R. § 164.502(d)(2) (“The requirements of this subpart do not apply to information that has been de-identified in accordance with the applicable requirements of § 164.514 . . . .”).
     [188].   See id. § 164.502(d)(2) (“Health information that meets the standard and implementation specifications for de-identification under § 164.514(a) and (b) is considered not to be individually identifiable health information, i.e., de-identified.”); id. § 164.514(b)(2)(i)(A)–(R) (stating the identifiers that must be removed from protected health information for de-identification).
     [189].   Id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information . . . whether or not the de-identified information is to be used by the covered entity.”).
     [190].   Id. § 164.502(a)(1)(ii) (listing “health care operations” as a permitted use).
     [191].   Id. § 164.501 (“Health care operations means any of the following activities of the covered entity to the extent that the activities are related to covered functions: . . . (6) Business management and general administrative activities of the entity, including, but not limited to: . . . (v) Consistent with the applicable requirements of § 164.514, creating de-identified health information or a limited data set, and fundraising for the benefit of the covered entity.”).  The term “covered function” is not explicitly defined in HIPAA, but presumably refers to the treatment, payment, and health care operations functions of covered entities.  See id. § 164.502(a)(1)(ii) (permitting a covered entity to use or disclose protected health information for “treatment, payment, or health care operations”).
     [192].   See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2010) (“These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”), aff’d, 131 S. Ct. 2653 (2011).
     [193].   See, e.g.Sorrell, 131 S. Ct. at 2660 (“Detailers, who represent the [drug] manufacturers . . . use the [data-mining] reports to refine their marketing tactics and increase sales.”).
     [194].   See generally 45 C.F.R. § 164.508(a)(3).
     [195].   Id. § 164.501(1).
     [196].   See id. § 164.501(1)(i)–(iii) (describing exclusions from the definition of marketing).
     [197].   See id. § 164.501(1)(i) (defining “marketing” as “a communication about a product or service that encourages recipients of the communication to purchase or use the product or service,” excluding communications made for the purpose of describing an individual’s benefits in a health plan or relating to the individual’s treatment or case management); id. § 164.501(2) (defining “marketing” to include “[a]n arrangement between a covered entity and any other entity whereby the covered entity discloses protected health information to the other entity, in exchange for direct or indirect remuneration, for the other entity or its affiliate to make a communication about its own product or service that encourages recipients of the communication to purchase or use that product or service”).
     [198].   See id. § 164.508(a)(3).
     [199].   Subpart E of the Privacy Rule encompasses 45 C.F.R. §§ 164.500–164.534.  For Subpart E’s table of contents, see id. § 164.102.
     [200].   The transition provisions in 45 C.F.R. 164.532 refer to the effect of authorizations and contracts that existed prior to the effective date of the Privacy Rule.  For example, authorizations executed prior to HIPAA are deemed to be effective post-HIPAA as long as the authorization specifically permits the use or disclosure and there is no agreement between the covered entity and the individual restricting the use or disclosure.  See id. § 164.532 (“Effect of prior authorization for purposes other than research.  Notwithstanding any provisions in § 164.508, a covered entity may use or disclose protected health information that it created or received prior to the applicable compliance date of this subpart pursuant to an authorization or other express legal permission obtained from an individual prior to the applicable compliance date of this subpart, provided that the authorization or other express legal permission specifically permits such use or disclosure and there is no agreed-to restriction in accordance with § 164.522(a).”).
     [201].   Id. § 164.508(a)(3)(i).
     [202].   Id. § 164.508(a)(3)(i)(A).
     [203].   Id. § 164.508(a)(3)(i)(B).
     [204].   See generally id. § 164.508(a)(3)(i).
     [205].   See id.
     [206].   Id. § 160.103.  It should be noted that not every provider is a covered entity under HIPAA.  The Privacy Rule provides that a covered entity includes only those providers “who transmit[] any health information in electronic form in connection with a transaction covered by this subchapter.”  Id.  However, because virtually all pharmacies currently send health care claims and other covered transactions electronically, they qualify as covered entities under HIPAA.
     [207].   See id.
     [208].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“When filling prescriptions, pharmacies in Vermont collect information including the prescriber’s name and address, the name, dosage, and quantity of the drug, the date and place the prescription is filled, and the patient’s age and gender.”), aff’d, 131 S. Ct. 2653 (2011).
     [209].   See 45 C.F.R. § 164.508(a)(3)(i) (imposing requirements on uses and disclosures of protected health information “for marketing”).
     [210].   See, e.g.Sorrell, 630 F.3d at 267 (“Pharmacies sell this PI [prescriber-identifiable] data to the data mining appellants. . . . These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”).
     [211].   See, e.g.Sorrell, 131 S. Ct. at 2660 (“Detailers, who represent the [drug] manufacturers . . . use the [data-mining] reports to refine their marketing tactics and increase sales.”).
     [212].   45 C.F.R. § 164.501.  This marketing definition does not specify who must make the communication, or who must be the recipient of the communication.  Therefore, on its face, the definition does not require the covered entity making the use or disclosure to be either the communicator or the marketer, or that the recipient of the communication be the individual whose protected health information is being used or disclosed.  However, later additions to the Privacy Rule, enacted by Congress on February 17, 2009, appear to equate “recipient” with “individual.”  American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 13406, 123 Stat. 115, 266–70 (2010).  In ARRA provisions relating to marketing, the law states that “the covered entity making such communication obtains from the recipient of the communication . . . a valid authorization . . . with respect to such communication.”  42 U.S.C. § 17936(a)(2)(B)(ii) (Supp. IV 2010).  In the context of the Privacy Rule, authorizations are obtained only from individuals. See 45 C.F.R. § 164.508(c)(1)(vi) (requiring an authorization to be signed by the “individual”).  Further, in proposed rules to implement ARRA, the HHS also appears to assume that the recipient of marketing communications is the individual.  See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,884 (July 14, 2010) (to be codified at 45 C.F.R. pt. 160) (“The Privacy Rule requires covered entities to obtain a valid authorization from individuals before using or disclosing protected health information to market a product or service to them.” (emphasis added) (citation omitted)).  Nevertheless, HIPAA does not explicitly state that the recipient of a marketing communication must be the individual.  See 45 C.F.R. § 164.501 (defining marketing as “mak[ing] a communication about a product or service that encourages recipients of the communication to purchase or use the product or service”); id. § 164.508(a)(3) (providing that “a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing”).  Moreover, the disclosures of prescription information described in Sorrellultimately resulted in sales of brand-name drugs to individuals whose privacy information may have been used to market the drugs.  Even if this were not the case, there is nothing unreasonable about reading the Privacy Rule precisely as it is written—requiring individuals to authorize any use of their protected health information to sell items or services, no matter the product, no matter the seller, and no matter the buyer.
     [213].   See, e.g., IMS Health Inc. v. Mills, 616 F.3d 7, 16 (1st Cir. 2010) (stating that pharmacies’ computer software collects prescription data, encrypts the patient identifiers so that patients cannot be identified by name, and sends the information to the data miners who have purchased the information), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [214].   See supra Part VI.B.
     [215].   See 45 C.F.R. § 164.502(d) (permitting a covered entity to use protected health information to create de-identified information, and providing that the Privacy Rule does not apply to de-identified information).
     [216].   See id. § 160.103 (defining “protected health information” to mean “individually identifiable health information”).
     [217].   See id. § 164.502(d)(2) (providing that the Privacy Rule does not apply to de-identified information).
     [218].   See id. § 164.508(a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [219].   See id. § 160.103 (broadly defining “use” of protected health information as “the sharing, employment, application, utilization, examination, or analysis of such information within an entity that maintains such information”).
     [220].   See id. § 164.502(d)(1) (permitting a covered entity to “use protected health information to create information that is not individually identifiable health information”).
     [221].   See, e.g., IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“Pharmacies sell this PI [prescriber-identifiable] data to the data mining appellants. . . . These data mining companies . . . aggregate the data to reveal individual physician prescribing patterns and sell it . . . primarily to pharmaceutical manufacturers.”), aff’d, 131 S. Ct. 2653 (2011).
     [222].   See 45 C.F.R. § 160.103 (defining “covered entity” to include “[a] health care provider who transmits any health information in electronic form in connection with a transaction covered by this subchapter”).  A “health care provider” is defined as “a provider of medical or health services . . . and any other person or organization who furnishes, bills, or is paid for health care in the normal course of business.”  Id.
     [223].   See id. § 164.508(a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [224].   Id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information . . . whether or not the de-identified information is to be used by the covered entity.”).
     [225].   Id. § 164.508(a)(3)(i).
     [226].   See id.  There is no reason to believe that the HHS, the drafter of the Privacy Rule, meant anything by this “notwithstanding” language other than what the language unambiguously states.  The Agency used similar language in another provision of the Privacy Rule to require an authorization before any use or disclosure of psychotherapy notes, subject to limited exceptions.  See id. § 164.508(a)(2).  However, where the Agency intended a more limited impact of its use of the term “notwithstanding,” it clearly restricted its reach to particular provisions within the Privacy Rule. See, e.g.id. § 164.502(g)(3)(ii) (“Notwithstanding the provisions of paragraph (g)(3)(i) of this section”); id. § 164.502(g)(5) (“Notwithstanding a State law or any requirement of this paragraph to the contrary”); id. § 164.532(b) (“Notwithstanding any provisions in § 164.508”); id. § 164.532(c) (“Notwithstanding any provisions in §§ 164.508 and 164.512(i)”).
     [227].   See id. § 164.502(d)(1) (“A covered entity may use protected health information to create information that is not individually identifiable health information or disclose protected health information only to a business associate for such purpose, whether or not the de-identified information is to be used by the covered entity.”).
     [228].   See id. § 164.508 (a)(3)(i) (“Notwithstanding any provision of this subpart . . . a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [229].   See id.
     [230].   See IMS Health Inc. v. Ayotte, 550 F.3d 42, 45 (1st Cir. 2008) (stating that IMS and Verispan organize several billion prescriptions each year), abrogated by Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011).
     [231].   See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,907 (July 14, 2010) (to be codified at 45 C.F.R. pts. 160, 164) (opining on the effect of proposed HIPAA privacy rules that would expand the requirement of covered entities to obtain written authorizations prior to marketing disclosures and sales of protected health information: “Even if covered entities attempted to obtain authorizations in compliance with the proposed modifications, we believe most individuals would not authorize these types of disclosures.  It would not be worthwhile for covered entities to continue to attempt to obtain such authorizations, and as a result, we believe covered entities would simply discontinue making such disclosures.”).
     [232].   See id.
     [233].   See American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 17935(d)(1), 123 Stat. 115 (2010) (adding the Privacy Rule restrictions on covered entities’ sale of protected health information and requiring covered entities to obtain an authorization from the affected individuals prior to selling their protected health information for any purpose).  Exceptions to the authorization requirement apply for activities such as public health activities, research, treatment, and healthcare operations.  See id. § 17935(d)(2)(A)–(G).  In addition, ARRA provides that communications by a covered entity encouraging the recipients to purchase or use a product or service may not be considered a health care operation, which would avoid the authorization requirement.  See id. § 17936(a)(1).  Rules proposed to implement ARRA underscore the Agency’s continuing concerns about covered entities’ use of protected health information for marketing purposes.  See Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. at 40,868.  The HHS declared:We believe Congress intended with these provisions [marketing and sale] to curtail a covered entity’s ability to use the exceptions to the definition of “marketing” in the Privacy Rule to send communications to the individual that were motivated more by commercial gain or other commercial purpose rather than for the purpose of the individual’s health care, despite the communication’s being about a health-related product or service.Id. at 40,884.  While ARRA restricts sales of protected health information, it does not prohibit sales of de-identified information; while it restricts marketing-related disclosures, it does not restrict marketing-related uses.  Therefore, there is nothing in the text of ARRA that explicitly prohibits a covered entity from first de-identifying protected health information and then selling it to a third party for any purpose without obtaining authorizations from the affected individuals.  Nevertheless, ARRA leaves unaltered HIPAA’s preexisting marketing requirement that covered entities must obtain authorizations from individuals before engaging in any marketing-related use or disclosure of their protected health information. See 45 C.F.R. § 164.508 (a)(3)(i) (“[A] covered entity must obtain an authorization for any use or disclosure of protected health information for marketing . . . .”).
     [234].   No. 11-2428, 2012 U.S. Dist. LEXIS 19372 (E.D. Pa. Feb. 15, 2012).
     [235].   Id. at *1–2.
     [236].   Id. at *1.
     [237].   Id. at *12.
     [238].   Id. at *14.
     [239].   Id. at *17 (“Under the Privacy Rule, healthcare providers are permitted to ‘de-identify’ Protected Health Information.  Once information is de-identified, it is no longer considered Protected Health Information.”).
     [240].   Id. (“[F]ederal regulations permit the disclosure of protected Health Information under certain circumstances, including for ‘treatment, payment, or health care operations.’  The term ‘health care operations’ is defined to include ‘contacting of health care providers and patients with information about treatment alternatives.’”).
     [241].   See 45 C.F.R. § 164.508(a)(3)(i) (2011).
     [242].   Id.
     [243].   The defendants’ letters to the plaintiffs’ physicians suggesting drug prescription alternatives should be characterized as marketing rather than treatment.  The drug manufacturers paid the pharmacies for sending the letters.  Steinberg, 2012 U.S. Dist. LEXIS 19372, at *6.  While the manufacturers stood to benefit when the physicians prescribed the suggested alternative drugs, the pharmacies had no motivation to send the communications other than their remuneration from the manufacturers.  In fact, the American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111–5, § 17936(a)(1), 123 Stat. 115 (2010), provides that communications by a covered entity encouraging the recipients to purchase or use a product or service may not be considered a health care operation, thereby avoiding the authorization requirement.  The HHS, in proposed rules to implement the ARRA, declared its intent:to curtail a covered entity’s ability to use the exceptions to the definition of ‘marketing’ in the Privacy Rule to send communications to the individual that were motivated more by commercial gain . . . rather than for the purpose of the individual’s health care, despite the communication’s being about a health-related product or service.Modifications to the HIPAA Privacy, Security, and Enforcement Rules Under the Health Information Technology for Economic and Clinical Health Act, 75 Fed. Reg. 40,868, 40,884 (July 14, 2010) (to be codified at 45 C.F.R. pt. 160)
     [244].   Steinberg, 2012 U.S. Dist. LEXIS 19372, at *13.
     [245].   See discussion of the states’ data-mining laws supra Part III, and discussion of the marketing provisions of the Privacy Rule supra Part VI.B;see also Brief for the United States as Amicus Curiae Supporting Petitioners,supra note 31, at 33 (“There are a number of federal statutory and regulatory provisions that regulate the dissemination or use of information by private parties for various reasons, including to protect individual privacy. . . . For instance, the Health Insurance Portability and Accountability Act of 1996 (‘HIPAA’) and its implementing regulations limit the nonconsensual dissemination and use of patient-identifiable health information by health plans . . . and most health care providers.”).
     [246].   See supra Part VI.C.
     [247].   See supra Part V.
     [248].   See supra Part VI.C.
     [249].   See, e.g., Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2668 (2011) (“[T]he State contends that its law is necessary to protect medical privacy, including physician confidentiality, avoidance of harassment, and the integrity of the doctor-patient relationship . . . [and] improved public health and reduced healthcare costs.”); IMS Health Inc. v. Sorrell, 630 F.3d 263, 267 (2d Cir. 2010) (“The Vermont legislature passed Act 80 in 2007, intending to protect public health, to protect prescriber privacy, and to reduce health care costs.”), aff’d, 131 S. Ct. 2653 (2011).
     [250].   See, e.g., Prot. & Advocacy Sys. v. Freudenthal, 412 F. Supp. 2d 1211, 1220 (D. Wyo. 2006) (“The primary purpose of HIPAA’s Privacy Rule is to safeguard the privacy of medical protected health information.”); Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“The governmental interest in protecting patient privacy is clearly a substantial one.”).
     [251].   Compare tit. 22, § 1711-E (2009) (making it unlawful for a pharmacy to use, sell, or transfer prescription information where the prescriber had registered for confidentiality protection), and § 318:47-f (2006) (prohibiting pharmacies and insurance companies from selling or licensing prescription data for any commercial purpose), and tit. 18, § 4631 (2010) (prohibiting the sale or disclosure of pharmacy records for marketing purposes and prohibiting drug manufacturers from using the records for marketing unless the prescribers consented), with 45 C.F.R. § 164.508 (a)(3)(i) (2011) (prohibiting covered entities from using or disclosing protected health information for marketing purposes without the individual’s authorization).
     [252].   Compare tit. 22, § 1711-E(1)(F-1) (defining marketing as advertising, publicizing, promoting, or selling a prescription drug), and § 318:47-f  (defining commercial purpose as advertising, marketing, or any activity that influences sales), and 18, § 4631(b)(5) (defining marketing as advertising or any activity that influences the sale of a drug or influences prescribing behavior), with 45 C.F.R. § 164.501 (defining marketing as a communication that encourages the listener to purchase or use the item or service).
     [253].   Compare tit. 22, § 1711-E(1)(F-1) (excluding a number of health-related activities from the definition of “marketing,” including pharmacy reimbursement, patient care management, utilization review by a healthcare provider, and healthcare research), and § 318:47-f  (exempting from the marketing prohibition disclosures of prescription information for health-related purposes, such as pharmacy reimbursement, care management, utilization review by a healthcare provider, or healthcare research), and § 4631 (excluding from the definition of marketing certain health-related purposes, including pharmacy reimbursement, healthcare management, utilization review by a healthcare provider, and healthcare research), with45 C.F.R. § 164.501 (exempting from the definition of marketing communications to describe the benefits in a health plan, uses and disclosures for treatment, and case management).
     [254].   See Brief for Respondent Pharmaceutical Research and Manufacturers of America at 48, Sorrell v. IMS Health Inc., 131 S. Ct. 2653 (2011) (No. 10-779) (“[T]he State did not defend its law below on the basis ofpatient privacy.”).
     [255].   IMS Health Inc. v. Mills, 616 F.3d 7, 15 (1st Cir. 2010) (“[P]hysicians ‘complain bitterly’ about detailers ‘who wave data in their faces’ and challenge them with their own prescribing histories when they fail to prescribe more of the product the detailer has been advertising.” (citations omitted)), vacated, IMS Health Inc. v. Schneider, 131 S. Ct. 3091 (2011).
     [256].   Sorrell, 131 S. Ct. at 2671.
     [257].   Id. at 2669.
     [258].   Id. at 2671.
     [259].   The Second Circuit in Sorrell found that the privacy of patients’ medical information was not at issue.  IMS Health Inc. v. Sorrell, 630 F.3d 263, 276 (2d Cir. 2010) (“[T]he state’s asserted interest in medical privacy is too speculative to qualify as a substantial state interest. . . . Vermont has not shown any effect on the integrity of the prescribing process or the trust patients have in their doctors from the use of PI [prescriber-identifiable] data in marketing.”), aff’d, 131 S. Ct. 2653 (2011).
     [260].   See 45 C.F.R. § 164.502(a) (2011) (regulating the uses and disclosures of protected health information by covered entities).
     [261].   See, e.g., Brief for Petitioners at 23, Sorrell, 131 S. Ct. 2653 (No. 10-779) (characterizing pharmacies’ prescription information as nonpublic, “particularly where the information has been produced involuntarily”); Reply Brief for Petitioners at 3, Sorrell, 131 S. Ct. 2653 (No. 10-779) (“Doctors and patients do not voluntarily provide prescriptions to pharmacies; by law, they must provide this sensitive information to obtain medicine.”).
     [262].   See, e.g.45 C.F.R. § 164.502(a)(1)(ii) (permitting a covered entity to use or disclose protected health information for “treatment, payment, or health care operations”).  Treatment includes coordination of healthcare, managing healthcare, consultations among providers, and referrals.  Payment includes insurers’ collection of insurance premiums, providers’ obtaining reimbursement for providing healthcare, determining eligibility for coverage, adjudicating health benefit claims, risk adjusting, billing and collections, reviewing healthcare services to determine medical necessity, utilization review, and making disclosures to consumer reporting agencies.  Healthcare operations include quality assessment; reviewing the competence or qualifications of healthcare professionals; underwriting; conducting or arranging for medical review, legal services, and auditing, including fraud and abuse detection and compliance; business planning, business management and administrative activities; customer service; resolution of internal grievances; sale, transfer, merger, or consolidation of the covered entity with another entity; and fundraising.  See id. § 164.501.
     [263].   See id. § 164.502 (providing the permitted and required uses and disclosures of protected health information by covered entities).
     [264].   See id. § 164.502(a) (prohibiting covered entities from using or disclosing protected health information “except as permitted or required by [the Privacy Rule]”); id. § 164.502(a)(1)(iv) (allowing covered entities to disclose protected health information “[p]ursuant to and in compliance with a valid authorization”).
     [265].   See, e.g.id. § 164.510 (requiring a covered entity, prior to certain uses and disclosures of an individual’s protected health information, to inform the individual in advance of the use or disclosure and provide the individual an opportunity to agree, or to prohibit, or to restrict the use or disclosure).
     [266].   See Sorrell, 131 S. Ct. at 2670–71 (2011) (“[T]he ‘state’s own explanation of how [the data-mining law] advances its interests cannot be said to be direct.’  The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers—that is, by diminishing detailers’ ability to influence prescription decisions.  Those who seek to censor or burden free expression often assert that disfavored speech has adverse effects.  But the ‘fear that people would make bad decisions if given truthful information’ cannot justify content-based burdens on speech.” (citations omitted)).
     [267].   See id. at 2661 (discussing the impact of pharmacies’ sales of prescription information upon cost containment and the public health).
     [268].   See 45 C.F.R. § 164.502 (listing permitted uses and disclosures of protected health information by covered entities that do not require an authorization from the affected individuals).
     [269].   See id. (providing individuals with rights of access and rights to control certain uses and disclosures of their protected health information by covered entities); see also supra Part VI.A (explaining individuals’ rights of access and control over their protected health information under the Privacy Rule).
     [270].   Sorrell, 131 S. Ct. at 2669 (“Physicians can, and often do, simply decline to meet with detailers, including detailers who use prescriber-identifying information.”).
     [271].   Id. at 2660–61.  But see id. at 2681 (Breyer, J., dissenting) (noting that the education program funded by Vermont’s data-mining law “does notmake use of prescriber-identifying data”).
     [272].   IMS Health Inc. v. Sorrell, 630 F.3d 263, 280 (2d Cir. 2010), aff’d, 131 S. Ct. 2653 (2011) (“The state could wait to assess what the impact of its newly funded counter-speech program will be.”).
     [273].   See 45 C.F.R. § 164.502(a)(1)(ii), (iv) (listing permitted and required uses, and permitting any other use or disclosure “[p]ursuant to and in compliance with a valid authorization”); see also Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“HIPAA and other such federal statutes directly advance substantial federal interests in a narrowly and reasonably tailored way.”).
     [274].   See 45 C.F.R. § 164.502(a)(1)(i)–(iii), (v)–(vi), (2)(i)–(ii) (listing permitted and required uses and disclosures for which an authorization is not required).
     [275].   See id. § 164.508(a)(1) (“Except as otherwise permitted or required by this subchapter, a covered entity may not use or disclose protected health information without an authorization that is valid under this section.”).
     [276].   See id. § 164.502(a)(1)(ii) (listing “treatment, payment, or health care operations” as a permitted use and disclosure).
     [277].   See id. § 164.502(a)(1)(vi) (listing permitted uses and disclosures “[a]s permitted by and in compliance with . . . § 164.512,” which, in turn, describes twelve public interest activities pursuant to which covered entities may use or disclose protected health information without obtaining an authorization from the affected individuals).
     [278].   See id. § 164.508(a)(3) (restricting marketing uses and disclosures); id. § 164.501 (providing health related exceptions to the definition of marketing); see also Reply Brief for Petitioners, supra note 261, at 21–22 (“Doctors and patients expect and intend these [health-related] uses of healthcare information, but they do not expect (or even know) that third parties purchase the information and use it as a marketing tool.”).
     [279].   Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2663–67 (2011).
     [280].   Id. at 2663.
     [281].   Id. at 2663–64 (citation omitted).
     [282].   Id. at 2672.
     [283].   Id. at 2671 (“[S]ome Vermont doctors view targeted detailing based on prescriber-identifying information as ‘very helpful’ because it allows detailers to shape their messages to each doctor’s practice.”).
     [284].   Id. (“[T]he United States, which appeared here in support of Vermont, took care to dispute the State’s ‘unwarranted view that the danger of [n]ew drugs outweigh their benefits to patients.’”); IMS Health Inc. v. Sorrell, 630 F.3d 263, 280 (2d Cir. 2010) (observing that the state law precludes the use of pharmacy information for marketing brand-name drugs “no matter how efficacious and no matter how beneficial those drugs may be compared to generic alternatives”), aff’d, 131 S. Ct. 2653 (2011).
     [285].   Sorrell, 131 S. Ct. at 2672 (concluding that the State “restrict[ed] the information’s use by some speakers and for some purposes, even while the State itself can use the information to counter the speech it seeks to suppress”).
     [286].   See 45 C.F.R. § 164.502(a)(1)–(2) (2011) (listing the permitted and required uses and disclosures of protected health information by covered entities).
     [287].   See id. § 164.508(a)(1) (“Except as otherwise permitted or required by this subchapter, a covered entity may not use or disclose protected health information without an authorization.”).
     [288].   See id. § 164.508(a)(3) (requiring an individual’s authorization for marketing uses and disclosures of protected health information by covered entities).
     [289].   The Privacy Rule provides that, notwithstanding any other provision in the Rule, a covered entity may not use or disclose protected health information for marketing and may not use or disclose protected health information in psychotherapy notes.  See id. § 164.508(a)(2)–(3).
     [290].   Sorrell, 131 S. Ct. at 2658 (“The State seeks to achieve its policy objectives through the indirect means of restraining certain speech by certain speakers.”).  Moreover, HIPAA similarly restricts all uses and disclosures of psychotherapy notes unless authorized by the individual.  45 C.F.R. § 164.508(a)(2) (“Notwithstanding any provision of this subpart . . . a covered entity must obtain an authorization for any use or disclosure of psychotherapy notes,” subject to limited exceptions, including, inter alia, uses for treatment by the psychotherapist, uses and disclosures for the psychotherapist’s training programs, and uses or disclosures to allow the psychotherapist to defend himself in a legal action brought by the individual).
     [291].   See 45 C.F.R. § 164.501 (describing the myriad permissible uses and disclosures of protected health information that comprise treatment, payment, and healthcare operations); id. § 164.502(a)(1)(ii) (indicating that permitted uses and disclosures of protected health information include treatment, payment, or healthcare operations).
     [292].   See Sorrell, 131 S. Ct. at 2660 (quoting the law as providing that pharmacies may not disclose pharmacy information for marketing and that drug manufacturers may not use the information for marketing, unless the prescriber consents).
     [293].   See id. at 2668 (“Under Vermont’s law, pharmacies may share prescriber-identifying information with anyone for any reason save one: They must not allow the information to be used for marketing.”).
     [294].   Id. at 2660.
     [295].   Id. at 2668 (“Exceptions further allow pharmacies to sell prescriber-identifying information for certain purposes, including ‘health care research.’  And the measure permits insurers, researchers, journalists, the State itself, and others to use the information.” (citations omitted)).
     [296].   See id. at 2663 (“[I]t appears that Vermont could supply academic organizations with prescriber-identifying information to use in countering the messages of brand-name pharmaceutical manufacturers and in promoting the prescription of generic drugs.”).  But see id. at 2680 (Breyer, J., dissenting) (noting that the record “contains no evidentiary basis for the conclusion that any such individualized counterdetailing is widespread, or exists at all, in Vermont”).
     [297].   Id. at 2669 (majority opinion); see also Brief of Respondents IMS Health Inc., Verispan, LLC, & Source Healthcare Analytics, Inc. at 9, Sorrell, 131 S. Ct. 2653 (No. 10-779) (“In stark contrast to HIPAA and other federal statutes and regulatory regimes that protect important personal privacy interests, Act 80 [Vermont’s data-mining law] contains numerous exceptions that freely permit the wide distribution of prescribers’ commercial prescription history information.”).
     [298].   See Sorrell, 131 S. Ct. at 2663 (“The statute thus disfavors marketing, that is, speech with a particular content.  More than that, the statute disfavors specific speakers, namely pharmaceutical manufacturers.”).
     [299].   See 45 C.F.R. § 164.502(a) (2011) (“A covered entity may not use or disclose protected health information, except as permitted or required by this subpart. . . .”).
     [300].   See id. § 164.502(a)(1)–(2) (listing the permissible and required uses and disclosures under the Privacy Rule).
     [301].   The Privacy Rule also permits uses and disclosures in several other areas and requires disclosures in two instances.  See supra Part VI.A.
     [302].   See 45 C.F.R. § 164.502(a)(1)(ii) (listing “treatment, payment or health care operations” as a permissible basis for covered entities to use or disclose protected health information); id. § 164.501 (defining the activities that comprise treatment, payment, and healthcare operations).
     [303].   See id. § 164.502(a)(1)(vi) (listing as permissible uses and disclosure of protected health information by covered entities those that are “permitted by and in compliance with . . . § 164.512”); id. § 164.512 (listing twelve public interest activities that comprise permissible uses of protected health information by covered entities, including uses and disclosures required by law; uses and disclosures for public health activities; disclosures about victims of abuse, neglect, or domestic violence; uses and disclosures for health oversight activities; disclosures for judicial and administrative proceedings; disclosures for law enforcement purposes; uses and disclosures about decedents; uses and disclosures for cadaveric organ, eye, or tissue donation purposes; uses and disclosures for research purposes; uses and disclosures to avert a serious threat to health or safety; uses and disclosures for specialized government functions; and disclosures for workers’ compensation); see also Brief for the United States as Amicus Curiae Supporting Petitioners, supra note 31, at 34 (“HIPAA’s regulations directly advance that interest, because they permit the nonconsensual disclosure or use of patient-identifiable information only in limited circumstances such as ‘treatment, payment, or health care operations,’ or national ‘public health activities . . . .’” (citations omitted)).
     [304].   See 45 C.F.R. § 164.502(a)(1) (listing permissible uses and disclosure of protected health information by covered entities); id. § 164.502(a)(1)(iv) (permitting disclosures pursuant to an authorization).
     [305].   See id. § 164.508(a)(3) (imposing special restrictions upon marketing uses and disclosures); see also discussion of marketing restrictions supra Part VI.B.
     [306].   See § 164.508(a)(3)(i) (providing generally that “a covered entity must obtain an authorization for any use or disclosure of protected health information for marketing”).  Under the American Recovery and Reinvestment Act of 2009 (“ARRA”), Pub. L. No. 111-5, § 17935(d)(1), 123 Stat. 115 (2009), sales of protected health information must be authorized, but this limit is broadly framed to apply to all sales of health information, both marketing and nonmarketing.
     [307].   See Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2660 (2011) (quoting the law as providing that pharmacies may not disclose pharmacy information for marketing and drug manufacturers may not use the information for marketing, unless the prescriber consents).
     [308].   Id. at 2668.
     [309].   See, e.g., Brief for Petitioners, supra note 261, at 36 (“The protection of free speech should not restrict reasonable consumer privacy protections that give consumers control over nonconsensual uses of their information.”); Brief of Respondents IMS Health Inc., Verispan, LLC, & Source Healthcare Analytics, Inc., supra note 297, at 32, 44 (“There is no dispute that, although genuine privacy measures restrict free speech by prohibiting the disclosure of factual information, they satisfy First Amendment scrutiny because they are tailored to further a substantial interest in protecting an important expectation of privacy. . . . Vermont errs in relying on several statutes and regulatory regimes that prohibit private parties from disclosing information.  All those measures satisfy constitutional scrutiny because they are not intended to restrict speech but instead consistently protect an important privacy interest.  The Solicitor General all but acknowledges that, in light of all the contradictions in Vermont law, Act 80 does not function as a genuine privacy statute.”); see also Brief for the United States as Amicus Curiae Supporting Petitioners,supra note 31, at 34–35 (“[T]his Court’s analysis of the ‘fit’ between the Vermont statute and the State’s legislative objectives should not affect those federal provisions [like HIPAA].”); Reply Brief for Petitioners, supra note 261, at 8 (“If respondents were correct, then privacy laws generally would be subject to strict scrutiny. . . . This position is plainly untenable . . . .”).

By Randall K. Johnson

Introduction

Does lawsuit data collection deter police misconduct lawsuits? One might think so, judging from recent scholarship on police accountability and deterrence.[1] The best of this work argues that police learn from lawsuit data collection, without actually proving the point.[2] While I agree with the premise that law enforcement agencies may learn from better and more complete information, there is little proof that lawsuit data collection deters police misconduct lawsuits.[3] As a result, additional research is necessary in order to support or to deny this claim.

I modeled and tested this claim in a recent paper: Do Police Learn from Lawsuit Data?[4] My paper introduced a new § 1983 dataset[5] in order to determine if lawsuit data collection correlates with better deterrence of published misconduct cases.  This dataset drew on 10,044 cases that were brought against twenty-six U.S. law enforcement agencies.[6] I matched these published cases with police employment data[7] in order to compute officer-to-lawsuit ratios.[8] These computations were done for all twenty-six law enforcement agencies and three separate groups of departments.[9] After comparing these average ratios, at the individual and group levels, I found that departments that consistently gather lawsuit data do not perform better than other law enforcement agencies.[10] This finding indicates that police may not learn from lawsuit data collection.[11] As a result, law enforcement agencies may need to identify a more promising approach. One approach, which is often overlooked by departments, is third-party data collection.

This Essay argues that third-party data collection, particularly of administrative complaints and departmental audit information, holds greater promise than lawsuit data collection. It does so by asserting that third-party data collection is more useful for three reasons. First, third-party data collection may prevent manipulation by individual police officers and law enforcement agencies. Second, it may assure that police behavioral trends are identified. Lastly, third-party data collection may help to deter published § 1983 cases. This Essay, however, only models and tests the final claim.

I.  Methodology

This Essay models and tests one claim: that police may learn from third-party data collection. In doing so, it draws on the same § 1983 dataset that I used to find out if police learn from lawsuit data collection. As in my earlier work, better deterrence is equated with higher officer-to-lawsuit ratios. Less effective deterrence, in contrast, is equated with lower average ratios. By comparing these average ratios, at the individual and group levels,[12] I found a baseline for each subset and another for the entire population. The baselines helped me to determine two things: whether the departments are a part of the same population and are distributed along a normal distribution.

This approach compliments regression analysis in several ways. First, officer-to-lawsuit ratios provide a simple way to test new hypotheses. Second, this approach shows whether lawsuits have been deterred. Third, officer-to-lawsuit ratios account for differences in department size. Finally, this approach captures the effect of changes in litigation strategy such as no-settlement policies.[13]

The preceding analysis indicates that officer-to-lawsuit ratios may be useful, even with a relatively small population.[14] This approach, however, will not be valid when law enforcement agencies do not meet a minimum “size” threshold.[15] The minimum size, at least in this paper, is 330 officers. These departments also must face more than a nominal amount of published § 1983 cases. The failure to meet each requirement means that a department will be excluded from this Essay’s analysis.[16] These two issues, and other potential problems, are dealt with deliberately, with an eye toward avoiding methodological issues.[17]

Within this context, I evaluate a single claim: that law enforcement agencies with greater access to third-party data are, on average, more effective in deterring published § 1983 cases. This claim is evaluated by determining whether law enforcement agencies with greater access to third-party data have higher officer-to-lawsuit ratios than other departments (with less access to third-party data). This finding will substantiate or deny the claim that police may learn from third-party data collection.

II.  Results

As I stated earlier in this Essay, my § 1983 dataset has 10,044 cases. These cases were published by LexisNexis between 2006 and 2012. I restricted these data by year (2006 to 2012), jurisdiction (federal district court), and cause of action (§ 1983). Next, these cases were matched with police employment data in order to compute officer-to-lawsuit ratios for twenty-six law enforcement agencies.  I also used this dataset to compute average ratios for three groups of departments (law enforcement agencies with access to complaint data and audit data, departments without access to third-party data, and a control group, which has access to complaint data or audit data).  These officer-to-lawsuit ratios are given, individually and by department group, in Tables 1, 2, 3, and 4.

As illustrated in Table 2, law enforcement agencies with access to complaint and audit data had an average ratio of sixty-two to one.[18] Departments without access to third-party data,[19] which are described in Table 3, had an officer-to-lawsuit ratio of forty-three to one.

The control group,[20] which is highlighted in Table 4, had an average ratio of fifty-one to one. When these ratios are compared, it is clear that departments with more access to third-party data perform better than others. This finding supports the claim that police learn from third-party data collection.

Conclusion

This Essay demonstrates that law enforcement agencies with greater access to third-party data are, on average, more effective in deterring published § 1983 cases. As a result, police may learn from more third-party data collection. These law enforcement agencies, however, should avoid situations that distort third-party data. For example, third-party data may be less accurate when regulators and police officers share office space.[21] It also may have limited usefulness when data collection is not done in a timely manner or employs substandard procedures.[22] Lastly, third-party data may be less effective when there are costly barriers to reporting police misconduct.[23]

Fortunately, each of these data-collection issues may be overcome by employing solutions that are grounded in practice. Several examples may be found in legal clinics, especially when law students are used to collect and analyze third-party data.[24] Other examples arise in regulatory settings and draw on public resources, staffing, and expertise.[25]  Lastly, additional examples may emerge over time, especially if new legislation calls for more robust third-party data collection.[26]

In summary, it is clear why police learn from third-party data collection. First, it may provide better and more complete information about the underlying causes of misconduct. Second, third-party data collection may be useful for modeling actual police behavior. Lastly, third-party data collection may help departments overcome heuristic biases and other informational failures.

Table 1.  Background Information for Twenty-Six Law Enforcement Agencies

Jurisdiction Third Party Consistently Gathers Complaints[27] Departmental Audits[28] Ratio of Officers to § 1983 cases
*Villa Rica *No *No *206 to 1
L.A. County No Yes 129 to 1
*Farmington *No *Yes *125 to 1
New York Yes Yes 99 to 1
Washington, D.C. Yes No 93 to 1
Boise Yes Yes 66 to 1
Philadelphia Yes Yes 65 to 1
San Jose Yes Yes 64 to 1
New Orleans Yes Yes 63 to 1
Buffalo No No 58 to 1
Chicago Yes Yes 56 to 1
Cincinnati No No 52 to 1
Nashville No Yes 51 to 1
Albuquerque Yes Yes 48 to 1
Prince George County No No 41 to 1
Portland No Yes 40 to 1
Detroit No No 39 to 1
New Jersey No No 37 to 1
Seattle No Yes 35 to 1
Denver Yes Yes 34 to 1
Los Angeles No No 30 to 1
Oakland Yes No 22 to 1
Pittsburgh Yes No 19 to 1
Sacramento No Yes 18 to 1
*Steubenville *No *No *17 to 1
*Wallkill *No *No *17 to 1
* Indicates that data for that department are not used to compute group-level averages.

 Table 2. Law Enforcement Agencies with Access to Complaint Data and Departmental Audit Data

Jurisdiction Number of Officers[29] 2006 Published  § 1983  Cases[30] 2007  Published § 1983 Cases[31] 2008 Published § 1983 Cases[32] 2009 Published § 1983 Cases[33] 2010 Published § 1983 Cases[34] 2011  Published § 1983 Cases[35] Average Number of Published  § 1983 Cases Ratio of Officers to Published § 1983 Cases
New York 36118 309 303 320 358 452 436 363 99 to 1
Boise 330 5 3 4 4 9 3 5 66 to 1
Philadelphia 6832 93 106 95 110 95 133 105 65 to 1
San Jose 1342 13 18 19 27 24 24 21 64 to 1
New Orleans 1646 20 25 31 27 20 32 26 63 to 1
Chicago 13129 164 165 210 215 297 358 235 56 to 1
Albuquerque 951 22 11 19 31 22 17 20 48 to 1
Denver 1405 32 25 38 40 58 55 41 34 to 1
Average 7720 83 82 92 102 123 131 102 62 to 1

Table 3. Law Enforcement Agencies Without Access to Complaint Data or Departmental Audit Data

Jurisdiction

Number of Officers[36]

2006 Published  § 1983  Cases[37]

2007  Published § 1983 Cases[38]

2008 Published § 1983 Cases[39]

2009 Published § 1983 Cases[40]

2010 Published § 1983 Cases[41]

2011  Published § 1983 Cases[42]

Average Number of Published  § 1983 Cases

Ratio of Officers to Published
§ 1983 Cases

*Villa Rica

*35

*1

*0

*0

*0

*0

*0

*0

*206 to 1

Buffalo

750

4

10

18

5

18

23

13

58 to 1

Cincinnati

1048

25

20

21

18

15

19

20

52 to 1

Prince George County

1344

17

24

23

38

45

53

33

41 to 1

Detroit

3512

68

73

77

101

125

102

91

39 to 1

New Jersey

2768

62

63

92

63

74

94

75

37 to 1

Los Angeles

9099

145

229

297

390

386

403

308

30 to 1

*Steubenville

*50

*2

*5

*3

*2

*2

*3

*3

*17 to 1

*Wallkill

*33

*3

*0

*4

*1

*1

*3

*2

*17 to 1

Average

2071

37

48

60

69

74

78

61

43 to 1

Table 4. Law Enforcement Agencies with Access to Complaint Data or Departmental Audit Data

Jurisdiction Number of Officers[43] 2006 Published  § 1983  Cases[44] 2007  Published § 1983 Cases[45] 2008 Published § 1983 Cases[46] 2009 Published § 1983 Cases[47] 2010 Published § 1983 Cases[48] 2011  Published § 1983 Cases[49] Average Number of Published  § 1983 Cases Ratio of Officers to Published § 1983 Cases
LA County 8239 49 30 53 77 92 83 64 129 to 1
*Farmington *125 *1 *0 *1 *1 *1 *3 *1 *125 to 1
Washington, D.C. 3800 39 38 38 37 43 52 41 93 to 1
Nashville 1212 18 15 23 16 30 41 24 51 to 1
Portland 1050 21 31 19 31 23 31 26 40 to 1
Seattle 1248 39 39 31 43 35 29 36 35 to 1
Oakland 803 29 30 41 37 47 35 37 22 to 1
Pittsburgh 892 26 33 42 54 62 67 47 19 to 1
Sacramento 677 28 42 26 34 49 42 37 18 to 1
Average 2006 28 29 31 37 43 43 35 51 to 1

              *   J.D. 2012, University of Chicago Law School; M.U.P. 2006, New York University; M.Sc. 2003, London School of Economics; B.A. 2000, University of Michigan. Special thanks to Amos Jones, Taimoor Aziz, and Lionel Foster.

        [1].   See, e.g., Myriam E. Gilles, In Defense of Making Government Pay: The Deterrent Effect of Constitutional Tort Remedies, 35 Ga. L. Rev845, 853 (2001).

        [2].   See, e.g., Joanna C. Schwartz, Myths and Mechanics of Deterrence: The Role of Lawsuits in Law Enforcement Decisionmaking, 57 UCLA L. Rev. 1023, 1086 (2010) [hereinafter Schwartz, Myths and Mechanics]; Joanna C. Schwartz, What Police Learn from Lawsuits, 33 Cardozo L. Rev. 841, 890 (2012) [hereinafter Schwarts, What Police Learn].

        [3].   See generally Victor E. Kappeler, Critical Issues in Police Civil Liability (3d ed. 2001).

        [4].   Randall K. Johnson, Do Police Learn from Lawsuit Data?, 40 Rutgers L. Rec. 30, 36 (2012).

        [5].   “The primary vehicle for asserting federal claims against local public entities and public employees is the Civil Rights Act of 1871, 42 U.S.C. §1983. [The statute’s] broad language . . . led to its present status as the primary source of redress for a wide variety of governmental abuses.” Robert W. Funk et al., Civil Rights Liabilityin Illinois Municipal Law: Contracts, Litigation and Home Rule (2012 ed.)

        [6].   Johnson, supra note 4, at 35. I used LexisNexis Advance to perform the research, and I searched using the following legal search terms: Villa /s Rica /s Police; Farmington /s Police; New /s York /s Police; District /s Columbia /s Police; Boise /s Police; Philadelphia /s Police; San /s Jose /s Police; New /s Orleans /s Police; Buffalo /s Police; Chicago /s Police; Cincinnati /s Police; Nashville /s Police; Albuquerque /s Police; Prince /s Georges /s County /s Police; Portland /s Police; Detroit /s Police; Seattle /s Police; Denver /s Police; Los /s Angeles /s Police; Oakland /s Police; Pittsburgh /s Police; Sacramento /s Police; Steubenville /s Police; Wallkill /s Police; Los /s Angeles /s County /s Sheriff and New /s Jersey /s State /s Trooper. These results were restricted by jurisdiction (U.S. Federal), citation (42 U.S.C. § 1983), and timeline (six intervals were used: 01/01/2006 to 01/01/07; 01/01/07 to 01/01/08; 01/01/08 to 01/01/09; 01/01/09 to 01/01/10; 01/01/10 to 01/01/11; 01/01/011 to 01/01/2012).

        [7].   See Brian A. Reaves, Census of State & Local Law Enforcement Agencies, 2004, Bureau Just. Stat. Bull. (June 2007),http://bjs.ojp.usdoj.gov/content/pub/pdf/csllea04.pdf.

        [8].   Johnson, supra note 4, at 34 & n.25 (“Ratios describe the relationship between two quantities, as expressed by one number being divided by the other.”).

        [9].   Id. at 38–42 (noting that the groups are law enforcement agencies that consistently gather lawsuit data, law enforcement agencies that ignore lawsuit data, and a control group, which inconsistently gathers lawsuit data).

      [10].   Id. at 37.

      [11].   Id.

      [12].   The three groups are law enforcement agencies with access to complaint data and audit data, law enforcement agencies without access to third-party data, and a control group, which has access to one type of third-party data.

      [13].   See, e.g., Heather Kerrigan, Chicago’s Police Misconduct Cases Go to Court, Governing (Feb. 2011), http://www.governing.com/topics/public-justice-safety/Chicagos-Police-Misconduct-Cases-Go-to-Court.html.

      [14].   Johnson, supra note 4, at 33 (“In addition to [the] restrictions [described above], only published cases are used so as to exclude frivolous claims, settlements and textbook applications of § 1983. Each of these precautions are necessary, in order to [test Schwartz’s hypothesis.]”). Nothing, however, would preclude departments from providing information about the full “universe” of § 1983 cases. By doing so, law enforcement agencies would increasethe target population size, individual sample sizes, and the reliability of this indirect measure of police misconduct.

      [15].   See Baruch Lev & Shyam Sunder, Methodological Issues in the Use of Financial Ratios, 1 J. of Acct. & Econ. 187, 187–88 (1979).

      [16].   Examples are Farmington, Steubenville, Wallkill, and Villa Rica. Data for each department are accompanied by an asterisk (*), which indicates that data for that department are not used to compute group-level averages.

      [17].   Johnson, supra note 4, at 35 (“Selection effects are addressed by testing only [certain departments] . . . , which have similar histories of police misconduct. Omitted variables are accounted for by creating a control group[, which is roughly the same size as the other two groups]. Reverse causation is addressed by treating the time period [as either an independent variable or] as a dependent variable.”).

      [18].   These law enforcement agencies are New York, Boise, Philadelphia, San Jose, New Orleans, Chicago, Albuquerque, and Denver.

      [19].   These law enforcement agencies are Villa Rica, Buffalo, Cincinnati, Prince George’s County, Detroit, New Jersey, Los Angeles PD, Steubenville, and Wallkill.

      [20].   These law enforcement agencies are Los Angeles County, Farmington, Washington, D.C., Nashville, Portland, Seattle, Oakland, Pittsburgh, and Sacramento.

      [21].   See, e.g., Rob Wildeboer, Police Oversight Agency Moving from Chicago’s South Side, WBEZ91.5 (Oct. 6, 2011), http://www.wbez.org/story/police-oversight-agency-moving-chicagos-south-side-92881.

      [22].   See, e.g., Al Baker & Joseph Goldstein, Police Tactic: Keeping Crime Reports Off the Books, N.Y. Times, Dec. 31, 2011, at A1.

      [23].   See, e.g., Cal. Civ. Code § 47.5 (2005); Cal. Penal Code § 148.6 (2008).

      [24].   See, e.g., Craig B. Futterman et al., The Use of Statistical Evidence to Address Police Supervisory and Disciplinary Practices: The Chicago Police Department’s Broken System, 1 DePaul J. of Soc. Just. 251, 252 (2008).

      [25].   See, e.g., City of New York, Office of the Comptroller, Claims Report Fiscal Years 2009 & 2010, at 1-2, 34–35 (2011).

      [26].   See, e.g., N.Y. City Council, Int. No. 130 (2010).

      [27].   Johnson, supra note 4, at 43–45.

      [28].   Schwartz, Myths and Mechanicssupra note 2, at 1090.

      [29].   Reaves, supra note 7, at app. 2, 4.

      [30].   Johnson, supra note 4, at 38–42.

      [31].   Id.

      [32].   Id.

      [33].   Id.

      [34].   Id.

      [35].   Id.

      [36].   See Reaves, supra note 7, at 9–10; Johnson, supra note 4, at 41–42.

      [37].   Johnson, supra note 4, at 41–42.

      [38].   Id.

      [39].   Id.

      [40].   Id.

      [41].   Id.

      [42].   Id.

      [43].   Reaves, supra note 7, at app. 2, 4.

      [44].   Johnson, supra note 4, at 38–42.

      [45].   Id.

      [46].   Id.

      [47].   Id.

      [48].   Id.

      [49].   Id.

By Derek E. Bambauer

Cyberlaw is plagued by the myth of perfection.

Consider three examples: censorship, privacy, and intellectual property.  In each, the rhetoric and pursuit of perfection has proved harmful, in ways this Essay will explore.  And yet the myth persists—not only because it serves as a potent metaphor, but because it disguises the policy preferences of the mythmaker.  Scholars should cast out the myth of perfection, as Lucifer was cast out of heaven.  In its place, we should adopt the more realistic, and helpful, conclusion that often good enough is . . . good enough.

Start with Internet censorship. Countries such as China, Iran, and Vietnam use information technology to block their citizens from accessing on-line material that each government dislikes.  Democracies, too, filter content: Britain blocks child pornography using the Cleanfeed system,{{1}} and South Korea prevents users from reaching sites that support North Korea’s government.{{2}}  This filtering can be highly effective: China censors opposition political content pervasively,{{3}} and Iran blocks nearly all pornographic sites (along with political dissent).{{4}}  However, even technologically sophisticated systems, like China’s Golden Shield, are vulnerable to circumvention.  Users can employ proxy servers or specialized software, such as Tor, to access proscribed sites.{{5}}  This permeability has led many observers to conclude that effective censorship is impossible, because censorship is inevitably imperfect.{{6}}  Filtering is either trivially easy to bypass, or doomed to failure in the arms race between censors and readers.  The only meaningful censorship is perfect blocking, which is unattainable.

And yet, leaky Internet censorship works.  Even in authoritarian countries, few users employ circumvention tools.{{7}} Governments such as China’s capably block access to most content about taboo subjects, such as the Falun Gong movement{{8}} or coverage of the Arab Spring uprisings.{{9}}  Those who see imperfect censorship as useless make three errors.  First, they ignore offline pressures that users face.  Employing circumvention tools is like using a flashlight: it helps find what you seek, but it draws attention to you.  China has become adept at detecting and interfering with Tor,{{10}} and Iran recently purchased a sophisticated surveillance system for monitoring Internet communications.{{11}}  Bypassing censorship in cyberspace may have adverse consequences in realspace.  Second, most Internet users are not technologically sophisticated.  They use standard software, and the need to install and update specialized circumvention tools may be onerous.{{12}}  Finally, governments do not need perfect censorship to attain their goals.  They seek to prevent most people from obtaining prohibited content, not to banish it entirely.  Censorship that constrains the average user’s ordinary web browsing generally suffices.

Privacy discourse too is obsessed with perfection.  The reidentification wars have pitted researchers who assert that anonymizing data is impossible{{13}} against those who argue the risk of breaching properly sanitized datasets is vanishingly small.{{14}}  While the arguments are dauntingly technical (for those unfamiliar with advanced statistics), the empirical evidence points toward the less threatening conclusions.  The only rigorous study demonstrating an attack on a properly de-identified dataset under realistic circumstances revealed but 2 out of 15,000 (.013%) participants’ identities.{{15}}  Moreover, critics of anonymized data overlook the effects of incorrect matches. Attackers will have to weed out false matches from true ones, complicating their task.

Opponents make three mistakes by focusing on the theoretical risk of re-identification attacks on properly sanitized data.  First, the empirical evidence for their worries is slight, as the data above demonstrates.  There are no reports of such attacks in practice, and the only robust test demonstrated minimal risk.  Second, anonymized data is highly useful for socially beneficial purposes, such as predicting flu trends, spotting discrimination, and analyzing the effectiveness of medical and legal interventions.{{16}} Finally, the most significant privacy risk is from imperfectly sanitized data: organizations routinely release, deliberately or inadvertently, information that directly identifies people, or that enables an attacker to do so without advanced statistical knowledge.  Examples are legion, from the California firm Biofilm releasing the names and addresses of 200,000 customers who asked for free Astroglide samples{{17}} to AOL’s disclosure of user queries that allowed researchers to link people to their searches.{{18}}  Concentrating on whether perfect anonymization is possible distracts from far more potent privacy threats emanating from data.

Intellectual property (“IP”) in the digital age is similarly obsessed with perfection.  IP owners argue that with the advent of perfect digital copies, high-speed networks, and distributed dissemination technologies, such as peer-to-peer file-sharing software, any infringing copy of a protected work will spread without limit, undermining incentives to create.  This rhetoric of explosive peril has resulted in a perpetual increase in the protections for copyrighted works and in the penalties for violating them.{{19}}

The quest for perfect safeguards for IP predates the growth of the commercial Internet.  In September 1995, President Clinton’s administration released its White Paper, which argued that expanded copyright entitlements were necessary for content owners to feel secure in developing material for the nascent Information Superhighway.{{20}}  Without greater protection, the Paper argued, the Superhighway would be empty of content, as copyright owners would simply refuse to make material available via the new medium.

This prediction proved unfounded, but still persuasive.  In the last fifteen years, Congress has reinforced technological protection measures such as Digital Rights Management with stringent legal sanctions;{{21}} has augmented penalties for copyright infringement, including criminal punishments;{{22}} has pressed intermediaries, such as search engines, to take down allegedly infringing works upon notification by the copyright owner;{{23}} and has dedicated executive branch resources to fighting infringement.{{24}}  And yet, pressures from content owners for ever-greater protections continue unrelentingly.  In the current Congress, legislation introduced in both the House of Representatives and the Senate would, for the first time in American history, have authorized filtering of sites with a primary purpose of aiding infringement{{25}} and would have enabled rightsowners to terminate payment processing and Internet advertising services for such sites.{{26}}  These proposals advanced against a backdrop of relatively robust financial health for the American movie and music industries.{{27}}

Thus, the pursuit of perfection in IP also contradicts empirical evidence.  Content industries have sought to prohibit, or at least hobble, new technologies that reduce the cost of reproduction and dissemination of works for over a century—from the player piano{{28}} to the VCR{{29}} to the MP3 player{{30}} to peer-to-peer file-sharing software.{{31}}  And yet each of these advances has opened new revenue horizons for copyright owners.  The growth in digital music sales is buoying the record industry,{{32}} and the VCR proved to be a critical profit source for movies.{{33}}  New copying and consumption technologies destabilize prevailing business models, but not the production of content itself.{{34}}

Moreover, perfect control over IP-protected works would threaten both innovation and important normative commitments.  The music industry crippled Digital Audio Tapes{{35}} and failed to provide a viable Internet-based distribution mechanism until Apple introduced the iTunes Music Store.{{36}}  The movie industry has sought to cut off supply of films to firms such as Redbox that undercut its rental revenue model,{{37}} and Apple itself has successfully used copyright law to freeze out companies that sold generic PCs running MacOS.{{38}}  And, the breathing room afforded by the fair use and de minimis doctrines, along with exceptions to copyright entitlements, such as cover licenses, enables a thriving participatory culture of remixes, fan fiction, parody, criticism, and mash-ups.  Under a system of perfect control, copyright owners could withhold consent to derivative creators who produced works of which they disapproved, such as critical retellings of beloved classics, for example Gone With The Wind,{{39}} or could price licenses to use materials beyond the reach of amateur artists.{{40}}  Perfection in control over intellectual property is unattainable, and undesirable.

The myth of perfection persists because it is potent.  It advances policy goals for important groups—even, perhaps, groups on both sides of a debate.  For censorship, the specter of perfect filtering bolsters the perceived power of China’s security services.  It makes evasion appear futile.  For those who seek to hack the Great Firewall, claiming to offer the technological equivalent of David’s slingshot is an effective way to attract funding from Goliath’s opponents.  Technological optimism is a resilient, seductive philosophical belief among hackers and other elites{{41}} (though one that is increasingly questioned).{{42}}

Similarly, privacy scholars and advocates fear the advent of Big Data: the aggregation, analysis, and use of disparate strands of information to make decisions—whether by government or by private firms—with profound impacts on individuals’ lives.{{43}}  Their objections to disclosure of anonymized data are one component of a broader campaign of resistance to changes they see as threatening to obviate personal privacy.  If even perfectly anonymized data poses risks, then restrictions on data collection and concomitant use gain greater salience and appeal.

Finally, concentrating on the constant threat to incentives for cultural production in the digital ecosystem helps content owners, who seek desperately to adapt business models before they are displaced by newer, more nimble competitors.  They argue that greatly strengthened protections are necessary before they can innovate.  Evidence suggests, though, that enhanced entitlements enable content owners to resist innovation, rather than embracing it.  The pursuit of perfection turns IP law into a one-way ratchet: protections perpetually increase, and are forever insufficient.

We should abandon the ideal of the sublime in cyberlaw.  Good enough is, generally, good enough.  Patchy censorship bolsters authoritarian governments.  Imperfectly anonymized data generates socially valuable research at little risk.  And a leaky IP system still supports a thriving, diverse artistic scene.  Pursuing perfection distracts us from the tradeoffs inherent in information control, by reifying a perspective that downplays countervailing considerations.  Perfection is not an end, it is a means—a political tactic that advances one particular agenda.  This Essay argues that the imperfect—the flawed—is often both effective and even desirable as an outcome of legal regulation.


*    Associate Professor of Law, Brooklyn Law School (through spring 2012); Associate Professor of Law, University of Arizona James E. Rogers College of Law (beginning fall 2012).  Thanks for helpful suggestions and discussion are owed to Jane Yakowitz Bambauer, Dan Hunter, Thinh Nguyen, Derek Slater, and Chris Soghoian.  The author welcomes comments at [email protected].

[[1]]   Richard Clayton, Failures in a Hybrid Content Blocking Systemin Privacy Enhancing Technologies: 5th International Workshop PET 2005 78 (George Danezis & David Martin eds., 2006).[[1]]

[[2]]   Eric S. Fish, Is Internet Censorship Compatible With Democracy? Legal Restrictions of Online Speech in South Korea, Asia-Pac. J. Hum. Rts. & the L. (forthcoming 2012), available at http://papers.ssrn.com/sol3/papers.cfm?abstract
_id=1489621.[[2]]

[[3]]   China, OpenNet (June 15, 2009), http://opennet.net/research/profiles
/china-including-hong-kong.[[3]]

[[4]]   Iran, OpenNet (June 16, 2009), http://opennet.net/research/profiles
/iran.[[4]]

     [[5]]   See, e.g., James Fallows, “The Connection Has Been Reset”, The Atlantic (March 2008), http://www.theatlantic.com/magazine/archive/2008/03
/-ldquo-the-connection-has-been-reset-rdquo/6650/.[[5]]

      [[6]]   See, e.g., Oliver August, The Great Firewall: China’s Misguided—and Futile—Attempt to Control What Happens Online, Wired (Oct. 23, 2007), http://www.wired.com/politics/security/magazine/15‑11/ff_chinafirewall?currentPage=all; Troy Hunt, Browsing the broken Web: A Software Developer Behind the Great Firewall of China, Troy Hunt’s Blog (Mar. 16, 2012), http://www.troyhunt.com/2012/03/browsing‑broken‑web‑software‑developer
.html; Weiliang Nie Chinese Learn to Leap the “Great Firewall”, BBC News (Mar. 19, 2010), http://news.bbc.co.uk/2/hi/8575476.stm.[[6]] [[7]]   Erica Naone, Censorship Circumvention Tools Aren’t Widely Used, Tech. Rev (Oct. 18, 2010), http://www.technologyreview.com/web/26574/.[[7]] [[8]]   Chinasupra note 3.[[8]] [[9]]   Richard Fontaine & Will Rogers, China’s Arab Spring Cyber Lessons, The Diplomat (Oct. 3, 2011), http://the-diplomat.com/2011/10/03/china%E2%80
%99s-arab-spring-cyber-lessons/.[[9]] [[10]]   Tim Wilde, Knock Knock Knockin’ on Bridges’ Doors, Tor (Jan. 7, 2012), https://blog.torproject.org/blog/knock-knock-knockin-bridges-doors.[[10]] [[11]]   Phil Vinter, Chinese Sell Iran £100m Surveillance System Capable of Spying on Dissidents’ Phone Calls and Internet, Daily Mail (Mar. 23, 2012), http://www.dailymail.co.uk/news/article‑2119389/Chinese‑sell‑Iran‑100m‑surveillance-capable-spying-dissidents-phone-calls-internet.html.[[11]] [[12]]   See generally Nart Villeneuve, Choosing Circumvention: Technical Ways to Get Round Censorshipin Reporters Without Borders, Handbook for Bloggers and Cyberdissidents 63 (2005), available at http://www.rsf.org
/IMG/pdf/handbook_bloggers_cyberdissidents-GB.pdf.[[12]] [[13]]   See, e.g., Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. Rev. 1701, 1752 (2010); Latanya Sweeney, Patient Identifiability in Pharmaceutical Marketing Data, (Data Privacy Lab, Working Paper No. 1015, 2011), available at http://dataprivacylab.org/projects/identifiability/pharma1.html.[[13]] [[14]]   See, e.g., Jane Yakowitz, The Tragedy of the Data Commons, 25 Harv. J.L. & Tech. 1, 52 (2011); Khaled El Emam et al., A Systematic Review of Re-identification Attacks on Health Data, PLoS One (Dec. 2011), http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.002807.[[14]] [[15]]   Deborah Lafky, Program Officer, Dep’t Health and Human Servs., The Safe Harbor Method of De-Identification: An Empirical Test, ONC Presentation (October 9, 2009), available at http://www.ehcca.com/presentations
/HIPAAWest4/lafky_2.pdf.[[15]] [[16]]   See Yakowitz, supra note 14.[[16]] [[17]]   Christopher Soghoian, Astroglide Data Loss Could Result in $18 Million Fine, DubFire (July 9, 2007), http://paranoia.dubfire.net/2007/07
/astroglide-data-loss-could-result-in-18.html.[[17]] [[18]]   Katie Hafner, Leaked AOL Search Results Create Ethical Dilemma for Researchers, N.Y. Times (Aug. 23, 2006), http://www.nytimes.com/2006/08/23
/technology/23iht-search.2567825.html?pagewanted=all.[[18]] [[19]]   See generally Robert Levine, Free Ride: How Digital Parasites are Destroying the Culture Business, and How the Culture Business Can Fight Back (2011); Jessica Litman, Digital Copyright (2001); Mike Masnick, Why Is The MPAA’s Top Priority “Fighting Piracy” Rather Than Helping the Film Industry Thrive?, Techdirt (Feb. 22, 2011), http://www.techdirt.com
/articles/20110221/15024713194/why‑is‑mpaas‑top‑priority‑fighting‑piracy‑rather-than-helping-film-industry-thrive.shtml.[[19]] [[20]]   Pamela Samuelson, The Copyright Grab, Wired (Jan. 1996), http://www.wired.com/wired/archive/4.01/white.paper.html.[[20]] [[21]]   17 U.S.C. § 1201 (2006).[[21]] [[22]]  17 U.S.C. § 1204 (2006); No Electronic Theft (NET) Act, Pub. L. No. 105-147, 111 Stat. 2678 (1997).[[22]] [[23]]  17 U.S.C. § 512(c) (2006).[[23]] [[24]]  Prioritizing Resources and Organization for Intellectual Property (PRO IP) Act, Pub. L. No. 110-403, 122 Stat. 4256 (2008). [[24]] [[25]]   PROTECT IP Act of 2011, S.968, 112th Cong. (2012).[[25]] [[26]]   Stop Online Piracy Act of 2011, H.R. 3261, 112th Con. (2012).[[26]] [[27]]   Robert Andrews, Music Industry Can See The Light After “Least Negative” Sales Since 2004, Time (Mar. 26, 2012), http://business.time.com/2012
/03/26/music-industry-can-see-the-light-after-least-negative-sales-since-2004/; Brooks Barnes, A Sliver of a Silver Lining for the Movie Industry, N.Y. Times (Mar. 22, 2012), http://mediadecoder.blogs.nytimes.com/2012/03/22/a-sliver-of-a
-silver-lining-for-the-movie-industry/#; Bob Lefsetz, Movie Industry Is Making Money from Technologies It Claimed Would KILL Profits, The Big Picture (Jan. 30, 2012, 4:30 PM), http://www.ritholtz.com/blog/2012/01/movie-industry-is
-making-money-from-technologies-it-claimed-would-kill-profits/.[[27]] [[28]]   See White-Smith Music Publ’g Co. v. Apollo Co., 209 U.S. 1, 13–14 (1908) (holding that a piano roll does not infringe composer’s copyright because the perforated sheets are not copies of the sheet music).[[28]] [[29]]   See Sony v. Universal Studios, 464 U.S. 417, 442 (1984) (holding that the manufacture of a VCR does not constitute contributory copyright infringement because it “is widely used for legitimate, unobjectionable purposes”).[[29]] [[30]]   See Recording Indus. Ass’n of Am. v. Diamond Multimedia Sys., 180 F.3d 1072, 1081 (9th Cir. 1999) (upholding a district court denial of preliminary injunction against the manufacture of the Rio MP3 player because the Rio is not subject to the Audio Home Recording Act of 1992).[[30]] [[31]]   See Metro-Goldwyn-Mayer Studios v. Grokster, 545 U.S. 913, 918 (2005) (holding that distributor of peer-to-peer file sharing network is liable for contributory copyright infringement when “the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement”).[[31]] [[32]]   Andrews, supra note 27.[[32]] [[33]]   Michelle Schusterman, Infographic: Why the Movie Industry is So Wrong About SOPA, Matador (Jan. 17, 2012), http://matadornetwork.com
/change/infographic-why-the-movie-industry-is-so-wrong-about-sopa/.[[33]] [[34]]   See generally Mark A. Lemley, Is the Sky Falling on the Content Industries?, 9 J. Telecomm. & High Tech. L. 125 (2011) (explaining that while the introduction of new technologies in the past may have disrupted certain industries, the new technology did not stop the creation of new content).[[34]] [[35]]   See generally Tia Hall, Music Piracy and the Audio Home Recording Act, 2002 Duke L. & Tech. Rev. 0023 (2002).[[35]] [[36]]   Derek Slater et al., Content and Control: Assessing the Impact of Policy Choices on Potential Online Business Models in the Music and Film Industries, (Berkman Center for Internet & Society at Harvard Law School, Research Publication No. 2005-10, 2005), available at http://papers.ssrn.com/sol3/papers
.cfm?abstract_id=654602.[[36]] [[37]]   Paul Bond, Warner Bros., Redbox Divided on DVD Terms, The Hollywood Reporter (Feb. 29, 2012), http://www.hollywoodreporter.com/news
/warner-bros-redbox-dvd-ultraviolet-flixster-kevin-tsujihara-296071.[[37]] [[38]]   See Apple Inc. v. Psystar Corp., 658 F.3d 1150, 1162 (9th Cir. 2011).[[38]] [[39]]   See SunTrust Bank v. Houghton Mifflin Co., 268 F.3d 1257, 1275 (11th Cir. 2001) (denying a preliminary injunction because a fair use defense would prevent the plaintiff, owner of the copyright of Gone With the Wind, from preventing the defendant from publishing a novel that critiques Gone With the Wind).[[39]] [[40]]   See generally Derek E. Bambauer, Faulty Math: The Economics of Legalizing The Grey Album, 59 Ala. L. Rev. 345 (2007) (contending the economics of the derivative works right prevents the creation of new works and stifles the re-mix culture).[[40]] [[41]]   John Gilmore averred that “[t]he Net interprets censorship as damage and routes around it.”  Philip Elmer-Dewitt, First Nation in Cyberspace, Time, Dec. 6, 1993, at 62.[[41]] [[42]]   See generally Evgeny Morozov, The Net Delusion (2011) (arguing that the Internet makes it easier for dictators to prevent democratic uprisings).[[42]] [[43]]   See generally Julie Cohen, Configuring the Networked Self (2011) (making the case that flows of private information are not restricted and proposing legal reforms to address the problem); Jessica Litman, Information Privacy / Information Property, 52 Stan. L. Rev. 1283 (2000) (contending that industry’s self-regulation of information privacy has failed and proposing that torts may be the best available avenue to improve privacy rights); danah boyd & Kate Crawford, Six Provocations for Big Data, Symposium, A Decade in Internet Time: Symposium on the Dynamics of Internet and Society, Oxford Internet Inst. (Sept. 2011), available at http://papers.ssrn.com/sol3/papers.cfm?abstract
_id=1926431 (proposing six questions about the potential negative effects of Big Data).[[43]]