Link Search Menu Expand Document

III. REIMAGINING PROTECTIONS FOR INTIMATE INFORMATION

  1. A. Reframing the Conversation
    1. B. Special Protections for Intimate Information
    2. 1. Limits on Collection
    3. 2. Use Restrictions
    4. 3. Remedies: Halt Processing and the Data Death Penalty
  2. C. Objections
    1. 1. Market
    2. 2. Free Speech

This Part sketches some guiding principles for the protection of intimate information in the commercial sector. My goal is four-fold: to situate data privacy as a matter of civil rights; to stem the tidal wave of data collection; to restrict certain uses of intimate data; and to expand the suite of remedies available to courts.

A. Reframing the Conversation

In the United States, information privacy is viewed through a consumer protection lens.269 The central theme is notice and choice.270 So long as businesses provide notice of their data practices, then consumers are treated as having elected to trade their data for commercial services.271 The U.S. approach has been described as “privacy self-management” and “privacy work.”272

The consumer protection model—as it is currently constructed—is both descriptively and conceptually flawed.273 Firms provide “notice” in privacy policies while “consent” is inferred from people’s decision to visit sites, download apps, and purchase goods.273 Both are fictions. As currently constructed, notice rarely provides individuals with relevant information that they can understand and use. It rarely, if ever, provides details about third-party marketing. It does not seek express, written consent in a form designed to inform people about a firm’s practices, and it does not give them an option of declining the collection of their personal information if they use the service.


268 In 2014, before Dr. Mary Anne Franks and the Cyber Civil Rights Initiative began working with lawmakers, three states criminalized the practice. See Citron, supra note, at (discussing the development of so-called revenge porn laws); Mary Anne Franks, Reform From the Front Lines, FLA. L. REV. (2018).

269 Woodrow Hartzog, The Inadequate, Invaluable Fair Information Practices, 76 MD. L. REV. 952 (2017); Woodrow Hartzog, The Case Against Idealising Control, 4 EUR. DATA PROTECTION REV. 423 (2018); Neil Richards & Woodrow Hartzog, The Pathologies of Consent, 96 WASH. U. L. REV. (2019).

270 The California Online Privacy Protection Act requires that entities notify California residents about their data collection practices. Most companies follow CalOPPA because of the significant likelihood of any service collecting information from California residents.

271 JULIE COHEN, CONFIGURING THE NETWORKED SELF (2015).

272 Daniel J. Solove, Privacy Self-Management and the Consent Dilemma, 126 HARV. L. REV. 1880 (2013); Daniel J. Solove, The Myth of the Privacy Paradox, GEO. WASH. L. REV. (forthcoming). Alice Marwick invokes the concept of privacy work, drawing on a feminist framework that aptly captures uncompensated work that is disproportionately shouldered by women and marginalized communities. See ALICE MARWICK, HIDDEN: NETWORKED PRIVACY AND THOSE LEFT OUT (forthcoming).

273 Paul Schwartz, Privacy and Democracy in Cyberspace, 52 VAND. L. REV. 1609, 1658 (1999).

274 Richards & Hartzog, The Pathologies of Consent, supra note, at.


Even when firms make an effort at directly notifying individuals about their practices, the consent provided is hardly meaningful. Lived experience casts doubt on the proposition that people have really consented to the trade of their personal data for services.275 When a pop up appears online, people tend to click “I Agree” because it is less onerous than reading dense privacy policies provided.276 Evan Selinger and Brett Frischmann talk about this as a form of manufactured consent and rightly so.277 Individuals have difficulty appreciating low-probability harms that nonetheless happen to a significant percentage of people.

Further complicating the ability to secure meaningful consent is the fact that companies have every incentive, in the words of Woodrow Hartzog, to “hide the risks in their data practices though manipulative design, vague abstractions, and complex words.”278 Firms’ website interfaces and default settings are designed to maximize data collection. As Hartzog explains further, businesses “engineer . . . [interactions] to expedite the transfer of rights and relinquishment of protections.”279

A consumer protection approach not only fails to satisfy its goal of notice and choice, it insufficiently captures the stakes.280 To be sure, a firm’s collection of intimate data might constitute deception if its privacy policy says one thing and does another. But, in addition, it might undermine the crucial values that sexual privacy protects and impede a fair chance to work, obtain housing, afford insurance, and express oneself.281 The consumer protection model lacks the capacity and even the vocabulary with which to protect these interests.282


275 Woodrow Hartzog, The Inadequate, Invaluable Fair Information Practices, 76 MD. L. REV. 952, 953 (2017).

276 WOODROW HARTZOG, PRIVACY’S BLUEPRINT: THE BATTLE TO CONTROL THE DESIGN OF NEW TECHNOLOGIES 130 (2018).

277 BRETT FRISCHMANN & EVAN SELINGER, RE-ENGINEERING HUMANITY 209-210 (2018) (explaining that people feel compelled to agree, undermining any desire to object, and thus informed consent is really manufactured or manipulated consent). Shoshana Zuboff talks about the notice and consent regime as a kind of psychic numbing. ZUBOFF, supra note, at.

278 Testimony of Woodrow Hartzog, “Policy Principles for Federal Data Privacy Framework,” Senate Committee on Science, Technology, and Commerce (February 17, 2019), available at https://www.commerce.senate.gov/services/files/8B9ADFCC-89E6-4DF3-9471-5FD287051B53.

279 Id.

280See Danielle Keats Citron & Mary Anne Franks, The Internet as Speech Conversion Machine and Other Myths Confounding Section 230 Reform, U. CHI. LEGAL F. (forthcoming); Citron, Sexual Privacy, supra note; Citron, Cyber Civil Rights, supra note.

281 Julie Cohen, CONFIGURING THE NETWORKED SELF: LAW, CODE, AND THE PLAY OF EVERYDAY LIFE (2012).

282 NEIL M. RICHARDS, WHY PRIVACY MATTERS (forthcoming) (on file with author).


In certain contexts, law protects crucial life opportunities and social goods as civil rights.283 Federal and state civil rights laws secure the ability to work, attend school, use the telephone, secure housing, and vote on equal terms.284 I am not suggesting that civil rights laws apply to the private sector surveillance of intimate data, which they mostly do not.285 Nonetheless, a civil rights framing brings into focus that far more than consumer choices are in jeopardy when firms amass intimate information.286 The ability to engage in life’s crucial activities hangs in the balance, especially for women, sexual minorities, and racial minorities and often on an intersectional basis.

Situating sexual privacy in the civil rights conversation is important.287 Law plays a crucial expressive role.288 It teaches us why certain interests matter and why they warrant law’s protection.289 A civil rights framing would attest to the close relationship between reservoirs of intimate data and opportunities essential for human flourishing. I am not suggesting that civil rights laws cover all freedoms and social goods in need of protection (they do not).290 Their reach is further limited by the state action doctrine.291 These limitations curtail the expressive power of those laws.292 Nonetheless, situating private sector surveillance of intimate life as a matter of civil rights and not just consumer choices helps begin the conversation about what those freedoms should be in the context of privacy law specifically and civil rights law more generally.


283 Title VII; FMLA; Title IX; Americans with Disabilities Act.

284 Danielle Citron & Mary Anne Franks, Cyber Civil Rights in an Age of COVID, HARV. L. REV. BLOG (May 14, 2020), https://blog.harvardlawreview.org/cyber-civil-rights-inthe-time-of-covid-19/.

285Of course, Title VII might be understood to ban discriminatory hiring practices that involve relying on intimate information, as I have suggested in previous work. See Citron, Hate Crimes in Cyberspace, supra note, at. There is also the Genetic Information Nondiscrimination Act, which bans employers from using genetic information in employment decisions.

286 As scholars have explored, antidiscrimination laws like Title VII are ill-suited to address the use of discriminatory algorithms in employment matters. See Deborah Hellman, Measuring Algorithmic Fairness, 106 VA. L. REV. 811 (2020); Pauline Kim, Data Discrimination at Work, 58 WILLIAM & MARY L. REV. 857 (2018); Solon Barocas & Andrew Selbst, Big Data’s Disparate Impact, 104 CAL. L. REV. 671 (2016).

287 I will be further exploring this argument in later work. Danielle Keats Citron & Courtney Hinkle, Privacy, a Matter of Civil Rights (in progress).

288CITRON, HATE CRIMES IN CYBERSPACE, supra note, at; Danielle Keats Citron, Law’s Expressive Value in Combating Cyber Gender Harassment, 108 MICH. L. REV. 373 (2009).

289Citron, Law’s Expressive Value, supra note, at.

290In her important new book, Robin West calls for a transformative understanding of civil rights that does not merely prohibit discrimination but that entails rights essential to the justice of the nation. ROBIN L. WEST, CIVIL RIGHTS: RETHINKING THEIR NATURAL FOUNDATION (2019).

291 Id. (exploring the various ways that civil rights laws have failed to fulfill their potential to protect social goods themselves).

292 As Alice Walker eloquently explained, “’Civil rights’ is a term that did not evolve out of black culture, but rather out of American law. As such, it is a term of limitation. It speaks only to physical possibilities—necessary and treasured, of course—but not of the spirit. Even as it promises assurance of greater freedoms it narrows the area in which people might expect to find them.” ALICE WALKER, IN SEARCH OF OUR MOTHER’S GARDENS 335 (1983).


Some legislators and law enforcers have underscored the connection between privacy and civil rights. New York AG Letitia James attributed her investigation of the gay dating app Jack’d to the special importance of privacy to the LGBTQ community. As she noted, “[a]pproximately 80 percent of the app’s users were individuals of color and had reason to fear discrimination from the exposure of their personal information or private photographs.”293

Understanding privacy as a matter of civil rights provides inspiration for reform. Data protection laws tend to focus on process, such as notice of an entity’s data practices and the ability to correct mistakes.294 By contrast, civil rights law moves in a more substantive direction by limiting certain conduct and requiring affirmative obligations.295 Under civil rights law, caretakers of crucial spaces must maintain them in ways that promote equal access and holds them accountable when they fail to do so.296 School administrators, private employers, hotel proprietors, and restaurant owners have responsibilities to ensure that their spaces are free of discrimination and abuse.297 Educational institutions and employers must craft and enforce anti-discrimination policies, and they must respond to credible complaints of sexual harassment or racial abuse. Hotels and restaurants must ensure that individuals are not denied service on the basis of protected characteristics.


293 Press Release, Attorney General James Announces Settlement With Dating App for Failure To Secure Private And Nude Photos (June 28, 2019), https://ag.ny.gov/pressrelease/2019/attorney-general-james-announces-settlement-dating-app-failure-secureprivateand#:~:text=NEW%20YORK%20%E2%80%93%20New%20York%20Attorney,%2C%20bisexual%2C%20and%20transgender%20community.

294 Woodrow Hartzog & Neil Richards, Privacy’s Constitutional Moment and the Limits of Data Protection, 61 B.C. L. REV. 1687 (2020).

295 Woody Hartzog and Neil Richards have laid important groundwork for a substantive turn for privacy law in their coauthored scholarship. See, e.g., id.; Richards & Hartzog, The Duty of Loyalty, supra note.

296 Danielle Keats Citron & Mary Anne Franks, Cyber Civil Rights in the Age of COVID19, HARV. L. REV. BLOG (May 19, 2020), https://blog.harvardlawreview.org/cyber-civilrights-in-the-time-of-covid-19/.

297 Title VII, Title IX.


Privacy law should follow this substantive turn. We see a measured move in that direction in the wake of the COVID-19 pandemic. A bill recently proposed by Senators Warner and Blumenthal frames the recognition of a “right to privacy” in “emergency health data” as a civil rights matter.298 It requires the adoption of reasonable safeguards against unlawful discrimination based on emergency health data. It prohibits discrimination against, or otherwise making unavailable, goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation” and the right to vote on the basis of “emergency health data.” The bill would have the Secretary of Health and Human Services work with the U.S. Commission on Civil Rights and the FTC to submit a report examining how the collection, use, and disclosure of COVID-19 health information impacts civil rights issues.

Recognizing privacy as a matter of civil rights may provide support for stronger privacy protections at the federal and state level. Information privacy is having a zeitgeist moment. Dozens of federal privacy bills are under consideration. At the state level, privacy laws are being proposed at a rapid clip.299 A civil rights framing might incentivize lawmakers to adopt robust privacy protections rather than watering bills down and letting bills die in committee. If privacy bills are described as consumer protection matter, then lawmakers will be more comfortable arguing that the profitability of firms should be balanced against consumer interests. Lawmakers would be less inclined to barter away civil rights against discrimination to protect firms’ profits or to reduce administrability costs.300 Indeed, in some circumstances, civil rights do not allow for any bartering at all—this is certainly true voting.


298 https://epic.org/privacy/covid/Public-Health-Emergency-Privacy-Act.pdf

299 CCPA; Washington state. 300 We see some accommodation of economic interests in Title VII in its exclusion of small firms. I am grateful to Neil Richards for exploring this point with me.


In recognizing privacy’s centrality to human flourishing, the U.S. would move in a direction that most of the world has already adopted. In the European Union, information privacy (better known as data protection) is a “fundamental right” essential for “dignity, personality, and informational self-determination.”301This is not a wholesale endorsement of the EU’s General Data Protection Regulation—its overall tack is overly focused on procedural commitments.302 Instead, it is to note that most of the world views data privacy as a human right.

B. Special Protections for Intimate Information

Before turning to the special protections owed intimate information, I have to note the need for strong baseline protections for all personal data collected in the private sector.303 All of the reasons why we need sexual privacy support comprehensive data protection in the United States. Technological advances may soon enable firms to turn innocuous personal data into sensitive information—including intimate information—with a high degree of accuracy.304 Paul Ohm and Scott Peppet have memorably termed this prospect “when everything reveals everything.”305 We need to stem the tide of over-collection and to restrict downstream use, sharing, and storage of personal data in part to protect intimate information.

No matter, whether or not lawmakers move on any of the countless comprehensive data privacy bills under consideration at the federal and state level Intimate information warrants special protection right now. This section focuses on areas worthy of reform. Certain activity should be off limits, including the collection and use of intimate information in certain contexts. Additional remedies should be available to address violations, including a “stop processing” order until violations are fixed. We might also consider reserving the possibility of a data death penalty.


301 Paul M. Schwartz & Karl-Nikolaus Pifer, Transatlantic Data Privacy Law, 106 GEO. L.J. 115 (2017); see also Paul M. Schwartz, Global Data Privacy Law: The EU Way, NYU L. REV. (2018).

302 Bert-Jaap Koops, The Trouble with European Data Protection Law, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2505692.

303 Personally identifiable information is a central concept in privacy law. Paul M. Schwartz & Daniel J. Solove, The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86 NYU L. REV. 1814 (2011). Federal and state laws address what information constitutes personal information in different ways. An organizing principle is whether an individual is identified or can be reasonably identified.

304 Paul Ohm & Scott Peppet, What If Everything Reveals Everything?, in BIG DATA IS A MONOLITH 53 (Cassidy Sugimoto et al, eds. 2016).

305 Id. That possibility certainly supports the call for strong baseline rules for the handling of personal information.


1. Limits on Collection

The default assumptions around the handling of intimate information must change. The norm of collection is not inevitable, unless law and society make it so. The status quo undermines the values that sexual privacy protects and risks people’s well-being.

To be sure, the collection of intimate information can produce more upside than downside in certain contexts. Law should work to ensure that collection occurs in those contexts and no others. To be sure, no legal approach can guarantee this outcome. The following reforms, however, are offered with that goal in mind.

Firms should be required to obtain meaningful consent before collecting intimate information. The “gold standard of consent” combines the “knowing and voluntary” waiver standard from constitutional law and the informed consent standard from biomedical ethics.306 Requests for consent also must be “infrequent [and] the risks of giving consent must be vivid and easy to envision.”307 Last, firms can only seek consent to collect intimate data for a legitimate business purpose.

As to the knowing requirement, requests for consent should be clear and understandable. They should explain what intimate data would be collected, how it would be used by the firm to provide its service, and how long it would be retained. Requests for consent should be conspicuous. Where possible, they should be made separately from the process of signing up for a service. They should be designed in a way that enhances the likelihood that people will understand them.308 Lessons from design psychology can be leveraged to make it more likely that people consider the question rather than simply clicking “I Agree.”309


306 Richards & Hartzog, supra note, at 1465, 1475.

307 Id. at 1492. Richards and Hartzog also argue that for consent to be meaningful, it must occur in contexts where people have the incentive to take the request seriously. For platforms collecting sensitive information like dating apps, they argue that people may be more inclined to consider the risks so long as requests do not arrive in dribs and drabs.* Id.* at 1498.

308 On this score, see the important work of Ryan Calo. See, e.g., M. Ryan Calo, Against Notice Skepticism in Privacy (And Elsewhere), 87 NOTRE DAME L. REV. 1027 (2012). Calo explores various mechanisms for delivering notice that rely on consumer experience rather than entirely words or symbols. Id. at 1039-47.

309 See European Data Protection Board, Guidelines 05/2020 on consent under Regulation 2016/679 (Adopted on May 4, 2020), https://edpb.europa.eu/sites/edpb/files/files/file1/edpb_guidelines_202005_consent_en.pdf.


As for voluntariness, requests for consent must not be “take it or leave it” if a firm can provide that service without collecting intimate data. Adult sites, for instance, do not need to track people’s searches to provide their services. Thus, people should be able to decline collection requests and still be able to browse adult sites. Firms also should not make it difficult for people to deny requests or engage in other activity designed to “coerce, wheedle, and manipulate people to grant it.”310

The context of the request should signal that the person answered the request with care. Firms should not be permitted to make several requests.311 They must limit their requests. When requests for consent are infrequent, individuals have time to consider them and likely will not feel overwhelmed. With frequent requests, individual just agree to stop being hasseled.312 Firms also should spell out the risks in concrete and vivid terms so that individuals understand what happens if their intimate data is leaked or improperly used or shared.

Under this approach, first-party data collectors would have to obtain people’s meaningful consent before amassing intimate information. They could only request consent to collect intimate data for a legitimate business reason. Sometimes, however, the collection of intimate data is necessary for the service to function at all. This is true of dating, fertility, and periodtracking apps. In such a case, requests for collection would have to make clear that the service depends upon the collection of intimate data and that it will be used only to provide that service and no other reason. In that case, firms could decline to provide services to people who reject their request.

No so for third-party data collectors. Third-party data collectors must make clear that individuals can decline their requests without consequence. This recommendation would alter the ground rules for the marketplace of intimate information. At present, third-party advertisers and data brokers do not have to ask people for permission to track their intimate data. If adopted, they would not only have to seek permission from individuals, but requests would have to be made so that people can easily refuse and know that their refusals will have no consequences (beyond not getting personalized ads).


310 Richards & Hartzog, supra note, at 1489.

311 Id. at 1494.

312 Id.


Admittedly, this requirement would be a significant setback for advertisers and data brokers. Data brokers would have to seek explicit consent before collecting intimate information. So would advertisers that track intimate information on porn sites, period-tracking services, and dating apps. To be clear, meaningful consent would only apply to intimate information. The advertising and data brokerage industries would not end. Instead, the default presumption that intimate information can be collected unbeknownst to individuals and without their permission would have to end. The sky will not fall.

My experience working with companies and lawmakers on the nonconsensual hosting of nude images informs this approach. Cyber Civil Rights Initiative President and my frequent coauthor Mary Anne Franks has long argued that nude images should not be posted online without written consent. After the first California Cyber Exploitation Task Force inperson meeting in the spring of 2015, Franks suggested as much to a tech company safety official. Her suggestion, wise then and wise now, was met with shock and dismay. The safety official—a thoughtful person with extensive content moderation experience—explained that social media companies could not possibly require prior written consent before nude images were posted online. Why not, we asked? The official responded that if written consent was required, then it might be more likely that nude photos would not be posted because the subjects of those photos would not give their consent.

Then, as now, we wondered what the problem was.313 As we noted then, written consent would not prevent the posting of nude photos, just nude photos where the subject did not consent (or at least where the poster was not willing to sign something saying that the subject consented to the posting). This sentiment applies not only to sites trafficking in nonconsensual pornography and deep fake sex videos, but also data brokers and advertisers. If firms want to collect intimate information, then they should obtain people’s meaningful consent to do so.


313 Of course, we knew the problem was that online platforms optimize for likes, clicks, and shares so that they can earn advertising income.


Privacy laws covering certain sensitive information often include affirmative consent requirements though they fall short of the “gold standard.” The Illinois Biometric Identification Privacy Act conditions the collection of biometric data on consent given after a firm informs consumers of the fact that biometric information is being collected and stored, the reason for the collection, use, and storage, and the duration of the storage.314 HIPAA’s Privacy Rule permits data use necessary for the treatment, payment, or health care system operations data and requires consent for any uses beyond those purposes. Under federal law, cable providers generally may not disclose subscribers’ information to anyone without subscribers’ consent.315

An alternative approach to seeking meaningful consent would be to limit the collection of intimate information to instances where entities have a legitimate, reasonable basis for collecting intimate data and where individuals would reasonably expect it.316 The advertising industry would surely prefer this approach. Advertisers have a legitimate business reason for collecting personal data and their practices might comport with people’s reasonable expectations depending on the context. The outcome would be different for data brokers. People do not reasonably expect that unknown shadowy actors are amassing their intimate information in digital dossiers. In my view, this approach is far less compelling than requiring meaningful consent. The data collection imperative for intimate data would continue with too little friction restraining it.

Certain collection practices should be off-limits. Law should prohibit services whose raison d’être is the nonconsensual collection of intimate data. Period the end, no exceptions. Software that “undresses” women in photographs runs afoul of this mandate; so do apps that facilitate the secret and undetectable monitoring of someone’s cellphone and sites hosting nonconsensual pornography and deep fake sex videos.317 To ensure that this reform would apply to revenge porn sites and their ilk, Congress should amend the federal law shielding online services from liability for user-generated content, as I have long argued they should.318


314 740 Ill. Comp. Stat. 14/20(2).

315 Cable Privacy Protection Act. The European Union’s General Data Protection Regulation requires opt-in consent for the placement of tracking cookies. For sensitive information including information about individuals’ sexuality, companies can only collect such information with explicit, affirmative consent.

316 See the thoughtful proposals of Cameron F. Kerry in Proposed Standards for Data Collection in Privacy Legislation, Lawfare, https://www.lawfareblog.com/datacollection-standards-privacy-legislation-proposed-language (“Collection and processing [defined terms] of personal data shall have a reasonable, articulated basis that takes into account reasonable business needs of the [covered entity/controller/etc.] engaged in the collection balanced with the intrusion on the privacy and the interests of persons whom the data relates to”). Kerry noted, and I agree, that his proposal would “take provisions or rulemaking that exclude certain sensitive data fields or targeting to establish boundaries for behavioral advertising. . . . even if behavioral advertising in general is considered a reasonable business purpose, this collection language could be construed as barring Target’s processing of purchasing data to deliver ads for maternity products to a secretly pregnant teenager as an excessive intrusion on her privacy and interests.

317 Such a rule would reinforce the legal practice of pornography—the recording and sharing of nude imagery with the subject’s explicit consent.

318 Section 230 of the Communications Decency Act secures a shield from liability for sites that under- or over-filter content provided by another information content provider. My prior work has explored suggestions for amending Section 230 and so I will not belabor the point here. See CITRON, HATE CRIMES IN CYBERSPACE, supra note; Danielle Keats Citron & Mary Anne Franks, The Internet as a Speech Conversion Machine and Other Myths Confounding Section 230 Reform, U. CHI. LEGAL F. (forthcoming); Danielle Keats Citron, Cyber Mobs, Disinformation, and Death Videos: The Internet As It Is (And As It Should Be), 118 . L. Rev. 1073 (2020); Danielle Keats Citron & Benjamin Wittes, The Internet Will Not Break: Denying Bad Samaritans Section 230 Immunity, 86 FORDHAM L. REV. 401 (2017); Citron, Cyber Civil Rights, supra note.


We have recognized no-collection zones in other contexts. American law has long banned the collection of information crucial to the exercise of civil liberties. Under the Privacy Act of 1974, for instance, federal agencies are precluded from collecting information that exclusively concerns individuals’ First Amendment activities. In NAACP v. Alabama, the Supreme Court struck down a court order requiring the civil rights group to create and produce its membership list on the ground that privacy in group associations is indispensable to preserving the freedom to associate.319 Apps and services designed to facilitate the collection of intimate information without individuals’ permission are an equal affront to civil rights and civil liberties, and they should be prohibited.

To wrap up this discussion, it is worth noting the synergy between limits on collection and limits on the retention of intimate information. Restrictions on collection should be paired with an obligation to delete or otherwise destroy intimate information as soon as it is no longer needed to fulfill the purpose prompting its collection. This obligation would minimize the potential for leaks or the sale of intimate data.320 The Fair Credit Reporting Act and the Video Privacy Protection Act similarly require the destruction of records from background checks or movie watching as soon as practicable.321 Under the EU’s General Data Protection Regulation, personal data can be kept only for as long as is necessary to fulfill the original basis for its collection and processing.322


319NAACP v. Alabama, 357 U.S. 449, 466 (1958).

320 Seda Gürses et al, Engineering Privacy by Design Reloaded, available at https://iapp.org/media/pdf/resource_center/Engineering-PbD-Reloaded.pdf.

321 15 U.S.C. 1681w (discussing disposal of records in consumer financial information context); 18 U.S.C. 2710(e) (requiring destruction of old records in context of video rental or sale records).

322 Article 5, Principles Related to the Processing of Personal Data, General Data Protection Regulation, section 1(c) (“personal data should be adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’)”).


2. Use Restrictions

Policymakers should restrict the uses of personal data to protect the values secured by sexual privacy and reduce the risks to well-being. Companies collect massive quantities of personal information on the expectation that someday it will generate significant returns. As Paul Ohm observes, “chasing profits, companies hoard data for future, undefined uses; redistribute it to countless third parties; and repurpose it in ways their customers never imagined.”323

Personal data collected for a legitimate business purpose should not be repurposed to infer people’s intimate information without obtaining separate consent. This mirrors the approach of the Fair Information Practice Principles (FIPPs).324 The FIPPs are the foundation both for most privacy laws in the United States and around the world, as well as for most understandings of information ethics. Under the FIPPs, information obtained for one purpose cannot be used or made available for other purposes without the person’s consent.325 That restriction is often referred to as a “secondary use limitation.”

Under this approach, a social media company could not use its subscribers’ personal data to infer their sexuality, HIV status, and miscarriages without seeking meaningful consent. It could not use subscribers’ intimate information to infer other intimate information without seeking meaningful consent. Subscribers’ intimate information, of course, could be used for the purpose for which it was collected and for which firms obtained meaningful consent. This would include allowing subscribers to message each other and to post intimate information.


323 Paul Ohm, Sensitive Information, 88 S. CAL. L. REV. 1125, 1128 (2015).

324 The FIPPs were first articulated by privacy scholar Alan West in 1967 and popularized by the U.S. Department of Health, Education, and Welfare in 1973. See ALAN WESTIN, PRIVACY AND FREEDOM (1967); https://epic.org/privacy/consumer/code_fair_info.html.

325 https://epic.org/privacy/consumer/code_fair_info.html; Privacy Policy Guidance Memorandum Department of Homeland Security (December 29, 2008) https://www.dhs.gov/sites/default/files/publications/privacy-policy-guidancememorandum-2008-01.pdf.


We need clear rules against the exploitation of intimate information to manipulate people to act in ways consistent with another’s ends rather than their own. As explored in Part II, law enforcers have investigated uses of personal data to target the vulnerabilities of protected groups as unfair commercial practices.326 Such cases, however, remain rare. A ban would make clear that such practices are unlawful and would discourage enforcement actions directed at such exploitative practices.327 More broadly, privacy law should require firms to act in the best interest of individuals whose intimate data they have collected consistent with a duty of loyalty and care.328

Strong use restrictions would protect the values that sexual privacy secures and prevent harms explored in this piece. Individuals would not have their sexual autonomy undermined by a dating app’s secret sharing their HIV status, sexual fantasies, or sex toy use with advertisers. They would not suffer blows to their self-esteem due to the posting of their nude photos on revenge porn sites or the inclusion of their sexual assault in data brokers’ dossiers. They would not be chilled from using reproductive health apps for fear that their struggles with painful periods or infertility would undermine their job opportunities or raise their insurance premiums.

3. Remedies: Halt Processing and the Data Death Penalty

Injunctive relief against improper processing of intimate data should be part of the suite of remedies for the very worst offenders.329 Privacy debates of late have focused on the wisdom of recognizing civil actions for damages or administrative fines.330 Injunctive relief, however, has not been a key part of the discussion, but it should be.


326 HARTZOG, supra note, at 131 (explaining that UDAP laws are designed to prevent the exploitation of human vulnerabilities).

327 Jaime Luguri & Lior Strahelivitz, Shining a Light on Dark Pattern, available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3431205.

328 Richards & Hartzog, Duty of Loyalty, supra note, at; Richards & Hartzog, Pathologies of Consent, supra note, at 1500 (arguing that lawmakers should create rules designed to protect our trust—meaning “being discreet with our data, honest about the risk of data practices, protective of our personal information, and, above all, loyal to us, the data subjects”).

329 The topic of privacy remedies has not attracted sustained attention with notable exceptions. For such an exception, see the important work of Lauren Henry Scholz. See, e.g., Lauren Henry Scholz, Privacy Remedies, 94 IND. L.J. 1 (2019). Scholz argues for the recognition of restitution as privacy remedy.


Privacy legislation should recognize judicial power to order injunctive relief in cases for serial offenders. In such a case, injunctive relief should be mandatory to assure meaningful protection of sexual privacy and make clear its priority over competing interests.331

As for substantive duties so for remedies: Civil rights law provides a model for reform. Injunctive relief is a core feature of civil rights law.332 Federal, state, and local anti-discrimination statutes permit injunctive relief,333 and courts have employed equitable remedies in flexible and creative ways.334 In workplace sexual harassment cases, courts have ordered employers to implement anti-harassment policies and procedures, provide training, retain personnel records, and install security cameras.335

Lawmakers should recognize a court’s power to order parties to halt processing intimate information for repeat offenders. Figuring out if a firm qualifies as a repeat offender would entail several steps. Under the first step, the court would issue an order directing the party to fulfill its legal obligations. If the court is presented with clear evidence that the party has violated the first order, then the court would turn to the second step. Under the second step, the court would order (the second order) the firm to stop processing intimate data until compliance has been achieved as shown by an independent third-party audit.336 For the final step, if the court is shown clear evidence that the party has failed to comply for the third time, then and only then would the court impose what can be called the data death penalty—an order permanently stopping the firm from processing intimate information.


330 The debate has largely centered on private rights of action. Industry lobbyists strongly oppose privacy bills that include private rights of action. Private rights of action are essential given the limited resources available to federal and state law enforcers.

331 Lawmakers must make clear that such injunctive relief is automatic. In the absence of clear legislative intent, courts are reluctant to order equitable remedies. Winter v. Natural Resources Defense Council, 555 U.S. 7, 24 (2008). There is an extensive scholarly debate about whether courts should be required to issue injunctions to remedy statutory violations. Michael T. Morley, Enforcing Equality: Statutory Injunctions, Equitable Balancing under eBay, and the Civil Rights Act of 1964, U. CHI.L. FORUM 177 (2014). In the environmental context, Daniel Farber argues that when statutes impose absolute duties on people, injunctive relief is essential to prevent future violations. Daniel A. Farber, Equitable Discretion, Legal Duties, and Environmental Injunctions, 45 U. PIT. L. REV. 513, 515 (1984).

332 OWEN M. FISS, THE CIVIL RIGHTS INJUNCTION 6 (1978) (explaining that injunctive relief was understood after Brown v. Board of Education as the most effective way to guarantee civil rights). For a thoughtful exploration of how courts exercise their equitable powers granted under Title VII, see Michael T. Morley, Enforcing Equality: Statutory Injunctions, Equitable Balancing under eBay, and the Civil Rights Act of 1964, U. CHI. L. FORUM 177 (2014).

333 See, e.g., Civil Rights Act of 1964, 204(a); 43 Pa. Stat. 962(c)(3); Availability of Injunctive Relief under State Civil Rights Acts, 24 U. CHI. L. REV. 174, 174, 180 (1956). In some civil rights statutes, injunctions are the only available remedy. For instance, Title III of the Americans with Disability Act only allows injunctive relief as opposed to monetarydamages. Dudley v. Hannaford Brothers Co., 333 F.3d 299, 304 (1st Cir. 2003).

334

335 See, e.g., United States v. Greenwood Community School Corp. (S.D. Ind.); Carey v. O’Reilly Auto. Stores, 2019 WL 3412170 (S.D. Fla. May 31, 2019).

336 A schedule would be set to report the auditor’s findings to the court.


Under a stop-processing order, providers of cyberstalking apps and sites devoted to nonconsensual pornography would have to halt their services.337 An adult site would be ordered to stop collecting individuals’ searches without meaningful consent. Such orders would be crucial to securing an effective remedy to individuals whose sexual privacy had been repeatedly violated.

There is nothing novel about a halt processing remedy. Under Article 58 of the GDPR, data protection authorities have authority to impose temporary or permanent bans on the processing of personal data. Halt processing orders must be “appropriate, necessary, and proportionate” to ensure compliance with legal obligations.338 In 2019, the Hamburg Commissioner for Data Protection and Freedom of Information (Hamburg Commissioner) started an administrative procedure to stop Google employees and contractors from listening to voice recordings of Google Home device subscribers for three months.339 The Hamburg Commissioner explained that, “effective protection of those affected from eavesdropping, documenting, and evaluating private conversations by third parties can only be achieved by prompt execution.”340 Google responded by pledging not transcribe voice recordings collected from its personal assistant device.341


337 In the case of revenge porn sites and their ilk, such relief would depend upon changes to Section 230 as explored in note.

338 Recital 129 of the GDPR.

339 Hamburg Commissioner for Data Protection and Freedom of Information, Speech Assistant Systems Put to the Test (August 1, 2019), available at https://datenschutzhamburg.de/assets/pdf/2019-08-01_press-release-Google_Assistant.pdf. The GDPR permits data protection authorities to take measures to protect the rights of data subjects for a period not to exceed three months. Id.

340 Id. Recall that whistleblowers reported that Google Home was inadvertently recording private and intimate conversations and that contractors were transcribing those conversations in order to analyze whether the device was correctly processing information.

341 Id. Google seemingly has not altered its position.


EU data protection authorities had been issuing halt-processing orders even before the GDPR’s adoption. For instance, Ireland’s data protection authority ordered Loyaltybuild to halt processing personal data for three months after learning that the firm’s data breach involved the personal data of 1.5 million people. The firm was directed to notify clients about the security breach, delete certain data, and achieve compliance with PCI-DSS standards for the processing of credit card data.342 It took the company seven months to fulfill those obligations.

To be sure, even temporary stop-processing orders exact significant costs. Loyaltybuild lost millions of Euros in revenue, a considerable blow to the firm.343 For some entities, halting processing for even a month might cause their collapse. New entrants will no doubt find it more challenging to absorb the costs of stop-processing orders than established entities.344 But the grave risk to individuals and society posed by the handling intimate information warrants strong remedies.

C. Objections

The new compact will raise questions about the market and free speech. This section addresses some concerns about the broader social welfare consequences of my reform proposals. It explains why the reform proposals enhance free speech values and would withstand First Amendment challenge.

1. Market

These proposals would surely change the value proposition for many online services. A significant number of apps and services explored above do not charge fees for their services because they earn advertising money. In some markets, third parties may have invested in them as we have seen in the sexual wellness and dating markets.345


342 https://iapp.org/news/a/cease-processing-orders-under-the-gdpr-how-the-irishdpa-views-enforcement/

343 Id. The behemoth Google halted transcriptions of conversations captured by personal devices with little impact on its bottom line.

344 At a faculty workshop, my colleagues David Webber and Michael Meuer asked me about potential perverse incentives of stop-processing orders. Might new entrants collect intimate information in violation of the law and then just shut down and restart in a game of endless whack a mole? That is surely possible depending on the start-up costs and availability of necessary financing. Criminals have certainly engaged in this sort of whacka-mole activity in the face of shut down orders as in the case of AnonIB. See supra note. Nonetheless, the reputational costs of this strategy would be significant. New entrants seeking third-party capitalization would be less inclined to engage in this sort of behavior.

345 Dana Olsen, The top 13 VC investors in femtech startups, PITCHBOOK (November 2, 2016), available at https://pitchbook.com/news/articles/the-top-13-vc-investors-infemtech-startups (explaining that a decade ago only $23 million worth of venture capital was invested in the global femtech industry whereas there has been nearly $400 million in venture capital funding in 2018); Kate Clark, Dating startup raises VC as Facebook enters the relationship biz, PITCHBOOK (May 4, 2018), available at https://pitchbook.com/news/articles/dating-app-raises-vc-as-facebook-enters-therelationship-biz (explaining that app-based dating services have attracted venture funding including apps like Happn, Hinge, Clover, and The League). 2018 set records for investment in apps devoted to women’s and men’s health issues. Dana Olsen, This year is setting records for femtech funding, PITCHBOOK (October 31, 2018), available at https://pitchbook.com/news/articles/this-year-is-setting-records-for-femtech-funding. Two venture capital funds have emerged that are devoted exclusively to investing in the funding of women’s health enterprises. Id. One of those firms Astarte invested in Lola, which provides subscription-based delivery of organic tampons, Flo, the period-tracking app, and Future Family, a business offering reproductive-health services. Id.


Firms would look to other revenue sources if advertising fees and outside funding dropped significantly. They might charge subscription fees. They might keep basic services at low or no cost and increase the costs for premium or add-on services. A nontrivial number of people might not be able to afford these services.

Non-profit organizations might support efforts to provide some services free of charge. The fem tech market seems a likely possibility. Reproductive justice organizations might provide funding for period-tracking apps providing helpful and truthful information. LGBTQ advocacy groups might hire technologists to create dating apps for community members.

Some gaps would remain, leaving some people unable to afford dating apps, period-tracking services, and subscriptions to adult sites. Failing to protect intimate data exacts too great a cost to sexual privacy even if it means that services tracking intimate life remain out of reach for some.

More broadly, we should not discount the role that privacy plays in enhancing market operations. As Ryan Calo has explored, a firm’s commitment to privacy engenders trust.346 Individuals may be more inclined to use services because they believe that a firm’s service is worth their price.347


346 Ryan Calo, Privacy and Markets: A Love Story, 91 NOTRE DAME L. REV. 649 (2015).

347* Id.*


2. Free Speech

The proposed reforms will garner objections on free speech grounds. For some scholars, all data privacy laws regulate “speech” and thus may be inconsistent with the First Amendment.348 These arguments illustrate what Leslie Kendrick has criticized as “First Amendment expansionism:” the “tendency to treat speech as normatively significant no matter the actual speech in question.”349 As Kendrick underscored, freedom of speech is a “term of art that does not refer to all speech activities, but rather designates some area of activity that society takes, for some reason, to have special importance.”350

Just because activity can be characterized as speech does not mean that the First Amendment protects it from government regulation.351 Neil Richards helpfully explains that free speech protections hinge on whether government regulations of commercial data flows are “particularly threatening to longstanding First Amendment values.”352 Indeed.

The assertion that all speech (or all data) has normative significance elides the different reasons why speech (or data) warrants protection from particular government regulations but not others.353 Some government regulations censor speech central to self-governance or the search for truth while others raise no such concerns. Some government regulations imperil speech crucial to self-expression while others pose no such threat.354

The proposed reforms would not threaten First Amendment values. The nonconsensual surveillance of intimate life is not necessary for the public to figure out how to govern itself. Requiring explicit consent to handle data about people’s HIV status, abortion, sex toy use, or painful cramps would have little impact on discourse about political, cultural, or other matters of societal concern. People’s miscarriages, erectile dysfunction, abortions, and sexual fantasies have nothing to do with art, politics, or social issues. Nude photos posted without consent contribute nothing to discussions about issues of broad societal interest. Someone’s abortion, miscarriage, and rape are not facts or ideas to be debated in the service of truth.


348 Eugene Volokh, Freedom of Speech and Information Privacy: The Troubling Implications of a Right to Stop People from Speaking About You, 52 STAN. L. REV. 1049, 1050-51 (2000) (arguing that government imposed fair information practice rules that restrict the ability of speakers to communicate truthful data about others is inconsistent with the basic First Amendment principles); Jane Bambauer, Is Data Speech?, 66 STAN. L. REV. 57, 63 (2014) (arguing that “for all practical purposes, and in every context relevant to the current debates in information law, data is speech.”).

349 Leslie Kendrick, First Amendment Expansionism, 56 WILLIAM & MARY L. REV. 1199, 1212 (2015).

350Id.

351 Id.

352 Neil M. Richards, Why Data Privacy Laws Is (Mostly) Constitutional, 56 WILLIAM & MARY L. REV. 1501, 1507 (2015).

353 Kendrick, supra note, at.

354 id.


Regulating the surveillance of intimate life with explicit consent requirements and narrow no-collection zones would not chill self-expression but rather secure the conditions for self-expression.355 The nonconsensual collection of people’s sex toy habits or porn site searches undermines their willingness to engage in sexual expression. People whose nude photos appear on revenge porn sites have difficulty interacting with others and often retreat from online engagement and self-expression.356

The Supreme Court has made clear the inextricable tie between the absence of privacy protections and the chilling of self-expression. In Bartnicki v. Vopper, the Supreme Court observed that “the fear of public disclosure of private conversations might well have a chilling effect on private speech.”357 In Carpenter v. United States, the Court held that pervasive, persistent police surveillance of location information enables inferences about one’s sexuality and intimate partners so as to chill “familial, political, professional, religious, and sexual associations.”358

With the proposed reforms, people would be less fearful of engaging in intimate expression and interaction. If individuals trust firms to use intimate information only for the purpose for which it was collected and no other unless they say otherwise, then they will be more willing to use those services to experiment with ideas. They will be more inclined to browse sites devoted to gender experimentation and to express themselves on dating apps.

For all of these reasons, the Court has made clear that laws regulating speech about “purely private matters” do not raise the same constitutional concerns as laws restricting speech on matters of public interest.359 As the Court explained in Snyder v. Phelps, speech on public matters enjoys rigorous protection “to prevent the stifling of debate essential to democratic self-governance.”360 In contrast, speech about “purely private matters” receives “less stringent” protection because the threat of liability would not risk chilling the “meaningful exchange of ideas” and “robust debate on public issues.”361 Its restriction “does not pose the risk of a reaction of self-censorship on matters of public import.” To illustrate a “purely private matter,” the Court pointed to an individual’s credit report and videos showing someone engaged in sexual activity.362 The proposed reforms suggested here relate to purely private matters, including videos showing someone engaged in sexual activity.


355 Danielle Keats Citron & Neil M. Richards, Four Principles for Digital Expression (You Won’t Believe #3!), 95 WASH. U. L. REV. 1353, 1379 (2018).

356 CITRON, HATE CRIMES IN CYBERSPACE, supra note, at 195.

357 532 U.S. 514 (2001). See Citron, Hate Crimes in Cyberspace, supra note, at 208-210 (discussing the Court’s recognition in Bartnicki v. Vopper that privacy protections foster private speech).

358 Carpenter v. United States, 138 S. Ct. 2206 (2018). See also David Gray & Danielle Keats Citron, The Right to Quantitative Privacy, 98 MINN. L. REV. 62, 77 (2013) (exploring the chilling effect of indiscriminate, continuous police collection of geolocation data).

359 As Kenneth Abraham and Edward White argue, the “all speech is free speech” view devalues the special cultural and social salience of speech about matters of public concern Kenneth S. Abraham & Edward G. White, First Amendment Imperialism and the Constitutionalization of Tort Liability, TEX. L. REV. (forthcoming), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3437289.

360 Snyder v. Phelps, 131 S. Ct. 1207 (2011). For an extended discussion of Snyder v. Phelps, see CITRON, HATE CRIMES IN CYBERSPACE, supra note, at 215.

361 Snyder, 131 S. Ct. at 1216 (noting that the “content of a particular person’s credit report ‘concerns no public issue’ and was speech solely in the individual interest of the speaker and its particular business audience” and that “videos of an employee engaging in sexually explicit acts did not address a public concern” because it “did nothing to inform the public about any aspect of the [employing agency’s] functioning or operation”).

362 The employee’s loss of public employment was constitutionally permissible because the videos shed no light on the employer’s operation and instead concerned speech on purely private matters.


The proposed reforms comport with First Amendment doctrine.363 Rules governing the collection of information raise few, if any, First Amendment concerns.364 These rules “prohibit information collection by separating the public sphere from the private.”365 Trespass laws, intrusion on seclusion tort, and video-voyeurism statutes have withstood constitutional challenge.366 Courts have upheld laws requiring informed consent before entities can collect personal data, such as the Fair Credit Reporting Act (FCRA), federal and state wiretapping laws, and the Children’s Online Privacy Protection Act (COPPA).367 It is also worth noting that the reform proposals turn on people’s explicit consent. The Court has held that “private decision making can avoid government partiality and insulate privacy measures from First Amendment challenge.”368 Indeed, explicit consent is part and parcel of data collection laws like FCRA and COPPA.


363 Richards, supra note, at.

364 Neil M. Richards, Reconciling Data Privacy and the First Amendment, 52 UCLA L. Rev. 1149, 1182 (2005).

365 Id.

366NEIL M. RICHARDS, INTELLECTUAL PRIVACY (2015).

367 Id.

368 Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2669 (citing Rowan v. Post Office, 397 U.S. 728 (1970)).


As Neil Richards argues, “information collection rules do not fall within the scope of the First Amendment under either current First Amendment doctrine or theory.”369 These rules “are of general applicability, neither discriminating against or significantly impacting the freedoms guaranteed by the First Amendment.”370 The Supreme Court has held that even media defendants enjoy no privilege against the application of ordinary private law in their efforts to collect newsworthy information.371

Trespassers cannot avoid liability by contending that they infringed others’ property rights in order to collect information.372 Computer hackers cannot avoid criminal penalties by insisting that they were only trying to obtain information.373 Websites cannot avoid responsibility under COPPA by insisting that they should not have to ask for parental consent because they need access to children’s online information. Employers cannot avoid liability under FCRA by arguing that they are just trying to learn about people and so should not have to ask for permission to see their credit reports.

Reform proposals restricting the use of intimate information without explicit consent would not run afoul of the First Amendment. Countless laws restrict certain uses of personal information, from state and federal anti-discrimination laws and trade secret laws to FCRA and census rules.374 Laws restricting secondary uses of information have not been held to violate the First Amendment.375 In Bartinicki v. Vopper, the Supreme Court assessed the First Amendment implications of the Wiretap Act’s prohibition on the use or disclosure of intercepted communications. The Court underscored that “the prohibition on the ‘use’ of the contents of an illegal interception . . . [is] a regulation of conduct” whereas the prohibition of the disclosure or publication of information amounts to speech.376


369 Neil M. Richards, Reconciling Data Privacy and the First Amendment, 52 UCLA L. REV. 1149, 1186 (2005).

370 Id.

371 Id. at 1188 (noting that in Cohen v. Cowles, the Supreme Court held that the press may not with impunity break and enter an office or dwelling to gather news”).

372 Id.

373Id.

374 Id. at 1190-91.

375 Id. at 1194.

376 Bartnicki, 532 U.S. at 527.


Sorrell v. IMS Health,377 decided in 2011, does not cast doubt on the likely constitutionality of the collection and use restrictions suggested here. In Sorrell, the Court struck down a Vermont law banning two types of activities. First, the law prohibited pharmacies, health insurers, or similar entities from disclosing doctors’ prescription data for marketing purposes. Second, the law prohibited pharmaceutical companies and health data brokers from using doctors’ prescription data for marketing purposes unless the medical prescriber consents.378 Data brokers and an association of pharmaceutical companies challenged the regulations on the grounds that they violated their free-speech rights.

Justice Kennedy, writing for the majority, struck down the law on First Amendment grounds. Under First Amendment doctrine, discrimination against particular speakers or messages—known as viewpoint-based discrimination—is “virtually always invalid.”379 The Court found that the law did precisely that. It held that the law “imposes a burden based on the content of the speech and the identity of the speaker.”380 The majority underscored that the law “imposed content- and speaker-based restrictions on the availability and use of prescriber-identifying information.”381

As the majority found, the law told pharmacies and regulated entities that they could not sell or give away prescription data for marketing purposes but it could be sold or given away for purposes other than marketing.382 Under the law, pharmacies could share prescriber information to academics and other private entities. The Court explained, “The State has burdened a form of protected expression it has found too persuasive. At the same time, the State has left unburdened those speakers whose messages are not in accord with its own views. This the State cannot do.”

The Court found viewpoint discrimination in the law’s targeting of specific speakers—data brokers and pharmaceutical companies—and not others. As the majority noted, academic institutions could buy prescription data “in countering the messages of brand-name pharmaceutical manufacturers and in promotion the prescription of generic drugs,” but pharmaceutical companies and detailers “were denied the “means of purchasing, acquiring, or using prescriber-identifying information.”383


377 131 S. Ct. 2653 (2011).

378Sorrell, 131 S. Ct. at 2660.

379 RICHARDS, INTELLECTUAL PRIVACY, supra note, at.

380 Sorrell, 131 S. Ct. at 2665.

381 Id.

382 Sorrell, 131 S. Ct. at 2662.

383 Sorrell, 131 S. Ct. at 2663.


The majority rejected the state’s argument that the consent provision insulated the law’s use restriction from constitutional concerns.384 The problem was that the “state gave doctors a contrived choice: Either consent, which will allow your prescriber-identifying information to be disseminated and used without constraint; or, withhold consent, which will allow your information to be used by those speakers whose message the State supports.” The majority explained that privacy could be chosen only if it “acquiesce[d] in the State’s goal of burdening disfavored speech by disfavored speakers.”385

The Court held that the state failed to provide a sufficiently compelling reason to justify the law and that the state’s interest was proportional to the burdens placed on speech and that the law sought to “suppress a disfavored message.” The law failed to advance the interest of medical privacy, as the state claimed, given that it did not restrict the sale or use of prescriber data for countless reasons other than marketing.386 The majority emphasized that the law “allowed prescriber data to be studied and used by all but a narrow class of disfavored speakers.”

Some have suggested that Sorrell casts doubt on the constitutionality of data protection laws in recognizing that “a strong argument exists that prescriber-identifying information is speech for First Amendment purposes.”387 But the majority went out of its way to say that its finding did not spell the end for all privacy law. Instead, Justice Kennedy, in dictum, seemingly affirmed the constitutionality of sectoral privacy laws like the federal health privacy law. He explained if Vermont had “advanced its asserted privacy interest by allowing information’s sale or disclosure in only a few narrow and well-justified circumstances” as in HIPAA, the law would have been constitutional.388

Neil Richards contends that the Sorrell holding is quite narrow. In his telling, the Court struck down the law not because it regulated data flows amounting to protected speech but because it lacked a “more coherent policy” and imposed impermissible viewpoint restrictions.389 Richards has the better reading here. The majority explained that it had “no need to determine whether all speech hampered by [the law] is commercial” or pure speech.390 Instead, it focused on the viewpoint discrimination—that the law sought to “suppress a disfavored message”—and the state’s failure to show that the law directly advanced a substantial government interest and the measure was drawn to achieve that interest.391 Crucially, as Richards explains, the Court made clear that the “law would have been less problematic if it had imposed greater duties of confidentiality” (as well as requirements of explicit consent and use restrictions) on the data.392


384 Id. at 2669.

385 Id.

386Id.

387 Jane Bambauer argues that if data is speech than privacy regulations always burden the production of knowledge. Bambauer, supra note, at 63.

388

389 RICHARDS, INTELLECTUAL PRIVACY, supra note, at.

390 Sorrell, 131 S. Ct. at 2668.

391 Id.

392 Richards, supra note, at 1523.