Professor Citron Interviews Digital Privacy Scholar


Andrew Allard '25
Staff Editor


This past Thursday, February 9, the Law School’s LawTech Center and Law, Innovation, Security & Technology (LIST) hosted an interview with Chris Gilliard, a writer, speaker, and inaugural member of the Just Tech Fellows at the Social Science Research Council. Gilliard’s scholarship focuses on digital privacy and the intersections of race, class, and technology. The interview was led by the Law School’s own Professor Danielle Citron, whose scholarship also centers on privacy and civil rights. The two discussed the proliferation of products that monitor us and our activity, such as smart home and fitness tracking devices, and their implications for privacy.

To give you a sense of Gilliard’s views on these devices—which he pointedly terms “luxury surveillance”—he has compared Apple Watches and Fitbits to ankle monitors. “What is the difference between an ankle monitor and a Fitbit?” asked Gilliard, facetiously. “One of them collects a lot more data.” Spoiler alert: It’s not the ankle monitor.

I found myself surprised at my own skepticism while listening to Gilliard and Professor Citron’s conversation. For my own part, I suppose I’m somewhere in the middle of the Luddite-tech bro spectrum. I own a Fitbit, which I wear daily. I don’t understand the point of having an Echo. But still, I found it difficult to accept what Gilliard was saying. Could my beloved Fitbit really be that harmful?

This, I suppose, is what worries Gilliard so much about these technologies: They’re insidious. It is difficult to convince those who are already invested in these technologies, particularly when they think they have nothing to hide. “There’s a segment of people who think they’re always going to be on the right end of the camera,” explained Gilliard.

This acceptance is facilitated in part by something called the “Borg Complex,” Gilliard explained. The term was coined by L.M. Sacasas, another tech writer. Star Trek fans will quickly understand, but for the Star Trek-uninitiated, think of it as a kind of tech fatalism. The Borg Complex is a criticism of the modern tendency to assume that resistance to new technologies is futile because they will be inevitably incorporated into our lives. But is this necessarily true? “We don’t walk around with plutonium!” Professor Citron quipped. So why do we so easily accept other (potentially) harmful technologies?

Maybe it was just the Star Trek reference that won me over, but the Borg Complex seemed to me a well-placed criticism. Gilliard cited the recent fervor over ChatGPT as an illustrative example. In a recent article in Slate, he chided the slew of articles declaring ChatGPT’s inevitable destruction of our education system: “The End of High-School English,” “The College Essay Is Dead,” “AI will almost certainly help kill the college essay,” and so forth.[1] An exasperated Gilliard asks, “Why do we keep doing this?”

On its face, what Gilliard argues for is eminently reasonable—that we should actually consider whether we want to accept new technologies into our homes and our daily lives. It is at least plausible that we can refuse these intrusive new gadgets. We ban things all the time—or at least attempt to. But while Gilliard’s warning against blind acceptance of the new is easy to accept, his cost-benefit analysis is probably less palatable to the general public. Asked whether there are ways in which surveillance could be beneficial to society, Gilliard was quick to say no. “The idea that we’re going to somehow leverage these systems that are in the hands of very powerful institutions with a seemingly endless supply of money is pure fantasy.”

This seems like an awfully lofty thing to say about a watch that tells me how many steps I’ve walked. To be sure, there are some serious legal consequences to sharing your personal data with tech companies. Your smart watch data can be used to determine your health conditions. Were it not for the Affordable Care Act’s protections for those with pre-existing conditions, that data could be sold to health insurance companies and be used to deny you coverage.[2] And under the third-party disclosure rule, established by Smith v. Marylandand United States v. Miller, cops may be able to access the data you’ve shared with your fitness app.[3]Professor Citron also mentioned concerns about law enforcement accessing health data from apps that track menstrual cycles—concerns that have proliferated in the wake of Dobbs.[4]

Fortunately, we do have the Affordable Care Act. The third-party disclosure rule has been narrowed in recent years, with Justice Gorsuch even suggesting it should be overturned.[5] And Congress may well extend HIPAA to apply to health and fitness apps.[6] But Gilliard argues that these problems are beyond regulation. “Often, when we’re talking about policy, there’s a discussion about how to ameliorate something. There are things I don’t think are best made less harmful. I think they’re best smashed into bits.”

With such broad adoption of these technologies, it’s hard to imagine the complete rejection that Gilliard describes. Indeed, he noticeably made comparatively little mention of the data collected by our smartphones, perhaps because he knows he would need a crowbar to pry them away from most people. Ultimately, although it is descriptively useful, the Borg Complex is deceptively simple. It’s not as though people accept these technologies without agency and without weighing their costs and benefits. We do that every time we choose to buy—or not buy—the latest gizmo. Certainly, it wouldn’t hurt to think more carefully about which technologies we do and don’t want to adopt. And admittedly, there is an illusion of choice when it comes to those technologies that everyone is expected to use. But I remain skeptical that smashing these technologies to bits is the most plausible or even the most effective solution to our contemporary privacy woes. 

Personally, I won’t be taking a hammer to my Fitbit. But perhaps when its battery finally gives out, I’ll consider a conventional watch.


---
tya2us@virginia.edu


[1] https://slate.com/technology/2023/02/chat-gpt-cheating-college-ai-detection.html

[2] https://blog.avast.com/what-fitbit-knows-about-you-avast

[3] Thank you, Professor Armacost. Unless I’m wrong, in which case, sorry.

[4] https://www.propublica.org/article/period-app-privacy-hipaa

[5] See Carpenter v. United States, 138 S.Ct. 2206 (2018).

[6] https://techcrunch.com/2022/07/08/house-oversight-letter-abortion-period-apps-data-brokers/