Andrew Allard '25
Executive Editor
This past Tuesday, March 21, the American Constitution Society (ACS) at UVA hosted a conversation with Professor Elizabeth Rowe to discuss her Stanford Technology Law Review article, “Regulating Facial Recognition Technology in the Private Sector.”[1] JJ Citron ’24, Programming Co-director for ACS at UVA, moderated the conversation.
Professor Rowe explained that she wrote her article in response to the convergence of two concerns—the diversity of interests in facial recognition technology on the one hand, and the lack of federal regulation of that technology on the other. The idea sprung out of her experience advising on data privacy issues in the private sector, including for “a very large amusement park.” Professor Rowe saw that complex and ever-evolving facial recognition technologies were outpacing the law’s ability to react and adapt.
In her article, Professor Rowe examines the “common interests and common areas of concern among the various stakeholders, including developers of the technologies, business users, and consumers.”[2] She suggests that consumers and developers alike have good reason to support federal regulation.
Consumer concerns are familiar, ranging from the unwitting collection of biometric data to the potential for misuse, inaccuracy, or racial bias. Developers, too, may benefit from federal regulation—and some companies, including Amazon, are even advocating for it.
Professor Rowe said that the current state-by-state approach to data privacy law amounts to a regulatory headache for businesses. “The cost of compliance for this patchwork of state [laws] is just too high. Which then leads [businesses] to say, ‘Please give us federal regulation. We’d rather have one law for the whole country.’”
But what federal regulation would look like remains an open question. The Commercial Facial Recognition Privacy Act, introduced to the Senate in 2019, has yet to make it out of committee.[3] And on the commercial side, Amazon’s policy team has drafted and lobbied for its own legislation.[4] Amazon’s efforts have been met with some skepticism in light of their interest in the industry.
Ultimately, Professor Rowe recommends a differentiated regulatory framework, meaning that regulations should be tailored to each industry and use-case.[5] Professor Rowe says that a similar framework has been adopted by the European Union.
To guide regulators, Professor Rowe suggested that trade secret law could serve as a model for data privacy protections. “If we flip the hypothetical, and what we’re talking about is the equivalent of company faces, company fingerprints—that’s trade secret law . . . That is, as the courts have said, a fundamental right to commercial privacy. Nobody can snoop at it.” But because the law does not currently treat biometric data as an individual’s property, consumers can’t assert the same privacy rights that companies can.
While consumers may benefit from increased regulation, Professor Rowe recognized that getting them to agree on a path forward is no easy task. “We have a love-hate relationship with these technologies,” said Professor Rowe. “If anyone tells us: ‘Put away your phone for just one day,’ we’ll all probably start shaking and having seizures from withdrawal.” With that challenge in mind, Professor Rowe suggested that “regulation in this area may merit reconceptualizing who the ‘public’ is and what ‘they’ want.”[6]
Hearing Professor Rowe talk about her research, one gets a sense of the daunting challenges of regulating in this area—and the potentially severe consequences of getting it wrong. Businesses and government actors alike already have extensive collections of biometric data, explained Professor Rowe. “All of that is being stored somewhere. And we trust that it will be safe. It’s really not much a question of whetherwe’ll have these kinds of vulnerabilities, but when.”
Professor Rowe suggested that government actors should think of biometric data privacy as a national security concern. “Over the last few years, the U.S. government has elevated trade secrecy and the protection of commercial information to the level of national security . . . [The government] has spent a tremendous amount of resources, time, and regulation thinking about it from that perspective. We’re not there yet with personal data.”
After the event, I spoke with Professor Rowe about the Biden administration’s efforts to force a sale of TikTok, the social media app owned by a Chinese company, ByteDance. The Biden administration has expressed concerns about “countries, including China, seeking to leverage digital technologies and Americans’ data in ways that present unacceptable national security risks.”[7]
Professor Rowe said that there are heightened concerns when Americans’ personal data is in the hands of foreign-owned companies. But she explained that transferring that data to an American company, without implementing nationwide data privacy regulations, would likely provide only a marginal benefit to consumers.
---
tya2us@virginia.edu
[1] Elizabeth A. Rowe, Regulating Facial Recognition Technology in the Private Sector, 24 Stan. Tech. L. Rev. 1 (2020), https://law.stanford.edu/publications/regulating-facial-recognition-technology-in-the-private-sector/.
[2] Id. at 1.
[3] S. 847, 116th Cong. (2019).
[4] See Rowe, supra note 1, at 37 (citing Kori Hale, Amazon Pitches Shady Facial Recognition Laws, Forbes (Oct. 1, 2019), https://perma.cc/S33R-MS4K).
[5] Id. at 48–51.
[6] Id. at 53.
[7] Press Gaggle, Olivia Dalton, Principal Deputy Press Sec’y, The White House (Feb. 28, 2023), https://www.whitehouse.gov/briefing-room/press-briefings/2023/02/28/press-gaggle-by-principal-deputy-press-secretary-olivia-dalton/.