The following article was featured in the University of North Carolina Law School’s Journal of Law & Technology, April 19, 2021, and was authored by Lynn Norton-Ramirez and her daughter UNC law student, Marissa Ramirez
This blog addresses Massachusetts’ shift away from the “all or nothing” approach taken by many states to address law enforcement’s facial recognition technology use. In Massachusetts, law enforcement agencies have been utilizing official and unofficial databases to conduct facial recognition searches largely without any legal controls or regulation. For example, police have been using the state’s database of driver’s license photographs to identify any unknown facial image as well as gaining access to private facial recognition databases such as Clearview AI. To curb the unfettered access to facial recognition databases while still acknowledging the legitimate value of this “powerful investigative tool,” legislators in Massachusetts have successfully negotiated a middle ground between the interests of law enforcement and privacy concerns.
The proposal comes after the Massachusetts ACLU filed hundreds of public records requests, gathering information about facial recognition technology used by state agencies, schools, private companies, and more. The ACLU’s findings were concerning, leading them to launch a campaign against the use of this technology. One noteworthy document was an advertisement from Clearview AI, sending season’s greetings and the opportunity for the local police department to use a free trial of the technology the ad claimed was like the “Google Search for faces.” The tool takes only seconds to search a suspect’s image in databases comprised of “mug shots, social media, news articles, and other publicly available sources.”
Facial recognition databases are valuable investigative tools for law enforcement, but regulation is critical to ensure that data is utilized appropriately and that privacy rights are protected.
The ACLU’s campaign in Massachusetts garnered significant support, capturing the attention of the Boston Celtics. In an Op-Ed in the Boston Globe, Celtic’s players voiced their approval of the regulations. They encouraged the governor to sign the bill, emphasizing the need for a balanced regulation, primarily because this technology is known to be faulty and biased.
The relevant section of the bill precludes “public agenc[ies],” including law enforcement, from utilizing or accessing any “biometric surveillance systems” or getting this information from any third parties, thereby eradicating any further affiliation with companies such as Clearview AI. There are, however, narrow exceptions that will allow law enforcement, with the permission of a judge, to run facial recognition software. Even when an exception is applicable, only the state police or FBI can run the search and using only the technology acquired by the state’s department of motor vehicles. This effectively takes this technology out of the hands of local law enforcement and away from publicly available databases.
Some of the circumstances in which a facial recognition search can be run include: (1) to execute a warrant based on “probable cause that such a search will lead to evidence of the commission of a violent felony offense under” Massachusetts law; and (2) “upon reasonable belief that an emergency involving immediate danger of death or serious physical injury to any individual or group of people requires the performance of a facial recognition search without delay.” All emergency searches must be “narrowly tailored to address the emergency.” Furthermore, a reporting requirement mandates that law enforcement agencies document each facial recognition search conducted and provide the documentation quarterly to the executive office of public safety and security. Notwithstanding the statute, a law enforcement agency may still: (1) “acquire and possess personal electronic devices. . that utilizes facial recognition technology for the sole purpose of user authentication;” (2) “acquire, possess and use automated video or image redaction software; provided, that such software does not have the capability of performing facial recognition or other remote biometric recognition;” and (3) “receive evidence related to the investigation of a crime derived from a biometric surveillance system; provided, that the use of a biometric surveillance system was not knowingly solicited by or obtained with the assistance of a public agency or any public official in violation of that statute.”
The legislation still leaves many questions: Will leaving the door open for law enforcement to use evidence derived from biometric surveillance systems provide the opportunity to work around these new regulations? What types of situations did the legislature intend to constitute a reasonable belief there is an emergency?
Only time can tell how effective this legislation will be, whether these limitations are practical, and whether additional safeguards may be needed. Facial recognition databases are valuable investigative tools for law enforcement, but regulation is critical to ensure that data is utilized appropriately and that privacy rights are protected. Companies like Clearview AI should also take note of this shift. If other jurisdictions begin to enact similar legislation, they may become obsolete quickly.
For more information on facial recognition software, some of the fundamental flaws associated with the technology, and federal and state actions are taken to address these issues, see my previous blog post here.
Marissa Flack & Lynn Norton-Ramirez