ICE's Mobile Fortify Facial Recognition App Sparks Backlash and Legal Challenges
ICE Facial Recognition App Faces Backlash

ICE's Mobile Fortify Facial Recognition App Sparks Backlash and Legal Challenges

Immigration and Customs Enforcement agents across the United States are increasingly deploying a controversial smartphone application equipped with facial recognition technology, known as Mobile Fortify. This tool allows officers to simply point a phone camera at individuals and scan their faces, accessing data from multiple federal and state databases. However, this practice has ignited significant backlash, including protests in cities like Minneapolis and legal actions challenging its use.

How Mobile Fortify Operates and Its Widespread Use

The Mobile Fortify app enables agents to pull detailed information on targets by scanning their faces, drawing from databases that some federal courts have previously deemed too unreliable for arrest warrants. According to a lawsuit filed by Illinois and Chicago against the Department of Homeland Security earlier this month, DHS has utilised Mobile Fortify for over 100,000 face and fingerprint scans in the field. This represents a dramatic shift from earlier uses of facial recognition, which were largely confined to investigations and ports of entry.

Nathan Freed Wessler, deputy director of the ACLU's speech, privacy and technology project, expressed grave concerns, stating, "Here we have ICE using this technology in exactly the confluence of conditions that lead to the highest false match rates." He warned that a false result could devastate someone's life and noted the broader democratic implications, adding, "ICE is effectively trying to create a biometric checkpoint society."

Accuracy Issues and Legal Challenges

Underpinning the resistance to ICE's use of facial recognition are serious doubts about the technology's efficacy. Research indicates higher error rates in identifying women and people of colour compared to white faces. The app's use often occurs in fast-moving, high-pressure situations where factors like poor lighting or individuals turning away from officers can increase the likelihood of misidentification.

The Illinois lawsuit specifically targets DHS's use of Mobile Fortify, arguing it exceeds congressional authorisation for biometric data collection. The complaint cites instances where federal agents allegedly scanned US citizens in Illinois without consent. In response, Democratic lawmakers introduced a bill on 15 January to ban DHS from using such apps except at entry points, following a September letter from senators demanding more information and warning of threats to privacy and free speech.

Protests and Public Backlash

Use of the app has inspired backlash on multiple fronts, including street protests where demonstrators employ tactics like recording masked agents and using burner phones. The existence of Mobile Fortify was first uncovered by 404 Media through leaked emails last summer, with subsequent reports revealing internal DHS documents stating people cannot refuse scans. According to 404 Media, the app's database contains approximately 200 million images.

Jake Laperruque, deputy director of the security and surveillance project at the Center for Democracy & Technology, emphasised that facial recognition should be a starting point, not a definitive ID. "If you treat this as an endpoint – as a definitive ID – you're going to have errors, and you're going to end up arresting and jailing people that are not actually who the machine says it is," he said.

Regulatory and Policy Responses

Despite DHS's assertion that Mobile Fortify operates with a high matching threshold and does not violate constitutional rights, observers note that agents frequently do not seek consent for scans and may disregard contradictory documentation. ICE has been documented using biometrics as a definitive determination of citizenship in the absence of ID, without requiring additional vetting.

In a notable case reported by 404 Media this month, Mobile Fortify misidentified a detained woman during an immigration raid, providing two incorrect names. This highlights broader concerns, as even police departments have pushed back against over-reliance on the technology, with at least 15 states enacting laws to limit its use. San Francisco set a precedent in 2019 by banning facial recognition for police and local agencies.

DHS issued a directive in September 2023 requiring bias testing and opt-out options for citizens, but it appeared to be rescinded in February last year. Meanwhile, ICE's stops, often referred to as "Kavanaugh stops" after a Supreme Court justice's opinion, face ongoing litigation, including a recent ACLU lawsuit accusing federal authorities of racial profiling and unlawful arrests.