Ice Lounge Media

Ice Lounge Media

DHS plans to collect biometric data from migrant children “down to the infant”

IceLoungeMedia IceLoungeMedia

The US Department of Homeland Security (DHS) plans to collect and analyze photos of the faces of migrant children at the border in a bid to improve facial recognition technology, MIT Technology Review can reveal. This includes children “down to the infant,” according to John Boyd, assistant director of the department’s Office of Biometric Identity Management (OBIM), where a key part of his role is to research and develop future biometric identity services for the government. 

As Boyd explained at a conference in June, the key question for OBIM is, “If we pick up someone from Panama at the southern border at age four, say, and then pick them up at age six, are we going to recognize them?”

Facial recognition technology (FRT) has traditionally not been applied to children, largely because training data sets of real children’s faces are few and far between, and consist of either low-quality images drawn from the internet or small sample sizes with little diversity. Such limitations reflect the significant sensitivities regarding privacy and consent when it comes to minors. 

In practice, the new DHS plan could effectively solve that problem. According to Syracuse University’s Transactional Records Access Clearinghouse (TRAC), 339,234 children arrived at the US-Mexico border in 2022, the last year for which numbers are currently available. Of those children, 150,000 were unaccompanied—the highest annual number on record. If the face prints of even 1% of those children had been enrolled in OBIM’s craniofacial structural progression program, the resulting data set would dwarf nearly all existing data sets of real children’s faces used for aging research.

It’s unclear to what extent the plan has already been implemented; Boyd tells MIT Technology Review that to the best of his knowledge, the agency has not yet started collecting data under the program, but he adds that as “the senior executive,” he would “have to get with [his] staff to see.” He could only confirm that his office is “funding” it. Despite repeated requests, Boyd did not provide any additional information. 

Boyd says OBIM’s plan to collect facial images from children under 14 is possible due to recent “rulemaking” at “some DHS components,” or sub-offices, that have removed age restrictions on the collection of biometric data. US Customs and Border Protection (CBP), the US Transportation Security Administration, and US Immigration and Customs Enforcement declined to comment before publication. US Citizenship and Immigration Services (USCIS) did not respond to multiple requests for comment. OBIM referred MIT Technology Review back to DHS’s main press office. 

DHS did not comment on the program prior, but sent an emailed statement following publication: “The Department of Homeland Security uses various forms of technology to execute its mission, including some biometric capabilities. DHS ensures all technologies, regardless of type, are operated under the established authorities and within the scope of the law. We are committed to protecting the privacy, civil rights, and civil liberties of all individuals who may be subject to the technology we use to keep the nation safe and secure.”

Boyd spoke publicly about the plan in June at the Federal Identity Forum and Exposition, an annual identity management conference for federal employees and contractors. But close observers of DHS that we spoke with—including a former official, representatives of two influential lawmakers who have spoken out about the federal government’s use of surveillance technologies, and immigrants’ rights organizations that closely track policies affecting migrants—were unaware of any new policies allowing biometric data collection of children under 14. 

That is not to say that all of them are surprised. “That tracks,” says one former CBP official who has visited several migrant processing centers on the US-Mexico border and requested anonymity to speak freely. He says “every center” he visited “had biometric identity collection, and everybody was going through it,” though he was unaware of a specific policy mandating the practice. “I don’t recall them separating out children,” he adds.

“The reports of CBP, as well as DHS more broadly, expanding the use of facial recognition technology to track migrant children is another stride toward a surveillance state and should be a concern to everyone who values privacy,” Justin Krakoff, deputy communications director for Senator Jeff Merkley of Oregon, said in a statement to MIT Technology Review. Merkley has been an outspoken critic of both DHS’s immigration policies and of government use of facial recognition technologies

Beyond concerns about privacy, transparency, and accountability, some experts also worry about testing and developing new technologies using data from a population that has little recourse to provide—or withhold—consent. 

Could consent “actually take into account the vast power differentials that are inherent in the way that this is tested out on people?” asks Petra Molnar, author of The Walls Have Eyes: Surviving Migration in the Age of AI. “And if you arrive at a border … and you are faced with the impossible choice of either: get into a country if you give us your biometrics, or you don’t.”

“That completely vitiates informed consent,” she adds.

This question becomes even more challenging when it comes to children, says Ashley Gorski, a senior staff attorney with the American Civil Liberties Union. DHS “should have to meet an extremely high bar to show that these kids and their legal guardians have meaningfully consented to serve as test subjects,” she says. “There’s a significant intimidation factor, and children aren’t as equipped to consider long-term risks.”

Murky new rules

The Office of Biometric Identity Management, previously known as the US Visitor and Immigrant Status Indicator Technology Program (US-VISIT), was created after 9/11 with the specific mandate of collecting biometric data—initially only fingerprints and photographs—from all non-US citizens who sought to enter the country. 

Since then, DHS has begun collecting face prints, iris and retina scans, and even DNA, among other modalities. It is also testing new ways of gathering this data—including through contactless fingerprint collection, which is currently deployed at five sites on the border, as Boyd shared in his conference presentation. 

Since 2023, CBP has been using a mobile app, CBP One, for asylum seekers to submit biometric data even before they enter the United States; users are required to take selfies periodically to verify their identity. The app has been riddled with problems, including technical glitches and facial recognition algorithms that are unable to recognize darker-skinned people. This is compounded by the fact that not every asylum seeker has a smartphone. 

Then, just after crossing into the United States, migrants must submit to collection of biometric data, including DNA. For a sense of scale, a recent report from Georgetown Law School’s Center on Privacy and Technology found that CBP has added 1.5 million DNA profiles, primarily from migrants crossing the border, to law enforcement databases since it began collecting DNA “from any person in CBP custody subject to fingerprinting” in January 2020. The researchers noted that an overrepresentation of immigrants—the majority of whom are people of color—in a DNA database used by law enforcement could subject them to over-policing and lead to other forms of bias. 

Generally, these programs only require information from individuals aged 14 to 79. DHS attempted to change this back in 2020, with proposed rules for USCIS and CBP that would have expanded biometric data collection dramatically, including by age. (USCIS’s proposed rule would have doubled the number of people from whom biometric data would be required, including any US citizen who sponsors an immigrant.) But the USCIS rule was withdrawn in the wake of the Biden administration’s new “priorities to reduce barriers and undue burdens in the immigration system.” Meanwhile, for reasons that remain unclear, the proposed CBP rule was never enacted. 

This would make it appear “contradictory” if DHS were now collecting the biometric data of children under 14, says Dinesh McCoy, a staff attorney with Just Futures Law, an immigrant rights group that tracks surveillance technologies. 

Neither Boyd nor DHS’s media office would confirm which specific policy changes he was referring to in his presentation, though MIT Technology Review has identified a 2017 memo, issued by then-Secretary of Homeland Security John F. Kelly, that encouraged DHS components to remove “age as a basis for determining when to collect biometrics.” 

The DHS’s Office of the Inspector General (OIG) referred to this memo as the “overarching policy for biometrics at DHS” in a September 2023 report, though none of the press offices MIT Technology Review contacted—including the main DHS press office, OIG, and OBIM, among others—would confirm whether this was still the relevant policy; we have not been able to confirm any related policy changes since then. 

The OIG audit also found a number of fundamental issues related to DHS’s oversight of biometric data collection and use—including that its 10-year strategic framework for biometrics, covering 2015 to 2025, “did not accurately reflect the current state of biometrics across the Department, such as the use of facial recognition verification and identification.” Nor did it provide clear guidance for the consistent collection and use of biometrics across DHS, including age requirements. 

But there is also another potential explanation for the new OBIM program: Boyd says it is being conducted under the auspices of the DHS’s undersecretary of science and technology, the office that leads much of the agency’s research efforts. Because it is for research, rather than to be used “in DHS operations to inform processes or decision making,” many of the standard restrictions for DHS use of face recognition and face capture technologies do not apply, according to a DHS directive

Do you have any additional information on DHS’s craniofacial structural progression initiative? Please reach out with a non-work email to tips@technologyreview.com or securely on Signal at 626.765.5489. 

Some lawyers allege that changing the age limit for data collection via department policy, not by a federal rule, which requires a public comment period, is problematic. McCoy, for instance, says any lack of transparency here amplifies the already “extremely challenging” task of “finding [out] in a systematic way how these technologies are deployed”—even though that is key for accountability.

Advancing the field

At the identity forum and in a subsequent conversation, Boyd explained that this data collection is meant to advance the development of effective FRT algorithms. Boyd leads OBIM’s Future Identity team, whose mission is to “research, review, assess, and develop technology, policy, and human factors that enable rapid, accurate, and secure identity services” and to make OBIM “the preferred provider for identity services within DHS.” 

Driven by high-profile cases of missing children, there has long been interest in understanding how children’s faces age. At the same time, there have been technical challenges to doing so, both preceding FRT and with it. 

At its core, facial recognition identifies individuals by comparing the geometry of various facial features in an original face print with subsequent images. Based on this comparison, a facial recognition algorithm assigns a percentage likelihood that there is a match. 

But as children grow and develop, their bone structure changes significantly, making it difficult for facial recognition algorithms to identify them over time. (These changes tend to be even more pronounced  in children under 14. In contrast, as adults age, the changes tend to be in the skin and muscle, and have less variation overall.) More data would help solve this problem, but there is a dearth of high-quality data sets of children’s faces with verifiable ages. 

“What we’re trying to do is to get large data sets of known individuals,” Boyd tells MIT Technology Review. That means taking high-quality face prints “under controlled conditions where we know we’ve got the person with the right name [and] the correct birth date”—or, in other words, where they can be certain about the “provenance of the data.” 

For example, one data set used for aging research consists of 305 celebrities’ faces as they aged from five to 32. But these photos, scraped from the internet, contain too many other variables—such as differing image qualities, lighting conditions, and distances at which they were taken—to be truly useful. Plus, speaking to the provenance issue that Boyd highlights, their actual ages in each photo can only be estimated. 

Another tactic is to use data sets of adult faces that have been synthetically de-aged. Synthetic data is considered more privacy-preserving, but it too has limitations, says Stephanie Schuckers, director of the Center for Identification Technology Research (CITeR). “You can test things with only the generated data,” Schuckers explains, but the question remains: “Would you get similar results to the real data?”

(Hosted at Clarkson University in New York, CITeR brings together a network of academic and government affiliates working on identity technologies. OBIM is a member of the research consortium.) 

Schuckers’s team at CITeR has taken another approach: an ongoing longitudinal study of a cohort of 231 elementary and middle school students from the area around Clarkson University. Since 2016, the team has captured biometric data every six months (save for two years of the covid-19 pandemic), including facial images. They have found that the open-source face recognition models they tested can in fact successfully recognize children three to four years after they were initially enrolled. 

But the conditions of this study aren’t easily replicable at scale. The study images are taken in a controlled environment, all the participants are volunteers, the researchers sought consent from parents and the subjects themselves, and the research was approved by the university’s Institutional Review Board. Schuckers’s research also promises to protect privacy by requiring other researchers to request access, and by providing facial datasets separately from other data that have been collected. 

What’s more, this research still has technical limitations, including that the sample is small, and it is overwhelmingly Caucasian, meaning it might be less accurate when applied to other races. 

Schuckers says she was unaware of DHS’s craniofacial structural progression initiative. 

Far-reaching implications

Boyd says OBIM takes privacy considerations seriously, and that “we don’t share … data with commercial industries.” Still, OBIM has 144 government partners with which it does share information, and it has been criticized by the Government Accountability Office for poorly documenting who it shares information with, and with what privacy-protecting measures. 

Even if the data does stay within the federal government, OBIM’s findings regarding the accuracy of FRT for children over time could nevertheless influence how—and when—the rest of the government collects biometric data, as well as whether the broader facial recognition industry may also market its services for children. (Indeed, Boyd says sharing “results,” or the findings of how accurate FRT algorithms are, is different than sharing the data itself.) 

That this technology is being tested on people who are offered fewer privacy protections than would be afforded to US citizens is just part of the wider trend of using people from the developing world, whether they are migrants coming to the border or civilians in war zones, to help improve new technologies. 

In fact, Boyd previously helped advance the Department of Defense’s biometric systems in Iraq and Afghanistan, where he acknowledged that individuals lacked the privacy protections that would have been granted in many other contexts, despite the incredibly high stakes. Biometric data collected in those war zones—in some areas, from every fighting-age male—was used to identify and target insurgents, and being misidentified could mean death. 

These projects subsequently played a substantial role in influencing the expansion of biometric data collection by the Department of Defense, which now happens globally. And architects of the program, like Boyd, have taken important roles in expanding the use of biometrics at other agencies. 

“It’s not an accident” that this testing happens in the context of border zones, says Molnar. Borders are “the perfect laboratory for tech experimentation, because oversight is weak, discretion is baked into the decisions that get made … it allows the state to experiment in ways that it wouldn’t be allowed to in other spaces.” 

But, she notes, “just because it happens at the border doesn’t mean that that’s where it’s going to stay.”

Update: This story was updated to include comment from DHS.

Do you have any additional information on DHS’s craniofacial structural progression initiative? Please reach out with a non-work email to tips@technologyreview.com or securely on Signal at 626.765.5489.