Public Housing Contractors Are Using Federal Money To Inflict Biometric Surveillance Misery On Their Tenants

Most of us wouldn’t argue that private companies can’t run their businesses the way they prefer. The gold standard has been the right to refuse service to anyone — something that covers everything from refusing paper checks from certain customers to booting people off social media services for refusing to stop behaving like inveterate assholes.
When private companies do things, they rarely mess with constitutional protections. There are guardrails in place to prevent discrimination against minorities and other historically oppressed groups, but for the most part, companies are free (and protected by the Constitution!) to choose whom they do business with.
All well and good, but it appears more private companies are mixing their autonomy with government funding. The end result isn’t the peanut butter and chocolate that gave Reese’s more market share. It’s more a blend of swamp water and tech-enabled discrimination, as Douglas MacMillan reports for the Washington Post. (h/t Techdirt Insider Samuel Abram for pointing this out in the Techdirt Insider Chat)

In public housing facilities across America, local officials are installing a new generation of powerful and pervasive surveillance systems, imposing an outsize level of scrutiny on some of the nation’s poorest citizens. Housing agencies have been purchasing the tools — some equipped with facial recognition and other artificial intelligence capabilities — with no guidance or limits on their use, though the risks are poorly understood and little evidence exists that they make communities safer.
In rural Scott County, Va., cameras equipped with facial recognition scan everyone who walks past them,looking for people barred from public housing. In New Bedford, Mass., software is used to search hours of recordings to find any movement near the doorways of residents suspected of violating overnight guest rules. And in tiny Rolette, N.D., public housing officials have installed 107 cameras to watch up to 100 residents — a number of cameras per capitaapproaching that found in New York’s Rikers Island jail complex.

Public housing and prisons tend to involve a host of private government contractors. The federal government tends to turn over the day-to-day business of managing these facilities to the lowest bidder, allowing private companies to become partners in government surveillance, while still maintaining a limited amount of plausible deniability when it comes to questionable surveillance efforts.
The government can always claim its contractors are exceeding their authority. Private contractors can always claim the government wants them to do these things. At the bottom of it all are the people negatively affected by these efforts — people treated like criminals worthy of incessant surveillance just because it’s the only housing they can afford.
These are the means. And they’re mostly punitive and vindictive. The government-purchased and mandated surveillance tech puts residents of these buildings at the end of unblinking crosshairs. The end result is the government and public housing administrators making life even more miserable for people who are unable to escape their circumstances.

The U.S. Department of Housing and Urban Development has helped facilitate the purchase of cameras through federal crime-fighting grants. Those grants are meant to keep residents safer, and housing agencies say they do. But the cameras are also being used to generate evidenceto punish and evict public housing residents, sometimes for minor violations of housing rules, according to interviews with residents and legal aid attorneys, a review of court records, and interviews and correspondence with administrators at more than 60 public housing agencies that received the grants in 27 states.

At this point, most public housing is still owned by the government. But privatization is becoming the new normal as the federal government seeks to divest itself of holdings it would rather not be held responsible for, and as legislators find ways to award donors with additional property that comes with a built-in federal excuse for refusing to do anything more than subject tenants to increased surveillance.
Federally-owned property utilizing these forms of surveillance should be facing intense amounts of scrutiny from lawmakers who have expressed objections to facial recognition tech and its problematic inability to separate itself from the biases of those creating and training this form of AI.
Instead, we’re getting a federal shrug from lawmakers, who continue to fund questionable surveillance tech while separating themselves from the least profitable parts of the federal government portfolio. The end result is a vacuum of oversight — something that encourages abuse and ensures violations will go unnoticed and unpunished.
If there’s any upside, it’s that the government is still somewhat capable of recognizing its mistakes. The downside is that it takes outside parties like the Washington Post to point out these problems to the agencies that should have been on top of this from day one.

Last month, after The Washington Post presented HUD with evidence of the growing use of sophisticated surveillance tools by local housing authorities, the agency said it would no longer permit future recipients to spend security grants on facial recognition. These tools “are not fool proof,” and their mistakes can adversely impact public housing residents, Dominique Blom, HUD general deputy assistant secretary of public and Indian housing, said in an interview.

This is good, but I’ll believe it when it’s codified. All this means at this point is that HUD will vet tech purchases for public housing a bit more closely until it believes the heat has died down. If the agency isn’t willing to advocate for a federal ban of this surveillance tech, it’s basically saying it will remain “concerned” for as long as it takes to outlast this current news cycle.
While public housing administrators may claim the cameras and biometric surveillance help prevent crime, residents aren’t actually seeing the trends the government and its contractors claim are the result of increased surveillance. Instead, they’re seeing the cameras and AI being deployed to engage in useless bullshit, like flagging an ex-spouse for spending too many nights at a place they’re not on the lease, despite this being permitted by the tenant (the other spouse) to ensure their kids got to school on mornings when she had to be to work before her kids needed to be at school.
That’s how these systems are used. Lots of actual crime goes ignored because it asks far too much of largely passive property holders. They’d rather go after the people that can’t fight back and don’t require the engagement of law enforcement to remove them from the property. It’s a way to maximize profits for public housing that shouldn’t be in the business of turning a profit. It allows landlords to eject tenants, end leases, and hoover up their deposits over minor violations that would have been overlooked if it wasn’t for always-on surveillance and supporting tech allowing housing owners to target people with a couple of clicks of a mouse button.
Certainly the government and its landlords have an obligation to protect residents from criminal activity. But this report shows little of this funding and tech purchased with it targets actual criminals. Instead, it’s AI-enabled navel-gazing that allows landlords to hassle tenants for minor lease violations because it’s far easier than actually trying to protect them from the criminal element surrounding them.
If HUD wants to be taken seriously, it needs to get a ban on this tech codified into law. And it needs to ensure its contractors use tech responsibly to help tenants, rather than harm them.

https://www.techdirt.com/2023/05/23/public-housing-contractors-are-using-federal-money-to-inflict-biometric-surveillance-misery-on-their-tenants/

- Any text modified or added by CorruptionLedger is highlighted in blue.

- [...] These characters indicate content was shortened. This is used for removing unnecessary/biased/flowery language. Example: The oppressive government imposed a curfew becomes: The [...] government imposed a curfew.