Skip to content

New Orleans debates real-time facial recognition legislation

Technology
12 6 129
  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

    Just a reminder that it's illegal to wear masks in New Orleans unless it's Carnivale.

  • Just a reminder that it's illegal to wear masks in New Orleans unless it's Carnivale.

    Thanks to the KKK, I assume.

  • Wow, I didn't even know that. Louisiana State law. That some dumb fucking bullshit.

    They make an exception for medical masks, but I also saw that video of a protestor getting hassled and arrested for a medical mask recently.

    I guess I'm just going to have to start using face paint to trick their cams

  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

    TLDR: New Orleans is poised to become the first U.S. city to legalize real-time police facial recognition surveillance, despite a 2022 ban. The push follows revelations that NOPD secretly used Project NOLA’s 200+ AI cameras for two years, making 34+ arrests without oversight. Proponents argue it’s vital for crime-fighting, citing Bourbon Street shootings and jailbreaks, while critics warn of dystopian privacy erosion and racial bias, referencing wrongful arrests like Randal Reid’s. With 70% public approval but fierce ACLU opposition, the vote could set a dangerous precedent: privatized mass surveillance with zero accountability.

  • Also unless you claim to be a member of ICE, I assume

  • TLDR: New Orleans is poised to become the first U.S. city to legalize real-time police facial recognition surveillance, despite a 2022 ban. The push follows revelations that NOPD secretly used Project NOLA’s 200+ AI cameras for two years, making 34+ arrests without oversight. Proponents argue it’s vital for crime-fighting, citing Bourbon Street shootings and jailbreaks, while critics warn of dystopian privacy erosion and racial bias, referencing wrongful arrests like Randal Reid’s. With 70% public approval but fierce ACLU opposition, the vote could set a dangerous precedent: privatized mass surveillance with zero accountability.

    ~2012ish: Palantir receives contract with city of New Orleans

    2015: Privately owned Project Nola surveillance cam program created

    2018: City cancels very shady contract with Palantir that helped them create and test their predictive policing tech

    2020: Peter Thiel becomes major investor in Clearview AI facial recognition technology. Free trials are given to ICE and multiple local law enforcement agencies across the U.S.

    Late 2020: Ban on facial recognition tech and predictive policing in New Orleans

    2022: ~18 months later, Cantrell requests City Council lift the ban, and it is replaced with shady surveillance ordinance giving the city some very concerning privileges in certain circumstances

    2024: Cantrell says she won't fight Landry establishing Troop Nola as a permanent police presence in the city, despite concerns from civil rights advocacy groups

    Feb 2025: Forbes reports that Clearview AI remains unprofitable due to multiple ongoing lawsuits and previous inability to secure federal contracts. The company says future focus will be large federal contracts.

    May 2025: Washington Post reveals NOPD has been ignoring the fairly lax laws regarding facial recognition tech in the 2022 surveillance ordinance while working with Project Nola. NOPD pauses use of tech, but Troop Nola and federal agencies continue use bc they're not under city jurisdiction

  • ~2012ish: Palantir receives contract with city of New Orleans

    2015: Privately owned Project Nola surveillance cam program created

    2018: City cancels very shady contract with Palantir that helped them create and test their predictive policing tech

    2020: Peter Thiel becomes major investor in Clearview AI facial recognition technology. Free trials are given to ICE and multiple local law enforcement agencies across the U.S.

    Late 2020: Ban on facial recognition tech and predictive policing in New Orleans

    2022: ~18 months later, Cantrell requests City Council lift the ban, and it is replaced with shady surveillance ordinance giving the city some very concerning privileges in certain circumstances

    2024: Cantrell says she won't fight Landry establishing Troop Nola as a permanent police presence in the city, despite concerns from civil rights advocacy groups

    Feb 2025: Forbes reports that Clearview AI remains unprofitable due to multiple ongoing lawsuits and previous inability to secure federal contracts. The company says future focus will be large federal contracts.

    May 2025: Washington Post reveals NOPD has been ignoring the fairly lax laws regarding facial recognition tech in the 2022 surveillance ordinance while working with Project Nola. NOPD pauses use of tech, but Troop Nola and federal agencies continue use bc they're not under city jurisdiction

    oh I know they're not the first ones to get it but I do believe they're the first ones to put the legal precedent or at least that's what the article says.

  • oh I know they're not the first ones to get it but I do believe they're the first ones to put the legal precedent or at least that's what the article says.

    I wasn't arguing with you, everything you said is correct.

    Just adding more details and the timeline of events that makes this all even more "what the actual fuck is happening?"

  • I wasn't arguing with you, everything you said is correct.

    Just adding more details and the timeline of events that makes this all even more "what the actual fuck is happening?"

    ahh very good, sorry for the misunderstanding.

  • Wow, I didn't even know that. Louisiana State law. That some dumb fucking bullshit.

    They make an exception for medical masks, but I also saw that video of a protestor getting hassled and arrested for a medical mask recently.

    I guess I'm just going to have to start using face paint to trick their cams

    I mean, Louisiana's best governor was the Kingfisher in my opinion and that guy was corrupt as fuck.

    If there's a state that should really Thank God for Mississippi, it's Louisiana.

  • I mean, Louisiana's best governor was the Kingfisher in my opinion and that guy was corrupt as fuck.

    If there's a state that should really Thank God for Mississippi, it's Louisiana.

    Palantir had a contract with New Orleans starting around ~2012 to create their predictive policing tech that scans surveillance cameras for very vague details and still misidentifies people.

    It's very similar to Lavender, the tech they use to identify members of Hamas and attack with drones. This results in misidentified targets ~10% of the time, according to the IDF (likely it's a much higher misidentification rate than 10%).

    Palantir picked Louisiana over somewhere like San Francisco bc they knew it would be a lot easier to violate rights and privacy here and get away with it.

    Whatever they decide in New Orleans on Thursday during this Council meeting that nobody cares about, will likely be the first of its kind on the books legal basis to track civilians in the U.S. and allow the federal government to take control over that ability whenever they want. This could also set a precedent for use in other states.

    Guess who's running the entire country right now, and just gave high ranking army contracts to Palantir employees for "no reason" while they are also receiving a multimillion dollar federal contract to create an insane database on every American and giant data centers are being built all across the country.

  • 259 Stimmen
    14 Beiträge
    0 Aufrufe
    I
    CommBank winning the Big 4's race to the bottom yet again? The only thing that surprises me about this is that people still bank with them when credit unions and building societies exist (The only exception would be international students, backpackers and working holidaymakers, because I hear CommBank's probably the easiest institution for foreign nationals to set up an account.)
  • 25 Stimmen
    11 Beiträge
    11 Aufrufe
    B
    Maybe if AI actually existed that would be a valid concern.
  • 722 Stimmen
    67 Beiträge
    230 Aufrufe
    S
    All the research I am aware of - including what I referenced in the previous comment, is that people are honest by default, except for a few people who lie a lot. Boris Johnson is a serial liar and clearly falls into that camp. I believe that you believe that, but a couple of surveys are not a sufficient argument to prove the fundamental good of all humanity. If honesty were not the default, why would we believe what anyone has to say in situations where they have an incentive to lie, which is often? Why are such a small proportion of people criminals and fraudsters when for a lot of crimes, someone smart and cautious has a very low chance of being caught? I think this is just a lack of imagination. i will go through your scenarios and provide an answer but i don't think it's going to achieve anything, we just fundamentally disagree on this. why would we believe what anyone has to say in situations where they have an incentive to lie, which is often? You shouldn't. edit : You use experience with this person or in general, to make a judgement call about whether or not you want to listen to what they have to say until more data is available. You continue to refine based on accumulated experience. Why are such a small proportion of people criminals and fraudsters when for a lot of crimes, someone smart and cautious has a very low chance of being caught? A lot of assumptions and leaps here. Firstly crime implies actual law, which is different in different places, so let's assume for now we are talking about the current laws in the uk. Criminals implies someone who has been caught and prosecuted for breaking a law, I'm going with that assumption because "everyone who has ever broken a law" is a ridiculous interpretation. So to encompass the assumptions: Why are such a small proportion of people who have been caught and prosecuted for breaking the law in the uk, when someone smart and caution has a very low chance of being caught? I hope you can see how nonsensical that question is. The evolutionary argument goes like this: social animals have selection pressure for traits that help the social group, because the social group contains related individuals, as well as carrying memetically inheritable behaviours. This means that the most successful groups are the ones that work well together. A group first of all has an incentive to punish individuals who act selfishly to harm the group - this will mean the group contains mostly individuals who, through self interest, will not betray the group. But a group which doesn’t have to spend energy finding and punishing traitorous individuals because it doesn’t contain as many in the first place will do even better. This creates a selection pressure behind mere self interest. That's a nicely worded very bias interpretation. social animals have selection pressure for traits that help the social group, because the social group contains related individuals, as well as carrying memetically inheritable behaviours. This is fine. This means that the most successful groups are the ones that work well together. That's a jump, working well together might not be the desirable trait in this instance. But let's assume it is for now. A group first of all has an incentive to punish individuals who act selfishly to harm the group - this will mean the group contains mostly individuals who, through self interest, will not betray the group. Reductive and assumptive, you're also conflating selfishness with betrayal, you can have on without the other, depending on perceived definitions of course. But a group which doesn’t have to spend energy finding and punishing traitorous individuals because it doesn’t contain as many in the first place will do even better. This creates a selection pressure behind mere self interest. Additional reduction and a further unsupported jump, individuals are more than just a single trait, selfishness might be desirable in certain scenarios or it might be a part of an individual who's other traits make up for it in a tribal context. The process of seeking and the focused attention might be a preferential selection trait that benefits the group. Powerful grifters try to protect themselves yes, but who got punished for pointing out that Boris is a serial liar? Everyone who has been negatively impacted by the policies enacted and consequences of everything that was achieved on the back of those lies. Because being ignored is still a punishment if there are negative consequences. But let's pick a more active punishment, protesting. Protest in a way we don't like or about a subject we don't approve of, it's now illegal to protest unless we give permission. That's reductive, but indicative of what happened in broad strokes. Have you read what the current government has said about the previous one? I'd imagine something along the lines of what the previous government said about the one before ? As a society we generally hate that kind of behaviour. Society as a whole does not protect wealth and power; wealth and power forms its own group which tries to protect itself. Depends on how you define society as a whole. By population, i agree. By actual power to enact change(without extreme measures), less so Convenient that you don't include the wealth and power as part of society, like its some other separate thing. You should care because it entirely colours how you interact with political life. “Shady behaviour” is about intent as well as outcome, and we are talking in this thread about shady behaviour, and hence about intent. See [POINT A]
  • 'Clanker' is social media's new slur for our robot future

    Technology technology
    58
    1
    269 Stimmen
    58 Beiträge
    1k Aufrufe
    C
    Does this mean we're getting genetically engineered whale based airships in the british navy?
  • 178 Stimmen
    9 Beiträge
    107 Aufrufe
    R
    They've probably just crunched the numbers and determined the cost of a recall in Canada was greater than the cost of law suits when your house does burn down
  • AI could already be conscious. Are we ready for it?

    Technology technology
    64
    1
    16 Stimmen
    64 Beiträge
    1k Aufrufe
    A
    AI isn't math formulas though. AI is a complex dynamic system reacting to external input. There is no fundamental difference here to a human brain in that regard imo. It's just that the processing isn't happening in biological tissue but in silicon. Is it way less complex than a human? Sure. Is there a fundamental qualitative difference? I don't think so. What's the qualitative difference in your opinion?
  • 0 Stimmen
    9 Beiträge
    7 Aufrufe
    C
    No idea. I just use the yubikey ones. I have an old usb-a one and a newer small usbc one
  • 0 Stimmen
    2 Beiträge
    36 Aufrufe
    andromxda@lemmy.dbzer0.comA
    The enshittification continues, but it doesn't affect me at all. Piracy is the way to go nowadays that all streaming services suck. !piracy@lemmy.dbzer0.com