Skip to content

New Orleans debates real-time facial recognition legislation

Technology
10 5 0
  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

    Just a reminder that it's illegal to wear masks in New Orleans unless it's Carnivale.

  • Just a reminder that it's illegal to wear masks in New Orleans unless it's Carnivale.

    Thanks to the KKK, I assume.

  • Wow, I didn't even know that. Louisiana State law. That some dumb fucking bullshit.

    They make an exception for medical masks, but I also saw that video of a protestor getting hassled and arrested for a medical mask recently.

    I guess I'm just going to have to start using face paint to trick their cams

  • New Orleans has emerged as a flashpoint in debates over real-time facial recognition technology. The city’s leaders are weighing a landmark ordinance that, if passed, would make New Orleans the first U.S. city to formally legalize continuous facial surveillance by police officers.

    The move follows revelations that, for two years, the New Orleans Police Department (NOPD) quietly used automated alerts from a privately operated camera network known as Project NOLA that bypassed the strictures of the city’s 2022 ordinance which explicitly banned such practices. Project NOLA is a non-profit surveillance network founded by ex-police detective Bryan Lagarde.

    Despite this, Project NOLA’s network was set to continuously and automatically scan public spaces. Every face that passed within view was compared in real time, and officers were pinged via an app whenever a watchlist match occurred, leaving no requirement for supervisory oversight, independent verification, or adherence to reporting standards.

    Opponents argue that automated surveillance everywhere in public spaces raises profound threats to privacy, civil rights, and due process. The American Civil Liberties Union (ACLU) of Louisiana described the system as a “facial recognition technology nightmare” that enables the government to “track us as we go about our daily lives.”

    The wrongful arrest of Randal Reid based on misidentification from still-image facial recognition is touted as highlighting the real-world dangers of facial recognition. Reid is a 29‑year‑old Black logistics analyst from Georgia who was wrongfully arrested in late 2022 and held for six days due to a false facial recognition match.

    The ACLU has urged the City Council to reimpose a moratorium and demand an independent audit covering privacy compliance, algorithmic bias, evidence admissibility, record retention, and public awareness. The organization said that NOPD currently lacks any system for logging or disclosing facial-recognition-derived evidence, and Project NOLA operates outside official oversight entirely.

    A vote by the City Council is expected later this month. If passed, NOPD and any authorized third party will be legally empowered to scan live public feeds using facial recognition, provided reports are submitted according to the new law.

    Meanwhile, NOPD is awaiting the outcome of its internal audit and Kirkpatrick has stated that policy revisions will be guided by the council’s decisions. Meanwhile, the ACLU and partners are preparing to escalate their opposition, pushing for either outright prohibition or deeply strengthened accountability measures.

    The decision facing New Orleans encapsulates the broader tension between embracing AI-based public safety tools and protecting civil liberties. Proponents emphasize the edge that real-time intelligence can provide in stopping violent crime and responding to emergencies, while critics warn that indiscriminate surveillance erodes privacy, civil rights, and due-process safeguards.

    A few things I feel are very important that none of the recent June articles about this mention:

    1. The city has managed to keep this all relatively under wraps. Few people are even aware of this, and even if they are they are not aware of the level of surveillance.

    2. This seems to be being kept in the dark even by people that we should be able to trust. I only found out about the City Council vote this month bc I make a habit of searching for updates about this every so often. I cannot find any information about when the vote is actually scheduled, just sometimes at the end of June. This is the last week of June so presumably this week?

    3. State Police and ICE can't be regulated by city government. There is a permanent state police force in New Orleans that was established as of last year by Governor Landry.

    I believe they have continued using this technology however they please, and there is no real way for the city to regulate how they use it, and who they share it with.

    EDIT:
    The city council meeting is this coming Thursday

    Thursday, June 26

    10:00a City Council Facial Recognition Meeting – City Council Chamber, 1300 Perdido St., Second Floor West

    Livestream link

    TLDR: New Orleans is poised to become the first U.S. city to legalize real-time police facial recognition surveillance, despite a 2022 ban. The push follows revelations that NOPD secretly used Project NOLA’s 200+ AI cameras for two years, making 34+ arrests without oversight. Proponents argue it’s vital for crime-fighting, citing Bourbon Street shootings and jailbreaks, while critics warn of dystopian privacy erosion and racial bias, referencing wrongful arrests like Randal Reid’s. With 70% public approval but fierce ACLU opposition, the vote could set a dangerous precedent: privatized mass surveillance with zero accountability.

  • Also unless you claim to be a member of ICE, I assume

  • TLDR: New Orleans is poised to become the first U.S. city to legalize real-time police facial recognition surveillance, despite a 2022 ban. The push follows revelations that NOPD secretly used Project NOLA’s 200+ AI cameras for two years, making 34+ arrests without oversight. Proponents argue it’s vital for crime-fighting, citing Bourbon Street shootings and jailbreaks, while critics warn of dystopian privacy erosion and racial bias, referencing wrongful arrests like Randal Reid’s. With 70% public approval but fierce ACLU opposition, the vote could set a dangerous precedent: privatized mass surveillance with zero accountability.

    ~2012ish: Palantir receives contract with city of New Orleans

    2015: Privately owned Project Nola surveillance cam program created

    2018: City cancels very shady contract with Palantir that helped them create and test their predictive policing tech

    2020: Peter Thiel becomes major investor in Clearview AI facial recognition technology. Free trials are given to ICE and multiple local law enforcement agencies across the U.S.

    Late 2020: Ban on facial recognition tech and predictive policing in New Orleans

    2022: ~18 months later, Cantrell requests City Council lift the ban, and it is replaced with shady surveillance ordinance giving the city some very concerning privileges in certain circumstances

    2024: Cantrell says she won't fight Landry establishing Troop Nola as a permanent police presence in the city, despite concerns from civil rights advocacy groups

    Feb 2025: Forbes reports that Clearview AI remains unprofitable due to multiple ongoing lawsuits and previous inability to secure federal contracts. The company says future focus will be large federal contracts.

    May 2025: Washington Post reveals NOPD has been ignoring the fairly lax laws regarding facial recognition tech in the 2022 surveillance ordinance while working with Project Nola. NOPD pauses use of tech, but Troop Nola and federal agencies continue use bc they're not under city jurisdiction

  • ~2012ish: Palantir receives contract with city of New Orleans

    2015: Privately owned Project Nola surveillance cam program created

    2018: City cancels very shady contract with Palantir that helped them create and test their predictive policing tech

    2020: Peter Thiel becomes major investor in Clearview AI facial recognition technology. Free trials are given to ICE and multiple local law enforcement agencies across the U.S.

    Late 2020: Ban on facial recognition tech and predictive policing in New Orleans

    2022: ~18 months later, Cantrell requests City Council lift the ban, and it is replaced with shady surveillance ordinance giving the city some very concerning privileges in certain circumstances

    2024: Cantrell says she won't fight Landry establishing Troop Nola as a permanent police presence in the city, despite concerns from civil rights advocacy groups

    Feb 2025: Forbes reports that Clearview AI remains unprofitable due to multiple ongoing lawsuits and previous inability to secure federal contracts. The company says future focus will be large federal contracts.

    May 2025: Washington Post reveals NOPD has been ignoring the fairly lax laws regarding facial recognition tech in the 2022 surveillance ordinance while working with Project Nola. NOPD pauses use of tech, but Troop Nola and federal agencies continue use bc they're not under city jurisdiction

    oh I know they're not the first ones to get it but I do believe they're the first ones to put the legal precedent or at least that's what the article says.

  • oh I know they're not the first ones to get it but I do believe they're the first ones to put the legal precedent or at least that's what the article says.

    I wasn't arguing with you, everything you said is correct.

    Just adding more details and the timeline of events that makes this all even more "what the actual fuck is happening?"

  • I wasn't arguing with you, everything you said is correct.

    Just adding more details and the timeline of events that makes this all even more "what the actual fuck is happening?"

    ahh very good, sorry for the misunderstanding.

  • Canalys: Companies limit genAI use due to unclear costs

    Technology technology
    8
    1
    25 Stimmen
    8 Beiträge
    4 Aufrufe
    B
    Just wait until all the venture capital OpenAi raised on a valuation that assumes they will singlehandedly achieve the singularity in 2027, replace all human workers by 2028, and convert 75% of the Earth's crust to paperclips by 2030 runs out, they can't operate at a loss anymore, and have to raises prices to a point where they're actually making a profit.
  • 136 Stimmen
    16 Beiträge
    7 Aufrufe
    E
    I thought we were going to get our share of the damages
  • 6 Stimmen
    1 Beiträge
    3 Aufrufe
    Niemand hat geantwortet
  • 5 Stimmen
    1 Beiträge
    1 Aufrufe
    Niemand hat geantwortet
  • 88 Stimmen
    26 Beiträge
    17 Aufrufe
    M
    I really can't stand this guy. What a slag.
  • The Enshitification of Youtube’s Full Album Playlists

    Technology technology
    3
    1
    108 Stimmen
    3 Beiträge
    7 Aufrufe
    dual_sport_dork@lemmy.worldD
    Especially when the poster does not disclose that it's AI. The perpetual Youtube rabbit hole occasionally lands on one of these for me when I leave it unsupervised, and usually you can tell from the "cover" art. But only if you're looking at it. Because if you just leave it going in the background eventually you start to realize, "Wow, this guy really tripped over the fine line between a groove and rut." Then you click on it and look: Curses! Foiled again. And golly gee, I'm sure glad Youtube took away the option to oughtright block channels. I'm sure that's a total coincidence. W/e. I'm a have-it-on-my-hard-drive kind of bird. Yt-dlp is your friend. Just use it to nab whatever it is you actually want and let your own media player decide how to shuffle and present it. This works great for big name commercial music as well, whereupon the record labels are inevitably dumb enough to post songs and albums in their entirety right there you Youtube. Who even needs piracy sites at that rate? Yoink!
  • 326 Stimmen
    20 Beiträge
    7 Aufrufe
    roofuskit@lemmy.worldR
    It's extremely traceable. There is a literal public ledger if every single transaction.
  • Microsoft's AI Secretly Copying All Your Private Messages

    Technology technology
    4
    1
    0 Stimmen
    4 Beiträge
    7 Aufrufe
    S
    Forgive me for not explaining better. Here are the terms potentially needing explanation. Provisioning in this case is initial system setup, the kind of stuff you would do manually after a fresh install, but usually implies a regimented and repeatable process. Virtual Machine (VM) snapshots are like a save state in a game, and are often used to reset a virtual machine to a particular known-working condition. Preboot Execution Environment (PXE, aka ‘network boot’) is a network adapter feature that lets you boot a physical machine from a hosted network image rather than the usual installation on locally attached storage. It’s probably tucked away in your BIOS settings, but many computers have the feature since it’s a common requirement in commercial deployments. As with the VM snapshot described above, a PXE image is typically a known-working state that resets on each boot. Non-virtualized means not using hardware virtualization, and I meant specifically not running inside a virtual machine. Local-only means without a network or just not booting from a network-hosted image. Telemetry refers to data collecting functionality. Most software has it. Windows has a lot. Telemetry isn’t necessarily bad since it can, for example, help reveal and resolve bugs and usability problems, but it is easily (and has often been) abused by data-hungry corporations like MS, so disabling it is an advisable precaution. MS = Microsoft OSS = Open Source Software Group policies are administrative settings in Windows that control standards (for stuff like security, power management, licensing, file system and settings access, etc.) for user groups on a machine or network. Most users stick with the defaults but you can edit these yourself for a greater degree of control. Docker lets you run software inside “containers” to isolate them from the rest of the environment, exposing and/or virtualizing just the resources they need to run, and Compose is a related tool for defining one or more of these containers, how they interact, etc. To my knowledge there is no one-to-one equivalent for Windows. Obviously, many of these concepts relate to IT work, as are the use-cases I had in mind, but the software is simple enough for the average user if you just pick one of the premade playbooks. (The Atlas playbook is popular among gamers, for example.) Edit: added explanations for docker and telemetry