Skip to content

What Does Palantir Actually Do?

Technology
10 9 0
  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    Customers need to already have the data they want to work with—Palantir itself does not provide any.

    It blows my mind that the role of Michael Kratsios during Trump's first administration as CTO has been basically ignored by the media.

    He was brought into the White House by Thiel to help the president with "technology issues." He is quoted in interviews as early as 2017-2018 saying the administration was trying to gain access to large protected government databases in order to train AI.

    Thiel was planning for government data to be ready for Palantir to use ~8 years before the current administration began handing Palantir billions of dollars in contracts and giving employees military rank.

    Kratsios is now science advisor for the POTUS, but still somehow barely receives press coverage. The rare coverage he does receive is never critical. Do you remember the big scary Elon Musk is running the White House, stealing our data, and we should all be terrified media narrative?

    Musk was only executing the plans Kratsios made during the first Trump administration, and he stepped down as soon as Kratsios was confirmed by the Senate.

    It's like we can state the obvious, "This could be a way for an authoritarian regime to destroy civil liberty." But nobody will just come out and say "Peter Thiel has already built a platform that will allow an authoritarian regime to destroy civil liberty and crush dissent, and he started planning it nearly a decade ago. Michael Kratsios is the flying monkey who made it possible for him to build it, and continues to quietly do his bidding."

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    For reference, Palantir are not the only ones to provide supposedly fancy UIs on top of data messes, there are many others like Talend, Datahaiku, Alteryx etc. Those platforms are not magic, they will still need a lot of little hands to wrangle the data mess to get it in a useful state, if there's ever an actual motivation to do that, and it's not all pretending to be "data driven". Usually, the data engineering community hates those graphical tools that are designed to convince executives rather than help engineers because no UI can be as powerful as a code base in this field.
    Palantir is special in how they managed to convince the executives of the police and administrations.

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    I've heard enough about Palantir to know that something is rotten in the state of Denmark but had been wondering about the specifics. Funny enough, I saw a Palantir commercial interspersed with friendly, shiny Apple commercials just the other day. (Sorry for the crap photo. I didn't want to be so obvious about recording our sinister digital overlords.) The ad makes a vague assertion about "revitalizing the American industrial base."

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    Enable fascism

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    also into ai hallucinations for script ideas

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    Engineers can laterally move to more prestigious or challenging projects if they prove worthy based on their skills and connections. One former staffer tells WIRED that this made the company feel like a meritocracy where the best people, and the best ideas, naturally rise to the top.

    I am very interested in the culture and psychology of these supposed "meritocratic" companies. Personally, I don't believe we have a reasonable approximation of the hyper-efficient merit-based resource allocation that is promoted by the ultra-rich.

    Usually I find these so-called "meritocratic" policies do not encourage good ideas, but enable hyper-competitive environments.

    These kind of environments likely do not support solid well-thought-out proposals; instead, pushes the quick implementation of mediocre ideas (a.k.a move fast and break things). A hyper-competitive environment can also discourage collaboration, which often can be crucial to "solve the hard problems".

    And the article mentions that this environment boosts employee retention, which I find extremely interesting. I wonder if the constant competitions can keep triggering a sense of "winning" and "accomplishment" in a perhaps mundane job.

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    And to think it won't even let me hack all kinds of bullshit with my phone like CtOS products in Watch Dogs...

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    So Palantir sells a data management tool and deployment support. That shouldn't really surprise anyone who knows the first thing about data science.

    The interesting thing about Palantir isn't what they sell but how they sell it and who buys it. They clearly market their unremarkable software as an autocrat's wet dream.

    And police and military departments across Europe and the US buy their shit, which says more about those police and military departments than about the software.

  • Palantir is often called a data broker, a data miner, or a giant database of personal information. In reality, it’s none of these—but even former employees struggle to explain it.

    Palantir sends its employees to work inside client organizations essentially as consultants, helping to customize their data pipelines, troubleshoot problems, and fix bugs. It calls these workers “forward deployed software engineers,” a term that appears to be inspired by the concept of forward-deployed troops, who are stationed in adversarial regions to deter nearby enemies from attacking.

    Crucially, Palantir doesn’t reorganize a company's bins and pipes, so to speak, meaning it doesn’t change how data is collected or how it moves through the guts of an organization. Instead, its software sits on top of a customer’s messy systems and allows them to integrate and analyze data without needing to fix the underlying architecture. In some ways, it’s a technical band-aid. In theory, this makes Palantir particularly well suited for government agencies that may use state-of-the-art software cobbled together with programming languages dating back to the 1960s.

    Palantir’s software is designed with nontechnical users in mind. Rather than relying on specialized technical teams to parse and analyze data, Palantir allows people across an organization to get insights, sometimes without writing a single line of code. All they need to do is log into one of Palantir’s two primary platforms: Foundry, for commercial users, or Gotham, for law enforcement and government users.

    Foundry focuses on helping businesses use data to do things like manage inventory, monitor factory lines, and track orders. Gotham, meanwhile, is an investigative tool specifically for police and government clients, designed to connect people, places, and events of interest to law enforcement. There’s also Apollo, which is like a control panel for shipping automatic software updates to Foundry or Gotham, and the Artificial Intelligence Platform, a suite of AI-powered tools that can be integrated into Gotham or Foundry.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Foundry and Gotham are similar: Both ingest data and give people a neat platform to work with it. The main difference between them is what data they’re ingesting. Gotham takes any data that government or law enforcement customers may have, including things like crime reports, booking logs, or information they collected by subpoenaing a social media company. Gotham then extracts every person, place, and detail that might be relevant. Customers need to already have the data they want to work with—Palantir itself does not provide any.

    Since leaving Palantir, Pinto says he’s spent a lot of time reflecting on the company’s ability to parse and connect vast amounts of data. He’s now deeply worried that an authoritarian state could use this power to “tell any narrative they want” about, say, immigrants or dissidents it may be seeking to arrest or deport. He says that software like Palantir’s doesn’t eliminate human bias.

    People are the ones that choose how to work with data, what questions to ask about it, and what conclusions to draw. Their choices could have positive outcomes, like ensuring enough Covid-19 vaccines are delivered to vulnerable areas. They could also have devastating ones, like launching a deadly airstrike, or deporting someone.

    In some ways, Palantir can be seen as an amplifier of people’s intentions and biases. It helps them make evermore precise and intentional decisions, for better or for worse. But this may not always be obvious to Palantir’s users. They may only experience a sophisticated platform, sold to them using the vocabulary of warfare and hegemony. It may feel as if objective conclusions are flowing naturally from the data. When Gotham users connect disparate pieces of information about a person, it could seem like they are reading their whole life story, rather than just a slice of it.

    it does what pornhub monetizes