OpenAI wins $200m contract with US military for ‘warfighting’
-
schrieb am 25. Juni 2025, 16:53 zuletzt editiert vonThis post did not contain any content.
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:06 zuletzt editiert von
I'm really curious what brainrot is in that man's head.
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:20 zuletzt editiert von
the war they are fighting is against you .
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:24 zuletzt editiert von
OpenAI wants the money and the military wants to never have to deal with accountability. That way when they bomb a wherever they want and just say "it wasn't my decision, it was the AI" and then OpenAI can say "we need more money to make it more reliable. Also we need more training data from the military so it won't happen again, can we have it all?"
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:28 zuletzt editiert von vanth@reddthat.com
Use of the terms "warfighter" or "warfighting" is one of the biggest red flags in my life due to the industry I'm in. Big cringe. Might as well just say "I wanna make the world more White and Christian. I'm not in the military but love tacticool fashion. 'Murrica."
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:29 zuletzt editiert von
OpenAI’s core message was “we can’t release our GPT model because people will try to use it for war”.
Fucking hypocrites.
-
This post did not contain any content.schrieb am 25. Juni 2025, 17:39 zuletzt editiert von
We gunna have to start holding programmers accountable for war crimes.
-
We gunna have to start holding programmers accountable for war crimes.
schrieb am 25. Juni 2025, 17:50 zuletzt editiert vonDid you live under the impression that all the smart missiles, smart guns, smart everything didn't already require programmers?
-
I'm really curious what brainrot is in that man's head.
schrieb am 25. Juni 2025, 17:58 zuletzt editiert vonThere is a device that allows for the head to be easily separated for a more thorough analysis. I say we start building some.
-
We gunna have to start holding programmers accountable for war crimes.
schrieb am 25. Juni 2025, 18:01 zuletzt editiert vonFor that, they'll need to license and protect the profession like other engineering vocations..
-
I'm really curious what brainrot is in that man's head.
schrieb am 25. Juni 2025, 18:15 zuletzt editiert vonWhen you pay money to be at the Nazi inaugration, it shouldn't be a surprise that you accepted the Nazi blood money.
-
This post did not contain any content.schrieb am 25. Juni 2025, 18:22 zuletzt editiert von
-
OpenAI wants the money and the military wants to never have to deal with accountability. That way when they bomb a wherever they want and just say "it wasn't my decision, it was the AI" and then OpenAI can say "we need more money to make it more reliable. Also we need more training data from the military so it won't happen again, can we have it all?"
schrieb am 25. Juni 2025, 18:42 zuletzt editiert vonI dreamed of a moment when further existence of the society without clear and non-ambiguous personal responsibility will be impossible.
This is that. In olden days, even if an apparatus made a decision, it still consisted of people. Now it's possible for the mechanism to not involve people. Despite making garbage decisions, that's something new, or, to be more precise, something forgotten too long ago - of the times of fortunetelling on birds' intestines and lambs' bones for strategic decisions. I suppose in those times such fortunetelling was a mechanism to make a decision random enough, thus avoiding dangerous predictability and traitors affecting decisions.
The problem with AI or "AI" is that it's not logically the same as that fortunetelling.
And also, about personal responsibility ... in ancient Greece (and Rome) unfortunate result of such decision-making was not blamed on gods, it was blamed on the leader - their lack of favor with gods, or maybe the fortuneteller - for failing to interpret gods' will, which, in case they could affect the result, correct. Or sometimes the whole unit, or the whole army, or the whole city-state. So the main trait of any human mechanism, for there to be a responsible party, was present.
Either a clearly predictable set of people in the company providing the program, or the operator, or the officer making decisions, or all of them, should be responsible when using an "AI" to at least match this old way.
-
There is a device that allows for the head to be easily separated for a more thorough analysis. I say we start building some.
schrieb am 25. Juni 2025, 19:14 zuletzt editiert vonAh, I see I've found a fellow member of la révolution. I applaud your scientific curiosity.
-
This post did not contain any content.schrieb am 25. Juni 2025, 19:26 zuletzt editiert von
Welp, it was nice knowing you all.
-
Use of the terms "warfighter" or "warfighting" is one of the biggest red flags in my life due to the industry I'm in. Big cringe. Might as well just say "I wanna make the world more White and Christian. I'm not in the military but love tacticool fashion. 'Murrica."
schrieb am 25. Juni 2025, 19:39 zuletzt editiert vonOr might as well say "Yes, I like money and want to sell to the DoD." Source: may have used it in a slide deck once. Not actually sure, as the phrase wasnt as popular back then.
-
Did you live under the impression that all the smart missiles, smart guns, smart everything didn't already require programmers?
schrieb am 25. Juni 2025, 19:40 zuletzt editiert vonThey were not making targeting decisions.
-
OpenAI’s core message was “we can’t release our GPT model because people will try to use it for war”.
Fucking hypocrites.
schrieb am 25. Juni 2025, 19:41 zuletzt editiert von"If people use it for war then people won't pay us to use it for war"
-
OpenAI’s core message was “we can’t release our GPT model because people will try to use it for war”.
Fucking hypocrites.
schrieb am 25. Juni 2025, 19:42 zuletzt editiert vonCapitalists will do anything for money. Nothing is off the table.
-
This post did not contain any content.schrieb am 25. Juni 2025, 19:43 zuletzt editiert von
I only ever used chat gpt infrequently, and find it mediocre at best, so I have no trouble abandoning it completely. Not giving them any more free training.
-
Mastercard and Visa face backlash after hundreds of adult games removed from online stores Steam and Itch.io
Technology vor 3 Tagen1
-
-
-
-
RFK Jr. Wants Every American to Be Sporting a Wearable Within Four Years
Technology 25. Juni 2025, 06:251
-
-
What Happens When AI-Generated Lies Are More Compelling than the Truth?
Technology 18. Mai 2025, 09:231
-
China aims to recruit top US scientists as Trump tries to kill the CHIPS Act
Technology 6. März 2025, 13:011