Your public ChatGPT queries are getting indexed by Google and other search engines
-
Here's a good one
As for georges bank, it was made a national monument and are large ancient volanos that would be perfect for a alien base. There's also a line on google maps going from woods hole to there, and obama has a mantion off if king point road looking out into the ocean
Omfg
-
Oh,boy. More.
Have you heard of timecube? Well here's mirrorcube
Ten times ten thousand pairs of opposite reflected extensions of you are doing the same thing - throwing the ball away from themselves toward their opposites and away from themselves, each one of each pair being the reverse of its opposite, and acting in reverse. YOU NOW KNOW WHAT THE ELECTRIC CURRENT IS, and that should tell you what RADAR is. Likewise it explains RADIO and TELEVISION. [See Principle of Regeneration, 3.13 - Reciprocals and Proportions of Motions and Substance, 7.3 - Law of Love - Reciprocal Interchange of State on Multiple Subdivisions]
It's so fucking insane
-
If you don't want your conversations to be public, how about you don't tick the checkbook that says "make this public." This isn't OpenAI's problem, its an idiot user problem.
-
If you don't want your conversations to be public, how about you don't tick the checkbook that says "make this public." This isn't OpenAI's problem, its an idiot user problem.
If you don't want corporations to use you chats as data, don't use corporate hosted language models.
Even non-public chats are archived by OpenAI, and the terms of service of ChatGPT essentially give OpenAI the right to use your conversations in any way that they choose.
You can bet they'll eventually find ways to monetize your data at some point in the future. If you think GoogleAds is powerful, wait until people's assistants are trained with every manipulative technique we've ever invented and are trying to sell you breakfast cereals or boner pills...
You can't uncheck that box except by not using it in the first place. But people will sell their soul to a company in order to not have to learn a little bit about self-hosting
-
I assumed this was a given. Anything offered to tech overlords will be monetized and packaged for profit at every possible angle. Nice to know it's official now, I guess.
-
Here's a good one
As for georges bank, it was made a national monument and are large ancient volanos that would be perfect for a alien base. There's also a line on google maps going from woods hole to there, and obama has a mantion off if king point road looking out into the ocean
Omfg
Yum, a nicely mixed word salad! Lmfao
-
If you don't want corporations to use you chats as data, don't use corporate hosted language models.
Even non-public chats are archived by OpenAI, and the terms of service of ChatGPT essentially give OpenAI the right to use your conversations in any way that they choose.
You can bet they'll eventually find ways to monetize your data at some point in the future. If you think GoogleAds is powerful, wait until people's assistants are trained with every manipulative technique we've ever invented and are trying to sell you breakfast cereals or boner pills...
You can't uncheck that box except by not using it in the first place. But people will sell their soul to a company in order to not have to learn a little bit about self-hosting
Hi there, I’m thinking about getting into self-hosting. I already have a Jellyfin server set up at home but nothing beyond that really. If you have a few minutes, how can self-hosting help in the context of OPs post? Do you mean hosting LLMs on Ollama?
-
Hi there, I’m thinking about getting into self-hosting. I already have a Jellyfin server set up at home but nothing beyond that really. If you have a few minutes, how can self-hosting help in the context of OPs post? Do you mean hosting LLMs on Ollama?
Yes, Ollama or a range of other backends (Ooba, Kobold, etc.) can run LLMs locally. Huggingface has a huge number of models suited to different tasks like coding, storywriting, general purpose, and so on. If you run both the backend and frontend locally, then no one monetizes your data.
The part I'd argue that the previous poster is glazing over a little bit is performance. Unless you have an enterprise-grade GPU cluster sitting in your basement, you're going to make compromises on speed and/or quality relative to the giant models that run on commercial services.
-
Even through duck.ai?
-
I mean... they are public. duh
-
Oh,boy. More.
Have you heard of timecube? Well here's mirrorcube
Ten times ten thousand pairs of opposite reflected extensions of you are doing the same thing - throwing the ball away from themselves toward their opposites and away from themselves, each one of each pair being the reverse of its opposite, and acting in reverse. YOU NOW KNOW WHAT THE ELECTRIC CURRENT IS, and that should tell you what RADAR is. Likewise it explains RADIO and TELEVISION. [See Principle of Regeneration, 3.13 - Reciprocals and Proportions of Motions and Substance, 7.3 - Law of Love - Reciprocal Interchange of State on Multiple Subdivisions]
It's so fucking insane
Have you heard of timecube?
No, but I've heard of þe time knife.
-
Have you heard of timecube?
No, but I've heard of þe time knife.
Warning, it gets racist the deeper you read.