Google quietly released an app that lets you download and run AI models locally
-
This post did not contain any content.
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
-
You never heard of ollama or docker model runner?
Android and iOS.
-
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
I wouldn't think so - it depends on your priorities.
The open source and offline nature of this without the pretenses of "Hey, we're gonna use every query you give as a data point to shove more products down your face" seems very appealing over Gemini. There's also that Gemini is constantly being shoved in our faces and preinstalled, whereas this is a completely optional download.
-
In a few years you won't be able to anyway
I'm just reaching the end game faster then.
-
This post did not contain any content.
There is already GPT4All.
Convenient graphical interface, any model you like (for Llama fans - of course it's there), fully local, easy to opt in or out of data collection, and no fuss to install - it's just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
-
This post did not contain any content.
god i can't wait for the ai bubble to pop
-
This post did not contain any content.
Wonder what this has over its competitors, I hesitate to think they released this for fun though
-
typically the phone home is looking for a response to unlock.
use a packet sniffer to see what the request/response is and replicate it with a proxy or response server.
this is also know as a man-in-the-middle (mitm).
takes skill and knowledge to do, but once you do a few dozen it's pretty easy since most software "phone homes" are looking for static non-encrypted responses.
Thanks for the info!
-
it does if you make it work in those conditions.
software that "phones home" is easy to fool.
Tell that to my Red Dead Redemption 2 install.
-
This post did not contain any content.
Google hosting their shit on Microsofts servers and telling you to sideload and not using their own software distribution method for their own OS is kind of crazy if you think about it
-
There is already GPT4All.
Convenient graphical interface, any model you like (for Llama fans - of course it's there), fully local, easy to opt in or out of data collection, and no fuss to install - it's just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
-
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
Yes, it works perfectly well without Internet. Tried it both on physically disconnected PC and laptop in airplane mode.
-
I won't gonna use my smartphone as a local llm machine.
Okay man.