Google quietly released an app that lets you download and run AI models locally
-
Ollama can’t run on Android
Llama.cpp (on which ollama runs on) can. And many chat programs for phones can use it.
-
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
How is Ollama compared to GPT models? I used the paid tier for work and I'm curious how this stacks up.
-
Just firewall the software or is there anything more fancy i would need to do?
typically the phone home is looking for a response to unlock.
use a packet sniffer to see what the request/response is and replicate it with a proxy or response server.
this is also know as a man-in-the-middle (mitm).
takes skill and knowledge to do, but once you do a few dozen it's pretty easy since most software "phone homes" are looking for static non-encrypted responses.
-
How is Ollama compared to GPT models? I used the paid tier for work and I'm curious how this stacks up.
It's decent, with the deepseek model anyway. It's not as fast and has a lower parameter count though. You might just need to try it and see if it fits your needs or not.
-
This post did not contain any content.
You never heard of ollama or docker model runner?
-
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
I've been using duck.ai recently myself and quite like it. My only complaint with it is that the chats have a length limit, so if you're working on complex projects you can run into those limits pretty quick. I use it for worldbuilding for a novel I'm working on and I have to use chatgpt for thematic stuff because it has a better memory, but otherwise it's great for quick/small things.
-
This post did not contain any content.
Excellent, I will be sure not to use this, like all Google shit.
-
Excellent, I will be sure not to use this, like all Google shit.
In a few years you won't be able to anyway
-
This post did not contain any content.
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
-
You never heard of ollama or docker model runner?
Android and iOS.
-
All the time I spent trying to get rid of gemini just to now download this. Am I stupid?
I wouldn't think so - it depends on your priorities.
The open source and offline nature of this without the pretenses of "Hey, we're gonna use every query you give as a data point to shove more products down your face" seems very appealing over Gemini. There's also that Gemini is constantly being shoved in our faces and preinstalled, whereas this is a completely optional download.
-
In a few years you won't be able to anyway
I'm just reaching the end game faster then.
-
This post did not contain any content.
There is already GPT4All.
Convenient graphical interface, any model you like (for Llama fans - of course it's there), fully local, easy to opt in or out of data collection, and no fuss to install - it's just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
-
This post did not contain any content.
god i can't wait for the ai bubble to pop
-
This post did not contain any content.
Wonder what this has over its competitors, I hesitate to think they released this for fun though
-
typically the phone home is looking for a response to unlock.
use a packet sniffer to see what the request/response is and replicate it with a proxy or response server.
this is also know as a man-in-the-middle (mitm).
takes skill and knowledge to do, but once you do a few dozen it's pretty easy since most software "phone homes" are looking for static non-encrypted responses.
Thanks for the info!
-
it does if you make it work in those conditions.
software that "phones home" is easy to fool.
Tell that to my Red Dead Redemption 2 install.
-
This post did not contain any content.
Google hosting their shit on Microsofts servers and telling you to sideload and not using their own software distribution method for their own OS is kind of crazy if you think about it
-
There is already GPT4All.
Convenient graphical interface, any model you like (for Llama fans - of course it's there), fully local, easy to opt in or out of data collection, and no fuss to install - it's just a Linux/Windows/MacOS app.
For Linux folks, it is also available as flatpak for your convenience.
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
-
So it doesnt require internet access at all? I would only use these on a disconnected part of my network.
Yes, it works perfectly well without Internet. Tried it both on physically disconnected PC and laptop in airplane mode.