Google quietly released an app that lets you download and run AI models locally
-
Is the chat uncensored?
And unmonitored? Don't trust anything from Google anymore.
What makes this better than Ollama?
-
And unmonitored? Don't trust anything from Google anymore.
What makes this better than Ollama?
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
-
This post did not contain any content.
Why would I use this over Ollama?
-
everything is unmonitored if you don't connect to the network.
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
That is exactly what Ollama does too.
-
Why would I use this over Ollama?
Ollama can’t run on Android
-
Ollama can’t run on Android
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
-
Is the chat uncensored?
Censoring is model dependent so you can select one of the models without the guardrails.
-
everything is unmonitored if you don't connect to the network.
But not everything works in those conditions.
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
never go online again - they won't be able to monitor anything, even if there's code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
-
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
Yes, that's my setup. But this will be useful for cases where internet connection is not reliable
-
This post did not contain any content.
Enclave on iOS does the trick for the rare times i need a local LLM
-
Ollama can’t run on Android
You can use it in termux
-
But not everything works in those conditions.
it does if you make it work in those conditions.
software that "phones home" is easy to fool.
-
Enclave on iOS does the trick for the rare times i need a local LLM
Didn't know about this. Checking it out now, thanks!
-
You can use it in termux
Has this actually been done? If so, I assume it would only be able to use the CPU
-
This post did not contain any content.
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
-
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
Yeah duck is all over bothered with since it came out since you don't even need to login to use it.
-
Has this actually been done? If so, I assume it would only be able to use the CPU
Yeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
-
it does if you make it work in those conditions.
software that "phones home" is easy to fool.
Just firewall the software or is there anything more fancy i would need to do?