Google quietly released an app that lets you download and run AI models locally
-
schrieb am 31. Mai 2025, 16:03 zuletzt editiert vonThis post did not contain any content.
-
This post did not contain any content.schrieb am 31. Mai 2025, 16:16 zuletzt editiert von
Alibaba also provides an OpenSource App, it even has support for their multimodal voice chat Model qwen2.5 omni:
https://github.com/alibaba/MNN -
This post did not contain any content.schrieb am 31. Mai 2025, 17:04 zuletzt editiert von
Is the chat uncensored?
-
Is the chat uncensored?
schrieb am 31. Mai 2025, 17:26 zuletzt editiert vonAnd unmonitored? Don't trust anything from Google anymore.
What makes this better than Ollama?
-
And unmonitored? Don't trust anything from Google anymore.
What makes this better than Ollama?
schrieb am 31. Mai 2025, 17:35 zuletzt editiert vonQuote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
-
This post did not contain any content.schrieb am 31. Mai 2025, 17:35 zuletzt editiert von
Why would I use this over Ollama?
-
schrieb am 31. Mai 2025, 17:40 zuletzt editiert von
everything is unmonitored if you don't connect to the network.
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
schrieb am 31. Mai 2025, 17:56 zuletzt editiert vonThat is exactly what Ollama does too.
-
Why would I use this over Ollama?
schrieb am 31. Mai 2025, 18:01 zuletzt editiert vonOllama can’t run on Android
-
Ollama can’t run on Android
schrieb am 31. Mai 2025, 18:08 zuletzt editiert vonThat's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
-
Is the chat uncensored?
schrieb am 31. Mai 2025, 18:55 zuletzt editiert vonCensoring is model dependent so you can select one of the models without the guardrails.
-
everything is unmonitored if you don't connect to the network.
schrieb am 31. Mai 2025, 19:21 zuletzt editiert vonBut not everything works in those conditions.
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
schrieb am 31. Mai 2025, 19:45 zuletzt editiert vonnever go online again - they won't be able to monitor anything, even if there's code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
-
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
schrieb am 31. Mai 2025, 21:46 zuletzt editiert vonYes, that's my setup. But this will be useful for cases where internet connection is not reliable
-
This post did not contain any content.schrieb am 31. Mai 2025, 22:13 zuletzt editiert von
Enclave on iOS does the trick for the rare times i need a local LLM
-
Ollama can’t run on Android
schrieb am 31. Mai 2025, 22:15 zuletzt editiert vonYou can use it in termux
-
But not everything works in those conditions.
schrieb am 31. Mai 2025, 23:12 zuletzt editiert vonit does if you make it work in those conditions.
software that "phones home" is easy to fool.
-
Enclave on iOS does the trick for the rare times i need a local LLM
schrieb am 31. Mai 2025, 23:25 zuletzt editiert vonDidn't know about this. Checking it out now, thanks!
-
You can use it in termux
schrieb am 31. Mai 2025, 23:30 zuletzt editiert vonHas this actually been done? If so, I assume it would only be able to use the CPU
-
This post did not contain any content.schrieb am 1. Juni 2025, 01:15 zuletzt editiert von
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
-
Dentist accused of fatally poisoning his wife asked daughter to create AI deepfake of mom asking for chemicals
Technology49 vor 15 Tagenvor 15 Tagen1
-
-
-
The National Association for the Advancement of Colored People (NAACP) is suing Elon's Musk xAI
Technology 18. Juni 2025, 15:021
-
1
-
Goodyear revitalises its Connector gravel tyre range to be 70g lighter, stronger and faster
Technology 27. Mai 2025, 18:101
-
The World's First Mass-Produced Flying Car Is Here and It Costs $1 Million
Technology 22. Mai 2025, 12:051
-