Google quietly released an app that lets you download and run AI models locally
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
schrieb am 31. Mai 2025, 17:56 zuletzt editiert vonThat is exactly what Ollama does too.
-
Why would I use this over Ollama?
schrieb am 31. Mai 2025, 18:01 zuletzt editiert vonOllama can’t run on Android
-
Ollama can’t run on Android
schrieb am 31. Mai 2025, 18:08 zuletzt editiert vonThat's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
-
Is the chat uncensored?
schrieb am 31. Mai 2025, 18:55 zuletzt editiert vonCensoring is model dependent so you can select one of the models without the guardrails.
-
everything is unmonitored if you don't connect to the network.
schrieb am 31. Mai 2025, 19:21 zuletzt editiert vonBut not everything works in those conditions.
-
Quote: "all running locally, without needing an internet connection once the model is loaded. Experiment with different models, chat, ask questions with images, explore prompts, and more!"
So you can download it and set the device to airplane mode, never go online again - they won't be able to monitor anything, even if there's code for that included.
schrieb am 31. Mai 2025, 19:45 zuletzt editiert vonnever go online again - they won't be able to monitor anything, even if there's code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
-
That's fair, but I think I'd rather self host an Ollama server and connect to it with an Android client in that case. Much better performance.
schrieb am 31. Mai 2025, 21:46 zuletzt editiert vonYes, that's my setup. But this will be useful for cases where internet connection is not reliable
-
This post did not contain any content.schrieb am 31. Mai 2025, 22:13 zuletzt editiert von
Enclave on iOS does the trick for the rare times i need a local LLM
-
Ollama can’t run on Android
schrieb am 31. Mai 2025, 22:15 zuletzt editiert vonYou can use it in termux
-
But not everything works in those conditions.
schrieb am 31. Mai 2025, 23:12 zuletzt editiert vonit does if you make it work in those conditions.
software that "phones home" is easy to fool.
-
Enclave on iOS does the trick for the rare times i need a local LLM
schrieb am 31. Mai 2025, 23:25 zuletzt editiert vonDidn't know about this. Checking it out now, thanks!
-
You can use it in termux
schrieb am 31. Mai 2025, 23:30 zuletzt editiert vonHas this actually been done? If so, I assume it would only be able to use the CPU
-
This post did not contain any content.schrieb am 1. Juni 2025, 01:15 zuletzt editiert von
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
-
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
schrieb am 1. Juni 2025, 02:05 zuletzt editiert vonYeah duck is all over bothered with since it came out since you don't even need to login to use it.
-
Has this actually been done? If so, I assume it would only be able to use the CPU
schrieb am 1. Juni 2025, 04:21 zuletzt editiert vonYeah I have it in termux. Ollama is in the package repos for termux. The speed it generates does feel like cpu speed but idk
-
it does if you make it work in those conditions.
software that "phones home" is easy to fool.
schrieb am 1. Juni 2025, 08:08 zuletzt editiert vonJust firewall the software or is there anything more fancy i would need to do?
-
Duck.ai doesn't data mine, and has o3 mini which I have found to be very good. Its got some extra functionality like lines to break up text.
schrieb am 1. Juni 2025, 08:52 zuletzt editiert vonNice! I saw Mozilla also added an ai chat in the browser recently (not in the phone version that I have seen tho).
It is too bad duck.ai only runs the small models. Gpt4o-mini is not very good, it can be very inaccurate and very inconsistent
I would like to see the 4.1-mini instead, faster and better and got function calling, so it can do web searches for example. O3 can't so it can only know what it knows until 2023.But thanks for the information I will be looking out for when 4.1 is added!
-
never go online again - they won't be able to monitor anything, even if there's code for that included.
Sounds counter-intuitive on a smart phone where you most likely want to be online again at some point in time.
schrieb am 1. Juni 2025, 10:15 zuletzt editiert vonSo trust them. If you don't and want to use this, buy a separate device for it, or VM.
Can't? This is not for you.
-
So trust them. If you don't and want to use this, buy a separate device for it, or VM.
Can't? This is not for you.
schrieb am 1. Juni 2025, 11:16 zuletzt editiert vonI won't gonna use my smartphone as a local llm machine.
-
Ollama can’t run on Android
schrieb am 1. Juni 2025, 11:19 zuletzt editiert vonIs there any useful model you can run on a phone?
-
CerebraAI App Review: 7 Unbelievable Ways It Beats ChatGPT in 2025
Technology49 vor 19 Tagenvor 19 Tagen1
-
-
This chair in the form of a Rubik's cube will allow you to sit, only if you solve a puzzle
Technology 6. Juli 2025, 14:102
-
-
A podcast episode with heavy-weights of the Fediverse and ActivityPub - fredrocha.net
Technology 2. Juli 2025, 10:371
-
Millions of Americans Who Have Waited Decades for Fast Internet Connections Will Keep Waiting After the Trump Administration Threw a $42 Billion High-Speed Internet Program Into Disarray.
Technology 20. Juni 2025, 14:001
-
Chinese AI outfits smuggling suitcases full of hard drives to evade U.S. chip restrictions — training AI models in Malaysia using rented servers
Technology 18. Juni 2025, 15:341
-
Palantir’s Collection of Disease Data at C.D.C. Stirs Privacy Concerns
Technology 7. Juni 2025, 01:021