It can run in my GPU. I have Qwen 3.6 27b and 25b, but they need RAM and was too lazy to clean up some RAM for them. Especially for something that seemed rather trivial. I’m sadly stuck with ollama right now for (stupid) reasons. The other is definitely the way to go in the future
It can run in my GPU. I have Qwen 3.6 27b and 25b, but they need RAM and was too lazy to clean up some RAM for them. Especially for something that seemed rather trivial. I’m sadly stuck with ollama right now for (stupid) reasons. The other is definitely the way to go in the future
Though I’m waiting for the real OS AI !fosai@lemmy.world
whaaaat what’s the stupid reasons?
whaaaat what’s the stupid reasons?