

I don’t know what is typical, but when I use AI locally I’ve been running llama-cpp with models grabbed from HF (ex. QwenCoder). Then in my VS code plugin (RooCode) I use the “OpenAI compatible” option to point it at my local server.
Not sure how hard that is to get working, but my hope is that “OpenAI Compatible” helps.

I’ve invented a new type of Vegetarianism: instead of eating veggies for every meal, occasionally you’ll add meat to your diet as well. It’s really the best of both worlds.