![](/static/61a827a1/assets/icons/icon-96x96.png)
![](https://lemmy.world/pictrs/image/8286e071-7449-4413-a084-1eb5242e2cf4.png)
11·
28 days agoThere are some videos on youtube of people running local LLMs on the newer M4 chips which have pretty good AI performance. Obviously, a 5090 is going to destroy it in raw compute power, but the large unified memory on Apple Silicon is nice.
That being said, there are plenty of small ITX cases at about 13-15L that can fit a large nvidia GPU.
As someone who dual-boots Asahi & macOS, I really hope they can figure out how to move forward. What they do is essential to keeping Apple Silicon MacBooks’s OS open to user choice.