

What is stopping people from bringing RISC-V to the desktop now? Major distros already support it and you can run x86 programs with box64.
What is not fast enough then?
What is stopping people from bringing RISC-V to the desktop now? Major distros already support it and you can run x86 programs with box64.
What is not fast enough then?
All the calculations could be done before hand and stored and then the only thing left in the delayed draw is to set the buffer.
I haven’t looked at the code yet so not sure how much if any it will save though.
Could also group pixels that are far away from eachother into a single call, while a compromise i think it will maintain the effect.
A random suggestion would be to draw to multiple canvases, and use a CSS animation for the delay.
Also not sure if you are already doing this but it might be more peformant to use the raw buffer instead of draw functions.
Alternatively you could look into webgpu, it is ment for these kind of things.
Well yes the peformance ceraintly hasn’t caught up yet to x86 but the strongest riscv cpu on the market as far as I know has 64 cores on 2ghz. More then enough to run a desktop.