

Fuck bots.


Fuck bots.


That’s unfortunate. Idk what I did but it’s just worked for me for a long time now. I definitely feel in the minority with having success with an Nvidia GPU. I run 3 monitors, 1 vertical, one an ultra wide, and one on top. It remembers position, colors and refresh are on pont and other than needing to restart Steam from time to time, it’s just worked for me. My next card will still not be Nvidia though.


I’ve had no performance degredation moving to Wayland after the aformentioned update. I read what you said. I’ve noticed no difference in performance, and anyone that’s fretting about 120 fps vs 110 fps is just splitting hairs for the sake of be being pedantic. I concede that there are some features that don’t exist that don’t apply to me and the vast majority of users, but for most of us, Wayland is just as good if not better in some common aspects.
No need to be hostile my dude. Chill.


I get very good fps in every game I play on my 2080ti that’s very much not new. I’ve had to download a fix for one game I have, and Sony doesn’t allow multiplayer in Ghost of Tsushima, but every other game literally loads and runs better than they ever have. 100fps+ in most games, and maybe less than 60 in the most intense. Arch + kde + Wayland is easily the best computing experience I’ve ever had.


I have kde + Wayland on Arch, and everything works just fine even with my Nvidia card. About 1.5 years ago there was a bug big Wayland update that changed it from literally unusable to good but maybe a little buggy. Now it’s as stable as anything I’ve ever used.
I use spectacle for screen shots without issue. Switch windows, desktops, snapping windows, etc works perfect for me. Other features you mentioned I can’t personally speak to, but not having all of the features on a rolling release is to be expected with anything. It’s early phases were bad though.


As a dev that recently transitioned from a decade of sys admin experience, to two years of ServiceNow admin/developer/et all, to now full stack development, I have found AI useful for somethings. I asked it how to do a thing, and it regurgitated a bunch of code that didn’t do what I was looking for, however, it did give me a framework for what files I needed to modify. I then put nose to the grindstone and write all of the rest of the code myself, researching the docs when needed, and I got it done.
For me, if I use AI to assist in something code, I always type everything out myself whether it’s right or not, because like taking notes, typing it out does help learn what I’m doing, not just finding a solution and running with it. I’ve disabled most of the auto complete copilot garbage in Visual Studio because it would generate huge blocks of code that may or may not be correct, and the accept button is the tab key, which I use frequently. I still have some degree of auto complete for single lines, but that’s it.
My advice would be to use AI as a prompt to get ideas or steer direction, but if you want to get better at coding and problem solving, I would suggest trying to find solutions yourself because digging through docs will be far more beneficial to your growth. AI does a good job of helping fill the gaps in packages or frameworks when your ignorant to all of the functions and stuff, but striving to understand them instead of relying on unreliable tools will make you a much better developer long term


I did read the source, but that doesn’t change what I said. Lines of code is a shit metric. The source even specifically says, “…line of code is of course a questionable metric” then says how 400k lines of rust are auto generated in bindings.
So after re-reading, the intent of the article is to compare lines of code between different languages and their percentage of gnome. It’s apples and oranges, and a meaningless, shit metric.
There are definitely use cases where something like C is still the best option because it’s faster. For the most part consumer software it’s unnecessary, but it’s not obsolete for all applications.


Lines of code is such a shit metric.
In the low level languages like c and rust, it takes 2 to 3 times as many lines to do the same thing. It’s a sensationalistic way to try and share information and I think the intent is disingenuous rather than ignorant.


Bad code is bad code. We’ll see more of this as rust is implemented into core software.


It literally went from .1 fps to workable across multiple monitors over night. I’ve had one issue since where an update broke multi monitor support where if I changed monitor input without first removing the monitor in display it hard locked my computer. After a couple weeks that was fixed. Now periodically I have to restart Steam sometimes after an Nvidia update sometimes a full system restart, but not always and that’s still less restarts than the equivalent on Windows.
I’ve got a few friends that are considering a full jump, but a couple still play LoL or other anti cheat and aren’t willing to jump ship yet. The fight is real and I’m still pushing though. I’ve been all one for 2+ years now and other than the pre plasma 6 days on kde, I’ve had one game that I had to tweak some settings for, Ghost of Tsushima didn’t have multiplayer support, but the rest have been perfect out of box. For 90%+ of people, gaming and Linux will just work, regardless of GPU.


I had issues until about 18 month ago. I went from Wayland completely unusable with my 2080ti to it just works with plasma 6 I think.
I read in on C but it’s also true for JavaScript. The code implies that x was declared as an int sometime previously, or if JavaScript, just an object if not assigned a value giving it a type.
I know. OP asked what x was before the loop, and I just said it’s an int. The int can be any value because as you pointed out it will be set to 0 in the first loop iteration.
An int. Value doesn’t matter because it’s overwritten.


Unfortunately it’s a necessary evil if you want voice and video chat. I think people and companies that use it professionally for customer support and similar are making a terrible choice. If there were a competing product that was as simple to setup and had even the most basic voice and chat features without the extra discord bloat, I’d jump ship, but that’s not really an option right now.
I’m running a 2080ti and was able to switch to Wayland once plasma 6 was released. Prior to that it was completely unusable for me. Since then I had one issue earlier this year where multi monitor detection would hard lock my computer if I changed inputs on the monitor but didn’t update display settings first.
I’d say no. The effort to setup a dual boot and then hope it never breaks isn’t with it. I’d recommend installing into a virtual machine and running from there. If you break something in your install then it’s easy to start over and it’s way easier for initial setup.
Or League or a slew of EA games.
This is something I’ve very consciously tried to be correct my behavior on. It’s unfortunately and surprisingly hard to just shut up and let someone finish, but I’ve gotten a lot better at it. Now I find myself nodding and physically moving, ready to jump in the moment I can.
I work with one guy who is the worst at this. You can be in the middle of a sentence telling a story, and he’ll hijack it and pivot. He then will ramp up his volume as he talks to control the convo. It’s annoying, but I can empathize with him to an extent, and I don’t think he knows he’s doing it. It’s a tough line to walk sometimes.