

If propaganda worked on everyone, why is there anyone trying to counter it?


If propaganda worked on everyone, why is there anyone trying to counter it?


Was probably just sufficiently satisfied with life at that point.


Looks like the frames were extended. If you zoom in, there’s a pretty obvious vertical line on both sides, guessing that shows where someone used some AI tool to extend them.
Edit: actually, it’s even more obvious zoomed out, you can see on Piccard’s sleeve right where reality ends and slop begins.
Whoever made this seems to have low standards to think that “they made” an improvement.


Yeah IMO getting popular ruined reddit.
The RJ plugs are my least favourite. Still snags but the plastic bit that snags is feeble enough to break off easily, and then the plug doesn’t have anything holding it in to the port. And those covers usually make it harder to fit it through holes intended for ethernet cables as well as make it harder to unclip it from the port.
The reason that 25 number came up is that’s how old the cohort they were studying brain development got when their funding was cut. There’s no reason to not believe brains continue developing all our lives, or that even if that study did find a “cut off point” that it would be the same from person to person.


I recently added the pesticide of just throwing out my infested plants and starting over from scratch. And if I buy another store plant, it’s getting quarantined despite not really having any space to do so. It’ll sit on the stairs or something until I can be sure there are no bugs.


Just realized that even if there is no mechanism to get the exact date from any of these age tracking systems, they’ll be able to infer the exact dates by just looking at when the user/device transitions to the next bracket. Then they’ll know the birthday for the start of that bracket falls somewhere between the last check and the current one.
Though maybe that data can be poisoned by making it transition backwards occasionally, so it looks like the user is editing their age older and back or something. But, on the other hand, a lack of data or poisoned data is going to be a flag on its own at some point (if not already).


Not sure it will, as it would have to be able to handle users older than that, so wouldn’t have a reason for the default age to be that. Also depends on the UI (like my steam bday is something like jan 1 1900 because that’s the default age already entered).


It’s not even a junior dev. It might “understand” a wider and deeper set of things than a junior dev does, but at least junior devs might have a sense of coherency to everything they build.
I use gen AI at work (because they want me to) and holy shit is it “deceptive”. In quotes because it has no intent at all, but it is just good enough to make it seem like it mostly did what was asked, but you look closer and you’ll see it isn’t following any kind of paradigms, it’s still just predicting text.
The amount of context it can include in those predictions is impressive, don’t get me wrong, but it has zero actual problem solving capability. What it appears to “solve” is just pattern matching the current problem to a previous one. Same thing with analysis, brainstorming, whatever activity can be labelled as “intelligent”.
Hallucinations are just cases where it matches a pattern that isn’t based on truth (either mispredicting or predicting a lie). But also goes the other way where it misses patterns that are there, which is horrible for programming if you care at all about efficiency and accuracy.
It’ll do things like write a great helper function that it uses once but never again, maybe even writing a second copy of it the next time it would use it. Or forgetting instructions (in a context window of 200k, a few lines can easily get drowned out).
Code quality is going to suffer as AI gets adopted more and more. And I believe the problem is fundamental to the way LLMs work. The LLM-based patches I’ve seen so far aren’t going to fix it.
Also, as much as it’s nice to not have to write a whole lot of code, my software dev skills aren’t being used very well. It’s like I’m babysitting an expert programmer with alzheimer’s but thinks they are still at their prime and don’t realize they’ve forgotten what they did 5 minutes ago, but my company pays them big money and get upset if we don’t use his expertise and probably intend to use my AI chat logs to train my replacement because everything I know can be parsed out of those conversations.


Actually, I think that’s windows 11. Though despite it never trying to get you to install win 11, it’s still worse than the one that does.


Of magma or plasma, whichever is most convenient.
Apparently the win 12 rumours were just a hoax. Even Microslop isn’t that out of touch (at this point in time).


It’s because those words were sponsored by oil lobbyists and their offshoots. So much of the economy is based on it that it might actually be accurate (not that I think it should be perpetuated even if it would be painful to truly move on from oil).


Ah, that’s efficiency of use and depends more on how familiar you are with the software as well as the design and task. Like editing an image or video is going to be a lot easier with a gui than a command line interface (other than generating slop I guess).
When people talk about how efficient software is, it’s usually referring more to the amount of resources it uses (including time) to run its processes.
Eg an electron app is running a browser that is manipulating and rendering html elements running JavaScript (or other scripts/semi-compiled code). There is an interpreter that needs to process whatever code it is to do the manipulation and then an html renderer to turn that into an image to display on the screen. The interpreter and renderer run as machine code on the CPU, interacting with the window manager and the kernel.
A native app doesn’t bother with the interpreter and html renderer and itself runs as machine code on the CPU and interacts with the window manager and kernel. This saves a bunch of memory, since there isn’t an intermediate html state that needs to be stored, and time by cutting out the interpreter and html render steps.


Can you elaborate on that? I disagree but would like to understand why you think that. Maybe you’re referring to something I wouldn’t disagree with.


Yeah, for things that will likely be used, caching is good. I just have a problem with the “memory is free, so find more stuff to cache to fill it” or “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”.


I don’t want my PC wasting resources trying to guess every possible next action I might take. Even I don’t know for sure what games I’ll play tonight.


Ib4 “uNusEd RAm iS wAStEd RaM!”
No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.
I’ve thought for a while that Tolkien was a great world builder but a meh storyteller. His big thing was breaking that new ground. Not that I would do any better, but many other authors since have.
Rowling doesn’t have the breaking new ground. I don’t get why her shit got so popular in the first place. I lost interest when the first movie was basically going down a list of pre-Tolkien fantasy tropes like it was a checklist.