Mastodon: @misk@lewacki.space
Lemmy: @misk@sopuli.xyz

Opinions exclusively of my own and of voices in my head.

Autism, communism, arthitism, cannabism.

  • 6 Posts
  • 11 Comments
Joined 7 months ago
cake
Cake day: March 15th, 2025

help-circle


  • I haven’t had a computer running Windows in my home for like a decade or so but I get exposure to it because of working at large corpos. Frankly, LTSC + proper policy set by administrators is okay for day to day work. It is kind of annoying and decaying in terms of usability but the core experience hasn’t changed that much. My partner works at a company that doesn’t use LTSC and that’s a big oof - unwanted features get shoved in your face all the time, breaking basic functionality like search etc. I can’t even imagine how it looks like in a regular consumer version.




  • If you’re a publisher only then RSS/Atom makes much more sense, especially if you want to retain ownership and control over content you created. If control is not important to you and you want to incorporate social aspect into your website (comments, reactions) then ActivityPub seems like a convenient way to go about it.

    One thing that I strongly disagree with author of the blog post is the discoverability aspect. Yes, initial federation kind of sucks but at the same time your readers on Mastodon will be able to boost it and increase your reach.


  • misk@piefed.socialtoProgramming@programming.dev*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 month ago

    Currently everything on the Internet is assumed to be free. Robots.txt is just a suggestion and not legally enforceable. I assume RSL is supposed to communicate terms of use explicitly, like a EULA.

    Robots is just a suggestion and so is this because scaraper never cared about legality of things. All this thing does is make license more easily accessible but consequently, do we want to make it easy for them in the first place? Make scrapers work for it.




  • I’ve worked in business process automation for a bit. In most cases I’ve seen no/low-code is introduced when IT can no longer service technical debt and won’t deliver any new features within reasonable timeframes (usually a result of decades of underfunding). No „real” developer will be ever fixing no/low-code solutions either because why would they? Are they going to fix something in an alien tech stack? Are they going to implement this functionality properly? No, because they never had resources to do that.