

The day it happen we’ll just have to defederate them, which other social networks can’t.
The day it happen we’ll just have to defederate them, which other social networks can’t.
Thx, but I’ll stay with alternativeto.net. At least that one is not an ad in a trenchcoat.
Fair point. I do agree with the “clic to execute challenge” approach.
For the terminal browser, it has more to do with it not respecting web standard than Anubis not working on it.
As for old hardware, I do agree that a temporization could be good idea, if it wasn’t so easy to circumvent. In such case bots would just wait in the background and resume once the timer is fullified, which would vastly decrease Anubis effectiveness as they don’t uses much power to do so. There isn’t really much that can be done here.
As for the CUDA solution, that will depend on the implemented hash algorithm. Some of them (like the one used by Monero) are made to vastly more inefficient on GPU than it is on the CPU. Moreover, GPU servers are far more expensive to run than CPU ones, so the result would be the same : crawling would be more expensive.
In any case, the best solution would be by far to make it a legal requirement to respect robot.txt, but for now the legislators prefer to look the other way.
To solve it or not do not change that they have to use more resources for crawling, which is the objective here. And by contrast, the website sees a lot less load compared to before the use of Anubis. In any case, I see it as a win.
But despite that, it has its detractors, like any solution that becomes popular.
But let’s be honest, what are the arguments against it?
It takes a bit longer to access for the first time? Sure, but that’s not like you have to click anything or write anything.
It executes foreign code on your machine? Literally 90% of the web does these days. Just disable JavaScript to see how many website is still functional. I’d be surprised if even a handful does.
The only people having any advantages at not having Anubis are web crawler, be it ai bots, indexing bots, or script kiddies trying to find a vulnerable target.
Anubis is no challenge like a captcha. Anubis is a ressource waster, forcing crawler to resolve a crypto challenge (basically like mining bitcoin) before being allowed in. That how it defends so well against bots, as they do not want to waste their resources on needless computing, they just cancel the page loading before it even happen, and go crawl elsewhere.
Yeah, that shit that bore you to death. Who needs it anyway? Ignorance is strength.
/s
Yeah, but where is the manual to read the fucking manual ?
A domaine without tld (me@home) is a valide address. I saw an email server being used as a mqtt-like server this way (it is very old and predate those software).
I did try to work for opensource company, but strangely none of them accepted .NET as an acceptable experience. So I had to either find an entry-level Java position, and cut my paycheck by half, or continue to work where I do while changing things from the inside.
I already managed to introduce some open-source tools here and there (we now uses DBeaver instead of SSMS, Insomnia instead of Postman, among others), and intend to continue for as long as I can.
As for the appointment, in about 70 years, according to the current life expectation.
You are talking about me, aren’t you ?
If so, no, I don’t work for Mistral at all, but I do work for a company selling M$ products to businesses. You know, to pay rend, food, things like that.
But M$ requires us to be certified to get prospects from them, and as such we are encouraged to do at least all basic certification relative to our field, which includes AI, Azure, C#, and the likes.
That why I knew that the use of Shavian alphabet is mostly useless, as even a basic free AI is able to mostly decipher it. If a free one can, I’ll let to your imagination what a more advanced one can do.
Now why did I use Mistral ? Simply because it happened to be installed on my phone for test purpose. I rarely use it, but I have to admit it is useful for specific scenarios. But once I can install an hardware accelereted local AI on my phone, Mistral can eat shit.
Basically, a blog/website platform, similar to Wordpress, but without the drama.
Having worked with both, SAP if by far the worst of the two.
But Sage is another league. Want an API? Sure, here you go. Oh, you want it to do something usefull? I’m afraid we can’t do that.
It’s so bad their client ask actual third parties to create custom APIs to be able to actualy do something.
If you are lucky you’ll have a good third party, if your’re not you’ll be like me, trying to do something without any docs, and api datapoints that make no sense unless you have said missing docs.
Those fuckers can’t even chose what format to give to their id. Sometimes it is a string with a lenght of 7, sometime 13, sometimes an int.
As someone who escaped from that hell, I pity you.
Then I remember that I’ve now experienced far worst, and Dynamics seem not that vad in retrospect.
Is it still asking for root password everywhere ?
You just had to wait 2 more hours for that.
My dad uses LibreOffice for years, still call it OpenOffice 😂.
You mean OwnCloud, don’t you?
You don’t need 4 drive for redundancy, 2 is enough. You only need 4 minimum for a raidz2, or raid 10.
My setup uses a pair of SSD in mirror for apps, and a 5 disk raidz2 (4 data disk + a hot spare) as a main data backbone.
On the other hand, algae do not produce shade, not sure if it filters atmospheric pollutants, and trees provide all sort of other services to the local ecosystem.
Maybe this invention can be used on places where trees cannot lives, but I’d still take a city with trees over a city full of green tanks.
I once deleted /dev/urandom. I didn’t want uncertainty in my life.
Well, I was on for a surprise.