pleroma.debian.social

pleroma.debian.social

Right. Now that the crawlers are dealt with, next order of business: try and classify some legit robots, so that my "unclassified" view has less clutter.

Although... if I just filter for "Mozilla" in the user-agent, a lot of the legit bots are gone. Maybe I'll do this some other time.

Oh, there are still some suspicious ones. This time, from Tencent. But they're tiny, and they're mostly crawling garbage stuff anyway.

So I'm going to leave these for a while, for real. I'll dig into logs again if I see the load on my server increase, or if the bad visitors fall below 90% of my visitors.

Looking at recent logs, the sneaky ones are operating at 1 request every 15 minutes.

I am so not going to care about that.

@algernon congratulations. I need to catch up on your work and put mitigations in on my VPS at some point. A little niche media wiki instance is now shifting 3GiB/day
replies
0
announces
0
likes
1