@neil Seems like an interesting idea... I'm in!
@neil Count me in. Haven't got around to implementing human.json myself yet, but I like the idea. I used to like XFN too.
Edit: I use Actual Idiot to produce my blog.
@neil Excellent work, Neil, thank you. Itโs on my todo list :)
@neil https://michal.sapka.pl/human.json I'll add you to mine later on ;-)
@neil
I'm in.
Three out of four....
@neil I think I qualify, if you consider "I know you" as a pass
@neil then I'm happy to add you to my list, if you'll have me.
edit: done
@neil I am in!
@neil That sounds like me and @Slash909uk , do please tag us here:
https://www.ashbysoft.com
Time to sort out similar the other way methinks!
@neil define "know"... ๐
@neil ๐
@neil We've met in person very briefly at a SCLUG meeting. Does that count for point one?
@neil And you're still willing to talk to me? Gosh.
@neil Hi. My blog is https://diziet.dreamwidth.org/ and I don't and won't use "AI". Please add me to your list :-)
@neil I love this idea :) it's now on my website to-do list!
I'm not sure how stringent you wish to be one point 1, but I'm happy to be included if you know me well enough. writing.straysong.co.uk
Is it human.json or humans.json ? Your link doesn't have the s.
@neil Yay humans! I finally got around to adding mine today.
@neil Hello! My stuff is on blog.jsbarretto.com. I don't post a huge amount, and it's often eclectic, it's all very sincerely human, mistakes and all.
@neil Is this like PGP for humanity? Are we going to exchange public keys in "meatings"?
@neil I'm not sure the threshold for the first criterion, but smathermather.com is my blog, and you're welcome to add / I'd be honored if you add it it if you feel so inclined.
@neil what a great idea!
@neil I get uneasy seeing ALL_CAPS variables in shell scripts. The reason built-in shell variables with special behavior like PATH or RANDOM or PS1 or TIME are ALL SHOUTY is so they don't clash with regular variables used in scripts. Making your own variables uppercase defeats the whole point.
@neil Well I guess I qualify depending on your definition of โknowโ
@neil I meet all but the first requirement but I shall follow you in the interest of us becoming acquainted.
@neil that's pretty cool
@neil I only Fedi-know you, but otherwise I meet the criteria! My site is 110% human so feel free to add me. No worries if not though!
@neil
I'd be happy to be included, it might give me some incentive because I really need to update my blob. https://pureandapplied.com.au
@neil My blog is at https://blog.liw.fi/ and I'd like to be in your human.json if I may.
@neil I rarely update anymore, because, you know, the Sickness, but my proper blog lives here: https://serenitywomble.wordpress.com/
@neil We're not mutuals, but: https://blog.peter-b.co.uk/
@neil Does Patreon count? Mine is here https://www.patreon.com/cw/rubyjones
I also have a one on Medium: https://medium.com/@QueenRubyJones
I'm sure well intentioned, but at this stage, do we really want to give the robots more guidance of what kind of performance to emulate and which identities to appropriate? I'd be inclined to fill such a file with bot references instead.
@neil
@neil shouldnโt being a human be part of that list :-) kind of implied I guess.
@neil Do you know me? All the other bits are true (janeishly.com) and I feel I at least know that you exist on Mastodon!
@osma @neil I agree. We should be approaching this pragmatically:
The most critical task is for the open internet to survive, and we canโt do that if nobody can afford to run servers anymore, because they die under the staggering load of AI company crawlers.
We need community-run detection, databases and block lists first.
Big Tech stopped playing nice and embraced fascist techniques a long time ago.
@neil posts approximately annually: https://drt24.user.srcf.net/blog/
My concern is that the .json is pointing bots towards content not meant for the bots. That seems counterproductive. If blocking the bots was effective, I'd have less concern, but at this time, it is not.
@neil @gimulnautti
Sure. That wasn't my argument, so I won't defend it either.
@neil @gimulnautti
@neil This is a neat idea. Need to look at getting it added to my site at some point.
@neil https://brunty.me/human.json added you to mine, I don't know you personally but I feel through all you've done in the online space you're legit ๐
@neil thank you. I shall reciprocate once I get back to a desktop, with your permission of course ๐
@neil To those of us who remember the PGP web of trust this seems hopelessly optimistic. Still, if I ever make it to a TVLUGโฆ
@neil I don't know if you know me but, if you do, hi! My blog is blog.allpurposem.at
I do not use generative "AI" in any way for anything in my life, and I don't know what substack is.
I'd like to add a human.json to my sites as well, that's a great idea!
@neil Feels a bit like you are singling out interesting human-generated web sites for AI bots to concentrate their harvesting on.
@neil how do I get this on my blog? I'm not very techy and my blog is built using wordpress (which has AI features that I do not use). I saw your GitHub page, I just need a pointer at where I can find instructions to use it on my blog... Thanks!
@neil I only fail your first criterion. Anyway to remedy that?
@neil@mastodon.neilzone.co.uk is that sort of thing the same sort of thing as this sort of thing?
https://xmlns.com/foaf/spec/
@neil I like this idea. Please feel free to add me and Iโll add you to mine when I have one.
@neil Thank you! I have the Heir to the Overdraft visiting for a few days but hopefully Iโll get mine sorted out in the next few days.
@neil thatโs so cool thatโs a thing, hopefully it becomes more of a standard across the indie web
@neil If you feel like you know me sufficiently I would be delighted to be included! https://spod.cx
@neil I think I would qualify. Haven't had time to put up a humans.json file yet, but it's on my list.
@neil I have vouched for you: https://sam.pikesley.org/human.json
@neil That URL returns 404 for me.
I don't know whether you feel you know me but https://blog.firedrake.org/ now has a human.json file.
@neil Mutual, thanks. I imagine I'll have to sort out the CORS stuff, but that's a job for Later Roger. (I hate that guy.)
@neil Bit late to the party, because I had to actually get the blog up (something I've been trying to get round to for _months_!). But it is created entirely without AI:
It should also have a full RSS feed - I'd be grateful for feedback on whether that actually works (outside my own testing) :)
@neil Wow, that was quick - thanks so much for checking! I'll have to think about how I can fix that issue, as I suspect it will take some plumbing to do it cleanly...
๐ต