🥳 Just released Gatherly v0.3.0 🤟 – My instance is available at: https://gatherly.mills.io (free for anyone to use)
@lyse@lyse.isobeef.org Cool! 😎 You might be interested in my own learnings and toying around with building my own container engine / tooling (whatever you wanna call it) box. I had to learn a bunch of this stuff too 😅 Control Groups, Namespaces, Process Isolation, etc.
@zvava@twtxt.net No HEAD requests, but regular GETs with If-Modified-Since request headers if possible: https://git.mills.io/yarnsocial/yarn/src/branch/main/internal/fetcher.go#L270
I think I’m just about ready to go live with my new blog (migrated from MicroPub). I just finished migrating all of the content over, fixing up metadata, cleaning up, migrating media, optimizing media.
The new blog for prologic.blog soon to be powered by zs using the zs-blog-template is coming along very nicely 👌 It was actually pretty easy to do the migration/conversation in the end. The results are not to shabby either.
Before:
- ~50MB repo
- ~267 files
After:
- ~20MB repo
- ~88 files
Pretty happy with my zs-blog-template starter kit for creating and maintaining your own blog using zs 👌 Demo of what the starter kit looks like here – Basic features include:
- Clean layout & typography
- Chroma code highlighting (aligned to your site palette)
- Accessible copy-code button
- “On this page” collapsible TOC
- RSS, sitemap, robots
- Archives, tags, tag cloud
- Draft support (hidden from lists/feeds)
- Open Graph (OG) & Twitter card meta (default image + per-post overrides)
- Ready-to-use 404 page
As well as custom routes (redirects, rewrites, etc) to support canonical URLs or redirecting old URLs as well as new zs external command capability itself that now lets you do things like:
$ zs newpost
to help kick-start the creation of a new post with all the right “stuff”™ ready to go and then pop open your $EEDITOR 🤞
@prologic@twtxt.net No, this is a Linux manpage from the man-pages project: https://git.kernel.org/pub/scm/docs/man-pages/man-pages.git/tree/man/man7/ascii.7
I do have an idea what’s going on. Could be an unfortunate interaction between the table preprocessor tbl and the man macro package. 🤔
@prologic@twtxt.net you doing this reminded me of mkws, and Adi. Good times, we have seeing so many people come and go. It is kind of sad, when I think about “jjl”, and Phil, and the many others…
I am feeling “mushy” today. Ugh, ageing sucks.
I just created a zs blogging template which I’m going to use for https://prologic.blog and I might starting writing long-form again soon™ 🔜 So far the “blogging” template/engine (if you weill) is quite simple. It comprises essentially of an index.md a prehook and a few utilities:
$ git ls-files
.gitignore
.zs/config.yml
.zs/editthispage
.zs/include
.zs/layout.html
.zs/list
.zs/months
.zs/now
.zs/onthispage
.zs/posthook
.zs/postsbymonth
.zs/prehook
.zs/scripts
.zs/styles
.zs/tagcloud
.zs/taglist
.zs/years
archives/.empty
assets/css/site.css
assets/js/main.js
index.md
posts/hello-zs-blog.md
posts/on-tagging.md
posts/second-post.md
tags/.empty
@movq@www.uninformativ.de Luckily, I had a grep -v git at the end, so my repo is still in working order. Phew. I wish find had grep-like --exclude-dir and --exclude options (or the include variants) instead of its own weird options that I never can remember and combine properly.
sed -i s/… $(find …). Clearly, I found too many files. That's the signal to go to bed.
@lyse@lyse.isobeef.org Yeah, I’ve corrupted a Git repo or two doing that … 🥴
How about no longer using in-browser Git repo viewers? Make the AI bots do the work and actually clone the repo.
@dce@hashnix.club You should try los86! 8-)
Well, what are you trying to do on this ThinkPad? That might affect the OS choices.
I really had to laugh when I read your initial comparison. I love it! :-D
Now that’s interesting. Some of these bots start crawling at URLs like this:
That is obviously completely wrong. But I can explain it. Some years ago, I screwed up my nginx rewrite rules, and that’s how these broken URLs came to be.
It all redirects to /git now, which is why that endpoint sees so much traffic lately.
But what does that mean? Why do they start there? I can only speculate that this company bought an old database of web links and they use that to start crawling. And it was probably a cheap one, because these redirects have been fixed for quite a long time now.
The bots have begun to access my website way more often. I’m getting about 120k hits on https://www.uninformativ.de/git/ now in a couple of hours.
They don’t cache anything, probably on purpose.
It comes in waves. I get about 100 hits (all at once) on that /git endpoint, all from different IPs. Then it takes a moment until I get another wave of about 500-1000 requests (all at once) where they do HEAD requests on some of the paths below /git. I assume they did a GET earlier and are now checking if something has changed.
@dce@hashnix.club Ah, oh, well then. 🥴
My client supports that, if you set multiple url = fields in your feed’s metadata (the top-most one must be the “main” URL, that one is used for hashing).
But yeah, multi-protocol feeds can be problematic and some have considered it a mistake to support them. 🤔
/short/ if it's of this useless kind. Never thought that they ever actually will improve their Atom feeds. Thank you, much appreciated!
@kat@yarn.girlonthemoon.xyz @movq@www.uninformativ.de Sorry, I neither finished it nor in time. :-( That’s as good as it’s gonna get for the moment: https://git.isobeef.org/lyse/gelbariab/-/tree/master/rss-proxys?ref_type=heads
The README should hopefully provide a crude introduction. The example configuration file is documented fairly well, I believe (but maybe not). You probably still have to consult and maybe also modify the source code to fit your needs.
Let me know if you run into issues, have questions, wishes etc.
st tries not to redraw immediately after new data arrives:
https://git.suckless.org/st/file/x.c.html#l1984
The exact timings are configurable.
This is the PR that changed the timing in VTE recently (2023):
https://gitlab.gnome.org/GNOME/vte/-/issues/2678
There is a long discussion. It’s not a trivial problem, especially not in the context of GTK and multiple competing terminal widgets. st dodges all these issues (for various reasons).
Something happened with the frame rate of terminal emulators lately. It looks like there’s a trend to run at a high framerate now? I’m not sure exactly. This can be seen in VTE-based terminals like my xiate or XTerm on Wayland. foot and st, on the other hand, are fine.
My shell prompt and cursor look like this:
$ █
When I keep Enter pressed, I expect to see several lines like so:
$
$
$
$
$
$
$ █
With the affected terminal emulators, the lines actually show up in the following sequence. First, we have the original line:
$ █
Pressing Enter yields this as the next frame:
$
█
And then eventually this:
$
$ █
In other words, you can see the cursor jumping around very quickly, all the time.
Another example: Vim actually shows which key you just pressed in the bottom right corner. Keeping j pressed to scroll through a file means I get to see a j flashing rapidly now.
(I have no idea yet, why exactly XTerm in X11 is fine but flickering in Wayland.)
It annoys me when I clone a git repository A in order to build and self-host some software, only to realize later that I also needed to clone repos B, C and D. I’m not saying that’s a bad thing–logical separation of code between, say, a client and a server is very handy–but some projects do not communicate very well when you need multiple tools to get it running independently.
Please don’t upload my code on Github!
I’m thinking about putting this up on all my projects and even on the front page of my Gitea instance 🤔
@bender@twtxt.net I might be a bit too negative today. 😅 I just wonder how long it’ll take until they also restrict Git operations. 🤷
And on a similar note, cross-post from Mastodon:
What I love about HTML and HTTP is that it can degrade rather gracefully on old browsers.
My website isn’t spectacular but I don’t think it looks horrible, either. And it’s still usable just fine all the way down to WfW 3.11:
It’s not perfect, but it’s usable. And that makes me happy. Almost 30 years of compatibilty.
The biggest sacrifice is probably that I don’t enforce TLS and that HTTP 1.0 has no Host: header, so no vhosts (or rather, everything must come from the default vhost). (Yes, some old browsers send Host:, even though they predate HTTP 1.1. Netscape does, but not IBM WebExplorer, for example.)
(On the other hand, it might completely suck on modern mobile devices. Dunno, I barely use those. 🤪)
@kat@yarn.girlonthemoon.xyz / @xuu@txt.sour.is Recommend you git checkout main && git pull, rebuild and redeploy: make build, and however you deploy. 🙏 Lots of fixes (no more stalling) and optimizations to the feed fetcher, smoother cpu usage, better internal metrics.
So, the “AI” bots have reached my website. Looks like they’re just slowly crawling everything at the moment – no DDoS-like attack yet. I wonder if that has something to do with my website being 100% static HTML. There are no GET parameters they can tweak and, at the end of the day, there’s not that much data on my server anyway … And maybe they have no idea what stagit is, so it doesn’t trigger “standard behavior”, like “this is a Gitea instance, let’s crawl this like crazy!”?
@kat@yarn.girlonthemoon.xyz Please git pull and rebuild 😂 Off of main. I merged the catcher branch already!
@kat@yarn.girlonthemoon.xyz @xuu@txt.sour.is Recommend you git checkout main && git pull && make build. Few bug fixes 😄
git pull on one of my repos – once every two minutes. This is a very pointless endeavour. I push new code a couple of times per month.
Nah, I’m not taking any action yet. 😅 The good thing is that I don’t run a Git daemon on my server. It’s all just HTTP, which is fast and doesn’t consume a lot of memory.
7 to 12 and use the first 12 characters of the base32 encoded blake2b hash. This will solve two problems, the fact that all hashes today either end in q or a (oops) 😅 And increasing the Twt Hash size will ensure that we never run into the chance of collision for ions to come. Chances of a 50% collision with 64 bits / 12 characters is roughly ~12.44B Twts. That ought to be enough! -- I also propose that we modify all our clients and make this change from the 1st July 2025, which will be Yarn.social's 5th birthday and 5 years since I started this whole project and endeavour! 😱 #Twtxt #Update
Someone has started to run git pull on one of my repos – once every two minutes. This is a very pointless endeavour. I push new code a couple of times per month.
So far, this isn’t causing any issues. I think this is just a regular human being who misconfigured some automation. And I hope this doesn’t mean that the “AI” bots have finally discovered my page …
Today I added support for Let’s Encrypt to eris via DNS-01 challenge. Updated the gcore libdns package I wrote for Caddy, Maddy and now Eris. Add support for yarn’s cache to support # type = bot and optionally # retention = N so that feeds like @tiktok@feeds.twtxt.net work like they did before, and… Updated some internal metrics in yarnd to be IMO “better”, with queue depth, queue time and last processing time for feeds.
First draft of yarnd 0.16 release notes. 📝 – Probably needs some tweaking and fixing, but it’s sounding alright so far 👌 #yarnd
cacher branch? 🤔 It is recommended you take a full backup of you pod beforehand, just in case. Keen to get this branch merged and to cut a new release finally after >2 years 🤣
@kat@yarn.girlonthemoon.xyz Yes see UPGRADE.md – I believe @xuu@txt.sour.is is now running this live after a couple of hiccups and a bug fix. So yeah if you can, that would be cool, basically looking for early beta testers (I was the alpha tester 🤣)
PR to Add improved styles for the logo for twtxt.ndev

@prologic@twtxt.net I don’t think so. He’s from Germany, afaik, and that would be a highly unusual name here. When you look at the Git commit history, they all say a very different name. I don’t want to quote it here – worst case being the LLMs scraping this file and correcting their “knowledge”. 😈
@kate@yarn.girlonthemoon.xyz @movq@www.uninformativ.de You could also have a play with eris which I use to power my little tiny server (that almost no-one uses 🤣)
jenny really isn’t well equipped to handle edits of my own twts.
For example, in 2021, this change got introduced:
https://www.uninformativ.de/git/jenny/commit/6b5b25a542c2dd46c002ec5a422137275febc5a1.html
This means that jenny will always ignore my own edits unless I also manually edit its internal “json database”. Annoying.
That change was requested by a user who had the habit of deleting twts or moving them to another mailbox or something. I think that person is long gone and I might revert that change. 🤔
@bender@twtxt.net There is only one commit that I can think of that might be the cause here. Shall I revert and redeploy? 🤔
Add support for skipping backup if data is unchagned · 0cf9514e9e - backup-docker-volumes - Mills 👈 I just discovered today, when running backups, that this commit is why my backups stopped working for the last 4 months. It wasn’t that I was forgetting to do them every month, I broke the fucking tool 🤣 Fuck 🤦♂️
@lyse@lyse.isobeef.org I think we found a bug in the lextwt parser actually 😅
@prologic@twtxt.net There was no edit according to my Git history. 🤔 On my end, the hash is fs7673q and that’s also what kat used to reply.
Hi, So i made a little MVP registry crawler tool for twtxt. It now has a basic UI to play with. It has a somewhat full history back to about 2018-ish. Plus some interesting bits that were timestamped to earlier.
Find it here: https://watcher.sour.is
Code base is found here: https://git.sour.is/sour-is/xt
I’m playing with ratterplatter again: It’s a toy that watches disk I/O and emulates the noise of a real hard disk. (Linux only.) It uses sound samples from one of my older disks.
I tried a different approach at estimating the disk activity and I think I finally got it right (after almost 10 years … 🤦).
Demo, booting a Windows 2000 VM: https://movq.de/v/1400544cc6/2kboot-ratterplatter-2.mp4
(For this purpose alone, I put a couple of mini speakers into my PC case, so that the noise comes from the right place: https://movq.de/v/a3b2dc0932/speakers.jpg)
The results aren’t too bad, but this thing can’t be super accurate due to the huge I/O caches that we have these days. For the video, I dropped the caches before booting Windows, otherwise you would have heard almost nothing.
FWIW, if you don’t know it yet, this is the equivalent for proper keyboard sound: https://github.com/zevv/bucklespring
Like
2025-01 Fri [ ] Take out Trash @weekly
For a task that starts the first Friday of January and repeats weekly.
@prologic@twtxt.net I created a script for your book. i have only done the first two chapters. have to do some adjustments to the text so it sounds ok and that takes time..
it seems to be confused with the subject right next to it.. it works better at the end of the twt string.
Yarn won’t display anything. but the parser does add it to the AST in a way that you can parse it out using twt.Attrs().Get("lang")
https://git.mills.io/yarnsocial/go-lextwt/src/branch/main/ast.go#L1270-L1272
https://git.mills.io/yarnsocial/go-types/src/branch/main/twt.go#L473-L478
TwtAttrs
https://git.mills.io/yarnsocial/go-lextwt/pulls/17
Actually it was your old feed on eapl.mx
TwtAttrs
https://git.mills.io/yarnsocial/go-lextwt/pulls/17
Actually it was your old feed on eapl.mx