Awk to take lines from Plan 9’s /lib/unicode and prepend the actual glyph and a tab: awk ‘{cmd=sprintf(“unicode %s”, $1); cmd | getline c; printf(“%s %s\n”, c, $0)}’
Colored Highlighter - A terminal tool to highlight specific words in your command output with colors
I needed to take a look at some live logs and quickly analyze some old ones, but I couldn’t find anything effective to highlight terms, except for esoteric sed and awk commands.
So I built ch - Colored Highlighter - a simple terminal tool to highlight specific words in your command output with colors. Perfect for tailing logs, debugging, and making command output more readable.
Try it out, all feedback is welcome!
@lyse@lyse.isobeef.org You might enjoy this one: https://github.com/TheMozg/awk-raycaster
Hmm, gnu.org is slow as heck. Shorter HTML pages load in about ten seconds. This complete AWK manual all in one large HTML page took a full minute: https://www.gnu.org/software/gawk/manual/gawk.html Is there maybe some anti AI shenanigans going on?
In any case, I find the user guide super interesting. My AWK skills are basically non-existent, so I finally decided to change that. This document is incredibly well written and makes it really fun to keep reading and learning. I’m very impressed. So far, I made it to section 1.6, happy to continue.
** Make awk rawk **
A friend online recently replied to something I wrote about awk by saying:
[…] it’s a danged shame [awk] didn’t continue to evolve the way Ruby, Python, PHP have evolved over the decades.
I had exactly this thought while working on my slightly unhinged“lets see if I can implement a basic scheme using awk by writing an assembler and VM in awk,” skwak. Which eventually lead me to start noodling on how to layer in some modern niceties into awk, without breaking awk’s portability.
… ⌘ Read more
Then I cleaned up my shell history of all of the invocations I ever made of dkv rm ... to make sure I never ever have this so easily accessible in my shell history (^R):
$ awk '
/^#/ { ts = $0; next }
/^dkv rm/ { next }
{ if (ts) print ts; ts=""; print }
' ~/.bash_history > ~/.bash_history.tmp && mv ~/.bash_history.tmp ~/.bash_history && history -r
** Styrofoam cups and awk **
I like writing these posts for my website, but I’ve sat down to write this one like 11 times and it either takes on a tone of totally encompassing dread and dystopian navel gazing or I feel like I’m burying my head in the sand and ignoring reality as it happens around me.
…I finished reading Victor LaValle’s The Changeling. It was engaging, and I was interested in where it was going, but I found that where it went wasn’t interesting. The dialogue and prose were lively and contemporary, which is what r … ⌘ Read more
@prologic@twtxt.net Regarding the new way of generating twt-hashes, to me it makes more sense to use tabs as separator instead of spaces, since the you can just copy/past a line directly from a twtxt-file that already go a tab between timestamp and message. But tabs might be hard to “type” when you are in a terminal, since it will activate autocomplete…🤔
Another thing, it seems that you sugget we only use the domain in the hash-creation and not the full path to the twtxt.txt
$ echo -e "https://example.com 2024-09-29T13:30:00Z Hello World!" | sha256sum - | awk '{ print $1 }' | base64 | head -c 12
https://ismailefe.org/blog/org-awk-anki/ How to use #awk to turn #orgmode into #anki flash cards
For the fun and to remove extra deps I rewrite prx twtxt2atom AWK script to a simple core perl script: twtfeed
#!/bin/sh
# Validate environment
if ! command -v msgbus > /dev/null; then
printf "missing msgbus command. Use: go install git.mills.io/prologic/msgbus/cmd/msgbus@latest"
exit 1
fi
if ! command -v salty > /dev/null; then
printf "missing salty command. Use: go install go.mills.io/salty/cmd/salty@latest"
exit 1
fi
if ! command -v salty-keygen > /dev/null; then
printf "missing salty-keygen command. Use: go install go.mills.io/salty/cmd/salty-keygen@latest"
exit 1
fi
if [ -z "$SALTY_IDENTITY" ]; then
export SALTY_IDENTITY="$HOME/.config/salty/$USER.key"
fi
get_user () {
user=$(grep user: "$SALTY_IDENTITY" | awk '{print $3}')
if [ -z "$user" ]; then
user="$USER"
fi
echo "$user"
}
stream () {
if [ -z "$SALTY_IDENTITY" ]; then
echo "SALTY_IDENTITY not set"
exit 2
fi
jq -r '.payload' | base64 -d | salty -i "$SALTY_IDENTITY" -d
}
lookup () {
if [ $# -lt 1 ]; then
printf "Usage: %s nick@domain\n" "$(basename "$0")"
exit 1
fi
user="$1"
nick="$(echo "$user" | awk -F@ '{ print $1 }')"
domain="$(echo "$user" | awk -F@ '{ print $2 }')"
curl -qsSL "https://$domain/.well-known/salty/${nick}.json"
}
readmsgs () {
topic="$1"
if [ -z "$topic" ]; then
topic=$(get_user)
fi
export SALTY_IDENTITY="$HOME/.config/salty/$topic.key"
if [ ! -f "$SALTY_IDENTITY" ]; then
echo "identity file missing for user $topic" >&2
exit 1
fi
msgbus sub "$topic" "$0"
}
sendmsg () {
if [ $# -lt 2 ]; then
printf "Usage: %s nick@domain.tld <message>\n" "$(basename "$0")"
exit 0
fi
if [ -z "$SALTY_IDENTITY" ]; then
echo "SALTY_IDENTITY not set"
exit 2
fi
user="$1"
message="$2"
salty_json="$(mktemp /tmp/salty.XXXXXX)"
lookup "$user" > "$salty_json"
endpoint="$(jq -r '.endpoint' < "$salty_json")"
topic="$(jq -r '.topic' < "$salty_json")"
key="$(jq -r '.key' < "$salty_json")"
rm "$salty_json"
message="[$(date +%FT%TZ)] <$(get_user)> $message"
echo "$message" \
| salty -i "$SALTY_IDENTITY" -r "$key" \
| msgbus -u "$endpoint" pub "$topic"
}
make_user () {
mkdir -p "$HOME/.config/salty"
if [ $# -lt 1 ]; then
user=$USER
else
user=$1
fi
identity_file="$HOME/.config/salty/$user.key"
if [ -f "$identity_file" ]; then
printf "user key exists!"
exit 1
fi
# Check for msgbus env.. probably can make it fallback to looking for a config file?
if [ -z "$MSGBUS_URI" ]; then
printf "missing MSGBUS_URI in environment"
exit 1
fi
salty-keygen -o "$identity_file"
echo "# user: $user" >> "$identity_file"
pubkey=$(grep key: "$identity_file" | awk '{print $4}')
cat <<- EOF
Create this file in your webserver well-known folder. https://hostname.tld/.well-known/salty/$user.json
{
"endpoint": "$MSGBUS_URI",
"topic": "$user",
"key": "$pubkey"
}
EOF
}
# check if streaming
if [ ! -t 1 ]; then
stream
exit 0
fi
# Show Help
if [ $# -lt 1 ]; then
printf "Commands: send read lookup"
exit 0
fi
CMD=$1
shift
case $CMD in
send)
sendmsg "$@"
;;
read)
readmsgs "$@"
;;
lookup)
lookup "$@"
;;
make-user)
make_user "$@"
;;
esac
https://si3t.ch/Logiciel-libre/awk.html code shell awk
started advent of code 2021 using awk | gemini://compudanzas.net/advent_of_code_2021.gmi
a simple Makefile for forwarding internet to your local machine:
SSH_HOST=https://xuu.me
PRIV_KEY=~/.ssh/id_ed25519
forward:
LOCAL_PORT=$(HOST_PORT); sh -c "$(shell http --form POST $(SSH_HOST) pub=@$(PRIV_KEY).pub | grep ^ssh | head -1 | awk '{ print "ssh -T -p " $$4 " " $$5 " -R " $$7 " -i $(PRIV_KEY)" }')"
https://box.matto.nl/cgi-with-awk-on-openbsd-httpd.html cgi httpd openbsd
Well, it was not a proper fix, more like a duck-tape mend, the right thing to do is to add a BSD branch and fix the calls to BSD’s awk and fmt so they produce the data in the way the rest of the code expects it. #txtnish #gnu #bsd
@mdom@domgoergen.com: I’m using txtnish on FreeBSD and I had to switch it to gawk (not sure why BSD awk fails) and disable color. Just fyi. I didn’t look into it any further.
GitHub - geophile/osh: Osh (Object SHell) is a command-line and API toolkit combining cluster access, database access, and data slicing and dicing. Sort of like awk and cssh morsels wrapped up in a Python crust. https://github.com/geophile/osh
@nblade@nblade.sdf.org I take a look, it should work with every posix compliant awk, so at least oawk, nawk and gawk should run. What’s the error message? Feel free to add an issue.
Aho, Kernighan, and Weinberger https://archive.org/download/pdfy-MgN0H1joIoDVoIC7/The_AWK_Programming_Language.pdf
@dave@davebucklin.com Which os are you using? Can you chech if you also have this awk problem?
@dave@davebucklin.com Okay, i fixed the awk and xargs problem, but it seems awk on macosx is weird: printf “foo\n” | awk ‘{gsub(/[[:cntrl:]]/,” “);print}’ => ” f o o “
How to get seconds since epoch in POSIX without C: PATH=$(getconf PATH) awk ‘BEGIN{srand();print srand()}’
Does cat /proc/fs/xfs/stat | awk ‘/^ig/{print $1}’ == 0 really mean that i’m not using the inode cache?