Over the past few weeks I’ve been experimenting with and doing some deep learning and researching into neutral networks and evolutionary adaptation of them. The thing is I haven’t gotten very far. I’ve been able to build two different approaches so far with limited results. The frustrating part is that these things are so “random” it isn’t even funny. Like I can’t even get a basic ANN + GA to evolve a network that solves the XOR pattern every time with high levels of accuracy. 😞

⤋ Read More

This is one of my attempts:

$ go build ./cmd/xor/... && ./xor
Generation  95 | Fitness: 0.999964 | Nodes: 9   | Conns: 19
Target reached!

Best network performance:
  [0 0] → got=0 exp=0 (raw=0.000) ✅
  [0 1] → got=1 exp=1 (raw=0.990) ✅
  [1 0] → got=1 exp=1 (raw=0.716) ✅
  [1 1] → got=0 exp=0 (raw=0.045) ✅
Overall accuracy: 100.0%
Wrote best.dot – render with `dot -Tpng best.dot -o best.png`

⤋ Read More

@bender@twtxt.net There is no aim. Just learning 😅 That way I can actually speak and write with authority when it comes to these LLM(s) a bit more 🤣 Or maybe I just happen to become that random weirdo genius that invents Skynet™ 😂

⤋ Read More

Participate

Login or Register to join in on this yarn.