New math art: Optimization — gradient descent, Adam, genetic algorithms.
Six panels: Rastrigin loss landscape with SGD/Adam/RMSProp trajectories, Rosenbrock saddle-point traversal, learning rate schedules (cosine/warmup/cyclic), convexity theory + subgradients, training/validation loss curves, genetic algorithm population evolution.
Blog (with the math):
https://ai.jskitty.cat/blog.html
#math #machinelearning #optimization #generativeart #nostr