I am surprised and pleased and a little disturbed, sure, at how much help chatGPT was here. Written in Codea on my iPhone with (not by) chatGPT. I’ve tried to write evolution simulators in Codea before and couldn’t get it to track over 200 creatures without massive slowdown. Thanks to chatGPT helping me code things I could conceptualize but not implement, this now handles over 3000 organisms without substantial slowdown and can even do over 10,000 without crashing, though to be sure it’s very slow at that point. The critters are very very simple but they are randomly reproducing with each other by blending colors. This simulation stops all breeding after the population gets over 3000.
To be clear chatGPT did not write this. It was built from the ground up using chatGPT, but all it was doing was giving me pieces of code that I asked for. The first thing I did was say “write me an evolution simulation” and the thing it gave me didn’t even run. I built it up piece by piece by asking chatGPT to give me a function that did this and a function that did that. I also often had to tweak its code because, for example, it could never remember not to use math.atan2 in Codea code.
The basic principle is that most tabs are simply a draw() function, which means that whichever tab is bottommost is the one that will run when you hit play. So to see different things drag different tabs to the bottom.
The Field class and the ColorCritters class are the building blocks of the simulations. The Field generates critters and its draw function moves them around.
Some of these things have not actually been worked out, for example there is an “Aggression” tab which is currently just a copy of one of the other tabs.
The tabs above OtherMain can be dragged to the bottom, to see what they do, but should not be left below OtherMain while trying to run any of the simulations. They’re simple tests of basic functions—other than the first “Main” tab, which is a first try at this whole thing.
The basic trick here is that critters randomly sample one pixel on their perimeter every draw cycle, and decide what to do based on that.
In TinyBreeders if they detect the background color they move in that direction, and if they detect any other color they have a random chance of generating a child. The child’s color is a mix of their own color and the color detected.