Skip to content

A self contained evolutionary ecosystem written in Rust, with Neural Nets and Genetic Evolution

Notifications You must be signed in to change notification settings

aaronik/evolution

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A self contained evolutionary ecosystem

image

Based off of this amazing youtube video: https://youtu.be/N3tRFayqVtk?t=1433

What is it?

I wanted to make some kind of ecosystem with lifeforms or automata that each have a neural net and that evolve over time from generation to generation. I wanted there to be continuity of time, just like in our universe, where generations of lifeforms can overlap with each other. This is in contrast to the system in the YouTube video above, where every generation lives for a certain number of cycles, and then the population is measured and culled/reproduced.

This is the result -- a small world with lifeforms, food, danger. The lifeforms reproduce when they eat enough. The danger is radioactive and hunts down lifeforms (scary, right?). The Lifeforms can attack each other, costing them both health.

The UI is terminal based. It uses tui-rs.

Here's a video of it in action, but note that this demonstrates mostly the UI. Because every time you run it and let it get to 500,000 or 1,000,000 iterations, different behaviors start to appear. One video cannot do it justice :)

evolution.compressed.mp4

Properties of this app

  • Cyclic neural nets
  • Blazingly fast, written in Rust, with care taken to be efficient
  • Parallelized using rayon

Discoveries I made running it

  • In one evolution I saw the lifeforms coordinate to keep the danger in the corner by every so often sacrificing one of themselves as bait. This is incredible evolutionary behavior and demonstrates the power of groupwise evolution even if it costs the individual.
  • The neural calculation algorithm paired with the Rust programming language worked out splendidly. This program performs parallel recursive network calculations very efficiently, doing thousands of iterations, for dozens of lifeforms, each with their own neural net, per second. This is the case even on Dellbert, the mid grade, many years old laptop I built this on, and with lifeforms with 10 inner neurons and a genome of size 75.

A little about the neural nets

There are three groups of neurons:

  • Input Neurons, which get their values from the environment,
  • Output Neurons, which, once their values are computed, determine what actions the lifeform is going to take, and
  • Inner Neurons, the interesting ones. Running the app from the CLI, you can choose how many inner neurons there are going to be. The inner neurons take their inputs from either the input neurons or other inner neurons, and they output to either output neurons or other inner neurons. They are fully free to output to themselves as well. So the neuron graph becomes cyclic. There aren't layers like in a deep learning net, there's just one big blob of inner neurons that are free to interconnect how they want. I thought this was more representative of how it works in biology.
  • Initially all of the lifeforms have the same set of neurons, which aren't connected to each other. They all have the same input neurons, output neurons and number of inner neurons. It's the Genome that represents the connections between neurons. The genome is comprised of an unordered list of genes, which each is { from: <neuron_id>, to: <neuron_id>, weight: f32 }.
  • As time goes on, it's the Genome that gets selected for under the evolutionary pressures.
  • Lifeforms aren't chosen for their fitness after a certain period of time passes. Whether they reproduce and pass on their genome is determined by whether they reproduce and pass on their genome. There's no grade or score that determines whether the lifeforms reproduce - if they eat enough and naturally reproduce, then they pass on their genes.

Interesting bits of code

  • Recursion-less recursive neural net calculations. Recursion is elegant and beautiful, but in languages that can not guarantee tail optimization, like Rust, loops are faster. But the idea of recursion is great, especially because in this case it mimics the real world analog, a brain. So this app puts together an ordered list of genes to be followed one by one and have the neural net calculations done on them (which looks like tanh(sum(inputs))). Find that here(ish), with the code that walks that vector around here.
  • Within that neural net calculation function is this data structure I'm calling NeuronGraph. I think this is cool because it's an infinitely recursive graph

Things left undone

  • Letting the lifeforms evolve the number of genes and inner neurons they have. Right now those values are fixed, but it'd be really cool to see if there were some ideal values, or at least local maxima/minima.
  • Separating the main thread from the UI drawing thread. Instead of doing this, I'm leaving a less ideal solution of being able to pause the drawing from within the app itself.
  • Ability to save the evolutions. They can evolve thousands of generations in only a few minutes, so it hasn't really been that important. But nonetheless, it'd be interesting to see how they'd be after a million generations!

For Next Time

  • The console UI is fun, but I'd definitely like some visual medium that is more expressive.
  • I'd love to have a richer set of output actions. Maybe some that can facilitate more social behaviors.

About

A self contained evolutionary ecosystem written in Rust, with Neural Nets and Genetic Evolution

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages