Return to Dr. Jacob Schrum's Homepage

Evolving Convolutional Neural Networks to Play Tetris

This page presents research in Tetris evolving Convolutional Neural Networks (CNNs) using the indirect encoding HyperNEAT. This work extends previous research applying HyperNEAT to Tetris that evolved only fully connected networks. The research makes use of the MM-NEAT software package which contains a variant of HyperNEAT that has been extended to support CNNs. State evaluators are evolved to play the game of Tetris using raw screen inputs (taken from the 10 by 20 grid) using both standard Full-HyperNEAT and the new CNN-HyperNEAT. The videos on this page can be viewed in a playlist here.

CNN-HyperNEAT with Depth 1 Width 1 Architecture

This Tetris playing agent uses an evolved Convolutional Neural Network (CNN) as an afterstate evaluator to decide which moves to make. The CNN is evolved with the indirect encoding HyperNEAT, but has a shallow architecture with just one hidden layer consisting of only one substrate/feature map. The network was constructed using threshold link encoding and Multi-Spatial Substrates. It earns a score of 2901 before finally losing.

CNN-HyperNEAT with Depth 1 Width 4 Architecture

This Tetris playing network is also evolved with CNN-HyperNEAT, and still only has one hidden layer, but now has four separate substrates/feature maps within that layer. Generally, CNNs with this architecture earn similar scores to those with only one substrate in the hidden layer, though this specific network in this specific trial earns a score of 542. However, this lower score is more due to the noisiness of evaluation and the high variance in evolution performance within the Tetris

Full-HyperNEAT with Depth 1 Width 4 Architecture

Evolving fully connected networks with HyperNEAT rather than CNNs in Tetris is consistently less successful, by at least an order of magnitude. This particular network uses a one hidden layer with four separate processing substrates, but earns a horrible score of 33. The extra links are simply more parameters to optimize, and that is hard to accomplish. This is a particularly bad example, however, as this approach can get scores around 200, as was also done in previous work using a similar architecture with only one substrate in the hidden layer.

Full-HyperNEAT with Depth 4 Width 4 Architecture

Making fully connected networks even deeper does not help them, as this simply adds more links to optimize. This architecture has four hidden layers, each with four substrates in parallel, but only earns a score of 80 in this trial.

CNN-HyperNEAT with Depth 4 Width 4 Architecture

Increasing the depth of a CNN architecture produces better results than the corresponding fully connected architectures, but performs worse than the shallow evolved CNN architectures. This CNN has a depth of four with four substrates/feature maps per layer, and earns a score of 247. The number of parameters in a deep CNN is easier to optimize than the number in a fully connected network of the same depth, but harder to optimize than the number in a shallow network.

Associated Publications


Peer-Reviewed Conference Publications


Undergraduate Poster Presentations Supervised


Associated Movies


Miscellaneous Content

  • Spring 2019: Patience, Grit, and an Open Mind: An article about my former student Lauren Gillespie which references me and our work on both Tetris and AnimationBreeder
  • Fall 2018: Evolutionary Computation Applied to Digital Entertainment and the Arts, poster presented at the President's Appreciation Celebration for Southwestern University donors.
  • Summer 2018: Neuroevolution in Video Games: "Mad Science Monday" presentation made by my SCOPE Summer research students to present to other SCOPE students
  • Spring 2018: Comparing Direct and Indirect Encodings Using Both Raw and Hand-Designed Features in Tetris, presentation by Lauren Gillespie at the Southwestern University Undergraduate Research & Creative Works Symposium
  • Summer 2016: Computer Science Majors Develop Artificial Intelligence for Video Games, an article about my Summer SCOPE research.
  • Summer 2016: Evolutionary Computation for Creativity and Intelligence: "Mad Science Monday" presentation made by my SCOPE Summer research students to present to other SCOPE students

  • Last Updated: 5/28/2019