Richard Savery

Shimon Scores Films

Automatic Film Scoring


Shimon’s development as a film composer came out of a collaboration with Janet Biggs in 2017, from her concept of a film scored by a robot. This project uses many common deep learning computer vision techniques tracking main characters, their emotions and object detection. This metrics are utilized to learn a narrative arc of the film and combined with visually derived director aesthetic choices including pacing and levels of movement. These parameters are then used to generate a full score for short films, played by Shimon.

Fragility Curves excerpt from Janet Biggs

Fragility Curves single channel excerpt from Janet Biggs

Press and Reviews

The premere of the film took place at the Museos de Tenerife in the Canary Islands

Review: Janet Biggs: Like Walking on Mars

This project was featured in Georgia Tech’s Presidents Update: Together, Man and Robot Write Film Score


After developing Shimon the Film Composer I turned the system into a software only automatic composer, which allowed human input and choices of melody. You can see and hear a System Demo


Shimon the Robot Film Composer and DeepScore

Computer Simulation of Musical Creativity, 2018

Richard Savery, Gil Weinberg

Abstract: Composing for a film requires developing an understanding of the film, its characters and the film aesthetic choices made by the director. We propose using existing visual analysis systems as a core technology for film music generation. We extract film features including main characters and their emotions to develop a computer understanding of the film’s narrative arc. This arc is combined with visually analyzed director aesthetic choices including pacing and levels of movement. Two systems are presented, the first using a robotic film composer and marimbist to generate film scores in real-time performance. The second software-based system builds on the results from the robot film composer to create narrative driven film scores.