Participants play a simple game to collectively create music on real musical instruments over the internet. Players can join together on a video chat application to see and hear the band of acoustic, electric, and electronic instruments respond in real time.
play a part, together bandy consists of 5 main parts
- A mobile web app where players play a simple paddle ball game
- Another web application collects all of the players’ game states and creates musical events from them
- Software processes the musical events and distributes them to controllable musical instruments
- A band of commercial and custom acoustic, electric, and electronic musical instruments reacts to the player gestures to create musical compositions
- A live video feed allows players and visitors to see and hear the instruments responding to the players’ actions in real time

Over the past few years I’ve moved geographically further away from my friends and musical collaborators. I was missing making music with them and looking for ways to play music remotely. Our past year spent inside has given me the opportunity and the push to learn some of the new available technologies and see how they apply to playing music together.
Some of my main goals in these explorations were maintaining the sense of connection and the joy of discovery in playing and improvising together, making it simple for less technologically adept musicians to play music together remotely, and to allow anyone, not only trained musicians, to engage in making music together.

The web applications for bandy are written in standard HTML5/CSS/JavaScript. They take advantage of the Firebase hosting platform and realtime database to serve the content and collect the status of each players’ game. I implemented a simple paddle ball game where players contribute to a musical composition by bouncing a ball against colored boxes. Each time a box is hit it generates an event that is sent to a piece of software that acts as a performance controller.
The performance controller uses the Magenta machine learning libraries’ Piano Genie to create musical information based on the events from players’ games. Piano Genie is trained on 1400 pieces of music and simplifies the 88 keys of the piano into only 8 keys. This means that simple gestures can be translated into note sequences. As the game events enter the performance controller Piano Genie translates them into musical events in MIDI format.
The MIDI information from Piano Genie is processed in various ways for a set of controllable instruments to create specific compositions. The band of instruments can be viewed via video conference so players can see and hear the music they are creating in real time and interact with their bandmates. The intent is that they will play using the web browser on their phones so that they can join the video conference on a computer. The games can also be played using any modern browser on most platforms.
The music generated by the system is of a particular aesthetic because of the music the machine learning model is trained on. This is an intentional decision to make the result feel familiar to the expected audience. Since the system can accommodate any number of players this can have a significant effect on the musical result. Limitations are intentionally imposed by the game mechanics allowing multiple players to interact at the same time while keeping the musical result varied and dynamic. The “compositions” are differentiated by the choice of instrumentation and how the musical events are orchestrated within the ensemble. Some of the most exciting moments are when the system breaks out of its western harmonic patterns and modulates seemingly at random or breaks into a streaming cadenza when a player’s ball gets stuck between blocks and creates a flurry of events.

It took about two weeks to create the basic software for bandy. During those two weeks I was also able to create a prototype of a MIDI controlled electric guitar to add to the band. Following development of the software components I’ve been playtesting with different numbers of players. Watching people of different ages, backgrounds, and technical skill levels has led to a few adjustments to bandy. I will be adding a simpler game mode for those less adept at manipulating the phone or those who want to pay more attention to what the instruments are doing. I’m also adjusting the interface to offer more information about the events, like Maker
Music Festival, this will be a part of. At the moment, bandy is intended as a facilitated performance event because of its dependence on the physical instrument setup.

Interactive performances where you can join in and play with bandy will be at the following times

Saturday, May 15th
12 noon PST - Live Zoom Performance
1 PM PST - YouTube LiveStream Performance

Sunday, May 16th
12 noon PST - Live Zoom Performance
1 PM PST - YouTube LiveStream Performance

Information for connecting will be available before the performances through the bandy web app at

Zoom Performances can also be accessed through
YouTube Live Performances also can be accessed through

Frequently Asked Questions
What inspired you to do this?
I've been thinking about ways to let people, regardless of musical ability, experience the joy of musical collaboration as a way of connecting during the pandemic.
How long did it take to make it?
It took me about two weeks to build the software and setup the hardware to get through my first couple playtests. I'm continuing to make changes and improvements.
How long have you been doing things like this?
I've been making interactive musical artwork for over 20 years now. I've been working with electronics and software as part of the process for over 15 of those.
How much did this cost to do?
Luckily I had most of the electronics and instruments already available. Even the app technology platform is still using the free version of Firebase. The only thing I spent money on for this particular project were the servos for the "guitar robot". If I had to put it all together from scratch though, probably around $5000 plus the $30,000 Yamaha Disklavier piano which I am lucky enough to have access to.
Have you done other things like this?
I have created many interactive artworks or technology platforms for theme parks, museums, corporate offices, and performance events. I have created some physical installations of generative and interactive music over the past couple years using the Magenta machine learning libraries which were a big influence on this piece.
What’s next?
I plan to continue to make improvements to bandy and hopefully have some more public showings and performances. I have some ideas on how to use this concept to generate musical scores in real time for musicians to perform live as well. I have two other related projects that I'm working on; an internet connected zither that two people can play together and a video conferencing platform specifically for musicians to collaborate remotely.
The Piano Genie project has a great demo with available source code that was the starting point for bandy:!/piano-genie
I also used Firebase as the hosting/database platform:
Audio and MIDI processing is done with Ableton Live and Max/MSP:

For the robot guitar -
Custom circuit boards were made with Eagle CAD and manufactured by OSH Park:
The microcontroller is a Teensy from PJRC:
Servos are from ServoCity:

Thadeus Frazier-Reed : Composer and Creative Technologist
The maker Thadeus Frazier-Reed
Thadeus Frazier-Reed is an interactive designer, composer, and creative technologist. With a degree in music composition from California Institute of the Arts and a background in music and dance performance, Thadeus builds technology platforms for the creation of visual and performing arts.

He has developed custom media platforms for The Museum of Contemporary Art, Los Angeles; Universal Studios Hollywood Halloween Horror Nights VIP Tour; Britney Spears' "Piece of Me"; and San Diego Comic-Con's Godzilla Encounter. He founded and directed the Interactive Design department at Thinkwell Group, an experience design firm in Los Angeles where he created interactive exhibits for Warner Bros. Studio Tour London: The Making of Harry Potter, The Hunger Games: The Exhibition, and Google.

Thadeus is currently a Technical Integration Lead for Experience Studio at Google, where he creates custom technology for physical spaces.

Connect with Thadeus Frazier-Reed
How I can help you:
I am a composer and creative technologist who creates tools for artists of all kinds. I am interested in artistic collaboration, experimental performance, and physical, interactive installations. At this point in my life I am interested in building community around experimenting and performing with technology, art, dance, and music. I can offer my expertise and experience to collaborative artistic endeavors.
How you can help me:
You can view my work at my website, on YouTube, and on Instagram You can listen to and purchase my music at as well as most streaming services. I am also interested in other opportunities for public presentation, installation, and performance.