Conductive Melody
Conductive Melody combines conductive fabric with the magic of #MachineLearning to interactively create music.
High Fashion as an interactive performance piece, a wearable musical instrument Conductive Melody combines conductive cooper fabric with the magic of machine learning to interactively create music. We packed in a 12 channel capacitive touch sensor board, an arduino and a raspberry pi (gotta love them ruffles). The raspberry pi ran the TensorFlow Magenta built Piano Genie, an intelligent musical interface, to expand the 8 "keys" on the sleeve into a full 88 keyboard based on the previous played notes.
Uniting music, clothing, and artistic performance, Conductive Melody produces music and light that is played by the wearer and shared with an audience. Conductive Melody uses a sleeve of laser cut conductive fabric as a capacitive touch musical interface. The conductive fabric interface is inspired by the strings of a harp and was designed to be scrolling and organic. The music is made through machine learning; a form of AI where a computer algorithm analyses and stores data over time and then uses this data to make decisions and predict future outcomes. An Arduino microcontroller captures touch inputs and a mini-computer, the Raspberry Pi, uses Piano Genie, a machine learning music interface, to expand input from the 8 capacitive touch inputs to a full 88 key piano in real time. The capacitive touch sensors also control the color and display of the lights on the front and back of the garment. The music can be played from the wearable speaker or through a computer or sound system.
Frequently Asked Questions
What inspired you to do this?
We love fashion tech because of the possibilities for expanding the meaning of clothes beyond visual messages. This project adds machine learning into the human experience of wearing clothes and artistic performance. Machine learning is a form of AI where a computer algorithm analyses and stores data over time and then uses this data to make decisions and predict future outcomes. The simple inputs from the garment are expanded into a full musical piece.
How long did it take to make it?
Overall, it looks about three months working on both the dress and the electronics.
How long have you been doing things like this?
We made our first tech couture art dress in 2015, we do about one dress a year, occasionally more.
How much did this cost to do?
Probably a ;little over a hundred dollars in fabric and electronics.
Have you done other things like this?
Over the years our dresses have been becoming more and more performance pieces.
What did you wish you knew before you started this?
Ensuring the sound quality was good was a challenge in creating this piece. In order to get good sound we used a Raspberry Pi, which is a mini computer, instead of a simpler system, like a microcontroller that we use for most of our designs. We needed to do a lot of research to find other projects using Raspberry Pi to make music and how those projects worked.
Are there plans available to make this? Do you sell this?
Check out our website, we have a system diagram available and we will adding the code to our GitHub account.
What’s next?
We're looking to make the output agnostic with choice of audio, bluetooth MIDI, and Open Sound Control.
Sahrye Cohen & Hal Rodriguez : Designers
The maker Sahrye Cohen & Hal Rodriguez
A tech couture studio melding melding historical fashion, couture techniques, and modern technology. Authors of Make It, Wear It! Available now!
Connect with
How I can help you:
Want to make clothes or cosplay that lights up, react to sound, or uses 3D printing? Our fun book, Make It, Wear It has easy-to-follow projects that show you how to design and make fashion-forward wearable electronics using sewing, crafting, and electronics techniques. Create your own stylish, electronics-based wearables―for all experience levels!
How you can help me:
Follow us on Instagram!