Manual MusicEXPERIMENTAL v0.3Guidonian Hand Readings with AI

  1. Allow access to your camera (no data is collected)
  2. Make sure your hand is in view
  3. Solmize using the Guidonian hand...
  4. ...and watch how this very experimental demo struggles.

Ehmmm... nothing happens?!

If the red dot at the top left of the video does not become green when your hand moves into view, I'm afraid the app doesn't work in your browser yet. You could try your computer: the app works fine in Chrome, Firefox and Safari on my MacBook.

The Guidonian hand

For hundreds of years, musicians were trained to sight-read music using a solmization method that revolved around the Guidonian hand. Simplifying things a bit, the idea is that every joint of the hand corresponds to a note. Singers would touch the joint of the note they were singing with their thumb. In this way, they learned to associate notes with a physical gesture — a deeply embodied way of learning solmization. This is all quite similar to how instrumentalist associate notes with a position on their instrument. In fact, hands could even be used as instruments themselves: we know that choir masters could use the same gestures to indicate which notes the singers were supposed to sing.

About this project

Manual Music is an experiment by Bas Cornelissen that tries to turn the hand into an instrument. The project revolves around a model developed by Google that can detect the position of a hand in images and videos. The difficult part is to recognize the different gestures. The project is in an early stage, but already allows you to 'play' the Guidonian hand. The detection method used is however not very reliable yet (see updates below). Interested in joining the project? Please get in touch!

Updates

  • Sep 12, 2024I trained a simple network (1 layer, 64 units) to to classify gestures, using the (64 most informative) distances between joints as input features. The test accuracy and F1 score are around 90%, but in practice this approach is still far from perfect. It may be worthwhile to develop gestures that are easier to discriminate.
  • Sep 10, 2024Sound! A synthesizer now plays the detected gestures...
  • Sep 10, 2024A gesture recorder has been implemented. This makes it easy to collect training data for the gesture classifier.

You can find all code on GitHub. Built using Mediapipe Solutions, Fresh, Deno, Tailwind, Preact and more.

Copyright © Bas Cornelissen, 2024.