motionEmotion is an emotion & gesture-based arpeggiator and synthesizer.
It uses the webcam to detect points of motion on the screen and tracks the user's emotion. It draws triangles between the points of motion, and each triangle represents a note in an arpeggio/scale that is determined by the user's emotion.
- clmtrackr.js Face tracking / emotion detection
- Tone.js Sound synthesis / sequencing
- delaunay.js Delaunay triangulation
- Karen Peng motion detection and visual design
- Jason Sigal sound design
- Special Thanks for T.K. Broderick and Yotam Mann for music suggestion and inspiration.