Skip to content

Lab 4. Interaction

David Goedicke edited this page Jul 6, 2022 · 1 revision

Now that we've trained networks to classify sounds, what can we do with that?

Interactive Elements in Python & Jupiter Notebooks

In the Interaction.ipynb file in 05_Interaction, we have a bunch of input and output elements that might be useful for your final interaction.

Fancy Cats!

As an exercise, modify Cats vs. Dogs so that something fun happens if the sound heard by the mic is a Cat or a Dog!

Try ModelZoo

We can also try using other pre-trained models from Model Zoo, both as recognition/classification engines, and also as bases for last-layer retraining.

Musical Interaction

In the Musical_Interaction.ipynb file in 05_Interaction, we show more examples of how to synthesize and output midi files.

New! Added a Interaction_osc.ipynb file, also in 05_Interaction, for Open Sound Control buffs.

Musical coloring (work in progress)

We can try using the emotional transforms from this Intel project(GitHub link) to color musical outputs of existing songs

Useful resources

IPY Widgets

Music 21[GitHub] Toolkit for Computer-Aided Musicology

Google Magenta [GitHub] An open source research project exploring the role of machine learning as a tool in the creative process

Google WaveNet [GitHub] A generative model for Raw Audio