Controlling a Synth using a Neural Network
Video tutorial introducing the FluidMLPRegressor neural network.
A Brief Synopsis
This tutorial guides you through building a bespoke synthesis controller. It leverages a type of Neural Network in the Fluid Corpus Manipulation toolkit, the MLPRegressor to map a set of input data to a set of output data. This workflow allows you to train the MLPClassifier to create a mapping between any dimension of input and output data. This means you can have a single fader control 10 other faders, or joystick control a complex collection of parameters. Fundamentally, the Neural Network doesn’t care what data you give to it, which means you get to decide and inform the relationship between the inputs and outputs. This can be applied musically in a number of contexts. For example you might:
- Map any dimensionality controller to something else such as a synth, VST effect, sequencer
- Associate gestures to each other using data from a Leap Motion Controller or phone.