Date/Time: Monday February 11, 6:30 PM - 8:30 PM
Location: Room 1045, Janet Ayers Academic Center (1500 Wedgewood Blvd), Belmont University. Free parking under the building.
Food & Drinks provided.
We’ve been talking about doing this for a few months now, and it’s actually going to happen!
Wekinator is an open source, cross-platform easy-to-use machine learning tool for controlling (electronic) musical instruments using anything –– game controllers, cameras, motion sensors, your face, — without writing any code. You just train the system to imitate what you want it to do.
This event, we’ll have a couple student assistants. It’s be particularly good if people can bring in some input and output devices to chain together. These could be physical peripherals, instruments or purely software. Take a look at http://www.wekinator.org/ for sample devices, protocols modules, etc.
What to bring: Yourself and your laptop, and maybe something else you might want to hook up either for input or output, like instruments — eg I’ve got a guitar synth modulator I’ve dying to hook up to a facial-geometry controller, and I’ve got a couple Xbox 360 controllers. (Otherwise there are a few free software utilities you can use). Wekinator outputs OSC codes, which are like MIDI but not quite the same, and there are OSC to MIDI converters that we can download and run for free.
If you want to get started early, check out Wekinator creator Rebecca Fiebrink‘s free online course: https://youtu.be/SdT0EwzZTsI