Workshop: Personalized Music Tagging via Vibrary
Date/Time: Wednesday October 2, 6:30 PM - 8:30 PM
Location: Room 1045, Janet Ayers Academic Center (1500 Wedgewood Ave), Belmont University. Free parking under the building.
Food & Drinks provided.
Description: Come and learn how to train your own machine learning application for personalized music tagging, and help contribute to the testing datasets for an ongoing ASPIRE project.
What to bring: Your Mac laptop (sorry, Windows users), your hard drive with audio samples for various instruments, genres, moods & more.
The Story: One of the projects bourne out of our group is the idea of a utility for producers and composers to re-index their libraries of samples and loops. Posed by Ethan Henley as an application of Scott Hawley’s audio classifier, developed further at HackMT 2018, and since then it’s been an official project of the Incubator Lab collaboration betwen Scott and Art+Logic, with help from a few friends such as Kyle Baker!
This application – still in development – has become known as “Vibrary,” and will presented at AES in October.
For this workshop, we’ll work with the Vibrary client and see what kinds of musical categories – instruments, genres, moods, keys, time signatures, …?? – the system is able to learn classify.
For more info, read the AES Engineering Brief about Vibrary.