148

Beat-Based Rhythm Quantization of MIDI Performances

Main:1 Pages
Bibliography:1 Pages
1 Tables
Abstract

We propose a transformer-based rhythm quantization model that incorporates beat and downbeat information to quantize MIDI performances into metrically-aligned, human-readable scores. We propose a beat-based preprocessing method that transfers score and performance data into a unified token representation. We optimize our model architecture and data representation and train on piano and guitar performances. Our model exceeds state-of-the-art performance based on the MUSTER metric.

View on arXiv
Comments on this paper