Following Section 6.2.1 of [Müller, FMP, Springer 2015], we introduce in this noteook the concepts of beat and tempo.
Temporal and structural regularities are perhaps the most important incentives for people to get involved and to interact with music. It is the beat that drives music forward and provides the temporal framework of a piece of music. Intuitively, the beat corresponds to the pulse a human taps along when listening to music. The beat is often described as a sequence of perceived pulse positions, which are typically equally spaced in time and specified by two parameters: the phase and the period. The term tempo refers to the rate of the pulse and is given by the reciprocal of the beat period. The following figure illustrates these notions using the beginning of "Another one bites the dust" by Queen.
The extraction of tempo and beat information from audio recordings is a challenging problem in particular for music with weak note onsets and local tempo changes. For example, in the case of romantic piano music, the pianist often takes the freedom of speeding up and slowing down the tempo—an artistic means also referred to as tempo rubato. There is a wide range of music where the notions of tempo and beat remain rather vague or are even nonexistent. Sometimes, the rhythmic flow of music is deliberately interrupted or disturbed by syncopation, where certain notes outside the regular grid of beat positions are stressed. The following audio example indicate some of the challenges:
Music with weak onsets (Borodin, String Quartet No. 2, 3rd movement):
Romantic music with local tempo fluctuations (rubato) and global tempo changes (Chopin, Op.68, No. 3):
Music with syncopation (Fauré, Op.15):
To make the problem of tempo and beat tracking feasible, most automated approaches rely on two basic assumptions.
Even though both assumptions may be violated and inappropriate for certain types of music, they are convenient and reasonable for a wide range of music including most rock and popular songs.
In the following code cell, we present for each of the above examples a visualization of a spectral-based novelty function along with manually annotated beat positions (quarter note level). Furthermore, a sonification indicates the annotated beats by short click sounds placed on top of the original music recordings.
import os, sys
import sys
import numpy as np
from scipy import signal
from matplotlib import pyplot as plt
import librosa
import IPython.display as ipd
import pandas as pd
sys.path.append('..')
import libfmp.b
import libfmp.c2
import libfmp.c6
%matplotlib inline
def plot_sonify_novelty_beats(fn_wav, fn_ann, title=''):
ann, label_keys = libfmp.c6.read_annotation_pos(fn_ann, label='onset', header=0)
df = pd.read_csv(fn_ann, sep=';', keep_default_na=False, header=None)
beats_sec = df.values
Fs = 22050
x, Fs = librosa.load(fn_wav, Fs)
x_duration = len(x)/Fs
nov, Fs_nov = libfmp.c6.compute_novelty_spectrum(x, Fs=Fs, N=2048, H=256, gamma=1, M=10, norm=1)
figsize=(8,1.5)
fig, ax, line = libfmp.b.plot_signal(nov, Fs_nov, color='k', figsize=figsize,
title=title)
libfmp.b.plot_annotation_line(ann, ax=ax, label_keys=label_keys,
nontime_axis=True, time_min=0, time_max=x_duration)
plt.show()
x_beats = librosa.clicks(beats_sec, sr=Fs, click_freq=1000, length=len(x))
ipd.display(ipd.Audio(x + x_beats, rate=Fs))
#print('Carlos Gardel: Por Una Cabeza')
#fn_ann = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_PorUnaCabeza_quarter.csv')
#fn_wav = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_PorUnaCabeza.wav')
#plot_sonify_novelty_beats(fn_wav, fn_ann)
title = 'Borodin: String Quartet No. 2, 3rd movement'
fn_ann = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Borodin-sec39_RWC_quarter.csv')
fn_wav = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Borodin-sec39_RWC.wav')
plot_sonify_novelty_beats(fn_wav, fn_ann, title)
title = 'Chopin: Op.68, No. 3'
fn_ann = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Chopin.csv')
fn_wav = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Chopin.wav')
plot_sonify_novelty_beats(fn_wav, fn_ann, title)
title = 'Fauré: Op.15'
fn_ann = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Faure_Op015-01-sec0-12_SMD126.csv')
fn_wav = os.path.join('..', 'data', 'C6', 'FMP_C6_Audio_Faure_Op015-01-sec0-12_SMD126.wav')
plot_sonify_novelty_beats(fn_wav, fn_ann, title)