The FMP notebooks offer a collection of educational material closely following the textbook Fundamentals of Music Processing (FMP). This is the starting website, which is opened when calling https://www.audiolabs-erlangen.de/FMP. Besides giving an overview, this website provides information on the license and the main contributors.
The FMP notebooks are a collection of educational material for teaching and learning Fundamentals of Music Processing (FMP) with a particular focus on the audio domain. Covering well-established topics in Music Information Retrieval (MIR) as motivating application scenarios, the FMP notebooks provide detailed textbook-like explanations of central techniques and algorithms in combination with Python code examples that illustrate how to implement the theory. All components including the introductions of MIR scenarios, illustrations, sound examples, technical concepts, mathematical details, and code examples are integrated into a consistent and comprehensive framework based on Jupyter notebooks. The FMP notebooks are suited for studying the theory and practice, for generating educational material for lectures, as well as for providing baseline implementations for many MIR tasks, thus addressing students, teachers, and researchers.
The text and figures of these notebooks are licensed under a Creative Commons
Attribution-NonCommercial-ShareAlike 4.0 International License (see the file
The Python package
libfmp (i.e., the content of the directory
libfmp) is licensed under the MIT license (see file
libfmp_LICENSE) and is available at GitHub. As for the audio material, the respective original licenses apply. This site contains material (text passages, figures) from the book Fundamentals of Music Processing. If you use code or material from this site, please give reference to this book (e.g. Figure 1.1 from [Müller, FMP, Springer 2021]). If you publish results obtained or using these Python notebooks, please consider the following references:
If a static view of the FMP notebooks is enough for you, the exported HTML versions can be used right away without any installation. All material including the explanations, the figures, and the audio examples can be accessed by just following the HTML links. If you want to execute the Python code cells, you have to download the notebooks (along with the data), create an environment, and start a Jupyter server. You then need to follow the IPYNB links within the Jupyter session. The necessary steps are explained in detail in the FMP notebook on how to get started.
The collection of FMP notebooks is organized along the eight chapters of the textbook [Müller, FMP, Springer 2015]. The following table gives an overview of these chapters and provides links. Furthermore, in Part B, we provide basic information on the Python programming language, introduce the Juypter framework, and provide various tools used throughout the FMP notebooks.
|Chapter||Title||Notions, Techniques & Algorithms||HTML||IPYNB|
|Basics||Get started; Juypter framework; Anaconda; multimedia; Python programming; visualization; audio; Numba; annotations; libfmp; MIR resources||[html]||[ipynb]|
|Overview||Overview of the notebooks (this notebook/website)||[html]||[ipynb]|
|Music Representations||Music notation; MIDI; audio signal; waveform; pitch; loudness; timbre||[html]||[ipynb]|
|Fourier Analysis of Signals||Discrete/analog signal; sinusoid; exponential; Fourier transform; Fourier representation; DFT; FFT; STFT||[html]||[ipynb]|
|Music Synchronization||Chroma feature; dynamic programming; dynamic time warping (DTW); alignment; user interface||[html]||[ipynb]|
|Music Structure Analysis||Similarity matrix; repetition; thumbnail; homogeneity; novelty; evaluation; precision; recall; F-measure; visualization; scape plot||[html]||[ipynb]|
|Chord Recognition||Harmony; music theory; chords; scales; templates; hidden Markov model (HMM); evaluation||[html]||[ipynb]|
|Tempo and Beat Tracking||Onset; novelty; tempo; tempogram; beat; periodicity; Fourier analysis; autocorrelation||[html]||[ipynb]|
|Content-Based Audio Retrieval||Identification; fingerprint; audio matching; version identification; cover song||[html]||[ipynb]|
|Musically Informed Audio Decomposition||Harmonic/percusive separation; signal reconstruction; instantaneous frequency; fundamental frequency (F0); trajectory; nonnegative matrix factorization (NMF)||[html]||[ipynb]|
The notebooks are based on results, material, and insights that have been obtained in close collaboration with different people. I would like to express my gratitude to my former and current students, collaborators, and colleagues who have influenced and supported me in creating these notebooks. Also, various people have contributed to the code examples of the notebooks; credits are given in the notebooks' acknowledgement sections. Here, I will confine myself to only mentioning the names of the main contributors in alphabetical order:
Furthermore, some of the code examples have been inspired or are based on code provided by other code collections. In particular, I want to mention the following excellent sources: