The FMP notebooks offer a collection of educational material closely following the textbook Fundamentals of Music Processing (FMP). This is the starting website, which is opened when calling https://www.audiolabs-erlangen.de/FMP. Besides giving an overview, this website provides information on the license, the main contributors, and some links.
The FMP notebooks are a collection of educational material for teaching and learning Fundamentals of Music Processing (FMP) with a particular focus on the audio domain. Covering well-established topics in Music Information Retrieval (MIR) as motivating application scenarios, the FMP notebooks provide detailed textbook-like explanations of central techniques and algorithms in combination with Python code examples that illustrate how to implement the theory. All components including the introductions of MIR scenarios, illustrations, sound examples, technical concepts, mathematical details, and code examples are integrated into a consistent and comprehensive framework based on Jupyter notebooks. The FMP notebooks are suited for studying the theory and practice, for generating educational material for lectures, as well as for providing baseline implementations for many MIR tasks, thus addressing students, teachers, and researchers.
The code, text, and figures of these notbooks are licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. As for the audio material, the respective original licenses apply. This site contains material (text passages, figures) from the book Fundamentals of Music Processing. If you use code or material from this site, please give reference to this book (e.g. Figure 1.1 from [Müller, FMP, Springer 2015]). If you publish results obtained or using these Python notebooks, please cite:
The FMP notebooks are maintained by Meinard Müller. For comments, please email meinard.mueller@audiolabs-erlangen.de. I am grateful for any feedback and suggestions.
If you want to have a view of the FMP notebooks (exported HTML versions), you can start right away. All material including the explanations, the figures, and the audio examples can be accessed by just following the HTML links. If you want to execute the Python code cells, you have to download the notebooks (along with the data), create an environment, and start a Jupyter server. You then need to follow the IPYNB links within the Jupyter session. The necessary steps are explained in detail in a dedicated notebook.
The collection of notebooks is organized along the eight chapters of the textbook [Müller, FMP, Springer 2015]. The following table gives an overview of these chapters and provides links. Furthermore, we provide some basic information on Python programming language and the Juypter framework underlying these notebooks.
Chapter | Title | Notions, Techniques & Algorithms | HTML | IPYNB |
![]() |
Basics | Basic information on the Python programming language, the Juypter notebook and other topics including the Anaconda package management system, Python environments, and Git | [html] | [ipynb] |
![]() |
Overview | Overview of the notebooks (this notebook/website) | [html] | [ipynb] |
![]() |
Music Representations | Music notation, MIDI, audio signal, waveform, pitch, loudness, timbre | [html] | [ipynb] |
![]() |
Fourier Analysis of Signals | Discrete/analog signal, sinusoid, exponential, Fourier transform, Fourier representation, DFT, FFT, STFT | [html] | [ipynb] |
![]() |
Music Synchronization | Chroma feature, dyamic programming, dyamic time warping (DTW), alignment, user interface | [html] | [ipynb] |
![]() |
Music Structure Analysis | Similarity matrix, repetition, thumbnail, homogeneity, novelty, evaluation, precision, recall, F-measure, visualization, scape plot | [html] | [ipynb] |
![]() |
Chord Recognition | Harmony, music theory, chords, scales, templates, hidden Markov model (HMM), evaluation | [html] | [ipynb] |
![]() |
Tempo and Beat Tracking | Onset, novelty, tempo, tempogram, beat, periodicity, Fourier analysis, autocorrelation | [html] | [ipynb] |
![]() |
Content-Based Audio Retrieval | Identification, fingerprint, indexing, inverted list, matching, version, cover song | [html] | [ipynb] |
![]() |
Musically Informed Audio Decomposition | Harmonic/percusive separation, signal reconstruction, instantaneous frequency, fundamental frequency (F0), trajectory, nonnegative matrix factorization (NMF) | [html] | [ipynb] |
The notebooks are based on results, material, and insights that have been obtained in close collaboration with different people. I would like to express my gratitude to my former and current students, collaborators, and colleagues who have influenced and supported me in establishing these notebooks. Also, various people have contributed to the code examples of the notebooks; credits are given in the notebooks' acknowledgement sections. Here, I will confine myself to only mentioning the names of the main contributors in alphabetical order:
Furthermore, some of the code examples have been inspired or are based on code provided by other code collections. In particular, I want to mention the following excellent sources: