Information

  • Publication Type: Bachelor Thesis
  • Workgroup(s)/Project(s):
  • Date: December 2013
  • Date (Start): 15. October 2012
  • Date (End): 20. December 2013
  • Matrikelnummer: 0725136
  • First Supervisor: Stefan OhrhallingerORCID iD
  • Keywords: feature extraction, music, visualization

Abstract

The aim of this bachelor’s thesis is to point out ways on how to extract distinct bits of information out of a song and how to combine them to create single parameters that reflect the currently transported emotion of the song. It presents approaches on how to extract certain information and data from MIDI and audio files that can then be used to create a more physics-based and naturally feeling visualization than the one that gets shipped with today’s common music player software, with a strong focus on MIDI. For example, the currently used scale should have an impact on the visualization’s color, as well as the current tempo, dynamic or aggressivity. Representing these attributes as input parameters that can be used by a visualization application should ultimately result in a better visualization experience for the viewer, because it creates a feeling that the things seen on screen match with the music currently playing. Besides defining such input parameters for visualizations, this paper also provides a short evaluation of music feature extraction libraries and frameworks that help in reaching the men- tioned goal, as well as a few concrete implementations of algorithms that can be used to extract such features based on the jMusic API framework.

Additional Files and Images

Additional images and videos

Additional files

Weblinks

No further information available.

BibTeX

@bachelorsthesis{hauer_alex-2013-ba,
  title =      "Physics-based Music Visualization",
  author =     "Alex Hauer",
  year =       "2013",
  abstract =   "The aim of this bachelor’s thesis is to point out ways on
               how to extract distinct bits of information out of a song
               and how to combine them to create single parameters that
               reflect the currently transported emotion of the song. It
               presents approaches on how to extract certain information
               and data from MIDI and audio files that can then be used to
               create a more physics-based and naturally feeling
               visualization than the one that gets shipped with today’s
               common music player software, with a strong focus on MIDI.
               For example, the currently used scale should have an impact
               on the visualization’s color, as well as the current
               tempo, dynamic or aggressivity. Representing these
               attributes as input parameters that can be used by a
               visualization application should ultimately result in a
               better visualization experience for the viewer, because it
               creates a feeling that the things seen on screen match with
               the music currently playing. Besides defining such input
               parameters for visualizations, this paper also provides a
               short evaluation of music feature extraction libraries and
               frameworks that help in reaching the men- tioned goal, as
               well as a few concrete implementations of algorithms that
               can be used to extract such features based on the jMusic API
               framework.",
  month =      dec,
  address =    "Favoritenstrasse 9-11/E193-02, A-1040 Vienna, Austria",
  school =     "Institute of Computer Graphics and Algorithms, Vienna
               University of Technology ",
  keywords =   "feature extraction, music, visualization",
  URL =        "https://www.cg.tuwien.ac.at/research/publications/2013/hauer_alex-2013-ba/",
}