{"id":2818,"date":"2014-11-03T12:20:19","date_gmt":"2014-11-03T12:20:19","guid":{"rendered":"https:\/\/irsg.bcs.org\/informer\/?p=2818"},"modified":"2014-11-03T12:20:19","modified_gmt":"2014-11-03T12:20:19","slug":"music-information-retrieval-at-the-johannes-kepler-university-linz-austria","status":"publish","type":"post","link":"https:\/\/archive-irsg.bcs.org\/informer\/?p=2818","title":{"rendered":"Music Information Retrieval at JKU Linz"},"content":{"rendered":"<p>The <a href=\"http:\/\/www.cp.jku.at\">Department of Computational Perception<\/a> at the <a href=\"http:\/\/www.jku.at\">Johannes Kepler University Linz<\/a> was founded in October 2004, with the appointment of <a href=\"http:\/\/www.cp.jku.at\/people\/widmer\/\">Prof. Gerhard Widmer<\/a>. Its mission is to develop computational models and algorithms that permit computers to perceive and &#8220;understand&#8221; aspects of the external world, where we interpret &#8220;perception&#8221; in the widest sense of the word, as the extraction of useful high-level information from complex, possibly low-level data, including\u00a0text,\u00a0audio, video, image, and sensor data.<\/p>\n<p><!--more-->While the research carried out at the department is highly interdisciplinary, connecting fields like machine learning, music understanding, signal processing, and web mining, the area of music information retrieval (MIR) has always been a focus.\u00a0In the following, a selection of our research directions related to <strong>music retrieval and recommendation<\/strong> is described and corresponding prototype applications are presented. Supporting videos are available from our <a href=\"https:\/\/www.youtube.com\/user\/CPJKU\">YouTube channel<\/a>.<\/p>\n<h1>Audio content-based search, retrieval, and browsing<\/h1>\n<p>The development of <strong>highly efficient music feature extractors<\/strong> from audio, which describe for instance aspects of timbre or rhythm, is one of our core research topics. Given such features, we then elaborate approaches to <strong>automatic playlist generation<\/strong>, <strong>browsing music collections,<\/strong> and <strong>similarity-based search and retrieval<\/strong>, among others. Our algorithms have shown superior results in the <a href=\"http:\/\/www.music-ir.org\/mirex\/wiki\/2013:Audio_Music_Similarity_and_Retrieval_Results\">audio music similarity and retrieval<\/a> task of the annual <a href=\"http:\/\/www.music-ir.org\/mirex\/\">Music Information Retrieval Evaluation eXchange (MIREX)<\/a> competition several times. A prototype application for similarity-based music retrieval is the &#8220;Wolperdinger&#8221; music search engine, developed by <a href=\"http:\/\/www.schnitzer.at\/dominik\/\">Dominik Schnitzer<\/a>,\u00a0a former PhD student. This interface facilitates retrieving songs most similar to the currently played one,\u00a0in terms of audio similarity. To this end, the user just needs to click on the seed song and is immediately presented a new playlist of alike songs.<\/p>\n<figure id=\"attachment_2895\" aria-describedby=\"caption-attachment-2895\" style=\"width: 389px\" class=\"wp-caption aligncenter\"><a rel=\"attachment wp-att-2895\" href=\"https:\/\/irsg.bcs.org\/informer\/2014\/11\/music-information-retrieval-at-the-johannes-kepler-university-linz-austria\/wolperdinger\/\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2895 \" src=\"https:\/\/irsg.bcs.org\/informer\/wp-content\/uploads\/wolperdinger.jpg\" alt=\"\" width=\"389\" height=\"278\" \/><\/a><figcaption id=\"caption-attachment-2895\" class=\"wp-caption-text\">The &quot;Wolperdinger&quot; system for music search and retrieval.<\/figcaption><\/figure>\n<p>A prototype for browsing collections of music that is potentially unknown to the user is the &#8220;<a href=\"http:\/\/www.cp.jku.at\/projects\/nepTune\/\">nepTune<\/a>&#8221; interface. Given an arbitrary collection of digital music files, &#8220;nepTune&#8221; performs content-based clustering to create a virtual island landscape which allows the user to freely navigate in this collection and hear the closest sounds via a surround sound system. A demo video is available <a href=\"https:\/\/www.youtube.com\/watch?v=3tTVjQovY-U\">here<\/a>. This research area is led by <a href=\"http:\/\/www.cp.jku.at\/people\/knees\/\">Peter Knees<\/a>.<\/p>\n<figure id=\"attachment_2937\" aria-describedby=\"caption-attachment-2937\" style=\"width: 504px\" class=\"wp-caption aligncenter\"><a rel=\"attachment wp-att-2937\" href=\"https:\/\/irsg.bcs.org\/informer\/2014\/11\/music-information-retrieval-at-the-johannes-kepler-university-linz-austria\/neptune-3\/\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2937 \" src=\"https:\/\/irsg.bcs.org\/informer\/wp-content\/uploads\/neptune2.jpg\" alt=\"\" width=\"504\" height=\"263\" \/><\/a><figcaption id=\"caption-attachment-2937\" class=\"wp-caption-text\">The &quot;nepTune&quot; interface to browse music collections.<\/figcaption><\/figure>\n<h1>Music identification from audio and from performance<\/h1>\n<p>Exploiting other kinds of audio features, we further elaborate methods to automatically segment an audio stream and\u00a0<strong>identify music pieces<\/strong> therein, even when the input signal is distorted or exhibits changes in tempo. To this end,\u00a0<a href=\"http:\/\/www.cp.jku.at\/people\/sonnleitner\/\">Reinhard Sonnleitner<\/a> develops tempo- and\u00a0pitch-invariant fingerprinting methods and\u00a0systems\u00a0that are able to correctly identify music tracks from a short audio recording even in very noisy environments\u00a0like live mixes by DJs, where tempo and pitch usually are manipulated in various\u00a0ways. In this context, <strong>music segmentation<\/strong> techniques, which we research too, play an important role to detect tracks in audio streams.<\/p>\n<p>Furthermore, similar techniques can be used in the domain of classical music,\u00a0where the goal is to <strong>identify the piece that is being performed<\/strong> based on a\u00a0database of sheet music. Here, we mainly focus on live piano performances and\u00a0develop methods that are able to achieve correct identification within a few\u00a0seconds. Coupled with a <strong>score following<\/strong> algorithm, this technology can be used as\u00a0a versatile music identification and tracking system. Without the need to look\u00a0for the score sheet, a musician can simply sit down at the piano and play a few bars. The system identifies the piece, finds the correct position,\u00a0tracks the progress over time, and even turns the pages automatically. A demo video is available\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=SUBtND_MJZs\">here<\/a>.\u00a0Research\u00a0in this direction, including working prototype applications, is conducted by\u00a0<a href=\"http:\/\/www.cp.jku.at\/people\/arzt\/\">Andreas Arzt<\/a>.<\/p>\n<figure id=\"attachment_2922\" aria-describedby=\"caption-attachment-2922\" style=\"width: 506px\" class=\"wp-caption aligncenter\"><a rel=\"attachment wp-att-2922\" href=\"https:\/\/irsg.bcs.org\/informer\/2014\/11\/music-information-retrieval-at-the-johannes-kepler-university-linz-austria\/screenshot-tracker\/\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2922  \" src=\"https:\/\/irsg.bcs.org\/informer\/wp-content\/uploads\/Screenshot-Tracker.png\" alt=\"\" width=\"506\" height=\"316\" \/><\/a><figcaption id=\"caption-attachment-2922\" class=\"wp-caption-text\">The &quot;Piano Music Companion&quot; for music identification and following.<\/figcaption><\/figure>\n<h1>User-aware recommendation and playlist generation<\/h1>\n<p>Taking into account contextual aspects of\u00a0the listener is vital to create <strong>user-aware <\/strong><strong>models of listening behavior<\/strong>. We develop and use such models to\u00a0specifically address the tasks of <strong>automatic playlist adaptation<\/strong>, <strong>music recommendation<\/strong>, and <strong>music browsing<\/strong>,\u00a0according to the listener&#8217;s current preferences. These preferences are influenced by a variety of factors, such as time, weather, activity, location, and social context (alone, with friends or family). A person might want to listen to an agitating rock song when doing sports, for instance, but prefer some relaxing reggae music when being at the beach on a sunny day. This line of research is directed by <a href=\"http:\/\/www.cp.jku.at\/people\/schedl\/\">Markus Schedl<\/a>.<\/p>\n<h2>Playlist generation on smart phones<\/h2>\n<p>Given today&#8217;s wide availability of smart mobile devices, we implemented one of our approaches to automatic playlist generation and adaptation in an Android application, called &#8220;<a href=\"http:\/\/www.cp.jku.at\/projects\/MMG\/\">Mobile Music Genius<\/a>&#8220;. This music player monitors more than 100 aspects of the listener&#8217;s context while she interacts with the player. It learns and constantly improves models that describe the relationship between the user context and her preferred artist or song. Feedback, such as play, pause, or skip events is used to infer where a user likes or dislikes an item.<\/p>\n<p>Playlists can be created manually or automatically by defining a seed song and some properties of the playlist (e.g., number of songs or whether songs by the seed artist should be included). In the latter case, the playlist is populated with songs most similar to the seed, which are sought using a model of tag-based similarity.<\/p>\n<p>During playback, the player continuously monitors the user context and compares it to the temporally preceding context. Once the discrepancy between the two exceeds a sensitivity threshold, a playlist update is triggered after the current song. In this case, the new context information is fed into a classifier trained on the user&#8217;s previous context and music preference data. The classifier then outputs a list of songs that were listened to in similar context, which are in turn added to the playlist.<\/p>\n<figure id=\"attachment_2859\" aria-describedby=\"caption-attachment-2859\" style=\"width: 231px\" class=\"wp-caption aligncenter\"><a rel=\"attachment wp-att-2859\" href=\"https:\/\/irsg.bcs.org\/informer\/2014\/11\/music-information-retrieval-at-the-johannes-kepler-university-linz-austria\/mmg_playlist_generation-2\/\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2859   \" src=\"https:\/\/irsg.bcs.org\/informer\/wp-content\/uploads\/mmg_playlist_generation1.png\" alt=\"\" width=\"231\" height=\"411\" \/><\/a><figcaption id=\"caption-attachment-2859\" class=\"wp-caption-text\">Playlist generation in &quot;Mobile Music Genius&quot;.<\/figcaption><\/figure>\n<h2>Geospatial music recommendation from social media data<\/h2>\n<p>While our research on user-aware playlist generation takes into account a wide variety of contextual listener aspects in a relatively small amount (a few thousands records), we also exploit the abundance of information that is present in social media data (hundreds of millions of records), even though much fewer aspects of the user context are directly available. As our focus is on MIR, we concentrate on <a href=\"http:\/\/www.last.fm\/\">Last.fm<\/a> and <a href=\"http:\/\/www.twitter.com\">Twitter<\/a>, from which we mine\u00a0<strong>music- and listener-related information<\/strong>. This work already resulted in two data sets available for research, i.e., &#8220;<a href=\"http:\/\/www.cp.jku.at\/datasets\/MusicMicro\/\">MusicMicro<\/a>&#8221; and &#8220;<a href=\"http:\/\/www.cp.jku.at\/datasets\/MMTD\/\">Million Musical Tweets Dataset<\/a>&#8220;. In addition to the posting text, tweets are frequently attached GPS information, which enable interesting research tasks, such as geospatial music recommendation, interfaces to explore music listening behavior around the world, and music popularity estimation.<br \/>\nUsing the aforementioned data sources, in particular, information about listening events and listener&#8217;s characteristics, we elaborate methods to enhance music recommendation algorithms: content-based, collaborative filtering, and hybrids. We are particularly interested in <strong>hybrid fusion schemes<\/strong> and in analyzing the <strong>influence of user characteristics<\/strong> on the performance of music recommendation algorithms.<\/p>\n<h2>Browsing interfaces to explore listening behavior<\/h2>\n<p>Another related task is the development of browsing interfaces to explore music listening events and music preferences on a worldwide scale. Such interfaces should be capable of <strong>analyzing differences between regions or countries<\/strong>. One of our prototypes is the &#8220;<a href=\"http:\/\/www.cp.jku.at\/projects\/MusicTweetMap\/\">Music Tweet Map<\/a>&#8221; web application, which allows its user to <strong>retrieve and to explore the listening events of microbloggers<\/strong>, by time, location, genre, artist, and track. It was developed and is maintained by <a href=\"http:\/\/www.cp.jku.at\/people\/hauger\/\">David Hauger<\/a>.<\/p>\n<figure id=\"attachment_2857\" aria-describedby=\"caption-attachment-2857\" style=\"width: 473px\" class=\"wp-caption aligncenter\"><a rel=\"attachment wp-att-2857\" href=\"https:\/\/irsg.bcs.org\/informer\/2014\/11\/music-information-retrieval-at-the-johannes-kepler-university-linz-austria\/mtm_location\/\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2857 \" src=\"https:\/\/irsg.bcs.org\/informer\/wp-content\/uploads\/mtm_location.jpg\" alt=\"\" width=\"473\" height=\"288\" \/><\/a><figcaption id=\"caption-attachment-2857\" class=\"wp-caption-text\">Exploring listening events by location and genre using &quot;Music Tweet Map&quot;.<\/figcaption><\/figure>\n","protected":false},"excerpt":{"rendered":"<p>The Department of Computational Perception at the Johannes Kepler University Linz was founded in October 2004, with the appointment of Prof. Gerhard Widmer. Its mission is to develop computational models and algorithms that permit computers to perceive and &#8220;understand&#8221; aspects of the external world, where we interpret &#8220;perception&#8221; in the widest sense of the word,&hellip; <a class=\"more-link\" href=\"https:\/\/archive-irsg.bcs.org\/informer\/?p=2818\">Continue reading <span class=\"screen-reader-text\">Music Information Retrieval at JKU Linz<\/span><\/a><\/p>\n","protected":false},"author":37,"featured_media":2935,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[183,204],"tags":[],"class_list":["post-2818","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-autumn-2014","category-org-overview","entry"],"_links":{"self":[{"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/wp\/v2\/posts\/2818","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/wp\/v2\/users\/37"}],"replies":[{"embeddable":true,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2818"}],"version-history":[{"count":0,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/wp\/v2\/posts\/2818\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=\/"}],"wp:attachment":[{"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2818"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2818"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/archive-irsg.bcs.org\/informer\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2818"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}