You are here: Home Event Satellite events

Satellite events




Satellite workshops (Saturday 13th)

Two satellite workshops will take place at the very end of ISMIR 2012, on Saturday 13th 2012.

Whether you are attending ISMIR or not, have a look at their respective webpages to see how to attend, how to get involved, and all necessary details:




Evaluation initiatives

Three different and complementary MIR-related evaluation initiatives are running in 2012. (See details below.)

Although these initiatives are independent from ISMIR 2012 and are organised by different institutions and groups of individuals, a special session of the conference will be dedicated to reporting on these initiatives and to reflect on evaluation methodologies in MIR:

Panel session on Evaluation Initiatives in MIR

Chair
Panelists
Invited speaker
  • Gareth Jones (Dublin City University) - "Searching for a Music REtrieval Conference (MREC)"

The aim of this panel is to discuss the methodologies currently used in MIR evaluations and compare them to the evaluation practices in other research fields. For this, we invited key‐actors of current MIR benchmarking initiatives (MIREX, Million‐Song‐Dataset Challenge and MusicClef/MediaEval), of MIR metaevaluations and one of the key actors of IR evaluation. Among the potential topics to be discussed are:

Definition of the tasks to be evaluated
What methodology should be used to define the task (bottom‐up vs. top‐down)? For which purpose should a task be evaluated: low‐level tasks (functionality‐oriented such as beat, chords) vs. full‐system tasks (use‐case‐oriented such as music recommendation systems). Specific tasks that are part of large‐scale international evaluations define de facto the specific topics that new contributors to the MIR field will work on. The methodology followed to define tasks is therefore of utmost importance. 
Evaluation
How should a specific task be evaluated? Which data, which measures, what is the reliability of the results obtained? 
Data
How to get more data? How to deal with data availability (not only music collections, but also raw system outputs, judgments, annotations)? Should we go to low‐cost evaluation methodology (see TREC Million Query Track 2007, 2008 and 2009)? Currently most MIR systems are concerned with audio‐only or symbolic‐only scenario. Multi‐modal systems (such as aggregating information from the audio‐content, from lyrics content or web mining) should allow deciding also on the impact on final user application of each technology. 
Methodology
What is the best methodology to drive improvements? What kind of evaluation framework (open vs close evaluation)? What could be improved in previous evaluation initiatives? How can we make results reproducible? How can we make MIR evaluation sustainable along time?




Evaluation initiatives links

MIREX 2012

The 2012 Music Information Retrieval Evaluation eXchange (MIREX) will be the eighth iteration of MIREX. The International Music Information Retrieval Systems Evaluation Laboratory (IMIRSEL) at the Graduate School of Library and Information Science (GSLIS), University of Illinois at Urbana-Champaign (UIUC) is the principal organizer of MIREX 2012.

For participating, or proposing new tasks, please consult the MIREX Wiki page http://www.music-ir.org/mirex/wiki/2012

Million Song Dataset Challenge

The Million Song Dataset Challenge  aims at being the best possible offline evaluation of a music recommendation system. It is a joint effort between the Computer Audition Lab at UC San Diego and LabROSA at Columbia University. Follow-up evaluations will be conducted by IMIRSEL at the Graduate School of Library Information Science at UIUC as part of the Music Information Retrieval Evaluation eXchange (MIREX).

The user data for the challenge, like much of the data in the Million Song Dataset, was generously donated by The Echo Nest, with additional data contributed by SecondHandSongsmusiXmatch, and Last.fm.

For specific details on the MSD Challenge, please see:
http://www.kaggle.com/c/msdchallenge

MusiClef: Multimodal Music Tagging Task

MusiClef is a benchmarking initiative on Multimodal Music Tagging that will be running as a MediaEval 2012 "Brave New Task". This task complements existing benchmarking initiatives and fosters less explored methodological directions in Music Information Retrieval. MusiClef deals with a concrete use case, encourages multimodal approaches, and strives for transparency of results as much as possible. Transparency is encouraged at several levels and stages, from the feature extraction procedure up to the evaluation phase, in which a dedicated categorization of ground truth tags will be used to deepen the understanding of the relation between the proposed approaches and experimental results.

The organisers of MusiClef are Nicola Orio (University of Padua), Geoffroy Peeters (IRCAM Paris), Markus Schedl (Johannes Kepler University Linz) and Cynthia Liem (Delft University of Technology). For details on MusiClef, please see: 
http://www.multimediaeval.org/mediaeval2012/newtasks/music2012/

 

Gold Sponsor








Bronze Sponsors







Support















     


     


Organisation


In partnership with