TY - GEN
T1 - Semantic music information retrieval using collaborative indexing and filtering
AU - Leung, C. H.C.
AU - Chan, W. S.
N1 - Copyright:
Copyright 2011 Elsevier B.V., All rights reserved.
PY - 2010
Y1 - 2010
N2 - With the rapid development of multimedia technology, digital music has become increasingly available and it constitutes a significant component of multimedia contents on the Internet. Since digital music can be represented in various forms, formats, and dimensions, searching such information is far more challenging than text-based search. While some basic forms of music retrieval is available on the Internet, these tend to be inflexible and have significant limitations. Currently, most of these music retrieval systems only rely on shallow music information (e.g., metadata, album title, lyrics, etc). Here, we present an approach for deep content-based music information retrieval, which focuses on high-level human perception, incorporating subtle nuances and emotional impression on the music (e.g., music styles, tempo, genre, mood, instrumental combinations etc.). We also provide a critical evaluation of the most common current Music Information Retrieval (MIR) approaches and propose an innovative adaptive method for music information search that overcomes the current limitations. The main focus of our approach is concerned with music discovery and recovery by collaborative semantic indexing and user relevance feedback analysis. Through successive usage of our indexing model, novel music content indexing can be built from deep user knowledge incrementally and collectively by accumulating users' judgment and intelligence.
AB - With the rapid development of multimedia technology, digital music has become increasingly available and it constitutes a significant component of multimedia contents on the Internet. Since digital music can be represented in various forms, formats, and dimensions, searching such information is far more challenging than text-based search. While some basic forms of music retrieval is available on the Internet, these tend to be inflexible and have significant limitations. Currently, most of these music retrieval systems only rely on shallow music information (e.g., metadata, album title, lyrics, etc). Here, we present an approach for deep content-based music information retrieval, which focuses on high-level human perception, incorporating subtle nuances and emotional impression on the music (e.g., music styles, tempo, genre, mood, instrumental combinations etc.). We also provide a critical evaluation of the most common current Music Information Retrieval (MIR) approaches and propose an innovative adaptive method for music information search that overcomes the current limitations. The main focus of our approach is concerned with music discovery and recovery by collaborative semantic indexing and user relevance feedback analysis. Through successive usage of our indexing model, novel music content indexing can be built from deep user knowledge incrementally and collectively by accumulating users' judgment and intelligence.
KW - Collaborative filtering
KW - multimedia indexing
KW - music information retrieval
UR - http://www.scopus.com/inward/record.url?scp=78651543789&partnerID=8YFLogxK
U2 - 10.1007/978-90-481-9794-1_65
DO - 10.1007/978-90-481-9794-1_65
M3 - Conference proceeding
AN - SCOPUS:78651543789
SN - 9789048197934
T3 - Lecture Notes in Electrical Engineering
SP - 345
EP - 350
BT - Computer and Information Sciences - Proceedings of the 25th International Symposium on Computer and Information Sciences
T2 - 25th International Symposium on Computer and Information Sciences, ISCIS 2010
Y2 - 22 September 2010 through 24 September 2010
ER -