dc.description.abstract |
Music emotion analysis has attracted an increasing attention among music information retrieval
community over the past two decades. Since the music emotion describes a specific emotional
meaning of a music clip, it has become an important aspect based on which the listeners can
arrange their personal music collections. Collection of such subjectively perceived data is important for further research in music emotion recognition and in the field of music data mining.
A systematic literature review was carried out resulting in several key findings: The majority
of the existing studies on emotion metadata have focused on the western world, whereas other
cultural-specific musical content such as Sri Lankan folk melodies are still left behind regardless of their richness in emotion expression; discrete and dimensional emotion models have been
commonly used; and in collecting human annotations higher attention should be paid on demographics as individual differences heavily matter in music emotion perception. Based on the
finding and opening up avenues for future researchers to computationally explore the melodies
which are least explored, this study presents a music dataset comprising of an initial set of 76
music stimuli verified by a panel of experts. The platform was developed to collect emotion
annotations enabling both the categorical and dimensional emotional ratings. Annotator profile
takes into consideration various demographic factors such as age, gender, ethnicity, religion,
and educational background in music, enabling the platform to be used for future research from
multiple disciplines. It is believed that the study will facilitate intelligent, large scale music
analysis in the music information retrieval industry through introducing a novel dataset and a
platform for emotion annotation collection. |
en_US |