Please use this identifier to cite or link to this item:
https://hdl.handle.net/10316/94353
Title: | Emotionally-Relevant Features for Classification and Regression of Music Lyrics | Authors: | Malheiro, Ricardo Panda, Renato Eduardo Silva Gomes, Paulo Paiva, Rui Pedro Pinto de Carvalho e |
Keywords: | lyrics feature extraction; lyrics music; lyrics music classification; lyrics music emotion recognition; music information retrieval | Issue Date: | 2018 | Publisher: | IEEE | Project: | info:eu-repo/grantAgreement/FCT/5876-PPCDTI/102185/PT/MOODetector - A System for Mood-based Classification and Retrieval of Audio Music | metadata.degois.publication.title: | IEEE Transactions on Affective Computing – TAFFC | metadata.degois.publication.volume: | 9 | metadata.degois.publication.issue: | 2 | Abstract: | This research addresses the role of lyrics in the music emotion recognition process. Our approach is based on several state of the art features complemented by novel stylistic, structural and semantic features. To evaluate our approach, we created a ground truth dataset containing 180 song lyrics, according to Russell’s emotion model. We conduct four types of experiments: regression and classification by quadrant, arousal and valence categories. Comparing to the state of the art features (ngrams - baseline), adding other features, including novel features, improved the F-measure from 69.9%, 82.7% and 85.6% to 80.1%, 88.3% and 90%, respectively for the three classification experiments. To study the relation between features and emotions (quadrants) we performed experiments to identify the best features that allow to describe and discriminate each quadrant. To further validate these experiments, we built a validation set comprising 771 lyrics extracted from the AllMusic platform, having achieved 73.6% F-measure in the classification by quadrants. We also conducted experiments to identify interpretable rules that show the relation between features and emotions and the relation among features. Regarding regression, results show that, comparing to similar studies for audio, we achieve a similar performance for arousal and a much better performance for valence. | URI: | https://hdl.handle.net/10316/94353 | ISSN: | 1949-3045 | DOI: | 10.1109/TAFFC.2016.2598569 | Rights: | embargoedAccess |
Appears in Collections: | I&D CISUC - Artigos em Revistas Internacionais |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Malheiro et al. - 2018 - Emotionally-Relevant Features for Classification and Regression of Music Lyrics.pdf | 629.22 kB | Adobe PDF | View/Open |
SCOPUSTM
Citations
44
checked on Oct 14, 2024
WEB OF SCIENCETM
Citations
28
checked on Nov 2, 2024
Page view(s)
239
checked on Oct 29, 2024
Download(s)
356
checked on Oct 29, 2024
Google ScholarTM
Check
Altmetric
Altmetric
This item is licensed under a Creative Commons License