Published in

SERSC, International Journal of Multimedia and Ubiquitous Engineering, 6(10), p. 99-112, 2015

DOI: 10.14257/ijmue.2015.10.6.10

Links

Tools

Export citation

Search in Google Scholar

Automatic Instrumental Raaga ? A Minute Observation to Find Out Discrete System for Carnatic Music

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Question mark in circle
Preprint: policy unknown
Question mark in circle
Postprint: policy unknown
Question mark in circle
Published version: policy unknown
Data provided by SHERPA/RoMEO

Abstract

The objective of this paper is to evolve a system, which automatically mines the raaga of an Indian Classical Music. In the first step Note transcription is applied on a given audio file in order to generate the sequence of notes which are used to play the song. In the next step, the features related to Arohana – Avarohana are extracted. The features of two/three songs are then selected in random and given as input to the training system. Totally songs of 72 melakartha raagas and 45 janya raagas are considered. Subsequently, work testing is done by extracting features of one or two songs of each raaga, which are given as inputs in the training part. The generated output indicates the identification of each raaga. Unique labeling has been done for each raaga, for the system to identify the set of trained raagas. In this work 7 instruments namely Veena, Saxophone, Violin, Nadaswaram, Mandolin, Flute and Piano are used. The database generated is trained and tested by using (1) Gaussian Mixed Model (2) Hidden Markov Model (3) K-Nearest Neighbor using Cosine distance and Earth Mover Distance to draw appropriate conclusions.