Publication: A system for tuning instruments using recorded music instead of theory-based frequency presets
| dc.contributor.author | Bozkurt, Baris Iş | |
| dc.contributor.institution | Bozkurt, Baris Iş, Department of Electrical and Electronic Engineering, Bahçeşehir Üniversitesi, Istanbul, Turkey | |
| dc.date.accessioned | 2025-10-05T16:43:32Z | |
| dc.date.issued | 2012 | |
| dc.description.abstract | Musical instrument tuners are devices that help musicians to adjust their instruments such that the played notes have the desired fundamental frequencies. In a conventional tuner, the reference tuning frequencies are preset, where the presets are obtained from tuning (musical scale) theory, such as twelve-tone equal temperament, or are user-specified temperaments. For many kinds ofmusic in oral traditions, especially nonwestern music, widely accepted theoretical presets for tuning frequencies are not available because of the use of non-standard tunings. For such contexts, the reference is a master musician or a recording of a master musician. In this article, a tuning method and technology are presented that help the musician to tune the instrument according to a given (user-provided) recording. Themethodmakes use of simultaneous audio and visual feedback during the tuning process, in which novel approaches are used for both modalities. For audio feedback, loopable stable frames, obtained automatically from the recording, are looped and played continuously. For visual feedback, a superimposed plot of the auto-difference functions is displayed instead of the conventional tuner's approach of detecting frequencies and displaying the amount of frequency difference between the input and the reference. © 2012 Massachusetts Institute of Technology. © 2018 Elsevier B.V., All rights reserved. | |
| dc.identifier.doi | 10.1162/COMJ_a_00128 | |
| dc.identifier.endpage | 56 | |
| dc.identifier.issn | 01489267 | |
| dc.identifier.issn | 15315169 | |
| dc.identifier.issue | 3 | |
| dc.identifier.scopus | 2-s2.0-84865522165 | |
| dc.identifier.startpage | 43 | |
| dc.identifier.uri | https://doi.org/10.1162/COMJ_a_00128 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14719/13439 | |
| dc.identifier.volume | 36 | |
| dc.language.iso | en | |
| dc.publisher | MIT Press Journals cellas@mit.edu | |
| dc.relation.source | Computer Music Journal | |
| dc.subject.authorkeywords | Tuners | |
| dc.subject.authorkeywords | Visual Communication | |
| dc.subject.authorkeywords | Audio Feedbacks | |
| dc.subject.authorkeywords | Difference Functions | |
| dc.subject.authorkeywords | Frequency Differences | |
| dc.subject.authorkeywords | Fundamental Frequencies | |
| dc.subject.authorkeywords | Musical Scale | |
| dc.subject.authorkeywords | Oral Tradition | |
| dc.subject.authorkeywords | Tuning Frequency | |
| dc.subject.authorkeywords | Visual Feedback | |
| dc.subject.authorkeywords | Audio Recordings | |
| dc.subject.indexkeywords | Tuners | |
| dc.subject.indexkeywords | Visual communication | |
| dc.subject.indexkeywords | Audio feedbacks | |
| dc.subject.indexkeywords | Difference functions | |
| dc.subject.indexkeywords | Frequency differences | |
| dc.subject.indexkeywords | Fundamental frequencies | |
| dc.subject.indexkeywords | Musical scale | |
| dc.subject.indexkeywords | Oral tradition | |
| dc.subject.indexkeywords | Tuning frequency | |
| dc.subject.indexkeywords | Visual feedback | |
| dc.subject.indexkeywords | Audio recordings | |
| dc.title | A system for tuning instruments using recorded music instead of theory-based frequency presets | |
| dc.type | Article | |
| dcterms.references | Turk Musikisi Nazariyati Dersleri, (1968), Kultur Bakanlii Yayinlari, (1993), Bello, Juan Pablo, A tutorial on onset detection in music signals, IEEE Transactions on Speech and Audio Processing, 13, 5, pp. 1035-1046, (2005), Bozkurt, Baris Iş, An automatic pitch analysis method for turkish maqam music, Journal of New Music Research, 37, 1, pp. 1-13, (2008), Bozkurt, Baris Iş, Weighing diverse theoretical models on Turkish maqam music against pitch measurements: A comparison of peaks automatically derived from frequency histograms with proposed scale tones, Journal of New Music Research, 38, 1, pp. 45-70, (2009), Swipe A Sawtooth Waveform Inspired Pitch Estimator for Speech and Music, (2007), Tanburi Cemil Bey, (1994), de Cheveigné, Alain L., YIN, a fundamental frequency estimator for speech and music, Journal of the Acoustical Society of America, 111, 4, pp. 1917-1930, (2002), Journal of the Internationa Folk Music Council, (1964), Pitch Determination of Speech Signals, (1983) | |
| dspace.entity.type | Publication | |
| local.indexed.at | Scopus | |
| person.identifier.scopus-author-id | 23476648700 |
