Using articulatory feature detectors in progressive networks for multilingual low-resource phone recognitiona)

Mahir Morshed, Mark Hasegawa-Johnson

Research output: Contribution to journalArticlepeer-review

Abstract

Systems inspired by progressive neural networks, transferring information from end-to-end articulatory feature detectors to similarly structured phone recognizers, are described. These networks, connecting the corresponding recurrent layers of pre-trained feature detector stacks and newly introduced phone recognizer stacks, were trained on data from four Asian languages, with experiments testing the system on those languages and four African languages. Later adjustments of these networks include the use of contrastive predictive coding layers at the inputs to those networks' recurrent portions. Such adjustments allow for performance differences to be attributed to the presence or absence of individual feature detectors (for consonant place/manner and vowel height/backness). Some of these differences manifest after feature-level comparisons of recognizer outputs, as well as through considering variations and ablations in architecture and training setup. These differences encourage further exploration of methods to reduce errors with phones having specific articulatory features as well as further architectural modifications.

Original languageEnglish (US)
Pages (from-to)3411-3421
Number of pages11
JournalJournal of the Acoustical Society of America
Volume156
Issue number5
DOIs
StatePublished - Nov 2024

Keywords

  • Phonetics
  • Speech recognition
  • Consonants
  • Vowel systems
  • Convolutional neural network
  • Artificial neural networks
  • Machine learning
  • Graphics processing units
  • Group theory
  • Organs

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Acoustics and Ultrasonics

Fingerprint

Dive into the research topics of 'Using articulatory feature detectors in progressive networks for multilingual low-resource phone recognitiona)'. Together they form a unique fingerprint.

Cite this