NEURAL SPEECH SYNTHESIS ON A SHOESTRING: IMPROVING THE EFFICIENCY OF LPCNET

Jean Marc Valin, Umut Isik, Paris Smaragdis, Arvindh Krishnaswamy

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Neural speech synthesis models can synthesize high quality speech but typically require a high computational complexity to do so. In previous work, we introduced LPCNet, which uses linear prediction to significantly reduce the complexity of neural synthesis. In this work, we further improve the efficiency of LPCNet - targeting both algorithmic and computational improvements - to make it usable on a wide variety of devices. We demonstrate an improvement in synthesis quality while operating 2.5x faster. The resulting open-source LPCNet algorithm can perform real-time neural synthesis on most existing phones and is even usable in some embedded devices.

Original languageEnglish (US)
Title of host publication2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages8437-8441
Number of pages5
ISBN (Electronic)9781665405409
DOIs
StatePublished - 2022
Externally publishedYes
Event47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Virtual, Online, Singapore
Duration: May 23 2022May 27 2022

Publication series

NameICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2022-May
ISSN (Print)1520-6149

Conference

Conference47th IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022
Country/TerritorySingapore
CityVirtual, Online
Period5/23/225/27/22

Keywords

  • LPCNet
  • WaveRNN
  • neural vocoder

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'NEURAL SPEECH SYNTHESIS ON A SHOESTRING: IMPROVING THE EFFICIENCY OF LPCNET'. Together they form a unique fingerprint.

Cite this