ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.15223
14
6

Low-rank Adaptation of Large Language Model Rescoring for Parameter-Efficient Speech Recognition

26 September 2023
Yu Yu
Chao-Han Huck Yang
J. Kolehmainen
Prashanth Gurunath Shivakumar
Yile Gu
Sungho Ryu
Roger Ren
Qi Luo
Aditya Gourav
I-Fan Chen
Yi-Chieh Liu
Tuan Dinh
Ankur Gandhe
Denis Filimonov
Shalini Ghosh
A. Stolcke
Ariya Rastrow
I. Bulyko
ArXivPDFHTML
Abstract

We propose a neural language modeling system based on low-rank adaptation (LoRA) for speech recognition output rescoring. Although pretrained language models (LMs) like BERT have shown superior performance in second-pass rescoring, the high computational cost of scaling up the pretraining stage and adapting the pretrained models to specific domains limit their practical use in rescoring. Here we present a method based on low-rank decomposition to train a rescoring BERT model and adapt it to new domains using only a fraction (0.08%) of the pretrained parameters. These inserted matrices are optimized through a discriminative training objective along with a correlation-based regularization loss. The proposed low-rank adaptation Rescore-BERT (LoRB) architecture is evaluated on LibriSpeech and internal datasets with decreased training times by factors between 5.4 and 3.6.

View on arXiv
Comments on this paper