ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12339
30
1

P3^33LM: Probabilistically Permuted Prophet Language Modeling for Generative Pre-Training

22 October 2022
Junwei Bao
Yifan Wang
Jiangyong Ying
Yeyun Gong
Jing Zhao
Youzheng Wu
Xiaodong He
ArXivPDFHTML
Abstract

Conventional autoregressive left-to-right (L2R) sequence generation faces two issues during decoding: limited to unidirectional target sequence modeling, and constrained on strong local dependencies. To address the aforementioned problem, we propose P3^33LM, a probabilistically permuted prophet language model, which strengthens the modeling of bidirectional information and long token dependencies for sequence generation. Specifically, P3^33LM learns to generate tokens in permuted order upon an order-aware transformer decoder, as well as to generate the corresponding future NNN tokens with a multi-stream attention mechanism. Extensive experiments are conducted on the GLGE benchmark, which includes four datasets for summarization, two for question generation, one for conversational question answering, and one for dialog response generation, where P3^33LM achieves state-of-the-art results compared with strong publicly available generative pre-training methods.

View on arXiv
Comments on this paper