ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.05078
  4. Cited By
Compression via Pre-trained Transformers: A Study on Byte-Level
  Multimodal Data

Compression via Pre-trained Transformers: A Study on Byte-Level Multimodal Data

7 October 2024
David Heurtel-Depeiges
Anian Ruoss
Joel Veness
Tim Genewein
ArXivPDFHTML

Papers citing "Compression via Pre-trained Transformers: A Study on Byte-Level Multimodal Data"

1 / 1 papers shown
Title
FineZip : Pushing the Limits of Large Language Models for Practical
  Lossless Text Compression
FineZip : Pushing the Limits of Large Language Models for Practical Lossless Text Compression
Fazal Mittu
Yihuan Bu
Akshat Gupta
Ashok Devireddy
Alp Eren Ozdarendeli
Anant Singh
Gopala Anumanchipalli
AI4CE
16
5
0
25 Sep 2024
1