ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.22480
  4. Cited By
Scaling LLM Inference with Optimized Sample Compute Allocation

Scaling LLM Inference with Optimized Sample Compute Allocation

29 October 2024
Kexun Zhang
Shang Zhou
Danqing Wang
William Yang Wang
Lei Li
ArXivPDFHTML

Papers citing "Scaling LLM Inference with Optimized Sample Compute Allocation"

1 / 1 papers shown
Title
Optimizing Temperature for Language Models with Multi-Sample Inference
Optimizing Temperature for Language Models with Multi-Sample Inference
Weihua Du
Yiming Yang
Sean Welleck
54
2
0
07 Feb 2025
1