ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2508.10948
184
2

Apriel-Nemotron-15B-Thinker

13 August 2025
Shruthan Radhakrishna
S. Parikh
Gopal Sarda
Anil Turkkan
Quaizar Vohra
Raymond Li
Dhruv Jhamb
Kelechi Ogueji
Aanjaneya Shukla
Oluwanifemi Bamgbose
Toby Liang
Luke Kumar
Oleksiy Ostapenko
Shiva Krishna Reddy Malay
Aman Tiwari
Tara Bogavelli
Vikas Yadav
Jash Mehta
Saloni Mittal
Akshay Kalkunte
Pulkit Pattnaik
Khalil Slimi
Anirudh Sreeram
Jishnu Sethumadhavan Nair
Akintunde Oladipo
Shashank Maiya
Khyati Mahajan
Rishabh Maheshwary
Masoud Hashemi
Sai Rajeswar Mudumba
Sathwik Tejaswi Madhusudhan
Torsten Scholak
Sébastien Paquet
Sagar Davasam
Srinivas Sunkara
    LLMAGMoELRM
ArXiv (abs)PDFHTMLGithub (524★)
Main:10 Pages
4 Figures
Bibliography:2 Pages
4 Tables
Appendix:1 Pages
Abstract

While large language models (LLMs) have achieved remarkable reasoning capabilities across domains like code, math and other enterprise tasks, their significant memory and computational costs often preclude their use in practical enterprise settings. To this end, we introduce Apriel-Nemotron-15B-Thinker, a 15-billion parameter model in the ServiceNow Apriel SLM series that achieves performance against medium sized state-of-the-art models such as o1-mini, QWQ32B, and EXAONE-Deep-32B while maintaining only half the memory footprint of those alternatives. Apriel-Nemotron-15B-Thinker model is trained in a four stage training pipeline including 1) Base Model upscaling, 2) Continual Pre-training 3) Supervised Fine-tuning (SFT) and 4) Reinforcement Learning using GRPO. Comprehensive evaluations across a diverse suite of benchmarks consistently demonstrate that our Apriel-Nemotron-15B-Thinker model matches or exceeds the performance of its 32-billion parameter counterparts, despite being less than half their size.

View on arXiv
Comments on this paper