ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.11600
13
0

No Saved Kaleidosope: an 100% Jitted Neural Network Coding Language with Pythonic Syntax

17 September 2024
Augusto Seben da Rosa
Marlon Daniel Angeli
Jorge Aikes Junior
Alef Iury Ferreira
L. Gris
Anderson da Silva Soares
Arnaldo Candido Junior
Frederico Santos de Oliveira
Gabriel Trevisan Damke
Rafael Teixeira Sousa
ArXivPDFHTML
Abstract

We developed a jitted compiler for training Artificial Neural Networks using C++, LLVM and Cuda. It features object-oriented characteristics, strong typing, parallel workers for data pre-processing, pythonic syntax for expressions, PyTorch like model declaration and Automatic Differentiation. We implement the mechanisms of cache and pooling in order to manage VRAM, cuBLAS for high performance matrix multiplication and cuDNN for convolutional layers. Our experiments with Residual Convolutional Neural Networks on ImageNet, we reach similar speed but degraded performance. Also, the GRU network experiments show similar accuracy, but our compiler have degraded speed in that task. However, our compiler demonstrates promising results at the CIFAR-10 benchmark, in which we reach the same performance and about the same speed as PyTorch. We make the code publicly available at: https://github.com/NoSavedDATA/NoSavedKaleidoscope

View on arXiv
Comments on this paper