ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.10540
16
21

Modelling Long Range Dependencies in NNND: From Task-Specific to a General Purpose CNN

25 January 2023
David M. Knigge
David W. Romero
Albert Gu
E. Gavves
Erik J. Bekkers
Jakub M. Tomczak
Mark Hoogendoorn
J. Sonke
    3DV
ArXivPDFHTML
Abstract

Performant Convolutional Neural Network (CNN) architectures must be tailored to specific tasks in order to consider the length, resolution, and dimensionality of the input data. In this work, we tackle the need for problem-specific CNN architectures. We present the Continuous Convolutional Neural Network (CCNN): a single CNN able to process data of arbitrary resolution, dimensionality and length without any structural changes. Its key component are its continuous convolutional kernels which model long-range dependencies at every layer, and thus remove the need of current CNN architectures for task-dependent downsampling and depths. We showcase the generality of our method by using the same architecture for tasks on sequential (1D1{\rm D}1D), visual (2D2{\rm D}2D) and point-cloud (3D3{\rm D}3D) data. Our CCNN matches and often outperforms the current state-of-the-art across all tasks considered.

View on arXiv
Comments on this paper