ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.01506
  4. Cited By
Transformers are Deep Infinite-Dimensional Non-Mercer Binary Kernel
  Machines

Transformers are Deep Infinite-Dimensional Non-Mercer Binary Kernel Machines

2 June 2021
Matthew A. Wright
Joseph E. Gonzalez
ArXivPDFHTML

Papers citing "Transformers are Deep Infinite-Dimensional Non-Mercer Binary Kernel Machines"

7 / 7 papers shown
Title
Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nyström method
Learning in Feature Spaces via Coupled Covariances: Asymmetric Kernel SVD and Nyström method
Qinghua Tao
F. Tonin
Alex Lambert
Yingyi Chen
Panagiotis Patrinos
Johan A. K. Suykens
35
1
0
13 Jun 2024
Learning Functional Transduction
Learning Functional Transduction
Mathieu Chalvidal
Thomas Serre
Rufin VanRullen
AI4CE
35
2
0
01 Feb 2023
Random Fourier Features for Asymmetric Kernels
Random Fourier Features for Asymmetric Kernels
Ming-qian He
Fan He
Fanghui Liu
Xiaolin Huang
20
3
0
18 Sep 2022
Choose a Transformer: Fourier or Galerkin
Choose a Transformer: Fourier or Galerkin
Shuhao Cao
42
221
0
31 May 2021
Dropout: Explicit Forms and Capacity Control
Dropout: Explicit Forms and Capacity Control
R. Arora
Peter L. Bartlett
Poorya Mianjy
Nathan Srebro
64
37
0
06 Mar 2020
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language
  Understanding
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,959
0
20 Apr 2018
Effective Approaches to Attention-based Neural Machine Translation
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong
Hieu H. Pham
Christopher D. Manning
218
7,926
0
17 Aug 2015
1