ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.12821
  4. Cited By
FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware
  Graph Transformer
v1v2 (latest)

FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware Graph Transformer

19 March 2024
Dongyeong Hwang
Hyunju Kim
Sunwoo Kim
Kijung Shin
    AI4CE
ArXiv (abs)PDFHTMLGithub (14★)

Papers citing "FlowerFormer: Empowering Neural Architecture Encoding using a Flow-aware Graph Transformer"

5 / 5 papers shown
Learning to Flow from Generative Pretext Tasks for Neural Architecture Encoding
Learning to Flow from Generative Pretext Tasks for Neural Architecture Encoding
Sunwoo Kim
Hyunjin Hwang
Kijung Shin
AI4CE
220
0
0
21 Oct 2025
Loss Functions for Predictor-based Neural Architecture Search
Loss Functions for Predictor-based Neural Architecture Search
Han Ji
Yuqi Feng
Jiahao Fan
Yanan Sun
205
0
0
06 Jun 2025
CARL: Causality-guided Architecture Representation Learning for an Interpretable Performance Predictor
CARL: Causality-guided Architecture Representation Learning for an Interpretable Performance Predictor
Han Ji
Yuqi Feng
Jiahao Fan
Yanan Sun
OODCML
261
0
0
04 Jun 2025
Multi-View Encoders for Performance Prediction in LLM-Based Agentic Workflows
Multi-View Encoders for Performance Prediction in LLM-Based Agentic Workflows
Patara Trirat
Wonyong Jeong
Sung Ju Hwang
309
3
0
26 May 2025
Simple Path Structural Encoding for Graph Transformers
Simple Path Structural Encoding for Graph Transformers
Louis Airale
Antonio Longa
Mattia Rigon
Baptiste Caramiaux
Roberto Passerone
661
2
0
13 Feb 2025
1
Page 1 of 1