ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1809.01185
  4. Cited By
DeepPINK: reproducible feature selection in deep neural networks

DeepPINK: reproducible feature selection in deep neural networks

4 September 2018
Yang Young Lu
Yingying Fan
Jinchi Lv
William Stafford Noble
    FAtt
ArXivPDFHTML

Papers citing "DeepPINK: reproducible feature selection in deep neural networks"

14 / 14 papers shown
Title
Error-controlled non-additive interaction discovery in machine learning models
Error-controlled non-additive interaction discovery in machine learning models
Winston Chen
Yifan Jiang
William Stafford Noble
Yang Young Lu
40
1
0
17 Feb 2025
Feature Selection as Deep Sequential Generative Learning
Feature Selection as Deep Sequential Generative Learning
Wangyang Ying
Dongjie Wang
Haifeng Chen
Yanjie Fu
36
7
0
06 Mar 2024
Enabling tabular deep learning when $d \gg n$ with an auxiliary
  knowledge graph
Enabling tabular deep learning when d≫nd \gg nd≫n with an auxiliary knowledge graph
Camilo Ruiz
Hongyu Ren
Kexin Huang
J. Leskovec
20
2
0
07 Jun 2023
Graph Convolutional Network-based Feature Selection for High-dimensional
  and Low-sample Size Data
Graph Convolutional Network-based Feature Selection for High-dimensional and Low-sample Size Data
Can Chen
Scott T. Weiss
Yang-Yu Liu
17
17
0
25 Nov 2022
Model-X Sequential Testing for Conditional Independence via Testing by
  Betting
Model-X Sequential Testing for Conditional Independence via Testing by Betting
Shalev Shaer
Gal Maman
Yaniv Romano
8
16
0
01 Oct 2022
Sequential Attention for Feature Selection
Sequential Attention for Feature Selection
T. Yasuda
M. Bateni
Lin Chen
Matthew Fahrbach
Gang Fu
Vahab Mirrokni
29
11
0
29 Sep 2022
Learning to Increase the Power of Conditional Randomization Tests
Learning to Increase the Power of Conditional Randomization Tests
Shalev Shaer
Yaniv Romano
CML
19
2
0
03 Jul 2022
Error-based Knockoffs Inference for Controlled Feature Selection
Error-based Knockoffs Inference for Controlled Feature Selection
Xuebin Zhao
H. Chen
Yingjie Wang
Weifu Li
Tieliang Gong
Yulong Wang
Feng Zheng
27
1
0
09 Mar 2022
Variable Selection with the Knockoffs: Composite Null Hypotheses
Variable Selection with the Knockoffs: Composite Null Hypotheses
Mehrdad Pournaderi
Yu Xiang
17
1
0
06 Mar 2022
High-Dimensional Knockoffs Inference for Time Series Data
High-Dimensional Knockoffs Inference for Time Series Data
Chien-Ming Chi
Yingying Fan
C. Ing
Jinchi Lv
AI4TS
22
6
0
18 Dec 2021
Deep neural networks with controlled variable selection for the
  identification of putative causal genetic variants
Deep neural networks with controlled variable selection for the identification of putative causal genetic variants
P. H. Kassani
Fred Lu
Yann Le Guen
Zihuai He
12
12
0
29 Sep 2021
More Powerful Selective Kernel Tests for Feature Selection
More Powerful Selective Kernel Tests for Feature Selection
Jen Ning Lim
M. Yamada
Wittawat Jitkrittum
Y. Terada
S. Matsui
Hidetoshi Shimodaira
32
9
0
14 Oct 2019
Evaluating Explanation Without Ground Truth in Interpretable Machine
  Learning
Evaluating Explanation Without Ground Truth in Interpretable Machine Learning
Fan Yang
Mengnan Du
Xia Hu
XAI
ELM
19
66
0
16 Jul 2019
Simple stopping criteria for information theoretic feature selection
Simple stopping criteria for information theoretic feature selection
Shujian Yu
José C. Príncipe
20
12
0
29 Nov 2018
1