ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.07106
  4. Cited By
Normalizing Flow based Feature Synthesis for Outlier-Aware Object
  Detection

Normalizing Flow based Feature Synthesis for Outlier-Aware Object Detection

1 February 2023
Nishant Kumar
Sinisa Segvic
Abouzar Eslami
Stefan Gumhold
ArXivPDFHTML

Papers citing "Normalizing Flow based Feature Synthesis for Outlier-Aware Object Detection"

7 / 7 papers shown
Title
Dream-Box: Object-wise Outlier Generation for Out-of-Distribution Detection
Dream-Box: Object-wise Outlier Generation for Out-of-Distribution Detection
Brian K. S. Isaac-Medina
T. Breckon
OODD
65
0
0
25 Apr 2025
Captured by Captions: On Memorization and its Mitigation in CLIP Models
Captured by Captions: On Memorization and its Mitigation in CLIP Models
Wenhao Wang
Adam Dziedzic
Grace C. Kim
Michael Backes
Franziska Boenisch
79
0
0
11 Feb 2025
Towards Open-World Object-based Anomaly Detection via Self-Supervised
  Outlier Synthesis
Towards Open-World Object-based Anomaly Detection via Self-Supervised Outlier Synthesis
Brian K. S. Isaac-Medina
Yona Falinie A. Gaus
Neelanjan Bhowmik
T. Breckon
21
2
0
22 Jul 2024
Towards Computational Performance Engineering for Unsupervised Concept
  Drift Detection -- Complexities, Benchmarking, Performance Analysis
Towards Computational Performance Engineering for Unsupervised Concept Drift Detection -- Complexities, Benchmarking, Performance Analysis
Elias Werner
Nishant Kumar
Matthias Lieber
Sunna Torge
Stefan Gumhold
W. Nagel
13
4
0
17 Apr 2023
VOS: Learning What You Don't Know by Virtual Outlier Synthesis
VOS: Learning What You Don't Know by Virtual Outlier Synthesis
Xuefeng Du
Zhaoning Wang
Mu Cai
Yixuan Li
OODD
174
220
0
02 Feb 2022
On the Importance of Gradients for Detecting Distributional Shifts in
  the Wild
On the Importance of Gradients for Detecting Distributional Shifts in the Wild
Rui Huang
Andrew Geng
Yixuan Li
171
324
0
01 Oct 2021
Dropout as a Bayesian Approximation: Representing Model Uncertainty in
  Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Y. Gal
Zoubin Ghahramani
UQCV
BDL
247
9,042
0
06 Jun 2015
1