ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.05047
  4. Cited By
One Map Does Not Fit All: Evaluating Saliency Map Explanation on
  Multi-Modal Medical Images

One Map Does Not Fit All: Evaluating Saliency Map Explanation on Multi-Modal Medical Images

11 July 2021
Weina Jin
Xiaoxiao Li
Ghassan Hamarneh
    FAtt
ArXivPDFHTML

Papers citing "One Map Does Not Fit All: Evaluating Saliency Map Explanation on Multi-Modal Medical Images"

3 / 3 papers shown
Title
Influence based explainability of brain tumors segmentation in
  multimodal Magnetic Resonance Imaging
Influence based explainability of brain tumors segmentation in multimodal Magnetic Resonance Imaging
Tommaso Torda
Andrea Ciardiello
Simona Gargiulo
Greta Grillo
Simone Scardapane
Cecilia Voena
S. Giagu
23
0
0
05 Apr 2024
Transparency of Deep Neural Networks for Medical Image Analysis: A
  Review of Interpretability Methods
Transparency of Deep Neural Networks for Medical Image Analysis: A Review of Interpretability Methods
Zohaib Salahuddin
Henry C. Woodruff
A. Chatterjee
Philippe Lambin
15
301
0
01 Nov 2021
Towards A Rigorous Science of Interpretable Machine Learning
Towards A Rigorous Science of Interpretable Machine Learning
Finale Doshi-Velez
Been Kim
XAI
FaML
251
3,683
0
28 Feb 2017
1