ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.02876
60
0

SPIDER: A Comprehensive Multi-Organ Supervised Pathology Dataset and Baseline Models

4 March 2025
Dmitry Nechaev
Alexey Pchelnikov
Ekaterina Ivanova
ArXivPDFHTML
Abstract

Advancing AI in computational pathology requires large, high-quality, and diverse datasets, yet existing public datasets are often limited in organ diversity, class coverage, or annotation quality. To bridge this gap, we introduce SPIDER (Supervised Pathology Image-DEscription Repository), the largest publicly available patch-level dataset covering multiple organ types, including Skin, Colorectal, Thorax, and Breast with comprehensive class coverage for each organ. SPIDER provides high-quality annotations verified by expert pathologists and includes surrounding context patches, which enhance classification performance by providing spatial context.Alongside the dataset, we present baseline models trained on SPIDER using the Hibou-L foundation model as a feature extractor combined with an attention-based classification head. The models achieve state-of-the-art performance across multiple tissue categories and serve as strong benchmarks for future digital pathology research. Beyond patch classification, the model enables rapid identification of significant areas, quantitative tissue metrics, and establishes a foundation for multimodal approaches.Both the dataset and trained models are publicly available to advance research, reproducibility, and AI-driven pathology development. Access them at:this https URL

View on arXiv
@article{nechaev2025_2503.02876,
  title={ SPIDER: A Comprehensive Multi-Organ Supervised Pathology Dataset and Baseline Models },
  author={ Dmitry Nechaev and Alexey Pchelnikov and Ekaterina Ivanova },
  journal={arXiv preprint arXiv:2503.02876},
  year={ 2025 }
}
Comments on this paper