ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.15619
58
0

Extraction multi-étiquettes de relations en utilisant des couches de Transformer

24 February 2025
Ngoc Luyen Le
Gildas Tagny Ngompé
ArXivPDFHTML
Abstract

In this article, we present the BTransformer18 model, a deep learning architecture designed for multi-label relation extraction in French texts. Our approach combines the contextual representation capabilities of pre-trained language models from the BERT family - such as BERT, RoBERTa, and their French counterparts CamemBERT and FlauBERT - with the power of Transformer encoders to capture long-term dependencies between tokens. Experiments conducted on the dataset from the TextMine'25 challenge show that our model achieves superior performance, particularly when using CamemBERT-Large, with a macro F1 score of 0.654, surpassing the results obtained with FlauBERT-Large. These results demonstrate the effectiveness of our approach for the automatic extraction of complex relations in intelligence reports.

View on arXiv
@article{le2025_2502.15619,
  title={ Extraction multi-étiquettes de relations en utilisant des couches de Transformer },
  author={ Ngoc Luyen Le and Gildas Tagny Ngompé },
  journal={arXiv preprint arXiv:2502.15619},
  year={ 2025 }
}
Comments on this paper