ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.06031
  4. Cited By
A Hierarchical Multi-task Approach for Learning Embeddings from Semantic
  Tasks

A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks

14 November 2018
Victor Sanh
Thomas Wolf
Sebastian Ruder
ArXivPDFHTML

Papers citing "A Hierarchical Multi-task Approach for Learning Embeddings from Semantic Tasks"

26 / 26 papers shown
Title
Federated Communication-Efficient Multi-Objective Optimization
Federated Communication-Efficient Multi-Objective Optimization
Baris Askin
Pranay Sharma
Gauri Joshi
Carlee Joe-Wong
FedML
66
1
0
21 Oct 2024
FAME-ViL: Multi-Tasking Vision-Language Model for Heterogeneous Fashion
  Tasks
FAME-ViL: Multi-Tasking Vision-Language Model for Heterogeneous Fashion Tasks
Xiaoping Han
Xiatian Zhu
Licheng Yu
Li Zhang
Yi-Zhe Song
Tao Xiang
VLM
24
38
0
04 Mar 2023
Modular Deep Learning
Modular Deep Learning
Jonas Pfeiffer
Sebastian Ruder
Ivan Vulić
E. Ponti
MoMe
OOD
32
73
0
22 Feb 2023
MLink: Linking Black-Box Models from Multiple Domains for Collaborative
  Inference
MLink: Linking Black-Box Models from Multiple Domains for Collaborative Inference
Mu Yuan
Lan Zhang
Zimu Zheng
Yi-Nan Zhang
Xiang-Yang Li
25
2
0
28 Sep 2022
Multimodal Crop Type Classification Fusing Multi-Spectral Satellite Time
  Series with Farmers Crop Rotations and Local Crop Distribution
Multimodal Crop Type Classification Fusing Multi-Spectral Satellite Time Series with Farmers Crop Rotations and Local Crop Distribution
Valentin Barrière
M. Claverie
21
4
0
23 Aug 2022
RobustAnalog: Fast Variation-Aware Analog Circuit Design Via Multi-task
  RL
RobustAnalog: Fast Variation-Aware Analog Circuit Design Via Multi-task RL
Wei Shi
Hanrui Wang
Jiaqi Gu
Mingjie Liu
David Z. Pan
Song Han
Nan Sun
14
14
0
13 Jul 2022
Modeling Task Interactions in Document-Level Joint Entity and Relation
  Extraction
Modeling Task Interactions in Document-Level Joint Entity and Relation Extraction
Liyan Xu
Jinho Choi
22
6
0
04 May 2022
Crude Oil-related Events Extraction and Processing: A Transfer Learning
  Approach
Crude Oil-related Events Extraction and Processing: A Transfer Learning Approach
Meisin Lee
Lay-Ki Soon
Eu-Gene Siew
27
0
0
01 May 2022
Knowledge Distillation from BERT Transformer to Speech Transformer for
  Intent Classification
Knowledge Distillation from BERT Transformer to Speech Transformer for Intent Classification
Yiding Jiang
Bidisha Sharma
Maulik C. Madhavi
Haizhou Li
36
25
0
05 Aug 2021
Specializing Multilingual Language Models: An Empirical Study
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
27
27
0
16 Jun 2021
A Comprehensive Survey on Graph Anomaly Detection with Deep Learning
A Comprehensive Survey on Graph Anomaly Detection with Deep Learning
Xiaoxiao Ma
Jia Wu
Shan Xue
Jian Yang
Chuan Zhou
Quan Z. Sheng
Hui Xiong
Leman Akoglu
GNN
AI4TS
43
538
0
14 Jun 2021
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task
  Learning on Semantic Parsing Datasets
One Semantic Parser to Parse Them All: Sequence to Sequence Multi-Task Learning on Semantic Parsing Datasets
Marco Damonte
Emilio Monti
AIMat
25
6
0
08 Jun 2021
Better Call the Plumber: Orchestrating Dynamic Information Extraction
  Pipelines
Better Call the Plumber: Orchestrating Dynamic Information Extraction Pipelines
M. Y. Jaradeh
Kuldeep Singh
M. Stocker
A. Both
Sören Auer
14
7
0
22 Feb 2021
Cross-Domain Multi-Task Learning for Sequential Sentence Classification
  in Research Papers
Cross-Domain Multi-Task Learning for Sequential Sentence Classification in Research Papers
Arthur Brack
Anett Hoppe
Pascal Buschermöhle
Ralph Ewerth
27
18
0
11 Feb 2021
LiteMuL: A Lightweight On-Device Sequence Tagger using Multi-task
  Learning
LiteMuL: A Lightweight On-Device Sequence Tagger using Multi-task Learning
S. Kumari
Vibhav Agarwal
B. Challa
Kranti Chalamalasetti
Sourav Ghosh
Harshavardhana
Barath Raj Kandur Raja
19
1
0
15 Dec 2020
Multi-Task Learning with Deep Neural Networks: A Survey
Multi-Task Learning with Deep Neural Networks: A Survey
M. Crawshaw
CVBM
48
609
0
10 Sep 2020
Learning Functions to Study the Benefit of Multitask Learning
Learning Functions to Study the Benefit of Multitask Learning
Gabriele Bettgenhauser
Michael A. Hedderich
Dietrich Klakow
14
4
0
09 Jun 2020
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
AdapterFusion: Non-Destructive Task Composition for Transfer Learning
Jonas Pfeiffer
Aishwarya Kamath
Andreas Rucklé
Kyunghyun Cho
Iryna Gurevych
CLL
MoMe
21
817
0
01 May 2020
Multi-Task Learning for Dense Prediction Tasks: A Survey
Multi-Task Learning for Dense Prediction Tasks: A Survey
Simon Vandenhende
Stamatios Georgoulis
Wouter Van Gansbeke
Marc Proesmans
Dengxin Dai
Luc Van Gool
CVBM
24
72
0
28 Apr 2020
Learning Sparse Sharing Architectures for Multiple Tasks
Learning Sparse Sharing Architectures for Multiple Tasks
Tianxiang Sun
Yunfan Shao
Xiaonan Li
Pengfei Liu
Hang Yan
Xipeng Qiu
Xuanjing Huang
MoE
30
128
0
12 Nov 2019
Hierarchical Multi-Task Natural Language Understanding for Cross-domain
  Conversational AI: HERMIT NLU
Hierarchical Multi-Task Natural Language Understanding for Cross-domain Conversational AI: HERMIT NLU
Andrea Vanzo
E. Bastianelli
Oliver Lemon
13
36
0
02 Oct 2019
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
BAM! Born-Again Multi-Task Networks for Natural Language Understanding
Kevin Clark
Minh-Thang Luong
Urvashi Khandelwal
Christopher D. Manning
Quoc V. Le
21
228
0
10 Jul 2019
Improving Sentiment Analysis with Multi-task Learning of Negation
Improving Sentiment Analysis with Multi-task Learning of Negation
Jeremy Barnes
Erik Velldal
Lilja Øvrelid
16
36
0
18 Jun 2019
Better, Faster, Stronger Sequence Tagging Constituent Parsers
Better, Faster, Stronger Sequence Tagging Constituent Parsers
David Vilares
Mostafa Abdou
Anders Søgaard
33
22
0
28 Feb 2019
What you can cram into a single vector: Probing sentence embeddings for
  linguistic properties
What you can cram into a single vector: Probing sentence embeddings for linguistic properties
Alexis Conneau
Germán Kruszewski
Guillaume Lample
Loïc Barrault
Marco Baroni
201
882
0
03 May 2018
Reference-Aware Language Models
Reference-Aware Language Models
Zichao Yang
Phil Blunsom
Chris Dyer
Wang Ling
221
80
0
05 Nov 2016
1