ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.11045
  4. Cited By
Distilling Knowledge for Fast Retrieval-based Chat-bots

Distilling Knowledge for Fast Retrieval-based Chat-bots

23 April 2020
Amir Vakili Tahami
Kamyar Ghajar
A. Shakery
ArXivPDFHTML

Papers citing "Distilling Knowledge for Fast Retrieval-based Chat-bots"

2 / 2 papers shown
Title
Learn What Is Possible, Then Choose What Is Best: Disentangling
  One-To-Many Relations in Language Through Text-based Games
Learn What Is Possible, Then Choose What Is Best: Disentangling One-To-Many Relations in Language Through Text-based Games
Benjamin Towle
Ke Zhou
SyDa
25
4
0
14 Apr 2023
Pretrained Transformers for Text Ranking: BERT and Beyond
Pretrained Transformers for Text Ranking: BERT and Beyond
Jimmy J. Lin
Rodrigo Nogueira
Andrew Yates
VLM
239
611
0
13 Oct 2020
1