QAID: Question Answering Inspired Few-shot Intent Detection
Authors: Asaf Yehudai, Matan Vetzler, Yosi Mass, Koren Lazar, Doron Cohen, Boaz Carmeli
ICLR 2023 | Conference PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | Our results on three few-shot intent detection benchmarks achieve state-of-the-art performance. |
| Researcher Affiliation | Collaboration | IBM Israel Research Lab , Hebrew University of Jerusalem |
| Pseudocode | No | The paper describes the framework and training process in text and with equations, but does not include any clearly labeled pseudocode or algorithm blocks. |
| Open Source Code | No | The paper links to external libraries and architectures used (e.g., Hugging Face transformers, ColBERT, SupContrast, Faiss) but does not provide a direct link to their own implementation of QAID or explicitly state that their code for this paper is released. |
| Open Datasets | Yes | We experiment with three widely studied few-shot intent detection datasets... For readily use: https://github.com/jianguoz/Few-Shot-Intent-Detection/tree/main/Datasets |
| Dataset Splits | Yes | Table 1: Data statistics of the three intent detection datasets from Dialo GLUE. It lists #Train, #Vaild, #Test counts for each dataset. |
| Hardware Specification | Yes | Our fine-tuning takes only ten minutes on one NVIDIA V100 GPU |
| Software Dependencies | No | The paper mentions software like 'Hugging Face transformers library', 'ColBERT architecture', and 'Faiss Index', but does not specify their version numbers or any other software dependencies with specific versions. |
| Experiment Setup | Yes | We train our encoder for 20 epochs with a batch size of 64, a learning rate of 1e 5, a temperature parameter τ of 0.07 and λ = 0.1... We train our model for 10 epochs... We set the batch size to 32... We set the temperature to 0.07. We also set λclass and λmlm to 0.1 and 0.05, respectively. |