Download PDFOpen PDF in browser

Leveraging BERT for Natural Language Understanding of Domain-Specific Knowledge

EasyChair Preprint no. 13203

6 pagesDate: May 6, 2024

Abstract

Natural Language Understanding (NLU) is a core task when building conversational agents, fulfilling the objectives of understanding the user’s goal and detecting any valuable information regarding it. NLU implies Intent Detection and Slot Filling, to semantically parse the user’s utterance. One caveat when training a Deep Learning model for domain-specific NLU is the lack of specific datasets, which leads to poorly performing models. To overcome this, we experiment with fine-tuning BERT to jointly detect the user’s intent and the related slots, using a custom-generated dataset built around an organization-specific knowledge base. Our results show that well-constructed datasets lead to high detection performances and the resulting model has the potential to enhance a future task-oriented dialogue system.

Keyphrases: BERT, intent detection, natural language understanding, slot filling, task-oriented dialogue system

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:13203,
  author = {Vasile Ionut Iga and Gheorghe Cosmin Silaghi},
  title = {Leveraging BERT for Natural Language Understanding of Domain-Specific Knowledge},
  howpublished = {EasyChair Preprint no. 13203},

  year = {EasyChair, 2024}}
Download PDFOpen PDF in browser