Kata.ai
Publication year: 2019

Pretrained language model transfer on neural named entity recognition in Indonesian conversational texts

Written by:
Rezka Leonandya, Fariz Ikhwantri

Abstract

Named entity recognition (NER) is an important task in NLP, which is all the more challenging in conversational domain with their noisy facets. Moreover, conversational texts are often available in limited amount, making supervised tasks infeasible. To learn from small data, strong inductive biases are required. Previous work relied on hand-crafted features to encode these biases until transfer learning emerges. Here, we explore a transfer learning method, namely language model pretraining, on NER task in Indonesian conversational texts. We utilize large unlabeled data (generic domain) to be transferred to conversational texts, enabling supervised training on limited in-domain data. We report two transfer learning variants, namely supervised model fine-tuning and unsupervised pretrained LM fine-tuning. Our experiments show that both variants outperform baseline neural models when trained on small data (100 sentences), yielding an absolute improvement of 32 points of test F1 score. Furthermore, we find that the pretrained LM encodes part-of-speech information which is a strong predictor for NER.

  • Share
Download Full Paper

Other case Paper

BERT Goes Brrr: A Venture Towards the Lesser Error in Classifying Medical Self-Reporters on Twitter
Publication year: 2021

BERT Goes Brrr: A Venture Towards the Lesser Error in Classifying Medical Self-Reporters on Twitter

IndoCollex: A Testbed for Morphological Transformation of Indonesian Word Colloquialism
Publication year: 2021

IndoCollex: A Testbed for Morphological Transformation of Indonesian Word Colloquialism

Benchmarking Multidomain English-Indonesian Machine Translation
Publication year: 2020

Benchmarking Multidomain English-Indonesian Machine Translation

Ready to build your conversational AI?

Get started
CTA