- Journal of Artificial Intelligence and Data Science
- Volume:1 Issue:2
- Performance Trade-Off for Bert Based Multi-Domain Multilingual Chatbot Architectures
Performance Trade-Off for Bert Based Multi-Domain Multilingual Chatbot Architectures
Authors : Davut Emre TAŞAR, Şükrü OZAN, Seçilay KUTAL, Oğuzhan ÖLMEZ, Semih GÜLÜM, Fatih AKCA, Ceren BELHAN
Pages : 144-149
View : 15 | Download : 8
Publication Date : 2021-12-30
Article Type : Research Paper
Abstract :Text classification is a natural language processing insert ignore into journalissuearticles values(NLP); problem that aims to classify previously unseen texts. In this study, Bidirectional Encoder Representations for Transformers insert ignore into journalissuearticles values(BERT); architecture is preferred for text classification. The classification is aimed explicitly at a chatbot that can give automated responses to website visitors` queries. BERT is trained to reduce the need for RAM and storage by replacing multiple separate models for different chatbots on a server with a single model. Moreover, since a pre-trained multilingual BERT model is preferred, the system reduces the need for system resources. It handles multiple chatbots with multiple languages simultaneously. The model mainly determines a class for a given input text. The classes correspond to specific answers from a database, and the bot selects an answer and replies back. For multiple chatbots, a special masking operation is performed to select a response from within the corresponding bank answers of a chatbot. We tested the proposed model for 13 simultaneous classification problems on a data set of three different languages, Turkish, English, and German, with 333 classes. We reported the accuracies for individually trained models and the proposed model together with the savings in the system resources.Keywords : BERT, classification, chatbot, memory gain, multi domain, multi lingual