3-4 July 2024
CSIR ICC
Africa/Johannesburg timezone
LAST CHANCE! Online registrations close at midnight TONIGHT (2 July). Note: only PayFast payments are available.

Context-Based Question Answering using Large Language BERT Variant Models for Low Resourced Sotho sa Leboa Language.

4 Jul 2024, 13:05
20m
ICC-G-Ruby - Ruby Auditorium (CSIR ICC)

ICC-G-Ruby - Ruby Auditorium

CSIR ICC

136
Talk Session

Speaker

Hlaudi Masethe (Tshwane University of Technology)

Description

Since reading and responding to text needs both a grasp of natural language and awareness of the outside world, it is challenging for machines to do (Akhila et al., 2023). The most difficult areas of information retrieval and natural language processing are question answering systems (QAS). The goal of the Question Answering System is to use the provided context or knowledge base to provide replies in natural language to the user's questions. Both closed and open domains can produce the answers. A closed domain's responses are limited to a specific situation, whereas open-domain systems are able to provide answers in a human-readable language from a vast knowledge base. Another issue is coming up with answers to the questions based on certain situations, as each question might have a variety of interpretations and responses based on the context to which it relates (Kumari et al., 2022). In our comprehension, this research work is the initial effort to extract answers from a context in low resourced Sesotho sa Leboa language. The Bidirectional
Encoder Representation from Transformers (BERT) variant model such as Albert, and DistilBERT is used as the language model in this research study

Primary author

Mrs Mosima Annan Masethe (Sefako Makgatho Health Science University)

Co-author

Hlaudi Masethe (Tshwane University of Technology)

Presentation Materials

There are no materials yet.