-
Notifications
You must be signed in to change notification settings - Fork 0
/
citation.cff
40 lines (39 loc) · 1.5 KB
/
citation.cff
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
# This CITATION.cff file was generated with cffinit.
# Visit https://bit.ly/cffinit to generate yours today!
cff-version: 1.2.0
title: >-
Bidirectional Encoder Representation from Transformer
Tagalog Part of Speech Tagger
message: 'If you use this software, please cite it as below.'
type: software
authors:
- given-names: Kenth
family-names: Saya-ang
email: [email protected]
- given-names: Mary Grizelle
family-names: Hamor
- given-names: Denise Julianne
family-names: Gozum
- given-names: Ria Karen
family-names: Mabansag
repository-code: 'https://github.com/syke9p3/bert-tagalog-pos-tagger'
abstract: >-
This model addresses the need for an efficient and
accurate Part-of-Speech (POS) tagger for the Filipino
language, by utilizing the Bidirectional Encoder
Representations from Transformers (BERT) model’s
capability for contextual analysis of languages. The
methodology involved fine-tuning Jiang’s pre-trained
Tagalog BERT Base Uncased model for POS tagging,
subsequently, resulting in an impressive accuracy of
96.4835%, greatly highlighting BERT's ability to capture
the syntactic structures and contextual nuances found in
the Filipino language. Nevertheless, concerns about
potential overfitting arose, limiting the model’s
generalizability beyond the specific dataset used for
training and evaluation.
keywords:
- ' Natural Language Processing '
- Bidirectional Encoder Representation of Transformers
- Part-of-Speech Tagging
- Filipino