DETEKSI BERITA PALSU MENGGUNAKAN BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT)

Listia Ningrum, - (2024) DETEKSI BERITA PALSU MENGGUNAKAN BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (BERT). S1 thesis, Universitas Pendidikan Indonesia.

[img] Text
S_KOM_2008084_Title.pdf

Download (462kB)
[img] Text
S_KOM_2008084_Chapter1.pdf

Download (316kB)
[img] Text
S_KOM_2008084_Chapter2.pdf
Restricted to Staf Perpustakaan

Download (1MB)
[img] Text
S_KOM_2008084_Chapter3.pdf

Download (131kB)
[img] Text
S_KOM_2008084_Chapter4.pdf
Restricted to Staf Perpustakaan

Download (952kB)
[img] Text
S_KOM_2008084_Chapter5.pdf

Download (34kB)
[img] Text
S_KOM_2008084_Appendix.pdf
Restricted to Staf Perpustakaan

Download (129kB)
Official URL: https://repository.upi.edu/

Abstract

Seiring dengan pesatnya perkembangan teknologi dan peningkatan pengguna internet di Indonesia, penyebaran berita palsu semakin meluas, menimbulkan dampak negatif di berbagai bidang. Untuk menangani masalah ini, otomatisasi deteksi berita palsu menggunakan teknologi Natural Language Processing (NLP) menjadi penting. Penelitian ini berfokus pada penerapan model deep learning berbasis transformers, IndoBERT dan MBERT, untuk deteksi berita palsu. Tujuannya adalah mengembangkan sistem deteksi berita palsu bahasa Indonesia dengan menerapkan fine-tuning pada model IndoBERT dan MBERT, serta mengevaluasi performa sistem dalam berbagai skenario. Hasil menunjukkan bahwa setelah fine-tuning, sistem deteksi berita palsu menggunakan model BERT mencapai performa yang sangat baik. Parameter optimal untuk IndoBERT adalah epoch 3, batch size 4, learning rate 1e-5, dropout 0.3, dan weight decay 0.1, sedangkan untuk MBERT adalah epoch 4, batch size 4, learning rate 2e-5, dropout 0.3, dan weight decay 0.1. IndoBERT unggul dibandingkan MBERT dengan akurasi 97.64%, precision 97.64%, recall 97.64%, dan F1-score 97.63%, sedangkan MBERT memiliki akurasi 97.26%, precision 97.27%, recall 97.26%, dan F1-score 97.27%. Ukuran data pelatihan optimal adalah 100% untuk IndoBERT dan 75% untuk MBERT. Penggunaan tanda baca tidak berpengaruh signifikan pada IndoBERT, tetapi sedikit meningkatkan akurasi MBERT. With the rapid advancement of technology and increasing internet usage in Indonesia, the spread of fake news has become more widespread, causing negative impacts in various areas. To address this issue, automating fake news detection using Natural Language Processing (NLP) technology is crucial. This study focuses on applying transformer-based deep learning models, IndoBERT and MBERT, for fake news detection. The aim is to develop an Indonesian fake news detection system by applying fine-tuning to both IndoBERT and MBERT models and to evaluate the system's performance across various scenarios. The results show that after fine-tuning, the fake news detection system using BERT models achieves excellent performance. The optimal parameters for IndoBERT are epoch 3, batch size 4, learning rate 1e-5, dropout 0.3, and weight decay 0.1, while for MBERT they are epoch 4, batch size 4, learning rate 2e-5, dropout 0.3, and weight decay 0.1. IndoBERT outperforms MBERT with an accuracy of 97.64%, precision of 97.64%, recall of 97.64%, and F1-score of 97.63%, compared to MBERT’s accuracy of 97.26%, precision of 97.27%, recall of 97.26%, and F1-score of 97.27%. The optimal training data size is 100% for IndoBERT and 75% for MBERT. Punctuation usage does not significantly affect IndoBERT but improves MBERT’s accuracy.

Item Type: Thesis (S1)
Additional Information: https://scholar.google.com/citations?hl=en&user=NseQdUIAAAAJ ID SINTA Dosen Pembimbing: Yudi Wibisono: 260167 Yaya Wihardi: 5994413
Uncontrolled Keywords: Deteksi berita palsu, Fine-tuning, IndoBERT, MBERT, Transformers. Fake news detection, Fine-tuning, IndoBERT, MBERT, Transformers.
Subjects: Q Science > Q Science (General)
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Divisions: Fakultas Pendidikan Matematika dan Ilmu Pengetahuan Alam > Program Studi Ilmu Komputer
Depositing User: Listia Ningrum
Date Deposited: 02 Sep 2024 09:26
Last Modified: 02 Sep 2024 09:26
URI: http://repository.upi.edu/id/eprint/122281

Actions (login required)

View Item View Item