KOMPARASI ANALISIS SENTIMEN PUBLIK DENGAN MODEL LONG SHORT-TERM MEMORY (LSTM) DAN INDONESIA BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (INDOBERT) (STUDI KASUS : PROGRAM MSIB)

    Alya Sahrani, - and Rizki Hikmawan, - (2025) KOMPARASI ANALISIS SENTIMEN PUBLIK DENGAN MODEL LONG SHORT-TERM MEMORY (LSTM) DAN INDONESIA BIDIRECTIONAL ENCODER REPRESENTATIONS FROM TRANSFORMERS (INDOBERT) (STUDI KASUS : PROGRAM MSIB). S1 thesis, Universitas Pendidikan Indonesia.

    Abstract

    Penelitian ini bertujuan membandingkan performa model Long Short-Term Memory (LSTM) dan Indonesian Bidirectional Encoder Representations from Transformers (IndoBERT) dalam analisis sentimen opini publik mengenai Program Magang dan Studi Independen Bersertifikat (MSIB). Di era digital, media sosial menjadi sumber utama opini publik, namun teks yang dihasilkan cenderung tidak terstruktur, penuh sarkasme, code switching, serta bahasa informal sehingga menimbulkan tantangan bagi model konvensional. Penelitian ini menggunakan pendekatan kuantitatif deskriptif komparatif dengan data opini publik yang dikumpulkan melalui crawling dari media sosial X sebanyak 3.593 data. Setelah preprocessing, distribusi sentimen menunjukkan 61,5% negatif dan 38,5% positif. Model LSTM dan IndoBERT kemudian dilatih dan dievaluasi menggunakan metrik klasifikasi. Hasil pengujian menunjukkan LSTM mencapai akurasi 85%, presisi 91%, recall 83%, dan F1-score 87%. Sementara itu, IndoBERT secara signifikan unggul dengan akurasi 95%, presisi 95%, recall 97%, dan F1-score 96%. Hal ini membuktikan IndoBERT lebih efektif dalam menangani teks kompleks khas media sosial Indonesia. ----- This study aims to compare the performance of Long Short-Term Memory (LSTM) and Indonesian Bidirectional Encoder Representations from Transformers (IndoBERT) in sentiment analysis of public opinion regarding the Magang dan Studi Independen Bersertifikat (MSIB) program. In the digital era, social media has become a primary source of public opinion; however, the texts are often unstructured, containing sarcasm, code-switching, and informal language, which pose challenges for conventional models. This research adopts a quantitative descriptive-comparative approach using 3,593 public opinion data collected through crawling from the social media platform X. After preprocessing, sentiment distribution indicated 61.5% negative and 38.5% positive opinions. Both LSTM and IndoBERT models were trained and evaluated using classification metrics. The experimental results revealed that LSTM achieved 85% accuracy, 91% precision, 83% recall, and an F1-score of 87%. In contrast, IndoBERT significantly outperformed LSTM with 95% accuracy, 95% precision, 97% recall, and an F1-score of 96%. These findings demonstrate that IndoBERT is more effective in handling complex text characteristics typical of Indonesian social media.

    [thumbnail of S_PSTI_2102985_Title.pdf] Text
    S_PSTI_2102985_Title.pdf

    Download (3MB)
    [thumbnail of S_PSTI_2102985_Chapter 1.pdf] Text
    S_PSTI_2102985_Chapter 1.pdf

    Download (297kB)
    [thumbnail of S_PSTI_2102985_Chapter 2.pdf] Text
    S_PSTI_2102985_Chapter 2.pdf
    Restricted to Staf Perpustakaan

    Download (514kB)
    [thumbnail of S_PSTI_2102985_Chapter 3.pdf] Text
    S_PSTI_2102985_Chapter 3.pdf

    Download (431kB)
    [thumbnail of S_PSTI_2102985_Chapter 4.pdf] Text
    S_PSTI_2102985_Chapter 4.pdf
    Restricted to Staf Perpustakaan

    Download (1MB)
    [thumbnail of S_PSTI_2102985_Chapter 5.pdf] Text
    S_PSTI_2102985_Chapter 5.pdf

    Download (279kB)
    [thumbnail of S_PSTI_2102985_Appendix.pdf] Text
    S_PSTI_2102985_Appendix.pdf
    Restricted to Staf Perpustakaan

    Download (1MB)
    Official URL: https://repository.upi.edu/
    Item Type: Thesis (S1)
    Additional Information: https://scholar.google.com/citations?hl=en&user=udrP4hgAAAAJ ID SINTA Dosen Pembimbing: Rizki Hikmawan: 6122897
    Uncontrolled Keywords: Analisis Sentimen, LSTM, IndoBERT, Program MSIB, Machine Learning. Sentiment Analysis, LSTM, IndoBERT, MSIB Program, Machine Learning.
    Subjects: T Technology > T Technology (General)
    Divisions: UPI Kampus Purwakarta > S1 Pendidikan Sistem Teknologi dan Informasi
    Depositing User: Alya Sahrani
    Date Deposited: 01 Sep 2025 05:00
    Last Modified: 01 Sep 2025 05:00
    URI: http://repository.upi.edu/id/eprint/136629

    Actions (login required)

    View Item View Item