PERBANDINGAN LONG SHORT TERM MEMORY DAN TRANSFORMER UNTUK PERINGKASAN TEKS BERITA BERBAHASA INDONESIA SECARA OTOMATIS

    Christina Prilla Rosaria Ardyanti, - (2023) PERBANDINGAN LONG SHORT TERM MEMORY DAN TRANSFORMER UNTUK PERINGKASAN TEKS BERITA BERBAHASA INDONESIA SECARA OTOMATIS. S1 thesis, Universitas Pendidikan Indonesia.

    Abstract

    The growth of information on the internet makes textual data such as news spread in society more and more. The amount of data makes it difficult for humans to process information quickly. Summary text can help humans to understand large amounts of information quickly. Automatic text summarization is needed to save time and effort compared to manually summarizing text. In this study, the encoder-decoder architecture will be implemented on the Indosum dataset using Long Short Term Memory (LSTM) with additional attention mechanisms and transformers. Trials were also carried out using fine-tuning on the pre-trained T5-Small and BART-Small models. Experiments were also carried out using dataset scenarios that used preprocessing and datasets that did not use preprocessing. The Indosum dataset will also be tested with a pre-trained model without going through a fine-tuning process. Based on experiments, the LSTM-Atensi model has low performance with the highest ROUGE-L value of 14.2 in the dataset that uses preprocessing. Meanwhile, the highest ROUGE value was obtained from the results of fine-tuning T5-Small with a score of 66.2.

    [thumbnail of S_KOM_1900575_Title.pdf] Text
    S_KOM_1900575_Title.pdf

    Download (538kB)
    [thumbnail of S_KOM_1900575_Chapter1.pdf] Text
    S_KOM_1900575_Chapter1.pdf

    Download (393kB)
    [thumbnail of S_KOM_1900575_Chapter2.pdf] Text
    S_KOM_1900575_Chapter2.pdf
    Restricted to Staf Perpustakaan

    Download (1MB)
    [thumbnail of S_KOM_1900575_Chapter3.pdf] Text
    S_KOM_1900575_Chapter3.pdf

    Download (641kB)
    [thumbnail of S_KOM_1900575_Chapter4.pdf] Text
    S_KOM_1900575_Chapter4.pdf
    Restricted to Staf Perpustakaan

    Download (1MB)
    [thumbnail of S_KOM_1900575_Chapter5.pdf] Text
    S_KOM_1900575_Chapter5.pdf

    Download (287kB)
    Official URL: http://repository.upi.edu/
    Item Type: Thesis (S1)
    Additional Information: Link Google Scholar : https://scholar.google.com/citations?user=gdiW3PgAAAAJ&hl=en ID SINTA Dosen Pembimbing : Yudi Wibisono : 260167 Rani Megasari : 5992674
    Uncontrolled Keywords: Peringkasan Teks, Natural Language Processing, Deep Learning, LSTM, Transformer, Mekanisme Atensi, ROUGE
    Subjects: L Education > L Education (General)
    Q Science > Q Science (General)
    Q Science > QA Mathematics > QA75 Electronic computers. Computer science
    Divisions: Fakultas Pendidikan Matematika dan Ilmu Pengetahuan Alam > Program Studi Ilmu Komputer
    Depositing User: Christina Prilla Rosaria Ardyanti
    Date Deposited: 07 Sep 2023 08:18
    Last Modified: 07 Sep 2023 08:18
    URI: http://repository.upi.edu/id/eprint/103343

    Actions (login required)

    View Item View Item