PENINGKATAN PENGETAHUAN TENTANG CARA ARTIFICIAL INTELLIGENCE BELAJAR BAHASA MANUSIA UNTUK MEMOTIVASI SISWA DALAM BELAJAR BAHASA INGGRIS
Abstract
Abstrak: Bahasa Inggris merupakan bahasa internasional yang disepakati pada zaman ini dan bahasa yang paling banyak digunakan di dunia. Namun demikian, proses belajar bahasa Inggris tidaklah terasa mudah bagi banyak siswa secara umum dan bagi SMA Negeri 3 Kota Tegal secara khusus. Pada kegiatan pengabdian ini, kami mengembangkan aplikasi berbasis web untuk memfasilitasi belajar bahasa Inggris dengan metode yang terinsprirasi dari cara artificial intelligence belajar bahasa manusia. Kemudian, 36 siswa diberikan edukasi tentang metode AI yang sukses dalam memahami bahasa manusia dan bagaimana cara model AI tersebut belajar. Kegiatan ini penting untuk menginspirasi dan memfasilitasi siswa dalam belajar bahasa Inggris. Pada sesi edukasi, siswa melakukan pretest dan posttest dengan 15 soal yang bertujuan untuk memberikan simulasi pelatihan dengan data, bahwa nilai pelatihan akan lebih baik dari sesi pelatihan sebelumnya, seperti cara belajar AI. Nilai tertinggi pretest adalah 80%, sedangkan nilai tertinggi posttest mencapai 100%; menunjukkan kepada siswa bahwa belajar bahasa berulang-ulang seperti AI berpengaruh pada peningkatan kemampuan. Manfaat yang diharapkan pada siswa dari kegiatan pengabdian ini adalah meningkatnya hard skill berbahasa Inggris dan soft skill kepercayaan diri dalam berkomunikasi dan berinteraksi dengan orang lain.
Abstract: English is an internationally-agreed language today and the most widely-used language in the world. However, the English learning process is not easy for many students, especially students of SMA Negeri 3 Kota Tegal. In this community service activity, we developed a web-based application to facilitate English learning with a method inspired by the way artificial intelligence learns human language. Then, 36 students were educated about AI method that is successful in understanding human language and how the AI model learns. This community service is important to inspire and facilitate students in learning English. In the education session, students conducted a pretest and posttest with 15 questions which aims to provide a training simulation with data, that the score of the training will be better than the previous training session, like how AI learns. The highest pretest score was 80%, while the highest posttest score reached 100%; showing students that learning a language repeatedly like AI has an effect on improving their abilities. The expected benefits for students from this activity are an improvement of hard skills in English and soft skills in term of confidence in communicating and interacting with others.
Keywords
Full Text:
DOWNLOAD [PDF]References
Barbieri, F., Espinosa Anke, L., & Camacho-Collados, J. (2022). XLM-T: Multilingual Language Models in Twitter for Sentiment Analysis and Beyond. Dalam N. Calzolari, F. Béchet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, J. Odijk, & S. Piperidis (Ed.), Proceedings of the Thirteenth Language Resources and Evaluation Conference (hlm. 258–266). European Language Resources Association. https://aclanthology.org/2022.lrec-1.27
Bianchi, F., Nozza, D., & Hovy, D. (2022). XLM-EMO: Multilingual Emotion Prediction in Social Media Text. Dalam J. Barnes, O. De Clercq, V. Barriere, S. Tafreshi, S. Alqahtani, J. Sedoc, R. Klinger, & A. Balahur (Ed.), Proceedings of the 12th Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis (hlm. 195–203). Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.wassa-1.18
Chi, Z., Dong, L., Wei, F., Yang, N., Singhal, S., Wang, W., Song, X., Mao, X.-L., Huang, H., & Zhou, M. (2021). InfoXLM: An Information-Theoretic Framework for Cross-Lingual Language Model Pre-Training. Dalam K. Toutanova, A. Rumshisky, L. Zettlemoyer, D. Hakkani-Tur, I. Beltagy, S. Bethard, R. Cotterell, T. Chakraborty, & Y. Zhou (Ed.), Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (hlm. 3576–3588). Association for Computational Linguistics. https://doi.org/10.18653/v1/2021.naacl-main.280
Chi, Z., Dong, L., Zheng, B., Huang, S., Mao, X.-L., Huang, H., & Wei, F. (2021). Improving Pretrained Cross-Lingual Language Models via Self-Labeled Word Alignment. Dalam C. Zong, F. Xia, W. Li, & R. Navigli (Ed.), Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) (hlm. 3418–3430). Association for Computational Linguistics. https://doi.org/10.18653/v1/2021.acl-long.265
Chi, Z., Huang, S., Dong, L., Ma, S., Zheng, B., Singhal, S., Bajaj, P., Song, X., Mao, X.-L., Huang, H., & Wei, F. (2022). XLM-E: Cross-lingual Language Model Pre-training via ELECTRA. Dalam S. Muresan, P. Nakov, & A. Villavicencio (Ed.), Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (hlm. 6170–6182). Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.acl-long.427
Conneau, A., Khandelwal, K., Goyal, N., Chaudhary, V., Wenzek, G., Guzmán, F., Grave, E., Ott, M., Zettlemoyer, L., & Stoyanov, V. (2020). Unsupervised Cross-lingual Representation Learning at Scale. Dalam D. Jurafsky, J. Chai, N. Schluter, & J. Tetreault (Ed.), Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics (hlm. 8440–8451). Association for Computational Linguistics. https://doi.org/10.18653/v1/2020.acl-main.747
Conneau, A., & Lample, G. (2019). Cross-lingual Language Model Pretraining. Dalam H. Wallach, H. Larochelle, A. Beygelzimer, F. d Alché-Buc, E. Fox, & R. Garnett (Ed.), Advances in Neural Information Processing Systems (Vol. 32). Curran Associates, Inc.
Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Dalam J. Burstein, C. Doran, & T. Solorio (Ed.), Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers) (hlm. 4171–4186). Association for Computational Linguistics. https://doi.org/10.18653/v1/N19-1423
Engerman, S. L., & Gallman, R. E. (2000). The Cambridge Economic History of the United States: Volume 2: The Long Nineteenth Century. Dalam Cambridge Economic History of the United States (Vol. 2). Cambridge University Press. https://doi.org/DOI: 10.1017/CHOL9780521553070
Hidayat, M. T. (2024). English Language Proficiency and Career Opportunities: Perceptions of Indonesian University Graduates. Language Value, 17(1), 85–107.
Kirkpatrick, A. (2009). World Englishes: Implications for international communication and English language teaching. Language in Society, 38(4), 537–538.
Liang, D., Gonen, H., Mao, Y., Hou, R., Goyal, N., Ghazvininejad, M., Zettlemoyer, L., & Khabsa, M. (2023). XLM-V: Overcoming the Vocabulary Bottleneck in Multilingual Masked Language Models. Dalam H. Bouamor, J. Pino, & K. Bali (Ed.), Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (hlm. 13142–13152). Association for Computational Linguistics. https://doi.org/10.18653/v1/2023.emnlp-main.813
Luong, T., Pham, H., & Manning, C. D. (2015). Effective Approaches to Attention-based Neural Machine Translation. Dalam L. Màrquez, C. Callison-Burch, & J. Su (Ed.), Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (hlm. 1412–1421). Association for Computational Linguistics. https://doi.org/10.18653/v1/D15-1166
Mahmood, R., Lucas, J., Acuna, D., Li, D., Philion, J., Alvarez, J. M., Yu, Z., Fidler, S., & Law, M. T. (2022). How Much More Data Do I Need? Estimating Requirements for Downstream Tasks. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 275–284. https://doi.org/10.1109/CVPR52688.2022.00037
McCrum, R., Macneil, R., & Cran, W. (2002). The Story of English (3 ed.). Penguin Publishing Group.
Nevalainen, T., & Traugott, E. C. (2012). The Oxford Handbook of the History of English (T. Nevalainen & E. C. Traugott, Ed.). Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199922765.001.0001
Perdani, Y. D. (2023). Preparing For An English-Language Job Interview: Strategies For Both Recent Graduates And Those With More Experience. Social Economics and Ecology International Journal (SEEIJ), 7(2), 47–55. https://doi.org/10.21512/seeij.v7i2.10184
Peters, M. E., Neumann, M., Iyyer, M., Gardner, M., Clark, C., Lee, K., & Zettlemoyer, L. (2018). Deep Contextualized Word Representations. Dalam M. Walker, H. Ji, & A. Stent (Ed.), Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers) (hlm. 2227–2237). Association for Computational Linguistics. https://doi.org/10.18653/v1/N18-1202
Richter, F. (2024, Februari 21). The Most Spoken Languages: On the Internet and in Real Life. https://www.statista.com/chart/26884/languages-on-the-internet/.
Romero, R. D., Cortezano, G. P., Manaig, K. A., Yazon, A. D., & Tesoro, J. F. B. (2023). A Phenomenological Investigation of Senior High School Learners with Low English Language Proficiency. Journal of English as A Foreign Language Teaching and Research, 3(1), 1–13. https://doi.org/10.31098/jefltr.v3i1.1148
Rusdin, R. (2022). The Students’ Difficulties In Using Simple Present Tense: A Case Study At Senior High School Student Simple Present Tense. Journal of English Language Teaching and Literature (JELTL), 5(1), 90–102. https://doi.org/10.47080/jeltl.v5i1.1800
Shahin, N., & Ismail, L. (2024). From rule-based models to deep learning transformers architectures for natural language processing and sign language translation systems: survey, taxonomy and performance evaluation. Artificial Intelligence Review, 57(10), 271. https://doi.org/10.1007/s10462-024-10895-z
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł. ukasz, & Polosukhin, I. (2017). Attention is All you Need. Dalam I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, & R. Garnett (Ed.), Advances in Neural Information Processing Systems (Vol. 30). Curran Associates, Inc. https://proceedings.neurips.cc/paper_files/paper/2017/file/3f5ee243547dee91fbd053c1c4a845aa-Paper.pdf
Wang, Y., He, S., Chen, G., Chen, Y., & Jiang, D. (2022). XLM-D: Decorate Cross-lingual Pre-training Model as Non-Autoregressive Neural Machine Translation. Dalam Y. Goldberg, Z. Kozareva, & Y. Zhang (Ed.), Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing (hlm. 6934–6946). Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.emnlp-main.466
Wu, L., & Lu, W. (2023). Struct-XLM: A Structure Discovery Multilingual Language Model for Enhancing Cross-lingual Transfer through Reinforcement Learning. Dalam H. Bouamor, J. Pino, & K. Bali (Ed.), Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing (hlm. 3405–3419). Association for Computational Linguistics. https://doi.org/10.18653/v1/2023.emnlp-main.207
Xu, Q. (2023). A Discussion on the Improvement Strategies of English Reading Teaching in Senior High School. Frontiers in Sustainable Development, 3(4), 36–39. https://doi.org/10.54691/fsd.v3i4.4745
Zhang, L., Wang, S., & Liu, B. (2018). Deep learning for sentiment analysis: A survey. WIREs Data Mining and Knowledge Discovery, 8(4). https://doi.org/10.1002/widm.1253
DOI: https://doi.org/10.31764/jmm.v8i5.25744
Refbacks
- There are currently no refbacks.
Copyright (c) 2024 Mirza Alim Mutasodirin, Muchammad Sofyan Firmansyah
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
________________________________________________________________
JMM (Jurnal Masyarakat Mandiri) p-ISSN 2598-8158 & e-ISSN 2614-5758
Email: [email protected]
________________________________________________________________
JMM (Jurnal Masyarakat Mandiri) already indexing:
________________________________________________________________
JMM (Jurnal Masyarakat Mandiri) OFFICE: