Evaluasi Psikometrik Butir Instrumen Literasi Sains pada Materi Momentum dan Impuls Berdasarkan Model IRT Tiga Parameter Logistik (3PL)
DOI:
https://doi.org/10.54082/jupin.2321Kata Kunci:
analisis IRT 3PL, karakterisasi butir, literasi sains, momentum dan impuls, validasi instrumenAbstrak
Penelitian ini bertujuan mengkarakterisasi butir instrumen literasi sains pada materi momentum dan impuls menggunakan pendekatan Item Response Theory (IRT) model tiga parameter logistik (3PL). Penelitian dilakukan dengan metode research and development (R&D) pada tahap uji coba terbatas terhadap 114 peserta didik SMA yang telah mempelajari materi momentum dan impuls. Instrumen berupa tes pilihan ganda yang dikembangkan berdasarkan indikator literasi sains yang mencakup kemampuan menjelaskan fenomena ilmiah, menafsirkan data, dan mengevaluasi isu berbasis sains. Uji asumsi unidimensionalitas dilakukan melalui analisis faktor eksploratori (EFA), yang menunjukkan satu faktor dominan (KMO = 0,781; RMSEA = 0,063). Analisis IRT 3PL menggunakan jMetrik menghasilkan estimasi parameter diskriminasi (a), kesulitan (b), dan peluang menebak (c). Hasil menunjukkan sebagian besar butir berada pada kategori moderat hingga baik, namun terdapat beberapa butir dengan diskriminasi rendah dan satu butir dengan diskriminasi negatif, tingkat kesulitan sangat ekstrem, serta nilai guessing relatif tinggi. Fungsi informasi tes mengindikasikan presisi pengukuran tertinggi pada rentang kemampuan sedang. Secara umum, instrumen menunjukkan potensi yang baik, tetapi memerlukan revisi pada beberapa butir sebelum digunakan pada skala luas. Penelitian ini memberikan kontribusi dalam pengembangan instrumen literasi sains berbasis analisis IRT untuk meningkatkan validitas dan presisi pengukuran.
Referensi
Adianto, T., & Rusli, M. A. (2021). Analysis of Studentâ€TMs Difficulties in Solving Physics Problem: Impulse and Momentum Topics. Unnes Science Education Journal, 10(1), 24–33. https://doi.org/10.15294/usej.v10i1.41517
Aksu, G., Güzeller, C. O., & Eser, M. T. (2019). Jmetrik: Classical test theory and item response theory data analysis software. Journal of Measurement and Evaluation in Education and Psychology, 10(2), 165–178. https://doi.org/10.21031/epod.483396
Ariyadi, D. J. (2025). Application of Three-Parameter Logistic (3PL) Item Response Theory in Learning Management System (LMS) for Post-Test Analysis. Journal of Informatics Development, 3(2), 33–46. https://doi.org/10.30741/jid.v3i2.1554
Gómez-Benito, J., Sireci, S., Padilla, J. L., Dolores Hidalgo, M., & Benítez, I. (2018). Differential item functioning: Beyond validity evidence based on internal structure. Psicothema, 30(1), 104–109. https://doi.org/10.7334/psicothema2017.183
Gormally, C., Brickman, P., & Lut, M. (2012). Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE Life Sciences Education, 11(4), 364–377. https://doi.org/10.1187/cbe.12-03-0026
Hooper, M. A., Tomarken, A., & Gauthier, I. (2025). Measuring visual ability in linguistically diverse populations. Behavior Research Methods, 57(1). https://doi.org/10.3758/s13428-024-02579-x
Hori, K., Fukuhara, H., & Yamada, T. (2022). Item response theory and its applications in educational measurement Part II: Theory and practices of test equating in item response theory. In Wiley Interdisciplinary Reviews: Computational Statistics (Vol. 14, Number 3). John Wiley and Sons Inc. https://doi.org/10.1002/wics.1543
Kalkan, Ö. K., Altun, A., & Atar, B. (2020). Role of teacher-related factors and educational resources in science literacy: An international perspective. Studies in Educational Evaluation, 67. https://doi.org/10.1016/j.stueduc.2020.100935
Knekta, E., Rowland, A. A., Corwin, L. A., & Eddy, S. (2020). Measuring university students’ interest in biology: evaluation of an instrument targeting Hidi and Renninger’s individual interest. International Journal of STEM Education, 7(1). https://doi.org/10.1186/s40594-020-00217-4
Mannan, A., Suparto, S., & Kusaeri, K. (2025). Practices and Challenges of the Validity of Exploratory Factor Analysis (EFA)-Based Assessment Instruments: A Systematic Literature Review 2020–2025. EDUKASIA Jurnal Pendidikan Dan Pembelajaran, 6(1), 611–626. https://doi.org/10.62775/edukasia.v6i1.1463
Ocy, D. R., Sarifah, I., & Riyadi, R. (2025). EFA and CFA analysis: Development and validation of a test instrument for mathematical abstraction skills. JRAMathEdu (Journal of Research and Advances in Mathematics Education). https://doi.org/10.23917/jramathedu.v10i2.7601
PISA 2022 Assessment and Analytical Framework. (2023). OECD Publishing. https://doi.org/10.1787/dfe0bf9c-en
PISA 2022 Results (Volume I). (2023). OECD Publishing. https://doi.org/10.1787/53f23881-en
Rusilowati, A., Darsono, T., Sekarningtias, F. O., & Fariyani, Q. (2024). Development of A Four-Tier E-Diagnostic Test on The Topic of Momentum to Measure and Reduce Student Misconception. Jurnal Pendidikan Fisika Indonesia, 20(2), 173–187. https://doi.org/10.15294/jpfi.v20i2.1297
Yusmar, F., & Fadilah, R. E. (2023). Analisis Rendahnya Literasi Sains Peserta Didik Indonesia: Hasil Pisa Dan Faktor Penyebab. Lensa (Lentera Sains): Jurnal Pendidikan IPA, 13(1), 11–19. https://doi.org/10.24929/lensa.v13i1.283
Zein, R. A., & Akhtar, H. (2025). Getting started with the graded response model: An introduction and tutorial in R. International Journal of Psychology, 60(1). https://doi.org/10.1002/ijop.13265
Unduhan
Diterbitkan
Cara Mengutip
Terbitan
Bagian
Lisensi
Hak Cipta (c) 2026 Della Apriyani Kusuma Putri, Ina Ana Khoeriah, Muhammad Firdaus

Artikel ini berlisensi Creative Commons Attribution 4.0 International License.



