Metacognitive Skills Assessment in Research-Proposal Writing (MSARPW) in the Indonesian University Context: Scale Development and Validation Using Multidimensional Item Response Models

Bahrul Hayat, Wardani Rahayu, Muhammad Dwirifqi Kharisma Putra, Iva Sarifah, Tina Deviana, Valendra Granitha Shandika Puri, Khairunesa Isa

Abstract


This study aimed to develop the Metacognitive Skills Assessment in a Research Proposal Writing Context (MSARPW), a multidimensional measure, based on student perspectives, for assessing metacognitive skills in thesis writing, and assess the instrument’s psychometric properties using item factor analysis (IFA) and multidimensional item response models. The 40-item MSARPW was administered to 602 Indonesian university students (Mage = 25.254 SDage = 6.854). The IFA showed that the two-dimensional factor structure of MSARPW was satisfactory; however, only 24 of the 40 items were found to fit the model. Multidimensional graded response models (MGRM) were applied to the subsequent 24-item MSARPW, which showed that one item (Item 4) did not satisfy the criteria. The estimated reliabilities of each subscale showed that the 23-item MSARPW has good internal consistency (0.891 and 0.902). To conclude, the 23-item MSARPW appears to be a valid and reliable tool for assessing metacognitive skills in the context of research-proposal writing among Indonesian university students.


Keywords


Item factor analysis, metacognitive skills, multidimensionality, graded response model, MSARPW questionnaire

References


Adams, R. J., Wilson, M., & Wang, W. (1997). The multidimensional random coefficients multinomial logit model. Applied Psychological Measurement, 21(1), 1-23. https://doi.org/10.1177/0146621697211001

Akhidime, A. E. (2017). The importance and development of research problem: A didactic discuss. International Journal of Economics, Commerce and Management, 5(8), 631-640.

American Educational Research Association (AERA), American Psychological Association (APA), National Council for Measurement in Education (NCME). (2014). Standards for educational and psychological testing. American Educational Research Association.

Anthonysamy, L. (2021). The use of metacognitive strategies for undisrupted online learning: preparing university students in the age of pandemic. Education and Information Technologies. Advanced online publication. https://doi.org/10.1007/s10639-021-10518-y

Arianto, M. A., & Wulyani, A. N. (2022). Self-regulated writing (SRW) strategies during the completion of dissertation: A case study. Cakrawala Pendidikan, 41(1), 154-169. https://doi.org/10.21831/cp.v41i1.40687

Baker, B. F., & Kim, S.-H. (2004). Item response theory: Parameter estimation techniques (2nd ed.). Marcel Dekker.

Bock, R.D., & Aitkin, M. (1981). Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm. Psychometrika, 46(4), 443–459. https://doi.org/10.1007/BF02293801

Bock, R.D., & Gibbons, R. D. (2021). Item response theory. Wiley.

Cai, L. (2010). High-dimensional exploratory item factor analysis by a metropolis-hastings robbins-monro algorithm. Psychometrika, 75(1), 33-57. https://doi.org/10.1007/s11336-009-9136-x

Cer, E. (2019). The instruction of writing strategies: The effect of the metacognitive strategy on the writing skills of pupils in secondary education. SAGE Open. https://doi.org/10.1177/2158244019842681

Chalmers, R. P. (2012). mirt: A Multidimensional Item Response Theory Package for the R Environment. Journal of Statistical Software, 48(6), 1-29. https://doi.org/10.18637/jss.v048.i06

Chou, Y.-T., & Wang, W.-C. (2010). Checking dimensionality in item response models with principal component analysis on standardized residuals. Educational and Psychological Measurement, 70(5), 717–731. https://doi.org/10.1177/0013164410379322

Craig, K., Hale, D., Grainger, C., & Stewart, M. E. (2020). Evaluating metacognitive self-reports: systematic reviews of the value of self-report in metacognitive research. Metacognition and Learning, 15, 155-213. https://doi.org/10.1007/s11409-020-09222-y

Daniel, F., & Taneo, P. N. L. (2019). Analisis kesulitan mahasiswa dalam penyusunan proposal penelitian pendidikan matematika. Jurnal Pendidikan Matematika Indonesia, 4(2), 79-83.

de Ayala, R. J. (1994). The influence of multidimensionality on the graded response model. Applied Psychological Measurement, 18(2), 155-170. https://doi.org/10.1177/014662169401800205

de Ayala, R. J. (2022). The theory and practice of item response theory. Guilford Press.

Desoete, A., Roeyers, H., & Buysse, A. (2001). Metacognition and mathematical problem solving in grade 3. Journal of Learning Disabilities, 34(5), 435–449. https://doi.org/10.1177/002221940103400505

Dhir, S. K., & Gupta, P. (2021). Formulation of research question and composing study outcomes and objectives. Indian Pediatrics, 58, 584-588. https://doi.org/10.1007/s13312-021-2246-y

Edelen, M. O., & Reeve, B. B. (2007). Applying item response theory (IRT) modelling to questionnaire development, evaluation, and refinement. Quality of Life Research, 16(1), 5–18. https://doi.org/10.1007/s11136-007-9198-0

Faculty of Psychology University of Indonesia. (2020). Buku pendoman akademik program studi doktor psikologi. Faculty of Psychology University of Indonesia.

Farrugia, P., Petrisor, B. A., Farrokhyar, F., & Bhandari, M. (2010). Research questions, hypotheses and objectives. Canadian Journal of Surgery, 53(4), 278-281.

Faustino, B., Vasco, A. B., Oliveira, J., Lopes, P., & Fonseca, I. (2021). Metacognitive Self-Assessment Scale: Psychometric properties and clinical implications. Applied Neuropsychology: Adult, 28(5), 596–606. https://doi.org/10.1080/23279095.2019.1671843

Firdayani., Fauzan., Nahartini, D., Yuliani, A., & Latip, A. (2020). Analysis of goals, metacognition and academic achievement students’ of Muhammadiyah Malang University. In T. Hariguna & S. C. Chen (Eds.), Proceedings of the 1st International Conference on Recent Innovations (ICRI 2018, pp. 1248-1256). Science and Technology Publications, Lda. http://doi.org/10.5220/0009925712481256

Flavell, J. (1979). Metacognition and cognitive monitoring. A new area of cognitive development inquiry. American Psychologist, 34(10), 906–911. https://doi.org/10.1037/0003-066X.34.10.906.

Flavell, J. H. (1987). Speculations about the nature and development of metacognition. In F. E. Weinert, & R. Kluwe (Eds.), Metacognition, motivation, and understanding (pp. 21-29). Lawrence Erlbaum Associates.

Green, B. F., Bock, R. D., Humphreys, L. G., Linn, R. L., & Reckase, M. D. (1984). Technical guidelines for assessing computerized adaptive tests. Journal of Educational Measurement, 21(4), 347-360. https://doi.org/10.1111/j.1745-3984.1984.tb01039.x

Hameed, H. A., & Cheruvalath, R. (2021). Metacognitive skills inventory (msi): Development and validation. International Journal of Testing. Advance online publication. https://doi.org/10.1080/15305058.2021.1986051

Inouye, K., & McAlpine, L. (2022). Writing across contexts: Relationships between doctoral writing and workplace writing beyond the academy. Innovations in Education and Teaching International. Advanced online publication. https://doi.org/10.1080/14703297.2022.2075431

Jans-Beken, L., & Wong, P. T. P. (2021). Development and preliminary validation of the Existential Gratitude Scale (EGS). Counselling Psychology Quarterly, 34(1), 72–86. https://doi.org/10.1080/09515070.2019.1656054

Kamata, A., & Bauer, D. J. (2008). A note on the relation between factor analytic and item response theory models. Structural Equation Modeling, 15(1), 136–153. https://doi.org/10.1080/10705510701758406

Kang, T., & Chen, T. (2008). Performance of the generalized S - χ2 item fit index for polytomous IRT models. Journal of Education Measurement, 45(4), 391–406. https://doi.org/10.1111/j.1745-3984.2008.00071.x

Kircaburun, K., Stravropoulos, V., Harris, A., Calado, F., Emirtekin, E., & Griffiths, M. D. (2021). Development and validation of the mukbang addiction scale. International Journal of Mental Health and Addiction, 19, 1031-1044. https://doi.org/10.1007/s11469-019-00210-1

Kocimaheni, A. A., Aminin, Z., & Kartika, A. D. (2020). Identifikasi kesulitan mahasiswa fakultas bahasa dan seni universitas negeri surabaya dalam penyelesaian studinya. Paramasastra, 7(2), 139-146.

Kuiken, F., & Vedder, I. (2021). The interplay between academic writing abilities of Dutch undergraduate students, a remedial writing programme, and academic achievement. International Journal of Bilingual Education and Bilingualism, 24(10), 1474-1485. https://doi.org/10.1080/13670050.2020.1726280

Li, C.-H. (2016). Confirmatory factor analysis with ordinal data: Comparing robust maximum likelihood and diagonally weighted least squares. Behavior Research Methods, 48(3), 936-949. https://doi.org/10.3758/s13428-015-0619-7

Marhaban, S., Mukminatien, N., Widiati, U., Sulistyo, T., Suhastyanang, W. D., Puspitasari, Y., & Muslem, A. (2021). Strategies employed by EFL doctoral candidates in dissertation writing completion. Studies in English Language and Education, 8(2), 546-560. https://doi.org/10.24815/siele.v8i2.17694

Marsh, H. W., Hau, K.-T., & Grayson, D. (2005). Goodness of Fit in Structural Equation Models. In A. Maydeu-Olivares & J. J. McArdle (Eds.), Contemporary psychometrics: A festschrift for Roderick P. McDonald (pp. 275–340). Lawrence Erlbaum Associates Publishers.

Masters, G. N. (1982). A Rasch model for partial credit scoring. Psychometrika, 47(2), 49-174. https://doi.org/10.1007/BF02296272

Maydeu-Olivares, A. (2013). Why should we assess the goodness-of-fit of IRT models? Measurement: Interdisciplinary Research and Perspectives, 11(3), 127-137. https://doi.org/10.1080/15366367.2013.841511

Maydeu-Olivares, A., & Joe, H. (2006). Limited information goodness-of-fit testing in multidimensional contingency tables. Psychometrika, 71(4), 713-732. https://doi.org/10.1007/s11336-005-1295-9

Mya, K. S., Zaw, K. K., & Mya, K. M. (2021). Developing and validating a questionnaire to assess an individual’s perceived risk of four major non-communicable diseases in Myanmar. PLoS ONE, 16(4), e0234281. https://doi.org/10.1371/journal.pone.023428

Mbato, C. L., & Cendra, A. (2019). EFL undergraduate students’ self-regulation in thesis writing: Help-seeking and motivation-regulation. Journal of English Language and Education, 5(1), 66-82. http://dx.doi.org/10.26486/jele.v5i1.949

Newman, I., & Covrig, D. M. (2013). Building consistency between title, problem statement, purpose, & research questions to improve the quality of research plans and reports. New Horizons in Adult Education and Human Resource Development, 25(1), 70-79. https://doi.org/10.1002/nha.20009

Ohtani, K., & Hisasaka, T. (2018). Beyond intelligence: A meta-analytic review of the relationship among metacognition, intelligence, and academic performance. Metacognition and Learning, 13(2), 179–212. https://doi.org/10.1007/s11409-018-9183-8

Papageorgiou, C., & Wells, A. (2003). An empirical test of a clinical metacognitive model of rumination and depression. Cognitive Therapy and Research, 27(3), 261–273. https://doi.org/10.1023/A:1023962332399

Pedone, R., Semerari, A., Ricciardi, I., Procacci, M., Nicolo, G., & Carcione, A. (2017). Development of a self-report measure of metacognition: The metacognition self-assessment scale (MSAS): Instrument description and factor structure. Clinical Neuropsychiatry, 14(3), 185-194.

Petscher, Y., Mitchell, A. M., & Foorman, B. R. (2015). Improving the reliability of student scores from speeded assessments: an illustration of conditional item response theory using a computer-administered measure of vocabulary. Reading and Writing, 28(1), 31-56. https://doi.org/10.1007/s11145-014-9518-z

Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychological Review, 16(4), 385-407. https://doi.org/10.1007/s10648-004-0006-x

Quvanch, Z., & Na, K. S. (2022). Evaluating Afghanistan university students’ writing anxiety in English class: An empirical research. Cogent Education, 9(1), 2040697. https://doi.org/10.1080/2331186X.2022.2040697

Reise, S. P. (1999). Personality measurement issues viewed through the eyes of IRT. In S. E. & S. L. Hershberger (Eds.), The new rules of measurement: What every psychologist and educator should know (pp. 219– 241). Erlbaum.

Reise, S. P., Waller, N. G., & Comrey, A. L. (2000). Factor analysis and scale revision. Psychological Assessment, 12(3), 287-297. https://doi.org/10.1037/1040-3590.12.3.287

Rifenbark, G., Lombardi, A., & Freeman, J. (2021). A confirmatory item factor analysis of a school climate measure for adolescents with and without disabilities. Educational Assessment, 26(1), 52-68. https://doi.org/10.1080/10627197.2020.1841625

Roth, A., Ogrin, S., & Schmitz, B. (2016). Assessing self-regulated learning in higher education: A systematic literature review of self-report instruments. Educational Assessment, Evaluation and Accountability, 28(3), 225–250. https://doi.org/10.1007/s11092-015-9229-2

Saeed, M. A., Al-Ahdal, A. A. M. H., & Al Qunayeer, H. S. (2021). Integrating research proposal writing into a postgraduate research method course: what does it tell us? International Journal of Research & Method in Education, 44(3), 303-318. https://doi.org/10.1080/1743727X.2020.1777963

Samejima, F. (1969). Estimation of latent ability using a response pattern of graded scores. Psychometrika Monograph Supplement, 34(4, Pt. 2), 100.

Song, J. H. H., Loyal, S., & Lond, B. (2021). Metacognitive awareness scale, domain specific (MCAS-DS): Assessing metacognitive awareness during Raven’s Progressive Matrices. Frontiers in Psychology, 11:607577. https://doi.org/10.3389/fpsyg.2020.607577

Stover, A. M., McLeod, L. D., Langer, M. M., Chen, W-H., & Reeve, B. B. (2019). State of the psychometric methods: patient-reported outcome measure development and refinement using item response theory. Journal of Patient-Reported Outcomes, 3:50. https://doi.org/10.1186/s41687-019-0130-5

Su, S., Wang, C., & Weiss, D. J. (2021). Performance of the S - χ2 statistic for the multidimensional graded response model. Educational and Psychological Measurement, 81(3), 491-522. https://doi.org/10.1177/0013164420958060

Suseno, M., Hayat, B., Putra, M. D. K., Bien, J. K., Rachmawati, R., & Hartanto, H. (2022). A differential item functioning (DIF) analysis of the mobile phone problem use scale in Indonesian schools with and without smartphone banned policy. Cogent Psychology, 9(1), 2137306. https://doi.org/10.1080/23311908.2022.2137306

Teng, F. (2020). The role of metacognitive knowledge and regulation in mediating university EFL learners’ writing performance. Innovation in Language Learning and Teaching, 14(5), 436-450. https://doi.org/10.1080/17501229.2019.1615493

Thompson, B., & Vacha-Haase, T. (2000). Psychometrics is datametrics: the test is not reliable. Educational and Psychological Measurement, 60(2), 174-195. https://doi.org/10.1177/0013164400602002

Uesaka, Y., Suzuki, M., & Ichikawa, S. (2022). Analyzing students’ learning strategies using item response theory: Toward assessment and instruction for self-regulated learning. Frontiers in Education, 7: 921884. https://doi.org/10.3389/feduc.2022.921844

van de Schoot, R., Yerkes, M. A., Mouw, J. M., & Sonneveld, H. (2013). What took them so long? Explaining phd delays among doctoral candidates. PloS ONE, 8(7), e68839. https://doi.org/10.1371/journal.pone.0068839

van Rooij, E., Fokkens-Bruinsma, M., & Jansen, E. (2021). Factors that influence phd candidates’ success: the importance of PhD project characteristics. Studies in Continuing Education, 43(1), 48-67. https://doi.org/10.1080/0158037X.2019.1652158

Wolters, C. A., & Hussain, M. (2015). Investigating grit and its relations with college students’ self-regulated learning and academic achievement. Metacognition and Learning, 10(3), 293–311. https://doi.org/10.1007/s11409-014-9128-9

Xia, W., & Luxin, Y. (2012). Problems and strategies in learning to write a thesis proposal: A study of six M.A. students in a TEFL program. Chinese Journal of Applied Linguistics (Quarterly), 35(3), 324-341. https://doi.org/10.1515/cjal-2012-0024

Yau, D. T. W., Wong, M. C. M., Lam, K. F., & McGrath, C. (2015). Evaluation of psychometric properties and differential item functioning of 8-item Child Perceptions Questionnaires using item response theory. BMC Public Health, 15:792. https://doi.org/10.1186/s12889-015-2133-3


Full Text: PDF

DOI: 10.15408/jp3i.v12i1.31679

Refbacks

  • There are currently no refbacks.


Copyright (c) 2023 Bahrul Hayat, Wardani Rahayu, Muhammad Dwirifqi Kharisma Putra, Iva Sarifah, Tina Deviana, Valendra Granitha Shandika Puri

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.