The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course
<p>This study was carried out to develop and evaluate the practicality of a lab report</p><p>scoring checklist to assess practical skills indirectly in the undergraduate optics course</p><p>(UOC). The study employed quantitative a...
Saved in:
Main Author: | |
---|---|
Format: | thesis |
Language: | eng |
Published: |
2023
|
Subjects: | |
Online Access: | https://ir.upsi.edu.my/detailsg.php?det=10590 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
oai:ir.upsi.edu.my:10590 |
---|---|
record_format |
uketd_dc |
institution |
Universiti Pendidikan Sultan Idris |
collection |
UPSI Digital Repository |
language |
eng |
topic |
QC Physics |
spellingShingle |
QC Physics Muhamad Zulhelmi Othman The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
description |
<p>This study was carried out to develop and evaluate the practicality of a lab report</p><p>scoring checklist to assess practical skills indirectly in the undergraduate optics course</p><p>(UOC). The study employed quantitative approach with the support of qualitative data.</p><p>A scoring checklist was developed for the obtained context from the needs analysis</p><p>using the ADDIE instructional design model. The checklist was validated by six experts</p><p>from the physics and physics education fields. The face validity was analysed using</p><p>descriptive statistics, while the content validity was analysed using the Content</p><p>Validation Index (CVI). The pilot test was conducted in three cyclic processes to obtain</p><p>the reliability of the developed checklist. It involved three raters from different</p><p>educational levels who evaluated 35 UOC lab reports. The first stage involves the</p><p>analysis of inter-rater agreement using Fleiss kappa coefficient. Next, the Cohens</p><p>kappa analysis was employed to determine the reliability of inter-rater agreement for</p><p>each cycle. After that, the practicality of checklist was evaluated by two lecturers while</p><p>marking 32 UOC lab reports. The instrument practicality was analysed using</p><p>descriptive statistics. Findings indicated that the developed scoring checklist had</p><p>satisfactory face validity, content validity (SCVI/Ave = 0.98, SCVI/UA = 0.87), interrater</p><p>agreement, and test-retest reliability. The checklist also obtained satisfactory</p><p>practicality level among course lecturers. As a conclusion, a lab report scoring checklist</p><p>with satisfactory validity, reliability and practicality level to assess indirect practical</p><p>skills in UOC has been successfully developed in this study. This study implies that the</p><p>developed checklist could overcome the practical skill assessment loads faced by the</p><p>UOC lecturers, guides the students in writing a professional lab report and proves that</p><p>the indirect assessment can be utilised to assess practical skills in the physics laboratory.</p> |
format |
thesis |
qualification_name |
|
qualification_level |
Master's degree |
author |
Muhamad Zulhelmi Othman |
author_facet |
Muhamad Zulhelmi Othman |
author_sort |
Muhamad Zulhelmi Othman |
title |
The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
title_short |
The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
title_full |
The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
title_fullStr |
The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
title_full_unstemmed |
The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
title_sort |
development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course |
granting_institution |
Universiti Pendidikan Sultan Idris |
granting_department |
Fakulti Sains dan Matematik |
publishDate |
2023 |
url |
https://ir.upsi.edu.my/detailsg.php?det=10590 |
_version_ |
1804890578292310016 |
spelling |
oai:ir.upsi.edu.my:105902024-07-10 The development and evaluation of a lab report scoring checklist to assess the practical skills indirectly in an undergraduate optics course 2023 Muhamad Zulhelmi Othman QC Physics <p>This study was carried out to develop and evaluate the practicality of a lab report</p><p>scoring checklist to assess practical skills indirectly in the undergraduate optics course</p><p>(UOC). The study employed quantitative approach with the support of qualitative data.</p><p>A scoring checklist was developed for the obtained context from the needs analysis</p><p>using the ADDIE instructional design model. The checklist was validated by six experts</p><p>from the physics and physics education fields. The face validity was analysed using</p><p>descriptive statistics, while the content validity was analysed using the Content</p><p>Validation Index (CVI). The pilot test was conducted in three cyclic processes to obtain</p><p>the reliability of the developed checklist. It involved three raters from different</p><p>educational levels who evaluated 35 UOC lab reports. The first stage involves the</p><p>analysis of inter-rater agreement using Fleiss kappa coefficient. Next, the Cohens</p><p>kappa analysis was employed to determine the reliability of inter-rater agreement for</p><p>each cycle. After that, the practicality of checklist was evaluated by two lecturers while</p><p>marking 32 UOC lab reports. The instrument practicality was analysed using</p><p>descriptive statistics. Findings indicated that the developed scoring checklist had</p><p>satisfactory face validity, content validity (SCVI/Ave = 0.98, SCVI/UA = 0.87), interrater</p><p>agreement, and test-retest reliability. The checklist also obtained satisfactory</p><p>practicality level among course lecturers. As a conclusion, a lab report scoring checklist</p><p>with satisfactory validity, reliability and practicality level to assess indirect practical</p><p>skills in UOC has been successfully developed in this study. This study implies that the</p><p>developed checklist could overcome the practical skill assessment loads faced by the</p><p>UOC lecturers, guides the students in writing a professional lab report and proves that</p><p>the indirect assessment can be utilised to assess practical skills in the physics laboratory.</p> 2023 thesis https://ir.upsi.edu.my/detailsg.php?det=10590 https://ir.upsi.edu.my/detailsg.php?det=10590 text eng closedAccess Masters Universiti Pendidikan Sultan Idris Fakulti Sains dan Matematik <p>Abrahams, I., & Reiss, M. J. (2015). Assessment of practical skills. School Science Review, 357(June), 4044.</p><p></p><p>Abrahams, I., Reiss, M. J., & Sharpe, R. M. (2013). The assessment of practical work in school science. Studies in Science Education, 49(2), 209251. https://doi.org/10.1080/03057267.2013.858496</p><p></p><p>Adams, C. J. (2020). A constructively aligned first-year laboratory course. Journal of Chemical Education, 97(7), 18631873. https://doi.org/10.1021/acs.jchemed.0c00166</p><p></p><p>Al-Elq, A. H. (2007). Medicine and clinical skills laboratories. Journal of Family & Community Medicine, 14(2), 5963. http://www.ncbi.nlm.nih.gov/pubmed/23012147%0Ahttp://www.pubmedcentral.nih.gov/articlerender.fcgi?artid=PMC3410147</p><p></p><p>Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. Psychology of Learning and Motivation - Advances in Research and Theory, 2(C), 89195. https://doi.org/10.1016/S0079-7421(08)60422-3</p><p></p><p>Ausubel, D. (1968). Educational Psychology: A Motivation for the research question: A Cognitive View. Coloso University College.</p><p></p><p>Beagles, A., Beck, S., Cross, L., Garrard, A., & Rowson, J. (2016). Guidance for writing lab reports. Sheffield University.</p><p></p><p>Berchtold, A. (2016). Testretest: Agreement or reliability? Methodological Innovations, 9. https://doi.org/10.1177/2059799116672875</p><p></p><p>Berita Harian. (2018, May 31). Tawaran masuk IPT meningkat 22 peratus. Berita Harian. https://www.bharian.com.my/berita/nasional/2018/05/432396/tawaran-masuk-ipt-meningkat-22-peratus</p><p></p><p>Branan, D., & Morgan, M. (2010). Mini-lab activities: Inquiry-based lab activities for formative assessment. Journal of Chemical Education, 87(1). https://doi.org/10.1021/ed8000073</p><p></p><p>Branch, R. M. (2009). Instructional design: The ADDIE approach (1st ed.). Springer New York. https://doi.org/10.1007/978-0-387-09506-6</p><p></p><p>Branson, R. (1978). The interservice procedures for instructional systems development. Educational Technology, 18(3), 1114.</p><p></p><p>Brookhart, S. M. (2018). Appropriate criteria: key to effective rubrics. Frontiers in Education, 3(April), 22. https://doi.org/10.3389/feduc.2018.00022</p><p></p><p>Brown, H. D. (2004). Language assessment: principles and classroom practice. Longman.</p><p></p><p>Burrows, N. L., Ouellet, J., Joji, J., & Man, J. (2021). Alternative Assessment to Lab Reports: A Phenomenology Study of Undergraduate Biochemistry Students Perceptions of Interview Assessment. Journal of Chemical Education, 98(5), 15181528. https://doi.org/10.1021/acs.jchemed.1c00150</p><p></p><p>Caballero, C. L., & Walker, A. (2010). Work readiness in graduate recruitment and selection: A review of current assessment methods. Journal of Teaching and Learning for Graduate Employability, 1(1). https://doi.org/10.21153/jtlge2010vol1no1art546</p><p></p><p>Chabeli, M. M. (2006). Higher order thinking skills competencies required by outcomes-based education from learners. In Curationis (Vol. 29, Issue 3, pp. 7886). https://doi.org/10.4102/curationis.v29i3.1107</p><p></p><p>Chen, B., Demara, R. F., Salehi, S., & Hartshorne, R. (2018). Elevating Learner Achievement Using Formative Electronic Lab Assessments in the Engineering Laboratory: A Viable Alternative to Weekly Lab Reports. IEEE Transactions on Education, 61(1). https://doi.org/10.1109/TE.2017.2706667</p><p></p><p>Chojnowski, M. (2017). Infrared thermal imaging in connective tissue diseases. In Reumatologia (Vol. 55, Issue 1). https://doi.org/10.5114/reum.2017.66686</p><p></p><p>Clark, R. C., & Mayer, R. E. (2012). E-learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (3rd ed.). Pfeiffer. https://doi.org/10.1002/9781118255971</p><p></p><p>Cook, D. A., & Beckman, T. J. (2006). Current concepts in validity and reliability for psychometric instruments: Theory and application. American Journal of Medicine, 119(2), 166.e7-166.e16. https://doi.org/10.1016/j.amjmed.2005.10.036</p><p></p><p>Creswell, J. W., & Creswell, J. D. (2018). Research design (5th ed.). SAGE Publications.</p><p></p><p>Cuschieri, S., Grech, V., & Savona-Ventura, C. (2019). WASP (Write a Scientific Paper): Structuring a scientific paper. Early Human Development, 128, 114117. https://doi.org/10.1016/j.earlhumdev.2018.09.011</p><p></p><p>Davis, M. H. (2003). Outcome-Based Education. Journal of Veterinary Medical Education, 30(3), 258263. https://doi.org/10.3138/jvme.30.3.258</p><p></p><p>de Jong, T. (2010). Cognitive load theory, educational research, and instructional design: Some food for thought. Instructional Science, 38(2), 105134. https://doi.org/10.1007/s11251-009-9110-0</p><p>Dixson, D. D., & Worrell, F. C. (2016). Formative and summative assessment in the classroom. Theory into Practice, 55(2), 153159. https://doi.org/10.1080/00405841.2016.1148989</p><p></p><p>Fadzil, H. M., & Saat, R. M. (2018). Development of instrument in assessing students science manipulative skills. Malaysian Online Journal of Educational Sciences, 7(1), 4757.</p><p></p><p>Fah, L. Y., & Hoon, K. C. (2014). Pengenalan kepada analisis data dengan AMOS 18 dalam penyelidikan pendidikan. Universiti Malaysia Sabah.</p><p></p><p>Falotico, R., & Quatto, P. (2015). Fleiss kappa statistic without paradoxes. Quality and Quantity, 49(2), 463470. https://doi.org/10.1007/s11135-014-0003-1</p><p></p><p>Fisher, M. J., & Marshall, A. P. (2009). Understanding descriptive statistics. Australian Critical Care, 22(2), 9397. https://doi.org/10.1016/j.aucc.2008.11.003</p><p></p><p>Ghofur, A., & Youhanita, E. (2020). Interactive media development to improve student motivation. IJECA (International Journal of Education and Curriculum Application), 3(1), 1. https://doi.org/10.31764/ijeca.v3i1.2026</p><p></p><p>Giessen-Hood, C. (1999). Teachers " Attitudes Towards the Implementation of Outcomes Based Education ( Obe ) in South Africa.</p><p></p><p>Gkioka, O. (2019). Learning how to teach experiments in the school physics laboratory. Journal of Physics: Conference Series, 1286(1). https://doi.org/10.1088/1742-6596/1286/1/012016</p><p></p><p>Guo, W. Y., & Yan, Z. (2019). Formative and summative assessment in Hong Kong primary schools: students attitudes matter. Assessment in Education: Principles, Policy and Practice, 26(6). https://doi.org/10.1080/0969594X.2019.1571993</p><p></p><p>Hamid, R., Shokri, S. N. E. S. M., Baharom, S., & Khatimin, N. (2016). Assessing students performance on material technology course through direct and indirect methods. Pertanika Journal of Social Sciences and Humanities, 24(April), 185196.</p><p></p><p>Hammerman, E. (2008). Formative assessment strategies for enhanced learning in science, k-8. Corwin Press.</p><p></p><p>Hancock, L. M., & Hollamby, M. J. (2020). Assessing the practical skills of undergraduates: the evolution of a station-based practical exam. Journal of Chemical Education, 97(4), 972979. https://doi.org/10.1021/acs.jchemed.9b00733</p><p></p><p>Harwood, C. J., Hewett, S., & Towns, M. H. (2020). Rubrics for assessing hands-on laboratory skills. Journal of Chemical Education, 97(7), 20332035. https://doi.org/10.1021/acs.jchemed.0c00200</p><p></p><p>Hinampas, R. T., Murillo, C. R., Tan, D. A., & Layosa, R. U. (2018). Blended learning approach: effect on students academic achievement and practical skills in science laboratories. International Journal of Scientific and Technology Research, 7(11), 6369.</p><p></p><p>Hurley, K. F., Giffin, N. A., Stewart, S. A., & Bullock, G. B. (2015). Probing the effect of OSCE checklist length on inter-observer reliability and observer accuracy. Medical Education Online, 20(1), 29242. https://doi.org/10.3402/meo.v20.29242</p><p></p><p>Hussey, T., & Smith, P. (2008). Learning outcomes: A conceptual analysis. Teaching in Higher Education, 13(1), 107115. https://doi.org/10.1080/13562510701794159</p><p></p><p>Ishak, M. R. (2014). Kajian Keberkesanan Program Pentaksiran Kerja Amali Sains (PEKA): Satu Penilaian di Sekolah Rendah (Study of Evaluation Program of Practical Skill Assessment (PEKA): Assessment in Primary School). Jurnal Pendidikan Malaysia, 39(2), 8393. https://doi.org/10.17576/JPEN-2014-%x</p><p></p><p>Jamieson, S. (2004). Likert scales: How to (ab)use them. In Medical Education (Vol. 38, Issue 12, pp. 12171218). https://doi.org/10.1111/j.1365-2929.2004.02012.x</p><p></p><p>Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: reliability, validity and educational consequences. Educational Research Review, 2(2), 130144. https://doi.org/10.1016/j.edurev.2007.05.002</p><p></p><p>Kaliannan, M., & Chandran, S. D. (2006). Empowering students through outcome-based education (OBE). Research in Education, 87(1), 5063.</p><p></p><p>Kamarudin, N., & Halim, L. (2013). Konsep pengurusan alatan dan bahan untuk pembelajaran sains di makmal. Jurnal Teknologi (Sciences and Engineering), 60, 6570. https://doi.org/10.11113/jt.v60.1449</p><p></p><p>Kamarudin, N., & Halim, L. (2014). Tahap pengurusan pelajar dan pengurusan masa dalam pengajaran amali Fizik. Sains Humanika, 2(4), 155161.</p><p></p><p>Katawazai, R. (2021). Implementing outcome-based education and student-centered learning in Afghan public universities: the current practices and challenges. Heliyon, 7(5), e07076. https://doi.org/10.1016/j.heliyon.2021.e07076</p><p></p><p>Kemmis, S., McTaggart, R., & Nixon, R. (2014). The action research planner (1st ed.). Springer Singapore. https://doi.org/10.1007/978-981-4560-67-2</p><p></p><p>Killpack, T. L., & Fulmer, S. M. (2018). Development of a tool to assess interrelated experimental design in introductory biology. Journal of Microbiology & Biology Education, 19(3), 110.</p><p></p><p>Kolivand, M., Esfandyari, M., & Heydarpour, S. (2020). Examining validity and reliability of objective structured clinical examination for evaluation of clinical skills of midwifery undergraduate students: a descriptive study. BMC Medical Education, 20(1), 17. https://doi.org/10.1186/s12909-020-02017-4</p><p></p><p>Landis, J. R., & Koch, G. G. (1977). The Measurement of Observer Agreement for Categorical Data. Biometrics, 33(1). https://doi.org/10.2307/2529310</p><p></p><p>Leshe, S. (2016). Developing and implementing assessment moderation procedures to evaluate written laboratory reports. African Journal of Chemical Education, 6(1), 3146.</p><p></p><p>Liew, S. S., Lim, H. L., Saleh, S., & Ong, S. L. (2019). Development of scoring rubrics to assess physics practical skills. Eurasia Journal of Mathematics, Science and Technology Education, 15(4), em1691. https://doi.org/10.29333/ejmste/103074</p><p></p><p>Lok, W. F., & Yau, P. W. (2020). A case study of direct assessment of students manipulative skills in chemistry practical: perspective of lecturers. Asian Journal of Assessment in Teaching and Learning, 10(2), 1017.</p><p></p><p>Lynn, M. R. (1986). Determination and quantification of content validity index. Nursing Research, 35, 382386. https://doi.org/https://doi.org/10.1097/00006199-198611000-00017</p><p></p><p>Malaysian Qualifications Agency. (2017). Malaysian Qualifications Framework (MQF) Second Edition. 139. https://www.mqa.gov.my/pv4/document/mqf/2019/Oct/updated MQF Ed 2 24102019.pdf</p><p></p><p>Maziyyah, N., & Krisridwany, A. (2020). Developing OSCE for pharmacy students in the pandemic era. Jurnal Farmasi Indonesia, 17(2), 178187. https://doi.org/10.31001/jfi.v17i2.1075</p><p></p><p>McHarg, I. L. (1971). Design with nature. Doubleday.</p><p></p><p>McLeod, S. (2019). What are independent and dependent variables. Simply Psychology. www.simplypsychology.org/variables.html</p><p></p><p>Mohamadirizi, S., Mardanian, F., & Torabi, F. (2020). The effect of direct observation of procedural skills method on learning clinical skills of midwifery students of medical sciences. Journal of Education and Health Promotion, 9(1). https://doi.org/10.4103/jehp.jehp_672_19</p><p></p><p>Moidunny, K. (2009). The effectiveness of the national professional qualification for educational leaders (NPQEL). Unpublished Doctoral Dissertation.</p><p></p><p>Moreno, R., & Mayer, R. E. (1999). Cognitive principles of multimedia learning: The role of modality and contiguity. Journal of Educational Psychology, 91(2), 358368. https://doi.org/10.1037/0022-0663.91.2.358</p><p>Moskal, B. M., & Leydens, J. A. (2001). Scoring rubric development: Validity and reliability. Practical Assessment, Research and Evaluation, 7(10).</p><p></p><p>Nurdin, E., Saputri, I. Y., & Kurniati, A. (2020). Development of comic mathematics learning media based on contextual approaches. JIPM (Jurnal Ilmiah Pendidikan Matematika), 8(2), 85. https://doi.org/10.25273/jipm.v8i2.5145</p><p></p><p>Nutbeam, D., Harris, E., & Wise, W. (2004). Theory in a nutshell: A practical guide to health promotion theories (2nd ed.). McGraw-Hill.</p><p></p><p>Oxford University Press. (2022). Oxford learners dictionary. Oxford University Press. https://www.oxfordlearnersdictionaries.com/definition/english/practical_1?q=practical</p><p></p><p>Popova, M., Bretz, S. L., & Hartley, C. S. (2016). Visualizing molecular chirality in the organic chemistry laboratory using cholesteric liquid crystals. Journal of Chemical Education, 93(6), 10961099. https://doi.org/10.1021/acs.jchemed.5b00704</p><p></p><p>Rao, N. J. (2020). Outcome-based education: an outline. Higher Education for the Future, 7(1), 521. https://doi.org/10.1177/2347631119886418</p><p></p><p>Salim, K. R., Puteh, M., & Daud, S. M. (2012). Assessing students practical skills in basic electronic laboratory based on psychomotor domain model. Procedia - Social and Behavioral Sciences, 56, 546555. https://doi.org/10.1016/j.sbspro.2012.09.687</p><p></p><p>Schler, I. M., Heinrich-Weltzien, R., & Eiselt, M. (2018). Effect of individual structured and qualified feedback on improving clinical performance of dental students in clinical courses-randomised controlled study. European Journal of Dental Education, 22(3), e458e467. https://doi.org/10.1111/eje.12325</p><p></p><p>Simpson, E. J. (1971). Educational objectives in the psychomotor domain. Behavioral Objectives in Curriculum Development: Selected Readings and Bibliography, 60(2), 135. https://files.eric.ed.gov/fulltext/ED010368.pdf</p><p></p><p>Stanger, L. R., Wilkes, T. C., Boone, N. A., McGonigle, A. J. S., & Willmott, J. R. (2018). Thermal imaging metrology with a smartphone sensor. Sensors (Switzerland), 18(7). https://doi.org/10.3390/s18072169</p><p></p><p>Still, C., Powell, R., Aubrecht, D., Kim, Y., Helliker, B., Roberts, D., Richardson, A. D., & Goulden, M. (2019). Thermal imaging in plant and ecosystem ecology: applications and challenges. Ecosphere, 10(6). https://doi.org/10.1002/ecs2.2768</p><p></p><p>Suchman, E. A. (1967). Evaluative research: principles and practice in public service and social action programs. (Vol. 1, Issue 2). Russell Sage Foundation. https://doi.org/10.1080/00222216.1969.11969732</p><p></p><p>Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257285. https://doi.org/10.1016/0364-0213(88)90023-7</p><p></p><p>Sweller, J., Van Merrienboer, J. J. G., & Paas, F. G. W. C. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251296. https://doi.org/10.1023/A:1022193728205</p><p></p><p>Talha, M., Elmarzouqi, N., & Abou El Kalam, A. (2020). Towards a powerful solution for data accuracy assessment in the big data context. International Journal of Advanced Computer Science and Applications, 11(2), 419429. https://doi.org/10.14569/ijacsa.2020.0110254</p><p></p><p>Tegou, L., Polatidis, H., & Haralambopoulos, D. (2007). Distributed Generation with Renewable Energy Systems: the Spatial Dimension for an Autonomous Grid. 47th Conference of the European Regional Science Association Local Governance and Sustainable Development, Thematic Stream M: Environment, Natural Resources and Sustainability, September.</p><p></p><p>Turbek, S. P., Chock, T., Donahue, K., Havrilla, C., Oliverio, A., Polutchko, S., Shoemaker, L., & Vimercati, L. (2016). Scientific writing made easy: A step-by-step guide to undergraduate writing in the biological sciences. International Journal of Environmental and Science Education, 11(12), 56445652. https://doi.org/10.1002/bes2.1258</p><p></p><p>Veale, C. G. L., Jeena, V., & Sithebe, S. (2020). Prioritizing the development of experimental skills and scientific reasoning: a model for authentic evaluation of laboratory performance in large organic chemistry classes. Journal of Chemical Education, 97(3), 675680. https://doi.org/10.1021/acs.jchemed.9b00703</p><p></p><p>Viera, A. J., & Garrett, J. M. (2005). Understanding interobserver agreement: the kappa statistic. Family Medicine, 37(5), 360363. http://www1.cs.columbia.edu/~julia/courses/CS6998/Interrater_agreement.Kappa_statistic.pdf</p><p></p><p>Wilcox, B. R., & Lewandowski, H. J. (2017). Developing skills versus reinforcing concepts in physics labs: Insight from a survey of students beliefs about experimental physics. Physical Review Physics Education Research, 13(1), 19. https://doi.org/10.1103/PhysRevPhysEducRes.13.010108</p><p></p><p>Wiseman, E., Carroll, D. J., Fowler, S. R., & Guisbert, E. (2020). Iteration in an inquiry-based undergraduate laboratory strengthens student engagement and incorporation of scientific skills. Journal of the Scholarship of Teaching and Learning, 20(2), 99112. https://search.ebscohost.com/login.aspx?direct=true&db=eric&AN=EJ1275056&site=ehost-live</p><p></p><p>Wood, B. K., & Blevins, B. K. (2019). Substituting the practical teaching of physics with simulations for the assessment of practical skills: an experimental study. Physics Education, 54(3), 035004. https://doi.org/10.1088/1361-6552/ab0192</p><p></p><p>Wright, J. S., Read, D., Hughes, O., & Hyde, J. (2018). Tracking and assessing practical chemistry skills development: practical skills portfolios. New Directions in the Teaching of Physical Sciences, 13(1). https://doi.org/10.29311/ndtps.v0i13.2905</p><p></p><p>Yenigun, K., & Ecer, R. (2013). Overlay mapping trend analysis technique and its application in Euphrates Basin, Turkey. Meteorological Applications, 20(4). https://doi.org/10.1002/met.1304</p><p></p><p>Yu, C. H. (2005). Test-retest reliability. Encyclopedia of Social Measurement, 3, 777784.</p><p></p><p>Yu, X., Yu, Z., Liu, Y., & Shi, H. (2017). CI-Rank: Collective importance ranking for keyword search in databases. Information Sciences, 384, 120. https://doi.org/10.1016/j.ins.2016.12.022</p><p></p><p>Zezekwa, N., & Nkopodi, N. (2020). Physics teachers views and practices on the assessment of students practical work skills. Eurasia Journal of Mathematics, Science and Technology Education, 16(8). https://doi.org/10.29333/EJMSTE/8289</p><p></p><p>Zhang, H., & Wink, D. J. (2021). Examining an acid-base laboratory practical assessment from the perspective of evidence-centered design. Journal of Chemical Education, 98(6), 18981909. https://doi.org/10.1021/acs.jchemed.0c01405</p><p></p><p>Zhang, M. J., Newton, C., Grove, J., Pritzker, M., & Ioannidis, M. (2020). Design and assessment of a hybrid chemical engineering laboratory course with the incorporation of student-centred experiential learning. Education for Chemical Engineers, 30, 18. https://doi.org/10.1016/j.ece.2019.09.003</p><p></p><p>Zulkifli, H., Razak, K. A., & Mahmood, M. R. (2018). The usage of ADDIE model in the development of a philosophical inquiry approach in moral education module for secondary school students. Creative Education, 09(14), 21112124. https://doi.org/10.4236/ce.2018.914153</p><p></p><p></p> |