Assessing novice programmers’ performance in programming exams via computer-based test

Gürsoy, D. (2016) Assessing novice programmers’ performance in programming exams via computer-based test.

[img]
Preview
PDF
1MB
Abstract:Problem: Students who are registered for the computer science programs are having some difficulty in answering the questions in programming exams that they are supposed to write some code on paper. Students are accustomed to write codes into code editors. In paper-based tests, students cannot take advantages of code editors such as syntax highlighting, automatic indentation and code autocompleting. Students spent extra time to check codes and add the punctuations marks in the correct places. Students also face the difficulty of adding new codes between the lines already written. They also have difficulty in tracing the code since they do not take advantage of syntax highlighting. When students take CBT in a programming course, they are supposed to finish the test earlier because they take advantage of a code editor. Since they have extra time during the exam, students check their answer; thus, CBT might create significant exam score difference between students who take CBT and those who take PBT. To overcome problems that students face in paper-based programming exam, students are introduced with a computer-based test that allows students to write their answers to a code editor. Method: The study is a mixed method study consisting of quantitative and qualitative parts. The qualitative part is used to clarify the result in the quantitative part. The quantitative part of this study consists of two exams taken by 44 students (28 male, 16 female) at the Middle East Technical University in an experimental setting. The quantitate part also includes data coming from a TAM questionnaire with variables Perceived Ease of Use, Perceived usefulness and Attitude towards use constructs. Quantitative part tries to find: Which group performs better in the tests and completes it earlier, and what are the correlations between TAM variables? It is also checked if exam score and time affect perceived ease of use of the computer-based test. The qualitative part of the study consists of semi-structured interviews conducted with four participants (three males and one female) at the University of Twente. The qualitative part aimed to find the factors affecting students’ attitude toward the computer-based test. Attitude is measured to show in the literature whether students accept computer-based test or not. Data analyses: Independent samples t–tests were used to compare the exam score and time spent in the exam between control and experimental group. Reliability of the exams was checked by improved Split-half reliability. Reliability of TAM questionnaire was checked by Cronbach’s alpha value. Correlations between TAM variables were analyzed by Spearmen’s Rho method. Interviews were recorded and then transcribed. Transcribed interviews were analyzed by finding the descriptive codes and finding themes from those codes. Results: Based on the result of the quantitative analysis, there is no statistically significant exam score difference between students who took CBT and PBT. Students who took CBT completed the exam in significantly less time. Analyses of correlation among TAM variables show that perceived ease of use positively and significantly correlates to perceived usefulness and attitude towards; however; perceived usefulness does not significantly correlate to attitude towards the use of CBT. Exam score and time spent in the exam do not correlate to the perceived ease of use of the CBT. Results of the qualitative analysis show that code editor, the design of the software and pictures in questions are the factors that affect students’ attitude towards CBT.
Item Type:Essay (Master)
Faculty:BMS: Behavioural, Management and Social Sciences
Subject:81 education, teaching
Programme:Educational Science and Technology MSc (60023)
Link to this item:http://purl.utwente.nl/essays/70114
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page