How early is early enough: Correlating student performance with final grades
Pittman J.M.; Titus K.; Williams L.
2021
Journal of Higher Education Theory and Practice
0
10.33423/jhetp.v21i6.4370
Student retention is one of the greatest challenges facing computer science programs. Difficulties in an introductory programming class often snowball, resulting in poor student performance or dropping the major completely. In this paper, we present an analysis of 197 students over 6 semesters from 11 sections of an introductory programming class at a private four-year liberal arts university in the southeastern United States. The goal of this research was to find the earliest point in the assessment sequence which could predict final grade outcomes. Accordingly, we measured the degree of correlation between student performance on quizzes, labs, programs, and tests compared to final course grade. Overall, the results show a strong positive correlation for all four assessment modalities. These results hold significance for educators and researchers insofar as the body of computing education research is extended by evaluating the relative effectiveness of early semester subsets of each of the four categories of student data to model class outcomes. © 2021, North American Business Press. All rights reserved.
Educational data mining; Grades; Learning analytics; Retention; Student performance
Bornat R., Dehnadi S., Mental models, consistency and programming aptitude, Proceedings of the tenth conference on Australasian computing education, 78, pp. 53-61, (2008); Butcher D.F., Muth W.A., Predicting performance in an introductory computer science course, Communications of the ACM, 28, 3, pp. 263-268, (1985); Gorson J., O'Rourke E., Why do CS1 Students Think They’re Bad at Programming?: Investigating Self-efficacy and Self-assessments at Three Universities, Proceedings of the 2020 ACM Conference on International Computing Education Research, pp. 170-181, (2020); Guo A., Analysis of Factors and Interventions Relating to Student Performance in CS1 and CS2, (2020); Ihantola P., Rivers K., Rubio M.A., Sheard J., Skupas B., Spacco J., Petersen A., Educational Data Mining and Learning Analytics in Programming: Literature Review and Case Studies, Proceedings of the 2015 ITiCSE on Working Group Reports-ITICSE-WGR ’15, pp. 41-63, (2015); Leu K., Beginning College Students Who Change Their Majors within 3 Years of Enrollment, (2017); Quille K., Bergin S., CS1: How will they do? How can we help? A decade of research and practice, Computer Science Education, 29, 2–3, pp. 254-282, (2019); Salguero A., McAuley J., Simon B., Porter L., A Longitudinal Evaluation of a Best Practices CS1, Proceedings of the 2020 ACM Conference on International Computing Education Research, pp. 182-193, (2020); Sobral S.R., CS1 student grade prediction: unconscious optimism vs insecurity?, In 4th International Conference on Education and Distance Learning Conference (ICEDL2020), (2020); Werth L.H., Predicting student performance in a beginning computer science class, ACM SIGCSE Bulletin, 18, 1, pp. 138-143, (1986)
North American Business Press
Article
All Open Access; Bronze Open Access
Scopus