Scaling the Student Journey from Course-Level Information to Program Level Progression and Graduation: A Model
Keywords:program analytics, matrix model, program pathways, learning journey
No course exists in isolation, so examining student progression through courses within a broader program context is an important step in integrating course-level and program-level analytics. Integration in this manner allows us to see the impact of course-level changes to the program, as well as identify points in the program structure where course interventions are most important. Here we highlight the significance of program-level learning analytics, where the relationships between courses become clear, and the impact of early-stage courses on program outcomes such as graduation or drop-out can be understood. We present a matrix model of student progression through a program as a tool to gain valuable insight into program continuity and design. We demonstrate its use in a real program and examine the impact upon progression and graduation rate if course-level changes were made early on. We also extend the model to more complex scenarios such as multiple program pathways and simultaneous courses. Importantly, this model also allows for integration with course-level models of student performance.
Asif, R., Merceron, A., Ali, S. A., & Haidera, N. G. (2017). Analyzing undergraduate students’ performance using educational data mining. Computers & Education, 113, 177–194. https://dx.doi.org/10.1016/j.compedu.2017.05.007
Bangert, A. (2008). The influence of social presence and teaching presence on the quality of online critical inquiry. Journal of Computing in Higher Education, 20(1), 34–61. https://dx.doi.org/10.1007/BF03033431
Brennan, A., Sharma, A., & Munguia, P. (2019). Diversity of online behaviours associated with physical attendance in lectures. Journal of Learning Analytics, 6, 34–53. https://dx.doi.org/10.18608/jla.2019.61.3
Campagni, R., Merlini, D., Sprugnoli, R., & Verri, M. C. (2015). Data mining models for student careers. Expert Systems with Applications, 42(13), 5508–5521. https://dx.doi.org/10.1016/j.eswa.2015.02.052
Caswell, H. (2006). Matrix population models. Wiley Online Library. https://dx.doi.org/10.1002/9780470057339.vam006m
Chiappe, A., & Rodríguez, L. P. (2017). Learning analytics in 21st century education: A review. Ensaio: Avaliação e Políticas Públicas em Educação, 25(97), 971–991. https://dx.doi.org/10.1590/s0104-40362017002501211
Cohen, A., & Shimony, U. (2016). Dropout prediction in a massive open online course using learning analytics. Proceedings of the World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education (E-Learn 2016) 14–16 November 2016, Washington, DC, USA (pp. 616–625). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE).
Daud, A., Aljohani, N., Abbasi, R., Lytras, M. D., Farhat Abbas, F., & Alowibdi, J. S. (2017). Predicting student performance using advanced learning analytics. Proceedings of the 26th International Conference on World Wide Web Companion (WWW ’17 Companion), 3–7 April 2017, Perth, Australia (pp. 415–421). New York: ACM. https://dx.doi.org/10.1145/3041021.3054164
Dekker, G., Pechenizkiy, M., & Vleeshouwers, J. (2009). Predicting students drop out: A case study. In T. Barnes et al. (Eds.), Proceedings of the 2nd International Conference on Educational Data Mining (EDM2009), 1–3 July 2009, Cordoba, Spain (pp. 41–50). International Educational Data Mining Society.
Fritz, J. (2016). LMS course design as learning analytics variable. In J. Greer, M. Molinaro, X. Ochoa, & T. McKay (Eds.), Proceedings of the 1st Learning Analytics for Curriculum and Program Quality Improvement Workshop (PCLA 2016), 25 April 2016, Edinburgh, UK (pp. 15–19).
Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice. Milton Park/Abingdon, UK: Taylor & Francis.
Garrison, D. R., & Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Distance Education, 19(3), 133–148. https://dx.doi.org/10.1207/s15389286ajde1903_2
Gašević, D., Dawson, S., Rogers, T., & Gašević, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68–84. https://dx.doi.org/10.1016/j.iheduc.2015.10.002
Golding, P., & Donaldson, O. (2006). Predicting academic performance. Proceedings of the 36th Annual Frontiers in Education Conference (FIE 2006), 27 October–1 November 2006, San Diego, CA, USA (pp. 21–26). Washington, DC: IEEE Computer Society.
Jeffreys, M. R. (2007). Tracking students through program entry, progression, graduation, and licensure: Assessing undergraduate nursing student retention and success. Nurse Education Today, 27(5), 406–419. https://dx.doi.org/10.1016/j.nedt.2006.07.003
Jeffreys, M. R. (2015). Jeffreys’s nursing universal retention and success (NURS) model: Overview and action ideas for optimizing outcomes A-Z. Nurse Education Today, 35(3), 425-431. https://dx.doi.org/10.1016/j.nedt.2014.11.004
Kabakchieva, D., Stefanova, K., & Kisimov, V. (2010). Analyzing university data for determining student profiles and predicting performance. In M. Pechenizkiy et al. (Eds.), Proceedings of the 4th Annual Conference on Educational Data Mining (EDM2011), 6–8 July 2011, Eindhoven, Netherlands (pp. 347–348). International Educational Data Mining Society.
Lefkovitch, L. P. (1965). The study of population growth in organisms grouped by stages. Biometrics, 21(1), 1–18. http://dx.doi.org/10.2307/2528348
Macfadyen, L. P., & Dawson, S. (2012). Numbers are not enough: Why e-learning analytics failed to inform an institutional strategic plan. Journal of Educational Technology & Society, 15(3), 149.
Mendez, G., Ochoa, X., Chiluiza, K., & de Wever, B. (2014). Curricular design analysis: A data-driven perspective. Journal of Learning Analytics, 1(3), 84–119. https://dx.doi.org/10.18608/jla.2014.13.6
Munguia, P. (2020). Preventing student and faculty attrition in times of change. In D. Burgos (Ed.), Radical solutions in higher education (pp. 115–129). Netherlands: Springer. https://dx.doi.org/10.1007/978-981-15-4526-9_8
Nicholls, M. G. (2007). Assessing the progress and the underlying nature of the flows of doctoral and master degree candidates using absorbing Markov chains. Higher Education, 53, 769–790. https://dx.doi.org/10.1007/s10734-005-5275-x
Nghe, N. T., Janecek, P., & Haddawy, P. (2007). A comparative analysis of techniques for predicting academic performance. Proceedings of the 37th Annual Frontiers in Education Conference (FIE 2007), 10–13 October 2006, Milwaukee, WI, USA (pp. T2G–7). Washington, DC: IEEE Computer Society.
Ochoa, X. (2016). Simple metrics for curricular analytics. In J. Greer, M. Molinaro, X. Ochoa, & T. McKay (Eds.), Proceedings of the 1st Learning Analytics for Curriculum and Program Quality Improvement Workshop (PCLA 2016), 25 April 2016, Edinburgh, UK (pp. 20-24).
Pechenizkiy, M., Trcka, N., De Bra, P., & Toledo, P. A. (2012). CurriM: Curriculum mining. In K. Yacef, O. Zaiane, A. Hershkovitz, M. Yudelson, & J. Stamper (Eds.), Proceedings of the 5th International Conference on Educational Data Mining (EDM2012), 19–21 June 2012, Chania, Greece (pp. 216–217). International Educational Data Mining Society.
Raji, M., Duggan, J., DeCotes, B., Huang, J., & Vander Zanden, B. (2017). Modelling and visualizing student flow. IEEE Transactions on Big Data. https://dx.doi.org/10.1109/TBDATA.2018.2840986
Shah, C., & Burke, G. (1999). An undergraduate student flow model: Australian higher education. Higher Education, 37, 359–375. https://dx.doi.org/10.1023/A:1003765222250
Shea, P., Li, C. S., & Pickett, A. (2006). A study of teaching presence and student sense of learning community in fully online and web-enhanced college courses. The Internet and Higher Education, 9(3), 175–190. https://dx.doi.org/10.1016/j.iheduc.2006.06.005
Stretch, P., Cruz, L., Soares, C., Mendes-Moreira, J., & Abreu, R. (2015). A comparative study of classification and regression algorithms for modelling students’ academic performance. In O. C. Santos et al. (Eds.), Proceedings of the 8th International Conference on Educational Data Mining (EDM2015), 26–29 June 2015, Madrid, Spain (pp. 392–395). International Educational Data Mining Society.
Yehuala, M. A. (2015). Application of data mining techniques for student success and failure prediction (The case of Debre Markos University). International Journal of Scientific & Technology Research, 4(4), 91–94.
Yukselturk, E., Ozekes, S., & Türel, Y. K. (2014). Predicting dropout student: An application of data mining methods in an online education program. European Journal of Open, Distance and E-learning, 17(1), 118–133. https://doi.org/10.2478/eurodl-2014-0008
How to Cite
Copyright (c) 2020 Journal of Learning Analytics
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons License, Attribution - NonCommercial-NoDerivs 3.0 Unported (CC BY-NC-ND 3.0) license that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).