Using Temporal Analytics to Detect Inconsistencies Between Learning Design and Students’ Behaviours
Extensive research in learning science has established the importance of time management in online learning. Recently, learning analytics (LA) has shed further lights on the temporal characteristics of learning by allowing researchers to capture authentic digital footprints of student learning behaviours. Nonetheless, students’ timing of engagement and its relation to learning design (LD) and academic performance have received limited attention. This study investigates to what extent students’ timing of engagement aligned with instructor learning design, and how engagement varied across different levels of performance. Our findings revealed a mismatch between how instructors designed for learning and how students study. In most weeks, students spent less time studying the assigned materials on the virtual learning environment (VLE) compared to the number of hours recommended by instructors. The timing of engagement also varied, from in advance to catching up patterns. High-performing students spent more time studying in advance, while low-performing students spent a higher proportion of their time on catching-up activities. By incorporating the pedagogical context into learning analytics, not only we can understand what, why, and when students engage, but also how their behaviours are influenced by the way instructors design for learning.
Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gašević, D., Mulder, R., . . . Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. Proceedings of the Proceedings of the Sixth International Conference on Learning Analytics & Knowledge, 329-338. http://dx.doi.org/10.1145/2883851.2883944
Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting Linear Mixed-Effects Models Using lme4. Journal of Statistical Software, 67(1), 1-48. http://dx.doi.org/10.18637/jss.v067.i01
Bennett, S., Agostinho, S., & Lockyer, L. (2015). Technology tools to support learning design: Implications derived from an investigation of university teachers' design practices. Computers & education, 81, 211-220. http://dx.doi.org/10.1016/j.compedu.2014.10.016
Biggs, J. B., & Tang, C. (2007). Teaching for quality learning at university: (3 ed.). Maidenhead, Beckshire, England: Open University Press.
Bond, M. J., & Feather, N. (1988). Some correlates of structure and purpose in the use of time. Journal of personality and social psychology, 55(2), 321.
Britton, B. K., & Tesser, A. (1991). Effects of time-management practices on college grades. Journal of Educational Psychology, 83(3), 405.
Broadbent, J., & Poon, W. L. (2015). Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review. The Internet and Higher Education, 27, 1-13. http://dx.doi.org/https://doi.org/10.1016/j.iheduc.2015.04.007
Carless, D., Salter, D., Yang, M., & Lam, J. (2011). Developing sustainable feedback practices. Studies in Higher Education, 36(4), 395-407. http://dx.doi.org/10.1080/03075071003642449
Cerezo, R., Esteban, M., Sánchez-Santillán, M., & Núñez, J. C. (2017). Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle. Frontiers in psychology, 8, 1403.
Chen, B., Knight, S., & Wise, A. F. (2018). Critical Issues in Designing and Implementing Temporal Analytics. 2018, 5(1), 9. http://dx.doi.org/10.18608/jla.2018.53.1
Claessens, B. J. C., Eerde, W. v., Rutte, C. G., & Roe, R. A. (2007). A review of the time management literature. Personnel Review, 36(2), 255-276. http://dx.doi.org/doi:10.1108/00483480710726136
Conole, G. (2012). Designing for learning in an open world (Vol. 4): Springer Science & Business Media.
Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). Final Project Report of the OULDI-JISC Project: Challenge and Change in Curriculum Design Process, Communities, Visualisation and Practice. York: JISC. Retrieved from http://www.open.ac.uk/blogs/OULDI/wp-content/uploads/2010/11/OULDI_Final_Report_Final.pdf
Dalziel, J. (2015). Learning design: Conceptualizing a framework for teaching and learning online. New York, NY, USA: Routledge.
Dalziel, J., Conole, G., Wills, S., Walker, S., Bennett, S., Dobozy, E., . . . Bower, M. (2016). The Larnaca declaration on learning design. Journal of Interactive Media in Education, 2016(1), 1-24. http://dx.doi.org/10.5334/jime.407
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5-6), 304-317. http://dx.doi.org/10.1504/IJTEL.2012.051816
Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic success. The Internet and Higher Education, 28, 68-84. http://dx.doi.org/10.1016/j.iheduc.2015.10.002
Goodyear, P. (2015). Teaching as design. HERDSA Review of Higher Education, 2, 27-50.
Häfner, A., Stock, A., Pinneker, L., & Ströhle, S. (2014). Stress prevention through a time management training intervention: an experimental study. Educational Psychology, 34(3), 403-416. http://dx.doi.org/10.1080/01443410.2013.785065
Järvelä, S., & Hadwin, A. F. (2013). New Frontiers: Regulating Learning in CSCL. Educational psychologist, 48(1), 25-39. http://dx.doi.org/10.1080/00461520.2012.748006
Kim, K. R., & Seo, E. H. (2015). The relationship between procrastination and academic performance: A meta-analysis. Personality and Individual Differences, 82, 26-33. http://dx.doi.org/https://doi.org/10.1016/j.paid.2015.02.038
Knight, S., Friend Wise, A., & Chen, B. (2017). Time for Change: Why Learning Analytics Needs Temporal Analysis. 2017, 4(3), 11. http://dx.doi.org/10.18608/jla.2017.43.2
Knight, S., Rienties, B., Littleton, K., Mitsui, M., Tempelaar, D., & Shah, C. (2017). The relationship of (perceived) epistemic cognition to interaction with resources on the internet. Computers in Human Behavior, 73(Supplement C), 507-518. http://dx.doi.org/https://doi.org/10.1016/j.chb.2017.04.014
Kovanovic, V., Gašević, D., Dawson, S., Joksimovic, S., & Baker, R. (2016). Does Time-on-task Estimation Matter? Implications on Validity of Learning Analytics Findings. 2016, 2(3), 81-110. http://dx.doi.org/10.18608/jla.2015.23.6
Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. (2015). OU Analyse: analysing at-risk students at The Open University. Learning Analytics Review, 1-16.
Li, N., Marsh, V., Rienties, B., & Whitelock, D. (2017). Online learning experiences of new versus continuing learners: a large-scale replication study. Assessment & Evaluation in Higher Education, 42(4), 657-672. http://dx.doi.org/10.1080/02602938.2016.1176989
Lockyer, L., Bennett, S., Agostinho, S., & Harper, B. (2009). Handbook of research on learning design and learning objects: issues, applications, and technologies (2 volumes). IGI Global, Hershey, PA. http://dx.doi.org/10.4018/978-1-59904-861-1
Lockyer, L., & Dawson, S. (2011). Learning designs and learning analytics. Proceedings of the Proceedings of the 1st international conference on learning analytics and knowledge, 153-156. http://dx.doi.org/10.1145/2090116.2090140
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439 - 1459. http://dx.doi.org/10.1177/0002764213479367
Macan, T. H., Shahani, C., Dipboye, R. L., & Phillips, A. P. (1990). College students' time management: Correlations with academic performance and stress. Journal of Educational Psychology, 82(4), 760.
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & education, 54(2), 588-599. http://dx.doi.org/10.1016/j.compedu.2009.09.008
Maina, M., Craft, B., & Mor, Y. (2015). The Art & Science of Learning Design. Rotterdam, The Netherlands: Sense Publisher.
Manso-Vázquez, M., Caeiro-Rodríguez, M., & Llamas-Nistal, M. (2016, 12-15 Oct. 2016). Tracking and visualizing time management for Self-Regulated Learners. Proceedings of the 2016 IEEE Frontiers in Education Conference (FIE), 1-5. http://dx.doi.org/10.1109/FIE.2016.7757411
Molenaar, I., & Järvelä, S. (2014). Sequential and temporal characteristics of self and socially regulated learning. Metacognition and Learning, 9(2), 75-85. http://dx.doi.org/10.1007/s11409-014-9114-2
Mor, Y., Ferguson, R., & Wasson, B. (2015). Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action. British Journal of Educational Technology, 46(2), 221-229. http://dx.doi.org/10.1111/bjet.12273
Moskal, A. C. M., Stein, S. J., & Golding, C. (2016). Can you increase teacher engagement with evaluation simply by improving the evaluation system? Assessment & Evaluation in Higher Education, 41(2), 286-300. http://dx.doi.org/10.1080/02602938.2015.1007838
Nguyen, Q., Rienties, B., & Toetenel, L. (2017a). Mixing and matching learning design and learning analytics. Proceedings of the Learning and Collaboration Technologies: Forth International Conference, LCT 2017, Part II, Held as Part of HCI International 2017, Cham, 302-316. http://dx.doi.org/10.1007/978-3-319-58515-4_24
Nguyen, Q., Rienties, B., & Toetenel, L. (2017b). Unravelling the dynamics of instructional practice: a longitudinal study on learning design and VLE activities. Proceedings of the the Seventh International Learning Analytics & Knowledge Conference, Vancouver, British Columbia, Canada, 168-177. http://dx.doi.org/10.1145/3027385.3027409
Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703-714. http://dx.doi.org/10.1016/j.chb.2017.03.028
Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230-248. http://dx.doi.org/10.1111/bjet.12207
Quené, H., & van den Bergh, H. (2004). On multi-level modeling of data from repeated measures designs: a tutorial. Speech Communication, 43(1), 103-121. http://dx.doi.org/https://doi.org/10.1016/j.specom.2004.02.004
Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333-341. http://dx.doi.org/10.1016/j.chb.2016.02.074
Rienties, B., Toetenel, L., & Bryan, A. (2015). Scaling up learning design: impact of learning design activities on lms behavior and performance. Proceedings of the Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, 315-319. http://dx.doi.org/10.1145/2723576.2723600
Rodríguez‐Triana, M. J., Martínez‐Monés, A., Asensio‐Pérez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330-343. http://dx.doi.org/10.1111/bjet.12198
Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational psychologist, 50(1), 1-13. http://dx.doi.org/10.1080/00461520.2014.1002924
Tabuenca, B., Kalz, M., Drachsler, H., & Specht, M. (2015). Time will tell: The role of mobile learning analytics in self-regulated learning. Computers & education, 89, 53-74. http://dx.doi.org/https://doi.org/10.1016/j.compedu.2015.08.004
Tempelaar, D., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167. http://dx.doi.org/10.1016/j.chb.2014.05.038
Tempelaar, D., Rienties, B., & Nguyen, Q. (2017). Towards Actionable Learning Analytics Using Dispositions. IEEE Transactions on Learning Technologies, 10(1), 6-16. http://dx.doi.org/10.1109/TLT.2017.2662679
Toetenel, L., & Rienties, B. (2016a). Analysing 157 learning designs using learning analytic approaches as a means to evaluate the impact of pedagogical decision making. British Journal of Educational Technology, 47(5), 981–992. http://dx.doi.org/10.1111/bjet.12423
Toetenel, L., & Rienties, B. (2016b). Learning Design–creative design to visualise learning activities. Open Learning: The Journal of Open, Distance and e-learning, 31(3), 233-244. http://dx.doi.org/10.1080/02680513.2016.1213626
van Merriënboer, J. J. G., Clark, R. E., & de Croock, M. B. M. (2002). Blueprints for complex learning: The 4C/ID-model. Educational Technology Research and Development, 50(2), 39-61. http://dx.doi.org/10.1007/BF02504993
Vermunt, J. D., & Vermetten, Y. J. (2004). Patterns in Student Learning: Relationships Between Learning Strategies, Conceptions of Learning, and Learning Orientations. Educational Psychology Review, 16(4), 359-384. http://dx.doi.org/10.1007/s10648-004-0005-y
Winne, P. H. (2017). Learning Analytics for Self-Regulated Learning. In C. Lang, G. Siemens, A. F. Wise, & D. Gaševic (Eds.), The Handbook of Learning Analytics (1 ed., pp. 241-249). Alberta, Canada: Society for Learning Analytics Research (SoLAR).
Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. Metacognition in educational theory and practice, 93, 27-30.
Wolters, C. A., Won, S., & Hussain, M. (2017). Examining the relations of time management and procrastination within a model of self-regulated learning. Metacognition and Learning, 12(3), 381-399. http://dx.doi.org/10.1007/s11409-017-9174-1
Share this article:
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.