Learning Design and Learning Analytics

GUEST EDITORS

Leah Macfadyen, The University of British Columbia, Canada, leah.macfadyen@ubc.ca
Lori Lockyer, University of Technology Sydney, Australia. lori.lockyer@uts.edu.au
Bart Rienties, Open University, UK, bart.rienties@open.ac.uk

AIMS & SCOPE

As early as 2011, Lockyer & Dawson argued that learning analytics has the potential to provide educators with the evidence they need that their efforts and innovations in learning design are ‘worth it’ in terms of improving teaching practice and learning:

The integration of research related to both learning design and learning analytics provides the necessary contextual overlay to better understand observed student behavior and provide the necessary pedagogical recommendations where learning behavior deviates from pedagogical intention. (p. 155)

How might learning analytics actually inform learning design decisions? How should learning design influence learning analytics integration and use? Are there frameworks or principles that best illuminate connections between learning analytics and learning design? Since 2011, a small but growing number of studies have sought to explore this area of work.

Several large data studies have confirmed, for example, that learning design significantly influences learner engagement and academic outcomes. Rienties & Toetenel (2016) examined the activity of 111,256 students in 151 courses at the UK Open University using multiple regression models, and found that learning design choices strongly predicted virtual learning environment (VLE) behaviour and performance of students. In particular, learning activities labelled ‘communication’ (i.e., student to student, teacher to student, student to teacher) significantly predicted VLE engagement and academic retention over time. Follow-up fine- grained analyses of weekly engagement within 38 courses found that 69% of engagement by students was primarily predicted by how teachers were designing their respective courses (Nguyen et al., 2017). In a US study, Fritz (2016) similarly identified strong correlations between implementation of LMS-based course design elements and learner engagement and outcomes.

Other research has focussed on providing tools or visualizations to educators to inform learning design choices. For example, Toetenel & Rienties (2016) found that providing educators with exemplar learning designs, combined with visualizations of initial learning design decisions, encouraged design of fewer learning activities that were classified as assimilative (e.g., reading, watching, listening), and more learning activities that were learner-centred, such as learner-to- learner interaction or finding information. A number of other systems and tools have sought to connect learning analytics with learning design (see for example, Corrin et al., 2016; Law et al., 2017; Persico & Pozzi, 2015).

Since 2011, a number of authors have proposed frameworks that connect learning analytics and learning design (Bakharia et al., 2016; Donald et al., 2016; Schmitz et al., 2017). Corrin et al. (2018) argue, however, that while such frameworks emphasize the potentially valuable connections between learning analytics and learning design, they are at such a high descriptive level that meaningful operationalization is difficult. Mangaroska & Giannakos (2018) similarly conclude their recent systematic review on learning design and learning analytics by stressing that “future research should consider developing a framework on how to capture and systematize learning design data grounded in learning analytics and learning theory, and document what learning design choices made by educators influence subsequent learning activities and performances over time”.

TOPICS OF INTEREST

Building on Mangaroska & Giannakos’ (2018) recommendations, we invite submissions to a special selection of JLA on learning analytics and learning design that may:

  • Offer conceptual frameworks that connect learning design and learning analytics, that bridge the gap between theory and practice, and/or that offer guidance in use, interpretation of and reflection on learning analytics for refinement and redesign of learning activities.
  • Detail examples or use cases in which learning design changes (design of learning environments, choice of pedagogical approach, just-in-time adaptation of design while teaching) driven or informed by learning analytics are positively (or negatively) influencing learning or the learner experience.
  • Explore how educators plan, implement and evaluate learning designs, and how they might take greater advantages of learning analytics to do so (and vice versa).
  • Trace how progress and insights in the field of learning analytics is contributing to learning design decisions and
  • Investigate effective interpretations of learning analytics to offer insights into learning processes that can be theoretically grounded in ways that inform theory and learning design.
  • Explore way in which educators may effectively share learning design decisions with learners, or collaborate with learners in learning
  • Consider whether and how learning analytics might contribute to personalization and flexibility vs. scalability and standardization of learning
  • Track the impact of learning analytics-informed learning design over

SUBMISSION PROCEDURE

Prospective authors may contact the section editors with queries. Final submissions will take place through JLA’s online submission system at http://learning-analytics.info When submitting a paper, select the section “Special Section: Learning Design and Learning Analytics". All submissions should follow JLA’s standard manuscript guidelines and template available on the journal website, and will undergo double-blind peer review.

TIMELINE

  • Full manuscripts due: September 15th 2019
  • Publication of special issue anticipated mid-2020

REFERENCES

Bakharia, A., Corrin, L., de Barba, P., Kennedy, G., Gasevic, D., Mulder, R., Williams, D., Dawson, S., & Lockyer, L. (2016). A conceptual framework linking learning design with learning analytics. In T. Reiners, B. R. von Konsky, D. Gibson, V. Chang, L. Irving, & K. Clarke (Eds.), Proceedings of the 6th International Conference on Learning Analytics and Knowledge (pp. 409-413). New York: ACM.

Corrin, L., Law, N. W. Y., Ringtved, U., & Milligan, S. (2018). DesignLAK18: Evaluating systems and tools that link learning analytics and learning design. In Pardo, A. et al. (Eds.), Companion Proceedings of the 8th International Conference on Learning Analytics & Knowledge (LAK’18): Towards User-Centred Learning Analytics, Sydney, NSW, Australia, 5-9 March 2018.

Corrin, L., Kennedy, G., de Barba, P., Lockyer, L., Gasevic, D., Williams, D., Dawson, S., Mulder, R., Copeland, S., & Bakharia, A. (2016). Completing the loop: Returning meaningful learning analytics data to teachers. Sydney: Office for Learning and Teaching.

Donald, C., Gunn, C., McDonald, J., Blumenstein, M., & Milne, J. (2016). Matching the rhythms of teaching to learning analytics. Paper presented at the International Consortium of Educational Developers, University of Cape Town.

Fritz, J. (2016). LMS Course Design As Learning Analytics Variable. In J. Greer, M. Molinaro, X. Ochoa & T. McKay. Proceedings of the 1st Learning Analytics for Curriculum and Program Quality Improvement Workshop, hosted by the Learning Analytics and Knowledge Conference 2016, University of Edinburgh, Scotland (April 25) (pp. 15-19).

Law, N., Li, L., Farias Herrera, L., Chan, A., & Pong, T. C. (2017). A pattern language based learning design studio for an analytics informed inter-professional design community. Interaction Design and Architecture(s), 33, 92-112.

Lockyer, L. & Dawson, S. (2011). Learning designs and learning analytics. In P. Long, G. Siemens, G. Conole & D. Gasevic (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (pp. 153-156). New York, NY, USA: ACM.

Mangaroska, K. & Giannakos, M. (2018). A systematic literature review of analytics-driven design to enhance learning. IEEE Transactions on Learning Technologies. PP. 1-1. 10.1109/TLT.2018.2868673

Nguyen, Q., Rienties, B., Toetenel, L., Ferguson, R., & Whitelock, D. (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior, 76, 703–714.

Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230-248.

Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333-341.

Schmitz, M., van Limbeek, E., Greller, W., Sloep, P., & Drachsler, H. (2017). Opportunities and Challenges in Using Learning Analytics in Learning Design. In Proceedings, European Conference on Technology Enhanced Learning (pp. 209-223). Springer, Cham.

Toetenel, L. & Rienties, B. (2016). Learning Design–creative design to visualise learning activities. Open Learning: The Journal of Open, Distance and e-learning, 31(3), 233-244.