Journal of Learning Analytics
https://learning-analytics.info/index.php/JLA
<p>The Journal of Learning Analytics is a peer-reviewed, open-access journal, disseminating the highest quality research in the field (Journal Impact Factor 3.9). The journal is published three times a year and is the official publication of the <a title="Society for Learning Analytics Research (SoLAR)" href="http://solaresearch.org/"> Society for Learning Analytics Research (SoLAR)</a>. With an international <a href="https://learning-analytics.info/index.php/JLA/about/editorialTeam">Editorial Board</a> comprised of leading scholars, it is the first journal dedicated to research into the challenges of collecting, analysing and reporting data with the specific intent to improve learning. 'Learning' is broadly defined across a range of contexts, including informal learning on the internet, formal academic study in institutions (primary/secondary/tertiary), and workplace learning.</p> <p>The journal seeks to connect learning analytics researchers, developers and practitioners who share a common interest in using data traces to better understand and improve learning through the creation and implementation of new tools and techniques, and the study of transformations they engender. The interdisciplinary focus of the journal recognizes that computational, pedagogical, institutional, policy and social perspectives must be brought into dialogue with each other to ensure that interventions and organizational systems serve the needs of all stakeholders. Together, these communities each bring a valuable lens to provide ongoing input, evaluation and critique of the conceptual, technical, and practical advances of the field.</p> <p>The <em>Journal of Learning Analytics</em> welcomes papers that either describe original research or offer a review of the state of the art in a particular area. The journal also welcomes practice-focused papers that detail learning analytics applications in real-world settings, provided that they offer innovative insights for advancing the field. Other paper types accepted include Data Reports, Tool Reports, Open Peer Commentary and Book Reviews. See the journal's <a href="https://learning-analytics.info/index.php/JLA/focusandscope"> Focus and Scope</a> for details.</p> <p><u>Manuscripts can be submitted to the <em>Journal of Learning Analytics</em> at any time.</u> Only manuscripts for special sections should be submitted by the specific date as defined in the call for papers of the special section. Special section paper calls can be found in the <a href="https://learning-analytics.info/index.php/JLA/announcement"> Announcements</a> section.</p> <p>The Journal of Learning Analytics provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge. The journal does not charge authors submission or processing charges to submit – costs are covered by the Society for Learning Analytics Research.</p>SoLARen-USJournal of Learning Analytics1929-7750Modelability as a Strategy for Improving the Generalizability and Scalability of Predictive Models
https://learning-analytics.info/index.php/JLA/article/view/9099
<p>Learning analytics has the potential to enhance education through data-informed decision-making, but persistent challenges around generalizability and scalability continue to limit its real-world impact. In this paper, we introduce the concept of a modelable world: a learning ecosystem purposefully designed to support the development of predictive models that generalize across diverse contexts. We outline three core design principles of modelability: (1) valid and interpretable measurements, (2) scalable and stable implementation, and (3) a collaborative research–practice–technology ecosystem. We then illustrate how these principles can be operationalized in the real world through a case study of CourseKata, a platform offering a fully instrumented online textbook adopted across a wide range of institutions and disciplines. Using CourseKata data, we developed early prediction models of students’ final course grades using behavioral measures and tested the model generalizability across institutions (something rarely done in the modeling literature). Results show that a system designed with modelability in mind can produce predictive models that generalize effectively across diverse educational contexts.</p>Alice XuYunyi ZhangAdam BlakeJames Stigler
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-142026-03-141318910910.18608/jla.2026.9099Beyond Time on Task
https://learning-analytics.info/index.php/JLA/article/view/8725
<p>Student workload analysis has the potential to play a crucial role in providing both actionable insights to inform course design and curricular adjustments that promote student learning and well-being. While numerous studies have emphasized the need for analyzing workload beyond single-value metrics, such as credit hours, the interpretation and practical application of these metrics for educational interventions remains unclear. In this study, we explore the interplay between time-on-task measurements with student-perceived learning and difficulty.We move beyond average indicators of time-on-task by proposing and examining various metrics related to the dynamics of workload over time. Across 14 engineering courses taught at Pontificia Universidad Católica de Chile, we analyze three different sources of data: (1) self-reported time-on-task and perceived difficulty obtained through a weekly timesheet<br />survey, (2) interactions with the learning management system (LMS), and (3) perceived learning attainment obtained from the course evaluation survey. Our results show that LMS-based and self-reported time-on-task were highly correlated. Also, workload dynamics metrics, such as the presence of workload peaks, were highly correlated with perceived learning and perceived difficulty. As such, this study provides evidence in support of considering workload dynamics, rather than average measures of time-on-task, to predict variables related to student learning. The metrics proposed by this framework could be used to implement practical tools for educators and administrators willing to optimize course design and improve learning attainment.</p>Paul V. SargentIsabel HilligerJorge A. Baier
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-182026-03-1813111012910.18608/jla.2026.872512 Heuristics for Learning Analytics in Simulation-Based Professional Learning
https://learning-analytics.info/index.php/JLA/article/view/9141
<p>This study aims to develop a set of heuristics tailored for evaluating learning analytics in simulation-based professional learning, focusing on the following research questions: (1) What heuristics are appropriate for evaluating learning analytics in simulation-based professional learning contexts? (2) How can theoretical frameworks and empirical findings be combined in the development of such heuristics? (3) How can expert evaluation inform their refinement and applicability? The study combines a top-down approach, drawing on a theoretical framework for learning experience design, with a bottom-up analysis of empirical findings from prior studies in the context of a design project. An initial set of heuristics was iteratively reviewed and refined in collaboration with experts in user and learning experience design. The outcome is a detailed heuristic framework that supports the evaluation of learning analytics in simulation-based settings and accounts for the technological, pedagogical, and social dimensions of professional learning.</p>Susan HarringtonCharlott Sellberg
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-202026-03-2013113014210.18608/jla.2026.9141Learning Analytics to Uncover Ethnic Bias in Educational Texts
https://learning-analytics.info/index.php/JLA/article/view/8905
<p>Online learning platforms have expanded access to education but also raise concerns about biased content, particularly in text-based learning materials such as textbooks, lesson plans, and course excerpts. Such biases can perpetuate discrimination, can harm student outcomes, and can often be difficult to detect, as identification typically relies on time-consuming human review. Learning analytics (LA) can enhance this process by supporting human reviewers through automated detection, offering a scalable solution while retaining human judgment for nuanced evaluations. Accordingly, this LA study explores two research questions: RQ1: Which features might support the identification of ethnic bias in text-based online learning materials? and RQ2: Which classification approaches might be suitable for identifying ethnic bias in text-based online learning materials? First, we identified features signalling potential ethnic bias (presence or absence) in textual content using a dataset (N = 345) labelled by 193 students from diverse ethnic backgrounds. Then, we evaluated multiple machine learning (ML) models for their effectiveness in bias classification. The results suggest significant correlations between perceived bias and content from social sciences. Additionally, through bootstrap analysis, support vector machines and random forest classifiers showed consistent performance in bias identification (with F1-scores of 0.71 and 0.70 on the test set, respectively). In contrast, the naive Bayes (NB) model demonstrated the highest precision (0.75 on the test set). We discuss these findings and their implications for LA, emphasizing the importance of quality and inclusive educational tools. As an initial step toward automated bias classification in education, this study provides a foundation for spotting ethnic bias in learning content, supporting fairer technologies for more inclusive learning environments. </p>Josmario AlbuquerqueBart RientiesMartin HlostaWayne Holmes
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-02-252026-02-2513114316210.18608/jla.2026.8905Human-Centred Development of Indicators for Self-Service Learning Analytics
https://learning-analytics.info/index.php/JLA/article/view/8921
<p>The aim of learning analytics (LA) is to turn educational data into insights, decisions, and actions to improve learning and teaching. The reasoning of the provided insights, decisions, and actions is often not transparent to the end-user, and this can lead to trust and acceptance issues when interventions, feedback, and recommendations fail. In this paper, we shed light on achieving transparent LA by following a transparency through exploration approach. To this end, we present the design, implementation, and evaluation details of the Indicator Editor, which aims to support self-service LA (SSLA) by empowering end-users to take control of the indicator implementation process. We systematically designed and implemented the Indicator Editor through an iterative human-centred design (HCD) approach. Further, we conducted a qualitative user study (n=15\) to investigate the impact of following an SSLA approach on users' perceptions of and interactions with the Indicator Editor. Our study showed qualitative evidence that supporting user interaction and providing user control in the indicator implementation process can have positive effects on different crucial aspects of LA, namely transparency, trust, satisfaction, and acceptance.</p>Shoeb JoarderMohamed Amine Chatti
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-222026-03-2213116318910.18608/jla.2026.8921Temporal Network Analysis of Collaborative Prototyping from a Knowledge Creation Perspective
https://learning-analytics.info/index.php/JLA/article/view/8959
<p>This study analyzed data from prototyping sessions to support learners’ engagement in collaborative problem-solving through design-mode thinking. Knowledge creation (KC) is essential for addressing 21st-century complex problems, yet learners struggle to share incomplete ideas due to unfamiliarity with design-mode thinking. Although prototyping is known to be effective for sharing and improving incomplete ideas in the business sector, few studies analyze design activities during prototyping from a KC perspective across different design disciplines. This study examined three teams (engineering, product design, service design) during 30-minute prototyping sessions. Using a novel combination of techniques—temporal socio-semantic network analysis (tSSNA) and ordered network analysis (ONA)—we analyzed teams’ shared epistemic agency, segmenting activities into three phases based on idea improvement patterns. Engineers focused on generative collaborative actions, product designers emphasized creating shared understanding, and service designers concentrated on alleviating lack of knowledge. Statistical tests revealed significant differences between teams and phases. The findings suggest three design principles for KC practice: attending to disciplinary differences in interdisciplinary teams, providing timely educator support for concept creation, and using short task durations to encourage sharing incomplete ideas. Furthermore, this work demonstrates the potential of combining tSSNA and ONA for analyzing collaborative KC processes.</p>Ayano OhsakiYuanru TanBrendan EaganDavid Williamson ShafferZachari Swiecki
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-222026-03-2213119020710.18608/jla.2026.8959Function Art
https://learning-analytics.info/index.php/JLA/article/view/9037
<p class="JLA18Abstracttext">This study explores how students across Grades 8 to 12 engage with mathematical functions in creative, visual ways through function art—an innovative STEAM-based educational approach. Grounded in the Trends in International Mathematics and Science Study (TIMSS) framework and employing a Design-Based Research methodology, the project involved 400 students from the Philippines who created digital artworks using GeoGebra. To uncover learner profiles, a person-centred clustering method—hierarchical clustering on principal components—was applied to variables representing the number and types of functions used. The results revealed three distinct student profiles: Repetitivists (high function quantity, low diversity), Simplists (low quantity and diversity), and Multifunctionists (high diversity, low quantity). Further analysis showed meaningful associations between cluster membership, grade level, and function strategies. Qualitative evaluation using TIMSS cognitive domains—Knowing, Applying, and Reasoning—highlighted that students’ use of mathematical strategies and precision in transformations varied widely, often independently of the quantity or diversity of functions used. These findings suggest that function art, when analyzed through learning analytics, provides a rich lens for understanding students’ mathematical thinking and offers valuable insights for tailoring interdisciplinary instruction in STEAM education.</p>Guillermo BautistaRoderick CacuyongZsolt LaviczaBarbara SabitzerMona Emara
Copyright (c) 2026 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-302026-03-3013120823110.18608/jla.2026.9037Evaluating 21st-Century Competencies in Postsecondary Curricula with Large Language Models
https://learning-analytics.info/index.php/JLA/article/view/9127
<p>The growing emphasis on 21st-century competencies in postsecondary education, intensified by the transformative impact of generative artificial intelligence (GenAI) on the economy and society, underscores the urgent need to evaluate how they are embedded in curricula and how effectively academic programs align with evolving workforce and societal demands. Curricular analytics, particularly recent advancements powered by GenAI, offer a promising data-driven approach to this challenge. However, the analysis of 21st-century competencies requires pedagogical reasoning beyond surface-level information retrieval, and the capabilities of large language models (LLMs) in this context remain underexplored. In this study, we extend prior research on curricular analytics of 21st-century competencies across a broader range of curriculum documents, competency frameworks, and models. Using 7,600 manually annotated curriculum-competency alignment scores (38 competencies and 200 courses across five curriculum document types), we evaluate the informativeness of different curriculum document sources, benchmark the performance of general-purpose LLMs on mapping curricula to competencies, and analyze error patterns. We further introduce a reasoning-based prompting strategy, curricular chain-of-thought (CoT), to strengthen LLMs’ pedagogical reasoning. Our results show that detailed instructional activity descriptions are the most informative type of curriculum document for competency analytics. Open-weight LLMs achieve accuracy comparable to proprietary models on coarse-grained tasks, demonstrating their scalability and cost-effectiveness for institutional use. However, no model reaches human-level precision in fine-grained pedagogical reasoning. Our proposed curricular CoT yields modest improvements by reducing bias in instructional keyword inference and improving the detection of nuanced pedagogical evidence in long text. Together, these findings highlight the untapped potential of institutional curriculum documents and provide an empirical foundation for advancing AI-driven curricular analytics.</p>Zhen XuXin GuanChenxi ShiQinhao ChenRenzhe Yu
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-02-252026-02-2513173010.18608/jla.2026.9127Profiling Pre-service Teachers’ Computational Thinking
https://learning-analytics.info/index.php/JLA/article/view/9077
<p style="font-weight: 400;">Computational thinking (CT) is a vital skill set for pre-service teachers who will need to foster computational literacy in K–12 classrooms, yet the factors influencing their CT skills remain less understood than those for K–12 students or in-service teachers. This study leverages multimodal data to investigate how pre-service teachers (n=128) differ in CT skills, the predictive role of metacognitive strategies and prior coding experience, and variations in online behaviours. Using latent profile analysis, we identified three profiles based on digital literacy, problem-solving, and coding comfort (Novice, Developing, and Proficient), revealing heterogeneity in CT, and supporting non-linear skill acquisition. Linear discriminant analysis revealed that metacognitive strategies and prior coding experience significantly predict profile membership, validating the interplay of technical and cognitive factors in the development of CT skills. Behavioural data from an interactive problem-solving task showed that, compared to Novices and Developing learners, Proficient learners were more task efficient and perceived fewer challenges during task completion. Implications for designing a learning analytics dashboard to visualize profiles and behavioural metrics to support adaptive, equitable, and personalized teacher training are discussed, thereby enhancing pre-service teachers’ readiness to integrate CT into K–12 education.</p>Tanya ChichekianMaria CutumisuAnnie SavardYi-Mei Zhang
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-142026-03-14131314110.18608/jla.2026.9077Assessing Patterns of Students’ Attainment of Professional Standards in Higher Education
https://learning-analytics.info/index.php/JLA/article/view/9035
<p>It is widely recognized that higher education (HE) graduates require a broad range of professional skills and abilities to succeed in their future careers. However, despite this acknowledgement, assessment practices in HE remain focused on content-based knowledge. This narrow emphasis limits the capacity to effectively and holistically evaluate a student’s professional competency and readiness for employment. This issue is particularly acute for HE degrees that require graduates to demonstrate attainment of externally regulated professional standards. While the curricula are mapped to professional standards for accreditation purposes, demonstrating a student’s attainment of these standards is not straightforward and has mostly been done through self-reported surveys. This study offers a novel curriculum analytics method for mapping assessment grades to the attainment of professional standards across a Teacher Education program. Specifically, we present an approach that uses psychometric modelling and learning analytics to identify distinct patterns in learners’ acquisition of professional standards. This method does not alter current assessment practices in HE. Instead, the approach offers a scalable, automated means to infer a learner’s attainment of documented professional standards, complementing current measures of academic success, such as GPA. The study underscores the advantages of complementing the current HE assessment practises with an outlined curriculum analytics approach, providing a holistic representation of a student’s learning progress.</p>Abhinava BarthakurJelena JovanovićRyan BakerVitomir KovanovićChristopher C. DeneenShane Dawson
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-152026-03-15131425610.18608/jla.2026.9035Assessing 21st Century Competencies
https://learning-analytics.info/index.php/JLA/article/view/9131
<p>The growing emphasis on competency-based education (CBE) has heightened the need for clearly defined metrics and robust assessment frameworks to evaluate 21st-century competencies. Curriculum analytics (CA) provides a promising avenue for assessing learning outcomes (LOs) and informing continuous improvement in higher education. However, challenges persist in differentiating academic performance from actual LO development and in translating assessment data into meaningful program-level actions. This study examines how CA tools support the direct assessment of LOs and contribute to continuous improvement processes in higher education. Using a two-case study design, we analyzed CA implementation in two universities through interviews, cognitive walkthroughs, and institutional document analysis. Data triangulation identified 18 themes, nine of which reached full consensus among the three researchers. Findings indicate that CA tools effectively support the assessment of LOs aligned with 21st-century competencies by generating actionable insights that guide faculty toward more authentic and reflective teaching practices. The study contributes to the LA field by providing empirical evidence of how CA tools can bridge assessment and pedagogical improvement, offering both theoretical and practical implications for researchers and practitioners.</p>Mónica Hernández-CamposIsabel HilligerFrancisco-José García-Peñalvo
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-152026-03-15131577310.18608/jla.2026.9131A GAI-Based Chatbot for Scaffolding Self-Regulated Learning
https://learning-analytics.info/index.php/JLA/article/view/9143
<p>Generative AI (GAI) is increasingly integrated into education, particularly through chatbots that support students without direct human intervention. While these tools show promise as personalized learning companions, concerns persist about their potential to foster overreliance, limit creativity, and hinder the development of critical thinking. These risks highlight the need to strengthen students’ metacognitive skills and promote structured self-reflection. This paper presents the design and iterative development of a GAI-based chatbot aimed at scaffolding self-regulated learning. Through three design cycles involving 276 students in 10 courses, the chatbot evolved from a static assistant into a dynamic, course-integrated tool capable of supporting personalized, Socratic-style dialogue. Thematic analysis of diverse qualitative data sources revealed that students seek scaffolded support, such as human tutoring, and require explicit guidance to engage meaningfully with AI. Findings emphasize the importance of dialogic competence, personalization, and educator involvement in shaping effective AI-mediated reflection. This study underscores the need for pedagogically grounded AI tools that position chatbots as collaborative agents, complementing rather than replacing the roles of teachers and learners. It advocates for reflective teaching practices that clearly define the responsibilities of students, educators, and AI systems to ensure that GAI enhances deep learning and independent thought.</p>Isabel HilligerMar Pérez-SanagustínCarlos GonzálezEsteban VillalobosSergio Celis
Copyright (c) 2024 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-152026-03-15131748810.18608/jla.2026.9143Advancing 21st-Century Professional Competencies with Learning Analytics in the Age of Generative AI
https://learning-analytics.info/index.php/JLA/article/view/9743
<p>The rise of generative artificial intelligence (GenAI) and accelerated globalization have necessitated a fundamental recalibration of higher education to prioritize domain-agnostic, 21st-century professional competencies. While institutional commitment to these skills is high, their systematic integration into the curriculum and evaluation remains fragmented, highlighting a critical gap between traditional academic success metrics and demonstrated workforce readiness. This special issue presents five complementary studies that investigate how the intersection of learning analytics (LA) and GenAI can bridge the gap between institutional rhetoric and demonstrated professional readiness. The contributions collectively advance a research agenda across four dimensions: 1) benchmarking large language models (LLMs) for curricular-competency alignment using reasoning-based prompting, 2) the iterative design of Socratic-style GenAI chatbots to scaffold self-regulated learning, 3) the application of psychometric modelling and Latent Profile Analysis to quantify 21st-century professional competencies, and 4) institutional governance and adoption of curriculum analytics. Collectively, these studies advocate for an epistemological shift toward process-sensitive assessments that move beyond static, episodic indicators toward dynamic, longitudinal representations of learner capability. We conclude by outlining the sociotechnical infrastructure, including robust governance and interdisciplinary collaboration, required to responsibly transition these AI-driven innovations from research prototypes to sustainable enterprise infrastructure, ensuring that analytics serve the evolving needs of students, educators, and professional bodies.</p>Abhinava BarthakurOlga VibergRené F. KizilcecRyan S. BakerShane Dawson
Copyright (c) 2026 Journal of Learning Analytics
https://creativecommons.org/licenses/by/4.0
2026-03-302026-03-301311610.18608/jla.2026.9743