JLA Note with ref number Goal Stakeholder Type Theme Theme 2 9.2 Learners’ time-of-day (TOD) preferences for engaging in learning have been studied extensively, but how these preferences are impacted when different modalities such as computers and mobiles are available needs better understanding. [1] STA NO 9.2 Our results suggest that learning sessions from various modalities are significantly associated with the time of day and day of the week (weekday or weekend), and this holds true for students who made extensive use of different modalities to complete their learning activities (computer-dominant and intensive learners)and those who sparingly used them (limited-computer learners). [1] FIN UND 9.2 Overall, mobile were found to be more frequent in the afternoon and short-computer sessions at night. [1] FIN UND 9.2 The modality-TOD [time of day] associations were similar on weekdays and weekends for computer-dominant and limited-computer learners, two groups that are strikingly different in terms of their academic performance. [1] FIN UND 9.2 Designers can use the results to target delivery of notifications, i.e., to optimize “the right information at the right time from the right modality.” [1] EDU LD Finding Feedback Feedback / Scaffolding 9.2 Researcher–practitioner collaborations offer an exciting way for schools and technology companies to unlock the potential of data collected and stored within digital learning environments. [2] STA NO 9.2 Practitioners and technology developers can play an important role in developing validity evidence for claims and inferences based on behavioural indicators derived from digital learning environment data. [2] STA NO 9.2 Complex data sources from digital learning environments can be made useful to practitioners through thoughtfully designed, authentic data analysis and interpretation activities. [2] STA NO 9.2 The presented learning analytics tool (i.e., the HeuristicsMiner algorithm) is applicable for small samples (e.g., for analyses at the classroom level). [3] EDU LD Tool Sequencing Sequencing 9.2 The information mined by the Heuristics Miner algorithm can be directly used for educational practice(e.g., to script sequences of effective learning activities). [3] EDU LD Tool Sequencing Sequencing 9.2 The analyses presented here identify productive and unproductive problem-solving strategies, which help students to prepare effectively for future learning (e.g., in later instruction). [3] EDU Instructors Finding Problem Solving Problem Solving 9.2 The need to monitor the quality of online tutoring has increased significantly in recent years. [4] STA NO 9.2 An approach using sequential pattern mining and decision trees is proposed to monitor online one-to-one primary math tutoring. [4] STA TL 9.2 Effective online tutors present statistically significantly more frequent appropriate hint provision and proactive planning behaviours in their sessions. [4] EDU Instructors Finding Instructor Behavior Instructor Behavior 9.2 Effective online tutors sequence monitoring actions with appropriate pausing and initiation of learner self-correction. [4] EDU Instructors Fidning Instructor Behavior Instructor Behavior 9.2 Machine learning can be used to design affirmative action policies that take into account changes in applicants’ behaviour from one admission cycle to the next. [5] EDU Admisions Methodology Diversity Diversity 9.2 Simple strategies such as computing the optimal bonus policy based on the last few years of historical data perform well if enough data is available. [5] RES 9.2 For newer programs or in the absence of historical data, a machine learning method may be preferable. [5] RES 9.2 Due to fluctuations in applicant behaviour, an affirmative action policy that has a desirable effect in one year may have an undesirable effect in another year. Evidence-based policies should be based on multiple years of data, and uncertainty should be taken into account. [5] EDU Admisions Findings Diversity Diversity 9.2 When applied in a targeted manner, affirmative action policies may lead to a moderate increase in the admission rate of underrepresented groups, while having a minimal effect on the average admission score of admitted applicants. [5] EDU Admisions Findings Diversity Diversity 9.2 This work demonstrates that a human-in-the-loop approach to analyzing students’ problem-solving sequences results in more interpretable insights than purely quantitative methods do. [6] RES 9.2 Visualizing the output of a clustering algorithm allows stakeholders to identify learning patterns in and across sequences. [6] RES 9.2 The visualized output also allows stakeholders to better understand how the algorithm interprets the sequences, which can be used to adjust the algorithm’s parameter. [6] RES 9.2 Results suggest that allowing the stakeholder to analyze the visualized sequences and iteratively adjust the algorithm produces clustered sequences that better match an expert’s interpretation of the data. [6] RES 9.2 Curriculum Modelling and Learner Simulation (CMLS) is a method for making principled quantitative projections of the impact curricular structures and arrangements will have on student learning. [7] STA 9.2 CMLS [Curriculum Modelling and Learner Simulation] draws on general principles of curriculum design, human learning, and simulation modelling, with assumptions that are explicit, transparent, and easily adjusted. [7] STA 9.2 We implement CMLS [Curriculum Modelling and Learner Simulation] in this work as coloured Petri nets using CPN Tools, a widely used and freely available simulation development environment. [7] STA 9.2 CMLS [Curriculum Modelling and Learner Simulation] can be applied both to test novel curriculum designs and to inform redesign work by simulating how modifications to a curriculum are likely to impact student learning. [7] STA 9.2 The CMLS [Curriculum Modelling and Learner Simulation] methods we describe both support existing methods in learning analytics and significantly extend our capacity for theoretically motivated work in curriculum design and development. [7] RES 9.2 Dropout in higher education (HE) is a worldwide societal problem, with a negative impact on the reputation and function of higher education institutions (HEIs). [8] STA 9.2 Institutional analytics (IA) is a promising approach for addressing dropout in HE. [8] STA 9.2 In this article, we identify reasons students drop out and map IA [Institutional analytics] solutions that can inform HEIs’ strategic planning. [8] EDU Administrators Methodology Retention Retention 9.2 Results suggest that focusing only on maximizing student performance does not help reduce dropout. Beyond classic indicators based on student academic history or engagement, other factors such as curriculum and institutional and social support should be considered to predict student retention. [8] EDU Administrators Methodology Retention Retention 9.2 To reduce dropout, HEIs can implement IA [Institutional analytics] solutions such as student dropout prediction models, competence-based models, or intelligent recommender systems that propose interventions to support students’ academic performance or overcome their academic struggles. [8] EDU Administrators Methodology Retention Retention 9.2 While IA [Institutional analytics] solutions are evidence based, there is a need for contextualization. As such, we envision that a human should mediate the interpretation of the analysis before triggering any intervention. [8] RES 9.2 Institution-wide mentoring programs, introductory courses, counselling, and curriculum improvements have great potential for overcoming dropout rates that have been identified through IA [Institutional analytics]. [8] STA 9.2 Change terminology regarding early identification systems from survive to thrive, to reflect students’ potential realization. [9] EDU Administrators Recommendation Students-at-risk / Wellbeing Students-at-risk / Wellbeing 9.2 Combine quantitative and qualitative approaches for model construction to achieve reliable, authentic learning analytics. [9] RES 9.2 Set institutional policies regarding a wide range of data types, to support effective data-driven decision-making. [9] EDU Administrators Methodology Adoption Adoption 9.2 Current research on the state of the field points to the relative absence of published research in LA from the African context. [10] STA 9.2 Very little, if anything, is known about the state of the field of LA on the African continent. The present research is a first attempt to map this. [10] STA 9.2 In line with a commitment from SoLAR, as well as the International Conference on Learning Analytics and Knowledge (LAK),to increase participation from marginalized groups and communities, and to diversify understandings of the potential and practice of LA, it is crucial to establish a baseline. [10] RES 9.2 Design analytics looks at overall trends and gaps in online communities of teachers and other designers of learning experiences. [11] STA 9.2 We show how supervised machine learning can automatically categorize learning designs from such online communities at scale, according to high-level pedagogical features (e.g., Bloom’s taxonomy of learning activities). [11] RES 9.2 The machine learning algorithm to use (e.g., interpretable, black box, or a combination of both) should be chosen based on evaluations of its respective performance and the needs of the target users (e.g., teachers need to understand the reasoning behind design recommendations suggested automatically by the system). [11] RES 9.2 Practice informed by solid theoretical principles is crucial in driving the agenda; practitioners do not have the time/resources to reflect on theory. [12] STA 9.2 Focusing on good visualization and design based on feedback has been very effective in better supporting all stakeholders and empowering them in their jobs. [12] RES 9.2 Systematic adoption requires a top-down vision as well as buy-in from stakeholders; the SHEILA framework is useful in explicitly articulating issues and actions and reflecting on the adoption/development process. [12] EDU Administrators Methodology Adoption Adoption 9.2 Collaboration and reflection on processes are essential in making theory accessible to practitioners and using ideas to foster improvement. [12] RES 9.2 The involvement of stakeholders is essential in designing dashboards and developing effective processes to support student goal setting and self-regulation. [12] RES 9.2 Prior work identified ethical and legal concerns when using data to enhance learning processes (“learning analytics” and “curriculum analytics”) and developed a Learning Analytics Code of Practice (Sclater & Bailey, 2018) to address these. More recently the need for, and possibility of, “wellbeing analytics” using data to enhance student and staff wellbeing processes has also been identified. [13] STA 9.2 This paper describes how an in-depth study of European data protection law was used to develop a complementary Wellbeing Analytics Code of Practice (Cormack & Reeve, 2020),building on the Learning Analytics Code, to guide the investigation and routine use of data by institutions to improve support for wellbeing. [13] STA 9.2 We conclude that it should be possible to research, investigate, and conduct wellbeing analytics ethically, lawfully, and safely. First, this must be led by health theory (as learning analytics should be led by pedagogical theory). Second, design and testing of models and systems are critical and best viewed as independent processes. Third, a wider range of data sources will likely be required, but this requires particular care. Although our work is based on European law, that law is increasingly considered a global “gold standard,” so we believe our work should be applicable elsewhere. [13] FUT 9.2 These should be fruitful topics for research, and important enablers for practice, with the Code of Practice providing a framework within which both research and practice can be conducted safely. [13] FUT 9.2 The Community of Inquiry and ICAP frameworks have been widely used to design and analyze student learning experiences and to understand the benefits of participation in online discussions. In previous work, the framework constructs were shown to be correlated with learning gains. We used them as independent proxy measures for the cognitive quality of student participation. [14] STA 9.2 This study looked at how various attributes of online discussions—such as text complexity and the threaded dialogue structure—were aligned with the framework constructs. We found that messages that were more deeply nested in discussion threads tended to be associated with greater quality in both frameworks .Messages that were posted later in time showed no such association. This result suggests that students should be rewarded for extending existing message threads, rather than for asking additional novel, but unrelated, questions. [14] EDU Instructors Findings Discussion Discourse 9.2 We also found that the frameworks were not closely aligned with each other, suggesting that they measure different aspects of student experience in online discussions. Thus, using their constructs in combination in future studies would be expected to provide richer insights than using either one alone. [14] RES 9.2 FoLA2 [Fellowship of Learning Activities and Analytics] is a method of considering learning analytics (LA) while designing curricula and learning activities for any subject. [15] STA 9.2 Educational theory, pedagogical background, and the design of learning activities differ in each organization. FoLA2 [Fellowship of Learning Activities and Analytics] has two aims: On an individual level, it aims to increase knowledge about and awareness of LA. On a group level, it facilitates shared terminology and understanding among team members to improve the co-creation of LA-supported learning activities. [15] RES 9.1 No notes. [16] NUL 9.1 Degree and Eigenvector centrality measures can be a consistent indicator of performance in settings where course design emphasizes collaboration. [17] STA 9.1 The correlation between degree and eigenvector centrality measures and academic achievement was reproducible regardless of the number of students, number of interactions, year of study, or course subject. [17] FIN 9.1 Closeness and betweenness centralities showed inconsistent correlation with performance. [17] FIN 9.1 No notes. [16] NUL 9.1 Although our context was homogenous, there was moderate heterogeneity in the pooled effect sizes indicating the diversity of CSCL as a medium. [17] FIN 9.1 A six-minute timeframe is not the gold standard, but the “shorter the better” approach applies to video design. [18] EDU LD Findings Video Video 9.1 The use of screen recordings or full screen has a strong impact on engagement with video. [18] EDU LD Findings Video Video 9.1 The types of instructions in videos relate to the allocation patterns of learners’ collective attention. [18] EDU LD Findings Video Video 9.1 Learners are more likely to engage with videos when they are already paying attention rather than at the start or end of their online learning session. [18] EDU LD Findings Video Video 9.1 Bridging theories of affordance networks, which are network representations that describe the affordances for collaboration, and the complementary theory of proxemics, which describes the physical structure of these networks during collaboration and social network analysis (SNA), we conceptualize collaborative opportunity networks (CONs). The CONs use inter-learner proximity to create networks that indicate where collaborative engagement is possible. [19] STA 9.1 We show that clustering algorithms can be applied to CONs to detect different types of social groupings (in our learning environment, we identified four different component social subgroup structures: singleton, coterie, crowd, and club). [19] FIN 9.1 This article defines a large group’s collaborative opportunity temperature (COT) as an instantaneous “reading” of the potential for collaboration, as defined by the current mix of social subgroup structures. [19] STA 9.1 COT can be ethically used in public settings because it works without storing identity information or tracking individuals, ensuring their anonymity. [19] RES 9.1 The case study we present serves as a proof of concept that COT can be useful to both designers and researchers to highlight social patterns in large-group immersive learning environments both as a quick way to contrast group behaviours and as a complementary triangulation to data that requires lengthier analysis. [19] FUT 9.1 Monitoring motivation fuels student learning during early phases of studying. [20] EDU Instructors Findings Motivation Motivation 9.1 Acknowledging monitoring has positive effects on other monitoring events. [20] EDU Instructors Findings Monitoring Monitoring 9.1 Modelling within-person variance and temporal network analysis offer a possible solution to the limitations of group-level methods. [20] RES 9.1 This study empirically supports the theoretical framework of knowledge reservoirs that assumes knowledge is embedded in networks consisting of different modes. [21] RES 9.1 The transfer of complex knowledge (about educational innovations) can succeed in two-mode networks between higher education teachers and innovative teaching projects. [21] STA 9.1 Thus, educational reform programs should continue to support knowledge transfer regarding innovative teaching projects through funding, networking events, or training. [21] FUT 9.1 No notes. [22] NUL 9.1 No notes. [23] NUL 8.3 No notes. [24] NUL 8.3 Multimodal learning analytics (MMLA) is a set of analytic techniques that can be used to better contextualize a given learning analytics project. [25] STA 8.3 The commitments described in this paper constitute best practices that researchers should follow in collecting, analyzing, and disseminating MMLA research that may be conducted in schools and laboratories. [25] RES 8.3 These commitments highlight the goal of making MMLA relevant to practice and ensuring that educators’ and students’ voices are authentically taken into consideration. [25] RES 8.3 Practitioners can use these commitments to hold researchers accountable for the ways that they enact MMLA innovations in educational settings. [25] EDU General Recommendation Adoption Adoption 8.3 There is a lack of evidence in the literature exploring specifically how tutors interpret and action learning analytics when they are made available to them. [26] STA 8.3 In this implementation, STEM tutors reacted in several ways: 1) they used the predictive learning analytics in five distinct ways and their interpretation and associated actions were subjective and context driven; 2) they considered descriptive learning analytics to be more useful than predictive learning analytics when presented alongside one another; 3) they accessed the learning analytics at specific times despite different module contexts; and 4) they did not consider this implementation to have significantly contributed to their practice in supporting students. [26] FIN 8.3 Higher education institutions hoping to introduce actionable learning analytics implementations should be aware of the challenges inherent in using complex, non-transparent predictive algorithms, decontextualized data, and socio-technical systems that do not directly address the problems tutors face. [26] EDU Administrators Recommendation Adoption Adoption 8.3 Social informatics, and the identification of “shadow practices,” should be considered as a valuable methodological approach when attempting to evaluate tutor practice with technologies. [26] RES 8.3 At K–12 schools, educators play different roles, such as teachers and instructional coaches. Understanding how these roles offer distinct vantage points toward data and how they shape sensemaking is fundamental to developing human-centred learning analytics. [27] RES 8.3 Training coaches to facilitate sensemaking with teachers is key for K–12 instructional improvement. Providing a common language to observe, categorize, and discuss reactions to data allows coaches and school leaders to design professional development interventions attuned to teachers’ practices and needs. [27] EDU Teacher's Trainers Recommendation Adoption Adoption 8.3 Visual analytics tools need to account for the possibility of “over-fitting” — or the normalization of new instructional information into teachers’ pre-existing views — and facilitate question-asking and information-seeking behaviours to mitigate this risk. [27] RES 8.3 The spacing and testing effect are well-known and robust findings in cognitive psychology, but they are difficult to translate to educational practice. In this study, we deployed an adaptive fact-learning system to exploit these effects in an undergraduate cognitive psychology course. Study behaviour and exam performance were recorded for two consecutive cohorts of students using the system. [28] STA 8.3 The estimated model parameter of the adaptive fact-learning system predicted exam grades up to two weeks in advance (Section 4.2.3). In practice, this information could be used as an early warning signal to students and their instructors that additional study is likely required for a passing grade. [28] EDU Instructors Tool Student-at-risk / Wellbeing Students-at-risk / Wellbeing 8.3 Performance on individual items during practice was predictive of their eventual recall on the exam (Section4.2.4). Thus, the adaptive system could provide tailored feedback to students and help them assess their mastery of the material. Furthermore, instructors may use this information to select appropriate materials for the exam, or even forgo exams altogether by relying exclusively on the assessments provided by the adaptive system. [28] EDU Instructors Tool Assessment Assessment 8.3 In terms of the expected gains in exam performance, students who have an additional hour to spend studying are better off spending it on the adaptive learning system than on other self-reported study activities (Section 4.2.6). [28] EDU Instructors Tool Assessment Assessment 8.3 One strength of the system deployed here is that it is agnostic to the material studied, which makes it applicable in a wide range of educational settings. In general, adaptive learning systems are a viable way of translating spacing and testing effects to educational practice, and they also provide useful insights to students and instructors. [28] EDU Instructors Tool Assessment Assessment 8.3 We use predictive analytics to investigate, analyse and understand the demographic, institution, learning environment and temporal variables affecting undergraduate student retention rates. [29] STA 8.3 Early alert systems (EAS) can form an important mechanism to address student retention, but the EAS must also be designed with retention in mind. The wellness engine was and remains primarily focused on identification for support needs, not on student retention. [29] EDU Administrators Tool Students-at-risk / Wellbeing Students-at-risk / Wellbeing 8.3 Temporal models such as survival analysis, is required to capture the complexities of changes over time. Treatment effects models serve an important function in decomposing effects when randomized controlled trials is not an option, as is frequent with student retention and the allocation of support resources. [29] RES 8.3 Does confusion facilitate or hamper learning? That may depend on whether it is resolved or not, but it is not clear what factors influence confusion resolution. [30] STA 8.3 This study examined the interplay between metacognitive strategies (MS) and confusion. It found that confusion was accompanied by self-regulated behaviour, suggesting that creating situations that may evoke confusion for students may promote learning regulation. However, we could not establish that such situations would necessarily result in enhanced learning. [30] FUT 8.3 In order to use confusion to promote learning, instructors need to provide students with support to help them resolve confusion. This study found that the use of MS did not improve confusion resolution. Thus, providing students with only MS support may be insufficient to take the advantage of confusion. Student cognitive skills and motivation must also be considered. [30] EDU Instructors Finding Confusion Confusion 8.3 Confusion rarely occurred naturally in this study. Even within the high-confusion group, on average, only 6% of the emotion observed was confusion. Thus, instructors may need to create activities that spark student confusion if they want to use confusion to enhance learning, while also ensuring sufficient support for confusion resolution. [30] EDU Instructors Finding Confusion Confusion 8.3 Learning analytics (LA) is about human decisions, which need to be carefully inspected, justified, and even challenged to serve the purpose of enhancing learning. [31] STA 8.3 LA is about reflecting on factors that contribute to observed patterns and prompting actions as a result. [31] STA 8.3 LA is about the social process in which power is negotiated as data flows. [31] STA 8.3 Calls to action (CTAs) in feedback messages offer a mechanism for capturing data on students’ engagement with customized feedback messages. [32] EDU LD Finding Feedback Feedback / Scaffolding 8.3 Student characteristics aid predictions on student engagement with technology-mediated feedback that contains a CTA [call to action]. [32] RES 8.3 The emerging patterns could help instructors and educational designers narrow the feedback gap and improve feedback recipience. [32] FUT 8.3 This paper presents a conceptual framework for the development of educational tools that would promote both the development of evaluative judgment and research into it, along with a referenced implementation of that framework. [33] RES 8.3 The framework provides explicit guidance to educational technology developers and instructors through a co-design process to facilitate the development of tools and metrics to ascertain the impact of pedagogically supported strategies for developing students’ evaluative judgment. [33] RES 8.3 In order to support the development of evaluative judgment and research into it, evaluative judgment strategies and learning analytics/metrics to verify their effect need to be integrated into educational tools and learning design from the outset. [33] EDU LA Designers Recommendation LA Design LA Design 8.3 The increase in the volume and complexity of student data makes it challenging for instructors to use conventional learning analytics dashboards (LADs) to make sense of students’ learning processes and to decide on pedagogical actions to enhance learning. [34] STA 8.3 Instructors may be better supported in such decision making by using predictive learning analytics, which can identify at-risk students and automate the process of providing pedagogical interventions. However, use of predictive models comes with growing concerns about the fairness, accountability, transparency, and ethics (FATE) of such models. [34] STA 8.3 The approach presented in this paper recommends to instructors which students should be considered in more detail due to the highest deviation in performance or learning process from their classmates. The approach responds to concerns related to FATE by refraining from automatically labelling students; instead, it guides instructors to make data-informed decisions via recommendations. [34] EDU Instructors Recommendation Student-at-risk / Wellbeing Students-at-risk / Wellbeing 8.3 Instructors can gain interesting insights when drilling down into student data in LADs on the basis of the recommendations. However, a risk of inequity still exists. Rigorous evaluation of our proposed approach is required before it can be integrated into existing dashboards. [34] FUT 8.2 No notes. [35] NUL 8.2 Teacher dashboards are a specific application of learning analytics in which visual displays provide teachers with information about their students; for example, concerning student progress and performance on tasks during lessons or lectures. [36] STA 8.2 In the two case studies presented here, teacher dashboard use in K–12 does not seem to be related to the teacher’s age, gender, years of teaching experience, or technological self-efficacy. [36] FIN 8.2 A framework to guide future work for understanding the relationship between teacher characteristics and teacher dashboard use is presented. [36] FUT 8.2 Inquiry-based learning (IBL) has many documented benefits, but its demanding orchestration hinders adoption by teachers. [37] STA 8.2 We identified teachers’ orchestration needs in an IBL platform and explored how to fulfill those needs with solutions that align learning design and learning analytics. [37] STA 8.2 Researchers, designers, and developers of information and communications technology (ICT) for IBL orchestration can benefit from the elicited IBL orchestration needs and design guidelines reported. [37] RES 8.2 Reflection is essential for effective problem solving and learning in game-based learning environments (GBLEs). [38] STA 8.2 Multimodal data captured during game-based learning may provide insight into the quantity and quality of adolescents’ reflections and their relation to learning and performance with GBLEs via game-learning analytics. [38] STA 8.2 Our findings suggest that designing reflection prompts based on learning goals in GBLEs [game-based learning environments] provides insight into adolescents’ learning and problem solving to guide instructional decision making in the classroom to support reflection, learning, and performance. [38] EDU LD Finding Feedback / Scaffolding Feedback / Scaffolding 8.2 Exchanging and combining data from two or more technologies requires common understandings of how data are generated by students. [39] RES 8.2 There are few established strategies for making sense of learner event data across different technologies. This gap signals the potential importance of higher bandwidth interactions between and among researchers, technology developers, and school practitioners. [39] RES 8.2 De-identified educational data may reveal sensitive information about students when linked with publicly available data sets. [40] STA 8.2 Learning analytics and complete student privacy are competing goals that cannot be settled by technology alone. [40] STA 8.2 Solutions should combine technology, sensible data policies that are backed up by regulations and legal agreements, and appropriate training for teachers and other educational stakeholders. [40] FUT 8.2 We provide an overview of analytics and visualization techniques from the perspective of a system designer. [41] STA 8.2 Most of the presented techniques are also relevant for teachers and other stakeholders. Two interesting research questions are as follows: What is the best way to present this type of analysis to teachers? How can we make them actionable? [41] STA 8.2 We highlight biases and caveats in learning data. Understanding these issues is necessary in order to correctly interpret research results. [41] RES 8.2 Learning analytics dashboards are a technological genre that provides real-time monitoring to teachers so they can track students’ progress and make pedagogical decisions on the fly to best support learning. [42] STA 8.2 Our dashboard for science guides teachers to support students on inquiry practices by providing alerts with fine-grained data and visualizations in real time. [42] FIN 8.2 This study shows that the dashboard’s real-time alerts were effective in helping teachers respond to student difficulties with science inquiry practices and, correspondingly, in promoting student improvement on science practices after receiving teacher help. [42] EDU Instructors Finding Orchestration Orchestration 8.2 Analyses of the discourse data showed that there were variations in the types of support that resulted in student improvement, with students benefiting most from higher-level support on how to understand and execute science practices, in addition to content support. [42] EDU Instructors Finding Orchestration Orchestration 8.2 Most of the selected empirical studies conduct surveys to understand respondents’ perceptions, perspectives, attitudes, or views on the topic. [43] STA 8.2 Most of these respondents represent institutional rather than student views. [43] STA 8.2 Qualitative data is most often collected via individual interviews and electronic questionnaires. [43] STA 8.2 Empirical research on the ethics of learning analytics systems in higher education is, so far, concentrated in a few countries. [43] STA 8.2 The top three ethical areas most often addressed in the selected literature are transparency, privacy, and informed consent. [43] STA 8.2 The ethics of enabling interventions triggered by analytics needs to be further investigated in future studies. [43] FUT 8.2 Studies on justice, equality, bias, ethical dissonance, moral discomfort, and intellectual freedom will help further develop research on the ethics of LA in higher education. [43] FUT 8.1 No notes. [44] NUL 8.1 Studies of educational contexts using collaboration analytics are being implemented, but some analytics practices may unintentionally fail to fully capture the comprehensive context (e.g., when users without sufficient contextual information must make final decisions based on data collected without predefined learning models). Furthermore, definitions and perspectives on collaboration analytics vary across disciplines and underlying perspectives on collaboration. [45] STA 8.1 In this article, we restrict the context to a collaborative classroom setting and explore comprehensive models for collaboration analytics. Specifically, we draw upon Pei-Ling Tan and Koh’s ecological lens (2017, Situating learning analytics pedagogically: Towards an ecological lens. Learning: Research and Practice, 3(1), 1–11. https://doi.org/10.1080/23735082.2017.1305661), which highlights the co-emergence of three interactions among students, teachers, and content, interwoven with time, in a classroom setting. Based on this lens, we suggest several factors to consider in each interaction when analyzing collaboration. [45] STA 8.1 Practitioners and researchers can produce better results and deliver them to stakeholders by adopting the framework and considering a list of factors for each interaction between the involved actor sand content. [45] RES 8.1 Multiple studies provide ways to analyze collaboration, but most approaches are post-hoc and fail to provide feedback that might promote collaboration literacy. [46] STA 8.1 This paper details different dimensions of collaborations, as viewed by university students, as well as how to design a system to support learning collaboration literacy. [46] STA 8.1 The dimensions of collaboration and the example platform could help steer the direction of future collaboration analytics tools for student and teachers. [46] FUT 8.1 This paper provides a framework that combines log data analyses and natural language processing to understand how CPS and regulation are employed in tasks of varying levels of difficulty and scaffolding. [47] STA 8.1 Results indicate that students engaged in more socially shared regulation (SSR) and productive collaboration in more challenging, open-ended tasks than in scaffolded tasks. SSR also correlated with more productive co-construction of knowledge, leading to higher performance scores. [47] EDU Instructors Finding SRL SRL 8.1 Our findings help build a better understanding of regulation processes and co-construction of knowledge during computational modelling in K–12 science. This understanding has the potential to inform the design of future environments and tasks to foster better collaboration and learning in computational scientific modelling and beyond. [47] FUT 8.1 Collaboration analytics often focuses on assessing and monitoring individuals during collaborative problem-solving. [48] STA 8.1 As part of these efforts, researchers, educators, and learners interact with models and representations of collaborative processes. [48] STA 8.1 Because interdependence is a critical feature of collaborative problem-solving, models that account for interdependence are often necessary. However, these models are complex, making them potentially difficult for researchers, educators, and learners to use. [48] STA 8.1 To make decisions regarding when interdependent models are necessary, we need a way to measure the impact of interdependence. [48] STA 8.1 This paper describes the development and testing of a novel quantitative measure of the impact of interdependence that can be used to guide modelling decisions and address specific research questions about collaborative contexts. [48] STA 8.1 This article shows how time-intensive process data can be used to measure the degree of interdependence among the actions of group members. [49] RES 8.1 The proposed measures of interdependence predict group performance in mathematics above and beyond(a) the mathematical ability of the group members and (b) the total number of interactions among group members, which suggests the importance of these measures for understanding small-group collaborations. [49] RES 8.1 The reliability of the proposed measures increases as the number of interactions among members increases, implying that short-duration learning activities are not well suited for studying interdependence. [49] RES 8.1 The study of groups with three or more members is important for understanding the contributions that individual members make to group dynamics; in dyads, the influence of one group member is not distinguishable from the responsiveness of the other. [49] STA 8.1 Knowing the inquiry-based learning (IBL) phase of the learners’ process may help with the provision of timely guidance for computer-supported collaborative inquiry-based learning (CSCIL). [50] EDU Instructors Finding Collaborative Learning Collaborative Learning 8.1 Instead of using classical algorithms to analyze computer-mediated interaction, we present computational models based on advanced natural language processing techniques for automatically coding orientation, conceptualization, investigation, conclusion, and discussion phases from authentic face-to-face conversations. [50] STA 8.1 Our methodology shows the potential of word embeddings and deep networks with attention mechanisms in content analyzing face-to-face conversations. [50] RES 8.1 Our results suggest our methodology may be combined with automatic speech recognition methods to analyze face-to-face conversations and support real-time orchestration of CSCIL activities. [50] RES 8.1 Collaboration analytics partially or fully automate the analysis of data captured from group/team settings. These include co-located, face-to-face groups and platform-mediated groups (synchronous and asynchronous). [51] STA 8.1 The results of such analytics should help the team, educators, researchers, or other stakeholders gain insight into the activity and inform action that improves collaboration processes and products. [51] STA 8.1 Collaboration analytics require the integration of three key inputs: (i) theoretically sound concepts, (ii) well-designed tasks that generate useful data traces to inform analytics, and (iii) human-centred design that empowers stakeholders to shape the analytics and produces effective user interfaces. [51] RES 8.1 Four cases demonstrate that when these elements are aligned, data traces from activity sensors can be designed in principled ways to serve as evidence of learners’ capabilities, rendered via user interfaces that make sense to users. [51] RES 8.1 Collaboration analytics of this sort, which go beyond simply counting and graphing low-level events, are predominantly research prototypes and rare to find in products. [51] RES 7.3 No notes. [52] NUL 7.3 No notes. [53] NUL 7.3 The systematic analysis of learning activities and their effects on students’ behavioural and academic engagement revealed that student-idiosyncratic learning analytics data have an important role in analytics-informed learning designs. [54] RES 7.3 Large gains can be attributed to collaborative learning, as well as learning and teaching approaches where students have the opportunity to reflect on their progress and are provided with scaffolds to support time management, self-efficacy, and online presence. [54] EDU Instructors Finding Collaborative Learning Collaborative Learning 7.3 Caution must be exercised when interpreting effect sizes reported in educational research because the impact on student outcomes and learning gains relies on contextual factors, as well as the actions of the teacher. [54] RES 7.3 It is important to discuss and understand the potential of learning analytics — informed personalization for the benefit of students’ well-being and relatedness rather than focusing on performance measures alone. [54] EDU General Recommendation Well-being Wellbeing 7.3 The effective use of learning analytics to support learning design requires a combination of different levels of learning analytics to provide richer insight into student online learning processes. [55] RES 7.3 The specific learning analytics visualizations should consider the pedagogical intentions of the course (i.e., to make learning design decisions based on theory) and the needs of the teachers, rather than specific analytics or other technical considerations. [55] RES 7.3 For learning analytics visualizations to meet the intended purpose of supporting learning design, they should be provided in a timely and simplified manner. [55] RES 7.3 Learning analytics implementation is a framework by which learning analytics metrics can be connected to learning design to best serve learners. [56] STA 7.3 In an undergraduate writing course, length of discussion entries, but not page views, predicts final grade. [56] FIN 7.3 Differences among instructors can provide programmatic insights into the student experience of interacting with instructional content. [56] RES 7.3 Learning analytics has provided tools and methods to understand how instructors design for learning. [57] STA 7.3 The learning design process was strongly influenced by institutional policies and management. [57] FIN 7.3 Study skills, workload, and subject disciplines are important factors in designing online courses. [57] STA 7.3 Co-designing and redesigning are prominent in the learning design process of online education. [57] STA 7.3 A mixed-method approach can provide an in-depth understanding of how and why instructors design their courses for online and distance education. [57] STA 7.3 Multimodal learning analytics gathers insights into learners’ actual behaviour and their cognitive-affective states, expanding the context and impact of learning analytics research by posing new methods and techniques to move forward the synergy between learning analytics and learning design. [58] STA 7.3 Measures from multimodal data can expand learning analytics that have the capacity to align with learning design and learning theories from the learning sciences, are consequential for learning, and can inform and influence the design (or its refinement) of learning activities and objectives in digital settings. [58] RES 7.3 Use of predictive modelling based on combinations of multimodal data streams affords generation of relevant learning analytics measures and showcases how the alignment between learning analytics and learning design can be achieved in computing education context. [58] RES 7.3 For learning analytics (LA) to be effectively deployed to support learning and teaching, it needs to be fully integrated into the learning design (LD) process and not added in as an afterthought. An LA design component should be an integral part of an LD design framework. [59] EDU LD Recommendation Adoption Adoption 7.3 LD is a complex, hierarchically nested, multilevel process from the course level all the way down to the detailed designs within each task, including social organization and individual resources. Coherence and alignment across the different levels and elements of the design are important for the LD to be effective. [59] EDU LD Recommendation Learning Design Learning Design 7.3 LD decisions for different design elements at different levels should be underpinned by learning-sciences-grounded pedagogic principles. [59] EDU LD Recommendation Learning Design Learning Design 7.3 LD frameworks should provide explicit support to guide teachers and other LD professionals through the multilevel (co-)design process and facilitate the construction of mechanisms to promote pedagogical alignment across the design levels and elements. [59] EDU LD Recommendation Learning Design Learning Design 7.3 An LA-integrated LD framework should support the documentation of Learning Analytics integrated Curriculum Component Design Patterns (LA-CCDP)to achieve effective learning for specific learning outcomes. These patterns could serve as the “low-hanging fruit” for teachers’ adoption of LA in their teaching practice. [59] EDU LD Recommendation Adoption Adoption 7.3 Confusion is an epistemic emotion that is unlikely to be avoided in complex learning tasks. [60] STA 7.3 Confusion can promote a deeper understanding of the concepts because its resolution often requires critical thinking, inquiry, and effortful problem-solving. [60] STA 7.3 Simulation-based predict-observe-explain (POE) environments can promote a degree of difficulty that can potentially confuse students. [60] EDU Instructors Finding Confusion Confusion 7.3 One of the likely moments of students’ confusion is during the observe phase of the POE [predict-observe-explain] learning design. [60] EDU Instructors Finding Confusion Confusion 7.3 Students with misconceptions or incorrect prior knowledge are more likely to become confused during the observe phase [of a predict-observe-explain sequence] than students with correct prior knowledge. [60] EDU Instructors Finding Confusion Confusion 7.3 Students’ confidence in their prior knowledge can strongly influence how confused they become. [60] EDU Instructors Finding Confusion Confusion 7.3 Students who have high confidence but make incorrect responses are more likely to show signs of confusion—their effort associated with confusion seems to positively influence their learning outcomes. [60] EDU Instructors Finding Confusion Confusion 7.3 No notes. [61] NUL 7.2 Machine learning classifiers have been widely used in predictive learning analytics (PLAs) in higher education, which requires extensive work on feature engineering and a large course with many failing students. [62] STA 7.2 This study finds that, compared with conventional machine learning models, LSTM networks can be used to predict student course performance with higher accuracy and generalizability using time-series dependencies between student daily click frequencies in the learning management system. [62] RES 7.2 The LSTM [long short-term memory] approach uses a simple feature for prediction, which is more likely to be successfully applied in a wide range of courses. [62] RES 7.2 The LSTM [long short-term memory] approach can be an effective screening tool for detecting at-risk students regardless of course type, which improves the efficiency and affordability of in-process course evaluations. [62] RES 7.2 This paper advances our knowledge of the challenges of time management and the characteristics of students’ progress within single activities in classrooms. [63] FIN 7.2 Several estimators of students’ progress rates and the proportion of time required by each subtask of an activity are proposed and tested on student data from 12 university courses ranging from 64 to 241 students. [63] STA 7.2 The estimators have an average error of less than 2% when predicting the average progress of the class, which can help teachers optimize their use of class time. [63] RES 7.2 Explorations of the detected learning tactics and strategies need to consider both sequential and temporal characteristics. [64] RES 7.2 Learning tactics and strategies are context dependent; therefore, specific learning tactics and strategies have to be interpreted in the particular learning context from which the data originate. [64] RES 7.2 Detected learning tactics should reflect the instructional course design. [64] RES 7.2 This paper explains how predictive learning analytics can help with the selection of students who need support. [65] STA 7.2 A motivational intervention delivered via text, phone, and email was tested. [65] STA 7.2 Students receiving the intervention had statistically significant better retention outcomes compared to the control group. [65] STA 7.2 Student Support Teams could use predictive learning analytics to target specific students and better support them. [65] EDU Advisors Methodology Student-at-risk / Wellbeing Students-at-risk / Wellbeing 7.2 No notes. [66] NUL 7.1 No notes. [67] NUL 7.1 Mastery behaviours such as persisting through difficulty, exerting effort, striving after failure, and seeking challenge can promote success in school. [68] STA 7.1 The persistence, effort, resilience, and challenge-seeking (PERC) task assesses these four mastery behaviours in a single computer activity that takes about 10 minutes to complete and does not depend on language ability or subject-specific knowledge. [68] STA 7.1 Students with higher PERC [persistence, effort, resilience, and challenge-seeking] scores have more adaptive learning mindsets and higher self-reported mastery behaviours. [68] EDU Instructors Finding Mindset Mindset 7.1 PERC [persistence, effort, resilience, and challenge-seeking] scores also predict high school grade point average (GPA)independent of learning mindsets and demographic factors. [68] EDU Instructors Finding Mindset Mindset 7.1 PERC [persistence, effort, resilience, and challenge-seeking] may be a useful tool for assessing mastery behaviours at scale. [68] EDU Instructors Tool Mindset Mindset 7.1 Overview: Learning analytics research is beginning to focus on non-cognitive constructs that lend themselves to more immediate measures, including fine-grained measures of affective states and behavioural disengagement. More distal constructs have also been identified as important in the sociological and social-psychological research, including self-discipline (Duckworth & Selignman, 2005), social belonging (Walton & Cohen, 2011), and academic identities. Despite their known importance, there have been fewer publications that use student interactions (e.g., click-stream data and language production) to model these distal constructs. [69] STA 7.1 Summary of contributions: This study examines how self-concept, interest, and value relate to student behaviours within mathematics learning software. The study finds support for the notion that math identity constructs are correlated with math success and that language features in student texts are predictive of math identity constructs. The study finds some support for the use of click-stream variables to predict math identity, but these variables are generally less predictive. The study finds no evidence that math identity grows (or reduces) across the year of study analyzed here. [69] EDU Instructors Findings Mindset Mindset 7.1 Key implications: This study provides a foundation for examining non-cognitive variables using language features along with click-stream variables that can assist practitioners and software designers in identifying socially relevant math behaviours within online learning systems. [69] RES 7.1 Through machine learning models, GCA [group communication analysis] overcomes some of the constraints inherent to manual qualitative analysis or summative data streams and provides novel perspectives by quantifying intra-and interpersonal dynamics involved in CPS [collaborative problem-solving]. [70] STA 7.1 GCA [group communication analysis] provides educational stakeholders with a domain-independent scalable framework to investigate how roles are constructed and maintained through the dynamic sociocognitive processes within STEM CPS interactions. Further, the novel sociocognitive dimensions within GCA provide new indicators of CPS outcomes and learner CPS [collaborative problem-solving] competencies not captured in current practices. [70] RES 7.1 The classification of types of CPS [collaborative problem-solving] skills and behaviours into a smaller set of sociocognitive roles reduces the analytic complexity of capturing CPS skills within a social system and facilitates the comparative study of populations across time and setting. [70] RES 7.1 Research on the assessment of collaborative learning is still ongoing, considering the advancement in online technologies and new theoretical considerations such as collaborative cognitive load theory (CCLT). [71] FUT 7.1 Collaboration in learning is currently mainly assessed through summative evaluation of individuals. [71] STA 7.1 We draw upon CCLT [collaborative cognitive load theory] to suggest a quantitative instrument for the assessment of the collaboration process in learning. [71] STA 7.1 We evaluate the collective rather than the individual, and the process rather than the outcome. [71] STA 7.1 The CLaP [Collaborative Learning as a Process] approach introduced here gives teachers a glimpse into the process of collaboration rather than its summative outcomes by viewing the changes occurring in the relationship between interactivity gains and coordination costs. [71] EDU Instructors Methodology Collaborative Learning Collaborative Learning 7.1 Schools can increase enrolment and income by strategically arranging the combinations of elective subjects on offer. [72] EDU Admisions Recommendation Enrollment Enrollment 7.1 This [strategically arranging the combinations of elective subjects on offer] can be achieved using the free optimization software described in this article. [72] EDU Administrators Tool Program Sequencing Program Sequencing 6.3 No notes. [73] NUL 6.3 No notes. [74] NUL 6.3 No notes. [75] NUL 6.3 No notes. [76] NUL 6.3 No notes. [77] NUL 6.3 No notes. [78] NUL 6.3 No notes. [79] NUL 6.3 This paper analyzes responses from around the world and identifies seven factors that must be taken into account when implementing learning analytics: power, pedagogy, validity, regulation, complexity, ethics, and affect. [80] RES 6.3 Responses revealed widespread unease about how analytics may develop. Bringing people together to engage with and understand the issues will be one way of addressing this problem. [80] FUT 6.3 It is also important for different communities to discuss and understand the value of their personal data and to know how they can be used, developed, and protected. [80] EDU General Recommendation Privacy Privacy 6.3 Evidence that learning analytics delivers benefits is not yet convincing to experts in the field. [80] RES 6.3 Major investment of thoughtful effort is required in terms of research agendas and funding, policy and regulation, and developing and informing practice among all those who engage with learning analytics. [80] FUT 6.3 Knowledge building aims to recreate education as a knowledge creating enterprise; as such, explorations are not circumscribed by assigned activities with predetermined structure and endpoint, but cross-disciplinary and grade-level boundaries. [81] STA 6.3 A new analytic tool, crisscrossing topics, shows whether students think and theorize about topics in an interdisciplinary way, leading them to address concepts at and beyond their grade level. [81] RES 6.3 The research reported indicates that integrating work across topics leads to more productive discourse threads. [81] EDU Instructors Recommendation Discussion Discourse 6.3 Response time is a type of data that complements response correctness (or score) in learning analytics. Similar to how response correctness data is transformed into learner’s latent ability or skill mastery, response time can be transformed into learner’s slowness (or, alternatively, speed). Recently, several learner models have been proposed that make use of response times. [82] RES 6.3 We find a way to extract user response times on assessment questions in HarvardX courses and fit them using a log-normal model via likelihood maximization. The model quantifies each user’s slowness on assessment questions, controlling for each question’s intrinsic time-intensity. [82] RES 6.3 We find that a user’s tendency to be slower on assessment items is correlated with higher completion rates, higher final grades in the course, and higher certification rates. This provides a basis for future interventions in online courses that would encourage fast users to slow down. [82] EDU LD Recommendation Online Quizzes Online Quizzes 6.3 Our model also outputs characteristics of each assessment item, which are of interest to course content creators. [82] EDU LD Tool Online Quizzes Online Quizzes 6.3 Contract cheating is a significant issue for the higher education sector. There is a strong need for reliable methods to detect or prevent contract cheating. Using keystroke logging and clickstream data shows potential for this purpose. [83] STA 6.3 The current study examined whether it is possible to detect whether a student is creating their own work or transcribing from another source. [83] STA 6.3 The results indicate that it is possible to use patterns of writing activity and speed to accurately distinguish between students creating authentic writing and transcribing from another source. [83] EDU Instructors Findings Cheating Cheating 6.3 These findings point to a promising approach for detecting contract cheating and plagiarism when students transcribe work from another source. [83] EDU Instructors Findings Cheating Cheating 6.3 Adaptive learning systems (ALSs) dynamically adjust the level or type of instruction based on individual student abilities or preferences to provide a customized learning experience. [84] STA 6.3 However, ALSs [adaptive learning systems] require access to a large repository of learning resources, which are commonly created by domain experts. This makes them expensive to develop and challenging to scale across different domains. [84] RES 6.3 We introduce an ALS called RiPPLE that uses a crowdsourcing approach to engage students in creating, moderating, and evaluating learning resources. This both reduces the cost of content generation and has the potential to foster higher-order learning across many domains. [84] RES 6.3 Initial results suggest that using RiPPLE may lead to measurable learning gains and that students perceived the platform as beneficially supporting their learning. [84] EDU General Findings LA Tools LA Tools 6.3 Data about constructs related to self-regulated learning, such as motivation, emotion, and metacognitive experiences, is highly relevant in learning analytics applications. [85] RES 6.3 While this data [about constructs related to self-regulated learning] is difficult to capture automatically, learning diaries can gather process data about students’ internal states. [85] RES 6.3 The main contribution of this article is to present a methodology in which concept maps are used as learning diaries to gather meaningful data for learning analytics applications. [85] RES 6.3 In the future, this method of data gathering [concept maps are used as learning diaries] can be combined with different kinds of learning analytics interventions, including personalized feedback at scale. [85] FUT 6.2 The process of embedding innovative technology in authentic contexts is as much a human challenge — cognitive, social, organizational, political — as it is a technical challenge. [86] STA 6.2 Human-centredness has been identified in other fields as a characteristic of systems that have been carefully designed by identifying the critical stakeholders, their relationships, and the contexts in which those systems will function. [86] STA 6.2 This approach opens up new possibilities for companies and academia to work together to design effective learning analytics (LA). [86] FUT 6.2 The papers in this special section provide a snapshot of how human-centred approaches are currently being applied to LA. [86] STA 6.2 Human-centred learning analytics (HCLA) addresses problems with implementation and take-up that are associated with other design approaches. [86] STA 6.2 Related work in other fields can be adopted and adapted for use with LA, but challenges specific to learning and teaching must be addressed. [86] RES 6.2 There are key differences between various terms used to describe collaboration with stakeholders in learning analytics, including user-centred design, human-centred design, participatory design, and co-creation. [87] RES 6.2 The framework of communication, usage, and service encounters is a practical lens through which to consider engagement with participatory design partners. Through this framework, practitioners can symmetrically design, implement, and evaluate interactions with users that generate value. [87] RES 6.2 Flexible LA tools — such as the Student Relationship Engagement System discussed here — allow teachers to co-create LA and develop expertise, thereby allowing them to contribute in a more meaningful way to participatory design and co-creation of LA in the future. [87] EDU Instructors Tool LA Design LA Design 6.2 Approaching participatory design in LA as co-creation of a dynamic set of actions, supported through a platform to co-create the teachers’ practice, may alleviate issues surrounding technical knowledge and expertise. [87] EDU Instructors Findings LA Design LA Design 6.2 The field of Learning Analytics (LA) is beginning to explore new methods and strategies for co-designing LA tools with critical stakeholders such as teachers, students, and parents. Effectively implementing LA co-design processes requires drawing upon design and prototyping methods from a range of disciplines, and in some cases creating new ones. However, demonstrations of successful co-design processes for LA tools remain very rare in the literature. [88] STA 6.2 This article provides an end-to-end demonstration and methodological recommendations for how non-technical stakeholders can meaningfully participate throughout the design of a complex LA tool. For example, when implementing co-design processes, designers of LA tools should centre initial discussions on stakeholder needs, rather than specific analytics, visualizations, or other technical considerations. In addition, to ensure the usefulness and usability of the resulting designs, designers should scaffold stakeholders in reflecting on how particular LA displays might inform (or fail to inform) instructional decision-making and action in the context of specific tasks and scenarios. [88] RES 6.2 The present case study illustrates the importance of carefully considering which roles to augment, which to automate, and which to leave alone, when designing LA tools for use in social contexts such as classrooms. Working closely with key stakeholders such as teachers can help designers to better understand their values, and nuances of the contexts in which LA technologies will be used. In turn, this can help in understanding where automation or augmentation via LA can help more than hurt. [88] RES 6.2 Integrating the use of learning analytics into teaching practices to inform instructional decisions takes time; the gap from “interesting” to “actionable” is the most important to bridge with pedagogical support. [89] RES 6.2 Instructors’ use of analytics can be stimulated and supported through opportunities for collaborative interpretation and development of ongoing support networks with other instructors. [89] EDU Administrators Findings Adoption Adoption 6.2 Learning analytics tools can support processes of use through targeted features such as question generation, flagging patterns for later review/action, and prompts to check the impact. [89] RES 6.2 Learning analytics tools can support actionability by aligning information structures with common pedagogical concerns and offering relevant reference points for comparison. [89] RES 6.2 Involving instructors throughout analytic tool development and conducting early studies of analytics use in-situ can provide important insight into tool design for local actionability. [89] RES 6.2 Prior learning analytics work has used various techniques from human-centred design ranging from user-interviews to engaging practice partners in low-fidelity prototyping. A framework of design practice that is deeply embedded in partnerships with our intended users is needed. [90] STA 6.2 Designing within the context of research–practice partnerships and improvement science initiatives helps dashboard designers balance the tensions between making interface–interaction decisions while ensuring that design aligns with existing work processes, data interpretation goals, and improvement aims. [90] RES 6.2 Common data collection techniques, such as user interviews and think-alouds, can be structured and analyzed for insights into practitioners’ data sensemaking needs in addition to usability analyses to inform interface or product changes. [90] RES 6.2 The purpose of learning analytics design work should not be limited only to fidelity of implementation or adoption. Rather, an indicator of success is whether productive adaptations and local needs for LA tools can be embedded in the design itself. Partnership approaches offer unique advantages to achieving these design goals. [90] RES 6.2 The purpose of this study was to determine how effective our LA program has been at influencing change on our campus thus far, as we cultivate a data-informed culture for the purpose of improving overall student success. The primary focus of our program is to move from research to actions that will impact student performance, persistence, retention, and graduation rates. [91] STA 6.2 Because of the purpose of this study and the evolving nature of our program, we used Action Research strategies to understand, inform, and continually improve the meaningful application of learning analytics by faculty and the staff and administrators who support them. [91] STA 6.2 In the future, a summative evaluation will consider other aspects of the program, including the administrative perspective and a richer exploration of the individual faculty projects. [91] FUT 6.2 The types of analytics that faculty used in their research encompass both learning and academic analytics as defined by Long and Siemens (2011). Taken together, Fellows’ projects analyze course, department, and institutional data that benefit students, faculty, and administrators alike. [91] FIN 6.2 No notes. [92] NUL 6.2 Administrative data in higher education is not typically ready “out of the box” for learning analytics researchers. How can institutions leverage their own resources and knowledge of their data to reduce time-intensive and repetitive data-cleaning efforts to make the most of their own data? [93] STA 6.2 This article presents the case study of the LARC [Learning Analytics Architecture] dataset, as well as the decision points and process that the committee members undertook to build and maintain the data. This kind of effort remains a major hurdle for many higher education institutions. [93] STA 6.2 A framework is provided for other institutions interested in building a similar dataset for learning analytics research. Six areas of consideration are discussed: (1) facilitating partnerships across departments, roles, and levels; (2) level setting and arriving at a common level of understanding among team members; (3) obtaining buy-in from relevant stakeholders and data stewards; (4) designing for the needs of the specific institutional context; (5) utilizing national and international standards whenever possible; and (6) understanding the institutional landscape of learning analytics and designing datasets accordingly. [93] EDU Administrators Tool Adoption Adoption 6.2 We propose the Social Semantic Server (SSS) as a service-based infrastructure for workplace and professional learning analytics (LA) that focuses on knowledge creation theories. [94] RES 6.2 We identify the requirements for the SSS [Social Semantic Server] and present its design and development. [94] FIN 6.2 We evaluated the SSS [Social Semantic Server] by integrating a set of learning tools and LA applications into the SSS and using it in 4 authentic workplace learning situations involving 57 participants. [94] FIN 6.2 The size and complexity of massive open online course (MOOC) data present an overwhelming challenge to many researchers. [95] STA 6.2 The package crsra tidies and performs preliminary analysis on Coursera data. [95] STA 6.2 The advantages of the [crsra] package are faster loading of data, an efficient method for combining data from multiple courses and institutions, and provision of a set of functions for analyzing student behaviours. [95] RES 6.2 Advanced educational technologies, including simulations, games, and intelligent tutoring systems, continually assess students in order to provide them with appropriate activities and to determine their mastery of the topics presented. [96] STA 6.2 The assessment embedded in adaptive systems is a type of formative assessment, but we can also use it to make summative conclusions about what a student has learned. [96] EDU Instructors Recommendation Assessment Assessment 6.2 We show that process data collected from students using MATHia, an intelligent tutoring system, over the course of a year can predict high-stakes test scores over and above the ability of a prior-year test to predict these scores. [96] EDU Instructors Findings Assessment Assessment 6.2 Models learned on data from a single academic year can be used to predict outcomes for students in other academic years, suggesting that significant predictors of student outcomes remain relatively stable from year to year. [96] EDU LA Designers Findings LA Design LA Design 6.2 The ability to predict high-stakes exam scores is a necessary (though insufficient) step towards replacing such exams with embedded formative assessments, but even if high-stakes exams remain in place, predictive tools can provide important information about learner readiness for such high-stakes exams. [96] EDU General Recommendation Assessment Assessment 6.1 Continuous-representation visualization can produce a high-level view of emergent student behaviour online without the need to define features or tag. [97] RES 6.1 Differential visualization of passing and non-passing student course behaviours can help identify deep-and shallow-learning strategies and provide instructors with essential information for modifying the curricula to discourage strategies associated with failure. [97] EDU Instructors Tool Student-at-risk / Wellbeing Students-at-risk / Wellbeing 6.1 Involving instructors in the tuning of the visualization and model parameters produces analyses with a desirable mixture of expected and unexpected, but explainable, patterns. [97] RES 6.1 Layering on additional data, such as when students create a discussion post, further contextualizes insight into student learning strategies from visualizations. [97] RES 6.1 No notes. [98] NUL 6.1 Low attendance in lectures is common in higher education, and so students who are absent may be using online materials as an important source of course content. There are differences in the temporal patterns of student engagement with materials online and in lectures. [99] STA 6.1 This work finds that more students participate online than are present in lectures, but that the rates of both forms of attendance decrease over time. [99] EDU Instructors Findings Online Courses Online Courses 6.1 Students refine their online behaviour, and rare behaviours disappear over time. [99] EDU Instructors Findings Online Courses Online Courses 6.1 Instructors can use the methods described in this paper to assess how students engage with their course materials relative to our benchmarks, and to diagnose possible reasons that students disengage. [99] EDU Instructors Methodology Student-at-risk / Wellbeing Students-at-risk / Wellbeing 6.1 Questions provide important insights into students’ level of knowledge, but coding schemes are lacking to study this phenomenon. [100] STA 6.1 After providing a bottom-up coding scheme of student questions in a blended environment, we analyzed the relationship between the questions asked and the student profiles. [100] STA 6.1 Profiling students based on their questions over a year allows us to predict the profiles of future students to help the teacher understand who asks what. [100] RES 6.1 These results provide both a coding scheme that can be reused in various contexts involving questions, and a methodology that can be replicated in any context where students ask many questions, in particular to help the teacher in prioritizing them according to their own criteria. [100] RES 6.1 Teachers need to focus on the nature of questions asked by their students, because they can reveal information about their profile (attendance, activity, etc.). [100] EDU Instructors Recommendation Discourse Discourse 6.1 The uptake and integration of learning analytics in most higher education institutions is limited, requiring knowledge about how to implement PLA [predictive learning analytics] at scale. [101] STA 6.1 Analysis of the perspectives of 20 educational stakeholders about the adoption of PLA at the Open University UK led to a set of recommendations about how to overcome resistance and implement PLA [predictive learning analytics] in higher education. [101] EDU Administrators Recommendation Adoption Adoption 6.1 The proposed set of guidelines [related to predictive learning analytics] needs to be tested in a variety of contexts, including campus-based universities and other distance learning institutions. [101] FUT 6.1 Further research is needed to understand and overcome teacher resistance to using PLA i[predictive learning analytics] In higher education. [101] FUT 5.3 No notes. [102] NUL 5.3 This paper presents a framework that can be used to assist with strategic planning and policy processes for learning analytics. [103] EDU Administrators Methodology Adoption Adoption 5.3 This research builds on the RAPID Outcome Mapping Approach (ROMA) and adapts it by including elements of actions, challenges, and policy prompts. STA 5.3 The proposed framework was developed based on the experiences of learning analytics adoption at 51 European higher education institutions. [103] STA 5.3 The proposed framework will enhance systematic adoption of learning analytics on a wide scale. [103] EDU Administrators Methodology Adoption Adoption 5.3 We developed and tested a system that prompts students with previous questions to encourage retrieval practice in a MOOC. [104] STA 5.3 Retrieval practice showed no effect on learning outcomes or engagement. [104] FIN 5.3 Learners who passed the course scored similarly on a knowledge post-test as did learners who didn’t pass. [104] FIN 5.3 Both passing and non-passing learners retained approximately two-thirds of the knowledge from the course after a two-week testing delay. [104] FIN 5.3 The co-enrollment networks evaluated here demonstrate power-law degree distributions common to many other types of networks. [105] RES 5.3 Structural models and, in particular, multi-view ensembles of structural models and traditional “flat” models can improve over the performance of non-network models for student grade prediction in residential higher education courses. [105] RES 5.3 Assumptions of independence between network and non-network features in statistical models may be strongly violated, which can reduce the performance of models that cannot model or otherwise incorporate this dependence. [105] RES 5.3 Research on academic performance typically focuses on single course outcomes, but students who are most at risk could be experiencing difficulty in more than one course during an academic term. [106] STA 5.3 This paper demonstrates how data from early warning systems can be utilized to characterize student performance throughout the term, allowing investigation of how and when students experience academic difficulty and recovery across co-enrolled courses. [106] EDU Advisors Methodology Students-at-risk / Wellbeing Students-at-risk / Wellbeing 5.3 We identify three consistent significant influences on changing odds of risk of academic difficulty: 1) whether a student is concurrently enrolled in another difficult course as defined by historical early warning system data, 2) a student’s academic major, and 3) a snowball effect whereby increased risk in the focal course is linked to increased risk of academic difficulty in other coursework. [106] EDU Advisors Findings Students-at-risk / Wellbeing Students-at-risk / Wellbeing 5.3 Utilizing the methods presented in this paper on data from their own institution, practitioners could reflect on the organization of their system-wide curriculum to best support student academic planning. For example, practitioners could use these measures to identify high risk courses for intervention or enhance the development of early warning and course recommender systems by integrating more holistic measures of student success. [106] EDU Advisors Methodology Wellbeing Wellbeing 5.3 The learning analytics field, to effectively support teaching and learning visually, needs to move on from visual analytics that invite users to explore the data, to visualizations that explain insights. [107] FUT 5.3 In this paper, we propose the concept of “Educational Data Storytelling” as an approach to designing visual learning analytics interfaces that explain student data by aligning educational visualizations with the learning design intended by the teacher. [107] RES 5.3 We see the potential of learning-design-driven data storytelling elements to support sensemaking by guiding students and teachers to “one learning story per visualization,” given that learning is a complex task. [107] EDU LA Designers Recommendation LA Design LA Design 5.3 Mirroring tools could regulate learner behaviour depending on the contextual set up of the programming environment. [108] RES 5.3 This study demonstrated that students who improved their performance and achieved a higher level of debugging success have gaze patterns that corresponded with attention shifts with the mirroring tool and other areas of interest (AOIs). [108] FIN 5.3 Visual attention strategies among novices are not as well developed as they are among experts. This is shown in the successful and unsuccessful debugging behaviour patterns calculated from two-and three-way transition probabilities that we observed. [108] EDU Instructors Findings CS Education CS Education 5.3 IDE-based (integrated development environment) learning analytics combined with gaze data could aid researchers and educators in designing and delivering interventions to enhance student progress by guiding them to attend to the right information at the right time so as to maximize student understanding of relevant concepts. [108] RES 5.3 Students spent, on average, less time studying in the virtual learning environment (VLE) than the number of hours recommended by instructors. [109] EDU Instructors Findings Time-on-task Time-on-task 5.3 High-performing students not only studied harder (i.e., spent more time) but also smarter (i.e., spent more time studying in advance) than low-performing students. [109] EDU Instructors Findings Time-on-task Time-on-task 5.3 Linking learning design with learning analytics allows instructors to pinpoint specific study materials that were causing delays in the student learning process. [109] EDU Instructors Recommendation Learning Design Learning Design 5.3 The development of visualization techniques tailored specifically for the design, delivery, and analysis of assessment material seems, as yet, to be under-developed and under-explored. [110] STA 5.3 This paper presents a collection of Topic Dependency Models (TDMs) that use a common graph foundation to address some of the challenges stakeholders (students, instructors, administration, researchers) face relating to the design, delivery, and analysis of assessment. [110] STA 5.3 Two case studies with a focus on computer science education are presented that suggest that instructors can benefit from the TDMs [topic dependency models]. [110] EDU Instructors Findings CS Education CS Education 5.3 TDMs [topic dependency models] may be hard to comprehend for stakeholders without formal training in algorithmic literacy. Future work will investigate the use of TDMs among different stakeholders and disciplines with less formal training in algorithmic literacy; modifications to make TDMs more comprehensible for these audiences will also be explored. [110] FUT 5.3 No notes. [111] NUL 5.3 Past studies have found that student grades are influenced by several factors, including student ability, student demographic background, and course specific factors. [112] STA 5.3 This study finds that student prior performance in past courses is the best predictor of final grade outcome in an introductory mathematics course. [112] FIN 5.3 We recommend early interventions to target at-risk students, particularly students with low grade-point averages, students who are part time, and first semester students. [112] EDU Advisors Recommendation Students-at-risk / Wellbeing Students-at-risk / Wellbeing 5.3 We found that time spent on assignments is associated with higher grades. Therefore, to the extent possible, assignments should be given in an online format that incorporates a timestamp, thereby allowing instructors to identify in real-time students who may need intervention. [112] EDU Instructors Recommendation Students-at-risk / Wellbeing Students-at-risk / Wellbeing 5.3 Previously, limited research studies focus on the impact of implicit video learning analytics (e.g., pause, seek, contents) on student success within MOOCs. Accordingly, researchers found a relationship between student video interactions and visual transitions of videos. [113] STA 5.3 However, the impact of verbal contents (i.e., lecturer’s verbal discourse) on student video interactions is not yet being explored. [113] STA 5.3 This research explores the relationship between student video interactions and discourse features of video transcripts. [113] STA 5.3 Our results demonstrate that discourse features —like first-or second-person pronouns, concrete words, frequent content words, causal connectives, and narrativity —facilitate video discourse processing while lengthy sentences, lengthy videos, and high speaking rate provide difficulties with video discourse processing. [113] EDU Learning Designers Findings Video Video 5.3 The findings of this research provide insightful implications for MOOC staff and video producers on formulating video transcripts to reduce interaction peaks. [113] EDU Learning Designers Findings Video Video 5.3 At the time of this review, we only found a single source in the literature that discusses the definition of “actionable insights.” All other sources task the reader to infer the meaning of the concept from its use. [114] STA 5.3 This paper provides a selective overview of learning analytics literature that sheds light on the use of the concept [actionable insights] and its equivalents. [114] STA 5.3 We identify a widely used perspective taken in learning analytics, which we dub “data-informed decision-making.” This perspective is characterized by an insistence of the perspective of a rational actor and the use of learning analytics for the institutional goal of increasing retention. [114] RES 5.3 We contend that “actionable insights” should be interpreted as data that allows a corrective procedure, or feedback loop, to be established for a set of actions. [114] RES 5.3 We argue that the field of learning analytics would benefit from greater attention to the role of perspective and action capabilities in determining what “actionable insights” are. [114] RES 5.3 The implications of these findings are that the perspective of data-informed decision-making is challenged, and with it, the idea that the presence of data alone provides the basis for determining insights. Instead, it charges any learning analytics researcher to map out the workflow of actions, the end goals of the actors involved, and the relevant couplings between them. [114] RES 5.3 Previous research suggests that learning is facilitated by tutorial dialogues; specifically, in the intelligent tutoring system, BRCA Gist tutorial dialogues contribute to learning. [115] STA 5.3 Coh-Metrix 3.0 is a web-based tool for the automatic evaluation of text and discourse that we applied to analyzing BRCA Gist tutorial dialogues about genetic breast cancer risk. [115] STA 5.3 In two studies, we found that the dialogues scored high on deep cohesion, referential cohesion, and the situation model variable LSA verb overlap. This indicates that participants’ mental representations were coherent and conceptually well integrated at the level of actions. [115] FIN 5.3 It appears that this method of using Coh-Metrix allows one to make reliable inferences about the mental representations of people engaged in conversation with an intelligent tutoring system. [115] RES 5.3 The report draws on research findings related to the effect of personalized feedback on student satisfaction and academic performance (Pardo, Jovanović, Dawson, Gašević, & Mirriahi, 2018). [116] STA 5.3 The main contribution is the description of the design and implementation of an open source platform for researchers and practitioners to connect data with personalized learning support actions. [116] RES 5.3 The area of learning analytics needs tools such as the one described in this document to serve as a vehicle to exchange insights among researchers and practitioners. [116] STA 5.3 This is an example of the note for practice and research. [116] NUL 5.2 No notes. [117] NUL 5.2 Social network analysis has been one of the most commonly applied methods within learning analytics. However, many of the common constructs and tools these methodologies employ have not been subjected to robust validation. Such concerns pertain to construct validity: namely, does a metric actually measure what it purports to measure? [118] STA 5.2 In this study, we find that different social tie extraction methods influence the structural and statistical properties of the induced networks, as well as the associations between centrality measures and academic performance. [118] RES 5.2 Our results emphasize not only the importance of transparency in the choice of tie definition, but also the importance of providing a justification for that choice. Given the impact that tie definitions may have, we advise that practitioners investigate a number of options to ascertain the extent to which such methodological choices can bias their results. [118] RES 5.2 We analyzed an online adaptive practice environment for arithmetic, actively used by over 400,000 primary school children in the Netherlands. [119] STA 5.2 Adaptive practice is achieved by continuously tracking both student abilities and item difficulties, and matching students to items. [119] STA 5.2 A unidimensional adaptive algorithm, separately employed within each domain (e.g., multiplication), takes care of tracking abilities and difficulties. [119] RES 5.2 We show that the obtained unidimensional ability and difficulty estimates are, to a large extent, reliable and accurate. [119] RES 5.2 Moreover, we explore the many sources of misfit, or violations of the unidimensionality assumption, including differences in response processes (fast and slow responders) and response strategies (erroneous strategies that work for clusters of items). [119] STA 5.2 To advance the field of learning analytics, we call for active analytics such as exemplified in this paper. Learning analytics must actively help direct a student towards his or her educational objective by means of embedded analytics that not only analyze the student, but also shape their learning path (such as the discussed adaptive algorithm) and includes experiments that ensure changes to the system have the desired effect. [119] FUT 5.2 Learning Analytics, as a field, should ultimately strive to make strong causal inferences, identifying the specific interventions that optimize and improve learning. [120] FUT 5.2 The most straightforward and compelling research method for supporting causal inference is experimentation. [120] STA 5.2 In this article, we advocate for embedding experiments within pre-existing learning contexts, in order to improve the strength of causal claims in learning analytics, and also to close the research/practice loop. [120] RES 5.2 We review practical matters in the design and deployment of embedded experiments and highlight the benefits of including experimentation in the learning analytics toolkit. [120] RES 5.2 For data visualizations to meet the intended purpose of supporting complex human cognition, design choices should be based on human factors research. [121] RES 5.2 This paper reviews some of these research insights from cognitive psychology and visualization science to inform human-centred design of data visualizations in order to support complex cognitive processes. This is achieved by demonstrating that, depending on the approaches to data visualization, attention and cognition may be facilitated or impaired. [121] STA 5.2 Evidence-informed guidelines are proposed for enhancing design of data visualization to support inference, judgment, and decision making for researchers, practitioners, and consumers of learning analytics. [121] RES 5.2 Previous research has shown that choice of metric plays a key role in training and evaluation of student models, focusing primarily on metrics intended for models that produce probabilistic predictions of student outcome variables. [122] STA 5.2 Imbalances in labelled data are quite common in student modelling tasks, and have been shown to impact metrics used for machine-learned student models. [122] STA 5.2 This paper explores the impact that predicted class proportions and data class proportions have on discrete model metrics including Cohen’s kappa, precision, recall, and F1, and formulates a random-chance level F1 measurement that is adjusted for imbalances. [122] STA 5.2 Results on real-world student models and simulated models show that best practices include reporting multiple metrics for discrete student models, and comparing F1scores to the appropriate chance level to avoid over-or under-estimating model performance. [122] STA 5.2 Use statistical testing for model evaluation. Use tests for use with the prediction architecture in an experiment. [123] STA 5.2 Do not use an uncorrected paired t-test for model evaluation when cross-validation is used. [123] STA 5.2 We recommend Bayesian hierarchical methods for model evaluation. Existing open-source software toolkits and consumer-grade hardware are adequate for even large-scale learning analytics experiments. [123] STA 5.2 Clearly describe or document methods for statistical algorithm and hyperparameter selection, prediction architectures, and model evaluation in published work. [123] STA 5.2 In the MOOC dropout prediction experiment presented here, hyperparameter tuning had little effect relative to both feature and algorithm selection. Simple activity-based features derived from the clickstream outperformed even sophisticated features derived from discussion forums and assignments. Non parametric tree-based algorithms (CART, Adaboost) achieved strong performance. [123] STA 5.1 No notes. [124] NUL 5.1 In jigsaw instruction, learners who have become experts in one group learning activity interact with those who study different materials in order to integrate the materials and learn from each other. [125] STA 5.1 We use two approaches to study jigsaw instruction: 1) dialogism, in which student dialogue is considered in its broad context over time; and 2) social physics, in which researchers use quantitative methods like social network analysis to describe mathematical relationships between idea flows and human behaviours. [125] STA 5.1 Identifying pivotal moments in jigsaw instruction in which a lack of knowledge from students can shift dialogue is important. Our computational analysis based on the social physics could identify such pivotal segments. [125] RES 5.1 Further discourse analysis based on the dialogism informed us how the productive interaction unfolded among learners. [125] STA 5.1 Results suggest that our mixed-methods approach would significantly reduce the time for the analysis of group interaction, supporting teachers in the classroom to use results of this analysis for redesigning their practices, and highlighting the importance of dialogic and social physics approaches to understanding student learning. [125] RES 5.1 Although the frequency by which participants perform interactions within course management systems and the total amount of time spent in online courses are important indicators of engagement commonly used in analytics packages, the timing at which tasks are completed can also be an important indicator. [126] RES 5.1 The methods presented in this study provide tools for researchers to capture and visualize variables representing the timing of work by participants. [126] RES 5.1 Tools for tracking and measuring timing in courses could be useful for course managers and researchers to help further promote participants’ achievements, promote course personalization, and understand common challenges and patterns of participation that affect a course. [126] FUT 5.1 Educational software automatically logs student entries, clicks, and other submitted activity. Researchers can use these detailed logs to infer information about student learning processes. However, software cannot log some important events that happen during student learning experiences. [127] STA 5.1 We collected video and audio while students used educational software in the classroom and developed a tool to combine the software data logs with the audio/video data to discover insights about student learning processes. [127] STA 5.1 The novel STREAMS tool allows researchers to extract audio and video data aligned with features extracted from software logs that reveal unique insights about students’ conceptual struggles and learning trajectories. [127] RES 5.1 The sequence model for learning analytics represents temporal relationships in heterogeneous student data as a basis for predictive models. [128] RES 5.1 This paper contributes an approach to learning analytics that uses temporal models for predicting students at risk and shows how these models perform with higher accuracy than non-temporal models. [128] RES 5.1 The implications of the sequence model for student data is improved predictive accuracy and understanding of students at risk. [128] RES 5.1 Researchers have not determined how to analyze whether student learning processes (e.g., creating an idea or evaluating it) often occur in specific sequences and whether they are linked to learning outcomes at higher level(s) (e.g., individual test score, group final project, etc.). [129] STA 5.1 After introducing, illustrating, and contrasting two methods to address the above issues on a dataset, I created a statistical procedure using both methods: higher-level outcome analysis followed by lower-level process analysis (if needed). [129] STA 5.1 This statistical procedure can test whether learning processes form a recurrent sequence (e.g., correct evaluation -> correct, new idea). [129] RES 5.1 This statistical procedure can test whether learning processes or sequences of them are linked to learning outcomes at higher level(s) (e.g., groups with more correct evaluations; correct, new ideas; or correct evaluation -> correct, new idea sequences have higher solution scores). [129] RES 5.1 These results can help teachers assess student learning processes and sequences of them, and then intervene suitably to improve their learning outcomes. [129] EDU Instructors Methodology Assessment Assessment 5.1 No notes. [130] NUL