Designing in Context: Reaching Beyond Usability in Learning Analytics Dashboard Design


  • June Ahn University of California, Irvine
  • Fabio Campos New York University
  • Maria Hays University of Washington, Seattle
  • Daniela Digiacomo University of California, Riverside



Human-Computer Interaction, Learning Dashboards, Design Narratives, Data Sensemaking, Improvement Science, Learning Sciences


Researchers and developers of learning analytics (LA) systems are increasingly adopting human-centred design (HCD) approaches, with growing need to understand how to apply design practice in different educational settings. In this paper, we present a design narrative of our experience developing dashboards to support middle school mathematics teachers’ pedagogical practices, in a multi-university, multi-school district, improvement science initiative in the United States. Through documentation of our design experience, we offer ways to adapt common HCD methods — contextual design and design tensions — when developing visual analytics systems for educators. We also illuminate how adopting these design methods within the context of improvement science and research–practice partnerships fundamentally influences the design choices we make and the focal questions we undertake. The results of this design process flow naturally from the appropriation and repurposing of tools by district partners and directly inform improvement goals.


Alhadad, S. S. (2018). Visualizing data to support judgement, inference, and decision making in learning analytics: Insights from cognitive psychology and visualization science. Journal of Learning Analytics, 5(2), 60–85.

Beyer, H., & Holtzblatt, K. (1999). Contextual design. Interactions, 6(1), 32–42.

Blomberg, J., Giacomi, J., Mosher, A., & Swenton-Wall, P. (2017). Ethnographic field methods and their relation to design. In J. Simonsen & T. Robertson (Eds.), Routledge international handbook of participatory design (pp. 123–155). New York: Routledge.

Bryk, A. S., Gomez, L. M., Grunow, A., & LeMahieu, P. G. (2015). Learning to improve: How America’s schools can get better at getting better. Cambridge, MA: Harvard Education Press.

Buckingham Shum, S. (2015, May 4). Learning analytics meet improvement science [Blog Post]. Retrieved from

Buley, L. (2013). The user experience team of one: A research and design survival guide. New York: Rosenfeld Media.

Coburn, C. E., & Stein, M. K. (2010). Research and practice in education: Building alliances, bridging the divide. Lanham, MD: Rowman & Littlefield.

Coburn, C. E., Penuel, W. R., & Geil, K. E. (2013, January). Research–practice partnerships: A strategy for leveraging research for educational improvement in school districts. New York: William T. Grant Foundation.

Datnow, A. (2000). Power and politics in the adoption of school reform models. Educational Evaluation and Policy Analysis 22(4), 357–374.

Dawson, S., Poquet, O., Colvin, C., Rogers, T., Pardo, A., & Gašević, D. (2018). Rethinking learning analytics adoption through complexity leadership theory. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 236–244). New York: ACM. https://10.1145/3170358.3170375

DiSalvo, B., Yip, J., Bonsignore, E., & DiSalvo, C. (2017). Participatory design for learning. New York: Routledge.

Dollinger, M., & Lodge, J. M. (2018). Co-creation strategies for learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 97–101). New York: ACM. https://10.1145/3170358.3170372

Duval, E. (2011). Attention please! Learning analytics for visualization and recommendation. In P. Long, G. Siemens, G. Conole, & D. Gašević (Eds.), Proceedings of the 1st International Conference on Learning Analytics and Knowledge (LAK ʼ11), 27 February–1 March 2011, Banff, AB, Canada (pp. 9–17). New York: ACM.

Ferguson, R., Clow, D., Macfadyen, L., Essa, A., Dawson, S., & Alexander, S. (2014). Setting learning analytics in context: Overcoming the barriers to large-scale adoption. In Proceedings of the 4th International Conference on Learning Analytics and Knowledge (LAK ʼ14), 24–28 March 2014, Indianapolis, IN, USA (pp. 251–253). New York: ACM.

Few, S. (2006). Information dashboard design: The effective visual communication of data. Sebastopol, CA: O’Reilly Media.

Fiorella, L., & Mayer, R. E. (2014). Role of expectations and explanations in learning by teaching. Contemporary Educational Psychology, 39(2), 75–85.

Fiorini, S., Sewell, A., Bumbalough, M., Chauhan, P., Shepard, L., Rehrey, G., & Groth, D. (2018). An application of participatory action research in advising-focused learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 89–96). New York: ACM.

Fonteyn, M. E., Kuipers, B., & Grobe, S. J. (1993). A description of think aloud method and protocol analysis. Qualitative Health Research, 3(4), 430–441.

Franke, M. L., Kazemi, E., & Battey, D. (2007). Mathematics teaching and classroom practice. In F. K. Lester (Ed.), Second handbook of research on mathematics teaching and learning (pp. 225–256). Greenwich, CT: Information Age Publishers.

Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.

Gutiérrez, K. D., & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher, 43(1), 19–23.

Hoadley, C. P. (2002). Creating context: Design-based research in creating and understanding CSCL. In Proceedings of the Conference on Computer Support for Collaborative Learning: Foundations for a CSCL Community (CSCL 2002), 7–11 January 2002, Boulder, CO, USA (pp. 453–462). International Society of the Learning Sciences. Hillsdale, NJ: Lawrence Erlbaum Associates.

Holstein, K., Hong, G., Tegene, M., McLaren, B. M., & Aleven, V. (2018). The classroom as a dashboard: Co-designing wearable cognitive augmentation for K–12 teachers. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 79–88). New York: ACM.

Jackson, K., Garrison, A., Wilson, J., Gibbons, L., & Shahan, E. (2013). Exploring relationships between setting up complex tasks and opportunities to learn in concluding whole-class discussions in middle-grades mathematics instruction. Journal for Research in Mathematics Education, 44(4), 646–682.

Jackson, K., Cobb, P., Wilson, J., Webster, M., Dunlap, C., & Appelgate, M. (2015). Investigating the development of mathematics leaders’ capacity to support teachers’ learning on a large scale. ZDM: Mathematics Education, 47(1), 93–104.

Jivet, I., Scheffel, M., Drachsler, H., & Specht, M. (2017). Awareness is not enough: Pitfalls of learning analytics dashboards in the educational practice. In É. Lavoué, H. Drachsler, K. Verbert, J. Broisin, M. Pérez-Sanagustín (Eds.), Data Driven Approaches in Digital Education: Proceedings of the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017), 12–15 September 2017, Tallinn, Estonia (pp. 82–96). Lecture Notes in Computer Science, Springer.

Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018). License to evaluate: Preparing learning analytics dashboards for educational practice. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (LAK ’18), 5–9 March 2018, Sydney, NSW, Australia (pp. 31–40). New York: ACM.

Klein, G., Moon, B., & Hoffman, R. R. (2006). Making sense of sensemaking 1: Alternative perspectives. IEEE Intelligent Systems, 21(4).

Klerkx, J., Verbert, K., & Duval, E. (2017). Learning analytics dashboards. In C. Lang, G. Siemens, A. Wise, & D. Gašević (Eds.), The handbook of learning analytics (pp. 143–150). Beaumont, AB: Society for Learning Analytics Research (SoLAR).

Knaflic, C. N. (2015). Storytelling with data: A data visualization guide for business professionals. Hoboken, NJ: John Wiley & Sons.

Krumm, A. E., Beattie, R., Takahashi, S., D’Angelo, C., Feng, M., & Cheng, B. (2016). Practical measurement and productive persistence: Strategies for using digital learning system data to drive improvement. Journal of Learning Analytics, 3(2), 116–138.

Lee, S., Kim, S. H., Hung, Y. H., Lam, H., Kang, Y. A., & Yi, J. S. (2016). How do people make sense of unfamiliar visualizations? A grounded model of novice’s information visualization sensemaking. IEEE Transactions on Visualization and Computer Graphics, 22(1), 499–508.

Little, J. W. (2012). Understanding data use practice among teachers: The contribution of micro-process studies. American Journal of Education, 118(2), 143–166.

Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., & Clayphan, A. (2015). Latux: An iterative workflow for designing, validating, and deploying learning analytics visualizations. Journal of Learning Analytics, 2(3), 9–39.

Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38(1), 43–52.

Merceron, A., Blikstein, P., & Siemens, G. (2015). Learning analytics: From big data to meaningful data. Journal of Learning Analytics, 2(3), 4–8.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks, CA: Sage Publications.

Munzner, T. (2014). Visualization analysis and design. (A. K. Peters visualization series). Boca Raton, FL: CRC Press/Taylor & Francis Group.

Potvin, A. S., Kaplan, R. G., Boardman, A. G., & Polman, J. L. (2017). Configurations in co-design: Participant structures in partnership work. In B. Bevan & W. R. Penuel (Eds.), Connecting Research and Practice for Educational Improvement: Ethical and Equitable Approaches (pp. 135–149). Abingdon, UK: Taylor & Francis.

Scheffel, M., Drachsler, H., Toisoul, C., Ternier, S., & Specht, M. (2017). The proof of the pudding: Examining validity and reliability of the evaluation framework for learning analytics. In Proceedings of the 12th European Conference on Technology Enhanced Learning (EC-TEL 2017), 12–15 September 2017, Tallinn, Estonia (pp. 194–208). Lecture Notes in Computer Science, Springer.

Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., ... & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30–41.

Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2018). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, in press.

Suthers, D., & Verbert, K. (2013). Learning analytics as a middle space. In Proceedings of the 3rd International Conference on Learning Analytics and Knowledge (LAK ’13), 8–12 April 2013, Leuven, Belgium (pp. 1–4). ACM.

Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12(2), 257–285.

Tatar, D. (2007). The design tensions framework. Human–Computer Interaction, 22(4), 413–451.

Van Someren, M. W., Barnard, Y. F., & Sandberg, J. A. C. (1994). The think aloud method: A practical approach to modelling cognitive processes. Academic Press. London.

Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist, 57(10), 1500–1509.

Wardrip, P. S., & Herman, P. (2018). “We’re keeping on top of the students”: Making sense of test data with more informal data in a grade-level instructional team. Teacher Development, 22(1), 31–50.

Wardrip, P. S., & Shapiro, R. B. (2016). Digital media and data: Using and designing technologies to support learning in practice. Learning, Media and Technology, 41(2), 187–192.

Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. In C. Lang, G. Siemens, A. Wise, & D. Gašević (Eds.), The handbook of learning analytics (pp. 151–160). Beaumont, AB: Society for Learning Analytics Research (SoLAR).

Xhakaj, F., Aleven, V., & McLaren, B. M. (2016). How teachers use data to help students learn: Contextual inquiry for the design of a dashboard. In Proceedings of the 11th European Conference on Technology Enhanced Learning (EC-TEL 2016), 13–16 September 2016, Lyon, France (pp. 340–354). Lecture Notes in Computer Science, Springer.

Yeager, D., Bryk, A., Muhich, J., Hausman, H., & Morales, L. (2013). Practical measurement. Palo Alto, CA: Carnegie Foundation for the Advancement of Teaching.

Yi, J. S., Kang, Y. A., Stasko, J. T., & Jacko, J. A. (2008, April). Understanding and characterizing insights: How do people gain insights using information visualization? In Proceedings of the 2008 Workshop on BEyond time and errors: Novel evaLuation methods for Information Visualization (BELIV ’08), 5 April 2008, Florence, Italy (Article No. 4). New York: ACM.

Zimmerman, J., Forlizzi, J., & Evenson, S. (2007). Research through design as a method for interaction design research in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ʼ07), 28 April–3 May 2007, San Jose, CA (pp. 493–502). New York: ACM.




How to Cite

Ahn, J., Campos, F., Hays, M., & Digiacomo, D. (2019). Designing in Context: Reaching Beyond Usability in Learning Analytics Dashboard Design. Journal of Learning Analytics, 6(2), 70–85.



Special Section: Human-Centred Learning Analytics