2nd Annual Learning & Student Analytics Conference was held during October 22-23 at University of Amsterdam. Some of the APOA-project team members attended to the conference. The conference brought together practitioners, lecturers and researchers to share ongoing practices and research related to learning analytics. Following the most interesting topics and presentations are shortly introduced.

Timothy A. McKay from the University of Michigan kept an interesting key note speech regarding to using LA for probing equity in education. He gave i.e. examples about exploring performance differences in large STEM-courses with LA. They had compared students’ expected performance (based on GPA) and achieved performance in specific physics course and observed that women had performed worse than expected. Based on their investigations, McKay suggested to concentrate from who did well-or-poorly to who did better-or-worse than expected. McKay also told about the tools they use to personalize education. A tool called ECoach allows both to learn more and to experiment with possible interventions.

Here’s also an interesting article from McKay, Architecting for Learning Analytics

Scaling grassroots projects

Scaling grassroots projects” -session was probably the best parallel session of the conference. This session included three concrete presentations about exploitability of LA. In this session we were presented the implementation of learning analytics at Nottingham Trent University and got insights from STELA & ABLE Erasmus+ projects.

Tom Broos from KU Luven suggested to start building LA from context, not from tools. He also recalled that building systems and tools require local knowledge of practitioners and end-users. This was also highly valued issue by Ed Foster from Nottingham Trent University. He highlighted that “the perfect tool needs perfect understanding of your institution and your systems. No vendor/internal IS specialist has this.”

NTU is already using a student dashboard that gives information to both the students and teaching staff. It presents an overall picture about each student’s engagement with their course. More info is available on their website, it is worth checking out!

The presentation about the STELA and ABLE projects was all about implementing institutional learning analytics. The picture below shows the steps used in ABLE. You can read more about the steps and the challenges behind them from here: https://oflablog.files.wordpress.com/2018/10/lsac-2018-presentation.pdf

Modelling implementing change

The conclusion of this all was that implementation process takes a long time. It requires patience and countless meetings with everyone involved in developing or using LA. Even though it is not easy it is still a thing one should do. There is no turning back.

Further information available at





Fred Pope from University of Amsterdam presented their research during which they had investigated students study tempo. They had noticed that high-performing first year students under performed in their second year, leading to delayed graduation or even dropping out. They had managed to develop a simple prediction model of students’ study tempo. In their investigations, all students that do not continue their tempo and dip more than 1 period, will delay their graduation more than 1 year.

Overall, it appears that while other nations are struggling with funding issues we have quite an unique and fruitful situation here in Finland. The Ministry of Culture and Education has funded this national project that has a potential to change practices not only locally but also in (inter)national level.