Transparent analytics for different user groups

Analytics Intelligence user needs surveys started from the everyday observation that learning analytics and its use are not familiar to students or even staff. Of course, it was clear to the respondents that various registration systems exist containing information about students, studies and training, yet respondents were not particularly familiar with the concept of "learning analytics" For example, the University of Tampere does not have in place learning analytics operations. 

The project conducted user needs surveys in Spring 2019 at six partner universities. There was a total of five questionnaires, targeted at different user groups: students, teachers, tutor-teachers / tutors, study coordinators and those responsible for education. The questionnaire was answered by 183 students and 170 staff members. Questionnaires aimed at different user groups revealed the different user groups needs in utilising analytics intelligence in the context of learning. A rich basis of data was collected from open-ended questions, in particular, on how the university should use registry data, and the underlying ethical issues involved in its use. Open answers were well received, as about half of the respondents answered those open questions. 

Both students and staff were particularly unanimous about the use of the study data at the university. Over half of the respondents said in open answers that they would like the university to use registry data to design and develop teaching. In contrast, responses to ethical issues raised different views across different user groups. Students identified guidance, transparency, the impact and misuse of information, and the use of sensitive information as major ethical concerns. Most of the staff considered these same issues to be ethical issues, but surprisingly many (13.8% of the respondents) felt that there were no ethical issues with the use of registry information. This was an interesting research result, as there were no ready-made answers to the question, but each respondent was allowed to write his or her own views on the ethical issues involved in utilising registry information. 

The differences between the views of students and staff may be explained by visions coming from different angles. Students may not have fully embraced the potential opportunities offered by learning analytics to their studies, or to the university general management system. The staff, on the other hand, may have a very instrumental approach to utilising learning analytics. The student’s own data disclosure is a personal matter, but staff see students partly as a faceless mass. Although there is no real contradiction, they represent different aspects of analytics such as privacy and productivity. The most important aspect is that the various functionalities and their meaning are carefully thought through, well-founded and ethically sound, and agreed by all parties. Based on the results, there is still much to be done in the domain of learning analytics. Users should be trained in analytics in general, and in particular in the use of various services and applications. This should also be accompanied by a discussion on both the ethical use and rules of analytics. 

Hanna Lindsten

Jussi Okkonen

Tampere University

Blog, featured

Learning Analytics as a Studies Guidance Tool

At the University of Oulu, the personal tutor teacher (PSP-teachers, teacher-tutors) acts as a supporter of students’ study progress and as a guide to study paths. The personal tutor teacher is an important close contact for students in their university studies, and his/her duties include assisting the student with the development of a personal study plan, tracking student progress, and guiding the student on career advancement and career choices. Personal tutor teachers are typically lecturers, university teachers, or researchers in the same discipline and carry out their own teaching duties alongside their own work. 

As part of the Analytics AI project, the University of Oulu is developing analytical tools for personal tutor teachers, aimed at facilitating the monitoring of individual studies progress in real-time. The goal of the visualisations being developed is to give the tutor teacher a clear idea of the student's progress in relation to the student's own study plan. The tools can be used, for example, in preparing for a counselling meeting, during the counselling meeting, and more generally in studies follow-up. As we develop new tools for learning analytics, we are also researching and developing practices that leverage knowledge. In addition to the creation of tools, we need to further understand who are the tools users, and for what purposes and in which situations the tools can be used.

Next, Oulu will test the functionality of a visualization tool under the guidance of second-year students and tutor teachers. The aim is to understand how the tool can be used as a conveyor of knowledge and a basis for discussion about the progress of the studies and the students’ own study goals. One essential part of the pilot study is to create user instructions and guidance to both user groups on how to use the new tools to support studies’ guidance. As we collect feedback on the comprehensibility and meaningfulness of the views, we gain insights into the different user experiences with the tool. In order for the technology to be introduced and deployed in a sustainable fashion, it is essential to understand its operating environment.

From the student's perspective, it is important to develop tools that, as part of their guidance and study practices, support them in making choices about studying and planning. For joint student and teacher tutor meetings, it is important that the developed visualisation tools promote high-quality student-tutor interaction, rather than technical review or data mining. In that way, the tools developed can help establish real and meaningful student-tutor interaction instances. 


Anni Silvola and Riku Hietaniemi, University of Oulu

Blog, featured

Predicting the progress of studies in a changing context of learning outcomes

To whom we predict

Quite often we have come across with "prediction models", which are able to describe events dating years, even decades back. We know precisely the number of students, or year of graduation, how long it took to complete the studies, what was the average grade or the drop-out rate for that year. When there is a sufficiently large amount of such data, we generate averages that we believe can help predict future student progress, lead degree programs, or support study guidance. In reality, however, we have lost the individual, the progress of which we should look at and, at the same time, have forgotten that a real prediction model should deal with new individuals starting their studies.

Often, learning prediction models are targeted as tools for study guidance, curriculum management, or as a university-level funding model tool. Surprisingly, the support provided for the customer or individual student's forecasting model may be forgotten and even more often the use of the forecasting model from the end-user perspective will be forgotten: the forecasting model should be able to produce reliable information for the development of educational products so that the industry in need of graduating students has access to experts with a suitable and up-to-date knowledge profile. As we currently anticipate the progress of studies, we may be focusing too much on the credits produced, the number of graduates, the duration of the studies, or the cost of teaching. A good prediction model would be able to highlight the changes needed in the knowledge profile in addition to these and would also consider the end user’s perspective. A new challenge is the increasing diversity of distance-learning programs, where students progress in digital environments according to their own schedules and whose content can be updated in real time anywhere in the calendar year. Regardless of the learning environment, the prediction model should be able to take into account that someone needs these graduating students!

Does the system support our operations or does it control what we do

A very classic question when developing information systems is how much flexibility should the system have to suit the needs of different users? From a purely analytical and statistical point of view, a strictly rule-based and locked system would be the easiest, but there are at least two big variables in the prediction model that requires system flexibility. First of all, the student’s background and life situation make an individual who, in any case, follows his or her own paths. Secondly, the degree structures underlying the forecasting model and the courses included therein will at least hopefully evolve to meet the needs of the end user mentioned above. Therefore, the system behind the forecasting model must allow and recognize a wide range of flexibility regarding individual choices, changes and updates in degree programme structures, study modules with variable scopes, and study paths originally designed to be of varying lengths. It would be unsatisfying if the system were so rigid that students would be forced into a particular template or that flexibility in their studies would be "system controlled". Undoubtedly, allowing flexibility makes it difficult to build prediction model algorithms, but only by allowing flexibility can one produce forecast information that serves the entire chain from the customer to the end user.

In order to meet the needs described, we have defined, for example, in the description of the AnalyticsAI project, the following: “The final phase of the project will provide a set of non-organizational and generic definitions of common ERP-critical information content and an e-PSP (electronic Personal Study Plan) prediction model”.

Let’s keep this in mind as we move forward with the project together.

The authors

Harri Eskelinen & Terho Lassila

Lappeenranta-Lahti University of Technology LUT

Blog, featured

Learning analytics as an evaluation tool - Privacy, Legal Security and Liability

Utilising artificial intelligence applications to support student learning, student guidance, learning assessment, and knowledge management can change university teaching procedures and practices. Legally, the use of artificial intelligence in student guidance and learning assessment involves a number of problem areas for which there is no legal solution yet.

The Data Protection Regulation sets out the conditions under which personal data may be processed, as well as the restrictions related to data subject profiling and automated decision-making. The processing of personal data must have a legal basis in accordance with the Data Protection Regulation. For example, where processing is necessary to comply with a university statutory obligation, there is an appropriate legal basis for processing personal data. Universities have statutory obligations, such as, in the Universities Act, whereby, for example, they have the task of providing research-based higher education and arranging teaching and study guidance so that students can complete their studies full-time within the stipulated target completion period. The use of learning analytics in universities to meet these obligations is permitted under the legal basis.

When using learning analytics, students are in practice required to be profiled based on their knowledge. According to the Privacy Regulation, "profiling" means any automatic processing of personal data in which the use of personal data is used to assess certain personal characteristics of a natural person. Analysing student learning is profiling within the meaning of the Privacy Regulation, as a student's ability to learn is a personal attribute that is assessed in learning analytics. Profiling is not categorically prohibited by the Data Protection Regulation, but it must have the processing criteria for personal data laid down in the Data Protection Regulation, which must take into account purpose-relatedness, the need and necessity of data minimization, transparency of processing and respect for data subject's rights. In many cases, the data subject has the right to object to profiling, but there is no such right if the processing is based on the fulfillment of the data controller's statutory obligation, such as the provision of instruction and guidance as described under the Universities Act.

Learning analytics can also be used to evaluate students' learning outcomes. Thus, learning analytics may be used for automated decision-making within the scope of the Data Protection Regulation, which is in principle prohibited. In universities, learning analytics can only be used in the assessment of students if it involves effective teacher control over the final outcome of the evaluation or if the automated assessment is specifically provided for by law.

There are issues other than personal data protection in the automated assessment of student learning. For the benefit of the student, the University Act provides legal safeguards related to the assessment of study’s accomplishments. These legal safeguards cannot be compromised when utilizing learning analytics. Assessment of learning is also the exercise of public authority, associated with the effective exercise of official duties. Under the current legislation, official responsibility for automated analysis cannot be outsourced, but it is ultimately the respective teacher the one responsible for the evaluation.

There are technological conditions for the development of learning analytics, but operational and legal conditions are still seeking their place in the development of artificial intelligence. Legal challenges still do not necessarily present a barrier to the use of learning analytics in student guidance and assessment, but where development work is concerned, it requires careful work process development to ensure students’ personal data and legal protection, not forgetting the teacher's own legal protection.

The authors

Tomi Voutilainen and Juuso Ouli

University of Eastern Finland


What is Learning Analytics?

In Higher Education Institutions as well as in other organizations, various electronic systems are constantly leaving users with electronic traces, or data. When a student takes an electronic exam, he/she registers how long he/she took the time to answer and how many words he/she wrote. Learning environments record information such as student assignment returns and logins. The course register again accumulates course scores and grades.

Human data can be broken down into an active or a passive footprint. An active footprint is created when, for example, people write messages or leave feedback. The passive trace, on the other hand, is left to everything the user is unaware of, such as time and clicks.1

By definition, learning analytics is the process of gathering, measuring, analyzing and reporting learner-centered information with for the purpose of understanding and optimizing learning and learning environments.2 Thus, learning analytics seeks to add value to information that has been too laborious to deal with prior to analytics, to serve different user groups: students, teachers, tutors, and administration and management.

The potential for using analytics depends on what kind of applications are built around it. The digital learning platform collects data naturally and many learning environments have analytical capabilities. However, analytics can also be extended to include library card loans or even lecture attendance by adding electronic registration to lessons, for example through a mobile application. In theory, data can be collected endlessly, so it is essential to identify what information is really useful for developing learning processes.

Learning analytics can be utilized in many different ways to serve the needs of users. The analytics can be directly descriptive, whereby, for example, the student can see real-time information about the overall status of their studies or the performance of students in their teacher course. Descriptive information can be used for comparison. This allows the student to see how they have progressed relative to other students, or the teacher to see how the course implementation relates to previous rounds of the same course. Analytics also enables foresight. Data collected over a longer period of time may predict that a student who meets certain criteria is at risk of dropping out of the course, providing them with situational support. In addition, Artificial Intelligence can automatically provide students with feedback or exercises appropriate to their skill level. The list of examples is endless.

Finally, how data is presented to users in the form of various results and reports is key to the successful exploitation of learning analytics.3 The goal of visualisation is to present the information and recommendations discussed in learning analytics reporting as clearly as possible to the users.4, 5 Two examples of learning analytics’ results are presented below.


Janne Mikkola,

University of Turku


1 Madden, M.  – Fox, S. – Smith, A. – Vitak, J. (2007). Digital Footprints – Online identity management and search in the age of transparency.

2 Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400.

3 Auvinen, A. (2017). Oppimisanalytiikka tulee – Oletko valmis? Suomen eOppimiskeskus Ry.

4 Brown, M. (2012). Learning analytics: Moving from concept to practice. EDUCAUSE Learning Initiative, 1-5.

5 Reyes, J. A. (2015). The skinny on big data in education: Learning analytics simplified. TechTrends, 59(2), 75-80.