Blog

Transparent analytics for different user groups

Analytics Intelligence user needs surveys started from the everyday observation that learning analytics and its use are not familiar to students or even staff. Of course, it was clear to the respondents that various registration systems exist containing information about students, studies and training, yet respondents were not particularly familiar with the concept of "learning analytics" For example, the University of Tampere does not have in place learning analytics operations. 

The project conducted user needs surveys in Spring 2019 at six partner universities. There was a total of five questionnaires, targeted at different user groups: students, teachers, tutor-teachers / tutors, study coordinators and those responsible for education. The questionnaire was answered by 183 students and 170 staff members. Questionnaires aimed at different user groups revealed the different user groups needs in utilising analytics intelligence in the context of learning. A rich basis of data was collected from open-ended questions, in particular, on how the university should use registry data, and the underlying ethical issues involved in its use. Open answers were well received, as about half of the respondents answered those open questions. 

Both students and staff were particularly unanimous about the use of the study data at the university. Over half of the respondents said in open answers that they would like the university to use registry data to design and develop teaching. In contrast, responses to ethical issues raised different views across different user groups. Students identified guidance, transparency, the impact and misuse of information, and the use of sensitive information as major ethical concerns. Most of the staff considered these same issues to be ethical issues, but surprisingly many (13.8% of the respondents) felt that there were no ethical issues with the use of registry information. This was an interesting research result, as there were no ready-made answers to the question, but each respondent was allowed to write his or her own views on the ethical issues involved in utilising registry information. 

The differences between the views of students and staff may be explained by visions coming from different angles. Students may not have fully embraced the potential opportunities offered by learning analytics to their studies, or to the university general management system. The staff, on the other hand, may have a very instrumental approach to utilising learning analytics. The student’s own data disclosure is a personal matter, but staff see students partly as a faceless mass. Although there is no real contradiction, they represent different aspects of analytics such as privacy and productivity. The most important aspect is that the various functionalities and their meaning are carefully thought through, well-founded and ethically sound, and agreed by all parties. Based on the results, there is still much to be done in the domain of learning analytics. Users should be trained in analytics in general, and in particular in the use of various services and applications. This should also be accompanied by a discussion on both the ethical use and rules of analytics. 

Hanna Lindsten

Jussi Okkonen

Tampere University

Blog, featured

Learning Analytics as a Studies Guidance Tool

At the University of Oulu, the personal tutor teacher (PSP-teachers, teacher-tutors) acts as a supporter of students’ study progress and as a guide to study paths. The personal tutor teacher is an important close contact for students in their university studies, and his/her duties include assisting the student with the development of a personal study plan, tracking student progress, and guiding the student on career advancement and career choices. Personal tutor teachers are typically lecturers, university teachers, or researchers in the same discipline and carry out their own teaching duties alongside their own work. 

As part of the Analytics AI project, the University of Oulu is developing analytical tools for personal tutor teachers, aimed at facilitating the monitoring of individual studies progress in real-time. The goal of the visualisations being developed is to give the tutor teacher a clear idea of the student's progress in relation to the student's own study plan. The tools can be used, for example, in preparing for a counselling meeting, during the counselling meeting, and more generally in studies follow-up. As we develop new tools for learning analytics, we are also researching and developing practices that leverage knowledge. In addition to the creation of tools, we need to further understand who are the tools users, and for what purposes and in which situations the tools can be used.

Next, Oulu will test the functionality of a visualization tool under the guidance of second-year students and tutor teachers. The aim is to understand how the tool can be used as a conveyor of knowledge and a basis for discussion about the progress of the studies and the students’ own study goals. One essential part of the pilot study is to create user instructions and guidance to both user groups on how to use the new tools to support studies’ guidance. As we collect feedback on the comprehensibility and meaningfulness of the views, we gain insights into the different user experiences with the tool. In order for the technology to be introduced and deployed in a sustainable fashion, it is essential to understand its operating environment.

From the student's perspective, it is important to develop tools that, as part of their guidance and study practices, support them in making choices about studying and planning. For joint student and teacher tutor meetings, it is important that the developed visualisation tools promote high-quality student-tutor interaction, rather than technical review or data mining. In that way, the tools developed can help establish real and meaningful student-tutor interaction instances. 

Authors

Anni Silvola and Riku Hietaniemi, University of Oulu

Blog, featured

Predicting the progress of studies in a changing context of learning outcomes

To whom we predict

Quite often we have come across with "prediction models", which are able to describe events dating years, even decades back. We know precisely the number of students, or year of graduation, how long it took to complete the studies, what was the average grade or the drop-out rate for that year. When there is a sufficiently large amount of such data, we generate averages that we believe can help predict future student progress, lead degree programs, or support study guidance. In reality, however, we have lost the individual, the progress of which we should look at and, at the same time, have forgotten that a real prediction model should deal with new individuals starting their studies.

Often, learning prediction models are targeted as tools for study guidance, curriculum management, or as a university-level funding model tool. Surprisingly, the support provided for the customer or individual student's forecasting model may be forgotten and even more often the use of the forecasting model from the end-user perspective will be forgotten: the forecasting model should be able to produce reliable information for the development of educational products so that the industry in need of graduating students has access to experts with a suitable and up-to-date knowledge profile. As we currently anticipate the progress of studies, we may be focusing too much on the credits produced, the number of graduates, the duration of the studies, or the cost of teaching. A good prediction model would be able to highlight the changes needed in the knowledge profile in addition to these and would also consider the end user’s perspective. A new challenge is the increasing diversity of distance-learning programs, where students progress in digital environments according to their own schedules and whose content can be updated in real time anywhere in the calendar year. Regardless of the learning environment, the prediction model should be able to take into account that someone needs these graduating students!

Does the system support our operations or does it control what we do

A very classic question when developing information systems is how much flexibility should the system have to suit the needs of different users? From a purely analytical and statistical point of view, a strictly rule-based and locked system would be the easiest, but there are at least two big variables in the prediction model that requires system flexibility. First of all, the student’s background and life situation make an individual who, in any case, follows his or her own paths. Secondly, the degree structures underlying the forecasting model and the courses included therein will at least hopefully evolve to meet the needs of the end user mentioned above. Therefore, the system behind the forecasting model must allow and recognize a wide range of flexibility regarding individual choices, changes and updates in degree programme structures, study modules with variable scopes, and study paths originally designed to be of varying lengths. It would be unsatisfying if the system were so rigid that students would be forced into a particular template or that flexibility in their studies would be "system controlled". Undoubtedly, allowing flexibility makes it difficult to build prediction model algorithms, but only by allowing flexibility can one produce forecast information that serves the entire chain from the customer to the end user.

In order to meet the needs described, we have defined, for example, in the description of the AnalyticsAI project, the following: “The final phase of the project will provide a set of non-organizational and generic definitions of common ERP-critical information content and an e-PSP (electronic Personal Study Plan) prediction model”.

Let’s keep this in mind as we move forward with the project together.

The authors

Harri Eskelinen & Terho Lassila

Lappeenranta-Lahti University of Technology LUT

Blog, featured

Learning analytics as an evaluation tool - Privacy, Legal Security and Liability

Utilising artificial intelligence applications to support student learning, student guidance, learning assessment, and knowledge management can change university teaching procedures and practices. Legally, the use of artificial intelligence in student guidance and learning assessment involves a number of problem areas for which there is no legal solution yet.

The Data Protection Regulation sets out the conditions under which personal data may be processed, as well as the restrictions related to data subject profiling and automated decision-making. The processing of personal data must have a legal basis in accordance with the Data Protection Regulation. For example, where processing is necessary to comply with a university statutory obligation, there is an appropriate legal basis for processing personal data. Universities have statutory obligations, such as, in the Universities Act, whereby, for example, they have the task of providing research-based higher education and arranging teaching and study guidance so that students can complete their studies full-time within the stipulated target completion period. The use of learning analytics in universities to meet these obligations is permitted under the legal basis.

When using learning analytics, students are in practice required to be profiled based on their knowledge. According to the Privacy Regulation, "profiling" means any automatic processing of personal data in which the use of personal data is used to assess certain personal characteristics of a natural person. Analysing student learning is profiling within the meaning of the Privacy Regulation, as a student's ability to learn is a personal attribute that is assessed in learning analytics. Profiling is not categorically prohibited by the Data Protection Regulation, but it must have the processing criteria for personal data laid down in the Data Protection Regulation, which must take into account purpose-relatedness, the need and necessity of data minimization, transparency of processing and respect for data subject's rights. In many cases, the data subject has the right to object to profiling, but there is no such right if the processing is based on the fulfillment of the data controller's statutory obligation, such as the provision of instruction and guidance as described under the Universities Act.

Learning analytics can also be used to evaluate students' learning outcomes. Thus, learning analytics may be used for automated decision-making within the scope of the Data Protection Regulation, which is in principle prohibited. In universities, learning analytics can only be used in the assessment of students if it involves effective teacher control over the final outcome of the evaluation or if the automated assessment is specifically provided for by law.

There are issues other than personal data protection in the automated assessment of student learning. For the benefit of the student, the University Act provides legal safeguards related to the assessment of study’s accomplishments. These legal safeguards cannot be compromised when utilizing learning analytics. Assessment of learning is also the exercise of public authority, associated with the effective exercise of official duties. Under the current legislation, official responsibility for automated analysis cannot be outsourced, but it is ultimately the respective teacher the one responsible for the evaluation.

There are technological conditions for the development of learning analytics, but operational and legal conditions are still seeking their place in the development of artificial intelligence. Legal challenges still do not necessarily present a barrier to the use of learning analytics in student guidance and assessment, but where development work is concerned, it requires careful work process development to ensure students’ personal data and legal protection, not forgetting the teacher's own legal protection.

The authors

Tomi Voutilainen and Juuso Ouli

University of Eastern Finland

Blog

Oppimisanalytiikka ja sitä tukevat käytänteet

AnalytiikkaÄly -hankkeessa kehitetään oppimisanalytiikkaa ja sitä tukevia käytäntöjä, joiden avulla korkeakouluissa tuetaan sujuvaa opiskelua opintojen eri vaiheissa. Syksyn 2018 kuluessa olemme kartoittaneet käyttäjätarpeita opiskelijoilta, omaopettajilta, sekä tiedekuntien ja yliopiston hallinnon edustajilta. Keväällä ja kesällä 2019 siirrymme kohti sovellusten kehitystä, joita pilotoidaan syksystä 2019 alkaen.

Oppimisanalytiikka tarkoittaa oppimisesta ja opiskelusta kertyvän datan hyödyntämistä analysoituna palautteena eri käyttäjäryhmille. Analytiikkaa on hyödynnetty monilla eri aloilla jo pitkään, mutta koulutuksen ja oppimisen optimointiin analytiikkaa on sovellettu vasta noin kymmenen vuoden ajan ja erityisesti aivan viime vuosina. Koska oppimisanalytiikka perustuu opiskelijoista kertyviin digitaalisiin jälkiin, on analytiikkatiedon hyödyntäminen tiiviisti yhteydessä koulutuksen digitalisaatioon, eli niiden tietojärjestelmien ja digitaalisten toimintaympäristöjen käyttöön joita korkeakouluissa tällä hetkellä kasvavasti hyödynnetään.

Hankkeessamme keskitytään erityisesti opiskelun ohjauksen, opintojen suunnittelun, etenemisen seurannan ja tukemisen sekä johtamisen näkökulmiin. Tällä hetkellä  suurin osa oppimisanalytiikkaa hyödyntävistä työkaluista on kehitetty opintojaksoilla tapahtuvan oppimisen ja opiskelun optimoinnin tueksi. AnalytiikkaÄly-hankkeessa keskitymme opintojen kokonaispolun tukemiseen pidemmällä aikavälillä.

Olennainen osa oppimisanalytiikkan käytölle ovat siihen liittyvät juridiset kysymykset, kuten tietosuoja, sekä eettiset näkökulmat, esimerkiksi kenelle opiskelijan tiedot näkyvät oppilaitoksessa ja mitä näytämme opiskelijalle itselleen. Pohjautuen eri käyttäjäryhmien tarpeisiin sekä kokemuksiin luomme toimintamalleja oppimisanalytiikan soveltamiselle. Oppimisanalytiikan käytössä erityisesti opiskelijoiden yksityisyys, tiedon keräämisen ja käyttötarkoitusten vastuullisuus ja läpinäkyvyys, sekä tiedon tallentaminen ja erilaiset analyysimenetelmät ovat nousseet esille tärkeinä näkökulmina.

Oppimisanalytiikasta maaliskuussa kirjoitti

Anni Silvola,

University of Oulu