Blog

The policy guidelines of learning analytics direct and promote its meaningful use

Why and how has the policy of learning analytics been built?

Forms and opportunities for learning in universities are becoming diverse. At the same time, the amount of data collected from students’ activities and teaching increases and diversifies. By analysing the data, one can better understand the learning opportunities as well as the bottlenecks. Learning analytics is a fast-growing area that refers to the collection, measurement, analysis, and reporting of information accumulated about a learner with the purpose of understanding and optimizing learning and learning environments (Siemens 2013).

The key objectives of AnalyticsAI project are to provide tools to support learning analytics practices. One of those tools is the policy of learning analytics, which contains carefully weighed, transparent and accepted principles, guidelines and decisions that ensure the meaningful use of analytics. The policy is a multifaceted tool that must consider the strategic, pedagogical, ethical and legal, as well as technical and data-related aspects of learning and teaching. In essence, the policy of learning analytics supports students and staff in the comprehensible, consistent and responsible utilisation of learning analytics. AnalyticsAI policy work has been prepared under the leadership of Aalto University, whose groundwork and policy document will serve all project stakeholders. The policy document processed in Aalto in a first phase will be processed in the next phase tailored for the specific needs of other universities.

In Aalto University, the policy assignment was validated by the Learning Steering Group and the policy working group set up for this purpose consisted of learning services (learning and teaching services & processes, pedagogy and responsibility for learning analytics), management information services (reporting and data maintenance, development and management), IT services (development of learning and teaching systems and services, development and management of data and university analytics), academic legal services (data protection and other legal matters), and teachers' and students' representatives. This representativeness has proven to work. The policy must support the university's many service processes and operating practices, including those under development, which is why the policy development must be closely linked to the university's various areas of expertise.

The policy alignment work has progressed in such a way that we first specified, based on an extensive international literature review, the principles of learning analytics that we want to follow. This served as a basis for the actual themes of the policy, through which the perspectives on the use of learning analytics were refined. Key examples have been the SHEILA Project Network’s R.O.M.A. method (Rapid Outcome Mapping System), and in particular the guidelines and themes developed by the JISC and the ORLA communities. We have taken advantage of several international policies, of which a comprehensive list can be found from here.

The following guiding principles were selected as regarding the use of learning analytics:

  1. Transparency of learning analytics objectives and practices and the right to influence the processing of one's own personal data: the collection and use of learning analytics data, its sharing and the ethical use of data are based on transparent criteria and decisions about the benefits and uses of learning analytics;  
  2. University values and strategy as a basis for learning analytics: The use and development of learning analytics is guided by the university's values and strategy;  
  3. Impartiality: learning analytics aims to understand the needs of diverse groups of students and provide them with support and guidance in a proactive and timely manner; 
  4. Improving quality for different stakeholders: Students can use learning analytics to streamline their studies; teaching staff to evaluate and develop teaching; leaders of degree programs as well as university management to support leadership and to improve the quality of teaching; 
  5. Furthering a positive learning experience: learning analytics provides content and pathways for the student to support his/her own personal plan and well-being; 
  6. Personal support and feedback: learning analytics can be used to identify students' learning needs and provide personal support; 
  7. Learning Analytics with the help of teacher and tutor support: we understand that the use of learning analytics provides only a partial picture of students’ performance, activity, well-being, and other factors. Therefore, support measures based on the results of learning analytics are the product of human decision making. Learning analytics complements forms of face-to-face and web-based interaction; 
  8. Critical review of data and algorithms: we recognize that data and algorithms may be garbled. We work systematically to correct any incomplete data, erroneous algorithms, as well as inferences and impacts; 
  9. User-centric development of learning analytics: the development and use of learning analytics is based on the user needs perspective of different groups of actors at the university;
  10. Digital skills development: the use of learning analytics supports students' and staff's understanding and ability to function in digital environments.

The actual policy themes include more detailed guidelines and are accompanied by other existing policies of the university (for example, the Privacy Policy) and documents (for example, the Data Management Plan and the Privacy Statement).

Themes of learning analytics policy: 

  1. Areas of learning analytics and liability issues; 
  2. Data protection principles in learning analytics; 
  3. Ensuring the validity of learning analytics data and results; 
  4. Access to analytics' results and data; 
  5. Justifying and enabling positive interventions; 
  6. Identifying and considering the detrimental effects of learning analytics; 

Learning analytics deployment is on its way to universities. Policy guidelines are an essential tool in conducting this action. The Policy is also a living document that needs to be refined through new opportunities, and based on the accumulation of experience and analytics.

Jiri Lallimo

Aalto University

References

JISC, UK (2015). Code of Practice for Learning Analytics. (Noudettu 18.10.2019) 
https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf

Sheila-project, Supporting Higher Education to Intgrate Learning Analytics. (Noudettu 18.10.2019) https://sheilaproject.eu/

Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. 

Blog

Learning analytics going safely to this day and open mindedly into the future and beyond

At the turn of the year, most universities and higher education institutions analyse the previous year’s performance targets by degree level or by other selected criteria. During that period, many educational institutions are launching the following year’s academic degree structures and preparing the various teaching modules. In how many institutions have the results been predicted using the traditional approach, such as, for example, that 70% of all beginners will graduate on time? And should this prediction model continue to guide the implemention of teaching for the coming academic year, as usual?

It is somewhat baffling that, at the same time as the AnalyticsAI project is being implemented, migration to SISU and Peppi is underway, whereby part of the prediction model development is likely to be learning the features of the new systems rather than developing the actual analytics tool. Why don’t all have the same system in place? The most recently adopted Data Protection Act creates its own challenge into the overall picture. This in turn has led to the fact that even in the development of analytics we have had to be very cautious in developing “real” forecasting models.

Fortunately, in the project, we have jointly identified the risks described above and are able to look at the whole picture openly by first identifying and recognizing the actual prediction models already in place and at least in use at some universities. Only then will we develop new, appropriate analytics’ tools. In doing so, we will not come up with something that is already available from SISU or Peppi. To support the development of analytical tools, it is thus reasonable to think of a three-part timeline: the existing tools, tomorrow’s tools, and the tools for beyond tomorrow’s prediction model.

At the moment we can quite reliably extract from SISU, for example, student-specific PSPs, study load data by period, number of graduates by educational product, etc. However, these do not really predict anything, but describe what has already happened. However, with a bit of work, it is relatively easy to have the opportunity to retrieve the follow-up data for each student or group of students by year of study, and to estimate the number of degrees to be achieved by educational product, anticipating the development prospects of the next 3-4 years. These examples illustrate the attempt to present analytics results as trends that better support the choice of measures required both in the progress of students and their studies, and in the management of educational products. Using SISU’s features, it is also easy to generate system-based alerts for students and management alike if, for example, study time threatens to stretch or the degree goals do not appear to be met. In such cases, the counteractive action may be taken immediately as opposed to an a posteriori analysis of what might have gone wrong.

Towards the end of the project, we will probably be prepared already to present a tool to support the design of educational products featuring the ”day after tomorrow's” forecasting model, which will allow employer feedback and labor market skills needs to be incorporated into the productivity analysis of an educational product. As we have a strong view of the future of the project, we think it is more than justified to continue with the development work in the future, even beyond “the day after tomorrow”. Hopefully, the project sponsor accords with our vision when the ideation of follow-up projects becomes topical.

Katriina Mielonen ja Harri Eskelinen

Lappeenranta-Lahti University of Technology LUT

Blog

Past and future steps

Less than one year is now left for the conclusion of the AnalyticsAI project. At the end of January, we met with project partners in Turku to survey the current issues of the project. 

To date, much has happened in the project's one and a half years of operation. Work began in the fall of 2018 with workshops to identify student and staff user needs, held at the project’s universities across the country. The workshops were followed by a survey, in the spring of 2019, which further complemented the information on data analytics needs.  

Based on the data collected, the project has started to produce its own AnalyticsAI application as well as analytical tools for the University of Oulu in-house teacher-tutors. In addition, research has been also conducted on analytics legal issues, university-level policy work, risk identification, and the predictive use of analytics

At the meeting in Turku, the focus was on the next steps, with particular emphasis to the ongoing application development and piloting. During the current year, we will be focused on piloting the project application. In the spring, we will start with the student’s view of the application, while the views of the instructors and management will be piloted in the autumn. Pilots from different universities will be recruited and the target groups informed, as they become relevant. 

We also want to disseminate the results of the project to a wider audience and, as as such, we will be conducting three webinars in 2020. The topics and times can be seen below and more detailed links are welcome on these pages as well as on the project’s other channels. 

  • Privacy Policy, Risk Assessment and Impact Assessment, 21.4.2020, 2:00 – 3:00 pm 
  • AnalyticsAI app, Student Dashboard, 27.05.2020, 1:00- 2:00 pm 
  • Study path as a service path - Perspectives on analytics, 18.8.2020, 1:00- 2:00 pm 

In addition, an open networking meeting on learning analytics will be held in conjunction with the Oulu Pedaforum on August 19, 2020. Welcome aboard! 

Janne Mikkola 

University of Turku 

Blog

Identifying the risks of learning analytics

Inherent to the concept of learning analytics is the collection and utilisation of information, in a variety of contexts, generated by a student's activities without the student having to produce it consciously and actively at any stage. The University, as a controller of personal data, must assess the risks associated with data processing in the context of learning analytics, in order to contain them and ensure the data’s proper handling.

A study carried out as part of the AnalyticsAI project Learning analytics and student data processing at the University (Ouli, J. & Voutilainen, T. 2019) looks at a wide range of legal issues related to the use of learning analytics in university education, especially from the perspective of a university student. Also included in the study report, is a simplified risk assessment model for data processing, for different learning analytics use cases. Based on this model, we have prepared a separate tool for carrying out the risk assessment, which is intended to be published openly online with guidelines for users. The format of the tool to be published is being further elaborated. The challenging nature of creating a new technological tool, is heightened by the fact that the related legislation is fairly recent. As a result, there are relatively few guidelines or use cases on the subject.

When considering the risk assessment of learning analytics, risk factors with different weights pose different risks. For example, one major risk factor is, if the use of learning analytics enables automated decisions concerning students. According to Ouli and Voutilainen’s report there is no legislation, which would allow the use of automated decision-making in learning analytics use cases. Similar issues have arisen in the public debate regarding the automated decisions regarding Kela and the tax authorities.

The risk assessment tool being prepared in the project does not automatically answer the question of whether the controller has to carry out an impact assessment under the Data Protection Regulation. It does, however, provide a basis for addressing specific issues in learning analytics and highlights the key risks and perspectives to consider, which in any case need to be assessed from a risk-based perspective in the Data Protection Regulation.

It should also be noted that, even though data processing from a risk assessment perspective would be a good model for learning analytics, it must also be considered separately whether its implementation follows through with ethically sustainable practices and criteria, what principles are accepted for the use of analytics, and how the responsibilities of the various stakeholders are defined. In addition, it should be noted that according to Ouli and Voutilainen’s report, learning analytics cannot be developed in universities from the point of view of data protection law alone, because action is governed by general administrative law, within the framework of the national leeway, the most important of which are the Administrative Procedure Act, the Act on the Openness of Government Activities, the Information Management Act, as well, the Universities Act.

Special Designer Tommi Haapaniemi, Study Services
Project Researcher Meri Sariola, Department of Law

University of Eastern Finland

Blog

Transparent analytics for different user groups

Analytics Intelligence user needs surveys started from the everyday observation that learning analytics and its use are not familiar to students or even staff. Of course, it was clear to the respondents that various registration systems exist containing information about students, studies and training, yet respondents were not particularly familiar with the concept of learning analytics For example, the University of Tampere does not have in place learning analytics operations. 

The project conducted user needs surveys in Spring 2019 at six partner universities. There was a total of five questionnaires, targeted at different user groups: students, teachers, tutor-teachers / tutors, study coordinators and those responsible for education. The questionnaire was answered by 183 students and 170 staff members. Questionnaires aimed at different user groups revealed the different user groups needs in utilising analytics intelligence in the context of learning. A rich basis of data was collected from open-ended questions, in particular, on how the university should use registry data, and the underlying ethical issues involved in its use. Open answers were well received, as about half of the respondents answered those open questions. 

Both students and staff were particularly unanimous about the use of the study data at the university. Over half of the respondents said in open answers that they would like the university to use registry data to design and develop teaching. In contrast, responses to ethical issues raised different views across different user groups. Students identified guidance, transparency, the impact and misuse of information, and the use of sensitive information as major ethical concerns. Most of the staff considered these same issues to be ethical issues, but surprisingly many (13.8% of the respondents) felt that there were no ethical issues with the use of registry information. This was an interesting research result, as there were no ready-made answers to the question, but each respondent was allowed to write his or her own views on the ethical issues involved in utilising registry information. 

The differences between the views of students and staff may be explained by visions coming from different angles. Students may not have fully embraced the potential opportunities offered by learning analytics to their studies, or to the university general management system. The staff, on the other hand, may have a very instrumental approach to utilising learning analytics. The student’s own data disclosure is a personal matter, but staff see students partly as a faceless mass. Although there is no real contradiction, they represent different aspects of analytics such as privacy and productivity. The most important aspect is that the various functionalities and their meaning are carefully thought through, well-founded and ethically sound, and agreed by all parties. Based on the results, there is still much to be done in the domain of learning analytics. Users should be trained in analytics in general, and in particular in the use of various services and applications. This should also be accompanied by a discussion on both the ethical use and rules of analytics. 

Hanna Lindsten

Jussi Okkonen

Tampere University