Blog, featured

Predicting the progress of studies in a changing context of learning outcomes

To whom we predict

Quite often we have come across with "prediction models", which are able to describe events dating years, even decades back. We know precisely the number of students, or year of graduation, how long it took to complete the studies, what was the average grade or the drop-out rate for that year. When there is a sufficiently large amount of such data, we generate averages that we believe can help predict future student progress, lead degree programs, or support study guidance. In reality, however, we have lost the individual, the progress of which we should look at and, at the same time, have forgotten that a real prediction model should deal with new individuals starting their studies.

Often, learning prediction models are targeted as tools for study guidance, curriculum management, or as a university-level funding model tool. Surprisingly, the support provided for the customer or individual student's forecasting model may be forgotten and even more often the use of the forecasting model from the end-user perspective will be forgotten: the forecasting model should be able to produce reliable information for the development of educational products so that the industry in need of graduating students has access to experts with a suitable and up-to-date knowledge profile. As we currently anticipate the progress of studies, we may be focusing too much on the credits produced, the number of graduates, the duration of the studies, or the cost of teaching. A good prediction model would be able to highlight the changes needed in the knowledge profile in addition to these and would also consider the end user’s perspective. A new challenge is the increasing diversity of distance-learning programs, where students progress in digital environments according to their own schedules and whose content can be updated in real time anywhere in the calendar year. Regardless of the learning environment, the prediction model should be able to take into account that someone needs these graduating students!

Does the system support our operations or does it control what we do

A very classic question when developing information systems is how much flexibility should the system have to suit the needs of different users? From a purely analytical and statistical point of view, a strictly rule-based and locked system would be the easiest, but there are at least two big variables in the prediction model that requires system flexibility. First of all, the student’s background and life situation make an individual who, in any case, follows his or her own paths. Secondly, the degree structures underlying the forecasting model and the courses included therein will at least hopefully evolve to meet the needs of the end user mentioned above. Therefore, the system behind the forecasting model must allow and recognize a wide range of flexibility regarding individual choices, changes and updates in degree programme structures, study modules with variable scopes, and study paths originally designed to be of varying lengths. It would be unsatisfying if the system were so rigid that students would be forced into a particular template or that flexibility in their studies would be "system controlled". Undoubtedly, allowing flexibility makes it difficult to build prediction model algorithms, but only by allowing flexibility can one produce forecast information that serves the entire chain from the customer to the end user.

In order to meet the needs described, we have defined, for example, in the description of the AnalyticsAI project, the following: “The final phase of the project will provide a set of non-organizational and generic definitions of common ERP-critical information content and an e-PSP (electronic Personal Study Plan) prediction model”.

Let’s keep this in mind as we move forward with the project together.

The authors

Harri Eskelinen & Terho Lassila

Lappeenranta-Lahti University of Technology LUT

Blog

What is Learning Analytics?

In Higher Education Institutions as well as in other organizations, various electronic systems are constantly leaving users with electronic traces, or data. When a student takes an electronic exam, he/she registers how long he/she took the time to answer and how many words he/she wrote. Learning environments record information such as student assignment returns and logins. The course register again accumulates course scores and grades.

Human data can be broken down into an active or a passive footprint. An active footprint is created when, for example, people write messages or leave feedback. The passive trace, on the other hand, is left to everything the user is unaware of, such as time and clicks.1

By definition, learning analytics is the process of gathering, measuring, analyzing and reporting learner-centered information with for the purpose of understanding and optimizing learning and learning environments.2 Thus, learning analytics seeks to add value to information that has been too laborious to deal with prior to analytics, to serve different user groups: students, teachers, tutors, and administration and management.

The potential for using analytics depends on what kind of applications are built around it. The digital learning platform collects data naturally and many learning environments have analytical capabilities. However, analytics can also be extended to include library card loans or even lecture attendance by adding electronic registration to lessons, for example through a mobile application. In theory, data can be collected endlessly, so it is essential to identify what information is really useful for developing learning processes.

Learning analytics can be utilized in many different ways to serve the needs of users. The analytics can be directly descriptive, whereby, for example, the student can see real-time information about the overall status of their studies or the performance of students in their teacher course. Descriptive information can be used for comparison. This allows the student to see how they have progressed relative to other students, or the teacher to see how the course implementation relates to previous rounds of the same course. Analytics also enables foresight. Data collected over a longer period of time may predict that a student who meets certain criteria is at risk of dropping out of the course, providing them with situational support. In addition, Artificial Intelligence can automatically provide students with feedback or exercises appropriate to their skill level. The list of examples is endless.

Finally, how data is presented to users in the form of various results and reports is key to the successful exploitation of learning analytics.3 The goal of visualisation is to present the information and recommendations discussed in learning analytics reporting as clearly as possible to the users.4, 5 Two examples of learning analytics’ results are presented below.

Author:

Janne Mikkola,

University of Turku

Sources

1 Madden, M.  – Fox, S. – Smith, A. – Vitak, J. (2007). Digital Footprints – Online identity management and search in the age of transparency. https://www.pewinternet.org/2007/12/16/digital-footprints/

2 Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380-1400.

3 Auvinen, A. (2017). Oppimisanalytiikka tulee – Oletko valmis? Suomen eOppimiskeskus Ry. https://poluttamo.fi/2017/08/02/oppimisanalytiikka-tulee-oletko-valmis/

4 Brown, M. (2012). Learning analytics: Moving from concept to practice. EDUCAUSE Learning Initiative, 1-5.

5 Reyes, J. A. (2015). The skinny on big data in education: Learning analytics simplified. TechTrends, 59(2), 75-80.

Blog

Learning analytics and supporting practices

The AnalyticsAI project is developing learning analytics and supporting practices to help Higher Education Institutions to support smooth learning at different stages of studies. During the fall of 2018, we identified user needs from students, teachers, as well as faculty and university administrators. In the spring and summer of 2019, we will be moving towards application development, which will be piloted starting in autumn 2019.

Learning analytics refers to the utilisation of data, generated from learning and studying, as feedback to different user groups. While analytics has been used in many fields for a long time, it has only been used in the optimisation of education and learning in the past ten years, and with increasing emphasis in most recent years. Because learning analytics is based on the digital footprint of students, the use of analytics is closely linked to the digitalisation of education, i.e., to the use of information systems and digital environments, increasingly used in Higher Education Institutions.

Our project focuses in particular on student guidance, study design, progress monitoring and support, and leadership. Currently, most tools that utilize learning analytics have been developed to support learning and study optimization during the course. At AnalyticsAI, we focus on supporting the overall study path, in the long term.

An integral part of the use of learning analytics are the related legal issues, such as data protection, as well as the ethical aspects, such as e.g. to whom the student information should be made available within the educational institution, and what we show the student himself. Based on the needs and experiences of different user groups, we create operating models for the application of learning analytics. Particularly important aspects on the use of learning analytics, refer to student privacy, accountability and transparency of data collection and use, as well as data storage and the various methods of analysis.

Author

Anni Silvola,

University of Oulu