Blog

Blog

Palvelumuotoilu auttaa hahmottamaan oppimisanalytiikan käyttökontekstia

Oppimisanalytiikan hyödyntäminen osana opiskelijan opintopolkua on laaja kokonaisuus, ja analytiikan käyttökokemukseen voivat vaikuttaa myös monet ulkoiset seikat opiskelijan ympäristössä ja opintopolussa. AnalytiikkaÄly -hankkeessa on kevään 2020 aikana hyödynnetty palvelumuotoilun näkökulmaa oppimisanalytiikan hahmottamiseen osana opiskelijan opintopolkua.

Palvelumuotoilu keskittyy ihmislähtöisen näkökulman ja systeemisen muotoiluajattelun kautta kehittämään ja suunnittelemaan palveluita, järjestelmiä ja prosesseja. Palvelumuotoilun menetelmät auttavat hahmottamaan palveluita yksittäisistä palvelukohtaamisista aina laajojen palveluekosysteemien rakenteisiin. Palvelun ymmärtämisen ja suunnittelun keskiössä ovat ihmiset, kuten käyttäjät, palveluntarjoajat, partnerit ja muut sidosryhmät, sekä heidän tarpeensa ja toiveensa palvelua kohtaan. Erityisesti käyttäjätarpeita pyritään tarkastelemaan ja ymmärtämään syvällisesti, jotta kehitettävä palvelu vastaisi tarpeisiin mahdollisimman kattavasti.

Palvelumuotoiluprosessissa palvelun rakenteita tarkastellaan niihin linkittyvien ihmisten lisäksi myös palveluympäristöjen, toimintamallien, palveluprosessien sekä palvelukanavien, kuten käyttöliittymien kautta. Näistä palvelun ”kontaktipisteistä” voidaan rakentaa palvelukohtaamisten jatkumo eli palvelupolku, joka rakentaa mahdollisimman kattavan ja tarkoituksenmukaisen palvelukokemuksen käyttäjälle. Palvelupolku on työkalu tarkastella palvelukohtaamisia kriittisesti sekä arvioida, miten palvelun osia voidaan kehittää oikeaan suuntaan, jotta käyttäjien tarpeet tulee kohdattua koko palvelupolun aikana.

AnalytiikkaÄly -hankkeessa opiskelijan opintopolkua on hahmotettu palvelupolkuna ja pyritty siten ymmärtämään oppimisanalytiikan kontekstia opiskelijan silmin. Hankkeessa kerättyyn tarvekartoitukseen pohjautuva palvelupolku liittää AnalytiikkaÄly -sovelluksen opiskelijan opintopolkuun opintoihin hakemisesta lähtien aina valmistumisen jälkeiseen aikaan asti, ja nostaa esiin tarpeita ja haasteita, joihin sovelluksen toiminnot voivat polun aikana tuoda ratkaisuja.

Oppimisanalytiikan ihmislähtöinen tarkastelu osana opiskelijan opintokokemusta voi antaa paremman ymmärryksen oppimisanalytiikan kontekstista, sen sovellutusten käytöstä suhteessa opintopolkuun, sekä siihen vaikuttavista ulkoisista tekijöistä, kuten opinnoissa tapahtuvat aikataulumuutoksista esimerkiksi opintojen ohella tapahtuvan työskentelyn tai perhesyiden vuoksi. Tällaisia muutoksia on haasteellista ennakoida analytiikan keinoin, mutta ne ovat inhimillisiä tekijöitä, joita on hankkeessa havainnollistettu palvelumuotoilun keinoin osaksi opintojen aikaista palvelupolkua.   Palvelumuotoilun näkökulmaa oppimisanalytiikkaan avataan lisää AnalytiikkaÄlyn webinaarissa ”Study path as a service path - Perspectives on analytics” tiistaina 18.8. klo 13-14. Lisätietoja hankkeen webinaarisarjasta saat täältä.

Titta Jylkäs

University of Lapland

Blog

Can analytics provide support and solutions in challenging situations?

The coronavirus has forced several educational institutions from basic education to higher education to take quite a digital leap in recent days. On a very fast schedule we switched to distance learning and working. Some higher education institutions have reported transferring to online learning in just a few weeks instead of previously planned months. Common worries of teachers and administrators in this situation are securing the continuation of the learning process, as well as ensuring that learning outcomes are achieved in digital environments. At the same time learning in digital environments from home requires better self-management skills from students as learning opportunities are more flexible and less supervised. Students who previously used help and direction from learning support may also have lost some of the resources.  

In these challenging times learning analytics can be utilized to become eyes and ears of a teacher and to support students’ learning process. By tracking students’ actions in learning environments and providing meaningful summaries and visualization to teachers, learning analytics can help teachers keep a constant overview of what is happening in the digital classroom, which students are progressing with their assignments and which ones need more help and personal attention from the teacher.   

Students on the other hand may benefit from more direct learning analytics support in a form of content suggestions and reminders to facilitate time management. Learning analytics may help keeping track of the study path, providing a track of completed studies and giving a structured overview of what courses still have to be mastered. All of the mentioned earlier could provide support and a sense of structure needed in one’s studies in challenging situations.   

Just as everyone else, our project has also adapted to the state of emergency in the country. Project meetings are now as well happening virtually so project partners can still collaborate successfully and do their work remotely from home. As we planned to have piloting sessions this spring, they are now transferred to purely online piloting and we will try to reach students online so they could try out and evaluate first the simulation and then the student dashboard as well. Teacher tutor piloting has also moved online, as teacher tutors now have instructions to carry out tutoring sessions with students using digital communication means.  

With our project team we are very happy to help create and develop tools and ways to utilize learning analytics that could benefit students and teachers in their everyday work and in extraordinary circumstances. Although the situation is hard for all educational communities there are resources and opportunities that teachers, students and researchers can utilize to succeed in their endeavors.   

Egle Gedrimiene ja Henna Määttä

University of Oulu

Blog

Learning analytics webinars coming soon!

During 2020, the AnalyticsAI project will implement three webinars on learning analytics. The purpose of the webinars is to share the results of the project with a wider audience and to bring out different perspectives related to learning analytics. The webinars will be held in Zoom, and you can find the registration links below. Come along to listen and discuss!

Registration for webinars: bit.ly/AIwebinars

More information about the webinars:

Privacy Policy, Risk Assessment and Impact Assessment

Tommi Haapaniemi (UEF), Meri Sariola (UEF), Jiri Lallimo (Aalto), Viivi Väisänen (UH)

As part of AnalyticAI work on the legal aspects of learning analytics, Viivi Väisänen's presentation deals with data protection principles and impact assessment through Aalto University's case study. The presentation shows the main conclusions of the case study and, as a concrete example, an impact assessment of the Moodle drop-out rate. The case study has also served the work on the AnalyticsAI policy design.

Under the leadership of the University of Eastern Finland, a simplified risk assessment model for the processing of personal data has been developed in the project, which can be applied to different use cases of learning analytics. The presentation introduces the risk assessment model and the key risks and perspectives on the implementation of learning analytics that need to be considered on the basis of the Data Protection Regulation.

AnalyticsAI app, Student Dashboard

Heikki Hyyrö ja Sami-Santeri Svensk (TU)

Under the leadership of the University of Tampere, the project has developed an application that utilizes learning analytics for the use of students, instructors and those responsible for education. The presentation concretely addresses the key elements of the application and in particular how the application can support students with the smooth progression of their studies.

Study path as a service path - Perspectives on analytics

Titta Jylkäs ja Essi Kuure (ULapland)

What development opportunities can we identify when we look at a student’s study path as a service path? The presentation opens up a people-oriented approach to learning analytics through the basics and methods of service design. Through the identification of the student's service path, the benefits of learning analytics can be targetted to studies in a timely manner and thus provide the student with concrete benefits for the promotion of studies. Through service design and learning analytics, we can outline the study path as a whole and offer students the value of a targeted service.

Blog

The policy guidelines of learning analytics direct and promote its meaningful use

Why and how has the policy of learning analytics been built?

Forms and opportunities for learning in universities are becoming diverse. At the same time, the amount of data collected from students’ activities and teaching increases and diversifies. By analysing the data, one can better understand the learning opportunities as well as the bottlenecks. Learning analytics is a fast-growing area that refers to the collection, measurement, analysis, and reporting of information accumulated about a learner with the purpose of understanding and optimizing learning and learning environments (Siemens 2013).

The key objectives of AnalyticsAI project are to provide tools to support learning analytics practices. One of those tools is the policy of learning analytics, which contains carefully weighed, transparent and accepted principles, guidelines and decisions that ensure the meaningful use of analytics. The policy is a multifaceted tool that must consider the strategic, pedagogical, ethical and legal, as well as technical and data-related aspects of learning and teaching. In essence, the policy of learning analytics supports students and staff in the comprehensible, consistent and responsible utilisation of learning analytics. AnalyticsAI policy work has been prepared under the leadership of Aalto University, whose groundwork and policy document will serve all project stakeholders. The policy document processed in Aalto in a first phase will be processed in the next phase tailored for the specific needs of other universities.

In Aalto University, the policy assignment was validated by the Learning Steering Group and the policy working group set up for this purpose consisted of learning services (learning and teaching services & processes, pedagogy and responsibility for learning analytics), management information services (reporting and data maintenance, development and management), IT services (development of learning and teaching systems and services, development and management of data and university analytics), academic legal services (data protection and other legal matters), and teachers' and students' representatives. This representativeness has proven to work. The policy must support the university's many service processes and operating practices, including those under development, which is why the policy development must be closely linked to the university's various areas of expertise.

The policy alignment work has progressed in such a way that we first specified, based on an extensive international literature review, the principles of learning analytics that we want to follow. This served as a basis for the actual themes of the policy, through which the perspectives on the use of learning analytics were refined. Key examples have been the SHEILA Project Network’s R.O.M.A. method (Rapid Outcome Mapping System), and in particular the guidelines and themes developed by the JISC and the ORLA communities. We have taken advantage of several international policies, of which a comprehensive list can be found from here.

The following guiding principles were selected as regarding the use of learning analytics:

  1. Transparency of learning analytics objectives and practices and the right to influence the processing of one's own personal data: the collection and use of learning analytics data, its sharing and the ethical use of data are based on transparent criteria and decisions about the benefits and uses of learning analytics;  
  2. University values and strategy as a basis for learning analytics: The use and development of learning analytics is guided by the university's values and strategy;  
  3. Impartiality: learning analytics aims to understand the needs of diverse groups of students and provide them with support and guidance in a proactive and timely manner; 
  4. Improving quality for different stakeholders: Students can use learning analytics to streamline their studies; teaching staff to evaluate and develop teaching; leaders of degree programs as well as university management to support leadership and to improve the quality of teaching; 
  5. Furthering a positive learning experience: learning analytics provides content and pathways for the student to support his/her own personal plan and well-being; 
  6. Personal support and feedback: learning analytics can be used to identify students' learning needs and provide personal support; 
  7. Learning Analytics with the help of teacher and tutor support: we understand that the use of learning analytics provides only a partial picture of students’ performance, activity, well-being, and other factors. Therefore, support measures based on the results of learning analytics are the product of human decision making. Learning analytics complements forms of face-to-face and web-based interaction; 
  8. Critical review of data and algorithms: we recognize that data and algorithms may be garbled. We work systematically to correct any incomplete data, erroneous algorithms, as well as inferences and impacts; 
  9. User-centric development of learning analytics: the development and use of learning analytics is based on the user needs perspective of different groups of actors at the university;
  10. Digital skills development: the use of learning analytics supports students' and staff's understanding and ability to function in digital environments.

The actual policy themes include more detailed guidelines and are accompanied by other existing policies of the university (for example, the Privacy Policy) and documents (for example, the Data Management Plan and the Privacy Statement).

Themes of learning analytics policy: 

  1. Areas of learning analytics and liability issues; 
  2. Data protection principles in learning analytics; 
  3. Ensuring the validity of learning analytics data and results; 
  4. Access to analytics' results and data; 
  5. Justifying and enabling positive interventions; 
  6. Identifying and considering the detrimental effects of learning analytics; 

Learning analytics deployment is on its way to universities. Policy guidelines are an essential tool in conducting this action. The Policy is also a living document that needs to be refined through new opportunities, and based on the accumulation of experience and analytics.

Jiri Lallimo

Aalto University

References

JISC, UK (2015). Code of Practice for Learning Analytics. (Noudettu 18.10.2019) 
https://www.jisc.ac.uk/sites/default/files/jd0040_code_of_practice_for_learning_analytics_190515_v1.pdf

Sheila-project, Supporting Higher Education to Intgrate Learning Analytics. (Noudettu 18.10.2019) https://sheilaproject.eu/

Siemens, G. (2013). Learning Analytics: The Emergence of a Discipline. American Behavioral Scientist, 57(10), 1380–1400. 

Blog

Learning analytics going safely to this day and open mindedly into the future and beyond

At the turn of the year, most universities and higher education institutions analyse the previous year’s performance targets by degree level or by other selected criteria. During that period, many educational institutions are launching the following year’s academic degree structures and preparing the various teaching modules. In how many institutions have the results been predicted using the traditional approach, such as, for example, that 70% of all beginners will graduate on time? And should this prediction model continue to guide the implemention of teaching for the coming academic year, as usual?

It is somewhat baffling that, at the same time as the AnalyticsAI project is being implemented, migration to SISU and Peppi is underway, whereby part of the prediction model development is likely to be learning the features of the new systems rather than developing the actual analytics tool. Why don’t all have the same system in place? The most recently adopted Data Protection Act creates its own challenge into the overall picture. This in turn has led to the fact that even in the development of analytics we have had to be very cautious in developing “real” forecasting models.

Fortunately, in the project, we have jointly identified the risks described above and are able to look at the whole picture openly by first identifying and recognizing the actual prediction models already in place and at least in use at some universities. Only then will we develop new, appropriate analytics’ tools. In doing so, we will not come up with something that is already available from SISU or Peppi. To support the development of analytical tools, it is thus reasonable to think of a three-part timeline: the existing tools, tomorrow’s tools, and the tools for beyond tomorrow’s prediction model.

At the moment we can quite reliably extract from SISU, for example, student-specific PSPs, study load data by period, number of graduates by educational product, etc. However, these do not really predict anything, but describe what has already happened. However, with a bit of work, it is relatively easy to have the opportunity to retrieve the follow-up data for each student or group of students by year of study, and to estimate the number of degrees to be achieved by educational product, anticipating the development prospects of the next 3-4 years. These examples illustrate the attempt to present analytics results as trends that better support the choice of measures required both in the progress of students and their studies, and in the management of educational products. Using SISU’s features, it is also easy to generate system-based alerts for students and management alike if, for example, study time threatens to stretch or the degree goals do not appear to be met. In such cases, the counteractive action may be taken immediately as opposed to an a posteriori analysis of what might have gone wrong.

Towards the end of the project, we will probably be prepared already to present a tool to support the design of educational products featuring the ”day after tomorrow's” forecasting model, which will allow employer feedback and labor market skills needs to be incorporated into the productivity analysis of an educational product. As we have a strong view of the future of the project, we think it is more than justified to continue with the development work in the future, even beyond “the day after tomorrow”. Hopefully, the project sponsor accords with our vision when the ideation of follow-up projects becomes topical.

Katriina Mielonen ja Harri Eskelinen

Lappeenranta-Lahti University of Technology LUT