In this series of articles, the Educational Technology team will be providing an insight into existing practice using technology for learning and teaching at Falmouth University and various projects being undertaken within the sector.

image

Learning Analytics utilises the data that passes through University systems and can be used as a powerful tool for learning about students and their achievements. It is used to collect data to measure learning and the contexts in which learning takes place. This data is then analysed and put to use to optimise interactions and opportunities; improving engagement, experiences and ultimately results.

It cannot be as simple as throwing data into some predictive software and asking it to tell you what it thinks though; analysts/experts are best placed to interpret the data, who know what they’re looking for, and align results with the institutional, teaching and student priorities.

An example of analysing an element of learning is measuring engagement. Overall there are three levels that can be measured.

Behavioural
Students who are behaviourally engaged would typically comply with behavioural norms, such as attendance and involvement, and would demonstrate the absence of disruptive or negative behaviour.

Behavioural engagement can be measured by recording attendance at face to face sessions and access to online materials. This can be relatively easy to achieve as long as the monitoring tools are in place.

Emotional
Students who engage emotionally would experience affective reactions such as interest, enjoyment, or a sense of belonging

Emotional engagement is more difficult to measure. Some activities that facilitate tracking include online discussion, interactive activities

Cognitive
Cognitively engaged students would be invested in their learning, would seek to go beyond the requirements, and would relish challenge

Cognitive engagement is even more difficult to measure. Extra curricular activities and ‘extra credit’ work can shed some light…

Activities, systems and mechanisms need to be in place to ensure that the right level of analytics can be performed, and measurements can be determined. This is why smaller pilots should be carried out with tasks designed with analytics in mind; so that settings and specifications can be refined.

There are other reasons to implement Learning Analytics including (but not limited to):

  • Identify students at risk so as to provide positive interventions designed to improve retention.
  • Provide recommendations to students in relation to reading material and learning activities.
  • Detect the need for, and measure the results of, pedagogic improvements.
  • Tailor course offerings.
  • Identify teachers who are performing well, and teachers who need assistance with teaching methods.
  • Assist in the student recruitment process.

Jisc cetis Analytics Series (2012)

Learning analytics has a strong link with pedagogy, and consideration needs to be taken into how the institution would like to improve pedagogically, and consider the method of implementation of a learning analytics process to ensure that it will not hinder, but enhance the direction of its learning and teaching strategy.

The sector has looked at learning analytics over the last few years as a tool to achieve better experiences for students. The realities of using the huge amount of data that institutions collect is shrouded in ethical and legal issues but luckily, the good folks over at Jisc have done a lot of the leg work and developed a Code of Practice for learning analytics as part of their ongoing Effective Learning Analytics project. This code of practice is in place to advise UK HEIs about the legal and ethical considerations that need to be included in the implementation of a learning analytics strategy.

See some of Team ET’s previous work on Learning Analytics with the Jisc Learning Analytics Network and its pre-project work.

Skip to toolbar