The day started with welcomes and introductions and then followed by four speakers.
Mark Stubbs introduced a casy study: Staff and student dashboards at Manchester Metropolitan University.
Did it make a difference? – yes in terms of NSS. More students responding and better results – especially around assessment. Laid foundation for university wide data warehouse which produced evidence driven action (weekly snapshots since 2011, VLE surveys, attendance, submissions).
Also produced a number of challenges; Data quality and ownership, agreeing how to count the number of students, BI skills: demand v supply, verifying accuracy of novel insights (data is showing new things but how do we know this correct when we don’t have any previous data). Real challenge is embedding this data.
Improvements dashboard have been produced for Heads, Programmes and Module leaders. Brining quantitative and qualitative data together. Can produce good visualisation of data eg. Can show that more students are working from home – how can we support that?
Give personal tutors access to data. So they have a dashboard of Moodle usage, attendance etc Gives individual breakdown of students. Gives an ‘at a glance’ evidence of how that student is doing. Basis of a better discussion. Red: Expected it to happen but it didn’t. Amber – Expected it to happen but it did a bit (also showed they were in the bottom quartile in Moodle) Green – it all happened. Light green over and above expectations
Improvements – money from HEFCE for innovation project – student app which was designed with students for students – similar to the FitBit app
In summary its all about people change not just the technology!
Next Niall Sclater introduced a session on the ‘The ethics of learning analytics’. For me this was the most interesting session of the day. Always a topic that crops up in the JISC Learning Analytics project– why is this worries and genuine concerns. Started with a small group discussion of these concerns (who owns the data, black box concerns, transparency, institution as ‘big brother’, students owning the data – is this a concern? If we know and don’t act are we liable?, asking the right/wrong questions of the data)
If we don’t get this right we mifght end up in a situation like InBloom – where data was kept on school students and parents got very concerned and the whole project was shut down. Similar study was the Facebook ‘mood experiment’ which caused outrage.
Niall says “education is different” – we do this for the benefit of our students.
However, there are a number of issues:
The problem of flawed or inadequate data – this is the biggest concern the data is unclean
Predictions are not always valid or correct – Dashboards are not facts eg Rio Salodo College Arizona noted a correlation between students who succedd and those who logged on in the first day Thus they sent and email to all students asking them to login to their VLE.
Neil then asked the question:
Q. Should institutions understand the algorithms and metrics behind learning analytics? Basically the room agreed that the answer is ‘yes’ but it is a question that brings in many issues. How do we present the data to different cohorts who then next practical data literacy to be effective.The issue is not straightforward!
Loss of Autonomy. This seems to be a US issue where students are recommended courses based on their previous data.
Manipulations of the analytics by students – Ryan Bakers suggests more intellenget LA to spot this gaming behaviour.
Negative impacts of continual monitoring – is this an issue – are we developing a culture where we are monitoring everything? Why do we need to Monitor? What did need to capture.
Q. Should we give students access students access to their analytics if it could potentially demotivate them? Again a difficult question to answer – one good answer was about yes but we should provide good support to the students.
Prejudicial categorisation and treatment of students – do we treat student differently based on the data. Do we ignore those at the bottem of the pile – does
Reduction the of the individual to the Metrric – do we thing of people just as numbers?
Q. Should students be asked consent to collect data on them? GDPR says not nesesarily so (surprising) – but you do have for ‘special category data’ or you are going to make interventions based on this data.
Triage – which students do we give are support to? Great students to become excellent students?
Q. If the analytics tell us that a student is likely to drop out do we have a responsibility to do something about it?
The third speaker was Yi-Shan Tsai frpm University of Edinburgh who talked about the The SHEILA project. Supporting HE to integrate Learning analytics. 3 objectives
- State of the art
- Engage different stakeholders
- Develop policy framework
Methodology. Surveys, focus groups, interviews etc
The three objectives were then discussed in much more detail.
- State of the art;
6 challenges of adoption; leadership, engagement with stakeholders, pedagogy approaches, Training, evidence of impact, context-based policies.
LA adoption in Europe – based on interviews and surveys – most countries are using LA. Interest is very high.
LA Strategy. Most were using LA to support their L&T strategy.
Based on focus groups just at Edinburgh only. Senior managers (improve student satisfaction, teaching excellence, retention and what can LA Do generally). Teaching staff, LA gives overview of student data (Performance, Attendance etc) inform course design, manage a big class, know why students struggle. Students want a personalised approach – provide better information to improve teaching support and design, support a widening access policy, support students at all achievement levels to improve learning, assist transition from school to HE and from HE to employment.
The final speaker of the morning was Rens van der Vorst who introduced a case study on The Quantified student at Fontys University. He started with the question “will you still be married in 10 years time”? Seems like we need to ask how many times couples are having sex is the right question to ask to find the answer!
He and his university built a ‘data lake’ build on variety of data (time on campus, what students are eating, how often they were sleeping). They then asked their students to analyse their own data and find correlations based on some design principles (key is the learning experience of the students). Privacy is key and be totally transparent.
Most important part of the programme is to get the data back to the student (Big Mother) students like to heard!
You cannot trust your feelings you must look at the data! Thus can we do the same thing with studying? So his students have built apps that can do this. NOMI tracks how many hours students study. ‘Am I a Workhorse’ app measures how long the student is on campus. Alo apps to check how many hours students spend on their smartphones in the class.
The afternoon session was entitled ‘Making it happen: technical panel’.
How do you integrate different information sources and what is the right mix of technologies?
Approaches used in the Netherlands – Nynke de Boer, SURF.
SURF (Similar to JISC) covering Cloud/networking services.
Customised Education is the section that deals with LA in SURF. The LA team’s goal is to ‘make education better’.
Learning analytics experiment was SURF’s attempt to ‘go and get started’ and keep it simple. They developed an infrastructure. They focus is on the capture and record stage. First step what do you want to know Next create a XAPI recipe. Thirdly embed code into a learning system which then reports back data to the teacher (in a visual way).
Lessons learnt: KIS keep it simple!
Approaches used in the UK – Niall Sclater.
Niall gave a good over of the JISC Learning Analytics project. Big project to have economies of scale –Student information system (Standard approach). VLE uses the ‘Experience API’ which uses a ‘Actor/Verb/Object’ approach (starting to work with Turnitin) . All this data is sent to a learning warehouse cloud that each institution can access and then benchmark their data against other institutions. Then they provide a ‘processor’ which feeds the data to staff dashboard. They have also developed a app called ‘Study Goal’. 20 Institutions engaged with the project and others thinking of coming on board.
The app gives the student a ‘score’ and they can see how this changes over time compared to the course average or other individual students. It can also record attendance in lectures.
Next there followed Q&A and panel discussion on some technical issues.
After that Ed Foster talked about a Case study: Embedding learning analytics across an institution – Ed Foster.
Speaking from the perspective of the end user!
Nottingham Trent Uni use the SolutionPath tool. Student dashboard available to staff and students. It just focuses on Student engagement – it is just a proxy for engagement. Did NOT focus on student socio-economic background of the student.
4 four goals of the project:
- Student success
- Improving staff/student working relationships – to kick start conversations
- Support students to manage their own learning
- Improving institutional data and systems
Two users to the data:
Further two aspects of the dashboard – measuring students engagement with learning AND static information (but Hard to find if you are tutor).
What the data at NTU show: Impact on Engagement. All students. 15/16 academic year 80% students progressed from year 1 to 2. 6% repeated. But partial engagement leads to 81 progress and High engagement leads to 95% progress onto year 2.
Student reactions to dashboard – evidence showed they liked and had some positive impact on engagement. Although not saying this is a causation…
Final stage is that tutors can add notes to the LA system (actions, length of converstionsetc)
- what is the mission of the LA initiative? – Need to ask the questions – what are you trying to do? Why will LA change anything or make a difference?
- Problems with tapping into institutional data. Who owns the data.
- Product and process development. Exposing and coping with assumptions of LA design is an issue. When does the academic year start/end?
- You can never do enough communication!!
- Implementation – what is it that will be different? Focus on change, ongoing management.
Can we intervene to spot struggling student and help them. How do we find different strategies to engage with students?
The final presentation was by Elsa Cardosa on a case study: Gaming approaches at the University Institute of Lisbon
Developing a gaming approach. Aims to be interactive and fun and promote self-discipline. From the faculty perspective to provide a “right-time aggregated view” on class performance. On the third version of the prototype.
Student perspective – they proposed the idea of the use of gamification of the data. Prototype has been used on three levels UG PG and working students.
Learning scorecard produces two different views (Staff and Students). Has various functional requirements.
Product developed ‘gain creators’ and ‘gain relievers’ (based on customer Student/staff gains and pain).
Gaming works as the students gain EP’s (experience points) which as based on their assignment progression , forum participation etc
Making it happen: people and processes – discussion of challenges and top tips for success with panel of case study presenters
Lessons learnt: Registration is now made mandetory. Registration is a course milestone. Reward system is crucial (badgers integrated into sytem). Data privacy is now not an issue (apparently!) – students are happy to be identified on the leader board – fostering competition. It takes time – its a cultural change.
Final question from the floor was “where do you think we will be with all this in 10years time? Great question to end the day on!
Many thanks to all the presenters – they were all great and really interesting – the debate goes on!
You can find and download everything from this padlet https://padlet.com/gillferrell/A_Z_Learning_Analytics