12 Apps of Christmas 2017 #12AoC


Recently I have moved to new job at London South Bank University and will not be doing a new version of #12AoC this year – like Glastonbury its good to have a fallow year once in while!

However, there are some great versions of the course being offered by other universties this year:

Limerick University in Ireland.


British Columbia University in Canada


Glasgow Calledonia University in Scotland.


Although I think the Glasgow version will be just for their staff only….

Also there is one for the lawyers, from the Law Society in Ireland:

Please leave a comment if you know of any other versions happening this year…and Happy Christmas!



Social Media in HE – the book! #SocMedinHE


Book cover v6

I’ve just started putting together a new book on the use of social media in higher education and it will be out next year. After I contributed to David Hopkin’s book on Emergency Rations I felt inspired and that I could do something similar.

So back in June this year I emailed/DM’ed several people I have worked with or knew who used aspects of social media in different universities asking them if they wanted to contribute to the book. I had a great response and now the book is shaping up nicely. There are 22 chapters representing 21 different universities (plus JISC) and from a variey of different professions, lecturers, learning technologists, librarians, carrers advisors and senior managers.

Also I thought it would be a process worth blogging and tweeting about, which I will do over the the next few months.

Here are the chapter titles so far:

Professional practice.

Developing a professional online presence and effective network.

The digital, authentic you.

Putting the Digital in the Professional for Social Work Students.

Career development online – is ‘The real world’ losing its importance?

Exploring the tensions in personal and professional identity to enable authentic debate and dialogue.

Teaching and learning.

Exploring social media use as a distraction in the HE classroom.

Social Media and Digital Identity in Formative Assessment.

A Framework for teaching social media for staff and students in HE.

Modelled use of cloud tools and social media by academic developers.

Social Media And Its Potential Application to Creative Students.

‘But I already know that’: Teaching social media beyond the front page.


Leadership and Social Media: Challenge and Opportunity.

Twitter and University Leadership – Navigating PR, academic networking and social identity.

Building Networks

Building Cohort Identity through Social Media.

WeChat, WeLearn: using social media to support the experience of students on a year abroad.

Using social media to create a sense of belonging and ‘connectedness’ for first year undergraduate student arrivals in a School of Arts and Humanities.

Bursting out of the bubble: social media, openness and HE.

Perspectives on Networked and Open Scholarship: Affordances and Barriers.


Academics’ understanding of Learning Spaces: Attitudes, practices and outcomes explored through the use of Social Media.

Learning to Twalk: an analysis of a new learning environment.

Expertise in your ears; why you should jump on the podcasting bandwagon.

The Personal Journey

A Librarian’s experience of Twitter as a tool for continuous personal development.

The ‘Healthy Academic’, Social media and, a personal and professional journey.

Check out the hashtag too:





Reflections on the A-Z of Learning Analytics #EUNIS


LA 4      LA 3

The day started with welcomes and introductions and then followed by four speakers.
Mark Stubbs introduced a casy study:  Staff and student dashboards at Manchester Metropolitan University.

Did it make a difference? – yes in terms of NSS. More students responding and better results – especially around assessment. Laid foundation for university wide data warehouse which produced evidence driven action (weekly snapshots since 2011, VLE surveys, attendance, submissions).

Also produced a number of challenges; Data quality and ownership, agreeing how to count the number of students, BI skills: demand v supply, verifying accuracy of novel insights (data is showing new things but how do we know this correct when we don’t have any previous data). Real challenge is embedding this data.

Improvements dashboard have been produced for Heads, Programmes and Module leaders. Brining quantitative and qualitative data together. Can produce good visualisation of data eg. Can show that more students are working from home – how can we support that?

Give personal tutors access to data. So they have a dashboard of Moodle usage, attendance etc Gives individual breakdown of students. Gives an ‘at a glance’ evidence of how that student is doing. Basis of a better discussion. Red: Expected it to happen but it didn’t. Amber – Expected it to happen but it did a bit (also showed they were in the bottom quartile in Moodle) Green – it all happened.  Light green over and above expectations

Improvements – money from HEFCE for innovation project  – student app which was designed with students for students – similar to the FitBit app

In summary its all about people change not just the technology!

Next Niall Sclater introduced a session on the  ‘The ethics of learning analytics’.  For me this was the most interesting session of the day. Always a topic that crops up in the JISC Learning Analytics project– why is this worries and genuine concerns. Started with a small group discussion of these concerns (who owns the data, black box concerns, transparency, institution as ‘big brother’, students owning the data – is this a concern? If we know and don’t act are we liable?, asking the right/wrong questions of the data)

If we don’t get this right we mifght end up in a situation like InBloom – where data was kept on school students and parents got very concerned and the whole project was shut down. Similar study was the Facebook ‘mood experiment’ which caused outrage.

Niall says “education is different” – we do this for the benefit of our students.

However, there are a number of issues:

The problem of flawed or inadequate data – this is the biggest concern the data is unclean

Predictions are not always valid or correct – Dashboards are not facts eg Rio Salodo College Arizona noted a correlation between students who succedd and those who logged on in the first day Thus they sent and email to all students asking them to login to their VLE.

Neil then asked the question:

Q.  Should institutions understand the algorithms and metrics behind learning analytics? Basically the room agreed that the answer is ‘yes’ but it is a question that brings in many issues. How do we present the data to different cohorts who then next practical data literacy to be effective.The issue is not straightforward!

Loss of Autonomy. This seems to be a US issue where students are recommended courses based on their previous data.

Manipulations of the analytics by students – Ryan Bakers suggests more intellenget LA to spot this gaming behaviour.

Negative impacts of continual monitoring – is this an issue – are we developing a culture where we are monitoring everything? Why do we need to Monitor? What did need to capture.

Q. Should we give students access students access to their analytics if it could potentially demotivate them? Again a difficult question to answer – one good answer was about yes but we should provide good support to the students.

Prejudicial categorisation and treatment of students – do we treat student differently based on the data. Do we ignore those at the bottem of the pile – does

Reduction the of the individual to the Metrric – do we thing of people just as numbers?

Q. Should students be asked consent to collect data on them? GDPR says not nesesarily so (surprising) – but you do have for ‘special category data’ or you are going to make interventions based on this data.

Triage – which students do we give are support to? Great students to become excellent students?

Q. If the analytics tell us that a student is likely to drop out do we have a responsibility to do something about it?

LA 2     LA 1

The third speaker was Yi-Shan Tsai frpm University of Edinburgh who talked about the The SHEILA project. Supporting HE to integrate Learning analytics. 3 objectives

  1. State of the art
  2. Engage different stakeholders
  3. Develop policy framework

Methodology. Surveys, focus groups, interviews etc

The three objectives were then discussed in much more detail.

  1. State of the art;

6 challenges of adoption; leadership, engagement with stakeholders, pedagogy approaches, Training, evidence of impact, context-based policies.

LA adoption in Europe – based on interviews and surveys – most countries are using LA. Interest is very high.

LA Strategy. Most were using LA to support their L&T strategy.

Based on focus groups just at Edinburgh only. Senior managers (improve student satisfaction, teaching excellence, retention and what can LA Do generally). Teaching staff, LA gives overview of student data (Performance, Attendance etc) inform course design, manage a big class, know why students struggle. Students want a personalised approach – provide better information to improve teaching support and design, support a widening access policy, support students at all achievement levels to improve learning, assist transition from school to HE and from HE to employment.

The final speaker of the morning was Rens van der Vorst who introduced a case study on The Quantified student at Fontys UniversityHe started with the question “will you still be married in 10 years time”?  Seems like we need to ask how many times couples are having sex is the right question to ask to find the answer!

He and his university built a ‘data lake’ build on variety of data (time on campus, what students are eating, how often they were sleeping). They then asked their students to analyse their own data and find correlations based on some design principles (key is the learning experience of the students). Privacy is key and be totally transparent.

Most important part of the programme is to get the data back to the student (Big Mother) students like to heard!

You cannot trust your feelings you must look at the data! Thus can we do the same thing with studying? So his students have built apps that can do this.  NOMI tracks how many hours students study. ‘Am I a Workhorse’ app measures how long the student is on campus. Alo apps to check how many hours students spend on their smartphones in the class.


The afternoon session was entitled ‘Making it happen: technical panel’.
How do you integrate different information sources and what is the right mix of technologies?

Approaches used in the Netherlands – Nynke de Boer, SURF. 

SURF (Similar to JISC) covering Cloud/networking services.

Customised Education is the section that deals with LA in SURF. The LA team’s goal is to ‘make education better’.

Learning analytics experiment was SURF’s attempt to ‘go and get started’ and keep it simple. They developed an infrastructure. They focus is on the capture and record stage. First step what do you want to know Next create a XAPI recipe. Thirdly embed code into a learning system which then reports back data to the teacher (in a visual way).

Lessons learnt: KIS keep it simple!

Approaches used in the UK – Niall Sclater. 

Niall gave a good over of the  JISC Learning Analytics project. Big project to have economies of scale –Student information system (Standard approach). VLE uses the ‘Experience API’ which uses a ‘Actor/Verb/Object’ approach (starting to work with Turnitin) . All this data is sent to a learning warehouse cloud that each institution can access and then benchmark their data against other institutions. Then they provide a ‘processor’ which feeds the data to staff dashboard. They have also developed a app called ‘Study Goal’. 20 Institutions engaged with the project and others thinking of coming on board.

The app gives the student a ‘score’ and they can see how this changes over time compared to the course average or other individual students. It can also record attendance in lectures.

Next there followed Q&A and panel discussion on some technical issues.

After that Ed Foster talked about a Case study: Embedding learning analytics across an institution – Ed Foster.

LA 5   LA 7  lA 9

Speaking from the perspective of the end user!

Nottingham Trent Uni use the SolutionPath tool. Student dashboard available to staff and students. It just focuses on Student engagement – it is just a proxy for engagement. Did NOT focus on student socio-economic background of the student.

4 four goals of the project:

  1. Student success
  2. Improving staff/student working relationships – to kick start conversations
  3. Support students to manage their own learning
  4. Improving institutional data and systems

Two users to the data:

  • Students
  • Staff

Further two aspects of the dashboard – measuring students engagement with learning AND static information (but Hard to find if you are tutor).

What the data at NTU show: Impact on Engagement. All students. 15/16 academic year 80% students progressed from year 1 to 2. 6% repeated. But partial engagement leads to 81 progress  and High engagement leads to  95% progress onto year 2.

Student reactions to dashboard – evidence showed they liked and had some positive impact on engagement. Although not saying this is a causation

Final stage is that tutors can add notes to the LA system (actions, length of converstionsetc)


  1. what is the mission of the LA initiative? – Need to ask the questions – what are you trying to do? Why will LA change anything or make a difference?
  2. Problems with tapping into institutional data. Who owns the data.
  3. Product and process development. Exposing and coping with assumptions of LA design is an issue. When does the academic year start/end?
  4. You can never do enough communication!!
  5. Implementation – what is it that will be different? Focus on change, ongoing management.

Further development:

Can we intervene to spot struggling student and help them. How do we find different strategies to engage with students?

The final presentation was by Elsa Cardosa on a case study: Gaming approaches at the University Institute of Lisbon

LA 6    LA 8

Developing a gaming approach. Aims to be interactive and fun and promote self-discipline. From the faculty perspective to provide a “right-time aggregated view” on class performance. On the third version of the prototype.

Student perspective – they proposed the idea of the use of gamification of the data. Prototype has been used on three levels UG PG and working students.

Learning scorecard produces two different views (Staff and Students). Has various functional requirements.

Product developed ‘gain creators’ and ‘gain relievers’ (based on customer Student/staff gains and pain).

Gaming works as the students gain EP’s (experience points) which as based on their assignment progression , forum participation etc

Making it happen: people and processes – discussion of challenges and top tips for success with panel of case study presenters

Lessons learnt: Registration is now made mandetory. Registration is a course milestone. Reward system is crucial (badgers integrated into sytem). Data privacy is now not an issue (apparently!) – students are happy to be identified on the leader board – fostering competition. It takes time –  its a cultural change.

Final question from the floor was “where do you think we will be with all this in 10years time? Great question to end the day on!

Many thanks to all the presenters – they were all great and really interesting – the debate goes on!



Towards excellent teaching….@lsbu #lsbuCRIT

Went to an excellent talk here at LSBU today organised by the Centre for Research Informed Teaching (CRIT) – Brent Carnell talking about how UCL our changing the culture of teaching and learning at their university. He shared a really great resource – two downloadable books on The Connected Curruculum and Developing the Higher Education Curriculum – avaibale from UCL press.

Critical learning analytics…


I went to the SRHE for a workshop on critical learning analytics yesterday. There were three presentations.

For now we see through a glass, darkly – critical perspectives on the rise of learning analytics in the age of teaching ‘excellence’. Sue Timmis

Sue started with a discussion on what is Excellence? Based around the work of Readings (1996) The University of Ruins – Excellence is about performativity – self interested and internally focused. Furthermore, MacFarland (2015) talks about 3 further dimensions: 1. Presenteeism – as opposed to absentisim 2. Learnerism learning that is public (although he goes on to make further criticism of collaborative learning) 3. Soulcraft – which is emotional performism (although this is less relevan to LA)

What do we mean by LA? Siemens and Gasevic 2012 – “The measurement, collection, analysis and reporting of data about learners and their contexts for purposes of understanding and optimising learning and the environments in which it occurs”

LA is an increasing topic of research eg. SOLAR – (2011 first conference) Clow (2013) argues LA is data driven and often very untheoretical. JISC have been very active produced a report 2016 and working on a LA service (which is to feed into the TEF). “71% of students questioned by JISC..said they would be happy with their data to used ‘ but this is questionable.

Economides (2014) did a SWOT analysis of existing research into LA – Strengths are mainly the pre-existing results. Weaknesses includes informational overload – Teachers cannot interpret information. Opportunities are about the power of the system mainly – Threats, Ethical issues, over-analysis, trust, possibility of pattern misclassification.

Ethics of LA. Proliferation of data with increase there for the ethical issues will grow and need to be addressed. Timmis (2016) most studies on LA have not addressed ethical issues. LA comes OUTSIDE of the ethical frameworks of research in most unis…although this starting to change more recently.

Retention and Success – at risk. Retention is NOT just determined by LA….need to warnings – this contrast to study by Liz Thomas where the biggest factor in determining retention is the culture of belonging, peer to peer and peer to tutor interactions.

Presenteeism becomes a proxy for performance.

Learnerism draws on the work of Holmes – its about the public show of learning – the right to be silent is ignored – especially the derogatory term of ‘lurkers’ (some students like to learn on their own).

Wilson (2017) recommended system actually encourage homogeneity by pointing students to the same resources narrowing creative thinking…

Final point – LA can lead to reinforcing inequalities…algorithms are not neutral Wilson (2017) – there are real risks of reinforcing inequalities rather than solving them. Just like the myth of the digital native institutions and policly makers are ‘beguiled’ by performance. How can we move to a more ethical LA? – more dialogic approach – students and staff need to brought together – need institution wide ethic policies. The digital Rights of the students. We need beyond the performative to be supporting students to be more self-critical and increasing risk taking and learning from your mistakes.

Good questions for the floor – is the critique about the lack of what LA can do or is it a broader critique of what LA should be measuring. – Are we collecting the right data?

See also Sonia Livingstone – the digital rights of the child could be useful.

Ways of seeing through Data. Mary Loftus.

‘Psychometric moderation’ was mentioned by one person – sound scarey. What is psychometric moderation?

Good discussion about who owns the data in universities – especially the students, are they aware of what data is being held on them and who is doing what with it.

From prediction to participation: a vision of critical awareness with the learning analytics report card (LARC). Jeremy Knox.

Out of the three talks this is the one I got the most from…

Big data in education – there is a gap between the production of data (LA) and education (teachers and students).

Blackboxing and educational dis-empowerment (Letour 1999) looking at inputs and out puts  There is a hidden process that student and teachers are hidden from (the results). However there is space for critical questions : how does analytics work? And why is being used? (assessment/management /ethics). See J.Knox 2017 “Data power in education…”

Predilections for prediction. This is one way of opening the ‘blackbox’…This is happening through machine learning (decision trees, neural networks, random forests, k-nearest neighbours) McKenzie 2015:

So prediction is

  • Categorisation and features. (how rigid are these?).
  • What is new about machine learning is that the data is ‘open-ended…’
  • Function finding. Process of matching two or more sets of variables…

Challenges for prediction:

Ethical. Is it Ok to judge present cohorts of students based on previous cohorts of students. Is there a ‘justice’ of forecasting? I.e. forecasting failure.

Philosophical. Education a complex system. Se Beer al 2012). Educational ideals i.e what should be important or do we just let the data determine this? Do we want to cut out failure in education – maybe this is import in how we learn.

Nest Jeremy talked about a small project at Edinburgh uni. Where they created a student interface on some new software. Very different type of LA. Designed only for students. Students can decide their own course, time scale and style report. Wanted to resist the data dashboard so the students received written text and then that feedback would happen opportunities for further discussion.

Feedback so far from the students mentioned factors such as Trust, autonomy , awareness of surveillance, novelty, limitations, recognition.

These issues are explored in an article by Jeremy ‘Exploring Critical Awareness with the “Learning Analytics Report Card”

Followed by some good questions about big data and ‘local data’ (course, student etc).

Ways of seeing learning through data – Learning Analytics for Learners. Mary Loftus.

Mary started with an excellent quote from John Berger’s ‘Ways of Seeing’. AI and machine learning opens new spaces for learning..but the LA story so far is set on prediction, personalisation, writing analytics…but little work has been done but or for the student perspectives.

Can we use LA to support PBL – Improve feedback – support student vulnerability? Measuring gets results but we need to use it with care in education…because Data id political (see Dana boyd )– there are unintended consequences eg criminal justice system – ‘White-box’ algorithms – we need to see how decisions are being made. Draws on lean manufacturing techniques.

How can students see analytic and data – how can we help students to do that – how are big organisations shaping their lives. Probalistic graphical models could be useful Daphine koller..also Bayesian probability (Thomas Bayes) looks at previous evidence and predicting things …Bayes rule…wanted to improve the existence of god! Se connected learning analytics –  Kitto et al.

No, no no! was my first response to using this Bayesian probability for the interpretation of LA. This is the first time I’d come across Bayseain probability. Mary did a good job in explaining how it worked – essentially saying that if we keep one variable contrast what is the probability of the final outcome changing if we change another variable. It did send a ‘shudder down my spine’ as it seemed a very similar approach that modern ‘positivist’ Economics theorises its decision making – and look where that got us! Anyway maybe I have misrepresented this probability techniques and need to look at it in a bit more detail.

When the presentations go online I’ll put a link to them here..

Anyway with LA it does all feel we are on a predicted road and the future is certain..give us time to work it out!



My Last day at @regentsuni

Today is my last day at Regent’s University ….

I have been at Regent’s for just over 7 years and been part of some big changes that have happened during this time.

I’ve really enjoyed working with the teaching staff and Learning Technology Team – I have been able to support and develop some really exciting, creative and innovative projects. Whilst my remit has been on learning technology – the systems we use mean nothing without the expertise and professionalism of the staff who use them. It truly has been a pleasure and inspiration working with all of you all !

Here are a few pics of some of the projects I’ve worked on over the last few years:

PhotoFitTwalk 212aoc-2016-x260IMG_20170531_135012mark-geimg_20160927_134152019ge-9ge-7ge-1pe-photo-2Steve and JamesTdeB Poster first draft

50 Must-read HE Ed IT Blogs 2016TdeB 2LDSantaPhoto 812AoC Poster 2015

Innovation awardsNot just for Christmas...posterTOTTAtmn2015James

twitter packsApposite vidCredo%20AwardCRIMOMtwitterlogobony.fw

Tricks of the Trade Logo Sept 2014Credo awards pichelenSmartKazakh Universityphoto 4

BYOD4L intranetAndy hortonLibrary photoGoogle hangouts 6Bb Catalyst award winneroutdoor-learning

and one last tune…

Storify – Crossing Boundaries #Twalk with #LSSIG


Here is the Storify from Andrew Middleton’s Twalk last week:

‘Walk this way…reflections on a #Twalk’ #SocMedHE17



Our joint proposal (Alex Spiers, Andrew Middleton, Claire Moscrop, Santanu Vasant and Jeff Walcock) for a conference workshop entitled ‘Walk this way…reflections on a #Twalk’ has been accepted at the 2017 Social Media for Learning in HE Conference at Sheffield Hallum University on 19th December. Here is the outline of the session we submitted:

By the end of the session workshop participants will,

  • Understand the benefits of using a #twalk as a learning space
  • Know how to design their own #twalk
  • Understand how social media can integrate seamlessly with face-to-face activities to create a place for rich experiential learning
  • Know how to engage others after the #twalk through the ongoing use of social media
  • Reflect on how different spaces support a diverse preferences for learning engagement
  • Identify and use at least one principle of good learning space design
  • Be able to recommend qualities informing the future learning spaces

The impact of the workshop will be increased participant awareness of the application of social media in a blended learning strategy and the implications of this for learning space and curriculum design.

The workshop will begin with a round of structured stories from some of the universities who participated in the global #Twalk event on 31st May 2017.

A set of activities will show the potential of the #Twalk as a method for enhancing a higher education learning experience:

Activity 1: Using metaphor and motion– walking changes the learning dynamic and the readiness of a learning network to engage. We will explore the transferability of the #Twalk model and how spatial landmarks can be used to structure discussions by considering how a walk and its ‘pause point’s can stimulate engagement in face-to-face and tweetchat learning conversations in any disciplinary area.

Activity 2 –  Participants will do a micro-Twalk, with a Christmas theme, to try the method and explore different types of learning spaces (formal/informal, ‘real’/’virtual’) by taking a 10 minute route through one floor of the conference building.

Activity 3– Using our #Twalk planning template, participants will draft a ‘Twalk’ for their own institution and discipline with the support of peers in the room.

Concluding discussion and whiteboard activity

Participants will select from the following topics to generate ideas and guidance:

  • Connected future space #1 – learning space beyond the binaries of informal-formal, digital-physical, study-work
  • Connected future space #2 – learning to be agile, networked and nomadic using social media for uncertain futures
  • Reasons to twalk – exploring the benefits of #twalking

Pre and post activities

The organisers will be ‘flipping the twalk’ by running a pre-conference virtual twalk in which ‘walkers’ will follow an online route, interacting with, grabbing, making, photographing and sharing what they find using social media.

Participants will feedback on their activities using social media and will be encouraged to join a collaborative online space to record their future Twalks.


Ellis, R.A. & Goodyear, P. (2016). Models of learning space: Integrating research on space, place and learning in higher education. Review of Education, 4(2), June 2016.

Long, P.D. (2005). Learning space design in action. EDUCAUSE Review, July/Aug., p.60.

Megele, C. (2014). Theorising Twitter chat. Journal of Perspectives in Applied Academic Practice, 2(2), 46-51.

Mulcahy, D.. (2015). Re/Assembling Spaces of Learning in Victorian Government Schools: Policy Enactments, Pedagogic Encounters and Micropolitics. Discourse: Studies in the Cultural Politics of Education, 36(4), 500-514.

Setting up a Digital Champions scheme


Poster 1

Over the last few weeks I’ve been working on a new pilot project setting up a staff Digital Champions scheme for academics working at Regent’s University London. The scheme is to help Digital Champions share good practice across the university on a wide variety of learning technologies from Powerpoint, social media, blogging to email and Blackboard content. The Digital Champions will be allocated a Learning Technology who will work alongside the lecturer to support their project over the course of the academic year.

I started out by trying to clarify what the project was actually trying to do – here is my first attempt at outlining the aims of the scheme:


The aims for the Digital Champion Scheme:

  • Help our Digital Champions share good practice across the university.
  • Improve lecturers digital literacy and thus improve the quality of the learning materials the students receive.
  • Make workflows more efficient for lecturers.
  • Strengthen communication and relationships between the Learning Technology Team, Academic Developers and the academic staff.

Next I went on to outline what the scheme will actually provide for the Digital Champions:

What Digital Champions scheme provides.

  • 15 hours of dedicated support from a Learning Technologist from the LT team to support your Digital Champions project.
  • The facility for each Digital Champion to record their project using different types of media.
  • Help to evaluate your Digital Champions project.
  • Verification of each Digital Champion’s personal and professional achievements through Digital Champions Certificates.
  • The opportunity to present the findings of their project to the end of year Digital Champions exhibition.
  • A customised and branded landing page that all of your Digital Champions will see as soon as they log onto Blackboard. You can amend this page as it suits to provide relevant, up-to-date information to your Digital Champion project.
  • Enrolment for all Digital Champions on the full collection of our Learning Technology Workshops to increase your knowledge and develop essential digital teaching techniques.
  • A Digital champions page on the Regent’s University London website (regents.ac.uk/digitalchampions) providing details of the Digital Champions projects and information about publications based on these projects.

Next I plan to sort out the time scale of when things need to be done by – First I’m planning to get the scheme up and running, posters up, website with application form etc. Hopefully this will involve getting staff to apply at the start of the academic year in Sept/October and then appointing the Digital Champions and moving forward with their projects from mid October onwards.

It would be great to hear from anyone else who has set up a similar scheme in their college or university. What have been the pros and cons? What are the pitfalls to avoid? How best to publicise it? Please leave a comment!