In total I had 81 LSBU members of staff register for the course which I was very pleased with. This is quick blog post mainly aimed at the participants of the course – really just to give you a feel of what others thought about the course. In terms of evaluating the #LSBU10DoT course I have initially looked at three sources of data; end of course survey results, Twitter analytics and intranet metrics:


As of the 11th May (the survey will remain open for another week) 15 participants of the #LSBU10DoT course completed the end of course evlauation questionnaire (response rate of 19% – which is not too bad for an online survey). Here is a brief summary of the results.

5 out of 15 completed the course. Those that didn’t finish were still catching up, too busy with work and couldnt find the time as the course became ‘less intuitive’. 100% of respondants thought the course clearly demonstarted how Twitter could be used in a professional context. The vast majority 80% and above thought that the content of the 10 days was ‘just the right amount of detail’. Although there were a few comments; ‘I felt there was a little too much text – but that’s probably because I am a relatively advanced
user!’, ‘I got lost at hashtags and all my previous anxieties about twitter just seemed reinforced’, I’ve entered ‘just enough detail’ because depending in your level of experience you might need more or less detail. What you provided enabled more experienced users to be selective but covered all bases for less experienced users’.

They were genarally happy with the topics covered although there were a couple of suggestions of what could have been included, such as, ‘How Twitter can be used with blogging’ and ‘Could we have a bit on institutional policies, what is/isn’t appropriate for LSBU staff etc?’.

In terms of the mode of delivery one persone commented that they ‘Found it difficult to access the intranet off campus’  and another said ‘Thought it worked quite well – but nothing showed up in yammer – was that correct? & would
be good to know how to access the content again if one wanted to (can you download a PDF nfor printing and reading for example?)’

8 respondant said that changed the way they used twitter as result of the course;  ‘Exploring use of lists (jury’s out) & have changed some settings thanks to the painstakinglyclear instructions within the programme!’, ‘I will utilise lists, moments and schedule tweets from now onwards’, ‘My bio is more useful now and I have changed my profile picture’ and ‘ More thoughtful about my approach to twitter’.

Finally I had three overall responces about the course that were very positive:

‘Well done! – thought it worked well! But perhaps a face to face element over a mid point
lunch time would bring it alive? (& would build support more of a ‘community’ aspect ?)’

‘Thank you so much for the lessons. It is a very useful resource which will be extremely useful or those who missed the LSBU10DoT I have enjoyed the daily lessons and will continue to improve my twitter experience for myself and others’

‘Overall, glad I took part. I’m just very slow at this- perhaps for those like me there could be a follow up in a couple of weeks – for any questions that have raised during this time’

Obviously its really pleasing to get such positive comments and I like the idea of maybe doing a F2F sesion, maybe half way through the course to bring people together and share experiences.



This chart above is from the @LSBU10DoT twitter acount showing the number of ‘Impressions’ over the duration of the course. ‘Impressions’ are the number of times tweets are viewed – judging by the numbers it wasnt just participants who had enrolled on the course who were veiwing them. In fact the most popular tweet below was about a best practice workshop:Most_popular_tweet

This is interesting because it was advertising a F2F work shop in the university that didnt have anything to do with the #LSBU10DoT course.

QUICK ANALYSIS OF THE INTRANET PAGES (thanks to Alice Saunders for these figures):

Useful metrics:

  1. Visits – are a metric for volume / reach
  2. Bounce rate – an engagement metric. It’s the % rate that people bounce off the page, indication that content isn’t sticky (of immediate interest)
  3. Av time on page – an engagement metric; how long to people spend using the material.


Day 1 108 53% 03:55
Day 2 46 69% 2.18
Day 3 45 36% 03:42
Day 4 31 71% 1.24
Day 5 30 54% 2.34
Day 6 26 56% 02:50
Day 7 25 33% 02:25
Day 8 23 73% 01:37
Day 9 11 100% 02:22
Day 10  9  100%  01.31
Homepage 51 39% 01:37

Best performing days in terms of engagement were: Day 3 and Day 7 and Day 1. This may be because the day’s content particularly reflected what the audience wanted/needed to know.


Visits steadily decline as the week progresses. However, most days follow the pattern of on-the-day spike and then some additional visits collected over the next 4 days. So, some of the latter days will still be getting new traffic now, as people use the content in their time.



Remember low bounce rate is good. So, the best performers in terms of sticky content were: Day 3 and Day 7.

  • Day 3: was particularly rich in content – with a lot of tailored information specific to academia. Hence the high engagement.
  • The homepage generally had a lower bounce rate than the article pages; looks pretty. Perhaps we should have included graphics at the top of the articles for more visual impact on the individual task pages.


Overall the data above in terms of the survey results, intranet metrics and Twitter analytics, they give a very positive impression of the course. Generally particiapants were very content with the course content. It could appeal to both experienced and new users of Twitter. They like the way it was targeted for an academic audience and it seemed to fit well into their working day. They became more competent and professional users of Twitter. On the downside I was slightly disappointed with the level of Twitter activity, having previously run the course at another much smaller university I noticed a much less engaged and participatory culture on this version of the course. There are a number of possible reasons for this and the evaluation survey does suggest that some participants found the content a bit too complex and that it was less easy to access the information on the intranet off campus. With a few changes I think it would be worth running the course again or even thinking about other versions focusing on Blogging or other learning technology tools like Moodle or Mahara.