turnitinArrived at Senate House in time for coffee. First session kicked off on time…

1100-1200     Turnitin Update & Roadmap. Will Murray VP International and Tom Rees Customer Care

Support update – (Tom Rees). Challenged to meet expectations – system wide issue. Growing customer base which is not predictable. Growth in usage. Increasing support and efficiency, user education. Thus Turnitin is increasing global support team 9number of staff). Also increasing hours covered eventually to 24/7 support. Looking to add support portal, adding more support material, videos, manuals FAQ’s, Expanded webinar schedule. Questions were raised about splitting up the support manual.

Engineering update. New VP appointed (Christopher Minson) introduced new team structure and new launch process. Expanded integration team. Big data crossroads. 400K submissions a day. Reaching the point where Turnitin systems can’t cope. Growth rate unprecedented expansion. Data base sharding is needed to split the databases. Which means dates have been put back for several projects

Communications. Customers want different things. One channel does not guarantee communication so Turnitin uses Twitter, RSS feeds (can be integrated into Blackboard…we could do this at our Regent’s Blackboard site), Email, mobile phones (smm messages).

Someone proposed a Twitter hashtag for todays event #tiifeb13

Academic update. Revised Plagiarism.org site…highlighted top ten resources. Established academic network. Speak to Gill Rowell if someone is interested in joining. US Turnitin team will launch a interactive rubric.

Product changes. There is now a Turnitin changes twitter feed. @turitinproduct

Turintin product also offer webinars. New products:

  1. New tutorial in Grademark. Useful for sending this as a link to tutors.
  2. New set of grademark rubrics. Prewritten rubrics (mainly US developed but will be anglicised). These are available online.
  3. ETS e-rater is now embedded into Turnitin. Gives corrected feedback on spelling, grammar, usage, style etc (aimed at English as a second language teaching), the product is an additional cost.
  4.  iPad app (Free) should be out in May 2013. To be launched at Grade Anywhere campaign. Quick demo of beta version was given…looks great. Can leave voice comments, use grademark Rubrics, etc. Also means it can download the papers so lecturers can mark off line.
  5. Multiple graders. High demand for this feature, especially in the UK. This will have a grade change audit…so its possible to see changes. Layers of markers can be called anything eg 1st or 2nd marker, moderator etc
  6. Admin tools. Allow multiple admin account, drop students from account, ability to expire classes, move classes between instructors, upload students list outside of class, create classes for instructors.
  7. New product research. Folder based version on Turnitin for research courses, Phd’s etc.. This will be an additional fee.
  8. Formative. (2014) allows students to provide multiple submissions for formative feedback from tutors (so tutors can see the changes that students have made)
  9. Document viewer improvements. DV fixed some failure case and presentation improvements. Second phase will be a rewrite of the DV. Will involve an intensive testing period. Not expected until the end of the year.
  10. Content. 34bn web pages iThenticate will identify the journals Turnitin checks against.
  11. Future. Unscheduled design features. Flexible grade and marking (using letters and figures). Developing analytics in Turnitin (2014)

 Road Map;

road map

1230-1330  Lunch

1330-1415  Evaluating the benefits of electronic assessment management, Cheryl Reynolds, University of Huddersfield (Part of the Jisc EBEAM project)

Institutions behave like ‘supertankers’…so it can be difficult to move them. But think about the context of HE at the moment; introduction of student fees and student debt, Intro of NSS highlighted the issues of satisfaction of assessment.

Grademark was used in University of two different schools. The biggest was the school of education (teacher education) with large number of students.

Feedback from students on a teacher training course on GradeMark. Based in north of England. Teachers are not in subject specific groups, also their digital literacy skills were very mixed.

There was a big spread of student views. What was surprising was the emotional response from the students (and staff) especially the pressure of meeting deadlines. The pressure on academics and the workload is already great.

GradeMark has “identified some of the pain and been able to relieve some of that pain”

Over 800 students responded to survey. Majority said feedback was delivered on time. Compared to paper or email submission this was a major improvement. They were also pleased they could pick up the feedback when they wanted.

Impact of feedback. 86% said there had been a positive feedback on their final mark. (3% said it had a negative feedback!!!) security of submission…students got a email receipt. They were pleased (54%) that their feedback was private.

Research was based mostly on tutors and students. Further work needs to done on the institutions and administrators

They found 3 main groups of tutors:

  1. innovators or early adopters
  2. more cautiously (healthy sceptics
  3. reluctant or have tried it and moved back to paper

most useful lesson is to get the early innovators and adopters to develop the healthy sceptics

‘Concomitant pressure’ was more effective than forcing staff to use it. Change agency from early adopters and from student demand. Also reward the staff or allow then to use the time saved to spend extra time on research or extra time with their students..

Institutions interest. GradeMark helps the institution to meet Ofsted requirements, especially giving them relevant statistics eg no of QuickMarks on scripts. Sun up. Identified where trainees were weakest. Allows students to identify their own development needs and model good assessment policy. Which overall will improve the NSS score. Allows comparison of moderation of different groups marking.

Questions from the audience:

  1. was there any special kit required? No but some for the audio feedback
  2. Did you ask students about audio feedback? Overwhelmingly yes mainly because it sounds more ‘human’. Also they don’t mind if it high quality – what they didn’t like is when they thought the tutors were reading from a script
  3. How long did it take to train the staff? No time at all …all the tutors came together at one time. No face to face training…just used Turnitin video and ing screen capture to give some specific feedback
  4. Did you have any technical problems with GradeMark?…esp regarding ‘Time Out issues’….but it has improved over the 4 year period.
  5. Were tutors printing off their feedback before they put it on GradeMark? Yes but they were in a minority.
  6. Did the Maths Tutors have any problems? No because they weren’t really using numerical data
  7. Did the use of GradeMark increase the qualitative nature of the feedback? Yes and it in turn shapes the staff development of feedback.This is reviewed at the end of the year.

Contribution from someone at Bedfordshire University which has two standard questions that ALL tutors must address (that tries to address this issue)

1415-1530  Turnitin Integrations Roadmap

Blackboard Outage. Cause. The data base not accessible prior to BBv9.1. Embedded database on clustered environments. Lack of available testing environments.

Resolution. Use of Bb database

legacy API (Blackboard integration). Difficult to use. Lack of  feature support. Implementation issues.

Turnitin are building a new API which should solve a lot of problems and lead to more elegant design.

For more information on LTI go to: http://www.imsglobal.org/cc/allblti.cfm?show_all_nav_listrsBLTICCompliance1=1

1530-1600  Q&A (Integrations)

 1600-1615  Summary & Close

Advertisements