eAssessment News

Marking and Giving Feedback with iPads

Last academic year, staff in the Spanish Section at LSE’s Language Centre were awarded an e-Assessment grant from LTI to support their project of using iPads and annotating apps to mark and give students feedback electronically.

Why use tablet marking and feedback…

…from the teacher’s point of view…

Several applications were analysed and the three teachers who participated in the project each selected the one that they found easier to use. Some advantages that were common to all teachers:

Advantages of using

the iPad and apps

Improvements on

marking and feedback

  • portability
  • paperless
  • online backup
  • possibility to add comments
  • rubrics to reuse common comments
  • apps suitable and adaptable to participants’ marking styles
  • encourages reflection on marking and feedback
  • opportunities to improve student understanding by extracting and analysing data held
  • clearer and more detailed feedback

 

…from the students point of view…

iPadsDuring the first term students’ were exposed to the “traditional” and “electronic” way of marking and giving feedback and were then asked to answer an evaluation questionnaire in order to determine their preferences in terms of writing and submitting their work, and receiving feedback. Overall many students did not show any preference with regards to traditional and electronic assessment. Those more in favour of e-assessment and feedback gave the following reasons:

* confidence of knowing work is backed up

* e-feedback easier to use so more likely to revisit at a later date

* improved clarity and understanding of feedback

Want to know more ? Click on the picture below to get the full report. You’ll find details on the apps that were used along with a very useful comparison grid to help you choose the right tool.

EMASpanish

 

E-Marking and e-Feedback: what else?

Using tablets to enhance marking and feedback is only one option in what is known as Electronic Management Assessment (EMA). Following on this project the language centre is now working on a wider EMA project entitled “From-e-Marking to e-Feedback: training, applying and evaluating” which aims at raising awareness of and comparing three distinctive e-marking and e-feedback methods, one of them being iPad marking. This project is being funded by an LTI e-assessment grant and an update will be provided when it is completed.

LTI is working on EMA and its use at the LSE. Watch out this space for updates on projects.

Also have a look at JISC’s guide on EMA to discover how you can use technology to support the assessment lifecycle.

 

 

August 10th, 2015|eAssessment News|Comments Off on Marking and Giving Feedback with iPads|

How effective are e-Marking and e-Feedback?

Catherine Hua Xiang

There are numerous methods and tools to mark and provide feedback using technology. Catherine Hua Xiang and Lourdes Hernandez-Martin from LSE’s Language Centre run an LTI-funded project* aimed at ‘exploring and comparing three distinctive e-marking methods and e-feedback as a result of three marking tools’

Lourdes Hernandez-Martin

More than 20 members of staff at the Language Centre were offered training on using Moodle, iPads with e-pens and Snagit to mark and provide feedback on students’ written work. They applied the three different marking methods to the same group of students throughout the academic year on different pieces of work (usually long essays).

They were then asked to write a reflective diary on the impact of these methods on the way they mark while students answered a survey to explore their perception of video and written feedback. The findings demonstrated a clear preference for video feedback using Snagit, which Catherine and Lourdes explained by providing the following reasons:

  1. Linguistic enhancement – Students have found being able to listen to teacher’s correct helps both pronunciation as well as overall listening skills.
  2. Personal approach – Students preferred the personal approach of a video feedback as it creates the style which is most similar to a face to face interaction. The teacher’s voice helps engage the students.
  3. Quality of the feedback – Students have commented on the details that verbal feedback could offer as opposed to the written feedback.
  4. Others – Students also commented on the usefulness of having both – the fact that they can come back and listen to the feedback at any time they wishes. It offers great materials for revision purposes.

“It is very helpful having the teacher guide me through the corrections as it is more personal and can allow me to see not just what is wrong but why it is wrong

“It’s much better to hear something directly rather than having to try and work things out from comments or notes written down”

“you can learn from home, rather than having to come in for office hours”

 

Snagit

Snagit Features

 

* From e-Marking to e-Feedback: Training, Applying and Evaluation, project funded by a Learning Technology and Innovation Grant. You can also find some information and updates on this project in the LSE Language Centre website.

 

LTI Show and tell on assessment with Technology, 11 November 2014

assessment with technologyLTI recently held a show and tell on assessment with technology with colleagues from LSE, UCL and Westminster. The event was well attended and provided an opportunity to find out the varying ways that technology is being used in to innovate assessment.

The show and tell event is part of the work that is being done by LTI to promote assessment with technology at LSE. The project aims and outcomes outlined by LTI learning technologists Athina Chatzigavriil and Kris Roger can be seen here.  If you are interested in being involved in the working group on e-assessment or have examples of e-submission, e-marking or e-feedback and e-return then please get in touch by emailing lti.support@lse.ac.uk

A lecture capture recording of the event (slides and audio) is now available here (LSE login required) or you can read a brief summary of the presentations below.

Alternatives to examinations

Professor George Gaskell started off the event with a brief outline of the changes that are taking place in LSE100, the compulsory course for Undergraduates at LSE. The LSE100 Director explained that the course team are currently investigating alternatives to exams. Using the learning outcomes of the course as the basis for assessment they have been developing a portfolio of activities that will allow students to demonstrate their appreciation of apply social scientific methods, concepts and theories to real world problems. Assessments will have to allow for ‘exit velocity’ and let students to take risks in their first year and allow for the progress of learners over their two years at LSE, while also preventing strategic planning by requiring all components to be completed. The process is still in the developing stages so watch this space for updates.

Peer assessment

papanicolasIriniSmallDr Irini Papanicolas, from Social Policy gave the second presentation on her work with Steve Bond in LTI on peer assessment. Dr Papanicolas discussed how she changed assessment on the course SA4D4 from 100% exam, to 50% exam and 50% presentation. She used ‘WebPA’ to enable students to rate their peers’ presentations using the course mark frame. Although peer assessment was an optional part of the assessment all the groups volunteered feedback and there was a positive response to the process with it creating discussion within the groups on the assessment criteria.

Dr Papanicolas will be using ‘TeamMates’ for this year as it will allow students to not only rate their own groups’ presentation but the individuals contributions within the group.

From peer assessment to peer teaching and learning….
Kevin Tang then reported how ‘Peerwise’ has been used at UCL. Kevin has been working with Sam Green & Stefanie Anyadi in the department of Linguistics to use the platform with 50 undergraduate and 50 postgraduate students. PeerWise allows students to create, answer and discuss questions. Students can rate feedback and are scored on their own contributions, at UCL these contributions are then worth a small percentage of their summative mark for the course.

Research into using the interface indicated that it was important to provide support for students to ‘think like an examiner’ with example questions and training on giving constructive feedback. Academic staff attitudes also played a crucial role in student engagement along with setting regular activities and deadlines.

As most examiners will know it is quite hard to create good questions so UCL asked students to devise questions in groups and found that the questions improved over time with the students in mixed ability groups appearing to benefit the most. The platform provided a space for interaction as students provided detailed feedback for each other which was then used to work on future questions and students were still using the system leading up to the exam for revision purposes.

Games and assessment in Law

Dr Vassiliki Bouki, Principal Lecturer, University of Westminster talked about the use of games in assessment. Dr Bouki demonstrated the ‘law of murder game’ which was developed in ‘Articulate storyline’ and was used as an alternative to coursework for a second year criminal law module. The game was used to demonstrate a real life scenario and assess critical thinking and allowed students to experience role playing to think like a lawyer. Students are given two hours to complete several small tasks in an open book environment. The game is currently in use so data and feedback from students will be available later in the year.

Word processed timed assessments and online feedback

Sunil KumarDr Sunil Kumar, Lecturer in Social Policy & Dean of Graduate Studies, talked about his experiences over three years on the course ‘urbanisation and social policy’. Concerned about how much students were actually learning with the traditional model of examinations, Dr Kumar introduced a 2 hour online formative assessment into his course. Students typed up their answers to short answer and long answer questions in examination style conditions. Dr Kumar was then able to read and mark submissions on his iPad and then upload the anonymised assessments with annotated feedback for all students to see on Moodle. The formative assessments have had 100% attendance with students being able to then learn from other students submissions, encouraging them to review topics they have not yet covered in preparation for the summative examination.

More information about the project can be found on our blog post.

E-assessment Scotland 2014

On 5 September I attended this one-day event at the University of Dundee, billed as the UK’s “largest conference dedicated to exploring the best examples of e-assessment in the world today”. LSE has an growing interest in e-assessment (which we might define as the use of IT to facilitate assessment processes), with various pilot projects on the go this year.

Total e-assessment

With that in mind, one presentation in Dundee proved a real eye-opener for me. Linda Morris, an academic in University of Dundee’s College of Life Sciences, told us that by 2015 the College will have moved to the point where all assessment, across all 4 years and including final exams, will be done online. Furthermore, this marks the end point of a journey which started a long time ago – in fact they already were using e-assessment for all 1st-year courses by 2003! I felt more than a little embarrassed, to be honest.

The drivers for this change were simple: More students, asking for more feedback, and fewer staff. The paper-based assessment regime was becoming completely unmanageable. A fully-online system means no paper, remote access for markers, progress tracking, and easy distribution of feedback. It is also popular with students, many of whom have fallen out of the habit of writing at length by hand (and whose writing may be barely legible as a result).

Dundee’s system uses a combination of Exam Online for essay questions and QuestionMark Perception for other question types. This system supports all the forms of submission they need, as well as all their marking requirements: blind marking, multiple markers, inline comments, and marking workflow.

Do people like it? Yes. Linda says “once you start down the road of e-assessment, you won’t get anyone to go back”.

Software

Various vendors were on hand to promote their wares: Surpass, Cirrus, QuestionMark and MyProgress, amongst others. However, I found it hard to see what, if anything, these tools would offer us that Moodle does not already provide. In fact, in some cases the feature set seemed much thinner than that of the Moodle quiz tool.

Keynotes

Peter Reed of the University of Liverpool started the day by identifying institutional problems with the introduction of e-assessment. Such a move is often done in a piecemeal manner, perhaps in response to NSS scores, and as a result fails to be transformational. He also pointed to a lack of flexibility in submission practices, which may assume that all submissions are documents, and prevent students from submitting other digital artefacts.

In thinking about e-assessment at the institutional level, he encouraged us to apply Brookfield’s “4 Lenses”. This theory proposes that any teaching and learning activity should be evaluated from four different perspectives: self-reflection, students, literature (i.e. theory and evidence) and peers (i.e. staff).

For example, through the student lens, we should think about the week-on-week burden of assessment. An assessment won’t be an effective measure of student achievement if that student has 3 other, more pressing assessments, in that same week. This can be countered by spreading out the assessment load: instead of a single high-stakes assessments at the end of module, spread out lower-stakes assessments through the term. Similarly, through the peer lens, we need to think about assessment load across different programmes and different years. Where there are multiple assessments from different sources in the same week, administrative staff or markers may be unable to cope.

In the other keynote, Mark Glynn of Dublin City University spoke about “assessment analytics”, proposing that the “click data” that VLEs typically provide are of limited value, and that assessment data is what will provide really the useful analytics. Such analytics may be Descriptive (what happened), Diagnostic (why it happened), Predictive (what’s gonna happen) or Prescriptive (what should happen).

I had a problem with one of his ideas for such analytics: to show students how they had performed in relation to their peers. This would be beneficial to the student, he claimed, because they could tell whether 75% was “good” in the context of the overall marking on their assessment. I found this rather depressing; 75% should mean “good”, regardless of how the other students performed. If it does not, then it means we do not know how to mark properly: the percentage grades we assign have no inherent meaning, and assessment becomes simply a process of sorting students into order of achievement, rather than determining how well they have achieved the objectives of the course. The use of technology to patch up these failures of assessment is not exactly inspiring.

Conclusion

This was a worthwhile conference, with some valuable insights into what other institutions are doing in this area. The day conference was followed by a longer online programme, which is ongoing at the time of writing.

Steve

September 15th, 2014|Conferences, eAssessment News|Comments Off on E-assessment Scotland 2014|