eAssessment Events

Going digital

Geraldine Foley, Assistant Learning Technologist and Athina Chatzigavriil, Senior Learning Technologist share their thoughts on Learning from Digital Examinations – 26th April 2018 conference.

Learning from Digital Examinations, a one day conference organized by Brunel University brought together practitioners form different universities across the country and from abroad. It was a great opportunity to share best practices, lessons learnt and provided detailed examples of the complexities involved with digital examinations as well as some of the potential benefits.

Students are used to typing their work electronically and the majority have their own devices, yet when it comes to exams at LSE and elsewhere in the UK the standard expectation is to hand-write responses for final examinations. This is due to multiple reasons including; infrastructure, regulations, spaces and facilities. However, some universities have started to shift to electronic examinations and so we went along to find out more and to present on the pilot projects we have done here at LSE (more details below).

Brunel University commenced research of digital examinations in 2015. They used WISEflow, a platform provided by the Danish based company UNIWise. They used students’ own devices Bring Your Own Device (BYOD) and implemented 1 exam (115 students) in 2016. Following a successful proof of concept with this one exam they moved to a pilot with 1300 students in 2016/17. Since then the university moved to a staged implementation of the assessment platform in September 2017. WISEflow was the highlight platform for digital examinations but also Electronic Management of Assessment (EMA) of the conference.

There were quite a few institutions at the conference that have already moved wholesale to typed examinations while others are still starting out. Moreover there seems to be a greater interest among institutions to move towards EMA approaches to assessment and not only typed instead of handwritten examination. Line Palle Andersen described how staff at University College Copenhagen, Denmark use WISEflow to support flows of other forms of assessment (such as oral, MCQs etc.) and how their staff are involved in marking and feedback provision taking advantage of the extensive feedback features available.

The full conference programme and the presentation slides can be viewed online but some general themes and questions over the day are discussed here.

  • Bring your own device (BYOD)

    Space and facilities tend to be limited in HE so the majority of institutions appear to be adopting the BYOD approach. In Norway and Denmark where the move to typed exams was a nation-wide project it is mandatory for all students to have a device for their studies. UK universities using the BYOD approach provide support for those that do not have their own devices such as loans and grants with a small number of devices for those that experience problems on the day of exams.

  • Student training and support are essential… and students can help!

    Students need chances to test out and get used to any new system or approach. Unsurprisingly those students that didn’t go to support sessions tended to be the ones that needed more support. Brunel University employed students as assistant learning technologists to run drop in support sessions leading up to the examinations so students could install and test out the software on their devices and they also worked with invigilators to offer technical support during the exams. This model has been used successfully in Demark and Norway too. Dr Liz Masterman from the University of Oxford presented on the literature review that looked at studies from 2000 onwards on typed exams to assess the equivalence on the psychological and academic aspects of moving from handwritten to typed examinations. The various studies surveyed yielded inconsistent results; nevertheless, the findings prompt a number of questions for consideration when moving essay-based examinations to typed ones.

  • Change requires strong project management

    Assessment processes involve multiple stakeholders and facilitators; professional support staff, admin staff, estates, IT, academic staff, students, and invigilators all need to be involved, informed and on-board in order to move successfully to digital assessment. Learning technology and Educational development staff have a critical role in working with academics to ensure that they engage with the process and don’t just replicate existing practice. Moving online should present an opportunity to design assessment that is in-line with the course learning outcomes, with clear links between the formative and summative assessments and is balanced across the course.

  • Electronic assessment may lead to more inclusive assessment

    Dr Torben K Jensen on his keynote talking about the reason for which universities should digitise examinations raised the ‘generation argument’ in terms of fairness; handwritten exams are far from students’ every day activities. Making spell checkers, screen readers, remote assessment and other assistive technology available to everyone can reduce the need for individual adjustments. More work is needed to find out the impact of moving to electronic assessment, but Brunel University reported that they received no appeals with regards to moving to electronic exams. As mentioned above changing assessment can provide an opportunity to rethink assessment and even move away from examinations. Many institutions demonstrated digital assessment in various forms, including oral presentations, video submissions, multiple choice questions, simulations and group projects.

  • Feedback can be electronic too!

    Feedback on work in HE has been similarly slow to move to electronic form and yet handwritten comments are often hard to read and slow to produce and distribute compared to typed comments. Many institutions moving to electronic assessments are shifting the entire process online. Professor Denise Whitelock from the Open University presented the final keynote on the various ways that technology can be used to train and support teachers to give useful and supportive feedback. She has been involved in creating several automated feedback tools for students and highlighted the importance feedback can have on students’ learning.

Pilot e-exams at LSE

Our presentation focused on three past LSE pilots that took place in order to:

  • Explore students’ perceptions of typing versus handwriting exams.
  • Test out online examination software
  • Evaluate the requirements for online examinations including: security, regulations, facilities, training and support.

All three pilots were for formative assignments which provided feedback for final examinations. In each case various software were compared and the departments made the final selection for the platforms used in-line with their individual requirements.

Two of the pilots were in the Law department for take home mock examinations using the software Examsoft which allowed students to access examination questions and type 1 essay response from a choice of 3 within 2 hours.  Students were given 5 working days to access the questions and it was up to them to find a suitable space to type their response (see full report here).

The third pilot was with the Government department for a mock on campus invigilated examination using the software Exam4 (see full report here). Students brought their own devices to type 4 essays questions (from a choice of 16) within 3 hours. Exam questions were given in hard copy format with extra information provided to invigilators. In both cases students were given opportunities to test out the software in advance.  Both pilots were evaluated with questionnaires and focus groups with students and feedback from staff.

Overall students welcomed the typed examinations and many appreciated producing a typed script which was more legible for examiners to read some students, but some had concerns about the expectations of examiners who might assume typed answers required better quality answers even though they were produced under exam conditions. Several students found editing their examination answers was easier when typing, but others felt penalized by their slow typing skills. Some students believed the cognitive process of typing an exam answer differed to handwriting one and that grammar and spelling errors were less easy to spot when typing. The identified institutional implications for scaling up typed examinations, include substantial overhaul of the regulations, provision in case students cannot use their own device and adequate student support and training.  The full evaluation reports of the pilots can be found on LSE Research online.

Next steps

The conference gave lots of detailed examples of the complexities involved with digital assessment as well as some of the potential benefits. Going forward at LSE, the Assessment Service Change Project (ASCP), led by Cheryl Edwardes, Deputy Head of Student Services, is collaborating with staff and students to design enhanced assessment processes and systems which incorporate best practice and expert knowledge from across the School community and wider HE sector. If you wish to learn more and/or share your views you can sign up to attend any of the Validation Workshops. Moreover, the Assessment Working Group, led by Dr Claire Gordon, Head of Teaching and Learning Centre are taking forward work on the following areas: i) assessment principles, ii) good practice in assessment design, iii) inclusive practice in assessment, and iv) quality assurance and regulatory arrangements in assessment. Also, the Law department are currently trialing a small-scale proof of concept exam using DigiExam with ipads and keyboards – providing devices for students.

LTI is involved in all the above initiatives and support courses and programmes in the use of electronic assessment and are working with several departments to move their processes online.  Please contact LTI.Support@lse.ac.uk if you would like to discuss this further with us.

Learning Technology Ideas Exchange

Cultivating Innovation

Click to sign up to the Learning Technology Idea Exchange

“Learning Technology Ideas Exchange” is an opportunity to get inspired, meet colleagues, exchange ideas and discover ways to improve teaching and learning with technology!

Run as informal café-style presentations, LSE colleagues will share insight into technologies used for teaching and learning and explain the educational rationale behind their work. You will have the opportunity to ask questions and discuss.

Posters from various LTI projects will be on display during lunch, which will be provided by LTI. Learning Technologists will be available to answer any questions throughout the event.

There will be a further opportunity to ask questions in a plenary before we wrap-up.

Sign up to the event via the training system (please note this is an event for LSE Staff only)

TimeProgramme Themes and presenters 
11.00 -11.10WelcomeTea and coffee provided
11.10 - 12.10Café 1Table 1 - e-assessment
* Sara Geneletti (Statistics)
* Elisabeth Grieger (Mathematics)
Table 2 - General Innovation
* Francesco Panizza, (Government)
* Kay Inckle (Sociology)
12.10 - 12.20LTI Update
12.20 - 13.00Lunch (provided)
13.00 - 14.00Café 2Table 1 - Students as Producers
* Jennifer Jackson-Preece (European Institute)
*Catherine Xiang (Language Centre)
Table 2 - e-assessment
* Edgar Whitley (Management)
* Lourdes Hernandez-Martin (Language Centre)
14.00 – 14.30PlenaryGroup discussion and questions
WhenMonday 23rd May
Time11am -3pm
LocationLower Ground of Parish Hall (PAR.LG.03)

‘Assessment and Feedback with technology’ project 2014/15

Over the academic year 2014/15 LTI have led several projects  in order to try and improve assessment practices with technology at LSE.

The following are the outcomes of the work carried out as part of the assessment and feedback with technology project:

Research

e-Assessment Practice at Russell Group Universities report Read e-Assessment Practice at Russell Group Universities report

A survey distributed to Russell Group universities to identify level of engagement with e-Assessment practice and factors conductive and critical to e-Assessment engagement.

Assessment and Feedback with technology at LSE report Read Assessment and Feedback with Technology at LSE report – please request a copy of the report.
Interviews with LSE Departments were carried out to identify the level of engagement with e-Assessment practice and understand the factors that encourage participation as well as barriers involved in this regard.

Pilots

A series of pilots with various departments to explore pedagogical benefits of assessment and feedback with technology 

Government e-assessment Pilot study report Read GV100 e-Assessment pilot study
Government (GV100)
Characteristics:
Timed, on-campus invigilated and typed formative exam, followed by online Self/Peer review and face-to-face Student-Teacher feedback
Technologies used: Exam4, Bring Your Own Device (BYOD), Moodle-TurnItIn (TII) PeerMark
Law e-assessment Pilot study report Read Law e-assessment pilot study
Law (LL205 & LL4K9)
Characteristics:
Timed, take-home and typed formative mock exam
Technologies used: ExamSoft
LSE100 portfolio assessment Pilot study
Read LSE100 portfolio assessment pilot study
LSE100
Characteristics:
e-portfolio for summative assessment
Technologies used: Moodle assignment
Moodle- TII integration Pilots  

Read Moodle-TII integration pilots report
Moodle – TurnItIn integration

  • Statistics (ST327)
    Characteristics: Originality checking
    Technologies used: Moodle-TII integration
  • Philosophy (PH400 & PH201)
    Characteristics: Originality checking, TII GradeMark
    Technologies used: Moodle-TII integration, ipads
  • Media and Communications (MC425 & MC419)
    Characteristics: Originality checking, TII GradeMark
    Technologies used: Moodle-TII integration
  • Government (GV100)
    Characteristics: TII PeerMark
    Technologies used: Moodle-TII integration

DECISIONS MADE for Moodle-TII integration: Where the Moodle-TII integration worked, the feedback was largely positive.  In the instances where the integration did not fully work, the issues identified were significant and cannot be ignored.  In most cases, workarounds provided solutions; however as a result of the relative uncertainty associated with the functionality of the plug-in, LTI will not scale Moodle-TII integration but continue supporting the integration in the form of pilots.  As such, the plug-in will be made available upon request to those who want to use it (i.e. teachers will have an opportunity of requesting the plug-in from LTI for any given Moddle course(s)).

If you want to take part in Phase 2 of Moodle-TII integration (i.e. use the plug-in for your Moodle course(s)) please email us on lti.support@lse.ac.uk

Visit our Moodle site for details of the Moodle-TII integratin phase 2, database of issues identifies and participating pilot users

LTI Grants

The following LTI Grant projects are related to e-Assessment. Find out more about the LTI Grants (e-Assessment innovation strand) and  LTI Grant winners or apply for an LTI Grant.

  • The social construction of human rights violations: e-Bricolage project,
    Pete Manning, Department of Sociology
    Use of peer assessment for an e-bricolage project, using resources produced for exam preparation and essay preparation.
  • From E-marking to E-feedback: training, applying and evaluation,
    Catherine Xiang & Lourdes Hernández-Martín, Language Centre
    Exploring new ways of marking and giving feedback (Moodle, iPads+annotation apps, Snagit).
  • Integrating offline marking and online moodle feedback using iPads,
    Ellen Helsper, Media & Communication
    Teachers using iPads and the Moodle-Turnitin integration to mark and give feedback on formative coursework (uploaded by students on Moodle).
  • Global perspectives via documentary and peer-assessment,
    Catherine Xiang, Language Centre
    Use of videos in continuous assessment with peer review of the documentaries created  – fully embedded in the continuous assessment.
  • Using film in urban planning analysis,
    Nancy Holman, Geography
    Creations of short interpretative films along with written work and presentation following fieldwork. The student produced films are formatively assessed by a panel of staff in the department. The films are part of the presentation students make at the end of the course.
  • Moodle-based group assessment for regression analysis using the R software,
    Sarah Geneletti, Statistics
    A project looking into replacing written report with a three part assessment: i)R Script ii) stats Moodle quiz iii)Moodle quiz report based on the analyses.
  • Electronic marking and feedback with iPads (Phase II)
    Lourdes Hernandez-Martin & Mercedes Coca, Language Centre
    Explore iPad apps to improve assessment and feedback

Guidelines

The following guidelines were produced to cover needs of innovative practic:

Testing and evaluation of technologies and tools

October 30th, 2015|Assessment, eAssessment Events, eAssessment News, innovation, LTI Grants, Reports & Papers, Teaching & Learning, Tools & Technologies|Comments Off on ‘Assessment and Feedback with technology’ project 2014/15|

LTI Show and tell on assessment with Technology, 11 November 2014

assessment with technologyLTI recently held a show and tell on assessment with technology with colleagues from LSE, UCL and Westminster. The event was well attended and provided an opportunity to find out the varying ways that technology is being used in to innovate assessment.

The show and tell event is part of the work that is being done by LTI to promote assessment with technology at LSE. The project aims and outcomes outlined by LTI learning technologists Athina Chatzigavriil and Kris Roger can be seen here.  If you are interested in being involved in the working group on e-assessment or have examples of e-submission, e-marking or e-feedback and e-return then please get in touch by emailing lti.support@lse.ac.uk

A lecture capture recording of the event (slides and audio) is now available here (LSE login required) or you can read a brief summary of the presentations below.

Alternatives to examinations

Professor George Gaskell started off the event with a brief outline of the changes that are taking place in LSE100, the compulsory course for Undergraduates at LSE. The LSE100 Director explained that the course team are currently investigating alternatives to exams. Using the learning outcomes of the course as the basis for assessment they have been developing a portfolio of activities that will allow students to demonstrate their appreciation of apply social scientific methods, concepts and theories to real world problems. Assessments will have to allow for ‘exit velocity’ and let students to take risks in their first year and allow for the progress of learners over their two years at LSE, while also preventing strategic planning by requiring all components to be completed. The process is still in the developing stages so watch this space for updates.

Peer assessment

papanicolasIriniSmallDr Irini Papanicolas, from Social Policy gave the second presentation on her work with Steve Bond in LTI on peer assessment. Dr Papanicolas discussed how she changed assessment on the course SA4D4 from 100% exam, to 50% exam and 50% presentation. She used ‘WebPA’ to enable students to rate their peers’ presentations using the course mark frame. Although peer assessment was an optional part of the assessment all the groups volunteered feedback and there was a positive response to the process with it creating discussion within the groups on the assessment criteria.

Dr Papanicolas will be using ‘TeamMates’ for this year as it will allow students to not only rate their own groups’ presentation but the individuals contributions within the group.

From peer assessment to peer teaching and learning….
Kevin Tang then reported how ‘Peerwise’ has been used at UCL. Kevin has been working with Sam Green & Stefanie Anyadi in the department of Linguistics to use the platform with 50 undergraduate and 50 postgraduate students. PeerWise allows students to create, answer and discuss questions. Students can rate feedback and are scored on their own contributions, at UCL these contributions are then worth a small percentage of their summative mark for the course.

Research into using the interface indicated that it was important to provide support for students to ‘think like an examiner’ with example questions and training on giving constructive feedback. Academic staff attitudes also played a crucial role in student engagement along with setting regular activities and deadlines.

As most examiners will know it is quite hard to create good questions so UCL asked students to devise questions in groups and found that the questions improved over time with the students in mixed ability groups appearing to benefit the most. The platform provided a space for interaction as students provided detailed feedback for each other which was then used to work on future questions and students were still using the system leading up to the exam for revision purposes.

Games and assessment in Law

Dr Vassiliki Bouki, Principal Lecturer, University of Westminster talked about the use of games in assessment. Dr Bouki demonstrated the ‘law of murder game’ which was developed in ‘Articulate storyline’ and was used as an alternative to coursework for a second year criminal law module. The game was used to demonstrate a real life scenario and assess critical thinking and allowed students to experience role playing to think like a lawyer. Students are given two hours to complete several small tasks in an open book environment. The game is currently in use so data and feedback from students will be available later in the year.

Word processed timed assessments and online feedback

Sunil KumarDr Sunil Kumar, Lecturer in Social Policy & Dean of Graduate Studies, talked about his experiences over three years on the course ‘urbanisation and social policy’. Concerned about how much students were actually learning with the traditional model of examinations, Dr Kumar introduced a 2 hour online formative assessment into his course. Students typed up their answers to short answer and long answer questions in examination style conditions. Dr Kumar was then able to read and mark submissions on his iPad and then upload the anonymised assessments with annotated feedback for all students to see on Moodle. The formative assessments have had 100% attendance with students being able to then learn from other students submissions, encouraging them to review topics they have not yet covered in preparation for the summative examination.

More information about the project can be found on our blog post.