ITS Capital projects have provided funding to run a one-year pilot project which will explore the software, hardware, people and infrastructure factors required to deliver online and electronically managed exams at scale and at the end of the academic year, make recommendations to the university about how to do so.
Why are we developing capacity for digital exam and assessment?
There are number of potential benefits that this project could deliver for students, academic staff and administrators:
- Research suggests that students like getting feedback faster – doing exams online can speed up marking and feedback.
- Students can amend their answers easily and submit e.g. worksheets or essays that are more legible than handwritten ones.
- Printing less has a positive environmental effect.
- Most coursework is already word-processed so summative exams now feel slightly anachronistic. (Media continuity)
- These kinds of exams offer the potential for more authentic assessment that is more closely aligned with the kind of tasks our graduates will encounter in the workplace.
- The potential to decease workload particularly through such features as automated marking, shorter more easily marked formats and legibility of typed scripts as well as the ability to mark anytime and/or anywhere.
- There are a wide variety of question types possible and all kinds of media and third party packages can be incorporated simply and easily thus offering the potential to diversify assessment practice.
- Decreased printing costs, automation of online and second marking and integration with student record systems.
- Better quality management of exam content and individual questions via tools like item-bank analysis or randomised question generation.
- The ability to create banks of easily accessible reusable questions.
- External examiners can be granted access easily to online exams.
It is extremely difficult though not impossible to run a completely online exam at QMUL. Some schools are already very proficient at doing it, particularly Undergraduate Medicine (MBBS) and the Language Centre (Pre-sessionals) in SLLF. Current practice involves using specialist software that is not centrally supported, or using QMplus. Some schools don’t necessarily want to run exams on computers but would like to store and manage large banks of exam questions online and be able to continue paper-based delivery with subsequent online processing of grades and feedback.
Our current infrastructure is not geared to support either running summative exams on computers or a hybrid of computer and paper-based. To support it effectively requires large modern pc-labs, trained invigilators, simple exam booking systems, secure browsers, accessible power supplies, scanning equipment for answer sheets etc
Given this context, funding has been made available to run a project which will explore the software, hardware, people and infrastructure factors required to deliver online and electronically managed exams at scale and at the end of the academic year, make recommendations to the university about how to do so.
We also see that we need an infrastructure dimension to this project i.e. if we chose to scale-up the provision after the pilot there would need to be a clear process for the creation, delivery, marking and feedback of a digital exam with clarity about roles and responsibilities and the clear logistical challenges taken into account. This might involve teams/departments who work on Exam policies and procedures, invigilation, room bookings, AV designs for teaching spaces, external examiners, Campus IT support etc.
The first step for the project will be purchasing a software system for the authoring, management, delivery, marking, feedback and management of grades at the other end of the assessment lifecycle. This applies to both paper-based and fully online exams.
The system would then be linked to QMplus in the first instance for the transfer of grades and feedback.
After all this, around the end of November 2018, we’ll begin working with a number of pilot modules who can work with the project team to evaluate the software and assess and make recommendations about the most effective ways to support their delivery. We will be running these assessments across a range of subjects in Semester B of 2018/19.
Can I get involved?
Yes you most certainly can!
We want to work closely with selected Schools, Modules / Courses to pilot the functionality of the systems and the associated processes / infrastructure to support them. We are now looking for expressions of interest in becoming involved with the pilot in semester B of the 2018-19 academic year. Please complete the Expression of Interest form
If you have already tried online exams and would like to switch to the trial software or if you are new to online exams but would like to try it out with your students for final exams we would be very interested in hearing from you. What we would offer you is a unique opportunity to join us in the selection, design, training, delivery and evaluation of a new system.
Will we have to move all our exams to online exams?
No, there are currently no plans to make it compulsory to run an online exam at QMUL. This project is exploring and making recommendations about how the institution can develop its processes and systems to support them efficiently.
I am not doing Multiple Choice type exams so does this initiative apply to me?
It might, we intend to support MCQs, short answers, long answers (essays), coding, chemistry, maths and other types of questions. Get in touch with the project team if you have a specific requirement.
I thought we already had software for this kind of thing?
Not exactly. We do have QMplus which has a quiz activity which can be used for assessment, but the difficulty is that QMplus is not secure enough nor does it have the required analytics or question-bank management tools. The medical school also has a system called ‘Rogo’ and other set-ups do exist..none of them meet all the requirements of a university wide solution. Hence this project.
What about paper-based exams?
Many colleagues would like to take advantage of aspects of digital exam software for example the capability to store questions, co-author questions in teams simply and easily with version control or analyse the performance of Multiple Choice Questions (item bank analysis). The software we select will enable exam papers to be printed and grades to be manually inputted or scanner via OCR sheets.
How can I find out more?