Welcome to the Dewis e-Assessment System
This web page is aimed at those that have yet to use Dewis and who wish to know more about this e-Assessment system.
The Dewis system has been designed and developed by a team of of Mathematicians, Statisticians and Software Engineers
at the University of the West of England, Bristol (UWE). It is a e-Assessment system initially designed for the assessment of Mathematics and
Statistics but which can also be used in other subject fields.
At UWE, Dewis has been used for both a
formative and
summative e-assessments across a number of modules, delivered
to students in awards in the fields of Business, Computer Science, Nursing, Software Engineering, Engineering and Mathematics and Statistics.
•
The Motivation behind Dewis
The Motivation Behind Dewis:
x
The creation of the system was in direct response to the shortcomings of other e-Assessment systems used previously in the Department of Mathematics and Statistics
over the past two decades. The following list of
requirements were fundamental to the creation and subsequent development of Dewis. We required as system that:
• is independent of commercial software due to previous difficulties with licences, support and version upgrades;
• is robust and efficient with performance mostly independent of the client's machine or browser;
• is flexible to support and encourage UWE-based innovation in e-assessments;
• supports a fully algorithmic approach to question generation, marking and feedback;
• for which academics can
easily write and manage their own question code;
• employs a lossless data principle:
○ which, in conjunction with a complete Reporter, allows for detailed analysis of all assessment attempts. This is
particularly important for summative e-assessments where we wanted to eliminate the 'have another go' culture.
○ which supports the process of assessment evaluation for the purposes of pedagogic research.
• A system that is easily portable to other departments, faculties, universities.
Key Features
The key features of Dewis arose from both the requirements eminating from the motivation behind Dewis and also by
system requirements drawn up by academics at UWE. These academics have had extensive experience of using various
e-Assessment systems across a broad field of subjects.
Here we supply a brief summary of what we consider to be the key features of Dewis:
- Algorithmic question generation, marking and feedback
Algorithmic question generation, marking and feedback:
x
• Question parameters randomly generated on the fly. This includes the facility to reverse engineer question parameters.
• Marking algorithm allows for
follow-on marking and/or
partial display marking.
• Marking algorithm that facilitates
verification marking.
• A large library of marking algorithms for a wide range of different input types.
• The marking and feedback algorithm allows for detection and reporting of
common student errors.
• Data analysis of the lossless data allows for detailed assessment evaluation. For example, it facilitates the detection of new
common student errors which can be fed back into future and/or previously-run assessments.
- Different Question Input Types
Different Question Input Types:
x
DEWIS supports all the following input types. The following input types include pre-submission checks on the student input. For example, with a free-form matrix input,
the question may check whether the student's input is a rectangular block of numbers.
• Numerical inputs - integer or floating point. The floating-point input can be either free form or constrained to specific precision.
• Matrices and vectors. Either free-form (a text-box) or constrained to specific size and/or precision.
• Algebraic expressions. Algebraic expressions can be of arity up to three with the arguments determined in the question. Dynamic checking of the validity of the student's input is performed.
• Strings.
• Multiple-choice and multiple-selection. Questions involving 'selections'. The order of the selection may be fixed or randomised or a mixture of both. For example, one may want the
'none of the above' option to be fixed as the last option but to randomise all the other option.
• Graphical input. The student may input their answer by, for example, selecting a point or a line on a graph.
• Computer programs. Students may input computer programs (e.g. written in C or R) with Dewis testing outputs against randomised inputs.
Examples of these input types are available on the 'Showcase Questions' link on the main page.
- Academics' Management
Academics' Management:
x
The Academics' Management of the system is via a web-interface. A snap-shot of the front-page of the management page is given below for a particular group (module) called
'ublmss_30_1'.
The sub-categories on the management page are as follows:
• Users' Manager. Allows the academic to manage the student users registered on the system. One functionality here is to register some students
as requiring more time on assessments due to accessibility issues.
• Assessment Manager. Allows the academic to create and manage assessments. This section also includes the extensive assessment reporter. Each assessment
will be made up of one or more questions from either the public and/or the private question banks.
• Public Question Bank. Allows the academic to view and try-out DEWIS questions that reside on the public question bank. These questions will have
passed a moderation process for inclusion in the Bank. The question bank lists questions according to their general classification or according to the keywords associated with
them. Academics may copy any public question to the private question bank in order to view/edit the question's code.
• Private Question Bank. Allows the academic to create/manage/alter questions that are bespoke to their own group (module). At the end of every academic
year we invite academics to submit their favourite questions to the public question bank.
- Lossless Data Collection
Lossless Data Collection:
x
The data for every assessment attempt is recorded and stored on the DEWIS server. This means that any assessment attempt can be reproduced both in terms of the questions
that were asked to a particular student at a particular attempt and also in terms of the student's response and the feedback and marks supplied to the student.
There are significant advantages to this lossless data which includes the following:
• Students can view the complete data of their previous attempts including a reproduction of the questions they were asked, the answers they
supplied, and the feedback and marks they received. This is a very popular feature for students.
• If a student prematurely terminates an assessment for any reason, they may return to the same assessment by logging back into DEWIS, provided
that they are within the allocated time interval for that assessment. A number of students make use of this feature (as detected by the assessment reporter).
• With the academic having access to the complete assessment information, via an easy to use reporter interface, student queries can be addressed in a prompt and professional manner, giving students confidence that they have been treated fairly.
• Assessment evaluation can be performed on any assessment. The academic's reporter facilitates the efficient analysis of students' performance
in any assessment by means of 'student performance flags'. Instead of viewing student performance based purely on the marks awarded, the academic can easily identify which
questions have, for example, been answered incorrectly, not answered, answered partially correctly. These 'student performance flags' will also contain information regarding
the triggering of common student errors.
• If any previously unidentified common student error is identified, the academic has the option of encoding this back into an
assessment's question
and inviting the students to view their own assessment data to see whether they triggered the new common student error. Alternatively the academic may wish to code
the common student error for subsequent assessments. The reporter facilitates the analysis of common student errors by using 'student performance flags' which
enables the academic to quickly view which the outcome of each assessment attempt. This includes the study of common student errors.
• Re-marking to obtain retrospective feedback and/or marking: At the end of an assessment period, an academic may, retrospectively, decide to alter either the feedback algorithm and/or
marking algorithm. Examples of where this may occur include:
• In performing a post-assessment data analysis of the students' performance, it may be the case that
previously unidentified common student error are identified. The academic may wish the students to be informed as to whether they triggered
any new common student errors.
• If the students' own evaluation of an assessment includes suggestions regarding the improvement of the
feedback that the academic feels is worthy of adressing.
• It was deemed that the amount of marks awarded for a partially correct answer was deemed harsh.
In such cases, the academic may alter the question's code and perform a 're-mark' of the whole assessment. The students would then be invited to log-on to the (now closed)
assessment to view their previous attempts - any recently altered feedback and/or marks allocation would be accessible to the student at this stage.
We believe this feature to be unique to DEWIS.
- Student Friendly Features
Student Friendly Features:
x
An enhanced student experience of e-Assessments was and remains a key target in the design and development of DEWIS.
Following is just a sample of the student-friendly features included in the DEWIS system and/or individual DEWIS questions:
• Shutdown recovery.
• Pre-processing checks on student input.
• Through ticketing log-on from institution's VLE without additional authentication (uses SCORM).
• Students can view their previous assessment attempts, including feedback. (Academics may disable this feature)
• The option of anonymous assessments for purely formative questions.
• The option of anonymous assessments for purely formative questions.
• Questions involving data-sets may offer the option for the student to upload the data-set.
- Independent of Commercial Software
Independent of Commercial Software:
x
The Dewis system is independent of commercial software.
• Strengthens the sustainability of the system.
• May be easily distributed to other institutions.
• Bespoke requests by academics can be actioned promptly.
- Robust and Efficient
Robust and Efficient:
x
The DEWIS system is a robust and efficient system that is heavily utilised at UWE and other institutions.
• More than 25,000 summative assessment attempts in the 2015-2016 academic year.
• Hybrid Server/Client-side code supports the robustness of the system.
• Question writing straightforward - a number of academics at UWE and elsewhere are coding their own questions.
Demonstrations
- 2020 Pedagogic Project supported by Dewis.
- View DEWIS at mathcentre - to see examples of assessments in anonymous formative mode. (opens new window)
- HEA STEM presentation - part of the 'E-assessment in the Mathematical Sciences' workshop (Middlesex University 2014)
- Showcase Questions - to see examples of individual questions that may be included in assessments.
- View the Public Question Bank - password protected - contact us to obtain the password.