How to create fraud-resistant unsupervised assessments
Looking to learn how you can make sure that students do not feel the need to commit fraud? Then read more below!
Table of contents
1. Guiding principles & background information
-
In order to give students grades that reliably represent how well they master the learning objectives (LOs) of a course, we must:
- Assess these learning objectives (and nothing more).
- Make sure that students deliver work that reflects their own level of LO mastering.
In other words, we must prevent and detect fraud, and make sure that students can perform optimally during the exam.
-
This is the TU Delft definition of fraud:
“Fraud is taken to mean any act or omission by a student that makes it fully or partially impossible to properly assess the knowledge, insight and skill of that student or another student.”
Source: the model Rules and Guidelines of the Boards of Examiners, article 7.1
-
Fraud prevention measures should not hinder student performance during the assessment. They should contribute to the quality requirements and values for assessment, respect the privacy of students, and the other eight requirements for remote assessment.
-
Fraud suspicions, decisions, and appeals
- Suspicion: Examiners must be vigilant of fraud and report any suspicion of fraud to their Board of Examiners, following the procedure of their Board of Examiners (see the Rules and Guidelines that are applicable to your course)
- Decision: The Board of Examiners (BoEx) of the student’s programme decides whether it is an actual case of fraud, and decides upon disciplinary measures for the student (i.e. it is not the examiner who decides this).
- Appeal: Students can appeal against the decisions of the BoEx to the Examination Appeals Board (EAB, Dutch: College van beroep voor de examens, CBE), and they can appeal against the decisions of the EAB to the Council of State (Dutch: Raad van State; NB, there used to be a Higher Educational Appeals Tribunal until 1 January 2023, Dutch: College van Beroep voor het hoger onderwijs, CBHO). The examiner can be heard by the Board and Council.
-
- Grades should represent how well individual students master the learning objectives.
- TU Delft students want their degree to reflect how that they meet the final attainment levels, and they want their grades to reflect their personal performance that they can be very proud of, so they will not commit fraud.
- Students are more likely to commit fraud if they know that students who commit fraud will not get caught.
In unsupervised assessments, there is some level of uncertainty as to whether the delivered work is (solely) the product of the student(s) who handed it in. Examiners can take measures to ensure that this was the case. These measures are described below.
-
Lecturers are advised to discuss with their students whether and how students may use tools like ChatGPT and other sources like Studeersnel, and how students should report their use of these tools in their deliverables. Most importantly, explain the background of your standpoint. Beside AI tools, there are many other (online) tools and sources available that students can use to come up with correct answers, while they do not master the learning objectives (e.g. Studeersnel etc.).
2. Prevent fraud during the course (before the assessment)
- Make sure that students are prepared for the exam, so they do not feel the need to commit fraud. Have them practice with representative questions/assignments and give them feedback (e.g. model answers).
- Inform them of what fraud is, what the consequences are, and which measures you have taken. More information for students on fraud can be found here.
2.1 Overview of measures and their pros and cons during the assessments
-
Find which fraud prevention measures are available for your assessment type (column) in the ‘summary of fraud prevention measures during assessments’ table.
-
Choose balanced fraud prevention measures based on their effectiveness (column on the right) and ‘stress inducement’ (rightmost column) in the tables with ‘pros and cons per fraud prevention measure during assessments’.
2.2 The fraud prevention measures during the assessment are grouped as follows:
-
Measure
Description
Required/ advised
Reduces risk of … (effectiveness)
Stress inducement during exam (– = a lot)
2a. Open-ended questions/ assignments only
No multiple choice questions etc.
Strongly advised
Sharing answers undetectably (+)
0 (if prepared)
2b. No factual questions
Required for all open-book assessments
Copy-pasting from book (++)
0
2c. No recycling of existing questions
Reformulate previous questions completely and use different parameter values, names and characteristics
Required in all cases
Googling the answer, or copy-pasting from last year (++)
0
2d. No standard questions that can be googled
Required in all cases
Googling the answer (++)
0
-
Measure
Description
Required/ advised in case of
Reduces risk of … (effectiveness)
Stress inducement during exam (– = a lot)
3a. Parametrizing numerical assignments / questions
For numerical questions, give students different variable values within a physical relevant range.
Required for numerical questions, Ans accommodate this
Sharing answers (++)
0
3b1. Different version per question: small variations of the question
In order to make it more difficult for students to share answers of questions (specifically answers of knowledge questions), you can give each student different exam questions, chosen from a question pool with equivalent questions.
Advised for open questions,
required for closed-ended questions (MCQ, true-false, etc.)
Sharing answers, cooperating (+)
Sharing answers, cooperating (++)
0
0
3b2. Different version per question: different questions that are equivalent in difficulty and topic
3c. Different versions on exam level (exam versions)
Make sure not to show version names
Advised as low-tech version of previous two measures
Sharing answers, cooperating (+)
0
3d. Different cases or datasets
Questions can be the same, not the answer to the assignment / questions.
Advised for case studies, computational assignments, etc.
Sharing answers (+)
0
-
Measure
Description
Required/ advised in case of
Reduces risk of … (effectiveness)
Stress inducement during exam
(– = a lot)4a. Oral authenticity checks (after the exam)
Videocalls during which examiner assesses whether randomly selected student can explain why/how they produced their solution
Advised
Not being able to explain own answers by the end of the exam (+)
0
4b. Identity check before oral exam / online proctored exam
Cannot be recorded!
Required for all oral exams
Someone else doing the oral exam (++)
0
4c. Login with netID
Required, advised for oral exams
Official surveillance via camera, screen capture, key log and microphone, by US-based company employees
Advised for closed questions, if no alternative is possible, and closed-book is really necessary (see decision tree). Use of Ans (or Grasple) is required. Permission by Board of Examiners is required.
Communication, use of resources (+, not watertight, additional measures needed)
—
‘Poor man’s proctoring’ (FORBIDDEN)
Asking students to turn on their camera during the exam
Never! This is NOT allowed!
Online proctoring is the only legally permitted video surveillance.Communication, use of resources (0, easy to pass by)
—
Asking students to upload a picture, selfie, their campus ID, etc. (FORBIDDEN)
During, before or after the assessment
Never! This is NOT allowed!
Identity fraud. (0) In case of identity fraud, the other student could leave the room temporarily.
0/–
(if during exams, students are disturbed) -
Measure
Description
Required/ advised in case of
Reduces risk of … (effectiveness)
Stress inducement during assessment
(– = a lot)5a. Honor’s pledge
Students promise explicitly not to accept or give help.
Advised
Fraud in general (0/+). Make students aware of what is not allowed.
0
5b. Random order of questions
Randomize the order of questions.
Only if the order of questions still makes sense and does not have illogical jumps.
Sharing answers, cooperating (0/+): students might communicate the answers to the first question
0 (- if question order is illogical)
5c. Limited time slots
Exam is split into partial exams; timeslots should last preferably 1 hour or longer. Maximum of 4 timeslots per exam.
Advised for larger groups in case of risk of cooperation
Sharing answers, cooperating (+)
-/– (the shorter the timeslot, the worse)
5d. Supervison during project
Students are supervised during the project / larger assignment, so that the supervisor can estimate whether the level shown in the deliverable corresponds to the level of the process.
Advised
Free riding (not doing the work), unreported use of AI tools, using answers from the internet (+)
++ Supervision makes students more confident and acknowledges their ownership of their work; It helps for learning
Prevent students from reattempting previous questions
Students have to answer a specific subquestion before they can continue. If they do not know the answer, they either wait or fill in an incorrect answer.
Never
Sharing answers, cooperating (+)
—
(you don’t test if students mastered the LOs)
Requiring students to answer multiple times if incorrect after the first try
Compensates for the lack of partial points in automatic grading. Students have to answer a specific (sub)question before they can continue. If they do not know the answer, they either wait or fill in an incorrect answer multiple times.
Never
None (0)
—
(students only receive negative feedback and cannot skip the question)Asking too many questions
Asking more questions than time allows, so students do not have time to cheat
Never
Having time to cooperate (?, they could still divide the questions)
—
3. Prevent fraud in question construction
-
Change the exam into an open book exam with open-ended questions (i.e. no multiple choice, multiple select, true/false, etc.). This implies that you cannot ask remember level questions. This should be constructively aligned with the learning activities and learning objectives.
See here for more information on how to construct open-ended questions
Click to see why you cannot use multiple-choice questions in a remote exam.Answers to open-ended questions, especially those that require longer answers at higher levels of Bloom (click here for the TU Delft adaptation of Bloom’s taxonomy for engineering education), are not straightforward to share amongst peers. Furthermore, similarities in answers to open questions can be used to detect fraud.
If you want to know why closed-ended questions (multiple choice, yes/no, true/false, multiple select, etc.) are sensitive to fraud, and if there are exceptions read the FAQ below.
-
Transform your exam to an open book exam instead of a closed-book exam. Questions on facts (declarative knowledge) are easy for students to look-up during a remote exam. It is difficult to enforce students not to use their books or ‘cheat sheets’ during the exam. Therefore, do not ask for factual knowledge, but aim at questions at higher levels of Bloom (Bloom, 1956, click here for the TU Delft adaptation for engineering education), which students can only answer if they master the learning objectives. The questions should be constructively aligned with the learning activities and learning objectives.
Authorise students to use all help available and provide them a list of sources that they are suggested to have available.
In case a learning objective requires reproduction of factual knowledge, consider whether this factual knowledge is crucial (for example in their professional lives) or not. In case a learning objective cannot be tested, discuss with the Board of Examiners and Programme Director whether this is acceptable.
In case you need to ask factual questions and an open book exam is not a possibility, you could consider oral examinations (depending on the number of students), or online proctored exams (only an option if online proctoring is allowed by your Board of Examiners).
If you must test factual knowledge in a closed-book setting, consider using oral exams, or online proctoring (last resort).
-
In case of multiple choice with 3 alternatives (3 is the preferred number of alternatives for multiple choice questions), each exam should consist of ~54 questions in order for the grade to be reliable. This number of questions is considerably higher than for open questions.
Keep in mind that students can earn points by randomly guessing the correct answer. You will need to take guessing into account when calculating the grade from the score. Be transparent about this to your students and communicate this before, during (cover page) and after the test. More information can be found in the reader of UTQ module ASSESS.
4. Prevent fraud by creating unique assessments per student
-
Ask students the same, numerical questions, but with different numbers, so they cannot exchange the numbers.
Parametrization is used in numerical questions. For each question, all students use different numbers, chosen from a range that you determine. Therefore, the outcomes are different and it is not possible to commit fraud by sharing answers. In order to help each other, they would have to share the calculation steps, which is more cumbersome. Parametrization is possible in Ans, Brightspace quizzes (formative use only, arithmetic question types), in Möbius, and in Grasple (available for math service education only).
If you want to use parametrization in Brightspace Assignment, you could determine the value they should work with based on a figure in their student number. You can change the figure that you base the values on for each (larger) assignment, to prevent grouping. Example: use the 3rd figure of the student number in question 1, the last in question 2 and the second in question 3.
3rd figure in your student number:
0
1
2
3
4
5
6
7
8
9
Value for x:
Value for y:
x=1
y=6
x=3
y=6
x=2
y=4
x=4
y=6
x=3
y=4
x=4
y=3
x=2
y=3
x=1
y=5
x=2
y=4
x=3
y=5
Be aware that students might unintentionally use the incorrect values. Try to make sure that the values lead to equally difficult calculations, and have students practice with this in order to reduce the stress during the exam of seeing this system for the first time.
-
In order to make it more difficult for students to share answers to questions (specifically answers to knowledge questions), you can give each student different exam questions, chosen from a question pool with interchangeable questions. This can be done by using different Exam versions or creating Unique individual exams from question banks:
- Exam versions: Divide the students into groups and give each group a different version of the exam. The exam questions are different for each group, and the same within a group.
-
- Pro: easy to set up.
- Con: if students find out in which group they are, they can communicate within the group.
- Work-around: if the exam is divided into parts, you can change the grouping for the second part compared to the first grouping.
You can keep the first questions in each exam part the same, so it is harder and more stressful for them to find out what group they belong to.
- Question pools: Give each student a unique exam, by drawing interchangeable questions from question pools. Each question pool contains questions that are interchangeable in terms of learning objective/topic and difficulty.
-
- Pro: unpredictable questions, easy to set-up in Brightspace quizzes
- Con: test result analysis only usable if you have large numbers of students
In both cases, you need interchangeable questions, which will take you more time to develop than in case of a traditional exam.
For examples, see the FAQ.
-
Multiple-choice questions are more sensitive to fraud, since it is relatively simple to communicate the answers to other students.
True/false questions are not good from an educational point of view, since students will start looking for an error in any question that they think is ‘true’. Students tend to overthink especially true statements and will continuously think that they have overlooked a detail that made the statement false. Uncertainty may diminish their performance.
Changing the order of the options (answers) does not help, since students can still communicate ‘the answer that begins with/ends on/is the shortest. If I ask them the same question at the same time? Students can still communicate the answers.
Not allowing the students to go back to the previous question will cause student's to perform worse than on a standard exam, since they will either waste too much time on that question or will skip the answer and feel sad that they just lost a point, and maybe remember the answer later and will be terribly frustrated that they can’t go back to the previous question. The grade will not reflect how well they master the learning objectives and will be lower.
Randomising the order of the questions will create illogical question orders for some students, and logical question orders for others. You should at least keep related questions together and if there is a logical order within a subject/learning objective, keep that order intact. Students can still communicate answers.
Asking students, a unique set of questions from a database of questions is possible, but it is a lot of work to create that database (‘question bank’) and it must contain good quality questions that have proven to discriminate between good-performing and not-so-good-performing students (p-value and Rir-value should be known). This is normally done by analyzing (test result analysis) how well these or similar questions perform on previous, regular exams. The reason that you need ‘proven-quality’ questions is that you cannot do a test result analysis to change the scoring for some questions (i.e. giving all students full points since there was something wrong with the question).
-
Multi-select questions
These questions consist of a number of true/false questions and are prone to cheating. True/false questions are rather difficult to develop without giving students the feeling that they might have overlooked something and that somewhere in the statement, you put a clue that the statement was incorrect, after all.Furthermore, if you are using Möbius, the grading is very untransparent: incorrectly selecting an option is punished harder than forgetting to select an option. This is not clear to lecturers, nor communicated to students. Therefore, scores are relatively low. Our experience is that the scores for multi-select questions (usually) do not correlate with the grade of the student.
The reason is that it is very difficult to ask good quality multi-select questions. In case of wordy questions, the main point is to ask a single question that can be answered without looking at the options.
When to use multi-select: The only type of question which is suitable for multi-select, is a question like ‘Which of the three geometric structures below is/are topologically identical to this structure?’ (insert a picture of a structure, and then 3 pictures of structures that are undeniably similar or dissimilar). In this case, each option should be indisputably correct or incorrect and students will not have the feeling that you are trying to trick them.
Be transparent about the scores: Be transparent in how students can earn points, and figure out how Möbius or Brightspace assigns/deducts points from the score for each incorrectly chosen or omitted option.
5. Prevent fraud by authenticity and identity checks
-
Complementary oral check for randomly sampled students
Contact a random 10% (or more) of the students immediately after the exam and ask them to explain a couple of answers to confirm that they authored the work. Make sure that you choose these students totally randomly, or by using selection criteria that are clearly unbiased towards specific groups of students. Let the students know what the timeslot is in which the oral check takes place, to prevent unnecessary waiting. Furthermore, take the Security and Privacy of Oral Examinations into account during oral authenticity checks.
In case of group projects or group assignments, let the group describe who contributed what, for example in the report. Provide them with a tool (Buddycheck) to stimulate them to give each other intermediate peer feedback on contribution, especially in larger projects. Make the group small enough so that everybody can contribute and that each contribution will be valued by their peers.
Most examiners will use the complementary post-exam oral check as an anti-fraud measure. It is important to mention that this is not a grade-determining part of the examination, it is only applied to check if the student has been honest in submitting his/her work. Only a sample of the students (e.g. 5-20%) will be selected to do the online complementary oral check.
In case you do a complementary oral check on a sample of your student population, please consider the following:
- We recommend to do the check shortly after the exam has finished and before you have graded the exams.
- Preferably pick the students totally random (so not the first 20% in alphabetical order, but for example based on a randomly picked last number of their student number).
- If you decide to use an algorithm to select students, make sure to make the selection criteria explicit to prevent bias.
What to ask:
- Ask for an explanation of some of their answers.
The checks can be recorded but should only be stored if there is a suspicion of fraud. The recording and storing should be done in a similar way as described in Security and Privacy Guidelines for Oral Exams.
In case you come across irregular results while you are scoring the assignment/exam and suspect fraud, please follow the regular processes and report this to the Board of Examiners and the involved student(s).
- Informing students
- Timing
- Preparing for handwritten exam questions takes about 5 minutes per question due to readability issues.
- Doing an oral check takes about 10 minutes, unless you run into bad cases.
- Identity check
- Inform students to keep their student ID ready.
- Check the students’ identity before you start the oral check or oral exam.
- Do not record campus cards or other proofs of identity.
- Questioning
- Ask for an explanation of some of their answers to check whether it is plausible that their work is their own.
- Be clear about whether or not students will be allowed look at their answers and/or drafts or not during the oral check.
- If students are allowed draft paper during the exam, ask them to write on clean sheets and inform them that you may ask them to show these draft papers during the oral check.
- Recordings
- Do not forget to delete all recordings two months after grading, unless for students who filed complaints. Delete their recordings two months after the procedure has been finished.
- Tool and TA support
- Use a tool with a main room and break-out rooms, like Virtual Classroom (Bongo), or a tool with waiting rooms.
- Have TAs invite the (random) students, put them in the waiting room, help them with their audio and video, check their identity, and move them to your break-out room when you are available for the oral check.
- Goal: to diminish start-up time.
-
For oral exams and (project) presentations, you can do an identity check using the student’s campus card (do not record this!).
-
For exams in Brightspace Quizzes or Assignments, students need to login with their netID. It is not allowed to share the login credentials with other people.
-
This is the only form of online video-surveillance that is allowed, and it should comply with the TU Delft Online Proctored Examination Regulation. Online proctoring is only available for digital knowledge exams that need to be taken as closed-book exams, in case it is not possible to change the exam into an oral examination due to student numbers.
- Permission of the Board of Examiners: Your Board of Examiners needs to give explicit permission to use online proctoring for each exam. Online proctoring may only be used as a last resort, and the Board of Examiners assesses whether this is the case. Some Boards of Examiners have indicated that they will never give permission for online proctored examinations.
- Online Proctored Examination Regulation: If you use online proctoring for your exam, you need to adhere to the Online Proctored Examination Regulations.
- Online proctoring only option for video surveillance: If you need to use video surveillance, you are only allowed to use online proctoring via Digital Exams, because it ensures that recorded data will be stored, processed and destroyed according to privacy regulations.
- Availability: Online proctoring is in principle only available for knowledge question exams that are administered as digital exams. The reason why exams need to be digital is that the camera can only record the student’s face and not their handwriting, since students need to read the assignments from the screen. This implies that the camera faces their heads, not hands.
- Maximum duration: The maximum duration of a proctored exam is 90 minutes. After 90 minutes, students need a toilet break, and the occurrence of technical issues increases.
- Maximum number of students: The maximum number of students per group is increased to 150.
- Available assessment tools with proctoring: Currently, proctoring is only available in combination with digital assessment tools Ans, Möbius and Grasple using the tool RPNow. Grasple is only available for mathematics (service education).
- Practice test: Have all students do a practice exam a couple of days before the exam, to detect technical issues and procedural issues, and have students familiarize themselves with the tools.
- Costs: Online proctoring is a paid service with costs ranging between 10-15 euros per student in the exam. The costs of a proctored exam are for the faculty.
- Click to see why you should avoid using multiple-choice questions in an online proctored remote exam, and if you use them, how you should do this appropriately.
6. Prevent fraud by intervening into the course of the assessment
-
Make students promise that they will only hand in their own work and that they will not use help or unauthorized tools, nor will help other students. The promise can be made by having students typing over the honor’s pledge, or by reading it aloud at the start of an oral exam. In case of a written exam, this can be done a day before taking the exam.
We trust the integrity of the student. During your course, ask them to read the TU Delft code of conduct and discuss that you expect that they will adhere to the code. Indicate that you will ask them to do an honor pledge and what it will read. Inform them whether they will be asked to do the pledge before, at the start of, and/or at the end of their assessment. Students can either copy or vocalize the honor pledge.
You can change the pledge to make it more applicable for your assessment. Here are two examples:
- Online exam:
“I promise that I will not use unauthorized help from people or other sources during my exam. I will create the answers on my own and I will create them only during the allocated exam time slots. I will not provide help to other students during their exam.” - Timed take-home exam:
“I promise that I have not used unauthorized help from people or other sources for completing my exam. I created the submitted answers all by myself during the time slot that was allocated for that specific exam part. I will not provide nor have I provided help to other students during their exam.”
For oral exams, students can promise that they will not receive questions from students who took the exam earlier, nor provide questions to the students who will take the exam later.
For written remote exams, the honor’s pledge could also be administered one day before the actual examination, by for example typing the text of a pledge in a Brightspace Quiz short answer and grade it automatically (students can have another go if they make a spelling mistake).
For oral exams, you can do it at the start of the recording (if applicable).
- Online exam:
-
Can I randomly change the order of the questions?
- The order of the exam questions should be logical, in order to enable students to perform optimally. Keep questions on the same topic/learning objective together.
-
Split the exam into 2-4 consecutive parts (30-90 minutes per part). Each exam part is visible during a timeslot and needs to be finished before the end of that timeslot. This diminishes the extent to which answers can be exchanged. You could schedule breaks in between. Please note that students tend to become very stressed by the intermediate deadlines, which diminishes their performance and the reliability of their grade. Therefore, make sure that the time slots are long enough for students to get into a flow of concentration. Preferably give them an opportunity to practice with similar timeslots in a practice exam, use long time-locks and as few as possible. Make sure that the length of each timeslot is realistic for students in exam conditions. Provide students who are entitled to extra time with correctly elongated timeslots.
Tip: Make the examination available only during the examination time-slot for the students who subscribed for the exams. If different groups have different exams, make the exam available only to the correct group. Close the exam after the time-slot (due date) plus a short, extra time window (grace period, end date). This flags all exams that were submitted late in Brightspace, but it is possible for students to submit their work. This page provides more information on creating exams in Brightspace.
7. Detect fraud during grading
-
Use plagiarism scan software to detect similarities between students. In most faculties, the Board of Examiners has made this mandatory in the Rules & Guidelines. Similarities do not automatically imply fraudulent behaviour, so manual labour is still needed, see below.
The software does not indicate who copied who and only indicates the likeliness in the student’s work that was delivered after the first delivering student, so both students will be under suspicion of fraud. Studying the similarity report will help you to determine whether the similarity is likely a suspicion of fraud, in which case you report your suspicion to the Board of Examiners, including the similarity report.
In case of Brightspace assignments that are written digitally, use the built-in plagiarism check in Brightspace (Turnitin) and open each similarity report to check for larger matches in the student’s text.
In case of Ans, you can use frequency of strange mistakes to look for similarities. In other cases, you will have to look for similarities or other suspicious patterns manually (see below).
-
Look for the following patterns:
- Large similarities in answers to open questions. For example, similar phrasing, by hand or using a plagiarism checker. Formulation of answers to (longer) open questions cannot be exactly the same for multiple students.
- Similar, strange mistakes in students’ answers. It is very unlikely that students make the exact same, uncommon mistake. In case of multiple-choice questions, statistical tools could be used to calculate the probability that similar wrong answers are purely coincidental (contact the educational advisor in your faculty, or the TU Delft learning developers of TLS via Teaching & Learning Support for more information). If you use Ans, you can keep track of strange errors during grading.
- Logical explanation of similarities: Check whether the similarities can be explained by the use of the same resource (paper, websites, etc.), in which case it may be a case of plagiarism or no fraud at all if the resource was allowed and was allowed correctly.
- Use of parameters from another version of the exam. For example, students who were working together and used the parameter of the other student.
- Time pattern. Especially in case of Ans exams, check in the log-file if groups of students submitted answers to several questions around the same time for a consistent period of time. If so, check if the answers were similar.
- Free riding in group work. This will in general simply lead to a fail grade, but could be considered fraud. This may show if students score extremely low on an individual assessment in the course.
- Handwritten exams. If available, do a random check in which you compare the handwriting with a previous assessment. Furthermore, check for identical handwriting, especially in the case of other suspicions.
Be as unbiased as possible and make sure that random checks are really random and for example not based on your experience with a student in class. Use for example a random generator to pick students to run checks.
-
When you use for example Brightspace assignments, you can manually check whether students had the same handwriting as in a previous assignment, whether they copied the hand-written notes from a peer, or whether they seem to have copied texts from peers. Ask your students to keep the original handwritten papers, in case of legibility issues.
-
If you suspect fraud with your assessment, you have to email your fraud suspicion report to the Board of Examiners (BoE, Dutch: examencommissie) that represents your course (according to the study guide). In case you suspect fraud after the assessment, it depends on the faculty if your BoE will ask you to inform your student of the suspected fraud, or if your BoE will take care of that. The BoE may have a checklist with all information that needs to be included in the report that you send them. You can find more information on the procedure and rules of article 7a, section 4 of the Rules and Guidelines of your BoE.
In case of suspicion of large-scale fraud, the Board of Examiners can declare all announced grades invalid, if it is impossible to determine who committed fraud and who did not. Any kind of participation in large-scale fraud is considered a case of serious fraud. In case of serious fraud, ‘the Board of Examiners is entitled to propose to the Executive Board that the student’s enrolment on the degree programme be permanently terminated’. Click here for more information on fraud and consequences.
As a lecturer, you are obliged to report any individual suspicion of fraud to the Board of Examiners. The same holds for large scale suspicions.
If the reliability of the grade is not sufficient to express the results as a grade (1-10), the Board of Examiners can allow the examiner to change the grade to a pass/fail decision (fail-pass). Variations on pass/fail are not allowed.
Pass/fails do not count for the student’s average grade (GPA).
If a student passes a course with a 6.0 and wishes to retake the exam with a pass/fail exam, the grade will not be replaced by a ‘pass’, because the (new) Rules & Guidelines of the Boards of Examiners state that when having received both a grade and a pass/fail after a retake, any pass grade is considered to be higher than a ‘pass’ and will be kept.
-
Do a test result analysis to assess the quality of the individual questions and use the information to change the scoring of individual questions. More information can be found in the TU Delft assessment manual. In Ans, Brightspace quizzes and Möbius, all relevant information is available in the test statistics.
8. What does an exam with question pools look like?
Per learning objective or topic, you will formulate a number of questions at the same levels of difficulty and of the same question type. This pool of interchangeable questions is called a question pool.
Examples of interchangeable questions in the same question pool:
- Fill in the blanks, automatically graded using regular expressions: Naming parts of a machine (if the answer can be copied from a book, this is only possible for proctored exams). The machine is different for each question.
- Short answer, automatically graded using regular expressions: Writing out the applicable formula for a situation shown in a figure. The situation is different for each question.
- Arithmetic question, automatically graded (all or nothing): Calculate the force on a beam in a construction. The construction or the beam is different for each question.
For each student, a unique exam will be formed with randomly drawn questions from the question pools.
-
LO1: 3 question pools of unrelated questions
o Pool 1a: low difficulty: matching question
o Pool 1b: low question: short open question
o Pool 1c: medium difficulty: arithmetic questionLO2: 3 question pools of unrelated questions
o Pool 2a: low difficulty: open question, giving an explanation
o Pool 2b: medium difficulty: arithmetic question
o Pool 2c: medium difficulty: open question, analyzing a problemLO3: 2 question pools of unrelated questions
o Pool 3a: medium difficulty: arithmetic question, analyzing
o Pool 3b: difficult question: analyzing data and drawing a conclusion -
LO1: 1 question pool that comprises questions with 5 sub questions each. The sub questions are of increasing difficulty.
LO2: 1 question pool that comprises questions with 4 sub questions each. The sub questions are of increasing difficulty. -
The order of the exam questions should be logical, in order to enable students to perform optimally. Keep questions on the same topic/learning objective together.