Nonetheless, waiting for the final revision was a bottleneck for many delegations. The reason for the long delay between finishing the discussion and the release of the revised version was that some of the requested changes required the clarification of some figures, which could be done easily only by the people implicated heavily in the development of the exams. However, as those same people were also involved in the discussion as well as in setting up the exam rooms, such changes took longer than planned. We thus recommend future hosts to brief an extra person on all aspects of the practical exams such that even changes of the figures can be executed during the sessions. A well-received novelty of the IBO 2013 was that we did no longer require the jury to print the practical exams themselves. Instead, each delegation had to upload their exams as pdf files and digitally indicate the correct file to be used for each student. To accommodate the need of delegations with multiple languages, different files could be assigned to different students of a single delegation. It was then the task of volunteers to print the exams and to put them into envelops. This not only greatly reduced the waiting time of delegations that finalized their translations, but also allowed us to verify the content of the exams carefully. This way, it was for instance noticed that one delegation uploaded the original exam file rather than their translation for one practical exam. Since the jury members were unable to be reached via phone, the printing crew used Wikipedia to search for other delegations using the same or a similar language and provided the students with such a translation instead. Just before the exam started, our volunteers were then able to reach a jury member of the affected delega tion, only to receive confirmation that the translation provided to their students was actually the result of a joint translation effort between the two delegations. As a result of the careful work of the printing staff, we had no single request regarding missing pages or a mix-up of languages or exams by any of the participants during the exam. Printing the exam papers ourselves allowed us further to add student and practical specific cover sheets with labels that identified the student, the practical and the matching exam ses sion that were visible through a window in the envelope. These cover sheets were further printed on colored paper for easy identification of the practical they were to be used in and also contained a barcode identifying the student and the practical in machine readable format. This made the handling of the papers for the different exams and sessions easy for everybody involved and allowed us to automatically attribute exam papers to students. It facilitated also the attribution of practical exams during the marking session as the scanned pages could be grouped and ordered using computer scripts.
Reception by Students
Many team guides reported that their students actually enjoyed the exams – the probably big gest compliment to get from students. While this is anecdotal evidence only, a similar impres sion of the exams was conveyed by the students through a representative survey conducted at the end of the IBO (see chapter 7). Well above 70% of all students were either very or extremely happy with how the exams were organized and conducted with less than 6% being unhappy about the exams. However, students rated the exams to be rather difficult (average 4.12 were 1 is too easy, 3 is appropriate and 5 is too difficult). However, since IBO exams discriminate best between students when the obtained points are broadly distributed and half of the students obtain more than half of the points, such a verdict is expected and probably speaks for an appropriate difficulty of the exams over all.
86 | finalreportIBO