Page 103

The Use of Different Languages

The students had the possibility to switch between languages during the exam and the system actually recorded the total time spent in each language. This data clearly suggest that the most commonly used language was English, used in 30.7% of the total time spent ( time * number of students). Interestingly, this is not only due to the large number of students only working with the English version (40 students or 17%), since 44 of the remaining students (18% of all students) used the English version in more than 30% of their time. In addition, these are likely underestimates since several delegations provided the English text in line with their translation.

The next most often used language was Russian (8.2% of the total time), followed by Spanish (5.0%), Turkish, German and Greek (all ~2.9%), Arabic (2.5%) and Dutch (2.0%). As com­ parison, a language used by a single team of 4 students should make up 1.7% of the total time.

Prediction of Jury Regarding Difficulty

During the jury sessions, each delegation was asked to rate each question regarding its dif­ ficulty on a five level scale where 1 corresponded to a too easy question, 3 to a question of appropriate difficulty and 5 to a too difficult question. On average, about 22 or 1/3 of all delegation rated each question (range spanning from 17 to 27). Based on this data, it appears that the jury was rather happy with the difficulty: the average rating corresponds to 2.97 and 47.8% of all questions received an average rating between 2.75 and 3.25 (FIGURE 6 .19). Interestingly, the judgment of difficulty spanned at least four out of the five levels for every question with less than 50% of the jury agreeing on the same difficulty for more than 52% of the questions. Nonetheless, the questions differed considerably in their average difficulty as ranked by the jury with the easiest question getting an average score of 1.65 and the most difficult a score of 4.05.

too easy

1.0

good difficulty

too difficult

Average

0.8

●● ●

Frequency

●●

0.6 ●

0.4 ●●

0.2

● ●●

●●●

●● ●●

● ●●

● ●●●● ●

●● ● ●● ● ●●

● ●●●

●●

●●●● ●●●●●●●●●

●●●●●● ●●

●●● ●●

●●●

● ●●● ●

● ●●

●● ●●

●● ●

●●●

● ●

Av er ag

e

0.0 All questions sorted by difficulty rating

FIGURE 6 .19  Difficulty estimated by the jury:

During the jury sessions, each delegation was asked to rate the difficulty of each question on a scale of five levels ranging from “too easy” to “too difficult”. Shown here is the distribution of obtained ranks for each question, with the question sorted by increasing average difficulty. For comparison, the average distribution across questions is shown on the left. Yellow dots correspond to the average difficulty per question.

finalreportIBO | 103

IBO 2013 Final Report  
IBO 2013 Final Report  
Advertisement