Page 1

Assessment Matters GL EDUCATION’S INSIGHTS

CLASSROOM INTERVENTIONS

What really works? TESTWISE

What’s new? NGST

Look out for the new spelling test CASE STUDIES

Applications in practice

May 2017


INTRODUCTION Great schools know that reliable, well targeted assessment is the one of the most powerful tools for raising standards in the classroom and making sure every student achieves what they are capable of. International schools face a particular set of challenges: diverse educational and cultural contexts; native and non-native English speakers learning alongside each other; and a growing pressure to help students win good university places. This new magazine is devoted to showing how assessment can help meet those challenges. Assessment has the power to improve teaching and learning, but to work, it has to be smart, easy to use and efficient. It must also answer the questions you need it to answer, whether that is through the provision of benchmarks that help to measure and monitor progress, supporting student wellbeing and inclusion, or providing tools to support personalised teaching and learning. Moreover, a growing number of educators recognise how students’ engagement and wellbeing impacts their learning. The best international schools understand that an over-reliance on grades and exams may not produce balanced and wellrounded learners. Students who understand their own strengths and weaknesses in this area are more resilient and more likely to succeed. I hope that this magazine shows how, with our help, the best schools are putting assessment at the heart of first class teaching and learning. Please get in touch with our international team for more information – their details are on page 13.

CONTENTS 01

INTRODUCTION

02

C  LASSROOM INTERVENTIONS: What really works? Nicola Lambros, King’s College Madrid

Greg Watson Chief Executive GL Education

Welcome to the first edition of Assessment Matters Over the last four years I have seen the way international schools use data to support teaching and learning evolve and develop beyond recognition. There were always innovators and leaders, and we hear from some of them in this magazine, but the general awareness and hunger to use data has been transformed. I’ve recently returned from Japan where schools at the IB Conference wanted to see how data could support the development of Approaches to Learning skills. The breakout held by Lucinda Willis of Nagoya International School was packed out, but you can read the highlights of the case study on page 6. We believe that to get a true picture of a student, you need to consider ability, attainment and attitudes. Rosemary Elmes, of GEMS Wellington Primary School in Dubai epitomises this approach, going beyond the standard reports to compare specific data sets from our CAT4 and PASS assessments to identify groups of fragile learners. You can see how she does this on page 7. And of course identification is only part of the story. Giving students the support and the tools they need to grow is vital. I’d encourage you to read the thoughts of Nicola Lambros of King’s College Madrid on pages 2–3.

04

TESTWISE

05

NEW GROUP SPELLING TEST

06

NAGOYA INTERNATIONAL SCHOOL: Empowering lifelong learners

07

GEMS WELLINGTON PRIMARY SCHOOL: Identifying fragile learners

08

INTERNATIONAL BENCHMARKING: With CAT4 and the Progress Test Series

I hope that you find the magazine interesting and useful and hope to meet you at one of the many events we’re running and attending around the world in the months to come. See our website gl-education.com/our-events for more information.

10

PERSONALISING LEARNING: Using assessment to identify needs and accelerate learning in international schools

James Neill International Director GL Education

11

WORKSHOPS

12

OUR EXPERTS ASSESSMENT MATTERS

01


INTRODUCTION Great schools know that reliable, well targeted assessment is the one of the most powerful tools for raising standards in the classroom and making sure every student achieves what they are capable of. International schools face a particular set of challenges: diverse educational and cultural contexts; native and non-native English speakers learning alongside each other; and a growing pressure to help students win good university places. This new magazine is devoted to showing how assessment can help meet those challenges. Assessment has the power to improve teaching and learning, but to work, it has to be smart, easy to use and efficient. It must also answer the questions you need it to answer, whether that is through the provision of benchmarks that help to measure and monitor progress, supporting student wellbeing and inclusion, or providing tools to support personalised teaching and learning. Moreover, a growing number of educators recognise how students’ engagement and wellbeing impacts their learning. The best international schools understand that an over-reliance on grades and exams may not produce balanced and wellrounded learners. Students who understand their own strengths and weaknesses in this area are more resilient and more likely to succeed. I hope that this magazine shows how, with our help, the best schools are putting assessment at the heart of first class teaching and learning. Please get in touch with our international team for more information – their details are on page 13.

CONTENTS 01

INTRODUCTION

02

C  LASSROOM INTERVENTIONS: What really works? Nicola Lambros, King’s College Madrid

Greg Watson Chief Executive GL Education

Welcome to the first edition of Assessment Matters Over the last four years I have seen the way international schools use data to support teaching and learning evolve and develop beyond recognition. There were always innovators and leaders, and we hear from some of them in this magazine, but the general awareness and hunger to use data has been transformed. I’ve recently returned from Japan where schools at the IB Conference wanted to see how data could support the development of Approaches to Learning skills. The breakout held by Lucinda Willis of Nagoya International School was packed out, but you can read the highlights of the case study on page 6. We believe that to get a true picture of a student, you need to consider ability, attainment and attitudes. Rosemary Elmes, of GEMS Wellington Primary School in Dubai epitomises this approach, going beyond the standard reports to compare specific data sets from our CAT4 and PASS assessments to identify groups of fragile learners. You can see how she does this on page 7. And of course identification is only part of the story. Giving students the support and the tools they need to grow is vital. I’d encourage you to read the thoughts of Nicola Lambros of King’s College Madrid on pages 2–3.

04

TESTWISE

05

NEW GROUP SPELLING TEST

06

NAGOYA INTERNATIONAL SCHOOL: Empowering lifelong learners

07

GEMS WELLINGTON PRIMARY SCHOOL: Identifying fragile learners

08

INTERNATIONAL BENCHMARKING: With CAT4 and the Progress Test Series

I hope that you find the magazine interesting and useful and hope to meet you at one of the many events we’re running and attending around the world in the months to come. See our website gl-education.com/our-events for more information.

10

PERSONALISING LEARNING: Using assessment to identify needs and accelerate learning in international schools

James Neill International Director GL Education

11

WORKSHOPS

12

OUR EXPERTS ASSESSMENT MATTERS

01


CLASSROOM INTERVENTIONS:

Positive feedback, too, is essential. Research has found that underachievers are more likely to receive negative feedback that focuses on shortfalls and highlights personal deficiencies, reducing self-efficacy. Focussing on the positive is therefore essential.

What really works?

As students work towards their goals, regular individual feedback should provide information to specify what is being done well, how much progress has been made and what could be done to improve further. Students who receive positive feedback on progress in a skill they find challenging develop higher skills and an improved selfefficacy.

Nicola Lambros, Deputy Head, King’s College Madrid An extensive range of research over the past few decades has overwhelmingly found that student self-belief is a strong predictor of academic and career achievement as well as the likelihood of depression, dropping out of school or truanting. This is because students who have a strong belief in their capabilities adopt higher aspirations, set more challenging goals and have more motivation, focus and perseverance, particularly when faced with obstacles. To enhance achievement, therefore, schools must build strong self-belief or self-efficacy in every student. Selfefficacy can be defined as ‘personal judgements of one’s capabilities to organise and execute courses of action to attain designated goals’ (Zimmerman, 2000). Crucially, it is a student’s perception of their capabilities rather than their actual performance that is important. If schools hope to ‘close the gap’ in student attainment then identifying, challenging, and altering low selfefficacy should be enshrined in every school’s ethos.

How, though, can teachers intervene to make this happen? Essentially, what matters is how things are taught. Academic underachievers, for instance, typically have fewer opportunities that allow them to succeed. So teachers must create environments that allow them to experience success on a regular basis. To do this students must be given opportunities and choice to set themselves personal shortterm goals they feel are attainable. Providing students with examples of what children with similar abilities have achieved will enable them to realise that goals are achievable. Goals set by parents or teachers are pointless if a child feels that they are not capable of achieving them, even if in reality they have the ability. As students achieve their goals their self-efficacy increases, boosting in turn their commitment, motivation and use of self-regulatory strategies so more challenging goals can be set.

Receiving feedback on effort, however, should be avoided. It can reduce self-efficacy as it reinforces a student’s belief that academic mastery requires a great deal of hard work and suggests they do not have the academic capabilities required. Anxiety undermines self-efficacy and academic performance. Students with low self-efficacy interpret anxiety as a sign of incompetence, which lowers their self-efficacy.

Hence, a well-planned support system should be in place and made readily available for all students. Tools for managing stressful situations should be integrated into classroom instruction and should play a key role in PSHCE programmes. There must also be a clear focus on developing effective independent learning skills and a ‘Growth Mindset’ in every student. However, teaching independent learning strategies out of context has been found to be a fruitless endeavour; students are not able to transfer the skills. Explicitly teaching learning strategies and skills in class and supporting students in their independent learning through effective mentoring and coaching to foster a ‘Growth Mindset’ is required. Students can then practise and master these skills independently in and out of class, thereby improving their metacognitive monitoring, study strategy selection and self-regulatory skills to increase academic success, self-efficacy and the likelihood of remaining in education.

Focussing on the positive is therefore essential.

It is a student’s perception of their capabilities rather than their actual performance that is important.

It’s a question of celebrating excellent attitudes to learning rather than grades. Schools should praise students for their increased motivation and progress, highlight success outside the classroom achieved through perseverance and espouse failure as a powerful learning experience.

ASSESSMENT MATTERS

02

ASSESSMENT MATTERS

03


CLASSROOM INTERVENTIONS:

Positive feedback, too, is essential. Research has found that underachievers are more likely to receive negative feedback that focuses on shortfalls and highlights personal deficiencies, reducing self-efficacy. Focussing on the positive is therefore essential.

What really works?

As students work towards their goals, regular individual feedback should provide information to specify what is being done well, how much progress has been made and what could be done to improve further. Students who receive positive feedback on progress in a skill they find challenging develop higher skills and an improved selfefficacy.

Nicola Lambros, Deputy Head, King’s College Madrid An extensive range of research over the past few decades has overwhelmingly found that student self-belief is a strong predictor of academic and career achievement as well as the likelihood of depression, dropping out of school or truanting. This is because students who have a strong belief in their capabilities adopt higher aspirations, set more challenging goals and have more motivation, focus and perseverance, particularly when faced with obstacles. To enhance achievement, therefore, schools must build strong self-belief or self-efficacy in every student. Selfefficacy can be defined as ‘personal judgements of one’s capabilities to organise and execute courses of action to attain designated goals’ (Zimmerman, 2000). Crucially, it is a student’s perception of their capabilities rather than their actual performance that is important. If schools hope to ‘close the gap’ in student attainment then identifying, challenging, and altering low selfefficacy should be enshrined in every school’s ethos.

How, though, can teachers intervene to make this happen? Essentially, what matters is how things are taught. Academic underachievers, for instance, typically have fewer opportunities that allow them to succeed. So teachers must create environments that allow them to experience success on a regular basis. To do this students must be given opportunities and choice to set themselves personal shortterm goals they feel are attainable. Providing students with examples of what children with similar abilities have achieved will enable them to realise that goals are achievable. Goals set by parents or teachers are pointless if a child feels that they are not capable of achieving them, even if in reality they have the ability. As students achieve their goals their self-efficacy increases, boosting in turn their commitment, motivation and use of self-regulatory strategies so more challenging goals can be set.

Receiving feedback on effort, however, should be avoided. It can reduce self-efficacy as it reinforces a student’s belief that academic mastery requires a great deal of hard work and suggests they do not have the academic capabilities required. Anxiety undermines self-efficacy and academic performance. Students with low self-efficacy interpret anxiety as a sign of incompetence, which lowers their self-efficacy.

Hence, a well-planned support system should be in place and made readily available for all students. Tools for managing stressful situations should be integrated into classroom instruction and should play a key role in PSHCE programmes. There must also be a clear focus on developing effective independent learning skills and a ‘Growth Mindset’ in every student. However, teaching independent learning strategies out of context has been found to be a fruitless endeavour; students are not able to transfer the skills. Explicitly teaching learning strategies and skills in class and supporting students in their independent learning through effective mentoring and coaching to foster a ‘Growth Mindset’ is required. Students can then practise and master these skills independently in and out of class, thereby improving their metacognitive monitoring, study strategy selection and self-regulatory skills to increase academic success, self-efficacy and the likelihood of remaining in education.

Focussing on the positive is therefore essential.

It is a student’s perception of their capabilities rather than their actual performance that is important.

It’s a question of celebrating excellent attitudes to learning rather than grades. Schools should praise students for their increased motivation and progress, highlight success outside the classroom achieved through perseverance and espouse failure as a powerful learning experience.

ASSESSMENT MATTERS

02

ASSESSMENT MATTERS

03


We have updated

NEW GROUP SPELLING TEST

TESTWISE What is Testwise?

What has changed?

Testwise is GL Education’s webbased assessment platform. It’s the administration centre that allows you to allocate, schedule and report on our range of online student assessments. With millions of assessments being taken every year by students worldwide it is vital that we have the best and most user-friendly platform for both students and teachers alike.

• Cloud based • More flexible • Improved search mechanism • Simplified credit allocation • Better data protection • Greater reliability • Visually stronger • Easier to use

Why have we developed it?

Single Student Records - no need to create a student record for every assessment administered. There will be one record for each student, which can be used across multiple assessments. In addition to this, the new custom fields give the option of adding in additional school data for each student, as well as ethnicity and SEN. So when it comes to grouping students, there’s increased flexibility for each school to work the way they want to.

The previous version of Testwise had been available for nearly eight years. In that time, the way that schools use our digital assessments has changed – for example, administering tests using iPads or Android tablets. The new platform will make it easier for us to support these devices and other operating systems in the future. Additionally, as our range of digital tests has increased significantly in those eight years, the process of allocating students to multiple assessments had become a time-consuming administrative task for schools and we have been able to simplify that process. The new platform is easier to use and saves hours of administration time. It gives more flexibility when it comes to choosing which levels to use, allows control of the assessment start time ensuring all pupils start at the same time and enables customised reports that group pupils in a way specific to the teacher. We believe that schools will find it a significant improvement over the old version.

What does this mean for me?

Multiple User Accounts – create multiple administrative user accounts for a single school account. This enables restricted access for different users, for example the admissions department or the SEN team. Student Access – Each student will be given their own 8 digit access code for each assessment they sit – this will reduce admin time for teachers and makes it simpler for younger students to access the tests themselves. For more information please get in touch at:

international@gl-education.com For Testwise support:

support@gl-assessment.co.uk

ASSESSMENT MATTERS

04

Frequently Asked Questions What if my student doesn’t take the test in the recommended time frame? In the new system instead of buying individual assessments, credits are purchased. If a student is unable to take the test in the required time frame the credit goes back into the account and nothing is lost. The Internet can be temperamental, what if it cuts out half way through a test? Each test can be downloaded; therefore if connection is lost the test can be completed without losing the data. Connection is needed to upload the answers to gain the reports.

GL Education is proud to announce the arrival of the New Group Spelling Test (NGST ). The New Group Spelling Test will be published towards the end of 2017 and will comprise of 3 termly digital adaptive tests of spelling for students aged 6 to 14+. Questions are all aural, but at the lower levels, some questions are supported by pictures. Each student will see at least two sections: a single word section and a spelling in context section. There will also be a third extended section which will only become available to particularly able spellers.

• Single word section will be made

up of 5 or 6 spelling rules and there will be around 5 words per rule.

• Spelling in context section will be made up of a large bank of words and will test a variety of different spelling rules using sentence completion tasks.

• Extended section will only become

available to particularly able spellers. The extended section will show whether a pupil can manage more difficult spelling rules but also whether they can use vocabulary and grammatical constructs in a more complex way.

Because the test is adaptive, children will be presented with questions that are appropriate for their ability, getting harder or easier according to their ability and removing the risk of demotivating students who are unable to answer large numbers of questions based on their chronological age.

The NGST is the first assessment to be developed for delivery on our new Testwise platform and will be standardised for use on tablets as well as computers from day 1.

NGST can be used as a standalone test, but it can also be used alongside the New Group Reading Test (NGRT ) meaning schools will be able to test spelling and reading in under an hour. A combination report will be developed to automate the analysis of both sets of data together.

QUICK GUIDE Age range:

6-14+

Suitable for:

Teachers and SENCos

Test duration:

Untimed

Test format:

Digital

ASSESSMENT MATTERS

05


We have updated

NEW GROUP SPELLING TEST

TESTWISE What is Testwise?

What has changed?

Testwise is GL Education’s webbased assessment platform. It’s the administration centre that allows you to allocate, schedule and report on our range of online student assessments. With millions of assessments being taken every year by students worldwide it is vital that we have the best and most user-friendly platform for both students and teachers alike.

• Cloud based • More flexible • Improved search mechanism • Simplified credit allocation • Better data protection • Greater reliability • Visually stronger • Easier to use

Why have we developed it?

Single Student Records - no need to create a student record for every assessment administered. There will be one record for each student, which can be used across multiple assessments. In addition to this, the new custom fields give the option of adding in additional school data for each student, as well as ethnicity and SEN. So when it comes to grouping students, there’s increased flexibility for each school to work the way they want to.

The previous version of Testwise had been available for nearly eight years. In that time, the way that schools use our digital assessments has changed – for example, administering tests using iPads or Android tablets. The new platform will make it easier for us to support these devices and other operating systems in the future. Additionally, as our range of digital tests has increased significantly in those eight years, the process of allocating students to multiple assessments had become a time-consuming administrative task for schools and we have been able to simplify that process. The new platform is easier to use and saves hours of administration time. It gives more flexibility when it comes to choosing which levels to use, allows control of the assessment start time ensuring all pupils start at the same time and enables customised reports that group pupils in a way specific to the teacher. We believe that schools will find it a significant improvement over the old version.

What does this mean for me?

Multiple User Accounts – create multiple administrative user accounts for a single school account. This enables restricted access for different users, for example the admissions department or the SEN team. Student Access – Each student will be given their own 8 digit access code for each assessment they sit – this will reduce admin time for teachers and makes it simpler for younger students to access the tests themselves. For more information please get in touch at:

international@gl-education.com For Testwise support:

support@gl-assessment.co.uk

ASSESSMENT MATTERS

04

Frequently Asked Questions What if my student doesn’t take the test in the recommended time frame? In the new system instead of buying individual assessments, credits are purchased. If a student is unable to take the test in the required time frame the credit goes back into the account and nothing is lost. The Internet can be temperamental, what if it cuts out half way through a test? Each test can be downloaded; therefore if connection is lost the test can be completed without losing the data. Connection is needed to upload the answers to gain the reports.

GL Education is proud to announce the arrival of the New Group Spelling Test (NGST ). The New Group Spelling Test will be published towards the end of 2017 and will comprise of 3 termly digital adaptive tests of spelling for students aged 6 to 14+. Questions are all aural, but at the lower levels, some questions are supported by pictures. Each student will see at least two sections: a single word section and a spelling in context section. There will also be a third extended section which will only become available to particularly able spellers.

• Single word section will be made

up of 5 or 6 spelling rules and there will be around 5 words per rule.

• Spelling in context section will be made up of a large bank of words and will test a variety of different spelling rules using sentence completion tasks.

• Extended section will only become

available to particularly able spellers. The extended section will show whether a pupil can manage more difficult spelling rules but also whether they can use vocabulary and grammatical constructs in a more complex way.

Because the test is adaptive, children will be presented with questions that are appropriate for their ability, getting harder or easier according to their ability and removing the risk of demotivating students who are unable to answer large numbers of questions based on their chronological age.

The NGST is the first assessment to be developed for delivery on our new Testwise platform and will be standardised for use on tablets as well as computers from day 1.

NGST can be used as a standalone test, but it can also be used alongside the New Group Reading Test (NGRT ) meaning schools will be able to test spelling and reading in under an hour. A combination report will be developed to automate the analysis of both sets of data together.

QUICK GUIDE Age range:

6-14+

Suitable for:

Teachers and SENCos

Test duration:

Untimed

Test format:

Digital

ASSESSMENT MATTERS

05


GEMS WELLINGTON PRIMARY SCHOOL

Empowering lifelong learners to follow their own path

The current Head of the Student Services Team, Kim Humphreys, has also been an advocate for using CAT4 beyond core teaching. “With a background in psychology and assessment, she has been making the most of the results yielded by CAT4 to identify any hidden pockets of special educational needs or underachievement.

“We have discovered our students have very positive attitudes to teachers and school. However, we did uncover evidence that some students were harbouring quite negative attitudes to themselves as learners and their work ethic.”

“She had already used CAT4 as an early screener for specific children of concern and now has copies of all the reports to ensure no child is missed nor their abilities misunderstood. She also uses the data to ensure our student service resources are allocated effectively to where help and intervention is most needed.”

“There is a culture of perfection that exists in high school students, which can result in the perception that if a task isn’t hard, you aren’t doing enough, so to some extent we had anticipated this. But with the evidence in front of our eyes from PASS, we have started to think about how we communicate how we think about success at the school.

The IB Diploma Co-ordinator has also used CAT4 when starting to think about option choices for the current Grade 10s. “As a scientist, our Diploma Co-ordinator was keen to do some number crunching with a small group of last year’s IB students to compare IB scores with CAT4 results. He noted a correlation between the mean SAS CAT4 score and final DP score. However, this was based on a very small sample.

“We can focus more on building multiple career pathways and we’re hoping to alleviate any pressure on our students and broaden their horizons – for example, one student recently did an internship at the Hilton Hotel and now has a job there.”

Table of Scores

Spatial NPR

Mean SAS

Math Literacy ISA

ISA Writing Narrative/ Reflective

ISA Writing Exposition/ Argument

ISA Reading

Addition & Subraction1

Multiply & Divide1

Proportions & Ratio1

Running Records2

The table Lucinda shares with teaching staff that compares data from CAT4 with information from other assessments. The next stage will be to add PASS data to the chart.

Spatial SAS

“The pilot test allowed everyone to see how easy CAT4 was to use, how quickly we could gain information we didn’t already have and how results came in a usable format. One of the children we assessed spoke English as an additional language and could often be disengaged in class. The CAT4 results showed that his underlying ability was quite strong, and that his language skills were holding him back. We had suspected this and enrolled him on our intensive English Language programme but the CAT4 pilot gave our Senior Leadership Team the chance to see the benefits of having confirmation.

“The IB Learner Profile is a blueprint for the character we are trying to build in each child, so PASS will be able to help us get a sense of where they see themselves and make sure we are creating an environment where they feel confident to take risks in their learning.

Support and inclusion

Non-verbal NPR

“Assessing ability is not commonly done in international schools, so there was some initial confusion amongst staff as to exactly what was being tested. Some were open to the idea but felt unsure as to why it would be more useful than a straightforward attainment test.

Understanding that attitude to learning goes hand in hand with attainment and ability, the school has recently piloted the Pupil Attitudes to Self and School (PASS) survey.

Non-verbal SAS

“The IB curriculum is interdisciplinary and transdisciplinary, as is GL Education’s Cognitive Abilities Test: Fourth Edition (CAT4) – it provides unique information about children’s ability, regardless of previous schooling. The IB curriculum is also an inclusive curriculum that builds on individual skill sets. I knew CAT4 could help teachers differentiate effectively and pick up issues that might otherwise be missed, such as a child who starts to show signs of underachieving.”

Completing the picture

Quantitative NPR

Assessing ability

“I also mocked-up a triangulated sample of CAT4 results from our Grade 5 children with their International Schools Association (ISA) attainment data and teacher grading to show how all these elements work together to paint a picture. It struck a chord as, when I distributed the report, I could really see how the benefits were starting to click into place for many of our teachers.”

Quantitative SAS

“We used to do this via an assessment that focused on prior attainment. However, the IB is concept based so this is not always the best predictor of achievement. We wanted to find a way to understand children’s potential in the IB as well as establish a strong baseline when they joined.”

He is beginning to look at Grade 10 CAT4 scores as a predictor of DP performance but would like to see more data. He has recently encouraged other teachers to look at the CAT4 reports when discussing Grade 10 subject options”.

Verbal NPR

Lucinda Willis, Director of Learning, explains: “A transient school population means children often come from extremely diverse backgrounds, with varying types of school reports and feedback available from their previous schools. Therefore, to nurture their talents, we need to get to know our students as quickly as possible.

it assesses, questions teachers might like to ask themselves such as ‘Does it match what I’d expect to see?’ and an explanation of key terminology.

Verbal SAS

Nagoya International School in Japan is a co-educational, non-selective inclusive community school that teaches the three IB programs of PYP, MYP and DP. It has 460 students from over 30 nationalities, with the largest number coming from the US, Japan and Korea.

Average even bias

103

58

111

77

112

78

106

66

108

419

472

479

400

7e

7e

6e

T

Extreme spatial bias

96

40

120

91

109

72

130

98

114

468

338

400

342

6

7e

6

N

Average even bias

96

40

97

42

112

78

97

42

101

386

393

453

357

5

5

5e

S

Verbal-Spatial profile

Supporting staff understanding

Moderate spatial bias

110

74

117

87

112

78

128

97

117

443

497

479

371

6

6

6

S

Average even bias

118

89

116

86

105

63

111

77

113

443

497

505

428

7e

7e

7e

W

To support staff understanding, Lucinda put together a teacher guidance pack detailing the background to CAT4 and what

Mild spatial bias

118

89

123

94

139

99

139

99

130

495

447

479

414

7e

7e

7

T

Moderate spatial bias

109

72

117

87

134

99

131

98

123

527

420 400 459

7e

7e

7e

Q

ASSESSMENT MATTERS

06

1

Math

2

Reading/Fountas & Pinnell

Identifying fragile learners with Pupil Attitudes to Self and School GEMS Wellington Primary School lives up to its reputation as a varied community, with 73 different nationalities found amongst the 1175 children aged three and a half to 11 years old. “We’re a truly inclusive school,” explains Rosemary Elmes, Senior Leader for Standards. “Children join us from all over the world. Around 25% start with little or no English and approximately 24% are on the SEND register. “Due to the transitional nature of family employment, many children move on quite frequently. Gaining a holistic picture of a child as quickly as we can is important. If we want them to really develop during their time with us, we need to look beyond the academic.”

Gaining an insight The school’s guiding principle is the belief that children need to be happy in order to learn successfully. The school decided to introduce GL Education’s Pupil Attitudes to Self and School (PASS), a robust survey that measures students’ attitudes towards themselves as learners and their school. “Children suffering from poor attitudes to their learning can be very good at hiding any issues, and we didn’t want to miss anything. PASS allows us to take a forensic approach into what children are thinking.” This led to a target group of 56 children being identified, who were at risk of low attainment due to their negative attitudes towards themselves and school. Rosemary says: “The results were surprising, as the ‘at risk’ list included children we really didn’t expect. In fact, when we put together a photo gallery of these children, one senior leadership team member expressed surprise that a particular girl was on there as she seemed bubbly and confident in lessons. Yet she didn’t ever do as well in tests as we expected, so it was important to delve further and explore why that might be.”

Getting ready to learn Armed with this information, various programmes of intervention were introduced, using a combination of nationally-proven strategies from the PASS intervention tool and teachers’ professional experience.

33% improvement Teachers annotated the strategies used and the school then re-assessed these children four months later to see what had changed. They were delighted with the results. “We have achieved a 33% improvement across the target group, which placed all but three children into the top percentile

for high satisfaction with their school experience.

We have achieved a 33% improvement across the target group, which placed all but three children into the top percentile for high satisfaction with their school experience.

“We have also seen a huge change in specific children. For example, one Arabic boy in Year 3 had achieved an 80% improvement across the nine factors. His confidence in himself as a learner had been extremely low, but with targeted encouragement and partner work focusing on how other children achieve, he was able to take those risks and experience success. His teacher couldn’t believe it was the same child! “The icing on the cake has been rising to the top of the GEMS schools ‘Happiness/ Student Satisfaction’ index in a survey across our company’s schools in the UAE.”

has and hasn’t been covered well. We can look across cohorts and see that they are not yet secure with place values of hundreds, tens and units for example, or that boys are out-performing girls which we saw happening in maths in Years 2 and 3. Then we can design appropriate provision, be this beginning girls only maths classes, re-teaching certain topics or organising professional development for staff.

360 degree view The school is now well-placed to gain a holistic picture of their pupils. The Knowledge and Human Development Authority (KHDA) in UAE has recently mandated that schools must assess all students from Year 5 with GL Education’s Cognitive Abilities Test: Fourth Edition (CAT4) to help schools understand their pupils’ potential for academic attainment, and GEMS Wellington Primary additionally uses GL Education’s Progress Test Series to assess skills in English, maths and science.

“CAT4 helps us understand our pupils’ learning dispositions and adding PASS to the mix means no stone goes unturned. It has been a great experience for us. All schools should be looking at the whole child pastorally to ensure they are able to reach their potential.”

“With the Progress Test Series, we can quickly glean information regarding what

Graph of fragile learners in Year 4 – bottom right hand quadrant are the fragile learners with risk of low achievement.

Year 4 120.0 Perceived Learning Capability

NAGOYA INTERNATIONAL SCHOOL:

100.0 80.0 60.0 40.0 20.0 Fragile learners

0.0 70

80

90

100

110

120

130

140

CAT 4 Mean Score

Rosemary produced a scatter graph to explain why some unexpected names were appearing on the PASS ‘at risk’ list and to identify an additional cohort of fragile learners. She plotted pupils’ mean CAT4 scores against their perceived learning capability from PASS. Those who had achieved a high CAT4 result but had a low opinion of their capability for learning were pinpointed as fragile learners. Strategies were put in place to build confidence and reduce anxieties of this vulnerable group.

These are extracts from the case studies. Read the full versions at gl-education.com/case-studies

ASSESSMENT MATTERS

07


GEMS WELLINGTON PRIMARY SCHOOL

Empowering lifelong learners to follow their own path

The current Head of the Student Services Team, Kim Humphreys, has also been an advocate for using CAT4 beyond core teaching. “With a background in psychology and assessment, she has been making the most of the results yielded by CAT4 to identify any hidden pockets of special educational needs or underachievement.

“We have discovered our students have very positive attitudes to teachers and school. However, we did uncover evidence that some students were harbouring quite negative attitudes to themselves as learners and their work ethic.”

“She had already used CAT4 as an early screener for specific children of concern and now has copies of all the reports to ensure no child is missed nor their abilities misunderstood. She also uses the data to ensure our student service resources are allocated effectively to where help and intervention is most needed.”

“There is a culture of perfection that exists in high school students, which can result in the perception that if a task isn’t hard, you aren’t doing enough, so to some extent we had anticipated this. But with the evidence in front of our eyes from PASS, we have started to think about how we communicate how we think about success at the school.

The IB Diploma Co-ordinator has also used CAT4 when starting to think about option choices for the current Grade 10s. “As a scientist, our Diploma Co-ordinator was keen to do some number crunching with a small group of last year’s IB students to compare IB scores with CAT4 results. He noted a correlation between the mean SAS CAT4 score and final DP score. However, this was based on a very small sample.

“We can focus more on building multiple career pathways and we’re hoping to alleviate any pressure on our students and broaden their horizons – for example, one student recently did an internship at the Hilton Hotel and now has a job there.”

Table of Scores

Spatial NPR

Mean SAS

Math Literacy ISA

ISA Writing Narrative/ Reflective

ISA Writing Exposition/ Argument

ISA Reading

Addition & Subraction1

Multiply & Divide1

Proportions & Ratio1

Running Records2

The table Lucinda shares with teaching staff that compares data from CAT4 with information from other assessments. The next stage will be to add PASS data to the chart.

Spatial SAS

“The pilot test allowed everyone to see how easy CAT4 was to use, how quickly we could gain information we didn’t already have and how results came in a usable format. One of the children we assessed spoke English as an additional language and could often be disengaged in class. The CAT4 results showed that his underlying ability was quite strong, and that his language skills were holding him back. We had suspected this and enrolled him on our intensive English Language programme but the CAT4 pilot gave our Senior Leadership Team the chance to see the benefits of having confirmation.

“The IB Learner Profile is a blueprint for the character we are trying to build in each child, so PASS will be able to help us get a sense of where they see themselves and make sure we are creating an environment where they feel confident to take risks in their learning.

Support and inclusion

Non-verbal NPR

“Assessing ability is not commonly done in international schools, so there was some initial confusion amongst staff as to exactly what was being tested. Some were open to the idea but felt unsure as to why it would be more useful than a straightforward attainment test.

Understanding that attitude to learning goes hand in hand with attainment and ability, the school has recently piloted the Pupil Attitudes to Self and School (PASS) survey.

Non-verbal SAS

“The IB curriculum is interdisciplinary and transdisciplinary, as is GL Education’s Cognitive Abilities Test: Fourth Edition (CAT4) – it provides unique information about children’s ability, regardless of previous schooling. The IB curriculum is also an inclusive curriculum that builds on individual skill sets. I knew CAT4 could help teachers differentiate effectively and pick up issues that might otherwise be missed, such as a child who starts to show signs of underachieving.”

Completing the picture

Quantitative NPR

Assessing ability

“I also mocked-up a triangulated sample of CAT4 results from our Grade 5 children with their International Schools Association (ISA) attainment data and teacher grading to show how all these elements work together to paint a picture. It struck a chord as, when I distributed the report, I could really see how the benefits were starting to click into place for many of our teachers.”

Quantitative SAS

“We used to do this via an assessment that focused on prior attainment. However, the IB is concept based so this is not always the best predictor of achievement. We wanted to find a way to understand children’s potential in the IB as well as establish a strong baseline when they joined.”

He is beginning to look at Grade 10 CAT4 scores as a predictor of DP performance but would like to see more data. He has recently encouraged other teachers to look at the CAT4 reports when discussing Grade 10 subject options”.

Verbal NPR

Lucinda Willis, Director of Learning, explains: “A transient school population means children often come from extremely diverse backgrounds, with varying types of school reports and feedback available from their previous schools. Therefore, to nurture their talents, we need to get to know our students as quickly as possible.

it assesses, questions teachers might like to ask themselves such as ‘Does it match what I’d expect to see?’ and an explanation of key terminology.

Verbal SAS

Nagoya International School in Japan is a co-educational, non-selective inclusive community school that teaches the three IB programs of PYP, MYP and DP. It has 460 students from over 30 nationalities, with the largest number coming from the US, Japan and Korea.

Average even bias

103

58

111

77

112

78

106

66

108

419

472

479

400

7e

7e

6e

T

Extreme spatial bias

96

40

120

91

109

72

130

98

114

468

338

400

342

6

7e

6

N

Average even bias

96

40

97

42

112

78

97

42

101

386

393

453

357

5

5

5e

S

Verbal-Spatial profile

Supporting staff understanding

Moderate spatial bias

110

74

117

87

112

78

128

97

117

443

497

479

371

6

6

6

S

Average even bias

118

89

116

86

105

63

111

77

113

443

497

505

428

7e

7e

7e

W

To support staff understanding, Lucinda put together a teacher guidance pack detailing the background to CAT4 and what

Mild spatial bias

118

89

123

94

139

99

139

99

130

495

447

479

414

7e

7e

7

T

Moderate spatial bias

109

72

117

87

134

99

131

98

123

527

420 400 459

7e

7e

7e

Q

ASSESSMENT MATTERS

06

1

Math

2

Reading/Fountas & Pinnell

Identifying fragile learners with Pupil Attitudes to Self and School GEMS Wellington Primary School lives up to its reputation as a varied community, with 73 different nationalities found amongst the 1175 children aged three and a half to 11 years old. “We’re a truly inclusive school,” explains Rosemary Elmes, Senior Leader for Standards. “Children join us from all over the world. Around 25% start with little or no English and approximately 24% are on the SEND register. “Due to the transitional nature of family employment, many children move on quite frequently. Gaining a holistic picture of a child as quickly as we can is important. If we want them to really develop during their time with us, we need to look beyond the academic.”

Gaining an insight The school’s guiding principle is the belief that children need to be happy in order to learn successfully. The school decided to introduce GL Education’s Pupil Attitudes to Self and School (PASS), a robust survey that measures students’ attitudes towards themselves as learners and their school. “Children suffering from poor attitudes to their learning can be very good at hiding any issues, and we didn’t want to miss anything. PASS allows us to take a forensic approach into what children are thinking.” This led to a target group of 56 children being identified, who were at risk of low attainment due to their negative attitudes towards themselves and school. Rosemary says: “The results were surprising, as the ‘at risk’ list included children we really didn’t expect. In fact, when we put together a photo gallery of these children, one senior leadership team member expressed surprise that a particular girl was on there as she seemed bubbly and confident in lessons. Yet she didn’t ever do as well in tests as we expected, so it was important to delve further and explore why that might be.”

Getting ready to learn Armed with this information, various programmes of intervention were introduced, using a combination of nationally-proven strategies from the PASS intervention tool and teachers’ professional experience.

33% improvement Teachers annotated the strategies used and the school then re-assessed these children four months later to see what had changed. They were delighted with the results. “We have achieved a 33% improvement across the target group, which placed all but three children into the top percentile

for high satisfaction with their school experience.

We have achieved a 33% improvement across the target group, which placed all but three children into the top percentile for high satisfaction with their school experience.

“We have also seen a huge change in specific children. For example, one Arabic boy in Year 3 had achieved an 80% improvement across the nine factors. His confidence in himself as a learner had been extremely low, but with targeted encouragement and partner work focusing on how other children achieve, he was able to take those risks and experience success. His teacher couldn’t believe it was the same child! “The icing on the cake has been rising to the top of the GEMS schools ‘Happiness/ Student Satisfaction’ index in a survey across our company’s schools in the UAE.”

has and hasn’t been covered well. We can look across cohorts and see that they are not yet secure with place values of hundreds, tens and units for example, or that boys are out-performing girls which we saw happening in maths in Years 2 and 3. Then we can design appropriate provision, be this beginning girls only maths classes, re-teaching certain topics or organising professional development for staff.

360 degree view The school is now well-placed to gain a holistic picture of their pupils. The Knowledge and Human Development Authority (KHDA) in UAE has recently mandated that schools must assess all students from Year 5 with GL Education’s Cognitive Abilities Test: Fourth Edition (CAT4) to help schools understand their pupils’ potential for academic attainment, and GEMS Wellington Primary additionally uses GL Education’s Progress Test Series to assess skills in English, maths and science.

“CAT4 helps us understand our pupils’ learning dispositions and adding PASS to the mix means no stone goes unturned. It has been a great experience for us. All schools should be looking at the whole child pastorally to ensure they are able to reach their potential.”

“With the Progress Test Series, we can quickly glean information regarding what

Graph of fragile learners in Year 4 – bottom right hand quadrant are the fragile learners with risk of low achievement.

Year 4 120.0 Perceived Learning Capability

NAGOYA INTERNATIONAL SCHOOL:

100.0 80.0 60.0 40.0 20.0 Fragile learners

0.0 70

80

90

100

110

120

130

140

CAT 4 Mean Score

Rosemary produced a scatter graph to explain why some unexpected names were appearing on the PASS ‘at risk’ list and to identify an additional cohort of fragile learners. She plotted pupils’ mean CAT4 scores against their perceived learning capability from PASS. Those who had achieved a high CAT4 result but had a low opinion of their capability for learning were pinpointed as fragile learners. Strategies were put in place to build confidence and reduce anxieties of this vulnerable group.

These are extracts from the case studies. Read the full versions at gl-education.com/case-studies

ASSESSMENT MATTERS

07


INTERNATIONAL BENCHMARKING

With CAT4 and the Progress Test Series Introduction The Cognitive Abilities Test (CAT4 ) and Progress Test Series for English, maths and science are standardised against the UK school population. Some schools have asked for an international standardisation or benchmarking data to sit alongside the UK version. There are two key reasons why we have not produced a full standardisation for these assessments:

Because of the diversity of international schools – for example curriculum, geography, number of EAL students – it would be very difficult to create a single, representative standardisation.

Comparison of UK standardisations and international scores in 2016

Standard Age Score

The distribution of scores for CAT4 in the international benchmarking sample followed a very similar profile to the UK standardisation overall. The verbal scores were typically slightly lower than the UK standardisation, with the non-verbal, quantitative and spatial scores slightly higher.

The Standard Age Score (SAS) is based on the student’s raw score, but then adjusted for age. The mean is 100. When comparing SAS scores, a difference of less than 3 points is not usually seen as statistically significant.

a diverse student population*, have proven to be relevant and reliable for international schools.

Given the higher level of EAL students in the typical international school, this is not surprising but it is an important point. Low verbal reasoning scores can be indicative of students who may have difficulty in accessing the curriculum.

In 2016 we carried out benchmarking exercises to compare the scores achieved by thousands of international students globally against the original standardisations for CAT4 and the Progress Test Series. The sample sets were not the same, so we should not expect a direct correlation between the scores achieved for CAT4 and those for the Progress Test Series, but we can identify trends.

Test CAT level B

CAT level D

CAT level F

CAT level G

International Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal

100 102 102 101 97 102 104 102 103 99 103 105 104 104 98 104 105 106 106 98

Level 14 is the test taken by students aged 13–14, so we can compare these scores with those for Level F of CAT4 in table 2. The international score for Progress Test in English is slightly lower than the UK standardisation (yellow highlight) which is broadly in line with the verbal ability scores achieved by international students in the CAT4 benchmarking sample. The international scores for Progress Test in Maths and Progress Test in Science are significantly higher (blue highlight). This reflects the trend of increased ability scores for non-verbal, quantitative and spatial seen in the CAT4 benchmarking sample.

And, more importantly

• Our UK standardisations, based on

Table 2 shows the mean SAS for different levels of the Progress Test Series of assessments for English, maths and science.

UK 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

Difference International vs UK 0 2 2 1 -3 2 4 2 3 -1 3 5 4 4 -2 4 5 6 6 -2

The increased size of the differential might be attributed to a number of factors such as value added by the teaching in these schools or higher aspirational levels. However, to accurately assess the size of the difference we would need to compare the ability and attainment data for exactly the same set of students. In this benchmarking exercise we see that the distribution of scores for international students is closely aligned to the standardisations and where there are higher or lower scores in the Progress Test Series they are reflected in the pattern of scores for CAT4. We might expect to see a greater difference between any two individual schools than between these samples.

Your point of reference The original standardisation is a reliable and valid benchmark for international students. Schools may additionally wish to compare their own data to the findings of this benchmarking exercise to see if they reflect some of the nuances seen between the different sections of CAT4 and from year to year with the Progress Test Series.

Mean SAS Test

Difference UK vs International

International

UK

PTE07

102.7

100

2.7

PTE08

103.7

100

3.7

PTE09

103.1

100

3.1

PTE10

100.9

100

0.9

PTE11

104.1

100

4.1

PTE12

96.2

100

-3.8

PTE13

98.6

100

-1.4

PTE14

97.7

100

-2.3

PTM07

101

100

1

PTM08

100.6

100

0.6

PTM09

102.3

100

2.3

PTM10

102.6

100

2.6

PTM11

99.4

100

-0.6

PTM12

105.1

100

5.1

PTM13

103.8

100

3.8

PTM14

108.7

100

8.7

PTS08

104.2

100

4.2

PTS09

101.4

100

1.4

PTS10

103.4

100

3.4

PTS11

100.9

100

0.9

PTS13

100.6

100

0.6

PTS14

113.5

100

13.5 Table 2

Table 1 shows mean scores for a number of CAT4 levels. The mean scores for CAT4 Level F and G (highlighted in green) indicate that the international student benchmarking sample at these ages is more able than that of the UK standardisation. At Levels B and D, the abilities are not significantly different. Level F is taken by students aged 13-14. The non-verbal, quantitative and spatial scores are significantly higher than the UK standardisation (blue highlight), while the verbal score is slightly below (yellow highlight).

*F  or example, during our last standardisation exercise, for the Baseline assessment, the EAL ratio was 18%

ASSESSMENT MATTERS

08

ASSESSMENT MATTERS

09


INTERNATIONAL BENCHMARKING

With CAT4 and the Progress Test Series Introduction The Cognitive Abilities Test (CAT4 ) and Progress Test Series for English, maths and science are standardised against the UK school population. Some schools have asked for an international standardisation or benchmarking data to sit alongside the UK version. There are two key reasons why we have not produced a full standardisation for these assessments:

Because of the diversity of international schools – for example curriculum, geography, number of EAL students – it would be very difficult to create a single, representative standardisation.

Comparison of UK standardisations and international scores in 2016

Standard Age Score

The distribution of scores for CAT4 in the international benchmarking sample followed a very similar profile to the UK standardisation overall. The verbal scores were typically slightly lower than the UK standardisation, with the non-verbal, quantitative and spatial scores slightly higher.

The Standard Age Score (SAS) is based on the student’s raw score, but then adjusted for age. The mean is 100. When comparing SAS scores, a difference of less than 3 points is not usually seen as statistically significant.

a diverse student population*, have proven to be relevant and reliable for international schools.

Given the higher level of EAL students in the typical international school, this is not surprising but it is an important point. Low verbal reasoning scores can be indicative of students who may have difficulty in accessing the curriculum.

In 2016 we carried out benchmarking exercises to compare the scores achieved by thousands of international students globally against the original standardisations for CAT4 and the Progress Test Series. The sample sets were not the same, so we should not expect a direct correlation between the scores achieved for CAT4 and those for the Progress Test Series, but we can identify trends.

Test CAT level B

CAT level D

CAT level F

CAT level G

International Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal Overall CAT Non Verbal Quantitative Spatial Verbal

100 102 102 101 97 102 104 102 103 99 103 105 104 104 98 104 105 106 106 98

Level 14 is the test taken by students aged 13–14, so we can compare these scores with those for Level F of CAT4 in table 2. The international score for Progress Test in English is slightly lower than the UK standardisation (yellow highlight) which is broadly in line with the verbal ability scores achieved by international students in the CAT4 benchmarking sample. The international scores for Progress Test in Maths and Progress Test in Science are significantly higher (blue highlight). This reflects the trend of increased ability scores for non-verbal, quantitative and spatial seen in the CAT4 benchmarking sample.

And, more importantly

• Our UK standardisations, based on

Table 2 shows the mean SAS for different levels of the Progress Test Series of assessments for English, maths and science.

UK 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100 100

Difference International vs UK 0 2 2 1 -3 2 4 2 3 -1 3 5 4 4 -2 4 5 6 6 -2

The increased size of the differential might be attributed to a number of factors such as value added by the teaching in these schools or higher aspirational levels. However, to accurately assess the size of the difference we would need to compare the ability and attainment data for exactly the same set of students. In this benchmarking exercise we see that the distribution of scores for international students is closely aligned to the standardisations and where there are higher or lower scores in the Progress Test Series they are reflected in the pattern of scores for CAT4. We might expect to see a greater difference between any two individual schools than between these samples.

Your point of reference The original standardisation is a reliable and valid benchmark for international students. Schools may additionally wish to compare their own data to the findings of this benchmarking exercise to see if they reflect some of the nuances seen between the different sections of CAT4 and from year to year with the Progress Test Series.

Mean SAS Test

Difference UK vs International

International

UK

PTE07

102.7

100

2.7

PTE08

103.7

100

3.7

PTE09

103.1

100

3.1

PTE10

100.9

100

0.9

PTE11

104.1

100

4.1

PTE12

96.2

100

-3.8

PTE13

98.6

100

-1.4

PTE14

97.7

100

-2.3

PTM07

101

100

1

PTM08

100.6

100

0.6

PTM09

102.3

100

2.3

PTM10

102.6

100

2.6

PTM11

99.4

100

-0.6

PTM12

105.1

100

5.1

PTM13

103.8

100

3.8

PTM14

108.7

100

8.7

PTS08

104.2

100

4.2

PTS09

101.4

100

1.4

PTS10

103.4

100

3.4

PTS11

100.9

100

0.9

PTS13

100.6

100

0.6

PTS14

113.5

100

13.5 Table 2

Table 1 shows mean scores for a number of CAT4 levels. The mean scores for CAT4 Level F and G (highlighted in green) indicate that the international student benchmarking sample at these ages is more able than that of the UK standardisation. At Levels B and D, the abilities are not significantly different. Level F is taken by students aged 13-14. The non-verbal, quantitative and spatial scores are significantly higher than the UK standardisation (blue highlight), while the verbal score is slightly below (yellow highlight).

*F  or example, during our last standardisation exercise, for the Baseline assessment, the EAL ratio was 18%

ASSESSMENT MATTERS

08

ASSESSMENT MATTERS

09


PERSONALISING LEARNING

GL EDUCATION WORKSHOPS

Using assessment to identify needs and accelerate learning in international schools

To help schools get the most from our assessments and to help them benefit from best practice established by their peers, we continually plan and run workshops around the world.

As you will know only too well, international schools face a unique set of challenges when it comes to assessment. Not only do you operate in incredibly diverse educational and cultural contexts but you also cater to large numbers of children for whom English is not their first language. As much traditional assessment relies on verbal skills, this can be an issue. We recently conducted a survey with teachers in international schools in Europe, the Middle East, India, China, Africa and the Asia Pacific region, and we asked if they thought that the identification of a child’s need or potential was sometimes hampered by the fact that English was not their first language. The overwhelming majority (85%) thought it was; only 13% said it was not. There was little regional variation, though respondents in Europe were marginally less likely to see it as a problem (81%). Conversely, all of those in Africa (100%) said EAL could be a barrier to accurate identification of student need. The findings of our survey are published in our new report, ‘Personalising learning: Using assessment to identify needs and accelerate learning in international schools’. The report seeks to answer how much of a problem it is for schools overseas that have sizeable numbers of students with EAL, as well as how children with additional support or gifted and talented needs can be best identified and supported. Whatever obstacles students with EAL face, it does not mean their learning performance and needs cannot be reliably identified. As Ofsted says of EAL learners in the UK, their “conceptual thinking may be in advance of their ability to

ASSESSMENT MATTERS

10

We spoke to Alison Chapman, one of GL Education’s International Consultants, to find out more about the events and what delegates experience.

speak English” but it doesn’t believe the “cognitive challenge” should be reduced to take account of that – and most professionals would agree. And, as Sue Thompson, our Senior Publisher, points out: “With EAL students there is always the danger that teachers jump too quickly to the conclusion that language is the main barrier to learning when in fact it could be something completely different.” There are, of course, measures that schools can adopt to make assessments fairer for EAL students. Understanding generally precedes speaking and writing, so it’s a good idea to include receptive as well as expressive language measures in any assessment. Embedding assessment in classroom practice on a regular basis will also help students with EAL, as will a proper understanding of prior attainment. Then there are non-verbal reasoning tests that may be better indicators of ability and potential for students who struggle with English. Our report includes contributions from a number of international schools who are proactively addressing this issue. They include Nicola Lambros, the Deputy Head at King’s College, Madrid, who writes about how you can identify gifted and talented students who have EAL, as well as offering her top tips on supporting EAL students in your school. Matthew Savage, the Acting

Principal, International Community School, Amman, explains how he uses the three As of student data – attainment, aptitude and attitude – to personalise learning so that every student believes their learning journey has been designed specifically for them; an approach he calls ‘The Mona Lisa effect’. The report also includes contributions from Lucinda Willis, Director of Learning at Nagoya International School in Japan, Emma Dibden, the Head of Learning Support at Jumeirah English Speaking School (JESS) in Dubai, and Colin Bell, the Chief Executive of COBIS. Thank you to everyone who participated in our survey and contributed to the report. You can download your copy at gl-education.com.

“It’s one thing reading a user guide or attending a webinar,” suggests Alison, “but there’s nothing quite like talking face to face with an assessment specialist or a current practitioner who has years of experience of implementing the assessments that your school uses. That’s why we try to keep our introductions short and informative and move on to case studies as soon as we can.”

Eye opening to discuss with other colleagues and their data situation. A Workshop attendee

“On that point, we are extremely grateful to the many teachers, specialists and school leaders who give up their time to share their knowledge and experience to make these workshops so practical and informative for their peers,” Alison adds. As we are conscious that there are different levels of experience in any given region, we try to run different workshops or have separate streams to accommodate those needs. Where we can’t, we try to make it clear what level of audience we’re addressing on the day.

It was very detailed and all information was linked to practical applications.

“I know from my experience,” says Alison, “that sometimes, people just want guidance in understanding the data, but more often they are there to find out how they can use the data to support great teaching and learning – how can the data be used to identify and support fragile learners? How can I foster a Growth Mindset in my students? “Remember, too, that teachers’ needs change over time. A teacher who is completely comfortable looking for signs of underperformance or insecurity at a classroom level, can suddenly find themselves responsible for data at a whole school level and having to provide evidence for inspections or predicting additional support needs for each new cohort.” The other key component of our workshops is the engagement of the audience, Alison explains. “That’s both in planned Q&A or discussion sessions and in the peer networking in coffee or lunch breaks. Thinking back to the workshops we’ve run recently in the Middle East, some of the main talking points included developing meta-cognition, using data to support parental engagement and how to use data in classroom feedback.

It helps people at many levels understand how to make this data effective. A Workshop attendee

In the first four months of 2017 we have held workshops or breakout sessions in 16 countries around the world, covering South America, Africa, Europe, the Middle East and Asia. Unfortunately, we can’t get to everyone, so we’ve also started running online workshops where experienced practitioners share their experience and explain how to analyse and use the data produced by the assessments. To find out more about these online programmes and to hear about events near you as they are announced, keep an eye on the events page on our website:

gl-education.com/our-events

“One thing we always try to include is a section for participants to discuss and plan their next steps - to come up with an action plan based on what they’ve learnt on the day. And when we look at the feedback forms after the events, this regularly comes up as one of the most useful elements. It proves once again, that you get out what you put in.”

A Workshop attendee

ASSESSMENT MATTERS

11


PERSONALISING LEARNING

GL EDUCATION WORKSHOPS

Using assessment to identify needs and accelerate learning in international schools

To help schools get the most from our assessments and to help them benefit from best practice established by their peers, we continually plan and run workshops around the world.

As you will know only too well, international schools face a unique set of challenges when it comes to assessment. Not only do you operate in incredibly diverse educational and cultural contexts but you also cater to large numbers of children for whom English is not their first language. As much traditional assessment relies on verbal skills, this can be an issue. We recently conducted a survey with teachers in international schools in Europe, the Middle East, India, China, Africa and the Asia Pacific region, and we asked if they thought that the identification of a child’s need or potential was sometimes hampered by the fact that English was not their first language. The overwhelming majority (85%) thought it was; only 13% said it was not. There was little regional variation, though respondents in Europe were marginally less likely to see it as a problem (81%). Conversely, all of those in Africa (100%) said EAL could be a barrier to accurate identification of student need. The findings of our survey are published in our new report, ‘Personalising learning: Using assessment to identify needs and accelerate learning in international schools’. The report seeks to answer how much of a problem it is for schools overseas that have sizeable numbers of students with EAL, as well as how children with additional support or gifted and talented needs can be best identified and supported. Whatever obstacles students with EAL face, it does not mean their learning performance and needs cannot be reliably identified. As Ofsted says of EAL learners in the UK, their “conceptual thinking may be in advance of their ability to

ASSESSMENT MATTERS

10

We spoke to Alison Chapman, one of GL Education’s International Consultants, to find out more about the events and what delegates experience.

speak English” but it doesn’t believe the “cognitive challenge” should be reduced to take account of that – and most professionals would agree. And, as Sue Thompson, our Senior Publisher, points out: “With EAL students there is always the danger that teachers jump too quickly to the conclusion that language is the main barrier to learning when in fact it could be something completely different.” There are, of course, measures that schools can adopt to make assessments fairer for EAL students. Understanding generally precedes speaking and writing, so it’s a good idea to include receptive as well as expressive language measures in any assessment. Embedding assessment in classroom practice on a regular basis will also help students with EAL, as will a proper understanding of prior attainment. Then there are non-verbal reasoning tests that may be better indicators of ability and potential for students who struggle with English. Our report includes contributions from a number of international schools who are proactively addressing this issue. They include Nicola Lambros, the Deputy Head at King’s College, Madrid, who writes about how you can identify gifted and talented students who have EAL, as well as offering her top tips on supporting EAL students in your school. Matthew Savage, the Acting

Principal, International Community School, Amman, explains how he uses the three As of student data – attainment, aptitude and attitude – to personalise learning so that every student believes their learning journey has been designed specifically for them; an approach he calls ‘The Mona Lisa effect’. The report also includes contributions from Lucinda Willis, Director of Learning at Nagoya International School in Japan, Emma Dibden, the Head of Learning Support at Jumeirah English Speaking School (JESS) in Dubai, and Colin Bell, the Chief Executive of COBIS. Thank you to everyone who participated in our survey and contributed to the report. You can download your copy at gl-education.com.

“It’s one thing reading a user guide or attending a webinar,” suggests Alison, “but there’s nothing quite like talking face to face with an assessment specialist or a current practitioner who has years of experience of implementing the assessments that your school uses. That’s why we try to keep our introductions short and informative and move on to case studies as soon as we can.”

Eye opening to discuss with other colleagues and their data situation. A Workshop attendee

“On that point, we are extremely grateful to the many teachers, specialists and school leaders who give up their time to share their knowledge and experience to make these workshops so practical and informative for their peers,” Alison adds. As we are conscious that there are different levels of experience in any given region, we try to run different workshops or have separate streams to accommodate those needs. Where we can’t, we try to make it clear what level of audience we’re addressing on the day.

It was very detailed and all information was linked to practical applications.

“I know from my experience,” says Alison, “that sometimes, people just want guidance in understanding the data, but more often they are there to find out how they can use the data to support great teaching and learning – how can the data be used to identify and support fragile learners? How can I foster a Growth Mindset in my students? “Remember, too, that teachers’ needs change over time. A teacher who is completely comfortable looking for signs of underperformance or insecurity at a classroom level, can suddenly find themselves responsible for data at a whole school level and having to provide evidence for inspections or predicting additional support needs for each new cohort.” The other key component of our workshops is the engagement of the audience, Alison explains. “That’s both in planned Q&A or discussion sessions and in the peer networking in coffee or lunch breaks. Thinking back to the workshops we’ve run recently in the Middle East, some of the main talking points included developing meta-cognition, using data to support parental engagement and how to use data in classroom feedback.

It helps people at many levels understand how to make this data effective. A Workshop attendee

In the first four months of 2017 we have held workshops or breakout sessions in 16 countries around the world, covering South America, Africa, Europe, the Middle East and Asia. Unfortunately, we can’t get to everyone, so we’ve also started running online workshops where experienced practitioners share their experience and explain how to analyse and use the data produced by the assessments. To find out more about these online programmes and to hear about events near you as they are announced, keep an eye on the events page on our website:

gl-education.com/our-events

“One thing we always try to include is a section for participants to discuss and plan their next steps - to come up with an action plan based on what they’ve learnt on the day. And when we look at the feedback forms after the events, this regularly comes up as one of the most useful elements. It proves once again, that you get out what you put in.”

A Workshop attendee

ASSESSMENT MATTERS

11


OUR EXPERTS: Biographies Nicola Lambros Nicola believes wholeheartedly that a strong pastoral programme underpins academic success and is passionate about spreading the importance of focusing on student attitudes rather than grades if all children are to reach their full potential to become successful lifelong learners. An experienced senior leader in education, Nicola recently led the development of a secondary international school in Kuala Lumpur; establishing an innovative curriculum based on outstanding pastoral care, with positive education at its core, which enabled all students to realise their full potential resulting in outstanding IGCSE and A Level results. This experience combined with her MA research has reinforced her belief that focussing on how schools can positively impact student attitudes is the key to raising attainment in schools and improving academic success. Now the Deputy Head, whole school at King’s College, Soto in Madrid, Nicola is keen to work with other educators to place student attitudes as the top priority in schools.

Contact us

Lucinda Willis Lucinda Willis has been an educator for almost 20 years. She spent 15 years working in the state sector in the UK, leading behaviour units and Special Education departments in ‘failing’ schools. In 2011 she moved into International IB Education leading the Personalized Learning Department at St Nicholas School, Sao Paulo, Brazil. Currently she is the Director of Teaching, Learning and Inclusion at Nagoya International School in Japan. The nature of these roles means Lucinda has experience working with all three IB programmes as a school leader. She has a special interest in inclusive educational practice. Lucinda is an experienced workshop leader and is an IBEN Educator. She has led workshops at SENIA 2016 and the IB ‘Creating Inclusive Classrooms’ workshop across the Asia Pacific region.

GL Education, 1st Floor, Vantage London, Great West Road, Brentford, TW8 9AG, United Kingdom

gl-education.com +44 (0)20 8996 3369

Alison Chapman Alison is one of GL Education’s International Consultants and has over 30 years’ experience within the education sector. Alison began her career as a classroom teacher and has taught at both secondary and tertiary levels. She has acted as an examiner, moderator and verifier for a number of awarding bodies, and also for QCA in writing GNVQ specifications and quality assurance. Upon leaving teaching she worked with BTEC in a Quality Assurance, training and advisory capacity, supporting schools with both the introduction of BTECs and GNVQs. More recently she was with Edexcel and Pearson working with schools and stakeholders to support them in interpreting government policy relating to the school curriculum and in shaping their curriculum accordingly.

international@gl-education.com @gl_education

Her strong commitment to education led her to volunteer as governor at her local high school for thirteen years, where latterly she became chair. She has now become a governor again, and this time at a primary school. Alison’s training experience also extends to working within her local authority where she has carried out training for governors, such as Vision and Ethos, Safeguarding and Holding Leaders to Account. Her quality assurance experience led her to be invited to conduct external reviews of governance at schools within the LA.

Rosemary Elmes Rosemary Elmes has been involved in Education for 35 years, at primary and tertiary levels. Most recently, she has taught at primary level in Dubai for 12 years and has held a range of middle and senior leader posts. Rosemary joined GEMS Wellington Primary School in 2009 and is currently Senior Leader for Standards with responsibility for all areas of assessment.

ASSESSMENT MATTERS

12

ASSESSMENT MATTERS

13


OUR EXPERTS: Biographies Nicola Lambros Nicola believes wholeheartedly that a strong pastoral programme underpins academic success and is passionate about spreading the importance of focusing on student attitudes rather than grades if all children are to reach their full potential to become successful lifelong learners. An experienced senior leader in education, Nicola recently led the development of a secondary international school in Kuala Lumpur; establishing an innovative curriculum based on outstanding pastoral care, with positive education at its core, which enabled all students to realise their full potential resulting in outstanding IGCSE and A Level results. This experience combined with her MA research has reinforced her belief that focussing on how schools can positively impact student attitudes is the key to raising attainment in schools and improving academic success. Now the Deputy Head, whole school at King’s College, Soto in Madrid, Nicola is keen to work with other educators to place student attitudes as the top priority in schools.

Contact us

Lucinda Willis Lucinda Willis has been an educator for almost 20 years. She spent 15 years working in the state sector in the UK, leading behaviour units and Special Education departments in ‘failing’ schools. In 2011 she moved into International IB Education leading the Personalized Learning Department at St Nicholas School, Sao Paulo, Brazil. Currently she is the Director of Teaching, Learning and Inclusion at Nagoya International School in Japan. The nature of these roles means Lucinda has experience working with all three IB programmes as a school leader. She has a special interest in inclusive educational practice. Lucinda is an experienced workshop leader and is an IBEN Educator. She has led workshops at SENIA 2016 and the IB ‘Creating Inclusive Classrooms’ workshop across the Asia Pacific region.

GL Education, 1st Floor, Vantage London, Great West Road, Brentford, TW8 9AG, United Kingdom

gl-education.com +44 (0)20 8996 3369

Alison Chapman Alison is one of GL Education’s International Consultants and has over 30 years’ experience within the education sector. Alison began her career as a classroom teacher and has taught at both secondary and tertiary levels. She has acted as an examiner, moderator and verifier for a number of awarding bodies, and also for QCA in writing GNVQ specifications and quality assurance. Upon leaving teaching she worked with BTEC in a Quality Assurance, training and advisory capacity, supporting schools with both the introduction of BTECs and GNVQs. More recently she was with Edexcel and Pearson working with schools and stakeholders to support them in interpreting government policy relating to the school curriculum and in shaping their curriculum accordingly.

international@gl-education.com @gl_education

Her strong commitment to education led her to volunteer as governor at her local high school for thirteen years, where latterly she became chair. She has now become a governor again, and this time at a primary school. Alison’s training experience also extends to working within her local authority where she has carried out training for governors, such as Vision and Ethos, Safeguarding and Holding Leaders to Account. Her quality assurance experience led her to be invited to conduct external reviews of governance at schools within the LA.

Rosemary Elmes Rosemary Elmes has been involved in Education for 35 years, at primary and tertiary levels. Most recently, she has taught at primary level in Dubai for 12 years and has held a range of middle and senior leader posts. Rosemary joined GEMS Wellington Primary School in 2009 and is currently Senior Leader for Standards with responsibility for all areas of assessment.

ASSESSMENT MATTERS

12

ASSESSMENT MATTERS

13


GL1817

gl-education.com

Assessment Matters: May 2017  

International schools face a particular set of challenges: diverse educational and cultural contexts; native and non-native English speakers...

Assessment Matters: May 2017  

International schools face a particular set of challenges: diverse educational and cultural contexts; native and non-native English speakers...

Advertisement