3 minute read

360 Cycle and Progress Tracking

We want our assessment to perform two functions; move students forward in their learning and to measure the learning that is taking place. The former is much more important to us and to achieve this our cycles are based around a 360 Cycle. There are three of these cycles each year.

We complete summative assessments three times a year because they ‘need to be far enough apart that pupils have the chance to improve on them meaningfully’ (Christodoulou, 2016, p. 193). Grading can be useful but is not the purpose of assessment; research by Black and Wiliam (1998) found that ‘the giving of marks and the grading function are overemphasized, while the giving of useful advice and the learning function are underemphasized’.

Phase 1: Teach (10 weeks)

Assessments identify the extent to which the Threshold Concepts have been understood and the required knowledge has been acquired.

Phase 2: Assess (1 week)

Teachers diagnose misconceptions in learning to inform Phase 3.

Phase 3: Review and Re-do (2 weeks)

Students review, re-draft, re-do to close gaps in learning. There are opportunities to go beyond if concepts have been fully mastered.

In our system, students get feedback immediately without receiving a grade or scaled score; teachers are able to use the assessments to close gaps in learning without the distraction of grades. Sherrington believes that ‘there is usually an authentic, natural, common-sense mode of assessment that teachers choose with an outcome that fits the intrinsic characteristics of the discipline’ (2014) and our system allows teachers to use this authentic assessment in classrooms while our scaled scores allow for shared meaning from those assessments.

The Department for Education produced a report in 2018 on the effective use of data in schools with some key principles to ensure that the purpose of assessment is clear, it is interpreted in a sensible way and that the frequency of data collection is proportionate (DfE, 2018, p. 5). With clear purpose and an appropriate frequency of three times per year for summative data collection, it is then important to ensure that interpretation supports teacher and leadership actions (Davies, 2020, p. 88).

Trust Subject Directors/Leaders use the assessment data to evaluate the impact of their curriculum and teaching and to inform refinements in curriculum design and delivery and teacher effectiveness. It can also identify areas of strength across the Trust to facilitate the sharing of best practice. This is endorsed by the OECD report which cited ‘greater reliance on evaluation results for evidencebased decision making’ (OECD, 2013, p. 1) as an emerging theme from school systems internationally.

Davies (2020, p. 83) asserts that ‘the most useful summative assessment data … [is] each student’s network percentile ranking’ which is what is represented by our scaled scores at Key Stage 3 and is achieved using grades at Key Stages 4 and 5. Tracking a student’s score in Key Stage 3 from one cycle to the next would approximate their rate of progress.

The inferences made by changes to scaled scores rely on the level of demand in the assessment being sufficiently high to mean that the standard has to raise in order for a scaled score to reflect progress.

Progress is not linear and in a system such as ours, fluctuations are to be expected; the scaled score system allows us to account for this but still identify students who are vulnerable to underperformance as well as celebrate exceptional progress. There is a tolerance within which we accept that students are likely to be maintaining their rate of progress. Scores beyond this tolerance signify exceptional progress and students who need additional support.

The student has increased their position in the rank and so they are likely to be making progress above expected

The student has maintained their position in the rank and so they are likely to be making expected progress

The student has moved to a lower position in the rank, indicating they may not be making expected progress

This article is from: