## Monday, 24 February 2014

### Key Stage 2 Value Added: Fact and Fiction

Let’s start with the obvious question: what is value added? Well, at its most basic – and at the risk of teaching people to suck eggs here – it is a comparison of pupils’ actual and expected outcomes. The ‘actual’ bit is fairly straightforward: it’s what a pupil gets in a test, obviously. Well, it’s quite straightforward except it’s actually the test score converted to a fine grade, and this is subject to change every year (more on this in a bit). But what is ‘expected’? Well, there is the common misconception that expected progress equates to 12 points of progress from KS1 to KS2; and value-added is therefore the difference between a pupil’s actual progress and the expected rate of 12 points. Unfortunately this is very wrong and I know of schools that been significantly below for VA despite the cohort having made just over 12 points of progress on average. This has come as a bit of a shock.

So, start again: what is expected progress? First, it should be pointed out that ‘the DfE does not define expected progress in terms of APS’ (see p6, paragraph 10 of Ofsted Subsidiary Guidance). However, this is not entirely true. Ofsted, in the same document (see p33) go on to define attainment gaps in terms of differences in ability i.e. a 3 point gap between groups of pupils (e.g. pupil premium/non-Pupil Premium) is stated as being a year’s difference. This implies that 3 points is equivalent to a year’s progress, thus reinforcing the misconception that 12 points equates to expected progress across KS2.

More pertinent to this discussion is the fact that VA involves the calculation of an APS estimate for the end of the key stage against which the actual KS2 outcome will be compared. This calculation is based on the prior attainment (APS) of the pupil at KS1 and the resulting difference (KS2 APS estimate - KS1 APS actual) is essentially the expected progress for that pupil. In other words, pupils with different prior attainment have different expectations. Essentially VA is the comparison of a pupil’s progress against that of pupils with similar prior attainment nationally, and variations can be rather surprising.

https://www.raiseonline.org/documentlibrary/ViewDocumentLibrary.aspx

To address this, I’ve built an excel-based VA calculator tool, which will calculate KS2 estimates for reading, writing and maths for all pupils in a cohort. It will also calculate potential VA scores for a cohort and significance if end of KS2 targets/predictions are entered into the columns provided. Please get in touch if you’d like a copy of this tool.

So, what does this mean for target setting? Well, as mentioned previously, Ofsted do not overtly define expected progress in terms of APS (except they do when it comes to VA estimates!). Instead, they now expect schools to talk about percentages on track to make two and three levels of progress (again, see subsidiary guidance p6). However, many schools continue to set APS-based targets, often at 4 points per year (16 points across the key stage), and this is commonly prescribed by consultants et al. It is, therefore, worth pointing out, from my own analysis of schools’ VA pupil lists, that 16 points of progress on average would result in the school being ranked at the 2nd or 3rd percentile (a school at the 1st percentile had a cohort which made 17 points of progress on average). 16 points of progress is therefore highly aspirational and 97% of schools do not reach this lofty bar. Instead it may be more appropriate to aim a sublevel above the estimates generated by my VA calculator tool. Alternatively, considering schools ranked at or above the 20th percentile in RAISE are usually significantly above national average for VA, schools could use FFT PA 15-20th percentile estimates (or SE estimates if preferred, but be warned that this may be somewhat lower depending on your school context). Both of these methods are VA based and link to the prior attainment of the pupil, which is perhaps preferable to a one-size fits all approach to target setting currently employed in many schools.

So, those are some thoughts on KS2 value-added. What the future is for this measure is anyone’s guess but the Primary Assessment Consultation does suggest a VA floor standard in future, and a baseline at the start of reception as a possibility. With the removal of levels on the horizon, it may be that some sort of VA measure becomes a more useful measure of progress. Watch this space!

And what about KS4? Well, that’s a topic for another day.

## Friday, 21 February 2014

### Some thoughts on Key Stage 1 data and recent Inspections

It is evident from recent inspections that Key Stage 1 has become a major focus for Ofsted, and there are increasing demands placed on schools tracking systems. Schools therefore need to ensure that their systems are capable of providing answers to the questions that are likely to arise. As detailed in the latest Ofsted subsidiary guidance, there is now a greater emphasis on percentages on track rather than progress in terms of points, but at Key Stage 1 both are relevant and systems need to be flexible.

Typical data requirements include:
• ·       Current APS
• ·       % on track for 2B+, 2A+, L3
• ·       % on track to make expected progress
• ·       % on track to make better than expected progress
• ·       Points progress this year
• ·       Points progress since start of Key Stage (for Year 2)
• ·       Predicted APS by end of Key Stage
• ·       Predicted points progress by end of key stage
• ·       % on track to meet targets

These data should be readily available for each subject, year group and pupil group.

What constitutes expected progress across Key Stage 1 is a hotly debated topic and opinions vary greatly; many schools taking different approaches to converting EYFS outcomes to national curriculum point (NCP) scores (EY 'expected' ranging from 3 to 7 NCP), whilst others avoid this and establish a baseline early in Year 1. 'Expected' progress from EYFS to KS1 consequently ranges from 8 to 12 points depending on the method used. Each definition of expected progress is therefore entirely dependent on how the school sets its baseline for Key Stage 1. Clear understanding and consistency of application of the school’s definition of expected progress is therefore critical.

Moreover, there is no nationally recognised definition of expected progress across Key Stage 1 beyond the broad statements in the Ofsted subsidiary guidance (1) and Handbook (2), which results in further confusion and subjectivity:

1. The early learning goals do not translate precisely to National Curriculum levels. However, as a broad rule of thumb children who reach a good level of development at the end of the Reception Year ought to be reaching at least Level 2b by end of Key Stage 1. Children exceeding the early learning goals at the end of reception ought to be exceeding Level 2b at the end of Key Stage 1 and be reaching Level 2a as a minimum, and more likely Level 3 (Ofsted Subsidiary Guidance, p11, paragraph 33)
2. Evaluation of achievement in Key Stage 1 should take account of the proportions of pupils who have made typical progress or more from their starting points. An example of typical progress is for a pupil who has met the Early Learning Goals at the end of reception to attain Level 2b at the end of Year 2. Inspectors should take into account how well pupils with a lower starting point have made up ground, and the breadth and depth of progress made by the most able.(Ofsted Inspection Handbook, p34)
Again, there is no right or wrong answer to the question of what constitutes expected progress across KS1. As stated above, your definition of expected progress is entirely dependent on how you set your baseline for Y1. It may be different to how a neighbouring school does it but that doesn't make it wrong. Also bear in mind that 'the DfE does not define expected progress in terms of APS' (Ofsted subsidiary guidance, p6, paragraph 10) so it is not appropriate for Ofsted Inspectors to state that expected progress equals 5 points per year, for example. I would suggest requesting a copy of the guidance they are referring to if they make such claims.

Having said all this, personally, I'm not keen on setting low baselines that give the impression of high rates of progress. For example, I know many schools convert EYFS expected (or 6 scale points on old EYFS) to 3 national curriculum points and then set 12 points as the expected progress rate (3+12=15 points i.e. 2b). This is no different to a school that converts EYFS expected pupils to 7 points and then sets an expected progress rate of 8 points (7+8=15 i.e. 2b). It just appears that the former has made more progress than the latter - a case of smoke and mirrors - and I've heard headteachers give this as their reason for adopting the first approach: to show as much progress as possible. Whilst I understand this and sympathise to an extent, 3 points for a pupil that is at the expected level of development does seem very low when you bear in mind that 3 points equates to a p5. Be honest: are those pupils, who reached a good level of development at EYFS, really a p5?

The thing you notice about the progress data in those schools that set such low baselines is that around half the progress for the entire key stage is made in the first term of year 1. They start the year on the 3 point baseline and by Christmas they're a 1b, so have made 6 points (a level) in 3-4 months. They then go on to make another 6-8 points across the remainder of KS1, which looks highly suspicious to anyone scrutinising the data closely. The progress from Christmas Y1 onwards is probably realistic; anything before that is unreliable.

The big risk is that an inspector becomes a little suspicious of such high rates of progress and decides to subtract the claimed progress rate from the KS1 APS in RAISE, the result being the average starting point. If the figures are very low then it might raise alarm bells.

One more thing on this subject. As discussed above there are various, claimed rates of expected progress for KS1 ranging from 8-12 points of progress, none of which actually have any basis in national guidance. So, in theory, you can do what you like. However, aside from the fact that the 3 point baseline is too low in my opinion, it also causes problems for other schools. Many schools set their baselines higher (e.g. 6-7 points for EYFS 'expected' pupils) and consequently have lower rates of expected progress. The issue here is that those schools, which in my opinion use a more realistic method, are finding themselves getting into difficult conversations with inspectors who are claiming they're not being aspirational enough even though they are actually expecting the same in real terms as other schools: that pupils with a good level of development go on to achieve at least a 2b, as stated in the Ofsted guidance.

Obviously, the simplest answer to this is, to baseline early in year 1 and measure progress from there, thus avoiding the need to convert EYFS data to national curriculum point scores. If you do want to do the latter (and I often do this in schools when setting up a new tracking system) I tend to offer the following as a guide:

Emerging = 4 points
Expected = 6 points
Exceeding = 8 points

But this is only a start point and should not be seen as rigid guidance.

FFT

One of the big challenges schools face with regards Key stage 1 data is presenting a positive picture where attainment is low (i.e. below national average) or where there is a declining trend. In these circumstances, it is vital that schools can present compelling evidence of 'good' progress to counter the attainment-centred data contained in RAISE. Beyond their own tracking data, it is recommended that schools access their FFT KS1 Self Evaluation Report, which provides a detailed breakdown of progress across Key Stage 1. These data compare the progress made by pupils in a school against that of pupils nationally with the same prior attainment (VA), or with similar characteristics and prior attainment (CVA), where prior attainment comprises the EYFS profile. The report identifies areas of strength and weakness, 3 year trends and national rankings for attainment and progress. FFT KS1 data has proved useful in recent inspections where a school with low KS1 results is able to successfully demonstrate that pupils have made good progress when compared against similar pupils nationally. Such schools have low (e.g. 90th percentile) rankings for attainment but high (e.g. 10th percentile) rankings for VA/CVA.

In addition, FFT also provides end of Key stage 1 estimates for individual pupils in Year 1 and 2 to help with target setting. These may be set from 50th to the 5th percentile and can be based either on prior attainment (PA) or can take account of socio-economic factors (SE). This helps schools explore the outcomes pupils are likely to achieve in schools at different percentile ranks. FFT D (25th percentile SE estimate), for example, indicates the progress made by pupils in the upper quartile (top 25%) of similar schools. FFT A (50th percentile PA estimate) shows the levels achieved on average, and is therefore perhaps our best approximation of expected progress across Key Stage 1.