Why data matters
Although data can understandably cause feelings of fear and confusion, practitioners should try to look upon data as a friend. Many teachers make effective use of data for teaching purposes. The UK Data Service includes case studies of good practice (external link). But data can also be helpful in performance management, pointing towards how learning and teaching might be improved. For example, John Hattie has developed a system of ranking the interventions that most influence learning based on a meta analysis of studies around the world. He reports that the average ‘effect size’ is 0.40 (effect size is the difference between two groups which tells us how well something works). This means that anything above this hinge point was significant. From analysing data spanning 15 years and millions of students, Hattie identifies the importance of effective feedback, peer-to-peer learning and self-assessment as among the key drivers for improvement.
An understanding of data also enables practitioners to pass informed comment on wider educational news. For instance, consider recent reports about the performance of 15 year-olds in Wales as measured by the Programme for International Student Assessment (PISA) results, which ranks countries against one another. Here are some headlines.
- ‘PISA ranks Wales’ education worse in the UK’ (3 December, 2013) (external link).
- ‘PISA results: Wales going backwards in all core subjects’ (3 December, 2013) (external link).
- ‘IoD: PISA results ‘depressing and gravely concerning’ (3 December, 2013) (external link).
Clearly there are serious questions to be asked about the performance of students in Wales compared to their international peers. However, by probing behind the headlines it’s possible to see that PISA, as with all assessment data, has limitations. It is based on a small sample of the performance of 15-year-olds in two-hour tests, distributed by the countries themselves, in three subjects. It is claimed that in some cases, teachers prepare students by ‘teaching to the tests’ so the results do not genuinely reflect students’ ability to apply skills or subject knowledge in English, mathematics and science. So it is useful to understand the context within which data was produced, by and for whom.
To reach a balanced view, it is a good idea to review at least two opinions on a specific topic.
Ethical research involves democratic values, for instance respecting the rights of others to publish their research even when you do not agree with what is said.
Review these three sources about PISA data and note the differences. Why should you be interested in PISA and do you think too much is made of the UK’s position on the ‘international naughty step’?
- A NASUWT document in 2014 on using international tests (external link).
- A blog in 2013 by Rebecca Wheater, NFER Research Manager (external link).
- A talk in 2012 by Andreas Schleicher, Head of Pisa (external link).
Types and uses of data
Data is usually classified as:
- quantitative data, which is concerned with quantities or numbers
- qualitative data, which is concerned with qualities expressed in words, pictures, sound or film.
When evaluating the performance of schools, Estyn inspectors and school leaders often begin with the All Wales Core Data Sets (AWCDS). This data pack was introduced to enable schools to compare their performance against that of other similar schools and to share good practice.
The AWCDS should be central to a school’s self-evaluation and can support teachers, leaders and governors by:
- considering current results and trends over a number of years, compared against local and national performance and against outcomes for the ‘family’ of schools (usually, each family comprises 11 schools according to the level of challenge they face and is grouped according to main language, size and contextual profile)
- identifying possible strengths and areas of concern in learner performance
- investigating learning and teaching strategies used by high performing but contextually similar schools.
The National School Categorisation System, introduced by the Welsh Government in September 2014, categorises primary and secondary schools based on their performance data as well as taking into account the quality of leadership, learning and teaching in our schools. A simplified description of the support categories is as follows.
- Green: these are highly effective schools.
- Yellow: these are effective schools.
- Amber: these are schools in need of improvement.
- Red: these are schools in need of greatest improvement.
The support category is discussed with the school by regional consortia and then agreed with the local authority. The outcomes are moderated by a regional moderation board to ensure consistency within and across regional consortia generating a final support category for each school. A quality assurance group comprising regional consortia and local authority representatives oversees this process; the Welsh Government has observer status.
The system is based upon three steps.
- A judgement in relation to performance and standards in the school, using a range of performance measures provided by the Welsh Government.
- A judgement in relation to the school’s capacity to self-improve based on robust self-evaluation in relation to leadership, learning and teaching.
- Combination of the two judgements leading to a colour categorisation, corroborated by education consortia challenge advisers and agreed by the local authority.
Once categorised, schools receive a tailored programme of support, challenge and intervention.
The model is based generally on three year weighted averages, designed to support longer-term analysis of where a school is at rather than one which is arguably primarily based on a single year cohort of learners.
You can find out more information on the model (external link) at Assembly in Brief.
Other key sources used by schools
Schools make use of the following sources of data.
- Data arising from standardised tests such as the National Reading and Numeracy Tests.
- School comparative reports, covering performance in end of key stage teacher assessments and the National Reading and Numeracy Tests, benchmarked against schools with a similar percentage of learners eligible for free school meals. These reports provide a graphical, and in the case of the National Reading and Numeracy Tests, a standardised view of a school’s results set against those for their local authority and Wales and the five-year results trend for the school.
- Summary of secondary school performance (SSSP), including data on attainment of 15 and 17-year-olds compared to local and national averages.
- The Pupil Level Annual School Census (PLASC) including data on learner characteristics such as gender, ethnicity and special educational needs.
You can find out more about education data in Wales on the following websites.
Stats Wales (external link) is the Welsh Government’s website on education statistics in Wales. You will find detailed datasets for each local authority, for example, relating to:
- absenteeism rates
- examination and assessment results for the Foundation Phase and Key Stages 2 to 4
- financial statistics such as delegated budgets.
The data is presented as aggregated totals, which limits its use as a resource for detailed analysis.
My Local School
The Welsh Government introduced My Local School (external link) for parents/carers, covering selected information about individual schools.
Schools also use a range of qualitative data for monitoring and assessment purposes including interviews with learners, diary entries, journals, portfolios, teacher observations, photographs of learners’ work and video reflections (see subtopic 2.6 ‘Research approaches’).
There are a number of important things to remember when trying to make sense of data. To get a reliable picture of school performance it is better to look at trends in results over a number of years, rather than a single year, because cohorts vary in ability. Throughout the UK questions have been raised about the reliability of national test data (Mansell et al., 2009). It is important to bear in mind that test results only provide a ‘snapshot’ of a child’s learning.
Data is only a starting point for self-evaluation. It raises questions and points to explore further. While data can be used to set targets, caution is needed because these may then act as a ceiling to learner performance.
In reality, learners with similar prior attainment and circumstances can achieve a range of outcomes. This is because they are subject to different variables including:
- the quality of care and support they receive
- the quality of learning and teaching
- the level of support received from families
- the range of learning opportunities within the curriculum
- record of attendance
- support from school leaders and governors
- the learners’ own resilience and attitude to learning.
As Estyn points out in its guidance on self-evaluation to schools:
While the analyses of performance data may raise some questions, the answers and the journey to improvement will come from within the school.
(Estyn, 2014: 8)
In the best practice, schools use data to track the performance of individuals and groups of learners. They regularly review what the data says to set appropriate targets for improvement. Trends in data (usually over three years) are particularly important, rather than relying too much on performance in a single year. But data is not the answer to everything.
The key for teachers and leaders is to find out the stories behind the data. To do this effectively, questions should be asked about the following.
- Attainment: how well learners are doing when measured in end of key stage teacher assessments, national tests and qualifications.
- Progress: the extent to which learners have made ‘additional progress’ over and above what they would ‘normally’ be expected or estimated to achieve given prior attainment and contextual factors, sometimes referred to as ‘value added’. This can be measured mathematically in terms of changes in ‘levels’ and reflects the progress made by learners while in the school rather than the standard of the school intake.
- Achievement: how well learners do in relation to their ability, often seen as a combination of attainment and progress.
It is possible for learners to attain above expectations for their age but underachieve. On the other hand, learners may not attain end-of-phase national expectations for their year, but achieve well and make good progress, given their starting points. Can you think of scenarios when this might happen?
In the first instance, perhaps a group of able learners are ‘coasting’ and not being stretched enough while in the second example a school may have high levels of transient learners (asylum seekers, Gypsies/Travellers, looked after children) which impacts on attainment.
Look at the most recent data available in the AWCDS for your school. Use the following questions (Governors Wales, 2012: 9), to improve your understanding of attainment, achievement and progress within your context.
- What is the overall attainment reached by the end of each key stage?
- How does the attainment and standards reached by boys and girls compare within school and to the national average?
- How does the school’s current performance compare with its previous performance?
- What is the trend in results over the last three or more years?
- Have some subjects/year groups shown a marked improvement or decline? If so, why and what strategies are in place to sustain and share good practice and bring about improvements?
- Are some individuals and groups of learners doing better than others? If so, why and what strategies are in place to sustain and share good practice and bring about improvements?
- Are learners making better or worse than expected rates of progress by the end of their time in school and in the intervening years?
- If so, why and what strategies are in place to sustain and share good practice and bring about improvements?
- Are some individuals and groups of learners and some subjects making better progress than others? (e.g. looked after children, boys, girls, ethnic groups, each group of learners with special education needs, more able and talented learners).
- If so, why and what strategies are in place to sustain and share good practice and bring about improvements?
Now compare how you think your school is doing in relation to the 10 questions posed by Estyn in its PowerPoint presentation: ‘How well are the all-Wales core data sets used to inform self-evaluation and planning for improvement?’ (external link).
Use of data
One of the suggested reasons why schools in London have improved their performance over the last decade is because leaders have become ‘relentless’ and ‘forensic’ in their use of data (Baars et al., 2014: 12). They have challenged underperformance in a ‘no excuses’ manner so that, by 2013, London secondary schools were the best performing in England. Schools Challenge Cymru, which draws on the strengths of the London and Manchester Challenge models, sets out to address underperformance in secondary schools in a similarly rigorous manner (Welsh Government, 2014).
Figure 2.4.2 shows how data might typically inform planning. A school might identify a particular issue arising from analysing trends in learners’ attainment in end-of-phase test results. For instance, the percentage of more able and talented learners achieving Level 5 in the primary phase or the Level 2 threshold in the secondary stage, may be well below national averages or when compared to similar schools. Closer scrutiny may reveal that boys significantly underperform in English when compared to girls. Or from qualitative sources, such as discussions or lesson observations, the school considers that more able and talented learners are not being stretched enough. This may result in leaders setting a whole-school priority within the school development plan with a quantifiable target, e.g. to increase the percentage of boys attaining Level 5 from 30 per cent to 40 per cent and introducing performance management goals for all staff, such as making regular use of challenging activities. It may also lead to targets set for specific learners. These targets might then be monitored every half term and, if necessary, refinements made. The school development plan becomes a working document and its targets subject to regular evaluation.
Over the course of this module, you have learnt a great deal about putting research into practice. This has not only introduced you to many key educational researchers, but has also helped you think about the direction of travel of your own research, as we discussed in Topic 2. In Topic 4, you will think about these key issues in the context of colleagues in your organization, within your region, and across the world.