The NFER blog

Evidence for excellence in education

The truth about the Phonics Screening Check

5 Comments

By Matt Walker

Last week the Department for Education (DfE) published the final report from NFER’s three-year evaluation into the impact of the Phonics Screening Check (PSC). The evaluation explored schools’ phonics teaching practices and sought to establish whether there is any evidence that the introduction of the check has had an impact on the standard of reading and writing.

The findings from the evaluation should be of interest to schools, teachers and policymakers and raise further questions about the best approaches to raising standards in primary settings. Selected findings from the evaluation are discussed below.

Kindergarten teacher helping student with reading skills

What has been the impact of the PSC on the teaching of phonics in primary schools during Reception and Years 1 and 2?

There is evidence that the introduction of the PSC has had some effect on phonics teaching and classroom practice. The evidence suggests that a majority of schools made some changes to ‘sharpen up’ their phonics teaching. These changes include improvements to the teaching of phonics, such as faster pace, longer time, and more frequent, more systematic, and better ongoing assessment. Schools also ensured that pupils were familiar with the kinds of pseudo-words (made up words that test pupils’ skills in decoding unfamiliar words) they would encounter in the PSC.

There is also evidence to suggest that, since the PSC was introduced, teachers are making more use of the results to decide whether to review or revise phonics teaching plans or to help inform decisions about the support offered to individual pupils.

Yet, despite these self-reported changes to schools’ teaching practices, there is little evidence to suggest that many schools are teaching, or have moved towards a position whereby they are teaching systematic synthetic phonics ‘first and fast’ to the exclusion of other word reading strategies. The impacts on pupil outcomes resulting from a systematic synthetic phonics approach to reading instruction as opposed to a ‘mixed methods’ approach are not discussed here. However, for better or worse, it is clear that many schools believe that a phonics approach to teaching reading should be used alongside other methods.

Has the introduction of the PSC had an impact on the standard of reading and writing?

Over the past three years, phonics attainment, as measured by scores on the check, has improved. Specifically, 74 per cent of Year 1 pupils met the expected standard of phonic decoding in 2014, compared with 58 per cent in 2012. When those pupils who retook or took the test for the first time in Year 2 are included, the proportion of pupils meeting the expected standard of phonic decoding by the end of Year 2 was 88 per cent in 2014, an increase of three percentage points from 2013.

The evaluation also sought to explore whether the improvement in the proportion of children meeting the expected standard of phonic decoding has resulted in better subsequent attainment or improvements in literacy overall, as distinct from just in phonics. As the check was introduced as part of a policy to strengthen phonics teaching in primary schools, it might be hypothesised that phonics teaching in general would improve as a result; or, more specifically, that the learning needs of individual children might be more effectively met. Either of these developments could be expected to lead to an improvement in attainment, in phonics specifically and/or in literacy more broadly.

Overall, however, analyses of pupils’ literacy (reading and writing) scores in the national datasets over four years were inconclusive. There were neither improvements in attainment or in progress that could be clearly attributed to the introduction of the check, nor any identifiable impact on pupil progress in literacy for learners with different levels of prior attainment. These findings should be viewed within the context of the methodological limitations of this study; namely, the absence of a control group and the context of a number of existing phonics initiatives in national policy.

What are the implications for policy and practice?

Our findings suggest that the introduction of the PSC has catalysed schools to ‘sharpen up’ their phonics teaching and/or to improve their phonics assessment. Yet the ‘truth’, alluded to in the title, is there is no conclusive evidence (at least at present) of improvements in pupil attainment or in progress that could be clearly attributed to the introduction of the PSC. This is despite improvements in phonics attainment, as measured by scores on the check. More research is required to understand better the longer term impacts associated with the introduction of the PSC, and of different schools’ approaches to reading instruction.

View the final report of the Phonics Screen Check evaluation on the NFER website here.

Author: thenferblog

National Foundation for Educational Research

5 thoughts on “The truth about the Phonics Screening Check

  1. Great this study has been attempted but you can’t really claim to have the ‘truth’ about phonics when you yourselves admit you cannot compare results of those recieving phonics to those not recieving phonics.

  2. Comment from Matt Walker:

    Hi John, thanks for your comment. The evidence presented in our report, while far from perfect (for reasons that we discuss in the report), is the best available on the impacts resulting from the PSC. As I have mentioned in the post, more research is required to better understand the longer term impacts associated with the introduction of the PSC, and of different schools’ approaches to reading instruction.

  3. Hi Matt,

    Dick Schutz has made some very interesting comments about the NFER surveys of the Year One PSC via the ‘International Foundation for Effective Reading Instruction’ forum:

    http://www.iferi.org/iferi_forum/viewtopic.php?f=2&t=416

    My field is phonics and literacy so I get to visit schools to observe their phonics provision and to provide professional development. I see patterns of practice which are identifiable but not that efficient or effective for many of the children.

    Whereas the NFER survey is about asking teachers questions about their practice and their views, I think it would have been invaluable to explore practice deeper by observing phonics teaching in action.

    I think it is important that schools aim for virtually 100% of their Year One pupils reaching or exceeding the benchmark of the PSC and clearly we are not there yet – but 600+ schools were in 2014. I look forward to learning of the national results for 2015.

    Warm regards,

    Debbie Hepplewhite

  4. Comment from Matt Walker:

    Hi Debbie, many thanks for your comments. As you say, our study focused on the views and actions of teachers, which was the design agreed with the DfE. However, I agree it would have been helpful to have observed phonics teaching in action. Perhaps this could be explored as part of any follow-up research in this area.

    Thank you also for signposting us to the International Foundation for Effective Reading Instruction forum. Dick Shultz makes some interesting comments about our study, the main criticism apparently being the lack of a comparison group, and Dick makes some suggestions for how alternative ‘controls’ could have been created. The main focus of our impact analysis was to investigate what impact the introduction of the check had on pupils’ literacy: we lacked an adequate comparison group because the check was introduced to all schools at the same time. Dick’s suggestions for alternative comparisons relate to schools’ use of phonics in the classroom, rather than the check itself. We did explore some variation in different approaches to teaching early reading, and followed up the results of two cohorts of pupils in the phonics check and Key Stage 1. These results are summarised in section 2.3 of the evaluation report.

  5. Hmm. I think we could all agree that “more research is needed.” The “main idea” I’ve been trying to convey is that the PSC constitutes the dependent variable in a Natural Educational Experiment of international import. As stated in the Framework for the PSC:

    “The Phonics Screening Check is designed to be a light touch, summative assessment of phonics ability”

    (Attempting to “evaluate the impact” of the instrument changes the status of the PSC from dependent variable to independent variable, creating methodological havoc. But that’s a whole nother story.)

    “By introducing the Check the Government hopes to identify pupils with below expected progress in phonic decoding. These pupils will receive additional intervention and then retake the Check to assess the extent to which their phonics ability has improved.”

    So how is that coming along? Well, the evaluation established that the Government hope has been realized. However, as of the 2014 administration of the check, 26% of Year 1 students did not pass the screen and 12% of Year 2 students still did not pass. Internationally, compared with the “failure rates” on reading tests in other English-speaking countries, the experimental results to date are interoculary significant–they hit you between the eyes.

    Nationally, within England, the analyses of the PSC data to date have been limited to the national and local-educational- authority level. However, the “action” is at the school and classroom level. For example, It has been reported that “more than 600 schools have registered “near-perfect PSC pass rates in Yr 1” This group of schools constitutes a natural “treatment group” that can be compared to the “control group” of all other English schools. We don’t now know what comprises the “treatment,” but a good guess if that it’s “within a context where phonic work is seen not as one of a range of optional methods or strategies for teaching reading but as a body of knowledge and skills about how the alphabet works, which all
    children should be taught.”–quoting from your Report.

    I’ve proposed a number of other feasible experimental comparisons of professional interest, and research begats research.

    More “evidence”, anyone?

Leave a comment