Scotland’s independent think tank
Scotland’s independent think tank

Scottish Survey of Literacy and Numeracy results – Keir Bloomer

Education

Making sound education policy needs information and evidence.  Unless policy makers – whether at school, local authority or national level – know what is going on, they cannot make good decisions.  Are standards in primary science rising or falling?  How much would it cost to introduce a second foreign language into our school?  How does the performance of the most disadvantaged children compare with twenty years ago?  So far as Scottish education is concerned, these are just three questions out of thousands that we don’t have the information to answer.

It is for this reason that the Scottish Government introduced the National Improvement Framework.  It sets out some clear worthwhile aims and lists a number of key factors that will generate change.  It provides a framework for gathering information.  Over time, it will ensure that Scottish education has better evidence at its disposal.

Unfortunately, we are moving in the opposite direct.  The figures from the 2016 Scottish Survey of Literacy and Numeracy published last week are to be the last of their kind.  The survey, one of the most important sources of information on vital areas of learning, has been discontinued.

In its place will be a new system that is currently untested.  The government’s original intention had been to rely on standardised national tests to be taken by all pupils at certain stages of their education.   Unfortunately, it has bowed to pressure and agreed that the information to be published will be based on teachers’ professional judgments.  The tests will merely help teachers in reaching their judgments.

Teacher judgment is usually insightful and often quite comprehensive.  If you want to learn about a pupil’s progress, development, attitudes to learning and much more besides, a view based on teacher judgment is just what you need.  If you want to know how the nation’s children are coping with a particular aspect of numeracy, a test is better.

National figures based on teacher judgment of the extent to which young people were progressing through Curriculum for Excellence were published for the first time last December.  The inconsistencies were glaring. If judgment is to be the basis of national measurement of performance, significant resources will need to be put into creating nationally consistent expectations and into external moderation of individual judgments.  These are not the highest priorities but, unless this kind of action is taken, the results of the new system will be of little value.

There is a further problem in replacing SSLN immediately.  The value of surveys like SSLN – and PISA, come to that – is that they test in a consistent way at regular intervals.  Thus they don’t only provide a snapshot of standards in 2017.  They enable comparisons to be made over time, thus demonstrating whether standards are rising or falling.  Until the new system has been operating for some years, this will no longer be possible. The only way to overcome this difficulty is to run the two measures in parallel for a few years so that it becomes possible to see how the results compare.  The government has refused to do this.

Of course, unfortunately the results from SSLN since it started in 2011 have been profoundly worrying.

In 2011/13/15 performance in numeracy was measured while in 2012/14/16 the emphasis was on literacy. The disturbing feature of the results over the years is that overall performance in each survey has been poorer than in the previous survey.  Thus, numeracy standards apparently fell between 2011 and 2013 and again by 2015.  The same is true of the three literacy surveys.

The literacy survey looks at three aspects of learning; reading, writing and listening and talking.  The 2016 results published last week showed performance in reading and listening and talking holding roughly steady but standards in writing declining at all stages with a very dramatic drop in secondary 2.  Only 49% young people at this stage were found to be writing ‘well or very well’ compared with 55% in 2014 and 64% in 2012.  Clearly something is seriously wrong.

Surveys like SSLN tell us what has happened.  They do not say why it has happened.  They cannot answer difficult questions such as “Why could almost two-thirds of second year pupils meet expected standards in writing in 2012 but under half four years later?”    The Education Secretary, John Swinney offered one explanation when he suggested that teachers are struggling with what he called too much ‘overbearing guidance’.  I am sure he is right.  I worry too that, in the secondary sector, teachers’ time and energy has been focused on changes to the examination system, leading many schools to give scant attention to the period of ‘broad general education’ in S1 to 3.

We can all speculate and there is certainly scope for the government to invest in relevant research.  What is certain, however, is that there are serious problems in relation to both literacy and numeracy.  At the same time, the survey that produced these important, if depressing, findings has been discontinued and is being replaced by an unproven and suspect successor.  Equally unfortunately, our ability to look at change over time is being discarded.

Keir Bloomer is the chair of the Commission on School Reform