Running Records: Results of a brief survey (spoiler: they’re unpopular if you haven’t drunk the balanced literacy Kool Aid)

Last week I sought information from teachers about being mandated to use Running Records for assessment and reporting. This followed an article I wrote about why these assessments were not fit for purpose. When I say Running Records, I do mean formal Benchmark Assessment System tests and less formal stand-alone Running Records.

I wanted to know several things about who and what was behind their use in schools. To say it was a massive outpouring of discontent would be an understatement. Here is the breakdown:

My first question was:

  • Are you obliged to use and submit data using Running Records?

I was shocked to hear that some schools and school districts have this as a mandatory requirement. Since these assessments are not aligned with what we know are the key components of reading, does this not equate to educational malpractice?

And that’s just the tip of the iceberg. What happened to teacher autonomy? There are teachers all over the world being forced to use these tests despite knowing full well that they are not valid.

Running Records fall short on the science, even if they do score well on the feels for some. I am aware that many teachers swear by them and love using them. Running Records certainly have a reassuringly structured feel to them. Supporters often express the view that Running Records provide insights that are helpful. They like the way the results can be matched to resources and ‘levels’. They like the way they can be used in goal-setting and reporting. But I think we need to open dialogue about replacing them with assessments that do all this and more. And the more I’m referring to is alignment with the findings of cognitive science around how reading works. I’m also referring to validity, reliability and objective measurement.

Next, I wanted to find out who was behind these mandates, so I asked:

  • Who do you have to submit the data to (e.g. a supervisor, a school board, a school district etc.)?

Aside from immediate superiors, many teachers answered that they had to upload the data to a common data portal, like Google Docs, and that in many cases, they were never checked, let alone acted upon. Some respondents were teachers in public schools who are paid to spend a great deal of time administering these tests. One teacher said:

“We enter the results onto a Google doc and nobody does anything with them. Leadership don’t check them and we never get any feedback about the results. They just sit there doing nothing. It’s the number 1 biggest waste of teaching time and so frustrating to have to do that instead of the explicit teaching my class require!”

Curious about their frequency, I asked:

  • How often do you have to use these tests in a school year?

Answers varied from “once every three weeks” to “twice a year”. Imagine the useful information we could have in schools if these assessments measured what was actually being taught.

The answers to the next question concern the use of teacher time. There seems to be an eye-watering amount of time devoted to Running Records. Many of my respondents were incensed to have their precious time wasted like this, but felt powerless to complain or change it. The question was:

  • How long, on average, does it take to complete the cycle, from first test to the last word of your report?

Some teachers couldn’t quantify an exact time, but answered, “about a week” or “by the time I’ve finished all the testing, it feels like I have to start it all over again”.

For those who did give a duration, the average was 23 minutes. Per child. That’s a lot of time measuring what doesn’t amount to reading, especially if it’s done once every three weeks.

But aside from questions of time-wasting, when I asked about training, I was alarmed at the laxity. This was a side of Running Records I wasn’t aware of. I assumed all teachers were given deliberate, approved training at the tertiary level. Bear in mind, these assessments are ubiquitous, time-consuming and very complex. I asked:

  • What training did you do to run this assessment?

A representative sample of typical responses follows:

“I do not have training to conduct PM assessment, although my colleague offered to informally tell me what to do…”

“I have never been trained, just shown by a colleague and we worked together to apply graded to levels with some moderation with schools close by.”

“No training has ever been provided. I’ve taught all year levels at many schools. It’s assumed teachers ‘just know’ how to do it. It’s also assumed teachers know what they’re supposed to be assessing/looking for/ analysing/ commenting on.”

“I have no formal training in administration. How to is handed down from teacher to teacher.”

“10 or so years ago a fellow teacher showed me what to do.”

“We’ve never done any training on it and I’ve noticed that everyone does the test slightly differently thus making the results void.”

There isn’t a wow emoji large enough to express my incredulity.

In all fairness, the Australian Catholic sector seemed to be a little more rigorous in mandating and providing formal training, which would be encouraging should they decide to go with higher quality assessment one day.

So if there’s this much discontent, why not rebel? I posted an optional question at the end of the survey:

  • What would happen if you did not submit that data?

Answers ranged from “disciplinary action” to “not an option”. What I didn’t get was the impression that those who issued these mandates were at all interested in teacher agency or open and honest collaboration. Fear ran through the responses.

When parents show me their children’s ILPs and it has some kind of target that refers to a reading level or a benchmark, I ask them to ignore it or to return to the school and ask for a valid assessment. Parents have less to lose and more to gain from doing this. Teachers, on the other hand, feel threatened at worst, voiceless and ignored at best. Teacher burnout is real, and with mandates like this, it’s no wonder.

For a series of #RunningRecordsReplacements, head over to my YouTube channel:

Single word reading test (The CC2)

Letter sound test (The LeST)

Leave a Comment

Your email address will not be published. Required fields are marked *

1 thought on “Running Records: Results of a brief survey (spoiler: they’re unpopular if you haven’t drunk the balanced literacy Kool Aid)”

  1. Avatar

    I’m in South Australia and we are required to enter RR data into the DfE data base at the end of September each year. We later receive a report comparing our progress with other schools. Reception students are expected to be a L5, Yr 1 L15 and Yr 2 L21. We use SSP to teach reading and our students read decodable readers once they can blend words confidently. But we also have to do guided reading using levelled texts. It is actively stipulated by our partnership director. It’s beyond frustrating that the people who are supposed to lead improvement in schools in our area continue to cling to RR data as the holy grail of measuring progress.