Still loving the National Student Survey

I didn’t plan to write a post on the NSS this year, but sometimes I see something so bonkers that I can’t hold my tongue. That came along last week in the form of a Guardian blog-post: The National Student Survey should be abolished before it does any more harm. And so, while my mate Derfel has already led the defence, and although I’ve made some of the points below before, here we go again.

Let’s tackle those NSS fallacies – most of them familiar enough – as rehearsed in the Guardian blog.

1.There must something wrong with a survey that demonstrates high levels of satisfaction

Well: why? Surely it’s possible that if students say they’re satisfied, they are satisfied. And surely it’s also possible that satisfaction rates have gravitated upwards because we’ve systematically raised standards over the years since its introduction. We’ve worked hard, we’ve got better, and our students are getting a better experience. Why wouldn’t we want to consider that as a possibility? Why wouldn’t we want to scream it from the rooftops – especially when there are plenty of people outside of universities telling us we’re not worth it? Or should we, perhaps, devise a new measure that demonstrates once and for all that we’re crap?

Over many years of studying and responding to NSS results, I’ve been involved in reviews of a couple of programmes getting really appalling satisfaction rates. (Don’t worry: not recently and not my own.) And I’ve seen what a very powerful tool for reform those results can be. I’ve seen a DVC banging a desk and saying the results are shameful. And I can assure you that we fixed those programmes. Without the NSS, can I be sure those changes would have happened?

Maybe there’s not so much fixing left to be done. Possibly. Maybe there’s also a case, now, for a review of the NSS questions. Last year I argued in favour of the proposed ‘engagement’ questions, which I still think can help us to move forwards on the agenda of quality enhancement. There’s always cause, in other words, to look for the next horizon. But why trash what has been – and remains – such a powerful tool for reform?

  1. It doesn’t demonstrate differences between programmes at an institution

Honestly! The Guardian blog-post was published the day after the institution-level results were published. But here’s the thing: there’s more to come. Really! Programme level data, comments – and people like me will devour the lot. Really.

And we’ll do that because the real value of the NSS is not whether it leaves my department one place above or below, say, Warwick – which is really neither here nor there – but what it identifies as priorities for the year to come. Because the micro-level data can be very, very specific.

In my department, for instance, we’ve had a couple of years of very high results on ‘prompt’ feedback and ‘detailed’ feedback, but significantly lower on feedback that has helped the students ‘clarify things [they] don’t understand’. Given that I’d rather like our students to understand things, and that I’d also rather like to protect my colleagues from crushing workloads, this tells me that we’ve got a problem: we’re working ourselves into the ground, but not working as effectively as we might. If we can fix that problem – easier said than done, granted – we could improve things for everybody.

  1. The NSS fosters a ‘race to the bottom’

The perception here is that we have all cut reading lists, shortened assessments, and generally made life easier for students, so that they will be more satisfied. And the evidence?

In my part of the world, actually, most of the evidence is to the contrary. Most students, in my experience, want more, not less, assessment. Funny thing, that: something to do with the fact that they want to learn. Who’d have thought it? And our students are unquestionably reading more, not less, than they were 5-10 years ago. How we facilitate that reading may have changed, due to the digitization of material. Some people may argue that this is spoon-feeding, but it would be hard to argue that it’s a product of the NSS specifically, rather than of advances in technology more generally.

One of the bigger changes in my department, meanwhile, has been in contact hours. Again, students wanted more, and we gave them more. Again, some argue that this is spoon-feeding, but again there’s plenty of evidence that they value, more than anything, quality learning hours, not hours that deliver piles of information in generic contexts.

In fact my biggest beef with this argument is that it assumes that students don’t want to be challenged and don’t know what’s good for them. While they won’t always be right, of course, I think that assumption, as an assumption, is patronizing and wrong.

  1. The NSS is a waste of resources

This is a new one to me: ‘just the cost of rewarding survey-completers with vouchers would cover a lecturer’s salary at many institutions’. Really?

For a start, of course, we don’t know who’s completed it and who hasn’t. So how are we all ‘rewarding survey-completers’? At Exeter this year I think we had banks of laptops and plates of donuts here and there: the message being, ‘stop for a few minutes, complete the NSS and have a donut’. I guess the donut was a ‘reward’, but it doesn’t strike me as excessive. Maybe some universities were letting them take the laptops home if they ticked the right boxes. Who’s to know?

—————————————————-

So let’s, please, defend the NSS. I don’t for a minute think students should be our only source of evidence about quality teaching. That’s another debate. But nor do I think we should ignore their views, because the NSS has proved our students, year after year, to be astute and honest commentators on what we’re doing. And it’s shown us, on the whole, to be doing a bloody good job.

4 thoughts on “Still loving the National Student Survey

  1. How to respond to NSS results? This is good: ‘Our findings revealed differences in how institutions approach the NSS based on their positions in the NSS league tables. In particular, institutions placed in the top 25% of the league tables appear to have a relaxed view of the NSS. They appear to put particular emphasis on improving the student experience and argue that this automatically triggers a higher satisfaction rate than being ‘obsessed with the NSS’ and improving league table position’ (http://www.qaa.ac.uk/en/Publications/Documents/Subscriber-Research-Role-of-Student-Satisfaction-Data-15.pdf). In other words: once you find that reactions to NSS results are dominating your education policy, you’re in trouble. The really good places are making the weather in their strategizing, and using NSS results to tidy up around the edges.

Leave a comment