Much Ado about Something – Jo Frankham

The effects of the National Student Survey on Higher Education: An exploratory study

SummaryFull paper (PDF)

This research suggests the National Student Survey (NSS) results do not represent a full or accurate picture of key issues relevant to the quality of higher education. Examples include: Low scores originating in a very small number of students and sometimes as a consequence of something quite outside the remit of the survey. In addition, the superficial nature of the data does not represent the complexity of the activity it sets out to measure, and the survey cannot differentiate between a serious and more superficial problem. Where real problems exist, these seem not to be addressed via the survey in order that individual staff and students are not ‘exposed’ or vilified. This response, which may be appropriate in the short-term, does nothing to address the much more deep-seated problems that can continue unaddressed.

The NSS may actually be diminishing the ‘student experience’ in respect of the educational endeavour of higher education. The data suggest that the NSS is encouraging a more instrumental attitude to education amongst the student body. An economistic register has been reinforced by the introduction of the £9,000 fee where students increasingly consider whether they are getting ‘value for money’. In such a context, higher education may increasingly be regarded as a transaction where students pay for something that academics ‘deliver’.

Academics reported that the NSS has also encouraged a punitive attitude amongst senior managers who oversee the survey. This is evident in the ways in which the results are distributed, the public nature of the comparisons that are made, the requirements to respond to issues raised and the combative tone of much of the discussion around the survey results.

Elements of the research suggested a diminishment of professional autonomy amongst academics. The notion of ‘continuous improvement’ disregards thinking about the complexity of educational issues that might be expressed through student feedback. For example, in educational terms, intellectual struggle would typically be regarded as essential to work at this level. In NSS terms, however, intellectual struggle may manifest itself in dissatisfaction. The responses that are required to ‘problematic’ scores are also impoverished in educational terms, given the sometimes dubious and uncertain validity of the data and the speed with which people are expected to respond.

The circularity of positivistic data in the form of numbers also leads to increasingly closed systems in respect of professional dialogue and debate. Once a programme evaluation has been reduced to a number it has huge power. Numbers are both ‘transportable and transposable’ and give things an air of certainty. This is the power of the league table, of course, and helps to explain how academics become implicated in these mechanisms.

Academics need, and want, to do well in these league tables; this is partly a consequence of their public nature and partly about the ways in which they are used by managers. And, as this research illustrates, ‘problematic’ scores come back onto the table “again and again and again” funneling the impact of those scores on the people concerned, and increasing their visibility. In contrast to students’ attitudes to the survey, academics reported a keen awareness and preoccupation with the survey and its effects.

Although the NSS has encouraged the proliferation of other surveys and opportunities for academic/student exchange, it appeared that these mechanisms also tended towards the provision of ‘solutions’ to immediate and short-term problems/dissatisfactions rather than to dialogue about the purpose, meaning or content of education. This represents an impoverished view of the complexity and potential of student ‘voice’.

The NSS has clearly changed both what academics do, and how they describe what they do. It has also contributed to a considerable increase in workload in respect of preparing for and responding to the NSS. Requirements to focus on employability, in particular, have encouraged the proliferation of a discourse associated with skills which is more akin to training than to education. Course material that is challenging, and assignments which present students with a challenge, are clear foci for student expressions of dissatisfaction and concern. Given the public nature of NSS scores, the institutional emphasis on them, and academics’ desires to respond to student feedback, these provide the focus for much extra help being given to students in order that they feel less uncomfortable with these elements of their courses. This may help to explain the massive increase in students achieving first class and upper-second class degrees.

The academics who were interviewed for this study seemed, on the whole, to have become accustomed to this climate while also expressing concern and critique of elements of the survey and its implications. A danger of this climate is that academics become increasingly accustomed to giving students what (they think) they want, and what senior managers require, in order to satisfy the demands of audit mechanisms. In the process, questions of educational quality, are not being prioritized or foregrounded in British higher education.


Through a glass darkly

In the final semester of the academic year third year students are invited to complete the National Student Survey. On university campuses, there are a plethora of visual reminders to complete the survey. Many of these prompts include information on prize draws, and other incentives, that students are eligible for by completing the questionnaire. Students are contacted directly by Ipsos MORI via their email and/or mobile phone, inviting and encouraging them to complete the survey.

Somewhat surprisingly, given the visibility of promotional material and the incentives attached to completion, academics do not believe that the process is particularly significant to students. Gill: “They’re kind of not that bothered. We don’t know what this is. We get stuff sent to us all the time, so we just do it.” Maggie: “It’s a battle to get them to participate at times.” Roger “they don’t engage in conversations with myself about it. I don’t know how much interest they actually have.” Amanda underlines the contrast that seems apparent between students’ and academics’ responses: “Our future hangs on it but we don’t hear them talking about it.” Most students, it seemed from academics’ accounts, fell into this category. Gill: “They don’t seem to know [what it is], even though it’s held up as this thing that they should have looked at in order to make the decision about coming here.”

Liz: “I will sometimes ask if they know what it is (laughs). Explain that it feeds into some things that feed into league tables and it’s a measure of the student experience and it will come out in the newspapers and so on. So to give them . . . because I don’t think they really know, necessarily what they’re filling in or what it feeds into. (. . . ) Because they fill in feedback forms regularly anyway . . . and they just end up kind of feeling, I think, like it’s the same kind of thing, because the questions are fairly similar.”

I was also told on many occasions that students “misunderstand the questions” (Gill). Students respond in ways that “are not what the survey intended. So they talk about ‘the university’ did not do this and then we are marked down” (Joe). Where academics see the qualitative comments that some students make, they discern a contrast between what they think the survey is for and what the students think it is for. Academics reported that results fluctuate from year to year in ways they did not find easy to understand beyond different cohort characteristics. Dale: “It fluctuates in terms of what they view the NSS is for. It think it varies in terms of their understanding of what they’re going to be asked . .“ Maggie reported that a particular ‘bugbear’ came up year after year and then “bizarrely enough this year it hasn’t” although nothing had changed as far as she was concerned.

Jo Frankham
Reader in Educational Research, Faculty of Education, Health and Community
Liverpool John Moores University

This research was funded by the British Academy/Leverhulme


About Sean

Principal Research Fellow, Survey of English Usage, University College London
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.