NUS Guide To Surveys in Higher Education

Page 1

A Guide to

Surveys in Higher Education


www.nusconnect.org.uk

designed/produced by arc. www.arc-cs.com


Introduction

The feedback of students on their experience of Higher Education is gathered in many ways throughout the year, and it is essential that students’ unions and institutions both listen to and respond to this valuable information. Student’s unions can use the surveys described in this guide as a tool for quality enhancement; to highlight areas which could benefit from improvement, strengthen campaigns, and accurately represent the study body. We have produced this guide to offer an insight into the key surveys that currently capture the views of students across the UK. The surveys discussed are: • • • • •

National Student Survey (NSS) Postgraduate Research Experience Survey (PRES) Postgraduate Taught Experience Survey (PTES) International Student Barometer (ISB) Destination of Leavers of Higher Education Survey (DHLE)

We hope that you will use this guide, and the section on the importance of being evidence-led, to achieve some excellent gains for students on your campus. Listening to the information provided by students in these surveys, and from data collected by your own institutional-level research, is the first step to mobilising a successful and effective campaign.

In unity, Usman Ali VP Higher Education

1


2


Contents

National Student Survey (NSS)

4

NSS case studies

8

Postgraduate Surveys

10

The International Student Barometer (ISB)

16

Destination of Leavers in Higher Education (DLHE)

18

Why is research so important?

20

Evidence-based campaigning

21

How to research

21

Further Information

ibc

Useful Contacts

ibc

3


National Student Survey (NSS)

What is the National Student Survey (NSS)? The NSS is an annual survey open to all full time final year undergraduate students and part time students in their fourth year of a higher education course. Starting in 2005, the survey runs across all publically funded Higher Education institutions in England, Wales, Northern Ireland, and participating institutions in Scotland. Since 2008, Further Education colleges with directly funded higher education students in England have been eligible to participate. It asks participants how satisfied they are about the level of academic quality in their student experience. Within 22 tick box questions, students are asked to rate their feelings about the following areas on a scale of 1 to 5: • Teaching Quality • Academic support • Assessment and feedback • Organisation and management • Learning resources • Personal development • Overall satisfaction with their course The survey is conducted by Ipsos-MORI, an independent organisation, which was commissioned by the Higher Education Funding council for England (HEFCE) to carry out the survey and aggregate the results. Detailed results are passed on to institutions and students’ unions, and summaries of results are also displayed online (www.unistats.com) to help inform prospective students deciding on where and what to study.

4

Completing the NSS Final year students can complete the survey at any point between January and April. Each eligible student will first be invited to complete the survey online, and will be sent an email by Ipsos-MORI by the end of February. If students do not complete the survey online they are then sent a postal survey to their registered address. Students are then telephoned by ipsos-MORI to complete the survey if the postal survey is not returned. Telephone attempts will only be made three times; after which the participant is not contacted again. Participates also retain the right to opt-out of the survey of any point. The survey is available in these various formats for those with difficulty accessing computers. In order for the NSS data to be published, a 50% response rate is sought at both course and institutional level.

The aim of the NSS The NSS has two predominant aims; to provide prospective students with information about their chosen courses, and to act as a key driver for improving the student experience by identifying areas for change.

NSS for prospective students: The NSS is part of the Government’s drive to widen access and participation in higher education by providing students with accurate information on available courses, allowing them to make more informed choices about their education.


The NSS data which is made publically available is put on unistats.com where students can browse institutions, colleges or courses and see the NSS data which has been released. This enables students to view the satisfaction of the most recent students who have participated on the course. Additionally, NSS data is used by some organisations for the creation of league tables such as The Sunday Times University League Table and The Guardian University Guide

NSS for institutions and students’ unions: Through the NSS, institutions are given access to comprehensive information about what students are thinking and feeling on an annual basis in order to inform their delivery of academic courses and the resources and opportunities offered to students. Students’ unions and associations are also given access to the results for their particular institution. This gives unions a good grasp of what students are thinking as well as a solid body of evidence to help them campaign for improvements to the academic experience.

• It is a powerful tool for students’ unions and course representatives to generate change and gives unions, their reps, and institutions access to data to improve the lives of students.

NUS driving change through the NSS Since it started many achievements have been made with the backing of NSS evidence. Students’ Unions and NUS have used the results to campaign on issues ranging from ensuring effective feedback mechanisms, improving campus facilities and drawing attention to inequality. The NSS has highlighted that on a national level, regardless of institution or subject of study, students tend to be united in their dissatisfaction with the area of feedback on their assessed work. 2010 NSS results show that 33 per cent of students were dissatisfied with assessment and feedback at their institution.

Teaching quality

2005

2009

2010

77

83

83

Assessment and feedback

59

65

66

Academic support

67

74

75

NUS and the NSS

Organisation and management

69

72

73

Learning resources

76

81

80

NUS believes that the NSS is important because:

Personal development

75

79

79

Overall satisfaction

79

82

82

• It provides prospective students with information about the academic experience at the institution they are thinking about applying to. • It recognises students ability to comment on academic quality in their own experience, giving added weight to the learner voice.

The table above shows UK wide results from 2005, 2009 and 2010. Assessment and Feedback is consistently the area with least satisfaction and remains so even with a 7% increase over 5 years. Teaching quality remains the area where students are most satisfied.

5


Despite some increases in satisfaction since the survey began overall satisfaction has only increased by 3% and remains at 82% with no increase from 2009.

Despite fees nearly trebling there has been no increase in student satisfaction. This year's National Student Survey is a wake up call to university vice chancellors. They must buck up their ideas and do far more to improve the experience they offer students.

Aaron Porter, NUS President

The Great Feedback Amnesty Based on the NSS results, NUS began to work with students’ unions to campaign on the area of assessment feedback and came up with a set of principles to allow students to benchmark efficacy of their feedback. Students’ unions across the country used this framework to lobby their unions to standardise turnaround time for feedback on assessed work, introduce new methods of feedback, anonymous marking practices, and review their existing policies and regulations in this area. The national campaign was also documented in the Times Higher Education.

6

Examination feedback In the summer of 2008, following the call in NUS’ Feedback Principles for feedback on exams, NUS ran a campaign on exam feedback. In addition to a campaign toolkit for students’ unions NUS produced stickers for students to stick on their exam scripts to allow students to visually demonstrate their dissatisfaction with the lack of exam feedback at their institution. Students’ unions across the country distributed over 20,000 stickers, sparking a dialogue with their institution about the changes that need to be made for exam feedback to be consistently implemented.

Feedback and Assessment Campaign Toolkit – F.A.C.T September 2010 saw the launch of F.A.C.T which is a 200 page folder containing briefings on a range of feedback and assessment issues, from anonymous marking to e-submission. The publication also includes practical help and guidance on campaigning on these issues with case studies, useful information and tools to help. If you haven’t got your copy, please email nss@nus.org.uk

Promoting the NSS – the role of Students’ Unions and Associations Making sure that the NSS is promoted is vital in order to ensure that your institution reaches the 50% response rate threshold and therefore gets your NSS data published; publically available data is a useful tool to strengthen your argument and back campaigns.


OUR TOP TIP Tell students what the NSS has done in the past! Prove to students that the NSS is worth completing and it helps to change things by telling them what changes or improvements have been made at your institution! Closing the feedback loop and informing your student body of your NSS results is an important part of the process!

• Increased funding and investment for teaching spaces and key facilities. • Renewed focus on student engagement through investigation and investment into course rep and personal tutoring systems. • Celebrating good practice through Teaching and Personal Tutoring Awards. • Providing inspiration for institutional and union surveys and focus groups.

Students’ Unions using results for change Since the NSS began in 2005, individual students’ unions and associations have made huge changes based on their NSS results across the UK, and here is a small snapshot of what the NSS has helped to achieve: • Greater promotion of policies, practices and support available to help students. • Creation of minimum standards or imposition of a minimum turnaround time for feedback on assessed work. • Positive promotion of the role of students’ unions as assistant in the delivery of changes that valued by students. • Increased or reformed contact hours or teaching time. • Evidence for quality assurance and enhancement processes. • Reforms to assessment practices such as anonymous marking.

7


Case studies

Using National Student Survey data for Student Written Submission at Glamorgan SU Glamorgan SU used the NSS to clarify areas for development and create an accurate Student Written submission, fully supported by their institution. To inform their written submission on behalf of their student body, Glamorgan SU used the National Student Survey (NSS) results to look at the key themes highlighted by students through searching their textual comments and looking at the statistics. From looking at other end reports, they noticed that the NSS data was often singled out and specifically mentioned by auditors. Working together the institution decided to clearly show how the SWS had been used for internal quality enhancement purposes. For the Union the NSS was “…probably one of the most powerful sources in writing the SWS” as it allowed them to identify key themes and signpost them to area for further investigation, so that they could undertake more research through looking through committee minutes, feedback from course reps on facebook and internal statistics and reflect on how well the institution had responded to problem areas highlighted by the NSS in the past. In addition, both union and institution ensured that their submissions could be consistent where possible, highlighting the positives undertaken by the institution in response to NSS data.

Tips from Glamorgan SU • Glamorgan worked out that at least 20 of the institutions that had been reviewed or audited since early 2006, 15 end reports mentioned the NSS data and how it has been used, making it an essential point of reference for your SWS. • Glamorgan SU state that, “The NSS data should never be used in isolation; it should be combined with other sources of information. • Why not have a look at course reps feedback on Facebook – students are likely to write freely on there about their course experiences • The NSS results site at www.ipsosmori.com/nss/results allows unions to break down data by age, gender, ethnicity, disability and more… Contact: Course Reps Coordinator at Glamorgan University

Reading University Students’ Union As part of their preparation for their university’s institutional audit, Reading University Students’ Union conducted a huge feedback survey of more than 1,500 students. The union combined this with student focus groups, campaigns research, the international students’ barometer national survey and the National Student Survey. As part of their submission to the audit, the union also produced a student-focussed video to supplement their written content, allowing auditors the chance to see Reading students’ views clearly

8


and highlighting the student perspective in a more interesting and innovative way. The video, made by students is available on Facebook and the union’s own website.

Westminster Students’ Union After receiving poor results, Westminster Students’ Union met with the school management board to discuss the NSS, and highlighted both the positive and negative aspects of its results. The union looked at its open comments to look for patterns existing outside of the statistical results and compared its data with those of its peers in other institutions, to establish a case for change, which eventually led to new academic models being put forward as a response to students concerns, promising less assessment and more credits for their work. Following this, the union sent out a link to students explaining the positive changes that had been made on the back of the NSS data, as they felt it was crucial to demonstrate to students how their feedback had been used and the effectiveness of their contributions in the NSS. They felt that “…students didn’t often make the link between their feedback and the changes made.” In addition, the NSS helped to give a new focus and prioritisation to the issue of assessment and feedback and the implementation of a new calendar, extended opening hours and changes to their personal tutoring system.

University of East Anglia Students’ Union By looking at their NSS results by department, and using internal survey results, UEA students’ union were able to use their results to demonstrate how students with anonymous marking were happier with their assessment and feedback mechanisms than those without, which added powerful impetus to a long-standing campaign for anonymous marking that began in 1999 and finally managed to demonstrate a powerful case for change in 2005. Though initially it seemed that the battle had all but been won when the University agreed in principle to change to an anonymous marking system, the institution later managed to secure a delay in it’s implementation for a year and by 2007 students were still waiting to see the change that they had been promised. However, not content to allow the university to back down on its commitment, new officers took up the campaign and seized their opportunity to push the issue back to the fore. After a lengthy battle and much resistance from academic departments, Officers finally managed to convince the university to reconsider and implement anonymous marking in practice, stage by stage across the institution in coming years. Officers involved in the campaign learnt that “...it is important to strike whilst the iron is hot and winning in principle is not the same as winning in practice - campaigns need to be followed through each year to ensure that change happens and secure a win for students.”

9


Postgraduate surveys: PRES and PTES

What are the postgraduate surveys? The Higher Education Academy, committed to enhancing learning and teaching in UK Higher Education, coordinates two national postgraduate surveys, PRES and PTES.

Institutions may also participate in benchmarking clubs, allowing them to benchmark their results against similar types of institution e.g. researchintensive, small and/or specialist, and modern.

How are the surveys organised at institutional level?

PRES stands for Postgraduate Research Experience Survey, the survey of the experience of PhD students and other postgraduate research students. It has run three times, in 2007, 2008 and 2009. In 2009 it surveyed 65,000 students in 82 HE institutions and received responses from 18,644 students.

Institutions are invited to participate up to November of the year that the surveys are scheduled to be run. The surveys are run in the spring and results are collated in late Spring/Summer. Institutional results are totally anonymous; there is no public information available about any institution’s PRES and PTES results.

PTES stands for Postgraduate Taught Experience Survey, the survey of the experience of postgraduates on Masters degrees and other taught programmes. It has run twice, in 2009 and 2010. In 2010 over 32,000 students in 76 institutions responded to the survey.

Every participating institution has a nominated PRES and/or PTES officer, whose job it is to coordinate the survey(s) for their institution. This person is most likely to be a member of administrative staff, possibly based in the Graduate School or its equivalent.

Students that have taken part in PRES and PTES have a similar demographic profile to postgraduate students across the UK, meaning that PRES and PTES provide an accurate reflection of UK postgraduate students’ opinions. National survey results are also fairly consistent year-to-year, suggesting that the surveys are a reliable tool for analysing the postgraduate experience.

It is not always obvious to students’ unions who the PRES or PTES officer for their institution is. If you would like to know whether your institution takes part in PRES or PTES and who your PRES or PTES officer is, contact NUS for more information.

Institutions can benchmark their results against national aggregate data published on the Higher Education Academy website at: http://www.heacademy.ac.uk/ourwork/supportin gresearch/postgraduatework

10


What is the rationale behind PRES and PTES? Unlike the National Student Survey, PRES and PTES cannot provide any public information about postgraduate course quality and standards. As such, they are useful primarily as a tool for researching and understanding the postgraduate experience, taking steps to enhance or improve that experience, and measuring the impact of enhancement activities within institutions.

What do PRES and PTES cover? Both surveys measure students’ experiences of their degree courses. Like the National Student Survey, respondents are asked to indicate the extent of their agreement with a set of statements on a five-point scale from ‘1 = strongly disagree’ to ‘5 = strongly agree’. The extent to which students agree is indicated by the percentage that agrees or strongly agree with each statement. Questions are organised into different areas of student teaching and learning experience.

Example: the supervision scale from PRES 1

2

3

4

5

a. My supervisor/s have the skills and subject knowledge to adequately support my research

o

o

o

o

o

b. My supervisor/s make a real effort to understand any difficulties I face

o

o

o

o

o

c. I have been given good guidance in topic selection and refinement by my supervisor/s

o

o

o

o

o

d. I have received good guidance in my literature search from my supervisor/s

o

o

o

o

o

e. My supervisor/s provide helpful feedback on my progress

o

o

o

o

o

f. My supervisor/s are available when I need them

o

o

o

o

o

11


PTES asks respondents about: • Motivations for study and for choice of institution • Quality of teaching and learning • Assessment and feedback • Dissertation and supervision • Organisation and management • Learning resources • Skills and personal development • Career and professional development • Overall satisfaction PRES asks respondents about: • • • • • • • • • •

Motivations for research and career aspirations Supervision Research and transferable skills development Infrastructure (such as library and IT facilities) Intellectual climate (such as a good research community) Goals and standards Thesis examination Professional development and career Roles and responsibilities of student and institution Teaching opportunities

Institutions are also free to add their own questions to either survey, tailoring it for their own student cohort.

What are the key issues emerging from the surveys? Overall, postgraduates’ satisfaction with their course experience is similar to that of undergraduates – 84-85% of both taught and research postgraduates agree that their course met or exceeded their expectations. Respondents to PTES 2010 were positive about teaching and learning e.g. 83% agree that their course is intellectually stimulating with enthusiastic staff. 78% agree that their programme has developed their research and transferable skills. Areas of concern include assessment and feedback. Only 56% agree that they receive feedback in time to improve the next assignment and 58% agree that feedback helped them clarify things that were not understood. PTES also shows us that the primary reason for undertaking a postgraduate degree is careerrelated, either to improve employment prospects in general or to progress in a specific career path. Respondents to PRES 2009 were most positive about their experience of supervision. 76% agree that their supervisor makes a real effort to understand difficulties faced and 75% agree that their supervisor is available when needed (although this still leaves 1 in 4 who do not agree to either proposition). Areas where respondents to PRES were less positive include intellectual climate and professional and career development. 49% agree that they feel integrated into the department’s community, and

12


58% agree that they have opportunities to become involved in the wider research culture. Only 37% agree that they are encouraged to think about the range of career opportunities available to them. Overall, both surveys demonstrate clear points of intervention that could help to improve the postgraduate student experience.

How can students’ unions engage with PRES and PTES? From 2010, following discussion with NUS, the Higher Education Academy has agreed to encourage PRES and PTES officers to engage with their students’ union in working on their postgraduate surveys. This could translate into students’ unions arguing for the adoption of the surveys, promoting PRES and PTES among postgraduate students, working with institutions to identify institution-specific questions, interpreting results, consulting with students and/or supporting the delivery of enhancement activities. It is worth thinking about how union activity around NSS could be replicated at postgraduate level. PRES and/or PTES can offer a useful way of engaging with your institution on postgraduate issues. This may mean building relationships with individuals that the union has not previously engaged with, such as postgraduate deans or administrators. PRES and PTES results can also help unions to build relationships within their postgraduate community by providing a better understanding of

postgraduate issues and a talking point on which to engage postgraduates. Involving postgraduates will help to strengthen any planned campaign for change or attempt to improve the student experience.

Think your institution should adopt PRES and/or PTES? Make the arguments! If your institution does not survey its postgraduate students… • The postgraduate student experience is very distinct from that of undergraduate students and should be addressed separately • If postgraduates are not surveyed it is too easy for their issues to fall off the agenda • Survey data provides robust information about where to intervene most effectively and with the greatest impact to improve the student experience • Information about the student experience makes it easier to understand what postgraduates want and market/tailor courses to postgraduate expectations (meaning more effective recruitment) • Postgraduate students need the opportunity to help shape their course experience, and surveys provide a focus for strengthening the student voice in the institution • Adopting PRES and/or PTES is less resourceintensive than designing an internal survey – your institution only needs to pay for a license and the survey and guidance on managing it is provided for you by the Higher Education Academy

13


If your institution already conducts an internal survey of its postgraduates… • Internal surveys can be very useful (as long as students’ unions have access to the results!) but do not include the ability to benchmark against similar institutions and national results • Internal surveys require a lot of resource input to be effective • You cannot share good practice with other institutions about what you have done with the results or benefit from case studies or networking opportunities provided for PRES and PTES officers • PRES and PTES have been tested for robustness at the national level • Adopting national surveys contributes to the national picture of the postgraduate experience and is a service to the academic community

14

Sample agenda for your first meeting with your PRES or PTES officer: The first time you meet with your PRES or PTES officer it might be unclear how you can work together effectively. The following sample meeting agenda may help to structure that initial conversation! 1

Background: has the institution run these before, what were the results and have any changes been made as a result?

2

Timing and management of the survey: how will it work, when will it be run and when will results come in? Which committees or individuals make these decisions and is there student representation on these?

3

Promoting the survey to students, how can the union be involved?

4

What is likely to happen when results come in? Can there be an agreement that the union/individuals within the union will have access to the results? What form will these come in? Does the institution have any particular concerns about sharing results and what can the union do about these?

5

In principle, how can the union be involved in interpreting results and suggesting areas for further work?


Keep in touch with NUS on your experience of postgraduate surveys NUS will continue to work with the Higher Education Academy on PRES and PTES, and share good practice with unions on engaging with the postgraduate surveys. If you have a case study of how your union has engaged with either PRES or PTES, let us know. You can also contact us for advice or guidance should you run into problems, if you are unsure if your institution already runs PRES or PTES or if you want to chat through any other issue. Get in touch: postgraduates@nus.org.uk You can also stay up to date with postgraduate issues via our mailing list: nus-postgrad@jiscmail.ac.uk All postgraduate news, information and resources are posted on NUS Connect: http://www.nusconnect.org.uk/campaigns/postg rad/

15


International Student Barometer (ISB)

What is the International Students Barometer? The International Student Barometer (ISB) is a privately run survey of the international student experience run by the International Graduate Insight Group (i-graduate). The ISB is described by i-graduate as ‘a global benchmarking study of expectations and experiences among international students’. Individual institutions pay to take part in the survey. Currently 600 Higher Education Institutions from 16 countries are signed up to the i-graduate International Student Barometer (ISB). Its global reach is best seen by the range of international institutions that take part in the survey which include, for example, Yale University, National University of Singapore and the University of Melbourne. It is currently the largest survey of its kind in the world and collates feedback from over 150,000 students each year. It can help you as a students’ union get a more holistic picture of the international student experience at your institution. As it covers a wide range of areas from learning and teaching to accommodation satisfaction it can help you identify areas which can be improved for international students.

the opinions of education agents worldwide (who often help international students into higher education in the UK), providing valuable insight into agents’ perceptions of international education markets and institutions and the Alumni Barometer which provides information to help institutions improve their work with their alumni.

How do institutions take part in the survey? Institutions individually pay to take part in the survey. For the fee they receive: • Questionnaire design, construction and management • Data analysis (institutional analysis & comparative benchmarking) • Presentation of results • Provision of the raw data • Country Insight Reports – to inform recruitment strategies • An e-handbook - providing a comprehensive guide to the Barometers • Marketing materials to support the survey process (downloadable posters, website buttons etc)

Who are i-graduate?

What does the survey measure?

i-graduate the company behind the ISB deliver a range of market research and consultancy services. They describe themselves as ‘an independent benchmarking and consultancy service, working to enable positive change in the education sector’. Other surveys they provide include, the Student Barometer, this collates data on the domestic student experience using the same methods as the ISB, the ICEF agent barometer, which monitors

The International Student Barometer gives institutions an overall picture of the international student experience from pre-arrival information to post study plans. The survey looks at the following areas:

16

• Demographics - such as nationality, funding of studies etc. • Courses - including areas of study, course type etc.


How does ISB measure the international student experience? The ISB runs real-time activity bursts at two significant points in the year, entry and exit - using semi-standardised online questionnaire format, adapted and customized for each partner institution. i-graduate manages the whole research process from planning to reporting. All survey design, hosting, data extraction and analysis is conducted for institutions. • Decision-making – what helped students choose country of study, college, who helped them choose • Application – did students use agents, what support did they receive etc. • Visas – visa type, application, and support • Enquiry to acceptance – service, communication • Arrival and orientation – induction experience • The learning experience – teachers, course content, facilities, employability • The living experience – accommodation, friends, funding, internet access • Support Services – advice on various factors including health, employment, visas • Recommendation – would students recommend their institution to others? • Future plans and career aspirations It enables institutions to benchmark themselves against others.

How can students’ unions engage with the survey? The International Student Barometer enables you to get a detailed look at the international student experience. The data allows you to examine key drivers in international student satisfaction and allows institutions to benchmark themselves against others. For students’ unions the ISB can help you better understand your international student population specifically and gain insight into their experience pre-arrival and during the induction experience. These are areas that can be unique to the international student experience and is an area students’ unions can work to improve. The data is not public therefore it is up to individual institutions as to whether they share the data received with their students’ unions. You will need to ask your institution for access to results if do not already have this information. For more information about i-graduate you can email info@i-graduate.org or follow them on Twitter via @igraduate

17


Destination of Leavers in Higher Education (DLHE)

What is it? Each year over 60,000 people who have recently qualified from a higher education course in a University or Further Education College are invited to complete a 5 minute survey to establish: • What type of employment or further study they were engaged in • Their income on one specific day in the survey period, approximately 6 months after leaving your course The DLHE is carried out by the Higher Education Statistics Agency (HESA), the central source for the collection and dissemination of statistics about publicly funded higher education in the UK. The two stage approach to the DLHE survey was developed so that a more accurate picture of graduate destinations can be traced over time. The survey replaced an earlier “First Destination Survey” in autumn 2003. The results are often used by league tables compiled by newspapers. There are two stages to the survey: 1

The first stage is a census of individual who have completed Higher Education course in the UK. This stage is carried out six months after the courses end (or longer period for a minority of eligible leavers) and is often referred to as the Early Survey. The first stage of the survey takes the form of a questionnaire that is sent by post to a sample of the graduates by the university. The survey is also available online, or can be undertaken by phone.

18

2

The second stage of the survey called the DLHE Longitudinal survey aims to find out what graduates have been doing over a longer period of time, since completing their studies. The survey is completed around three and a held years after graduation.

All respondents are asked what they were doing on a given date after graduation, and subsequent sections are asked accordingly: • If employed; role, salary, employer, location; importance of qualification; reasons for taking job; how found out about job • If engaged in further study; course, subject, institution, funding; reasons for studying • If newly qualified teacher; whether employed as a teacher, type of school / college All are asked whether the course they have completed was part-time or full-time, their reasons for undertaking the course, whether they were employed during the course and whether / how their employer supported them. The survey can be completed by telephone, post or on-line. The results of the DLHE can be found on the Higher Education Statistics Agency website www.hesa.ac.uk. You can also find the on www.unistats.com Institutions can use the information students provide to advise current students about the opportunities that might be available to them after completing or undertaking a course, Each University can also undertake analysis of its own data to create profiles of each department’s graduates to gain a clearer understanding of what graduates of higher education do, which students’ unions can approach the institution for access to assist in relevant campaigns and representation.


DLHE for HE in FE The DLHE looks at not only Higher Education Institution leavers, but those who have studied HE in FE. The eligibility criteria for participation in the DLHE in FE are as follows: • All students who completed directly funded, prescribed Higher education courses at FECs and at institutions that are members of a HEFCE-recognised funding consortium. • Students who completed their course between August and December will be included in Tranche 1 of the survey; those who complete their course later in the academic year (between January and July) will be included in Tranche 2 of the survey

It is important that students’ unions remind students to complete the survey to let their institution know how their degree has affected them and help future students and shape national developments in education policy. http://www.hesa.ac.uk/dox/datacoll/C07018/EN GLISH_HESA_Quest_4pp_April_08.pdf

How students’ unions can play a part in the DLHE? Students’ unions can use the information to find out more about how students feel e.g. their motivation for study and how they funded their education by visiting the HESA website, or asking their institution for access to results for their further education college. The results of the survey can be used in students’ unions campaigns to support the continuing improvement to the experiences of students at further education colleges in order to ensure that the courses that they study are consistent with future needs and expectations. It also enables students’ unions to examine the support they could give their students in skills development for their future employment

19


Why is research so important?

The enhancement of the student experience relies on good quality, widespread feedback from everyday students, and students’ unions – it aids quality enhancement through student written submissions, strengthens campaigns, and it can inform your plans, priorities and direction Knowing what your student body thinks about their student experience is vital in order to represent their issues and concerns correctly. Therefore, ensuring that you make available suitable ways for them to express these feelings is fundamental to being a responsive and representative students’ union. Conducting research and utilising survey data allows you to highlight themes of concern, areas of satisfaction, and get a general idea of issues that require further attention. The process of collecting and analysing student feedback should not be regarded as an end in itself; the ultimate aim is delivering real, useful, valued and perceptible outcomes for students. For unions, being evidence led is increasingly important to ensure institutions and students take the union seriously.

Strengthening your position Having representative evidence from your student body immeasurably strengthens the position of the student union or association when presenting arguments to the institution. The student voice cannot be easily ignored by institutions if data is trustworthy and explicit. Basing your opinions on fact rather than presumptions not only makes the students union appear professional but also destroys many potential counter-arguments of institutions questioning the representativeness of students unions.

EvIDENCE TRIANGLE Quantitative data: • NSS • PRES/PTES • Module evaluation • University internal surveys • ISB Policy: • University committee minutes • Charity briefings • QAA/HEA work • NUS briefings • Government agencies • Union committee minutes

20

Qualitative data: • Course reps • G.O.A.T • Part time officers • Survey free text comments (NSS)


Evidence-based campaigning – triangulation of data When campaigning on an important issue and presenting to the institution, make sure you try and collate as much evidence as you can to strengthen your position. The four levels of evidence below, when combined present the perfectly evidenced argument. Aiming to use three out of the four below will make sure that your position is covered from multiple angles, making it extremely difficult for the institution to dismiss your claims: Level 1: National Data – Make sure you use some national and comparable data such as any of those mentioned in this guide. Using survey results such as the NSS is powerful as the survey is externally run from the institution, made publically available and it is easy to compare your institution to others.

Level 4: Students’ union data – Most students’ unions will also conduct research throughout the academic year. Make sure you use existing data or conduct your own specifically targeted research to back specific campaign points or to investigate areas highlighted by national data.

How to research When thinking about conducting your own research it’s important to think fully about all the stages involved. The list below is a good place to start: 1

2

Level 2: Regional Data – There is a large amount of regional data that may also be useful to strengthen your argument. The local authority or council may hold data associated to your campaign, particularly if it is to do with housing and accommodation, or you need demographic information of the local area. Level 3: Institutional Data - Make sure that you look at all the internal institutional data available to you. The majority of institutions conduct annual internal surveys which may ask questions on similar things you want to campaign on. Additionally, module evaluations and other feedback mechanisms may also be useful to include.

3

Select a problem: The first place to start is to decide exactly what it is you want to research. This could have been decided from an area of low satisfaction in NSS data, from an internal survey or from verbal feedback from focus groups etc. Review current literature and theory: Make sure that you read around the topic. If you decide to conduct some research looking at your students’ attitudes to anonymous marking for example, make sure you conduct a detailed literature review on what has already been written. Having the theory to back-up your evidence strengthens your argument. Develop hypotheses or research questions: Once you have read around the subject, it is important to decide on clear hypotheses, or a few specific research questions. This ensures that your research remains focused, and you are fully aware of what it is you are trying to investigate from the very beginning.

21


4

5

6

Determine an appropriate methodology: Once your research questions have been decided, it’s important that you choose a methodology that is most appropriate to what you are trying to find. If you are researching a more emotive subject, then focus groups and one-to-one interviews may be the most appropriate. If you are wanted to establish the thoughts of many students on quite specific topics, a survey may provide you with more effective evidence. Collect relevant data: It is imperative that you collect relevant data in order to make your research effective. Ending up with research results that slightly miss the aim of the research questions won’t do much to strengthen your argument when presenting to your institution. To get your results taken seriously, it is important that the right things are asked of your students. Devising questions for focus groups or surveys can be a difficult task, but it is worth taking time at this stage to make sure these are the best they can be. Try to avoid leading questions, or survey questions which do not give participants enough scope to express their thoughts. Analyse and interpret the results: Analysing results, especially if you have a lot of data, can sometimes be a bit daunting. However this is often the most exciting part of the process as you start to see the final results. Make sure that you analyse the results accurately and they realistically reflect what you found. There is lots of help available on analysing data on nusconnect.org.uk and also more broadly online.

22

7

8

Present the results: When presenting your findings make sure you do so in the most persuasive format for your intended audience. Try and find out how your institution likes to receive research results, and then write your report in corresponding language. When feeding the results back to your students, make you sure make the results interesting, and relevant to them. Thinking carefully about how you present your research will maximise its impact. Replicate the study if possible: Perhaps your results have yielded more questions than answers‌.maybe your research has highlighted problems but given little clues as to possible solutions. Perhaps your results are more positive than you anticipated. All these options are possible when conducting research and therefore it may be likely that further research needs to be conducted to clarify some results.

NUS has also published a research guide which is available online at www.nusconnect.org.uk. The guide offers helpful advice and practical guidance on all stages of researching the student body from planning and preparing, conducting research and analysing data collected.


Major wins for Students’ Unions Evidenced-based campaigning works. The case studies below show that well organised and evidencebacked arguments can mean major wins for students’ unions!

Kinston University Students’ Union

UCL Union

In 2010 Kinston University Students’ Union (KUSU) will be embarking on a major campaigning project which aims to get students more involved in their learning by undergoing a series of consultations and research projects.

UCL Union used its NSS results to inform its campaign for 24 hour library access, entitled “UCL students do it 24/7 as well”. Though the institution already ran an internal library survey each year, they used the NSS results on related areas to reinforce the findings from this existing survey and push the university toward taking action on a long-standing promise to increase opening hours, finally resulting in extended hours.

The Campaign “Love your degree, love your future” aims to: • To engage students fully in decisions affecting their learning and teaching and to help students take control of their university experience. • To consult students on their current learning and teaching needs. • To deliver change on the immediate needs of students The union have set out their campaign in a document entitled ‘The Student Agenda’ which highlights the key research areas for the campaign. They include: • • • • •

Anonymous marking Feedback Academic misconduct Plagiarism Course representation reforms

KUSU hopes that “love your degree, love your future” will create a coherent approach to the union’s research and generate a consistent message which will significantly improve the academic experiences of students at Kingston

Anglia Ruskin Students’ Unions Quality Monitoring Process Students’ unions collect a vast quantity of information from their student body about their experiences, but this information is often hard to link together if it is not coordinated properly. At Anglia Ruskin Students’ Union they created a Quality Monitoring role who for 15 hours per week, collates all the information from institutional and national surveys, university and union committees and feedback from the weekly officer Go Out And Talk sessions (GOAT). This information is collected and put into an issues matrix which allows staff and officers of the union to monitor issues and track trends in student satisfaction.

23


The creation of this database has meant that the union is able to efficiently monitor issues which have arisen and follow up on agreed actions with the university as well as aid to feedback to students on what the union is working on and brief course reps on the actions taken on key student issues. The Issues Matrix was created using Microsoft Access which enables the user to run queries on certain issues. This means that if a student rep highlights that there has been an issues with the timeliness of their feedback, a quick search in the database will uncover whether this is an issue on any other course. Laura Homan, Student Rep and Quality Monitoring Coordinator said: “I feel that this role has helped us be more proactive in tackling issues and assists us in communicating with students on topics that are relevant to them. It also helps us inform reps about the current issues and news that ultimately improves the quality of the student reps and strengthens the system.” For more information please contact l.holman@angliastudent.com

24


Further information

5 Heriot-Watt University Union Heriot-Watt University Students’ Union used the results of the NSS to highlight poor satisfaction with feedback among students, and ran a postcard campaign to find best and worst practice in terms of feedback within the institution.

www.nusconnect.org.uk/HE NSS: www.thestudentsurvey.com PRES/PTES: www.heacademy.ac.uk ISB: www.i-graduate.org DHLE: www.hesa.ac.uk

There was an overwhelming response, with students requesting that exam scripts should be returned with feedback included on them. The students’ union ran a campaign where this message was attached to students’ exam scripts with stickers. The university has now agreed to return these scripts and is also using the feedback policy composed by the union.

UCLAN Students’ Union UCLAN used the comments section of the NSS to provide evidence for its student written submission for its institutional audit. In addition, NSS results were also used to isolate one campus’ dissatisfaction with library opening hours, and by NSS feedback, results from other questionnaires and hard campaigning, the union was eventually able to pressure the university into opening the library on Sundays.

Useful Contacts National Union of Students 4th Floor 184-192 Drummond Street LONDON NW1 3HP TEL: 0207 380 6600 E-mail: nusuk@nus.org.uk Usman Ali VP Higher Education usman.ali@nus.org.uk Christina Yan Zhang International Students Officer Christina.Zhang@nus.org.uk Kate Wicklow Student Feedback Officer nss@nus.org.uk Victoria Passant Student Feedback Assistant Victoria.passant@nus.org.uk Debbie McVitty Research & Policy Officer (Postgraduates) Debbie.McVitty@nus.org.uk Liz Williams Higher Education Policy Officer Liz.williams@nus.org.uk


National Union of Students 4th Floor 184-192 Drummond Street London NW1 3HP t: 0207 380 6600 f: 020 7380 6690

www.nusconnect.org.uk


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.