HE Focus on Assessment

Page 1

Assessment he focus Welcome to the new edition of the HE Focus education, kindly supported by the NASUWT. In this edition we are examining assessment practices within the sector. The National Student Survey has highlighted student dissatisfaction with feedback and assessment, time and time again. This edition of HE Focus brings together articles from students’ unions and institutional staff on some of the issues surrounding assessment practices, and showcases some possible solutions. Assessment is one of the most important aspects of student life. Assessment not only informs a student’s final degree classification, it can also help them to expand their knowledge outside the curriculum, and learn important life skills to help them succeed in the future. Therefore, it is worrying that students are currently not getting the most out of assessment practices while at university. Students commonly criticise institutions for the lack of variety of assessment tasks, and in many cases formative assessment is not set. Without formative assessment and a variety of tasks, our graduates are lacking key development skills such as self reflection and teamwork, which are vital to their future career development. Dissatisfaction with assessment is not just confined to the undergraduate

in association with

quarterly journal volume 5

student experience. Postgraduate students are also unhappy with the lack of information given to them on completing their assessment tasks. We must ensure that all of our students receive information about their assessments in a timely fashion, so they can fully prepare for the task and to reduce the stress and anxiety that assessment can cause. As well as this issue of HE Focus, NUS has just launched a new Feedback and Assessment Charter, which brings together students’ requirements for both assessment and feedback practices. The 10-point charter can be seen at the side of this page. An expanded edition can be downloaded from: www.nusconnect. org.uk/campaigns/highereducation.

What’s in here? p2, Bucks Education Campaign, Chris Cla rk

p4, Rethinking assessment, Prof Marg are t Pric e

p5, Demystifying the doctoral theses, Prof Eme ritus Vernon Traf ford

p6, Good assessment, Dr Mark Russell

p7, Student engagement, Carolyn Bew

p8, Time’s up for exams, Sue Bloxham

New charter on assessment and feedback 1. Formative assessment and feedback should be used throughout the programme. 2. Students should have access to face-toface feedback for at least the first piece of assessment each academic year.

We hope that this issue of HE Focus will help you to think about the assessment needs of your students, and campaign for better assessment practices within your institution. NUS has a variety of resources available on all elements of assessment, from writing out plagiarism in assessment to anonymous marking. All of these resources can be found on our NUS connect pages.

3. Receiving feedback should not be exclusive to certain forms of assessment.

If you would like any further assistance, please email me at usman.ali@nus.org.uk.

8. Students should be supported to critique their own work.

Usman Ali VP Higher Education, NUS

4. Feedback should be timely. 5. Students should be provided with a variety of assessment methods. 6. There should be anonymous marking for all summative assessment. 7. Students should be able to submit assessment electronically.

9. Programme induction should include information on assessment practices and understanding marking criteria. 10. Students should be given the choice of format for feedback.


n ig pa m Ca n io at uc Ed n io Un ’ ts en ud Bucks St

k it must be So for an anonymous system to wor marker to truly blind, with no possibility of the identify the student.

e President Chris Clark, Vic ion ks St udents’ Un Educ ation , B uc Assessment and feedback ’ Union, Over the past year at Bucks Students that es issu we have been challenging many ic quality our students face relating to academ cation Edu ks and standards, through the Buc ine of the Campaign. What follows is an outl ment and ess points that have a focus on ass paign, the feedback, why they are in the cam learned. ons benefits involved and some less essments Anonymous marking system of ass ous marking Bucks has had a system of anonym it meets the of for some years, which on the face students universities criteria. So why were our e is a issu this dissatisfied with it? Underneath ents that deep-seated conviction among stud k the crac to if there is a way for an academic is linked to shield of anonymity, they will. This demics will aca the other deep-seated fear that es or the respond to criticism about themselv course by giving lower marks.

often Anonymous marking systems are they prevent criticised either on the grounds that l help that academics from giving the persona suit some students want, or that they do not e truth in both assignment formats. There is som needed is of these accusations. What is really marked a blend of sufficiently anonymously surance assignments to give students the reas with other of merit-based marking, combined participation assignments where the individual of the learning and support of academics is part re and process. If students would rather sha have that discuss their assessment then they would rather option. In the same way, if a student assurance remain anonymous, they have the protect their that the system is robust enough to identity.

nd to However implausible this might sou usly fair to those academics who are scrupulo the time her their students and who have neit or behind nor inclination to hunt out the auth d belief -hel a particular script, it is a strongly supported by among students, and one which is hing staff. horror stories about a minority of teac

ts Electronic submission of assessmen e t-tim In our universities there are more par learners than students, commuting and distance ce studies ever before. More of the UK's workfor ng extended while also working, rather than taki aspirations. study breaks to further their career technologies At the same time, communications that were once difficult to use are now ents find it mainstream and intuitive. These stud simply to an inconvenience to travel to campus h sometimes submit a piece of assessment, whic work. means having to take time off from ks Education While we were formulating the Buc tronic Campaign, it became clear that elec rity, not prio submission was an ever-increasing expected just for our ‘traditional’ students who

a digitally-literate university experie nce, but to address the accessibility of assessment for all types of student. It was at this point that we encoura ged Bucks to look at creating an electronic process for assessments. The result is a process that not only allo ws for electronic submission, but also student anonym ity, electronic marking and annotation, and electron ic delivery of feedback and grades. This mak es life easier for many of our students but also has added benefits for administration staff by streamlining the amount of data handling. Can there be a justification at this mom ent in time for failing to offer students a slick, relia ble and integrated electronic method of submitting wor k and receiving feedback and results? I don’t think so, yet many of our universities have not yet taken this opportunity. Face-to-face feedback on first ass essment Research conducted throughout 200 9/10 clearly showed that students at Bucks pref erred one-to-one communication with their academ ics, and this was especially valued as their preferred method of feedback on assignments. Discussions with students and aca demics highlighted a significant issue around students’ progression from one year to the next. Investigation revealed that students were not aware of what would be exp ected of them, or what the step from one year to the next would entail. As a result, we recommended that “each student shall receive face-to-face feedback on thei r first written piece of assessment, every year”. This early individual feedback prov ides an opportunity to set the context and value of face -to-face interaction between student and academics for the remainder of the year, and allows academics to coach students in the level of performance required at the next level of their course.

Challen gin g the issues of assessment and feedb ack as a package

Because assessment and feedback go hand in hand, we have found it more useful to campaign jointly for better assessment and feedback practices. This approach means that we are not overlapping with any other campaigning and both students and the university know clearly what our goals are. Recently Bucks held a ‘Feedback Matters’ conference for the staff of the university and its partner collages, at which Prof Alison Hanstead (Pro Vice Chancellor, Aston University) gave the keynote speech. She shared some best practice and innovative approaches, and articulated how universities’ approach to the issues around assessment and feedback needs to change. Specifically, she said that assessment and feedback are not individual issues, and that it is at the time of designing assessments that we should also consider what are necessary and relevant forms of feedback.


Rethinking assessment:

standards, fairness and learning Prof Margaret Price, Director of Assessment Standards Knowledge Exchange (ASKe), Centre for Excellence in Teaching and Learning, Oxford Brookes University

Assessment is much more complex than most people imagine and there are no quick and easy solutions to current problems. Successful and fair assessment depends upon common understandings of the standards used to make assessment judgments (Sadler, 1989). When staff use the same standards there is consistency in marking and if students have the same notion of quality as their teacher it means that they can, at least, ‘see the target’ and can focus their efforts rather then just ‘shooting in the dark’. Therefore all staff and students need to share a common view of what constitutes high-quality work. However, this is easier said than done. Assessment standards are at the heart of the assessment process underpinning everything from assessment strategy to design and marking. For students, assessment standards provide guidance for their learning, allow them to monitor their progress and will ultimately be used to judge their performance. So in order to improve the assessment process ASKe has been focusing on the sharing of assessment standards. Improvement in assessment will only really come about if we take a holistic view of assessment within learning (Price et al, in press). We discussed our ideas with others (researchers, teachers and students), which led to the development and publication of the Assessment Standards Manifesto for Change (available at www. brookes.ac.uk/aske/manifesto.html). In summary, this manifesto acknowledges that, contrary to popular belief, assessment standards can rarely be fully explained through explicit descriptions. Therefore criteria rubrics, assessment

grids and level descriptors alone are not sufficient to communicate assessment standards. Establishing common understandings of standards requires the sharing of both explicit and Pro f Margaret Price, tacit knowledge. A clear Oxford Brookes Universi understanding of standards ty is developed through using them so, beyond the traditional process of students participating in well-designed assessments and receiving feedback related to standards, students must also engage in activities designed to develop their own ability to use standards, such as marking exercises and/or self and peer assessment processes. Standards are shaped and established within disciplinary and Re ferences professional communities so staff and Price, M, Rust, C, O’Donovan, B, and students (and other stakeholders, eg there Carroll, J (in press) ‘If I was going al employers) need to be active in their I wouldn’t start from here – a critic nt ssme asse nt curre commentar y on communities. Alongside this there need ation in practice’. In Assessment and Evalu to be checks and balances to moderate Higher Education the standards being used within the Price, M, O’Donovan, B, Rust, C and ards: Carroll, J (2008) ‘Assessment Stand higher education sector. We must all be prepared to play our part in developing a common understanding of standards through an active involvement in the learning community. Only then will assessment promote deep learning, and yield greater levels of satisfaction. ASKe is continuing its work to improve assessment and assessment processes – please see our website for ideas, resources and events. If you want to read more about the rationale for our manifesto, please see http://bejlt.brookes. ac.uk/article/assessment_standards_a_ manifesto_for_change/

a Manifesto for Change’. In Brookes vol e-Journal of Learning and Teaching, 2(3)

ssment Sadler, DR (1989). ‘Formative Asse ms’. In Syste al ction Instru of n Desig the and 9-44 pp11 18, vol Instructional Science,

Demystifying how doctoral theses are examined

‘I enjoyed my viva. It was what I had expected.’ ‘I had no idea what would happen in the viva. I coped and survived.’ Each higher education institution’s procedures for doctoral examinations appear in handbooks for candidates and supervisors. Examiners also receive explicit criteria to use in assessing the scholarly merit of doctoral theses. Despite the availability of such materials, Burnham (1998) believed that “the viva is one of the best-kept secrets in British higher education”. This view still exists. So what is the mystery? A PhD, DBA, EdD, PrD, etc is awarded to candidates who make an original and independent contribution to knowledge, and are judged to be capable of undertaking research independent of further supervision. Theses are assessed in viva voce examinations attended by the candidate and two or three examiners – plus the non-participating supervisor(s). HEIs may appoint an independent chair as event host and to advise on procedure. Most supervisors discuss nominations for examiners with candidates, thereby avoiding potential methodological conflicts/disputes or difficulties with gender, ethnicity and religion. The mystery starts if this process is not transparent.

Vernon Tra fford, Pro fesso r Emeritus, An glia Ruskin Un iversity Professor Trafford’s latest book is: Trafford, VN and Leshem, S (2008). Stepping stones to achieving your doctorate: by focusing on the viva from the start. Maidenhead and New York, Open University Press/McGraw-Hill

The viva process Vivas range from 40-130 minutes in humanities and social sciences, and 90-240 minutes in natural sciences. Viva times and locations depend on the availability of a room and participants’ availability, and arrangements are handled by a cadre of research degree administrators. Candidates have minimal input into these administrative activities, unless they object to an unsuitable viva setting. Traditionally, only participants receive these details – which others may interpret as secrecy. After receiving a submitted thesis, examiners produce independent reports that are exchanged before the viva. During vivas examiners ask questions that reflect their reading of the thesis, addressing points of interest/concern about the research or its defence. They expect candidates to engage with their questions, enter into scholarly discussion and defend or justify their methodological approaches, findings and conclusions. When vivas end, candidates and supervisors retire and return to receive the examiners’ consolidated report and recommendations. A written report follows. The mystery may deepen since candidates seldom recall in detail, or with emotional clarity, what transpired in their viva. Elated by the outcomes, or dismayed by recommended amendments to their theses, candidates are often unable explain exactly what happened in their viva. Thus, for different reasons participants attending UK doctoral vivas seldom tell others what transpired. This confirms the mystery and, inevitably, accelerates the speed of rumours about most vivas. HEIs are scrupulous in ensuring how academic quality is examined. The very private world of doctoral vivas reflects HEI concerns with confidentiality of proceedings, and accepting liability for academic standards. Another mystery? Very occasionally, one examiner answers the ‘key’ question asked by the other examiner while the candidate listens in astonishment, and possible relief!


ty of Hertfordshire. Effecting Dr Mark Russell, Deputy Director of the Blended Learning Unit, Universi project director E) (ESCAP ce Experien and Practice Sustainable Change in Assessment

The ESCAPE project Good assessment is the right of all our students – fact! Good assessment should not be the experience of a lucky few taught by academics who understand the significant influence of assessment. Proper assessment is an influence that, inter alia, shapes students’ study behaviours, stimulates an appropriate approach to learning, and arouses the students’ inquisitiveness in learning and their subject discipline. Such aspirations, although not impossible, are becoming even more difficult given the increase in student numbers and associated reduction in resources.

assessment to encourage and stimulate, rather than measure, learning.

Working with our ESCAPE partner schools we are using the themes to establish a better assessment and hence educational experience for their Dr Mark Russell, Universi ty of students. Using these themes, along Hert fordshire. with a purposely positive exploration of current practice, we are working with staff to develop assessment that is both educative and resource-efficient. By working at a strategic level, rather than offering quick fixes, we are providing a framework for our partners to take their developing assessment expertise to other modules. Indeed, we are already seeing Our JISC-funded project, Effecting Sustainable being translated to Change in Assessment Practice and Experience assessment developments other non-ESCAPE modules. (ESCAPE) seeks to bring about enhancements to assessment in a resource-efficient way. An example in practice Working with two academic schools at the required students to University of Hertfordshire, (the Business School One module previously provide an individual laboratory report. The and the School of Life Sciences), we set out to nature of lab reports and the class size of 90 explore current assessment practice, highlight students meant that feedback was not provided good assessment and subsequently support immediately after their submission, and the our partners as they review and develop their students’ thinking and developing conceptions assessment practice. could not readily be seen by their lecturer. Following the module team’s engagement with To help us engage with busy academics, and the ESCAPE project, the students now work being mindful of the so-called magic number in groups and co-construct their laboratory seven, we have drawn together the many report on a wiki. Importantly, the wiki ensures existing principles of assessment and feedback that individual contributions are seen and and produced a set of six over arching ESCAPE the evolving laboratory report is visible to the themes. lecturer. As such the lecturer now engages with the students’ work at regular intervals and Good assessment for learning… provides ongoing feedback on the students Engages students with the assessment criteria ge Images on this pa s. This means that analysi and iversity thinking work, Un e th of learning alised sy person ts rte Suppor cou 10 20 © e to ts hir of Her tfords feedback can now be used by the studen Ensures feedback leads to improvement shape their thinking and their work. The lecturer, Focuses on student development new to wikis at the start of the process, believes Stimulates dialogue that his students now have a better educational Considers student and staff effort. experience. He has also saved time, so this is a Although we recognise the importance of assessment as an instrument to measure learning, our primary interest is using

win-win situation. For more information about the ESCAPE project take a look at http://escape-uh-jisc.blogspot. com/ or email m.b.russell@herts.ac.uk

Student engagement in assessment – a case study

Carolyn Bew, Art, Design and Media Subject Centre, Higher Education Academy

Conclusions drawn on the research into the national results of the National Student Survey for Art Design and Media students carried out by Prof Manze Yorke and Prof David Vaughan, highlighted the need for a more reflective view of the pedagogy of art, design and media and student engagement. In response, the Art Design and Media Subject Centre and the Performing Arts Learning and Teaching Innovation Network (PALATINE) recently ran a two-day collaborative event with support from the Arts Group on assessment, focusing on art and design and the performing arts. There is some interesting work being done in this area, but as highlighted by the National Student Survey, we need continuing enquiry into the ways students reflect on their own and others’ work and feed their experience back into their institutions. This is particularly important in art and design, where it is difficult to set objective assessment criteria without greater student involvement.

35%

Percentage Agree

Good asse ssment – the right of all students!

Formative assessment and feedback has been an integrated and established part of the curriculum practice in art, design, architecture and the performing arts for more than 50 years. Student engagement in this process is crucial. This involvement is seen as a positive and critical element in students’ learning processes. The creation, performance and presentation of work, portfolio reviews, self and peer assessment, tutorials, seminar presentations and – in art and design – the critique (‘crit’), together with practice- and studio-based culture and environment, are all integral to teaching and learning in the creative arts. The subject centre will be compiling a series of papers and case studies as a result of the symposium, that we hope will inspire the whole higher education sector in involving students in curriculum development. We will shortly be inviting contributions from the art, design and media sector, starting with the contributors to the symposium.

Student Involvement in Cu rriculum Design 32%

30% 25% 20% 15%

11%

10%

10%

5%

1% 1%

0% 1 - Not at all involved How involved do you believe you are in shaping the content, curriculum or design of your course?

Student En gagement Toolkit

The Student Engagement Toolkit will be launched in September 2010, alongside additional resources on NUS Connect. The materials will help institutions and students' unions enhance their engagement

How involved do you want to be in shaping the content, curriculum or design of your course? Taken from the 2009 NUS/HSBC Student Experience Report

activities through gathering feedback, supporting representatives and involving students in curriculum design. This work is part of the ongoing NUS-HE Academy project, which will also be hosting two conferences in November 2010.


Time’’ s up for Time examinations Prof Sue Bloxham, University of Cumbria

Written examinations continue to be used very heavily in universities. Undoubtedly, examinations do have some benefits as a mode of assessment; they can test knowledge and understanding and its theoretical application (Freeman and Lewis, 1998), the revision process can help to develop depth of understanding (Entwistle, and Entwistle, 1997), and examinations offer equitable treatment to all students who have the same time to do the same task. However, their main advantage as a form of assessment is their convenience and practicability. Everyone sits them at the same time, there is little obligation on tutors to provide detailed feedback so they are quicker to mark, and they reduce the potential for cheating as it is much easier to guarantee that it is a student’s own work if assessment is carried out in controlled conditions. But university education is not about convenience, it is about learning. The best university assessment is integrated with students’ learning and provides them and others, such as employers, with reliable information about their achievement. On this score, examinations are weak. The drawbacks of examinations Firstly, examinations promote inferior learning when compared with coursework, because students tend to focus on superficial approaches and memorisation. They provide limited opportunity for students to learn from their performance, as feedback is often minimal and students do not learn what marks they have gained until well after a module has finished. In addition they encourage students to put off work until the end of the module (Gibbs, 2006), then ‘cram’ information which is quickly forgotten (Knight and Yorke, 2003). Secondly, examinations aren’t terribly good at assessing the rich range of outcomes

that we expect from modern graduates and, therefore, they have little value in predicting students’ future achievement or job success (Kvale, 1996). They are generally only good for assessing what students know rather than what they can do, and they are not suitable for assessing higher-level thinking because “it is difficult to set sufficiently complicated questions with data that is wide-ranging enough in a time-limited examination” (Freeman and Lewis, 1998). Thirdly, examinations may not be testing what we think they are testing. Poor examination technique and writing speed can heavily influence students’ marks, and ‘exam nerves’ can affect performance. I sometimes wonder whether examinations have become a test of wrist stamina! Time constraints can be particularly difficult for international students, and those with disabilities and ‘reasonable adjustments’ such as extra time cannot always compensate. Despite this poor track record of examinations as a form of assessment, worries about plagiarism will no doubt halt or reverse trends towards fewer examinations. However, we should be encouraging tutors to design examinations which foster higher-quality learning by using different formats of examination (Thomas and Bain, 1984) and setting holistic questions which draw from topics across a module. (Entwistle and Entwistle, 1997)

am, Pro f Sue Bloxh umb ria Universit y o f C

References

Entwistle, and Entwistle, A (1997) ‘Revision and the experience of understanding’. In Marton, F, Hounsell, D and Entwistle, NJ (eds) The Experience of Learning: Implications for teaching and studying in higher education. Edinburgh, Scottish Academic Press pp145-155 Freeman, R and Lewis, R (1998) Planning and implementing assessment. London, Kogan Page Gibbs, G (2006) ‘How assessment frames student learning’. In Bryan, C and Clegg, K (eds) Innovative Assessment in Higher Education. Abingdon, Routledge pp23-36 Knight, PT and Yorke, M (2003) Assessment, Learning and Employability. Maidenhead, Open University Press Kvale, S (1996) ‘Examinations re-examined: certification of students or certification of Knowledge?’. In Chaiklin, S and Lave, J (eds) Understanding Practice: Perspectives on activity and context. Cambridge, Cambridge University Press Thomas, PR and Bain, JD (1984) ‘Contextual dependence of learning approaches: the effects of assessments’. In Human Learning, vol 3 pp227-240


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.