NZASE #126

Page 41

Using the test results to support student learning The tests can be used to identify both individual and class strengths and weaknesses. A class-wide pattern of incorrect answers in a particular Nature of Science sub-strand might indicate the need for further opportunities for students to develop particular skills/dispositions/competencies. Response patterns might shed light on questions such as: • Do the students understand the importance of evidence, and what counts as evidence? • Do students need practice in reading or interpreting data? • Do students carefully consider alternative explanations? If a teacher really wants to find out more about their students’ thinking it could be useful to use incorrect responses as a basis for discussion. Are there some items where many students chose the same incorrect distracter? Perhaps there were questions where responses were fairly evenly spread across the distracters. In either case, challenge students to think about why students might have chosen particular answers. Perhaps students could work out what knowledge was needed to be able to think successfully. The teachers’ manual that accompanies the tests gives some suggestions of activities and approaches that teachers may find useful as they think about adapting their teaching practices. For example, if many students are having difficulty with identifying evidence and its limitations, teachers could simply focus students’ attention on this by encouraging them in class discussions to ask the question, “How do you know that? What’s the evidence?” Alternatively, students could be given data sets and then asked to brainstorm as many possible explanations as they can for any patterns they see in the data. As a group discuss which explanations are the most plausible. How do you know? What additional data would you need to decide which is the best explanation?

The test items themselves could also provide teachers with ideas for adapting their own teacher-generated assessments. Many of the test items could be adapted to whichever context the class is studying. Used in these ways we think Science: Thinking with evidence has the potential to provide useful support for teachers as they work out what curriculum policy changes might look like in their classrooms. We see it as a thinking tool for both students and teachers and we hope that is the way it will be used. Developing this test has certainly challenged our thinking.

NZ

science teacher

Challenges in assessing thinking One of the challenges in designing this resource was that we wanted to measure how well students could use knowledge, but we had no way of knowing what knowledge students had and obviously students can’t use knowledge they don’t have. Our solution to this dilemma was to provide most of the information a student needed to be able to think about the problem. This solution has limitations though. According to cognitive science, “Successful thinking relies on four factors: information from the environment, facts in the long-term memory, procedures in the long-term memory, and the amount of space in the working memory. If any of these factors is inadequate, thinking will fail,” (Willingham, 2009, p14). This means that even though the necessary information is given in the test question (information from the environment) a student’s capacity to think with that information will still be affected by the other factors. We would expect students to be able to think better when the content and context is familiar. We saw an example of this in a small research project we carried out with some Year 9 classes in five different schools. Generally, the students in the higher decile schools did better on the items in Science: Thinking with evidence than the students in the low decile school. The one question where this was not the case was one about shadows. The low decile school’s students had just completed a unit of work on shadows and they scored better than the other schools’ students on this item. This poses an interesting question as to whether or not you can assess thinking as a general skill, or whether it is always context specific. If a student can think through problems that involve complex relationships and multiple thinking steps in one context in science but can only work with simple direct relationships in another science context, how do we decide how well they can think in science? Another question to ponder is how is the thinking students do in science the same or different from the thinking they do in other learning areas? What is the specific contribution thinking in science makes to the development of the key competency, thinking? These are some of the questions the science education team at NZCER is currently thinking about as we continue to grapple with what future-focused science education might look like. For further information contact: Ally.Bull@nzcer.org.nz

126

futurefocus −supportingscienceeducation

Close analysis of the test items showed that a number of factors combine to make an item either more or less difficult than others. Low-demand items usually require students to: read only one text type; work within a familiar context; work with familiar or provided science knowledge; think in ways that require them to make simple or direct links. The most demanding tasks require students to: read multiple text types; synthesise factors from different pieces of evidence; work in unfamiliar contexts; use challenging/ complex science knowledge; and think through complex relationships that might involve orientating information in space or time, or involve multiple thinking steps. Items of moderate difficulty make higher demands on students in some of these areas and lower demands in others. We found no pattern identifying that any one of these areas on its own necessarily makes an item any more or less difficult. For example, presenting evidence with very little written text does not by itself necessarily make a question easy if there are other high-demand aspects. To progress (i.e. answer harder questions) students need to master a range of competencies, such as the ability to read a range of text types, or to transfer ideas from one context to another. The metaphor of ‘progress in pieces’ (Carr, 2008) where progress looks more like putting together the pieces of a jigsaw puzzle than a series of linear steps, seems useful here.

References Bull, A., Ferral, H., Hipkins, R., Joyce, C., & Spiller, L. (2010). Science: Thinking with evidence. Wellington: NZCER. Carr, M. (2008). Zooming in and zooming out: challenges and choices in discussions about making progress. Presented at ‘Making progress – measuring progress’, NZCER Conference, Wellington. Ministry of Education (2007). The New Zealand Curriculum. Wellington: Learning Media. Willingham, D. (2009). Why don’t students like school? San Francisco: Jossey-Bass.

New Zealand Association of Science Educators

39


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.