Assessment & Evaluation Portafolio

Page 1

Universidad Mariano Galvez de Guatemala Profesorado de Enseñanza Media en Inglés Escuela de Idioma Evaluation and Assessment Techniques Lda. Evelyn Quiroa

Student’s name: Ma. Raquel Morales Pérez Student’s ID: 07603220 Date: November 5th


Introduction The principal theories about evaluation and assessment were taught in this oportunity. We develop our critical analysis in order to critique the different perspectives about the student’s learning and needs according the level of skills they have.


INDEX Introduction

1

Evaluation and Assessment Syllabus

2

Assessment and Evaluation Defined

3

What is assessment and Evaluation Differences

4

Assessment Vocabulary

5

Vocabulary Day 1 Assessment and Evaluation

6

Evaluation and Assessment Article

7

First Presentation

8

Rubric

9

What is Evaluation? Part I

10

Objectives and Competences

11

Revised Blooms’ Taxonomy

12

Blue Print

13

Monthly Planner

14


Types of Evaluation Items Essay Questions

17

Completion short answers

18

Higher Order of Thinking

19

Performance Assessment

20

Matching Test Format

21

True and False Format

22

Constructing Test Items

23

Test Item Type Questions

24

Planning Class

25

Blue Print

26

Exam

27

Conclution

28


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS ESCUELA DE IDIOMAS LICDA. EVELYN R. QUIROA

PROFESORADO EN EL IDIOMA INGLES CURSO: EVALUATION AND ASSESSMENT TECHNIQUES COURSE DESCRIPTION This course is dedicated in the study of the principle theories that inbound evaluation and assessment in the classroom. A critical analysis will be held in order to critique and put into practice the different perspectives, techniques and styles related to performance‐based assessment, summative and formative feedback methods to assess and evaluate student learning in the classroom. COURSE GOAL By the end of the course, students will be able to plan and create assessments and evaluations that provide their students with activities closely related to learning objectives and/or competences. LEARNING OUTCOMES Upon completion of the course, students will be able to: 1. Demonstrate development and use of academic standards across the curriculum and application of standards and objectives in classroom assessment and evaluation. 2. Match assessment to learning outcomes, develop rubric criteria and select appropriate assessment and evaluation choices using the tools proportioned by the course. 3. Apply current research tools to create authentic assessment, discourse analysis, self and peer evaluation, rubrics, surveys, tests and mini‐quizzes for self‐paced tutorials. 4. Evaluate and utilize appropriate tools such as grade books, calendars, spreadsheets and portfolios. GENERAL AND SPECIFIC EXPECTATIONS OF THE COURSE Student Assessment and Evaluation General Expectation 1: to communicate an overview of evaluation frameworks and processes. Specific Expectations: 1. Identify the following: a) the purposes of evaluation, b) key terms relative to evaluation, c) types of evaluation, d) links between planning and evaluation 2. Develop student assessment and practice within a philosophical framework 3. Understand equity issues in evaluation and assessment. General Expectation 2: to understand the purposes of various types of evaluation strategies. Specific Expectations: 1. Differentiate between diagnostic, formative, and summative evaluation 2. Compare the purpose and function of different information sources for evaluation 3. Identify a variety of evaluation and assessment procedures, their purposes, strengths, and weaknesses 4. Discriminate between traditional and authentic assessment and appropriate application in teaching/learning 5. Incorporate appropriate assessment and evaluation strategies into your teaching practice.


General Expectation 3: to place evaluation strategies in the context of a unit of study. Specific Expectations: 1. Design student assessment instruments (including rubrics) for a unit of study 2. Accommodate the needs of exceptional students within the unit and its evaluation component. 3. Enhance research in teaching to improve their own practice. 4. Be capable of doing self assessment. 5. Share the knowledge acquired to benefit the school community to which they belong. EXPECTATIONS: • Students are expected to attend all classes. Class attendance will be a part of the final evaluation. • Students are expected to arrive for class on time. Any student who arrives late will not be given additional time to complete quizzes, exams, or in‐class assignments. • Students are expected to submit all assignments on time. Late submissions will be penalized or not be accepted depending on the particular case. • Students are expected to come to class having read and completed all assignments. • Students are expected to participate in class discussions. • Students are expected to complete all quizzes and examinations in class on the date specified by the teacher. • Students are expected to word process assignments as required, handwritten work will not be excepted unless it is a test blueprint. CONTENTS: EXAM DATE CONTENT • The difference between evaluation and assessment • Types of evaluation (Diagnostic, Formative & Summative) • Establishing High‐Quality (Validity, Reliability etc. ) 08‐27‐11 • Becoming aware of content, context and learners • Curriculum and Evaluation • Visualizing your actions: planning and testing • Objectives vs. Competences • Blooms Taxonomy • Designing a blueprint • Test type items • Test item type instructions • Organizing test type items according to competencies and domain 10‐08‐11 levels • Analyzing test • Creating different core content tests • Assessment strategies • Self Improvement through self assessment • Self assessment tools: rubrics, checklists, portfolios etc. 11‐19‐11 • Differentiated learning • Declarative and procedural knowledge based assessment • Reflective Teaching and Learning • Administering and interpreting standardized tests NOTE: Additional content may be added to list.


MEANS TO ACHIEVE OUR GOALS: 1. Summary on subject matter must be turned in weekly. (Except when having test) 2. Teacher and student exchange of knowledge and experiences. 3. Group discussions. Students must read the material in advance. 4. Individual research and enrichment. 5. Multimedia presentations. 6. Teaching Project 7. Portfolio 8. Exams EVALUATION: Attendance 80% to apply for final term TOTAL ZONE…………………….……………………………………………10 PTS • QUIZZES • CLASS ACTIVITIES • PRESENTATIONS TWO MIDTERMS…….…………………………………………………….40PTS PORTFOLIO …….……………………………………………………………. 20 PTS FINAL EXAM ….…………..…………………………………………………. 30 PTS TOTAL …………………………………………………………………………..100PTS REFERENCES: 1. LANGUAGE PROGRAM EVALUATION, Brian K. Lynch Cambridge University applied linguistics 2. REFLECTIVE PLANNING, TEACHING AND EVALUATION. Judy W. Eby, Adrienne L. Herrell & Jim Hicks 3rd. Edition Merill‐Prentice Hall. London 2002 3. PLANNING LESSONS AND COURSES. Tessa Woodward. Cambridge University Press. Cambridge 2001 4. CLASSROOM ASSESSMENT, PRINCIPLES AND PRACTICES FOR EFFECTIVE INSTRUCTION, James H. McMillan. McMillan Press. Virginia 2001


CLASS REQUIREMENTS AND GUIDELINES Submitting Assignments: All assignments either have or will have an identified “due date”. Extensions beyond the designated due date are not granted except in the most extenuating of circumstances. With the exception of an immediate and pressing “emergency”, all requests for an extension will be written, signed, dated, and delivered in person to me, as your Professor, before the specified due date and in time for me to respond to your request in writing. All assignments are to include a title page that clearly identifies the assignment topic/title, course name and number, the date submitted, the teacher’s name, and the student’s name and I.D. number. All assignments are to be given, in person, directly to the teacher. I will take no responsibility for assignments that are given to other students or given to the personnel in the “Escuela de Idiomas” office. While I have not yet lost any student assignment; there is always the first time! Therefore, you would be well advised to back up your assignment electronically and if feasible, in hard copy. An assignment will be considered late if it is not directly handed to me, as your Professor, by the end of class on the specified “due date”. Late assignments will be penalized 5% for each day or part thereof following the specified “due date” [including Saturday(s) and Sunday(s)]. Attendance and Participation: Attendance will be taken at the beginning of each class period. Attendance in each class is mandatory; however, there is a proviso in the University regulations that students are permitted to miss the equivalent of 3 classroom contact hours. Beyond this limit, the student will be issued a warning that any more absences may result in being excluded from writing the final examination. Regular attendance, being prepared, and constructively participating in classroom activities, are all seen as integral components in the growth and development of becoming a professional teacher and in the establishment of a meaningful community of learnership in our class. Tardiness This can be extremely disruptive and disrespectful to members who strive to be on time. Naturally, we all encounter circumstances that occasionally cause us to be late – but habituated tardiness is not acceptable. If you are late for class, no material will be repeated. Therefore, you need to contact your classmates to be filled in on the material covered. If you arrive after attendance has been taken and you have no excuse, you will be marked as absent. Class Policy on Cell Phones Cell phones must be turned off at all times. If you are expecting an emergency call make sure to talk to me before class. Class Policy on Laptop Computers You may bring your laptop to class, but all work done on laptop computers must be related to the class work of that day. Academic Dishonesty Academic honesty is fundamental to the activities and principles of the University, and more broadly to society at large. All members of the academic community must be confident that each person’s work has been responsibly and honorably acquired, developed, and presented. References Use the A.P.A format 5th Edition.


Universidad Mariano Galvez de Guatemala Facultad de Humanidades Escuela de Idiomas Lda. Evelyn R. Quiroa

August 06th 2011 Evaluation

In this class we had to make a little quetionaire about evaluation. I have to admit that with the information you send us I could get a better idea about he evaluation concept. The problem that I have is that I’m not teaching actually, and it’s difficult to me apply the concept with my job. I tried to answered some questions with my job information, so I think the result are going to be different if it’s not classes at all.

Student Name: Ma. Raquel Morales Pérez ID number: 07603220


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS ESCUELA DE IDIOMAS LICDA. EVELYN R. QUIROA

Assessment and Evaluation Defined Assessment is the act of gathering information on a daily basis in order to understand individual students' learning and needs. Evaluation is the culminating act of interpreting the information gathered for the purpose of making decisions or judgments about students' learning and needs, often at reporting time. Assessment and evaluation are integral components of the teaching‐learning cycle. The main purposes are to guide and improve learning and instruction. Effectively planned assessment and evaluation can promote learning, build confidence, and develop students' understanding of themselves as learners. Assessment data assists the teacher in planning and adapting for further instruction. As well, teachers can enhance students' understanding of their own progress by involving them in gathering their own data, and by sharing teacher‐gathered data with them. Such participation makes it possible for students to identify personal learning goals. This curriculum advocates assessment and evaluation procedures which correspond with curriculum objectives and instructional practices, and which are sensitive to the developmental characteristics of early adolescents. Observation, conferencing, oral and written product assessment, and process (or performance) assessment may be used to gather information about student progress.

Guiding Principles The following principles are intended to assist teachers in planning for student assessment and evaluation: • •

• • •

Assessment and evaluation are essential components of the teaching‐learning process. They should be planned, continuous activities which are derived from curriculum objectives and consistent with the instructional and learning strategies. A variety of assessment and evaluation techniques should be used. Techniques should be selected for their appropriateness to students' learning styles and to the intended purposes. Students should be given opportunities to demonstrate the extent of their knowledge, abilities, and attitudes in a variety of ways. Teachers should communicate assessment and evaluation strategies and plans in advance, informing the students of the objectives and the assessment procedures relative to the objectives. Students should have opportunities for input into the evaluation process. Assessment and evaluation should be fair and equitable. They should be sensitive to family, classroom, school, and community situations and to cultural or gender requirements; they should be free of bias. Assessment and evaluation should help students. They should provide positive feedback and encourage students to participate actively in their own assessment in order to foster lifelong learning and enable them to transfer knowledge and abilities to their life experiences.


Assessment and evaluation data and results should be communicated to students and parents/guardians regularly, in meaningful ways.

Using a variety of techniques and tools, the teacher collects assessment information about students' language development and their growth in speaking, listening, writing, and reading knowledge and abilities. The data gathered during assessment becomes the basis for an evaluation. Comparing assessment information to curriculum objectives allows the teacher to make a decision or judgment regarding the progress of a student's learning.

Types of Assessment and Evaluation There are three types of assessment and evaluation that occur regularly throughout the school year: diagnostic, formative, and summative. Diagnostic assessment and evaluation usually occur at the beginning of the school year and before each unit of study. The purposes are to determine students' knowledge and skills, their learning needs, and their motivational and interest levels. By examining the results of diagnostic assessment, teachers can determine where to begin instruction and what concepts or skills to emphasize. Diagnostic assessment provides information essential to teachers in selecting relevant learning objectives and in designing appropriate learning experiences for all students, individually and as group members. Keeping diagnostic instruments for comparison and further reference enables teachers and students to determine progress and future direction. Diagnostic assessment tools such as the Writing Strategies Questionnaire and the Reading Interest/Attitude Inventory in this guide can provide support for instructional decisions. Formative assessment and evaluation focus on the processes and products of learning. Formative assessment is continuous and is meant to inform the student, the parent/guardian, and the teacher of the student's progress toward the curriculum objectives. This type of assessment and evaluation provides information upon which instructional decisions and adaptations can be made and provides students with directions for future learning. Involvement in constructing their own assessment instruments or in adapting ones the teacher has made allows students to focus on what they are trying to achieve, develops their thinking skills, and helps them to become reflective learners. As well, peer assessment is a useful formative evaluation technique. For peer assessment to be successful, students must be provided with assistance and the opportunity to observe a model peer assessment session. Through peer assessment students have the opportunity to become critical and creative thinkers who can clearly communicate ideas and thoughts to others. Instruments such as checklists or learning logs, and interviews or conferences provide useful data. Summative assessment and evaluation occur most often at the end of a unit of instruction and at term or year end when students are ready to demonstrate achievement of curriculum objectives. The main purposes are to determine knowledge, skills, abilities, and attitudes that have developed over a given period of time; to summarize student progress; and to report this progress to students, parents/guardians, and teachers.


Summative judgments are based upon criteria derived from curriculum objectives. By sharing these objectives with the students and involving them in designing the evaluation instruments, teachers enable students to understand and internalize the criteria by which their progress will be determined. Often assessment and evaluation results provide both formative and summative information. For example, summative evaluation can be used formatively to make decisions about changes to instructional strategies, curriculum topics, or learning environment. Similarly, formative evaluation assists teachers in making summative judgments about student progress and determining where further instruction is necessary for individuals or groups. The suggested assessment techniques included in various sections of this guide may be used for each type of evaluation.

The Evaluation Process Teachers as decision makers strive to make a close match between curriculum objectives, instructional methods, and assessment techniques. The evaluation process carried out parallel to instruction is a cyclical one that involves four phases: preparation, assessment, evaluation, and reflection. In the preparation phase, teachers decide what is to be evaluated, the type of evaluation to be used (diagnostic, formative, or summative), the criteria upon which student learning outcomes will be judged, and the most appropriate assessment techniques for gathering information on student progress. Teachers may make these decisions in collaboration with students. During the assessment phase, teachers select appropriate tools and techniques, then collect and collate information on student progress. Teachers must determine where, when, and how assessments will be conducted, and students must be consulted and informed. During the evaluation phase, teachers interpret the assessment information and make judgments about student progress. These judgments (or evaluation) provide information upon which teachers base decisions about student learning and report progress to students and parents/guardians. Students are encouraged to monitor their own learning by evaluating their achievements on a regular basis. Encouraging students to participate in evaluation nurtures gradual acceptance of responsibility for their own progress and helps them to understand and appreciate their growth as readers and writers. The reflection phase allows teachers to consider the extent to which the previous phases in the evaluation process have been successful. Specifically, teachers evaluate the utility, equity, and appropriateness of the assessment techniques used. Such reflection assists teachers in making decisions concerning improvements or adaptations to subsequent instruction and evaluation.

Student Assessment and Evaluation When implementing assessment and evaluation procedures, it is valuable to consider the characteristics of early adolescents. Developmentally, Middle Level students are at various cognitive, emotional, social, and physical levels. Assessment and evaluation must be sensitive to this range of transitions and address individual progress. It is unrealistic and damaging to expect students who are at various stages of development to perform at the same level. It is necessary to clarify, for Middle Level students, the individual nature of the curriculum and the assessment strategies used; students should recognize that they are not


being compared to their peers, but that they are setting their own learning goals in relation to curriculum objectives. Insensitive evaluation of the early adolescent can result in the student feeling low self‐worth and wanting to give up. Regular, positive feedback is a valuable part of the learning process and helps students identify how well they have achieved individual goals and curriculum objectives. As students begin to achieve success, their sense of self‐esteem increases and the need for extrinsic rewards gives way to the development of intrinsic motivation. Early adolescents are vulnerable to peer approval or rejection, and they harbor a strong sense of fairness and justice. Because Middle Level students find it more satisfying to strive for immediately achievable goals rather than long‐term goals, they will respond positively to a system of continuous assessment and evaluation. Effective evaluators of Middle Level students are astute observers who use a variety of monitoring techniques to collect information about students' knowledge, skills, attitudes, values, and language competencies. Well organized, concise, and accessible records accommodate the large quantities of data likely to be collected, and assist teachers' decision making and reporting. Some effective techniques for monitoring student progress in the areas of oracy and literacy include the following: • • • • • • • •

Make video and audio recordings of a variety of formal and informal oral language experiences, and then assess these according to pre‐determined criteria which are based upon student needs and curriculum objectives. Use checklists as concise methods of collecting information, and rating scales or rubrics to assess student achievement. Record anecdotal comments to provide useful data based upon observation of students' oral activities. Interview students to determine what they believe they do well or areas in which they need to improve. Have students keep portfolios of their dated writing samples, and language abilities checklists and records. Keep anecdotal records of students' reading and writing activities and experiences. Have students write in reader response journals. Confer with students during the writing and reading processes, and observe them during peer conferences.

Self‐assessment promotes students' abilities to assume more responsibility for their own learning by encouraging self‐reflection and encouraging them to identify where they believe they have been successful and where they believe they require assistance. Discussing students' self‐assessments with them allows the teacher to see how they value their own work and to ask questions that encourage students to reflect upon their experiences and set goals for new learning. Peer assessment allows students to collaborate and learn from others. Through discussions with peers, Middle Level students can verbalize their concerns and ideas in a way that helps them clarify their thoughts and decide in which direction to proceed.


The instruments for peer and self‐assessment should be collaboratively constructed by teachers and students. It is important for teachers to discuss learning objectives with the students. Together, they can develop assessment and evaluation criteria relevant to the objectives, as well as to students' individual and group needs.

Assessment and Evaluation Strategies Assessment data can be collected and recorded by both the teacher and the students in a variety of ways. Through observation of students, and in interviews or conferences with students, teachers can discover much about their students' knowledge, abilities, interests, and needs. As well, teachers can collect samples of students' work in portfolios and conduct performance assessments within the context of classroom activities. When a number of assessment tools are used in conjunction with one another, richer and more in‐ depth data collection results. Whatever method of data collection is used, teachers should: • •

meet with students regularly to discuss their progress adjust rating criteria as learners change and progress.

Observation Observation occurs during students' daily reading, writing, listening, and speaking experiences. It is an unobtrusive means by which teachers (and students) can determine their progress during learning. Observations can be recorded as anecdotal notes, and on checklists or rating scales. When teachers attach the data collection sheets to a hand‐held clipboard, data can be recorded immediately and with little interruption to the student. Alternatively, adhesive note papers can be used to record data quickly and unobtrusively. Anecdotal Records Anecdotal records are notes written by the teacher regarding student language, behavior, or learning. They document and describe significant daily events, and relevant aspects of student activity and progress. These notes can be taken during student activities or at the end of the day. Formats for collection should be flexible and easy to use. Guidelines for use include the following: • •

• • • •

Record the observation and the circumstance in which the learning experience occurs. There will be time to analyze notes at another time, perhaps at the end of the day, or after several observations about one student have been accumulated. Make the task of daily note taking manageable by focusing on clearly defined objectives or purposes, and by identifying only a few students to observe during a designated period of time. However, learning and progress cannot be scheduled, and it is valuable to note other observations of importance as they occur. Record data on loose‐leaf sheets and keep these in a three‐ring binder with a page designated for each student and organized alphabetically by students' last names or by class. This format allows the teacher to add pages as necessary. Write the notes on recipe cards and then file these alphabetically. Use adhesive note papers that can be attached to the student's pages or recipe card files. Design structured forms for collection of specific data.


Use a combination of the above suggestions.

Teachers may choose to keep running written observations for each student or they may use a more structured approach, constructing charts that focus each observation on the collection of specific data. A combination of open‐ended notes and structured forms may also be used. It is important to date all observations recorded. Checklists Observation checklists, usually completed while students are engaged in specific activities or processes, are lists of specific criteria that teachers focus on at a particular time or during a particular process. Checklists are used to record whether students have acquired specific knowledge, skills, processes, abilities, and attitudes. Checklists inform teachers about where their instruction has been successful and where students need assistance or further instruction. Formats for checklists should be varied and easy to use. Guidelines for using checklists include the following: • • • • • • • • • • • •

Determine the observation criteria from curriculum, unit, and lesson objectives. Review specific criteria with students before beginning the observation. Involve students in developing some or all of the criteria whenever it will be beneficial to do so. Choose criteria that are easily observed to prevent vagueness and increase objectivity. Use jargon‐free language to describe criteria so that data can be used in interviews with students and parents. Make the observation manageable by keeping the number of criteria to less than eight and by limiting the number of students observed to a few at one time. Have students construct and use checklists for peer and self‐assessments. Summarize checklist data regularly. Use or adapt existing checklists from other sources. Use yes‐no checklists to identify whether a specific action has been completed or if a particular quality is present. Use tally checklists to note the frequency of the action observed or recorded. Construct all checklists with space for recording anecdotal notes and comments.

Rating Scales and Rubrics Rating scales record the extent to which specific criteria have been achieved by the student or are present in the student's work. Rating scales also record the quality of the student's performance at a given time or within a given process. Rating scales are similar to checklists, and teachers can often convert checklists into rating scales by assigning number values to the various criteria listed. They can be designed as number lines or as holistic scales or rubrics. Rubrics include criteria that describe each level of the rating scale and are used to determine student progress in comparison to these expectations. All formats for rating student progress should be concise and clear. Guidelines for use include the following: • • •

Determine specific assessment criteria from curriculum objectives, components of a particular activity, or student needs. Discuss or develop the specific criteria with students before beginning the assessment. Choose criteria that are easily observed in order to prevent vagueness and increase objectivity.


• • • • • •

Select criteria that students have had the opportunity to practice. These criteria may differ from student to student, depending upon their strengths and needs. Use jargon‐free language to describe criteria so that data can be used effectively in interviews with students and parents. Make the assessment manageable by keeping the number of criteria to less than eight and by limiting the number of students observed to a few at one time. Use or adapt rating scales and rubrics from other sources. Use numbered continuums to measure the degree to which students are successful at accomplishing a skill or activity. Use rubrics when the observation calls for a holistic rating scale. Rubrics describe the attributes of student knowledge or achievements on a numbered continuum of possibilities.

Portfolios Portfolios are collections of relevant work that reflect students' individual efforts, development, and progress over a designated period of time. Portfolios provide students, teachers, parents, and administrators with a broad picture of each student's growth over time, including the student's abilities, knowledge, skills, and attitudes. Students should be involved in the selection of work to be included, goal setting for personal learning, and self‐assessment. The teacher can encourage critical thinking by having students decide which of their works to include in their portfolios and explain why they chose those particular items. Instruction and assessment are integrated as students and teachers collaborate to compile relevant and individual portfolios for each student. Guidelines for use include the following: • • • •

Brainstorm with students to discover what they already know about portfolios. Share samples of portfolios with students. (Teachers may need to create samples if student ones are not available; however, samples should be as authentic as possible.) Provide students with an overview of portfolio assessment prior to beginning their collections. Collaborate with students to set up guidelines for the content of portfolios and establish evaluation criteria for their portfolio collections. Consider the following: o What is the purpose of the portfolio? (Is it the primary focus of assessment or is it supplemental? Will it be used to determine a mark or will it simply be used to inform students, teachers, and parents about student progress?) o Who will be the audience(s) for the portfolio? o What will be included in the portfolio (e.g., writing samples only, samples of all language processes)? o What are the criteria for selecting a piece of work for inclusion? When should those selections be made? o Who will determine what items are included in the portfolio (e.g., the student, the teacher, the student and teacher in consultation)? o When should items be added or removed? o How should the contents be organized and documented? Where will the portfolios be stored? o What will be the criteria for evaluation of the portfolio? o What form will feedback to the students take (e.g., written summaries, oral interviews/ conferences)? o How will the portfolio be assessed/evaluated (e.g., list of criteria)? Assemble examples of work that represent a wide range of students' developing abilities, knowledge, and attitudes including samples of work from their speaking, listening, reading, writing, representing, and viewing experiences.


• • •

Date all items for effective organization and reference. Inform parents/guardians about the use and purposes of portfolios (e.g., send letters describing portfolios home, display sample portfolios on meet‐the‐teacher evening to introduce parents to the concept). Consider the following for inclusion: o criteria for content selection o table of contents or captioned labels that briefly outline or identify the contents o samples of student writing (e.g., pre‐writing, multiple drafts, final drafts, published pieces) o sample reading logs o samples of a variety of responses from reader response journals (originals or photocopies of originals) o evidence of student self‐reflection (e.g., summaries, structured reflection sheets) o audiotapes and videotapes of student work o photographs o collaborative projects o computer disks.

Formats for portfolio assembly should be easily organized, stored, and accessed. Some possibilities include the following: • • •

Keep file folders or accordion folders in classroom filing cabinet drawers, cupboards, or boxes. Use three‐ring binders for ease of adding and removing items as students progress. Store scrapbooks in boxes or crates.

Evaluating Student Portfolios At the end of the term/semester/year when the portfolio is submitted for summative evaluation, it is useful to review the contents as a whole and record data using the previously set criteria. One method of recording data is to prepare a grid with the criteria listed down one side and the checklist or rating scale across the top. If there is need to assign a numerical grade, designate numbers to each set of criteria on the checklist/rating scale and convert the evaluation into a number grade. Some examples of portfolio assessment and recording forms follow. The teacher can adapt these sample forms or create new ones.


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS ESCUELA DE IDIOMAS LICDA. EVELYN R. QUIROA

What is assessment and evaluation? Assessment is defined as data‐gathering strategies, analyses, and reporting processes that provide information that can be used to determine whether or not intended outcomes are being achieved: Evaluation uses assessment information to support decisions on maintaining, changing, or discarding instructional or programmatic practices. These strategies can inform: •

The nature and extent of learning,

Facilitate curricular decision making,

Correspondence between learning and the aims and objectives of teaching, and

The relationship between learning and the environments in which learning takes place.

Evaluation is the culminating act of interpreting the information gathered for the purpose of making decisions or judgments about students' learning and needs, often at reporting time. Assessment and evaluation are integral components of the teaching‐learning cycle. The main purposes are to guide and improve learning and instruction. Effectively planned assessment and evaluation can promote learning, build confidence, and develop students' understanding of themselves as learners. Assessment data assists the teacher in planning and adapting for further instruction. As well, teachers can enhance students' understanding of their own progress by involving them in gathering their own data, and by sharing teacher‐gathered data with them. Such participation makes it possible for students to identify personal learning goals. Types of Assessment and Evaluation There are three types of assessment and evaluation that occur regularly throughout the school year: diagnostic, formative, and summative. Diagnostic assessment and evaluation usually occur at the beginning of the school year and before each unit of study. The purposes are to determine students' knowledge and skills, their learning needs, and their motivational and interest levels. By examining the results of diagnostic assessment, teachers can determine where to begin instruction and what concepts or skills to emphasize. Diagnostic assessment provides information essential to teachers in selecting relevant learning objectives and in designing appropriate learning experiences for all students, individually and as group members. Keeping diagnostic instruments for comparison and further reference enables teachers and students to determine progress and future direction. Diagnostic assessment tools such as the Writing Strategies Questionnaire and the Reading Interest/Attitude Inventory in this guide can provide support for instructional decisions.


Formative assessment and evaluation focus on the processes and products of learning. Formative assessment is continuous and is meant to inform the student, the parent/guardian, and the teacher of the student's progress toward the curriculum objectives. This type of assessment and evaluation provides information upon which instructional decisions and adaptations can be made and provides students with directions for future learning. Involvement in constructing their own assessment instruments or in adapting ones the teacher has made allows students to focus on what they are trying to achieve, develops their thinking skills, and helps them to become reflective learners. As well, peer assessment is a useful formative evaluation technique. For peer assessment to be successful, students must be provided with assistance and the opportunity to observe a model peer assessment session. Through peer assessment students have the opportunity to become critical and creative thinkers who can clearly communicate ideas and thoughts to others. Instruments such as checklists or learning logs, and interviews or conferences provide useful data. Summative assessment and evaluation occur most often at the end of a unit of instruction and at term or year end when students are ready to demonstrate achievement of curriculum objectives. The main purposes are to determine knowledge, skills, abilities, and attitudes that have developed over a given period of time; to summarize student progress; and to report this progress to students, parents/guardians, and teachers. Summative judgements are based upon criteria derived from curriculum objectives. By sharing these objectives with the students and involving them in designing the evaluation instruments, teachers enable students to understand and internalize the criteria by which their progress will be determined. Often assessment and evaluation results provide both formative and summative information. For example, summative evaluation can be used formatively to make decisions about changes to instructional strategies, curriculum topics, or learning environment. Similarly, formative evaluation assists teachers in making summative judgements about student progress and determining where further instruction is necessary for individuals or groups. The suggested assessment techniques included in various sections of this guide may be used for each type of evaluation.


Chapter 9. Assessment Vocabulary The definitions in this list were derived from several sources, including: • • • • • •

Glossary of Useful Terms Related to Authentic and Performance Assessments. Grant Wiggins SCASS Arts Assessment Project Glossary of Assessment Terms The ERIC Review: Performance-Based Assessment. Vol. 3 Issue 1, Winter, 1994. Assessment: How Do We Know What They Know? ASCD. 1992. Dissolving the Boundaries: Assessment that Enhances Learning. Dee Dickinson http://www.newhorizons.org/strategies/assess/terminology.htm

Accountability – The demand by a community (public officials, employers, and taxpayers) for school officials to prove that money invested in education has led to measurable learning. "Accountability testing" is an attempt to sample what students have learned, or how well teachers have taught, and/or the effectiveness of a school's principal's performance as an instructional leader. School budgets and personnel promotions, compensation, and awards may be affected. Most school districts make this kind of assessment public; it can affect policy and public perception of the effectiveness of taxpayer-supported schools and be the basis for comparison among schools. It has been suggested that test scores analyzed in a disaggregated format can help identify instructional problems and point to potential solutions. Action Plans – The statement that indicates the specific changes that a given area plans to implement in the next cycle based on assessment results. "The biology faculty will introduce one special project in the introductory class that will expose the students to the scientific method." "Career Services is implementing a software program called ‘1st Place’. This software will allow better tracking of job openings." Action Research – Classroom-based research involving the systematic collection of data in order to address certain questions and issue so as to improve classroom instruction and educational effectiveness. Affective Outcomes – Outcomes of education that reflect feelings more than understanding; likes, pleasures, ideals, dislikes, annoyances, values. Annual Report: A report from each academic program based on its assessment plan that is submitted annually, which outlines how evidence was used to improve student learning outcomes through curricular and/or other changes or to document that no changes were needed. Assessment – The systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. In general


terms, assessment is the determination of a value, or measurement, based on a "standard." We often refer to this standard as a "target." Standard-based measurement, or assessment, is useful in education for both the placement of students in initial course work and ascertaining the extent of students' acquisition of skills/knowledge. Assessment Cycle – The assessment cycle in higher education is generally annual and fits within the academic year. Outcomes, targets and assessment tools are established early in the fall semester; data is collected by the end of spring semester; results are analyzed during the summer and early fall. Assessment Tool – An instrument that has been designed to collect objective data about students' knowledge and skill acquisition. An appropriate outcomes assessment test measures students' ability to integrate a set of individual skills into a meaningful, collective demonstration. Some examples of assessment tools include standardized tests, end-of-program skills tests, student inquiries, common final exams, and comprehensive embedded test items. Assessment Literacy – The possession of knowledge about the basic principals of sound assessment practice, including terminology, the development and use of assessment methodologies and techniques, familiarity with standards of quality in assessment. Increasingly, familiarity with alternatives to traditional measurements of learning. Authentic Assessment – A circumstance in which the behavior that the learning is intended to produce is evaluated and discussed in order to improve learning. The concept of model, practice, feedback in which students know what excellent performance is and are guided to practice an entire concept rather than bits and pieces in preparation for eventual understanding. A variety of techniques can be employed in authentic assessment. Benchmark – Student performance standards (the level(s) of student competence in a content area). Cohort – A group whose progress is followed by means of measurements at different points in time. Course-embedded assessment – A method in which evidence of student learning outcomes for the program is obtained from assignments in particular courses in the curriculum. Course-level assessment – Assessment to determine the extent to which a specific course is achieving its learning goals. Course mapping – A matrix showing the coverage of each program learning outcome in each course. It may also indicate the level of emphasis of each outcome in each course. Criterion Referenced Tests – A test in which the results can be used to determine a student's progress toward mastery of a content area. Performance is compared to an expected level of


mastery in a content area rather than to other students' scores. Such tests usually include questions based on what the student was taught and are designed to measure the student's mastery of designated objectives of an instructional program. The "criterion" is the standard of performance established as the passing score for the test. Scores have meaning in terms of what the student knows or can do, rather than how the test-taker compares to a reference or norm group. Curriculum Map – A matrix showing where each goal and/or learning outcome are covered in each program course. Direct Assessment – Assessment to gauge student achievement of learning outcomes directly from their work. Educational Goals – The knowledge, skills, abilities, capacities, attitudes or dispositions students are expected to acquire as a result of completing your academic program. Goals are sometimes treated as synonymous with outcomes, though outcomes are the behavioral results of the goals, and are stated in precise operational terms. Formative assessment – The assessment of student achievement at different stages of a course or at different stages of a student’s academic career. The focus of formative assessment is on the documentation of student development over time. It can also be used to engage students in a process of reflection on their education. General Education Assessment – Assessment that measures the campus-wide, general education competencies agreed upon by the faculty. General education assessment is more holistic in nature than program outcomes assessment because competencies are measured across disciplines, rather than just within a single discipline. Holistic Scoring – In assessment, assigning a single score based on an overall assessment of performance rather than by scoring or analyzing dimensions or traits individually. The product is considered to be more than the sum of its parts and so the quality of a final product or performance is evaluated rather than the process or dimension of performance. A holistic scoring rubric might combine a number of elements on a single scale. Focused holistic scoring may be used to evaluate a limited portion of a learner's performance. Indirect Assessment – Assessment that deduces student achievement of learning outcomes through the reported perception of learning by students and other agents. Institutional assessment – Assessment to determine the extent to which a college or university is achieving its mission. Learning outcomes – Operational statements describing specific student behaviors that evidence the acquisition of desired goals in knowledge, skills, abilities, capacities, attitudes or dispositions. Learning outcomes can be usefully thought of as behavioral criteria for determining whether students are achieving the educational goals of a program, and, ultimately, whether overall program goals are being successfully met. Outcomes are sometimes treated as synonymous with objectives, though objectives are usually more general statements of what


students are expected to achieve in an academic program. Measurable Criteria – An intended student outcome, or administrative objective, restated in a quantifiable, or measurable, statement. "60% of biology students will complete an experiment/project using scientific methods in fall 2003;" "75% of responding MU students will indicate on a survey in fall 2003 that they have read materials about career opportunities on campus." Metacognition – The knowledge of one's own thinking processes and strategies, and the ability to consciously reflect and act on the knowledge of cognition to modify those processes and strategies. Norm – A distribution of scores obtained from a norm group. The norm is the midpoint (or median) of scores or performance of the students in that group. Fifty percent will score above and fifty percent below the norm. Performance-Based Assessment – Direct, systematic observation and rating of student performance of an educational objective, often an ongoing observation over a period of time, and typically involving the creation of products. The assessment may be a continuing interaction between teacher and student and should ideally be part of the learning process. The assessment should be a real-world performance with relevance to the student and learning community. Assessment of the performance is done using a rubric, or analytic scoring guide to aid in objectivity. Performance-based assessment is a test of the ability to apply knowledge in a reallife setting or performance of exemplary tasks in the demonstration of intellectual ability. Portfolio – A systematic and organized collection of a student's work that exhibits to others the direct evidence of a student's efforts, achievements, and progress over a period of time. The collection should involve the student in selection of its contents, and should include information about the performance criteria, the rubric or criteria for judging merit, and evidence of student self-refection or evaluation. Portfolio Assessment – Portfolios may be assessed in a variety of ways. Each piece may be individually scored, or the portfolio might be assessed merely for the presence of required pieces, or a holistic scoring process might be used and an evaluation made on the basis of an overall impression of the student's collected work. It is common that assessors work together to establish consensus of standards or to ensure greater reliability in evaluation of student work. Established criteria are often used by reviewers and students involved in the process of evaluating progress and achievement of objectives. Primary Trait Method – A type of rubric scoring constructed to assess a specific trait, skill, behavior, or format, or the evaluation of the primary impact of a learning process on a designated audience. Process – A generalizable method of doing something, generally involving steps or operations which are usually ordered and/or interdependent. Process can be evaluated as part of an assessment, as in the example of evaluating a student's performance during prewriting exercises


leading up to the final production of an essay or paper. Program assessment – Assessment to determine the extent to which students in a departmental program can demonstrate the learning outcomes for the program. Reliability – An assessment tool’s consistency of results over time and with different samples of students. Rubric – A set of criteria specifying the characteristics of a learning outcome and the levels of achievement in each characteristic. Self-efficacy – Students’ judgment of their own capabilities for a specific learning outcome. Senior Project – Extensive projects planned and carried out during the senior year as the culmination of the undergraduate experience. Senior projects require higher-level thinking skills, problem-solving, and creative thinking. They are often interdisciplinary, and may require extensive research. Projects culminate in a presentation of the project to a panel of people, usually faculty and community mentors, sometimes students, who evaluate the student's work at the end of the year. Summative assessment – The assessment of student achievement at the end point of their education or at the end of a course. The focus of summative assessment is on the documentation of student achievement by the end of a course or program. It does not reveal the pathway of development to achieve that endpoint. Triangulation – The collection of data via multiple methods in order to determine if the results show a consistent outcome Validity – The degree to which an assessment measures (a) what is intended, as opposed to (b) what is not intended, or (c) what is unsystematic or unstable


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS ESCUELA DE IDIOMAS LICDA. EVELYN R. QUIROA

1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24. 25. 26. 27. 28. 29. 30. 31. 32. 33. 34. 35.

ASSESSMENT AND EVALUATION VOCABULARY (DIAGNOSTIC) Action Research Affective Outcomes Annual Report Assessment Assessment Cycle Assessment Tool Assessment Literacy Authentic Assessment Benchmark Cohort Course‐embedded assessment Course‐level assessment Course mapping Criterion Referenced Tests Curriculum Map Diagnostic Evaluation Direct Assessment Educational Goals Formative assessment General Education Assessment Holistic Scoring Learning outcomes Measurable Criteria Metacognition Norm Portfolio Primary Trait Method Process Program assessment Reliability Rubric Self‐efficacy Senior Project Summative assessment Validity


YORK UNIVERSITY S E N AT E C O M M I T T E E O N T E A C H I N G A N D L E A R N I N G ’ S G U I D E T O

TEACHING ASSESSMENT & EVALUATION INTRODUCTION

NEED FOR THE GUIDE

The Teaching Assessment and Evaluation Guide provides instructors with starting-points for reflecting on their teaching, and with advice on how to gather feedback on their teaching practices and effectiveness as part of a systematic program of teaching development. As well, the Guide provides guidance on how teaching might be fairly and effectively evaluated, which characteristics of teaching might be considered, and which evaluation techniques are best suited for different purposes. The Teaching Assessment and Evaluation Guide is a companion to the Teaching Documentation Guide (1993), also prepared by the Senate Committee on Teaching and Learning (SCOTL). The Documentation Guide (available at the Centre for the Support of Teaching and on the SCOTL website) aims to provide instructors with advice and concrete suggestions on how to document the variety and complexity of their teaching contributions.

Teaching is a complex and personal activity that is best assessed and evaluated using multiple techniques and broadly-based criteria. Assessment for formative purposes is designed to stimulate growth, change and improvement in teaching through reflective practice. Evaluation, in contrast, is used for summative purposes to give an overview of a particular instructor’s teaching in a particular course and setting. Informed judgements on teaching effectiveness can best be made when both assessment and evaluation are conducted, using several techniques to elicit information from various perspectives on different characteristics of teaching. There is no one complete source for information on one’s teaching, and no single technique for gathering it. Moreover, the techniques need to be sensitive to the particular teaching assignment of the instructor being assessed or evaluated, as well as the context in which the teaching takes place. If multiple perspectives are represented and different techniques used, the process will be more valued, the conclusions reached will be more credible, and consequently more valuable to the individual being assessed or evaluated.

CONTENTS • Introduction ....................................... 1 • Need for the Guide ............................ 1 • What is Quality Teaching? ................. 2 • Formative Assessment ...................... 2 • Summative Evaluation ....................... 2 • Overview of Assessment and Evaluation Strategies: 1. 2. 3. 4. 5. 6.

Teaching dossiers ........................ 3 Student ratings ............................ 4 Peer observations ........................ 5 Letters & individual interviews ...... 6 Course portfolios ......................... 6 Classroom assessment ............... 7

• Classroom Assessment Techniques .. 8

Current practices at York University are varied. In most departments and units, teaching is systematically evaluated, primarily for summative purposes. Individual instructors are free, if they wish, to use the data so gathered for formative purposes, or they may contact the Centre for the Support of Teaching which provides feedback and teaching analysis aimed at growth, development and improvement. Without denying the value of summative teaching evaluation, the main purpose of this Guide is to encourage committees and individuals to engage in reflective practice through the ongoing assessment of teaching for formative purposes and for professional development. Research indicates that such practice leads to heightened enthusiasm for teaching, and improvement in teaching and learning, both of which are linked to faculty vitality.

The Teaching Assessment and Evaluation Guide© is published by the Senate Committee on Teaching and Learning (SCOTL),York University www.yorku.ca/secretariat/senate/committees/scotl/ (revised January 2002)


Teaching Assessment and Evaluation Guide

WHAT IS QUALITY TEACHING?

consideration the level of the course, the instructor’s objectives and style, and the teaching methodology employed. Nonetheless, the primary criterion must be improved student learning. Research indicates that students, faculty and administrators alike agree that quality teaching:

All assessment and evaluation techniques contain implicit assumptions about the characteristics that constitute quality teaching. These assumptions should be made explicit and indeed should become part of the evaluation process itself in a manner which recognizes instructors’ rights to be evaluated within the context of their own teaching philosophies and goals. First and foremost then, “teaching is not right or wrong, good or bad, effective or ineffective in any absolute, fixed or determined sense.”¹ Instructors emphasize different domains of learning (affective, cognitive, psychomotor, etc.) and employ different theories of education and teaching methodologies (anti-racist, constructivist, critical, feminist, humanistic, etc.)². They encourage learning in different sites (classrooms, field locations, laboratories, seminar rooms, studios, virtual classrooms, etc.). They use different instructional strategies and formats (using case studies, coaching, demonstrating, facilitating discussions, lecturing, problemQUALITY TEACHING based learning, Put succinctly, quality teaching is online delivery, etc.), that activity which brings about the and they do this most productive and beneficial while recognizing learning experience for students and that students have promotes their development as diverse backgrounds learners. This experience may and levels of include such aspects as: preparedness. In one situation, instructors • improved comprehension of may see their role as and ability to use the ideas transmitting factual introduced in the course; information, and in • change in outlook, attitude and another as facilitating enthusiasm towards the discussion and discipline and its place in the promoting critical academic endeavour; thinking. • intellectual growth; and • improvement in specific skills As variable and such as critical reading and diverse as quality writing, oral communication, teaching might be, analysis, synthesis, abstraction, generalizations may and generalization. nevertheless be made about its basic characteristics as described in the accompanying text box.

• • • • •

establishes a positive learning environment; motivates student engagement; provides appropriate challenges; is responsive to students’ learning needs; and is fair in evaluating their learning.

Concretely, indicators of quality teaching can include: • • • •

effective choice of materials; organization of subject matter and course; effective communication skills; knowledge of and enthusiasm for the subject matter and teaching; • availability to students; and • responsiveness to student concerns and opinions. Some characteristics are more easily measured than others. Furthermore, since instructors are individuals and teaching styles are personal, it is all the more important to recognize that not everyone will display the same patterns and strengths.

ASSESSMENT OF TEACHING FOR FORMATIVE PURPOSES Formative assessment of teaching can be carried out at many points during an instructional period, in the classroom or virtual environment, to compare the perceptions of the instructor with those of the students, and to identify gaps between what has been taught and what students have learned. The purpose of assessment is for instructors to find out what changes they might make in teaching methods or style, course organization or content, evaluation and grading procedures, etc., in order to improve student learning. Assessment is initiated by the instructor and information and feedback can be solicited from many sources (for example, self, students, colleagues, consultants) using a variety of instruments (surveys, on-line forms, etc. - see classroom assessment below). The data gathered are seen only by the instructor and, if desired, a consultant, and form the basis for ongoing improvement and development.

The criteria for evaluating teaching vary between disciplines and within disciplines, and should take into

SUMMATIVE EVALUATION Summative evaluation, by contrast, is usually conducted at the end of a particular course or at specific points in an instructor’s career. The purpose is to form a judgment about the effectiveness of a course and/or an instructor. The judgment may be used for tenure and promotion decisions, to reward success in the form of teaching awards or merit pay, or to enable departments to make

______ 1. Mary Ellen Weimer (1990). Improving College Teaching (CA: Jossey Bass Publishers), 202. 2. Adapted from George L. Geis (1977), “Evaluation: definitions, problems and strategies,” in Chris Knapper et al Eds., Teaching is Important (Toronto: Clarke Irwin in association with CAUT).

2


Teaching Assessment and Evaluation Guide

• evidence of exceptional achievements and contributions to teaching in the form of awards, and committee work.

informed decisions about changes to individual courses, the curriculum or teaching assignments. At most universities, summative evaluation includes the results of teaching evaluations regularly scheduled at the end of academic terms. However, to ensure that summative evaluation is both comprehensive and representative, it should include a variety of evaluation strategies, among them:

One’s teaching dossier (see below) is an ideal format for presenting these types of evaluation as a cumulative and longitudinal record of one’s teaching. Important note: It is crucial that the two processes – summative evaluation and formative assessment – be kept strictly apart if the formative assessment of teaching is to be effective and achieve its purpose. This means that the information gathered in a program of formative assessment should not be used in summative evaluation unless volunteered by instructors themselves. It also means that persons who are or have been involved in assisting instructors to improve their teaching should not be asked to provide information for summative evaluation purposes.

• letters from individual students commenting on the effectiveness of the instructor’s teaching, the quality of the learning experience, and the impact of both on their academic progress; • assessments by peers based on classroom visits; • samples and critical reviews of contributions to course and curriculum development, as well as of contributions to scholarship on teaching; and

OVERVIEW OF STRATEGIES FOR ASSESSING AND EVALUATING QUALITY TEACHING AND STUDENT LEARNING This section describes six strategies that teachers may use to assess and evaluate the quality of their teaching and its impact on student learning: 1) teaching dossiers; 2) student ratings; 3) peer observations; 4) letters and individual interviews; 5) course portfolios; and 6) classroom assessment. These descriptions draw on current research in the field (available at the Centre for the Suppport of Teaching, 111 Central Square, www.yorku.ca/cst) and practices and procedures at other universities in Canada and abroad. All evaluation and assessment efforts should use a combination of strategies to take advantage of their inherent strengths as well as their individual limitations.

1. TEACHING DOSSIERS

Benefits: Dossiers provide an opportunity for instructors to articulate their teaching philosophy, review their teaching goals and objectives, assess the effectiveness of their classroom practice and the strategies they use to animate their pedagogical values, and identify areas of strength and opportunities for improvement. They also highlight an instructor’s range of responsibilities, accomplishments, and contributions to teaching and learning more generally within the department, university and/or scholarly community.

A teaching dossier or To focus on: portfolio is a factual description of an § Appraisal of instructor’s instructor’s teaching teaching and learning context achievements and contains documentation § Soundness of instructor’s approach to teaching and that collectively learning suggests the scope and quality of his or her § Coherence of teaching teaching. Dossiers can objectives and strategies be used to present § Vigour of professional evidence about teaching development, contributions quality for evaluative and accomplishments in the purposes such as T&P area of teaching. submissions, teaching award nominations, etc., as they can provide a useful context for analyzing other forms of teaching evaluation. Alternatively, dossiers can provide the framework for a systematic program of reflective analysis and peer collaboration leading to improvement of teaching and student learning. For further information on how to prepare a teaching dossier, please consult SCOTL’s Teaching Documentation Guide (available at the Centre for the Support of Teaching and from the SCOLT website).

Limitations: It is important to note that dossiers are not meant to be an exhaustive compilation of all the documents and materials that bear on an instructor’s teaching performance; rather they should present a selection of information organized in a way that gives a comprehensive and accurate summary of teaching activities and effectiveness. _______ For further information on teaching dossiers see: Teaching Documentation Guide (1993, Senate Committee on Teaching and Learning). Peter Seldin “Self-Evaluation: What Works? What Doesn’t?” and John Zubizarreta “Evaluating Teaching through Portfolios” in Seldin and Associates (1999). Changing Practices in Evaluating Teaching: A Practical Guide to Improved Faculty Performance and Promotion/ Tenure Decisions (MA: Anker Press).

3


Teaching Assessment and Evaluation Guide

2. STUDENT RATINGS OF TEACHING Student ratings of To focus on: teaching or student evaluations are the most § Effectiveness of instructor commonly used source § Impact of instruction on of data for both student learning summative and § Perceived value of the course formative information. to the student In many academic units they are mandatory, and § Preparation and organization in several units, they are § Knowledge of subject matter also standardized. For and ability to stimulate purposes such as tenure interest in the course and promotion, data should be obtained over § Clarity and understandability time and across courses § Ability to establish rapport using a limited number and encourage discussion of global or summary within the classroom type questions. Such § Sensitivity to and concern data will provide a with students’ level of undercumulative record and standing and progress enable the detection of patterns of teaching development1. Information obtained by means of student ratings can also be used by individual instructors to improve the course in future years, and to identify areas of strength and weakness in their teaching by comparison with those teaching similar courses. Longer and more focussed questionnaires are also useful in a program of formative evaluation when designed and administered by an instructor during a course. Benefits: The use of a mandatory, standardized questionnaire puts all teaching evaluations on a common footing, and facilitates comparisons between teachers, courses and academic units. The data gathered also serve the purpose of assessing whether the educational goals of the unit are being met. Structured questionnaires are particularly appropriate where there are relatively large numbers of students involved, and where there are either several sections of a single course, or several courses with similar teaching objectives using similar teaching approaches.

Limitations: While students’ perceptions provide valuable feedback to instructors, recent research has identified specific areas of teaching quality on which students are not able to make informed judgments. These include the appropriateness of course goals, content, 3 design, materials, and evaluation of student work. Thus, the use of a variety of techniques as described elsewhere in this document can help to address the gaps and shortcomings in the student rating data. Further, recent research indicates that care should be taken to control for possible biases based on gender, race, discipline, and teaching approach, particularly for those using non-traditional teaching methods and curriculum. Likewise, ratings can be affected by factors for which it is difficult to control, such as student motivation, complexity of material, level of course, and class size. Care should be taken, therefore, to create an appropriate context for interpreting the data in light of other sources of data and in comparison with other courses. One way to ensure fairness and equity is to ask students to identify the strengths of the instructor’s approach as well as weaknesses, and to ask for specific suggestions for improvement. Teachers have such different perspectives, approaches, and objectives that a standardized questionnaire may not adequately or fairly compare their performance. For example, the implicit assumption behind the design of many evaluation forms is that the primary mode of instruction is the lecture method. Such a form will be inadequate in evaluating the performance of instructors who uses different teaching methods, for example collaborative learning. One way to overcome this limitation and to tailor the questionnaire to the objectives and approaches of a specific course or instructor is to design an evaluation form with a mandatory core set of questions and additional space for inserting questions chosen by the instructor. Note: The Centre for the Support of Teaching has sample teaching evaluation forms from numerous Faculties and departments, as well as books and articles which are helpful resources for individuals and committees interested in developing questionnaires. In addition, web resources are posted on the SCOTL website. _____

Questionnaires are relatively economical to administer, summarize and interpret. Provided that students are asked to comment only on items with which they have direct experience, student responses to questionnaires have been found to be valid. While questionnaire forms with open-ended questions are more expensive to administer, they often provide more reliable and useful sources of information in small classes and for the tenure and promotion process. Also, open-ended questions provide insight into the numerical ratings, and provide pertinent information for course revision.

For further information on student ratings of teaching see: 1. Cashin, William (1995), “Student ratings of teaching: The research revisited.” Idea Paper, Number 32 (Kansas State University, Centre for Faculty Development) 2. See, for example, The Teaching Professor, Vol. 8, No. 4, 3-4 3. See also Theall, Michael and Franklin, Jennifer, Eds.(1990). Student Ratings of Instruction: Issues for Improving Practice, New Directions in Teaching and Learning, No. 43 (CA: Jossey-Bass Inc.).

4


Teaching Assessment and Evaluation Guide

3. PEER OBSERVATIONS

Peer observation is especially useful for formative evaluation. In this case, it is important that the results of the observations remain confidential and not be used for summative evaluation. The process of observation in this case should take place over time, allowing the instructor to implement changes, practice improvements and obtain feedback on whether progress has been made. It may also include video-taping the instructor’s class. This process is particularly helpful to faculty who are experimenting with new teaching methods.

Peer observations offer To focus on: critical insights into an instructor’s § Quality of the learning performance, environment (labs, lecture complementing student halls, online discussion ratings and other forms groups, seminars, studios, of evaluation to etc.) contribute to a fuller § Level of student engagement and more accurate representation of § Clarity of presentation, and overall teaching quality. ability to convey course Research indicates that content in a variety of ways colleagues are in the § Range of instructional best position to judge methods and how they specific dimensions of support student teaching quality, understanding including the goals, § Student-instructor rapport content, design and organization of the § Overall effectiveness course, the methods and materials used in delivery, and evaluation of student work.

A particularly valuable form of observation for formative purposes is peer-pairing. With this technique, two instructors provide each other with feedback on their teaching on a rotating basis, each evaluating the other for a period of time (anywhere between 2 weeks and a full year). Each learns from the other and may learn as much in the observing role as when being observed. Full guidelines for using this technique, as well as advice and assistance in establishing a peer-pairing relationship, are available from the Centre for the Support of Teaching. Benefits: Peer observations can complete the picture of an instructor’s teaching obtained through other methods of evaluation. As well, observations are an important supplement to contextualize variations in student ratings in situations, for example, where an instructor’s teaching is controversial because experimental or non-traditional teaching methods are being used, or where other unique situations exist within the learning environment. Colleagues are better able than students to comment upon the level of difficulty of the material, knowledge of subject matter and integration of topics, and they can place the teaching within a wider context and suggest alternative teaching formats and ways of communicating the material.

Peer observation may be carried out for both summative and formative purposes. For summative evaluation, it is recommended that prior consensus be reached about what constitutes quality teaching within the discipline, what the observers will be looking for, and the process for carrying out and recording the observations. To ensure that a full picture of an instructor’s strengths and weaknesses is obtained, some observers find checklists useful and some departments may choose to designate the responsibility of making classroom observations to a committee. Given the range of activities in a class, some observers find it helpful to focus on specific aspects of the teaching and learning that takes place. It is also advisable that more than one colleague be involved, and that more than one observation take place by each colleague. This will counteract observer bias towards a particular teaching approach and the possibility that an observation takes place on an unusually bad day. These precautions also provide for greater objectivity and reliability of the results.

Limitations: There are several limitations to using peer observations for summative purposes. First, unless safeguards are put in place to control for sources of bias, conflicting definitions of teaching quality, and idiosyncrasies in practice, inequities can result in how classroom observations are done1. For example, instructors tend to find observations threatening and they and their students may behave differently when there is an observer present. Also, there is evidence to suggest that peers may be relatively generous evaluators in some instances. A second limitation is that it is costly in terms of faculty time since a number of observations are necessary to ensure the reliability and validity of findings. Since observers vary in their definitions of quality teaching and some tact is required in providing feedback on observations, it is desirable that observers receive training before becoming involved in providing formative evaluation. The approaches described above can help to minimize these inequities and improve the effectiveness of peer observation. Finally, to protect the integrity of this

Before an observation, it is important that the observer and instructor meet to discuss the instructor’s teaching philosophy, the specific objectives and the strategies that will be employed during the session to be observed, and the materials relevant to the course: syllabus, assignments, online course components, etc. Likewise, discussions of the criteria for evaluation and how the observations will take place can help to clarify expectations and procedures. A post-observation meeting allows an opportunity for constructive feedback and assistance in the development of a plan for improvement.

5


Teaching Assessment and Evaluation Guide

5. COURSE PORTFOLIOS

technique for both formative and summative purposes, it is critical that observations for personnel decisions be kept strictly separate from evaluations for teaching improvement. ______ For further information on colleague evaluation of teaching see: 1. DeZure, Deborah. “Evaluating teaching through peer classroom observation,” in Peter Seldin and Associates (1999). Changing Practices in Evaluating Teaching: A Practical Guide to Improved Faculty Performance and Promotion/Tenure Decisions (MA: Anker Press).

4. LETTERS AND INDIVIDUAL INTERVIEWS Letters and/or individual interviews may be used in teaching award nominations, tenure and promotion files, etc. to obtain greater depth of information for the purpose of improving teaching, or for providing details and examples of an instructor’s impact on students.

To focus on: § Effectiveness of instructor through detailed reflection § Impact of instruction on student learning and motivation over the longer term § Preparation and organization § Clarity and understandability § Ability to establish rapport and encourage discussion

§ Sensitivity to and concern with students’ level of Benefits: Interviews understanding and progress and letters elicit information not readily available through student ratings or other forms of evaluation. Insights, success stories, and thoughtful analyses are often the outcomes of an interview or request for a written impressions of an instructor’s teaching. Students who are reluctant to give information on a rating scale or in written form, often respond well to a skilled, probing interviewer.

A course portfolio is a To focus on: variant on the teaching dossier and is the § Appropriateness of course product of focussed goals and objectives inquiry into the learning § Quality of instructional by students in a materials and assignments particular course. It § Coherence of course represents the specific organization, teaching aims and work of the strategies and modes of instructor and is delivery structured to explain what, how and why § Comprehensiveness of students learn in a class. methods for appraising It generally comprises student achievement four main components: § Level of student learning and 1) a statement of the contribution of teaching to aims and pedagogical students’ progress strategies of the course and the relationship § Innovations in teaching and between the method and learning outcomes; 2) an analysis of student learning based on key assignments and learning activities to advance course goals; 3) an analysis of student feedback based on classroom assessment techniques; and 4) a summary of the strengths of the course in terms of students’ learning, and critical reflection on how the course goals were realised, changed or unmet. The final analysis leads to ideas about what to change in order to enhance student learning, thinking and development the next time the course is taught.1 Course portfolios have been described as being closely analogous to a scholarly project, in that: “a course, like a project, begins with significant goals and intentions, which are enacted in appropriate ways and lead to relevant results in the form of student learning. Teaching, like a research project, is expected to shed light on the question at hand and the issues that shape it; the methods used to complete the project should be congruent with the outcomes sought. The course portfolio has the distinct advantage of representing – by encompassing and connecting planning, implementation and results – the intellectual integrity of teaching as reflected in a single course.” 2

Limitations: The disadvantage of letters is that the response rate can be low. The major disadvantage of interviews is time. Interviews can take approximately one hour to conduct, about 30 minutes to arrange, and another block of time for coding and interpretation. A structured interview schedule should be used to eliminate the bias that may result when an untrained interviewer asks questions randomly of different students.

Benefits: The focus on a specific course allows the portfolio to demonstrate student understanding as an index of successful teaching. For instructors, course portfolios provide a framework for critical reflection and continuous improvement of teaching, and deep insight into how their teaching contributes to students’ knowledge and skills.

6


Teaching Assessment and Evaluation Guide

For departments, they can highlight cohesion and gaps within the curriculum and enable continuity within the course over time and as different instructional technologies are incorporated. As well, course portfolios can collectively promote course articulation and provide means of assessing the quality of a curriculum and pedagogical approaches in relation to the overall goals and outcomes of a program of study.

between what they teach and what students learn and enable them to adjust their teaching to make learning more efficient and effective. The information should always be shared with students to help them improve their own learning strategies and become more successful selfdirected learners. There are a variety of instruments for classroom assessment, either in class or electronically, such as oneminute papers, one-sentence summaries, critical incident questionnaires, focus groups, and mid-year mini surveys (see page 8). Generally, the instruments are created, administered, and results analysed by the instructor to focus on specific aspects of teaching and student learning. Although the instructor is not obligated to share the results of classroom assessment beyond the course, the results may usefully inform other strategies for evaluating teaching quality.

Limitations: Because course portfolios focus on one course, they do not reflect the full range of an instructor’s accomplishments, responsibilities, and contributions (such as curriculum development and work with graduate students) that would be documented in a teaching dossier. Also, course portfolios take time to prepare and evaluate, and instructors should not be expected to build a portfolio for every course taught; rather they should concentrate on those courses for which they have the strongest interest or in which they invest the majority of their energy, imagination and time.3 ______ For further information on course portfolios see:

Classroom assessment can be integrated into an instructor’s teaching in a graduated way, starting out with a simple assessment technique in one class involving five to ten minutes of class time, less than an hour for analysis of the results, and a few minutes during a subsequent class to let students know what was learned from the assessment and how the instructor and students can use that information to improve learning. After conducting one or two quick assessments, the instructor can decide whether this approach is worth further investment of time and energy.

1. Cerbin, William (1994), “The course portfolio as a tool for continuous improvement of teaching and learning.” Journal on Excellence in College Teaching, 5(1), 95-105. 2. Cambridge, Barbara. “The Teaching Initiative: The course portfolio and the teaching portfolio.” American Association for Higher Education. 3. Cutler, William (1997). The history course portfolio. Perspectives 35 (8): 17-20.

Benefits: Classroom assessment encourages instructors to become monitors of their own performance and promotes reflective practice. In addition, its use can prompt discussion among colleagues about their effectiveness, and lead to new and better techniques for eliciting constructive feedback from students on teaching and learning.

6. CLASSROOM ASSESSMENT* Classroom assessment To focus on: is method of inquiry into the effects of § Effectiveness of teaching on teaching on learning. It learning involves the use of § Constructive feedback on techniques and teaching strategies and instruments designed to classroom/online practices give instructors ongoing feedback about § Information on what students the effect their teaching are learning and level of is having on the level understanding of material and quality of student § Quality of student learning learning; this feedback and engagement then informs their § Feedback on course design subsequent instructional decisions. Unlike tests and quizzes, classroom assessment can be used in a timely way to help instructors identify gaps

Limitations: As with student ratings, the act of soliciting frank, in-the-moment feedback may elicit critical comments on the instructor and his/her approach to teaching. However, it is important to balance the positive and negative comments and try to link negative commentary to issues of student learning. New users of classroom assessment techniques might find it helpful to discuss the critical comments with an experienced colleague. ______ Adapted from Core: York’s newsletter on university teaching (2000) Vol 9, No. 3.

* “Classroom Assessment” is a term used widely by scholars in higher education; it is meant to include all learning environments. For examples, see references on page 8.

7


Teaching Assessment and Evaluation Guide

A SAMPLING OF CLASSROOM ASSESSMENT TECHNIQUES ONE-MINUTE PAPER

The One Sentence Summary technique involves asking students to consider the topic you are discussing in terms of Who Does/Did What to Whom, How, When, Where and Why, and then to synthesize those answers into a single informative, grammatical sentence. These sentences can then be analyzed to determine strengths and weaknesses in the students’ understanding of the topic, or to pinpoint specific elements of the topic that require further elaboration. Before using this strategy it is important to make sure the topic can be summarized coherently. It is best to impose the technique on oneself first to determine its appropriateness or feasibility for given material.

The One-Minute Paper, or a brief reflection, is a technique that is used to provide instructors with feedback on what students are learning in a particular class. It may be introduced in small seminars or in large lectures, in first year courses or upper year courses, or electronically using software that ensures student anonymity. The OneMinute Paper asks students to respond anonymously to the following questions: One-Minute Paper 1. What is the most important thing you learned today?

For further information on these and other classroom assessment strategies see:

2. What question remains uppermost in your mind?

Cross, K. P. and Angelo, T. A, Eds. (1988) Classroom Assessment Techniques: A Handbook for Faculty (MI: National Center for Research to Improve Post-Secondary Teaching and Learning).

Depending upon the structure and format of the learning environment, the One-Minute Paper may be used in a variety of ways: •

During a lecture, to break up the period into smaller segments enabling students to reflect on the material just covered.

At the end of a class, to inform your planning for the next session.

CRITICAL INCIDENT QUESTIONNAIRES The Critical Incident Questionnaire is a simple assessment technique that can be used to find out what and how students are learning, and to identify areas where adjustments are necessary (e.g., the pace of the course, confusion with respect to assignments or expectations). On a single sheet of paper, students are asked five questions which focus on critical moments for learning in a course. The questionnaire is handed out about ten minutes before the final session of the week.

In a course comprising lectures and tutorials, the information gleaned can be passed along to tutorial leaders giving them advance notice of issues that they may wish to explore with students.

Critical Incident Questionnaire

THE MUDDIEST POINT

1. At what moment this week were you most engaged as a learner?

An adaptation of the One-Minute Paper, the Muddiest Point is particularly useful in gauging how well students understand the course material. The Muddiest Point asks students:

2. At what moment this week were you most distanced as a learner? 3. What action or contribution taken this week by anyone in the course did you find most affirming or helpful?

What was the ‘muddiest point’ for you today? Like the One-Minute Paper, use of the Muddiest Point can helpfully inform your planning for the next session, and signal issues that it may be useful to explore.

4. What action or contribution taken this week by anyone in the course did you find most puzzling or confusing?

ONE SENTENCE SUMMARIES

5. What surprised you most about the course this week?

One Sentence Summaries can be used to find out how concisely, completely and creatively students can summarize a given topic within the grammatical constraints of a single sentence. It is also effective for helping students break down material into smaller units that are more easily recalled. This strategy is most effective for any material that can be represented in declarative form – historical events, story lines, chemical reactions and mechanical processes.

Critical Incident Questionnaires provide substantive feedback on student engagement and may also reveal power dynamics in the classroom that may not initially be evident to the instructor. For further information on Critical Incident Questionnaires see Brookfield, S. J. and Preskill, S. (1999) Discussion as a Way of Teaching: Tools and Techniques for a Democratic Classroom. (CA: Jossey Bass), page 49.

8


UNIVERSIDAD MARIANO GALVEZ FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS FORMAL ASSESSMENT INSTRUCTOR

DATE Rosa Tobar ,Ingrid Quill y Susana Vásquez

July 30th 2011 LESSON NUMBER

COURSE TITLE Science UNIT

4

SPECIFIC TOPIC

Recycle

INSTRUCTIONAL GOAL

Cognitive: knows about the importance for take care of the environment. Affective: Differentiates with self-reliance the concept of the three “Rs”. Psychomotor: Makes an action plan for his/her school. PERFORMANCE OBJECTIVE

Learns the meaning of Reduce, Reuse, and Recycle. RATIONALE

Students need to know how to take care of our environment through campaigns and action plans at school. LESSON CONTENT Recycle

Ecology Vocabulary: litter, throw, garbage ,recycle, save,waste,keep,organic,non-organic,pollute INSTRUCTIONAL PROCEDURES

a. Focusing event: Write the following words on the board metal, glass, and paper. Students in 3 minutes should write other words using the same letters.

TIME 5 min.

b. Teaching procedures Scramble the title “Recycle”. Brief explanation about the topic. List some words related to recycle in order to students increase their vocabulary. Explain the meaning of Reduce, Reuse, and Recycle.

10 min.

c. Formative check: Students cut out some objects from magazines and place them in the correct recycle bin.

5 min

d. Student Participation: activity 1 they work in groups; with given colored paper strips write suggestions to recycle different materials. e. Closure: write the word Styrofoam on the board and explain that lots of things are made of it and it is also very dangerous when it gets in the ocean etc. Elicit ideas about how we can reduce the amount of this material.

10 min. 5 min.

EVALUATION PROCEDURES:

Activity 2 Make an Eco-Action project for the school in groups; do a plan about a recycle campaign and explain the plan to the rest of the class. MATERIAL AND AIDS

Magazines, boxes, (recycle bins), colored strips, markers, cartuline.


UNIVERSIDAD MARIANO GALVEZ FACULTAD DE HUMANIDADES ESCUELA DE IDIOMAS INFORMAL ASSESSMENT INSTRUCTOR

DATE Rosa Tobar ,Ingrid Quill y Susana Vásquez

COURSE TITLE

July 30th 2011 LESSON NUMBER

Science

UNIT

SPECIFIC TOPIC 4

Recycle

INSTRUCTIONAL GOAL

Cognitive: knows about the importance of take care the environment. Affective: Differentiates with self-reliance the concept of the three “Rs”. Psychomotor: Makes different functional objects with recycle material. PERFORMANCE OBJECTIVE

Learns the meaning of Reduce, Reuse, and Recycle. RATIONALE

Students take advantage of different recycle materials to make useful objects with them. LESSON CONTENT: Recycle Ecology Vocabulary: litter, throw, garbage ,recycle, save, waste, keep ,organic ,non-organic ,pollute INSTRUCTIONAL PROCEDURES

a. Focusing event: Show different items made of recycle materials. b. Teaching procedures: After show items made of recycle materials, explain how to make some useful objects.

TIME 5 min. 10 min.

c. Formative check: Students make a list of materials that can be reused. d. Student Participation: activity 1 With an egg carton students make an “Eco-Action carton” in each

15 min.

compartment of the carton paste circles of paper with actions which can help to recycle at home or school e.g. Reuse plastic bags when you go shopping. (you can use 6 or more compartments) e. Closure: according to some poster students have to give some advices to stop all the bad habits that humans have and contribute to the excessive contamination of the world.

EVALUATION PROCEDURES activity 2

Make a pencil holder with toilet paper tubes, using their creativity decorate it.

MATERIAL AND AIDS

egg carton, paint, paper, markers, glue, toilet paper tubes

5 min.


Have you seen this symbol before?


It's called a mobius loop and is the universal symbol for recycling around the world.


What is Recycling? Process by which items that would otherwise be solid waste are collected separated and returned to the economic mainstream to be reused in the form of raw materials and finished goods.


What items are recyclable?

•Paper •Plastic •Glass •Aluminum


Paper What paper can be recycled?

•Office Paper •Cardboard •Boxes


Aluminum What aluminum can be recycled? • Soda Cans • Soup Cans


Glass What glass can be recycled? •Clear glass bottles •Green glass bottles •Amber glass bottles


Plastics What plastic materials that can be recycled? •Soft drink bottles • Milk Jugs • Water Bottles •Soap Containers


Recycling saves energy, reduces pollution, saves natural resources, and reduces green house gas emissions to help our planet`s future.


Universidad Mariano Gálvez de Guatemala Profesorado en Enseñanza Media en el Idioma Inglés Science I Eight Grade Miss Susana, Ingrid, Raquel, Jenny and Rosa

Name: ______________________ ______________________ ______________________ ______________________ ______________________

Date: ___________________

Recycle Bins Objective: Reinforce and practice what you know about Reduce, Reuse and Recycle by working on this project we are also creating new habits in other students and in ourselves. Materials (per group): 4 boxes (about the same size) Papers of these colors Markers Procedure: 1. Each group will be in charge of elaborating 4 boxes; each one will be wrapped and named. 2. You will be given pictures of different kinds of materials like cans, plastic bottles and so, you are going to use them in your show and tell. 3. Each group will be assigned a different class, in this class your will explain the use of the boxes and share some information with the students. 4. Each group only has 5 minutes to present and to explain the material. 5. The teacher that is in charge of the class will evaluate the group that is presenting.


Universidad Mariano GĂĄlvez de Guatemala Profesorado en EnseĂąanza Media en el Idioma Ingles Science I Eight Grade Miss Susana, Ingrid, Raquel, Jenny and Rosa

Name: ______________________ ______________________ ______________________ ______________________ ______________________

Date: _____________________

Recycling Campaign Objective: create new habits in other students and in ourselves. Procedure: 1. In groups you will gather the information that you think is the most important about Reduce, Reuse and Recycle, you have to use the information that your teacher already gave you. You cannot add more information. 2. Prepare a poster with some important facts about recycling. 3. Prepare a 5 minutes oral presentation. 4. Include your boxes and the example materials. 5. In class you will be evaluated individually according to your contributions to your group. 6. In the class where you will present your information, the teacher in charge will evaluate you by group. 7. All the integrants must participate. 8. You cannot spend more than five minutes in the class; the teacher in charge is allowed to cut your presentation if needed. 9. You must present to your teacher a written version of the information you choose for your presentation, also you have to leave a copy in the class where you work.


Rubric Universidad Mariano Galvez de Guatemala Science I Eight Grade

Recycling Campaign Name: _____________________________________ Key: ____________ 5 excellent

1. 2. 3. 4. 5.

4 good

3 need to improve

2 poor

1 bad

Description originality all the information was clearly explained good manage of visual aids and other material tone of voice, fluency and grammar structures use of time

Score

Total Comments:


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA TECNICAS DE EVALUACION 2011 No 1

INDICATORS Reflects on evidence of student learning

INEFFECTIVE Teacher does not examine and/or analyze formal and informal evidence of student learning to inform professional growth.

DEVELOPING Teacher occasionally examines and/or analyzes formal and informal evidence of student learning; professional growth is only loosely aligned with the needs of students.

EFFECTIVE Teacher regularly examines and analyzes formal and informal evidence of student learning; professional growth is aligned with the needs of students.

HIGHLY EFFECTIVE Teacher engages in an ongoing examination and analysis of formal and informal evidence of student learning; professional growth is aligned with the needs of students. The teacher reviews the impact of professional learning on student achievement.

2

Communicates purposes and criteria

Teacher does not communicate purposes of assessments, the assessment criteria or the parameters for success to students.

Teacher communicates purposes of assessments, the assessment criteria or the parameters for success to students, but for some students, the explanation is unclear.

Teacher communicates purposes of assessments, the assessment criteria, and the parameters for success, and the explanation is clear to most students.

Teacher communicates purposes of assessments, the assessment criteria, and the parameters for success clearly to all students. Students are able to explain purposes and criteria to others.

3

Provides preparation and practice

Teacher does not prepare students for assessment formats using authentic curriculum and/or does not appropriately modify assessments or testing conditions for students with exceptional learning needs. Teacher rarely seeks out specialists to ensure modifications to meet individual student needs.

With limited success, teacher prepares students for assessment formats using authentic curriculum and modifies assessments and/or testing conditions for students with exceptional learning needs. Teacher occasionally seeks out specialists to ensure modifications meet individual student needs.

Teacher prepares students for assessment formats using authentic curriculum and appropriately modifies assessments and/or testing conditions for students with exceptional learning needs. Teacher frequently seeks out specialists to ensure modifications meet individual student needs.

Teacher prepares students for assessment formats using authentic curriculum and appropriately modifies assessments and/or testing conditions for students with exceptional learning needs. Teacher consistently seeks out specialists/resources to ensure modifications meet individual student needs.

4

Provides assessment skills and strategies

Teacher does not equip students with assessment skills and/or strategies.

Teacher equips students with some assessment skills and/or strategies. Some students apply the skills and/or strategies when coached by teacher.

Teacher equips students with several assessment skills and strategies. Students apply the skills and strategies when coached by teacher.

Teacher equips students with multiple assessment skills and strategies. Students independently apply the skills and strategies.

5

Designs instruction using current levels of student understanding

Teacher does not use students’ responses to questions, discussion or other work nor considers possible misconceptions when planning instruction.

Teacher uses students’ responses to questions, discussion or other work, and may or may not consider common misconceptions when planning instruction.

Teacher uses students’ responses to questions, discussion, and other work, and considers common misconceptions when planning instruction.

Teacher uses individual students’ responses to questions, discussion, and other work, and routinely considers common misconceptions when planning instruction.

6

Designs learning experiences using prior knowledge

Teacher does not design learning experiences that connect students’

Teacher designs some learning experiences that connect prior content

Teacher designs learning experiences that connect prior content knowledge

Teacher designs learning experiences that connect prior content knowledge


prior content knowledge to new learning.

knowledge to new learning.

to new learning within and across disciplines.

to new learning. Teacher plans opportunities for students themselves to make connections to prior learning within and across disciplines.

7

Designs instruction to meet diverse learning needs of students

Teacher does not use a range of instructional strategies to design learning experiences that reflect the experiences, strengths, and learning needs of students.

Teacher uses few differentiated instructional strategies to design learning experiences that reflect the experiences, strengths, and learning needs of students. Teacher plans an alternate strategy to adapt instruction if needed.

Teacher uses several differentiated instructional strategies to design learning experiences that reflect the experiences, strengths, and learning needs of students with some differentiation for different groups of students and awareness of 21st Century Skills. Teacher plans several alternate strategies to adapt instruction as needed.

Teacher uses several differentiated instructional strategies to design learning experiences that reflect the experiences, strengths, and learning needs of all students. Instruction is differentiated, as appropriate, for individual learners and incorporate 21st Century Skills. Teacher plans alternate strategies to adapt instruction in anticipation of various levels of student understanding.

8

Plans for student strengths, interests, and experiences

Teacher does not plan instruction to address the strengths, interests, and experiences of students.

Teacher plans instruction to address the strengths, interests, and experiences of some students.

Teacher plans instruction to address the strengths, interests, and experiences of most students.

Teacher plans instruction to address the strengths, interests, and experiences of each student and is able to adapt the lesson as needed.

9

Gives and receives constructive feedback

Teacher does not give or receive constructive feedback to improve professional practice.

Teacher inconsistently gives or receives constructive feedback to improve professional practice

Teacher regularly gives, receives and acts upon constructive feedback to improve professional practice. Feedback to colleagues is conveyed in a professional and supportive manner.

Teacher regularly gives, receives, and reflects upon constructive feedback to improve professional practice. Feedback to colleagues is conveyed in a professional and supportive manner. Teacher encourages and engages in peer assessment to improve professional practice.

---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------GROUP ASSESSING:______________ GROUP BEING ASSESSED:_________

COMMENTS:

SCORE:___________________________


OBJETIVOS vs. COMPETENCIAS



















Classwork ď‚› Create

a Venn Diagram estableshing the differences and similarities between objectives and competences.


Revised Bloom’s Taxonomy Revised Bloom’s Taxonomy (RBT) employs the use of 25 verbs that create collegial understanding of student behavior and learning outcome. Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm

Bloom’s Revised Taxonomy • Taxonomy of Cognitive Objectives • 1950s- developed by Benjamin Bloom • Means of expressing qualitatively different kinds of thinking • Been adapted for classroom use as a planning tool • Continues to be one of the most universally applied models • Provides a way to organise thinking skills into six levels, from the most basic to the more complex levels of thinking • 1990s- Lorin Anderson (former student of Bloom) revisited the taxonomy • As a result, a number of changes were made (Pohl, 2000, Learning to Think, Thinking to Learn, pp. 7-8)


Original Terms

New Terms

• Evaluation

•Creating

• Synthesis

•Evaluating

• Analysis

•Analysing

• Application

•Applying

• Comprehension

•Understanding

• Knowledge

•Remembering

(Based on Pohl, 2000, Learning to Think, Thinking to Learn, p. 8)

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm

Change in Terms • The names of six major categories were changed from noun to verb forms. • As the taxonomy reflects different forms of thinking and thinking is an active process verbs were used rather than nouns. • The subcategories of the six major categories were also replaced by verbs and some subcategories were reorganised. • The knowledge category was renamed. Knowledge is an outcome or product of thinking not a form of thinking per se. Consequently, the word knowledge was inappropriate to describe a category of thinking and was replaced with the word remembering instead. • Comprehension and synthesis were retitled to understanding and creating respectively, in order to better reflect the nature of the thinking defined in each category. http://rite.ed.qut.edu.au/oz-teachernet/training/bloom.html


BLOOM’S REVISED TAXONOMY Creating Generating new ideas, products, or ways of viewing things Designing, constructing, planning, producing, inventing. Evaluating Justifying a decision or course of action Checking, hypothesising, critiquing, experimenting, judging Analysing Breaking information into parts to explore understandings and relationships Comparing, organizing, deconstructing, interrogating, finding Applying Using information in another familiar situation Implementing, carrying out, using, executing Understanding Explaining ideas or concepts Interpreting, summarising, paraphrasing, classifying, explaining Remembering Recalling information Recognizing, listing, describing, retrieving, naming, finding

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


The Cognitive Dimension Process Level 1 - C1 Categories & Cognitive Processes Remember

Alternative Names

Recognizing

Identifying

Recalling

Retrieving

Definition Retrieve knowledge from longterm memory Locating knowledge in long-term memory that is consistent with presented material Retrieving relevant knowledge from long-term memory

Level 2 – C2 Categories & Cognitive Processes Understand

Interpreting

Exemplifying Classifying Summarizing Inferring

Comparing

Explaining

Alternative Names

Clarifying Paraphrasing Representing Translating Illustrating Instantiating Categorizing Subsuming Abstracting Generalizing Concluding Extrapolating Interpolating Predicting Contrasting Mapping Matching Constructing models

Definition Construct meaning from instructional messages, including oral, written, and graphic communication Changing from one form of representation to another

Finding a specific example or illustration of a concept or principle Determining that something belongs to a category Abstracting a general theme or major point(s) Drawing a logical conclusion from presented information

Detecting correspondences between two ideas, objects, and the like Constructing a cause and effect model of a system

Anderson, Lorin W. & Krathwohl, David R. (2001). A Taxonomy for Learning, Teaching and Assessing: a Revision of Bloom’s Taxonomy. New York. Longman Publishing.


Level 3 – C3 Categories & Cognitive Processes Apply

Alternative Names

Executing

Carrying out

Implementing

Using

Analyze

Differentiating

Organizing

Attributing

Discriminating Distinguishing Focusing Selecting Finding coherence Integrating Outlining Parsing Structuring Deconstructing

Evaluate Checking

Coordinating Detecting Monitoring Testing

Critiquing

Judging

Definition Applying a procedure to a familiar task Applying a procedure to a familiar task Applying a procedure to an unfamiliar task Break material into its constituent parts and determine how the parts relate to one another and to an overall structure or purpose Distinguishing relevant from irrelevant parts or important from unimportant parts of presented material Determining how elements fit or function within a structure

Determine a point of view, bias, values, or intent underlying presented material Make judgments based on criteria and standards Detecting inconsistencies or fallacies within a process or product; determining whether a process or product has internal consistency; detecting the effectiveness of a procedure as it is being implemented Detecting inconsistencies between a product and external criteria; determining whether a product has external consistency; detecting the appropriateness of a procedure for a given problem

Anderson, Lorin W. & Krathwohl, David R. (2001). A Taxonomy for Learning, Teaching and Assessing: a Revision of Bloom’s Taxonomy. New York. Longman Publishing.


Categories & Cognitive Processes Create

Alternative Names

Generating

Hypothesizing

Planning

Designing

Producing

Constructing

Definition Put elements together to form a coherent or functional whole; reorganize elements into a new pattern or structure Coming up with alternative hypotheses based on criteria Devising a procedure for accomplishing some task Inventing a product

The Knowledge Dimension Dimension Factual Knowledge

Conceptual Knowledge

Procedural Knowledge

Metacognitive Knowledge

Definition The basic elements students must know to be acquainted with a discipline or solve problems in it The interrelationships among the basic elements within a larger structure that enable them to function together How to do something, methods of inquiry, and criteria for using skills, algorithms, techniques, and methods Knowledge of cognition in general as well as awareness and knowledge of one’s own cognition

Anderson, Lorin W. & Krathwohl, David R. (2001). A Taxonomy for Learning, Teaching and Assessing: a Revision of Bloom’s Taxonomy. New York. Longman Publishing.


Potential Activities and Products

Remembering: Potential Activities and Products • • • • • • • •

• • • • • • • • • • • • • • • • •

Make a list of the main events of the story. Make a time line of events. Make a facts chart. Write a list of any pieces of information you can remember. What animals were in the story? Make a chart showing… Make an acrostic. Recite a poem.

Understanding: Potential Activities and Products

Cut out, or draw pictures to show a particular event. Illustrate what you think the main idea may have been. Make a cartoon strip showing the sequence of events. Write and perform a play based on the story. Retell the story in your own words. Write a summary report of the event Prepare a flow chart to illustrate the sequence of events. Make a coloring book. Cut out, or draw pictures to show a particular event. Illustrate what you think the main idea was. Make a cartoon strip showing the sequence of events. Write and perform a play based on the story. Retell the story in your own words. Write a summary report of the event Prepare a flow chart to illustrate the sequence of events. Cut out, or draw pictures to show a particular event. Illustrate what you think the main idea was. Make a cartoon strip showing the sequence of events. Write and perform a play based on the story.

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


Applying: Potential Activities and Products • • • • • • • • • • •

Construct a model to demonstrate how it works Make a diorama to illustrate an event Make a scrapbook about the areas of study. Make a papier-mache map / clay model to include relevant information about an event. Take a collection of photographs to demonstrate a particular point. Make up a puzzle or a game about the topic. Write a textbook about this topic for others. Dress a doll in national costume. Make a clay model. Paint a mural using the same materials. Design a marketing strategy for your product using a known strategy as a model.

Analyzing: Potential Activities and Products • • • • • • • • •

Design a questionnaire to gather information. Write a commercial to sell a new product Make a flow chart to show the critical stages. Construct a graph to illustrate selected information. Make a family tree showing relationships. Devise a play about the study area. Write a biography of a person studied. Prepare a report about the area of study. Conduct an investigation to produce information to support a view. • Review a work of art in terms of form, color and texture.

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


Evaluating: Potential Activities and Products • Prepare a list of criteria to judge… • Conduct a debate about an issue of special interest. • Make a booklet about five rules you see as important. Convince others. • Form a panel to discuss views. • Write a letter to. ..advising on changes needed. • Write a half-yearly report. • Prepare a case to present your view about...

Creating: Potential Activities and Products • Invent a machine to do a specific task. • Design a building to house your study. • Create a new product. Give it a name and plan a marketing campaign. • Write about your feelings in relation to... • Write a TV show play, puppet show, role play, song or pantomime about.. • Design a record, book or magazine cover for... • Sell an idea • Devise a way to... • Make up a new language and use it in an example.

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


Assessment

Questions for Remembering • • • • • • • • •

What happened after...? How many...? What is...? Who was it that...? Can you name ...? Find the meaning of… Describe what happened after… Who spoke to...? Which is true or false...? (Pohl, Learning to Think, Thinking to Learn, p. 12)

Questions for Understanding • • • • • • • • •

Can you write in your own words? How would you explain…? Can you write a brief outline...? What do you think could have happened next...? Who do you think...? What was the main idea...? Can you clarify…? Can you illustrate…? Does everyone act in the way that …….. does? (Pohl, Learning to Think, Thinking to Learn, p. 12)

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


Questions for Applying • Do you know of another instance where…? • Can you group by characteristics such as…? • Which factors would you change if…? • What questions would you ask of…? • From the information given, can you develop a set of instructions about…? (Pohl, Learning to Think, Thinking to Learn, p. 13)

Question for Analysing • • • • • • • • • • •

Which events could not have happened? If. ..happened, what might the ending have been? How is...similar to...? What do you see as other possible outcomes? Why did...changes occur? Can you explain what must have happened when...? What are some or the problems of...? Can you distinguish between...? What were some of the motives behind..? What was the turning point? What was the problem with...? (Pohl, Learning to Think, Thinking to Learn, p. 13)

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


Questions for Evaluating • • • • • • • • • • • • • •

Is there a better solution to...? Judge the value of... What do you think about...? Can you defend your position about...? Do you think...is a good or bad thing? How would you have handled...? What changes to.. would you recommend? Do you believe...? How would you feel if. ..? How effective are. ..? What are the consequences..? What influence will....have on our lives? What are the pros and cons of....? Why is ....of value? What are the alternatives? Who will gain & who will loose? (Pohl, Learning to Think, Thinking to Learn, p. 14)

Questions for Creating • Can you design a...to...? • Can you see a possible solution to...? • If you had access to all resources, how would you deal with...? • Why don't you devise your own way to...? • What would happen if ...? • How many ways can you...? • Can you create new and unusual uses for...? • Can you develop a proposal which would...? (Pohl, Learning to Think, Thinking to Learn, p. 14)

Retrieved from: http://www.kurwongbss.qld.edu.au/thinking/Bloom/blooms.htm


SCHOOL NAME

Name

GRADE: SECTION:

LEVEL: SUBJECT: TEACHER: LEVEL

CONTENTS

TOTALS

REMEMBERING Item Value Series

UNDERSTANDING Item Value Series

APPLYING Item Value Series

ANALYSING Item Value Series

EVALUATING Item Value Series

CREATING Item Value Series

LEVEL ITEMS

0

0 SERIES POINTS

0

0

0

TOTAL 0

0

TOTALS PER CONTENT

100 points 0.00


Universidad Mariano Galvez de Guatemala Facultad de Humanidades Escuela de Idiomas Lda. Evelyn R. Quiroa

September 3rd 2011 Blue Print

In this class we learned about blue print and how it has more effective the evaluation process in a test. We can distribute in a better way the series depending the areas we thought in each lesson or unit in class. I really like it, but we have to be careful with the calculate form we use‌ Because in the last class I made a little mistake.

Student Name: Ma. Raquel Morales PĂŠrez ID number: 07603220


September 2011 Sun

Mon

Tue

Wed

Thu

Fri

Sat

1 Topic: Objective: Assessent Activity:

2 Topic: Objective: Assessent Activity:

3

4

5 Topic: Objective: Assessent Activity:

6 Topic: Objective: Assessent Activity:

7 Topic: Objective: Assessent Activity:

8 Topic: Objective: Assessent Activity:

9 Topic: Objective: Assessent Activity:

10

11

12 Topic: Objective: Assessent Activity:

13 Topic: Objective: Assessent Activity:

14 Topic: Objective: Assessent Activity:

15 Topic: Objective: Assessent Activity:

16 Topic: Objective: Assessent Activity:

17

18

19 Topic: Objective: Assessent Activity:

20 Topic: Objective: Assessent Activity:

21 Topic: Objective: Assessent Activity:

22 Topic: Objective: Assessent Activity:

23 Topic: Objective: Assessent Activity:

24

25

26 Topic: Objective: Assessent Activity:

27 Topic: Objective: Assessent Activity:

28 Topic: Objective: Assessent Activity:

29 Topic: Objective: Assessent Activity:

30 Topic: Objective: Assessent Activity:


October 2011 Sun

Mon

Tue

Wed

Thu

Fri

Sat 1

2

3 Topic: Objective: Assessent Activity:

4 Topic: Objective: Assessent Activity:

5 Topic: Objective: Assessent Activity:

6 Topic: Objective: Assessent Activity:

7 Topic: Objective: Assessent Activity:

8

9

10 Topic: Objective: Assessent Activity:

11 Topic: Objective: Assessent Activity:

12 Topic: Objective: Assessent Activity:

13 Topic: Objective: Assessent Activity:

14 Topic: Objective: Assessent Activity:

15

16

17 Topic: Objective: Assessent Activity:

18 Topic: Objective: Assessent Activity:

19 Topic: Objective: Assessent Activity:

20 Topic: Objective: Assessent Activity:

21 Topic: Objective: Assessent Activity:

22

23

24 Topic: Objective: Assessent Activity:

25 Topic: Objective: Assessent Activity:

26 Topic: Objective: Assessent Activity:

27 Topic: Objective: Assessent Activity:

28 Topic: Objective: Assessent Activity:

29




What is an essay question? It is a test item which requires a response

composed by the examinee, usually in the form of one or more sentences or paragraphs, about a presented situation or to required questions.


Contained elements Requires examinees to compose rather than select

their response. Elicits student responses that mus consist of more than one sentence. Allows different or original responses or pattern of responses Requires subjective judgment by a competent specialist to judge the accuracy and quality of responses.


Essay question sample Education comes not from books but from

practical experience. Write a unified essay in which you perform the following tasks. Explain what you think the above statement means. Describe a specific situation in which books might educate students better than practical experience. Discuss what you think determines when practical experience provides a better education than books do.


SAMPLE ESSAY The statement means that at times practical experience can

be a better method of education than pure classroom work. The use of books to present abstract ideas is one kind of education that is better to learn from books than practical experience. Take math, for instance. Math concepts are best learned from books rather than practical experience. Also, history is best learned in the classroom since a person can't physically go back in time and watch a war. Certain professions are learned on the job, like carpentry and plumbing. Practical experience is the primary method of education. It is the same way for surgery. You wouldn't want someone to take out your appendix unless they had practiced this procedure many times on someone else.


Essays are essential for development and evaluation of student’s skills – Writing and reading skills, analytical and critical thinking skills, research skills, and the ability to write under time pressure. All these skills are assessed in the entire essay writing process.


Essay writing is a good way to internalize the facts that has been heard or read. Writing activity stimulates the intellect and leads to intellectual development and also is a healty way to improve writing skills. Essay are used to judge the mastery and comprenhension of material in fields such as the humanities and social sciences, also include non-literary types such as visual arts, music, films, and photography Though essay wrinting is not very easy and simple; it needs practice and the students have to learn essay writing rules and about the collection of material.


A student must have a critical and analytical mind that is keen to research on any specified essay topic. Knowledge internalization Intellectual development Feed Back Generation Good Practice



Reaction: How the learners react to the learning process. Learning: The extent to which the learners gain knowledge and skills. Behavior: Capability to perform the learned skills while on the job. Results: Includes such items as monetary, efficiency, moral, etc.


Elements of an essay that can be assessed through an essay Evaluation Content: Evidence of the use of appropriate material. There should be transition sentences linking the paragraph. Idea: It directly answer the question of the essay. Organization: Of material into a coherent structure: introduction, argument and evidence, coclusion. Form: Clear style, including accurate spelling, clear sentence construction and punctuation. Language: Avoidance of inappropriate slang, racist or sexist language.


Type of essay to test a specific topic and what does it consist in. Extended Response: These answers have three parts: a) Beginning: The first paragraph introduces your main idea or

position. Often it simply restates the questions. a)

Middle: The second paragraph provides information, examples and details to support your main idea of position.

a)

Ending: The final paragraph sums up your main idea or position. It restates your topic sentence, this time with more feeling. Restricted response: Relate more directly to a specific objective, generally assesses the lower complexity outcomes.


Process Essay: The writing project in which we describe how to do something or tell how to do something of tell how something happens. Classification Essay: It is supposed to organize or sort the arguments of the wssay in categories. An important thing to mention is that the categories are to fully resemble the essence of the essay’s topic. Comparison Essay: This type is focused on the comparison of things, people, facts or events. Its basic purpose it to find the most significant similarities of the differences of specific things, facts, evets, etc. First is described the less significant points and up to the conclusion the most significant point is revealed. Problem Solving Essay: It is a kind of academic paper that describes one or more problems and provides possible variants for their solutions. It is crucial to remember that a problem solving papaer cannot be written without one definite solution. You cannot just dwell on the problems for pages without providing your ideas of solving them.


“Advantanges� Assess complex learning outcomes and thought processes Test writing skills Promote original thinking Stimulate increased studying Simulate realistic tasks Relatively simple to construct Encourage the organization of knowledge, integration of theories and expression of opinions


“Disadvantages� Time consuming to answer Scoring is more subjective and time consuming Grading can be influenced by context effects, expectations, or grading fatigue. Grading may be influenced by factors extraneous to the content. Students can submit an organized coherent essay of irrelevant material


Evaluation Tool used to avoid bias or Subjective Results When evaluating an essay for professional rubrics, common assessment areas include focused thesis statement, satisfactory grammar, organization, and conclusive sections that fully answer the question timeline can aid in correction and evaluation of professional writing to meet the standardized objectives. Descriptions uses tools such as denotative language, connotative language, figurative language, metaphor, and sImile to arrive at a dominat impression.


Tips for using Essay Question Focus each question so that students know exactly what is expected. Provide specific guidelines about time limits, amount of info expected, weighting of questions, criteria for evaluating. Limit the use of essay questions to objectives that benefit from them. Avoid making optional essay questions so that different tests. Don’t give questions to students ahead of time, but you can go through sample questions to prepare students.


HANDS ON !! Politicians too often base their decisions on what

will please the voters, not on what is best for the country. Write a unified essay in which you perform the following tasks. Explain what you think the above statement means. Describe a specific situation in which a politician might make an unpopular decision for the good of the country. Discuss the principles you think should determine whether political decisions should be made to please the voters or to serve the nation .


Short-answer questions are similar to objective items in that a clearly-defined answer is required, but differ from the latter in that the answer has to be generated and supplied by the learner rather than chosen from a number of options provided.


ď‚ž They

can have extremely high reliability, thus minimizing possible marker subjectivity. ď‚ž While short answer items often target knowledge or comprehension understanding, effectively developed completion items can also be utilized to assess application, synthesis, analysis, and evaluation levels. One means of measuring this type of higherorder understanding is to utilize combinations of short answer statements within a given paragraph. When implementing the paragraph format, be sure that desired knowledge is clearly specified.


  

 

Easy to construct. Minimizes guessing Encourages more intensive study-student must know the answer vs. Recognizing the answer. Short-answer tests are also fairly simple to administer and mark. Effective as either a written or oral assessment. Effective for assessing who, what, where, and when information.

 

 

May overemphasize memorization of facts. Take care – questions may have more than one correct answer. Scoring is laborious. they are not particularly well suited for testing some types of highercognitive and noncognitive outcomes Not suitable for itemanalysis Often criticized for encouraging rote memorization


When using with definitions: supply term, not the definition-for a better judge of student knowledge.  For numbers, indicate the degree of precision/units expected.  Use direct questions, not an incomplete statement.  If you do use incomplete statements, don´t use more than 2 blanks within an item.  Arrange blank to make scoring easy.  Try to phrase question so there is only one answer possible.  Do instructions clearly specify the desired knowledge and specificity of response?  Is there only one clearly correct answer? 


Completion items ď‚ž In their simplest form, these consist of incomplete statements, the learner having to supply the missing words, terms, symbols, etc. Four typical examples are shown below. ď‚ž Example 1 (a simple completion item that only requires a single answer to be provided) 1. How you call a person who studies space? (answer: astronomer)


ď‚ž Completion

items can also be built round things like tables, maps, diagrams, drawings and photographs, with the learner again having to supply missing pieces of information.


ď‚ž These

take the form of actual questions (or instructions that imply questions), with the learner having to supply the answer(s). Such items can themselves take a wide range of forms, some of the possibilities being shown below.

Example 2 (a similar question that requires more than one answer) 'Name the three basic branches of government.' ________

________

________

(Executive)

(Legislature)

(Judiciary)


ď‚ž These

are similar to unique-answer questions except that they allow for some variation in the nature of the answer, either in terms of its intrinsic content or in terms of the way in which it is expressed. ď ś Example 1 (a question that has several acceptable answers) Two planets that have rings are ___________ and ________________. (Jupiter, Saturn, Uranus, or Neptune)


Example 2 (a similar question that requires slightly longer answers) 'Outline four fundamental differences between the systems of government of the United States of America and the United Kingdom'. (Possible answers might include: The USA's head of state is a president while that of the UK is a constitutional monarch; the USA has a federal structure while the UK has not; in the USA, the executive and legislative arms of government are separate, while in the UK they are not; the upper legislative house in the USA is elected whereas that in the UK is not; the USA has a written constitution whereas the UK has not; the USA has a supreme court whereas the UK has not) ď ś


Like objective questions, they are of limited use in testing non-cognitive skills such as communication skills, interpersonal skills and psychomotor skills. Thus, the first thing that anyone thinking of making use of short-answer questions should do is check that the learning outcomes that it is wished to assess are in fact suited to this form of assessment; if they are not, some other assessment technique should be employed.


The most common method of evaluating a short-answer question (or, more usually, a test composed of such questions) is to have it checked by a colleague or validation panel. In order to enable such an evaluation to be carried out in a meaningful and systematic way.


 Is

a short answer item an appropriate assessment of the learning objective?  Does the content of the short answer question measure knowledge appropriate to the desired learning goal?  Is the item clearly worded and stated in language appropriate to the student population?  Does the positioning of the item blank promote efficient scoring?


Higher-order of thinking Higher-order thinking essentially means thinking that takes place in the higher-levels of the hierarchy of cognitive processing. Bloom’s Taxonomy is the most widely accepted hierarchical arrangement of this sort in education and it can be viewed as a continuum of thinking skills starting with knowledge-level thinking and moving eventually to evaluation-level of thinking. A common example, used by Dr. Chuck Weiderhold of the application of the major categories in Bloom’s Taxonomy, is show below, applying the taxonomy to the Pledge of Allegiance: Knowledge statements ask the student to recite the pledge. Example: “Say the pledge.” Comprehension statements ask the student to explain the meaning of words contained in the pledge. Example: “Explain what indivisible, liberty, and justice mean.” Application statements ask the student to apply understandings. Example: “Create your own pledge to something you believe in.” Analysis statements ask the student to interpret word meanings in relation to context. Example: “Discuss the meaning of ‘and to the Republic for which it stands’ in terms of its importance to the pledge.” Synthesis statements ask the student to apply concepts in a new setting. Example: “Write a contract between yourself and a friend that includes an allegiance to a symbol that stands for something you both believe in.” Evaluation statements ask the student to judge the relative merits of the content and concepts contained in the subject. Example: “Describe the purpose of the pledge and assess how well it achieves that purpose. Suggest improvements.” (Wiederhold, C. (1997). The Q-Matrix/Cooperative Learning & Higher-Level Thinking. San Clemente, CA: Kagan Cooperative Learning.) When we promote higher-order thinking then, we are simply promoting thinking, along with the teaching methodologies that promote such thinking, that takes place at the higher levels of the hierarchy just provided, notably application, analysis, synthesis, and evaluation. Critical/creative/constructive thinking is closely related to higher-order thinking; they are actually inseparable. Critical/creative/constructive thinking simply means thinking processes that progress upward in the given direction. First one critically analyzes the knowledge, information, or situation. Then they creatively consider possible next-step options, and then finally, they construct a new product, decision, direction, or value. The evaluation step listed above with the Pledge of Allegiance would require this sort of thinking.


Reading Beyond the Lines Another way to look at higher-order thinking is to look at the reading process in typical terms and then extend the terms one step to reach higher-order thinking. That is, being able to read, being literate, typically means having the ability to decode words and understand their meanings individually and collectively. Being able to read and to comprehend the reading is generally considered thinking and involves “reading the lines” and “reading between the lines.” Higherorder thinking or literacy though, is the next crucial step, often not even thought of in the reading process, that being “reading beyond the lines.” This is so crucial because it is in reading beyond the lines that reading the lines and reading between the lines have their real value.

Instructional Elements for Fostering Higher-Order Thinking in the Classroom (Synthesized from Teaching Children to Be Literate: A Reflective Approach, by Anthony and Ula Manzo, 1995) 1. Remember to ask for it; that is, for discovery, invention, and artistic/literary creation. 2. Great curiosity and new ideas with enthusiasm; these can often lead to the most valuable “teachable moments.” 3. Expose learners to new twists on old patterns and invite looking at old patterns from new angles. 4. Constructively critique new ideas because they almost always require some fine-tuning. 5. Reset our expectations to the fact that there will be many more “misses” than “hits” when reaching for workable new ideas. 6. Learn to invite contrary, or opposing, positions; new possibilities are often discovered in this way and existing thoughts, patterns, and beliefs can be tested and strengthened.

Questions that Invite Higher-Order Thinking (Synthesized from Teaching Children to Be Literate: A Reflective Approach, by Anthony and Ula Manzo, 1995) · How is this study like another you/we have read? This question encourages students to make connections and see analogies.


· Does this story/information make you aware of any problems that need attention? This amounts to asking students to see themselves as active participants in problem identification as well as problem solving. · What does this mean to you and how might it affect others? This pair of questions gives students a chance to express their own interests but also to empathetically consider and understand the views of, and possible consequences to, others. · Is there anything wrong with this solution, and how else might this problem be solved? These questions are the heart of successful critical analysis. · What more needs to be known or done to understand or do this better? This is a pointed request for creative problem solving that invites thinking “beyond the lines.” · What is a contrary way of seeing this? Being able to examine issues from multiple points of view helps the students to clarify their thoughts.

Questioning for Quality Thinking at Each Level of Bloom’s Taxonomy Knowledge: Identification and recall of information Who, what, when, where, how? Describe ___________________. Comprehension: Organization and selection of facts and ideas Retell ___________ in your own words. What is the main idea of ___________________? Application: Use of facts, rules, principles How is __________ and example of _______________? How is __________ related to _________________? Why is _________________ significant? Analysis: Separation of the whole into component parts What are the parts or features of ________________?


Classify _______________ according to ________________. Outline/diagram/web ____________________. How does ______________ compare/contrast with __________________? What evidence can you list for _____________________? Synthesis: Combination of ideas to form a new whole What would you predict/infer from __________________? What ideas can you add to __________________? How would you create/design a new __________________? What might happen if you combine _______________ with ________________? What solutions would you suggest for __________________? Evaluation: Development of opinions, judgments, or decisions Do you agree with _________________? What do you think about _______________? What is the most important _____________? Prioritize ________________. How would you decide about ________________? What criteria would you use to assess ______________________?

Head-on Approaches to Teaching Higher-Order Thinking “Thinking Thursdays� o Consider setting aside a given amount of time on a regular basis to try some of these direct approaches to teaching critical and creative thinking.


Word Creation: o Define the word “squallizmotex” and explain how your definition fits the word. o If dried grapes are called raisins, and dried beef is called beef jerky, what would you call these items if they were dried: lemons, pineapple, watermelon, chicken. Unusual Uses: o Have students try to think of as many unusual uses as they can for common objects such as bricks, used toys, old tennis balls, soda bottles, and 8-track cassette tapes. Circumstances and Consequences: What would happen if . . . o school was on weekends and not during the week? o water stuck like glue? o gravity took a day off? o there were no colors? o everyone in the country could vote on every issue that is now decided by government representatives? Product Improvements: o How could school desks be improved? o How could living room furniture be improved to provide better storage and even exercise while watching television? o How can we better equip book-carrying bags to handle lunches and other needs that you can think of? Systems and Social Improvements: o A sample question that could lead into plenty of higher-level discussion and a good give-and-take of views and needs could be: “How can schools be made more fun without hurting learning?”

Higher-Order Thinking & REAP Read-Encode-Annotate-Ponder (REAP) is a teaching method developed by M.G. Eanet & A.V. Manzo at University of Missouri- Kansas City. It is a strategy developed for students to use to improve writing, thinking, and reading. As a teaching method, it is intended to teach students a variety of ways to respond to any text. The responses are brief and poignant ways to critique or annotate what they have read. There are different types of annotations which range from simple summary (reconstructive) to highly challenging critical-creative responses (constructive). Value of Annotating In writing annotations the readers discriminate and synthesize ideas presented by the author, then translate it into their own language. Writing and annotations enrich reflective thinking and reading. The readers analyze the author's purpose and explore their own feelings about the


written material. Students who write about what they have learned gain from the reading process. Consequently, writing should be an integral part (a vital component) in the classroom setting. Writing serves as a catalyst in improving one's reading, thinking and comprehension abilities. Learning the routine to write after reading ignites ACTIVE THINKING before, during and after a reading selection. Annotations ensure meaningful reading and encourage clear and concise thinking and writing. Annotations enhance reader's knowledge base as well as improve thinking and writing skills. Steps in REAP: R: Read to discern the writer's message. E: Encode the message by translating it into your own words. A: Annotate by cogently writing the message in notes for yourself, or in a thought book or on an electronic response system. P: Ponder, or further reflect on what you have read and written, through discussion and by reviewing others' response to the same materials and/or your own annotation. Using REAP as a Rubric for Monitoring Progress Toward Higher-Order Thinking REAP may be used as a way to monitor a student's progress toward higher-order thinking. By using examples of the various types of annotations, a teacher may compare and appraise the characteristic way in which the student responds to text. The annotation types listed above are roughly in order of difficulty. Lower numbers indicate more concrete thinking (or literalness) and higher numbers more personal and abstract patterns of response. Annotation Types Reconstructive... requires literal-level response to a text. Constructive... requires reading and thinking between and beyond the lines. -----------------------------------------------------------------------Reconstructive Responses 1. Summary response. States the basic message of the selection in brief form. In fiction, it is the basic story line; in nonfiction, it is a simple statement of the main ideas. 2. Precise response. Briefly states the author's basic idea or theme, with all unnecessary words removed. The result is a crisp, telegram like message. 3. Attention-getting or heuristic response. Restates a snappy portion of the selection that makes the reader want to respond. It is best to use the author's own words.


4. Question response. Turns the main point of the story or information into an organizing question that the selection answers. Constructive Responses 5. Personal view or transactional response. Answers the question "How do your views and feelings compare with what you perceive the author to have said?" 6. Critical response. Supports, reject s, or questions the main idea, and tells why. The first sentence of this type of response should restate the author's position. The next sentence should state the writer's position. Additional sentences should explain how the two differ. 7. Contrary response. Attempts to state a logical alternative position, even if it is not one that the student necessarily supports. 8. Intention response. States and briefly explains what the responder thinks is the author's intention, plan, and purpose in writing the selection. This is a special version of the critical response that causes the reader/responder to try to think like the author or from the author's perspective. 9. Motivation response. States what may have caused the author to create or write the story or selection. This is another special version of critical responding. It is an attempt to discover the author's personal agenda and hence areas of writing or unwitting biases. 10. Discovery response. States one or more practical questions that need to be answered before the story or facts can be judged for accuracy or worth. This type of response to text is the mode of thinking that leads to more reading and research and occasionally to a reformulated position or view. 11. Creative response. Suggests different and perhaps better solutions or views and/or connections and applications to prior learning and experiences. Students usually need some guidance and/or examples to produce this type of response. Once they begin thinking in this way, the results can be remarkably constructive. For more information about REAP, especially if you are interested in being involved with a current on-line REAP pilot study, please visit REAP Central Today

Writing to Promote Higher-Order Thinking (Synthesized from Teaching Children to Be Literate: A Reflective Approach, by Anthony and Ula Manzo, 1995) Advantages


Writing activates the reader’s background knowledge before reading/thinking. Writing builds anticipation of upcoming learning events. Writing raises the reader’s level of intellectual activity. Writing encourages meaningful comparisons of the student’s perspective with that of the writer (in reading situations) Writing helps students better formulate their world view. Writing allows students to examine their perspectives on key issues. Writing builds metacognitive as well as cognitive abilities because writing forces deeper levels of introspection, analysis, and synthesis than any other mediational process. Suggestions Related to Using Writing to Promote Higher-Order Thinking Write daily or frequently rather than sporadically. Write for real audiences and purposes. Allot sufficient time for stages of thought and editing to occur. Encourage peer review Write with an initial emphasis on thinking rather than on proofreading and editing.

Contributed by Barbara Fowler, Longview Community College. Bloom's Revised Taxonomy divides the way people learn into three domains. One of these is the cognitive domain which emphasizes intellectual outcomes. This domain is further divided into categories or levels. The key words used and the type of questions asked may aid in the establishment and encouragement of critical thinking, especially in the higher levels.

Level 1: Remembering - exhibits previously learned material by recalling facts, terms, basic concepts and answers. Key words: who, what, why, when, omit, where, which, choose, find, how, define, label, show, spell, list, match, name, relate, tell, recall, select Questions: What is . . . ? How is . . . ? Where is . . . ? When did _______ happen? How did ______ happen? How would you explain . . . ? Why did . . . ? How would you describe . . . ? When did . . . ? Can you recall . . . ?


How would you show . . . ? Can you select . . . ? Who were the main . . . ? Can you list three . . . ? Which one . . . ? Who was . . . ?

Level 2: Understanding - demonstrating understanding of facts and ideas by organizing, comparing, translating, interpreting, giving descriptions and stating main ideas. Key words: compare, contrast, demonstrate, interpret, explain, extend, illustrate, infer, outline, relate, rephrase, translate, summarize, show, classify Questions: How would you classify the type of . . . ? How would you compare . . . ? contrast . . . ? Will you state or interpret in your own words . . . ? How would you rephrase the meaning . . . ? What facts or ideas show . . . ? What is the main idea of . . . ? Which statements support . . . ? Can you explain what is happening . . . what is meant . . .? What can you say about . . . ? Which is the best answer . . . ? How would you summarize . . . ?

Level 3: Applying - solving problems by applying acquired knowledge, facts, techniques and rules in a different way. Key words: apply, build, choose, construct, develop, interview, make use of, organize, experiment with, plan, select, solve, utilize, model, identify


Questions: How would you use . . . ? What examples can you find to . . . ? How would you solve _______ using what you have learned . . . ? How would you organize _______ to show . . . ? How would you show your understanding of . . . ? What approach would you use to . . . ? How would you apply what you learned to develop . . . ? What other way would you plan to . . . ? What would result if . . . ? Can you make use of the facts to . . . ? What elements would you choose to change . . . ? What facts would you select to show . . . ? What questions would you ask in an interview with . . . ?

Level 4: Analyzing - examining and breaking information into parts by identifying motives or causes; making inferences and finding evidence to support generalizations. Key words: analyze, categorize, classify, compare, contrast, discover, dissect, divide, examine, inspect, simplify, survey, take part in, test for, distinguish, list, distinction, theme, relationships, function, motive, inference, assumption, conclusion Questions: What are the parts or features of . . . ? How is _______ related to . . . ? Why do you think . . . ? What is the theme . . . ?


What motive is there . . . ? Can you list the parts . . . ? What inference can you make . . . ? What conclusions can you draw . . . ? How would you classify . . . ? How would you categorize . . . ? Can you identify the difference parts . . . ? What evidence can you find . . . ? What is the relationship between . . . ? Can you make a distinction between . . . ? What is the function of . . . ? What ideas justify . . . ?

Level 5: Evaluating - presenting and defending opinions by making judgments about information, validity of ideas or quality of work based on a set of criteria. Key Words: award, choose, conclude, criticize, decide, defend, determine, dispute, evaluate, judge, justify, measure, compare, mark, rate, recommend, rule on, select, agree, interpret, explain, appraise, prioritize, opinion, ,support, importance, criteria, prove, disprove, assess, influence, perceive, value, estimate, influence, deduct Questions: Do you agree with the actions . . . ? with the outcomes . . . ? What is your opinion of . . . ? How would you prove . . . ? disprove . . . ? Can you assess the value or importance of . . . ? Would it be better if . . . ?


Why did they (the character) choose . . . ? What would you recommend . . . ? How would you rate the . . . ? What would you cite to defend the actions . . . ? How would you evaluate . . . ? How could you determine . . . ? What choice would you have made . . . ? What would you select . . . ? How would you prioritize . . . ? What judgment would you make about . . . ? Based on what you know, how would you explain . . . ? What information would you use to support the view . . . ? How would you justify . . . ? What data was used to make the conclusion . . . ? Why was it better that . . . ? How would you prioritize the facts . . . ? How would you compare the ideas . . . ? people . . . ?

Level 6: Creating - compiling information together in a different way by combining elements in a new pattern or proposing alternative solutions. Key Words: build, choose, combine, compile, compose, construct, create, design, develop, estimate, formulate, imagine, invent, make up, originate, plan, predict, propose, solve, solution, suppose, discuss, modify, change, original, improve, adapt, minimize, maximize, delete, theorize, elaborate, test, improve, happen, change Questions:


What changes would you make to solve . . . ? How would you improve . . . ? What would happen if . . . ? Can you elaborate on the reason . . . ? Can you propose an alternative . . . ? Can you invent . . . ? How would you adapt ________ to create a different . . . ? How could you change (modify) the plot (plan) . . . ? What could be done to minimize (maximize) . . . ? What way would you design . . . ? What could be combined to improve (change) . . . ? Suppose you could _______ what would you do . . . ? How would you test . . . ? Can you formulate a theory for . . . ? Can you predict the outcome if . . . ? How would you estimate the results for . . . ? What facts can you compile . . . ? Can you construct a model that would change . . . ? Can you think of an original way for the . . . ?



Performance Assessment WHAT IS IT? Performance assessment, also known as alternative or authentic assessment, is a form of testing that requires students to perform a task rather than select an answer from a ready-made list. For example, a student may be asked to explain historical events, solve math problems, converse in a foreign language, or conduct research on an assigned topic.


The performance assessment could consist of a single task and a scoring method, or it could consist of multiple tasks and one or multiple scoring methods

performance assessments can be divided into two rough categories: Task-Centered performance assessments that are primarily intended to tap into and evaluate specific skills and competencies. Construct-Centered performance assessments that are intended to tap into and sample from a domain of skills and competencies.


Performance assessment measures students skills based on authentic tasks such as activities, exercises, or problems that require students to show what they can do. In some cases performance tasks are used to have students demonstrate their understanding of a concept or topic by applying their knowledge to a particular situation.


We can`t use the performance assessment for kids because it is made by hight levels skills and we must to have high care on the kinds of assessments. Small children are used to have short activities and we as teachers check our students knowledge while they are working. Small children don`t notice when your are checking their work. They are always imitating what the teacher or other students do , so we can`t check what they really think.


An example of the performance test is when you are on a driving test. The person is able to perform the functions of a competent driver of an automobile. ď‚ž Another sample are the rubrics they show us a kind of testing of different skills. ď‚ž



Let students have an introduction about the topic they are going to work on. Give specific instructions and roles when they work in groups. Be specific and let them know what are you going to evaluate or score. Omit scoring creativity, decoration, writen works and other that may distract students from the main objective… that is to perform. Make sure to give different assigments to each student, so all students can present somenthing different. Make sure to ask the audience to listen to each classmate when presenting.

   


One of the major limitations when testing by performances is to loose the object of evaluating.  Another one might be the level of difficulty in the assigment.  An the last one is that, if the teacher does not take the time to explain the topic it may not be clear for all the students. 


Performance assessments use grading strategies that are commonly used in the performing arts, fine arts, and Olympic competitions. In the context of the science laboratory, students are graded on the performance of manipulating variables, using scientific apparatus, identifying hypotheses, making measurements and calculations, organizing and managing data, and the communication of results. Graded laboratory performances go far beyond grading a final field report - this strategy considers the processes that become the laboratory report as well. In the evaluation of a performance task, the process of performing the task is emphasized more than the final product itself.


Clearly define the knowledge and skills students need to apply or demonstrate in solving a problem. Determine the criteria (standards) against which students will be judged and define indicators of “levels� of competence. Inform students of your expectations that students have every opportunity to clearly demonstrate to that course learning objectives have been mastered Design an authentic task that is somewhat undefined, complex, and has multiple entry and exit


Holistic Scoring Example, The Telescope Task Your task is to set up and align the 8� telescope, find three different sky objects, and accurately describe some aspects of these objects that astronomers consider to be important. Level 3: Student completes all aspects of task quickly and efficiently and is able to answer questions about the equipment used and objects observed beyond what is obvious. The tasks are: 1. align telescope mount with north celestial pole; 2. align finder telescope with primary telescope; 3. center on target object; 4. select and focus appropriate eyepiece; 5. provide information about the target beyond the literal descriptive level; and 6. answer questions about the target correctly. Level 2: Student completes all aspects of task and provides descriptive information about the equipment and objects observed.

Level 1: Student is not able to complete all aspects of task or is not able to sufficient provide information about the equipment used or objects observed. Level 0: No attempt or meaningful effort obvious.


TRUE/FALSE TESTS GROUP #3 October 22/2011


INTRODUCTION In the most basic format, truefalse questions are those in which a statement is presented and the student indicates in some manner whether the statement is true or false.


Skill levels evaluated by True/False tests True-false questions are well suited for testing student recall or comprehension. Students can generally respond to many questions, covering a lot of content, in a fairly short amount of time. From the teacher's perspective, these questions can be written quickly and are easy to score.


While true-false and other forced choice questions are generally used to measure knowledge and understanding, they could also be used at higher levels. The student: Analyzes a statement Assesses whether true or false Marks an answer


Advantages Appropriate for all levels of cognitive ability objective Efficient in testing recall and comprehension of a broader content area relative to other testing strategies Well suited to test recall, comprehension of simple logic or understanding, as with "ifthen" "causal/because" statements Useful for automated scoring Useful for item analysis, internal and over time


LIMITATIONS Scoring tends to be high since guessing yields a 50-50 score (half right half wrong) as a base. i.e. if there are 100 items, and the student knows the correct answer to 50, and guesses on the other half, the score will be 75 knowing only half the material. Since the stem can cue a correct answer, guessing is enhanced without really understanding the question The format does not provide diagnostic information on why a student got it wrong



Why do students tend to answer True or False? Because random guessing will produce the correct answer half the time, true-false tests are less reliable than other types of exams. However, these items are appropriate for occasional use. Some faculty who use truefalse questions add an "explain" column in which students write one or two sentences justifying their response.

Words like "sometimes, often, frequently, ordinarily, generally" open up the possibilities of making accurate statements. They make more modest claims, are more likely to reflect reality, and usually indicate "true" answers.


Every part of a true sentence must be "true" If any part of the sentence is false, the whole sentence is false despite many other true statements. Absolute words restrict possibilities. "No, never, none, always, every, entirely, only" imply the statement must be true 100% of the time and usually indicate



Why should be the True/False questions ratio? Research indicates that students tend to mark "true" when guessing blindly, thus false items discriminate better between high and low ability students. At the same time, students tend to quickly pick up on patterns of responding. To prevent response-bias and effectively assess understanding, instructors should include an equivalent number of true and false items within the assessment.


To prevent rote memorization of trivial facts or general knowledge, avoid using exact wording from the textbook. A well-designed true-false item is very effective for assessing the accuracy of statements, understanding of definitions, and novel applications of theories or principles.


Some tips in reference to True or False tests: Statements should be relatively short and simple. True statements should be about the same length as false statements. (There is a tendency to add details in true statements to make them more precise.) The answers should not be obvious to students who don't know the


Be sure to include directions that tell students how and where to mark their responses. Finally, arrange the statements so that there is no discernible pattern of answers (such as T, F, T, F, T, F and T, T, F, F, T, T, F, F) for True and False statements. Avoid Unfamiliar vocabulary and concepts Long strings of statements Ambiguous statements and generalizations


EXAMPLES T F Poor: "The Raven" was written by Edgar Allen Poe. T F Better: "The Raven" was written by Edgar Allan Poe. T F Poor: The square of the hypotenuse of a right triangle equals the sum of the squares of the other two sides. T F Better: If the hypotenuse of an isosceles right triangle is 7 inches, each of the two equal sides must be more than 5 inches.


CONCLUSION

True/False tests can be used for different skill levels and they measure knowledge and understanding as well as students recalling information.



THANK YOU FOR YOUR ATTENTION GOD BLESS YOU!


THE MATCHING TEST FORMAT


The matching test item format provides a way for learners to connect a word, sentence or phrase in one column to a corresponding word, sentence or phrase in a second column. The items in the first column are called premises and the answers in the second column are the responses.


The convention is for learners to match the premise on the left with a given response on the right. By convention, the items in Column A are numbered and the items in Column B are labeled with capital letters.


EXAMPLE PREMISES

RESPONSE

COLUMN A

COLUMN B

___1. Person who performs mysterious tasks no one understands

A. Facilitator

___2. Person who provides schooling for children

B. Trainer

___3. Person who enables a group C. Instructional Designer to find solutions

___4. Person who instructs adults D. Meeting Organizer in a classroom E. Teacher


Many authoring tools come with a pre-built matching test item template, which may involve dragging responses to the premise or typing the letters from Column B into Column A. The authoring tool templates may vary from the conventions of the written format.


WHEN TO USE MATCHING The matching test item format provides a change of pace, particularly for self-check and review activities. Many instructional designers employ them in quizzes and tests too. They are effective when you need to measure the learner’s ability to identify the relationship or association between similar items.


THEY WORK BEST WHEN THE COURSE CONTENT HAS MANY PARALLEL CONCEPTS, FOR EXAMPLE:

Terms and Definitions  Objects or Pictures and Labels  Symbols and Proper Names  Causes and Effects  Scenarios and Responses  Principles and Scenarios to which they apply 


CONSTRUCTION GUIDELINES If you decide to use a matching format, take the time to construct items that are valid and reliable. Here are some guidelines for this.


1. Two-part directions. Your clear directions at the start of each question need two parts: 1) how to make the match and 2) the basis for matching the response with the premise. You can also include whether items can be re-used, but often pre-built templates don’t allow for this.  Example for exercise above: Drag each career name in Column B to the best definition in Column A. No items may be used more than once.  2. Parallel content. Within one matching test item, use a common approach, such as all terms and definitions or all principles and the scenarios to which they apply. 


3. Plausible answers. All responses in Column B should be plausible answers to the premises in Column A. Otherwise, the test loses some of its reliability because some answers will be “give-aways.”  4. Clueless. Ensure your premises don’t include hints through grammar (like implying the answer must be plural) or hints from word choice (like using the term itself in a definition).  5. Unequal responses. In an ideal world, you should present more responses than premises, so the remaining responses don’t work as hints to the correct answer. This is not often possible when using a template. 


6. Limited premises. Due to the capacity limitations of working memory, avoid a long list of premises in the first column. A number that I’ve come across is to keep the list down to six items. Even less might be better, depending on the characteristics of your audience.  7. One correct answer. Every premise should have only one correct response. Obvious, but triple-check to make sure each response can only work for one premise. 


REARRANGMENT ITEMS Rearrangment items: rearrange and skip certain items in order to better estimate the examinees' abilities, without allowing them to cheat on the test.  The rearrangement procedure is effective in reducing the standard error of the Bayesian ability estimates and in increasing the reliability of the same estimates. 


RANKING ITEMS 

A ranking is a relationship between a set of items such that, for any two items, the first is either 'ranked higher than', 'ranked lower than' or 'ranked equal to' the second. It is not necessarily a total order of objects because two different objects can have the same ranking. The rankings themselves are totally ordered. For example, materials are totally preordered by hardness, while degrees of hardness are totally ordered.




By reducing detailed measures to a sequence of ordinal numbers, rankings make it possible to evaluate complex information according to certain criteria. Analysis of data obtained by ranking commonly requires non-parametric statistics.


ADVANTAGES AND LIMITATIONS THAT THIS TYPE OF TEST MIGHT HAVE Advantages:  Relatively easy to construct  Easy to score Disadvantages:  Time consuming for students  Not good for higher levels of learning


TIPS FOR WRITING GOOD MATCHING ITEMS: Need 15 items or less.  Give good directions on basis for matching.  Use items in response column more than once (reduces the effects of guessing).  Use homogenous material in each exercise.  Make all responses plausible.  Put all items on a single page.  Put response in some logical order (chronological, alphabetical, etc.).  Responses should be short. 


SKILL LEVELS THAT CAN BE REINFORCED THROUGH THIS TYPE OF TEST ITEM

Good for:  Knowledge level  Some comprehension level, if appropriately constructed


Types:  Terms with definitions  Phrases with other phrases  Causes with effects  Parts with larger units  Problems with solutions


MAJOR LIMITATIONS FOR THIS TYPE OF TEST

They are time consuming for students  They are not good for higher levels of learning  They have difficulty measuring learning objectives requiring more than simple recall of information  They are difficult to construct due to the problem of selecting a common set of stimuli and responses  They place a high degree of dependence on the student's reading ability and instructor's writing ability. 


WHAT ARE THE NAMES OF THE COLUMNS?



The items in the first column are called premises and the answers in the second column are the responses.


PREMISE A previous statement or proposition from which another is inferred or follows as a conclusion. In tests: Words or phrases. RESPONSE Written answer.


When there are exactly as many premises as there are responses and when each response is used once and only once in the matching process, the test item is said to have perfect matching. When some of the responses are used more than once or not at all, the item is said to have imperfect matching. Imperfect matching makes guessing more difficult.


PERFECT MATCHING COLUMN A

COLUMN B

______1. James Michener

A. History

______2. Stephen King

B. Horror

______3. Erma Bombeck

C. Humor

______4. Agatha Christie

D. Mystery

______5. Walt Whitman

E. Poetry

______6. Danielle Steele

F. Romance

______7. Isaac Asimov

G. Science Fiction


IMPERFECT MATCHING COLUMN A

COLUMN B

______1. James Michener

A. History

______2. Stephen King

B. Horror

______3. Erma Bombeck

C. Humor

______4. Agatha Christie

D. Mystery

______5. Walt Whitman

E. Poetry

______6. Danielle Steele

F. Romance

______7. Isaac Asimov

G. Science Fiction H. Tragedy


THANK YOU


CONSTRUCTING TEST ITEMS


TEST ITEM TYPES • • • • • •

Multiple choice True or False Completion/Short Answers Matching Essay Questions Performance Assessment


MULTIPLE CHOICE • Multiple-choice items can be used to measure knowledge outcomes and various types of learning outcomes. • They are most widely used for measuring knowledge, comprehension, and application outcomes. • The multiple-choice item provides the most useful format for measuring achievement at various levels of learning. • When selection-type items are to be used (multiple-choice, true-false, matching, check all that apply) an effective procedure is to start each item as a multiple-choice item and switch to another item type only when the learning outcome and content make it desirable to do so. For example (1) when there are only two possible alternatives, a shift can be made to a true-false item; and (2) (2) when there are a number of similar factors to be related, a shift can be made to a matching item.


STRENGTHS • • • • • • • • • • • •

Learning outcomes from simple to complex can be measured. Highly structured and clear tasks are provided. A broad sample of achievement can be measured. Incorrect alternatives provide diagnostic information. Scores are less influenced by guessing than true-false items. Scores are more reliable than subjectively scored items (e.g., essays). Scoring is easy, objective, and reliable. Item analysis can reveal how difficult each item was and how well it discriminated between the strong and weaker students in the class Performance can be compared from class to class and year to year Can cover a lot of material very efficiently (about one item per minute of testing time). Items can be written so that students must discriminate among options that vary in degree of correctness. Avoids the absolute judgments found in True-False tests.


LIMITATIONS • • • • • • • • • • • •

Constructing good items is time consuming. It is frequently difficult to find plausible distracters. This item is ineffective for measuring some types of problem solving and the ability to organize and express ideas. Real-world problem solving differs – a different process is involved in proposing a solution versus selecting a solution from a set of alternatives. Scores can be influenced by reading ability. There is a lack of feedback on individual thought processes – it is difficult to determine why individual students selected incorrect responses. Students can sometimes read more into the question than was intended. Often focus on testing factual information and fails to test higher levels of cognitive thinking. Sometimes there is more than one defensible “correct” answer. They place a high degree of dependence on the student’s reading ability and the instructor’s writing ability. Does not provide a measure of writing ability. May encourage guessing.


Helpful Hints • • • • • • • • • • • • •

Base each item on an educational or instructional objective of the course, not trivial information. Try to write items in which there is one and only one correct or clearly best answer. The phrase that introduces the item (stem) should clearly state the problem. Test only a single idea in each item. Be sure wrong answer choices (distracters) are at least plausible. Incorporate common errors of students in distracters. The position of the correct answer should vary randomly from item to item. Include from three to five options for each item. Avoid overlapping alternatives (see Example 3 following). The length of the response options should be about the same within each item (preferably short). There should be no grammatical clues to the correct answer. Format the items vertically, not horizontally (i.e., list the choices vertically) The response options should be indented and in column form.


• Word the stem positively; avoid negative phrasing such as “not” or “except.” If this cannot be avoided, the negative words should always be highlighted by underlining or capitalization: Which of the following is NOT an example …… • Avoid excessive use of negatives and/or double negatives. • Avoid the excessive use of “All of the above” and “None of the above” in the response alternatives. • In the case of “All of the above”, students only need to have partial information in order to answer the question. Students need to know that only two of the options are correct (in a four or more option question) to determine that “All of the above” is the correct answer choice. Conversely, students only need to eliminate one answer choice as implausible in order to eliminate “All of the above” as an answer choice. • Similarly, with “None of the above”, when used as the correct answer choice, information is gained about students’ ability to detect incorrect answers. However, the item does not reveal if students know the correct answer to the question.


Multiple-Choice Item Writing Guidelines Multiple-choice questions typically have 3 parts: STEM, KEY & DISTRACTERS


Procedural Rules:

• Use either the best answer or the correct answer format. • Best answer format refers to a list of options that can all be correct in the sense that each has an advantage, but one of them is the best. • Correct answer format refers to one and only one right answer. • Format the items vertically, not horizontally (i.e., list the choices vertically) • Allow time for editing and other types of item revisions. • Use good grammar, punctuation, and spelling consistently. • Minimize the time required to read each item. • Avoid trick items. • Use the active voice. • The ideal question will be answered by 60-65% of the tested population. • Have your questions peer-reviewed. • Avoid giving unintended cues – such as making the correct answer longer in length than the distracters.


Content-related Rules:

• Base each item on an educational or instructional objective of the course, not trivial information. • Test for important or significant information. • Focus on a single problem or idea for each test item. • Keep the vocabulary consistent with the examinees’ level of understanding. • Avoid cueing one item with another; keep items independent of one another. • Use the author’s examples as a basis for developing your items. • Avoid overly specific knowledge when developing items. • Avoid textbook, verbatim phrasing when developing the items. • Avoid items based on opinions. • Use multiple-choice to measure higher level thinking. • Be sensitive to cultural and gender issues. • Use case-based questions that use a common text to which a set of questions refers.


Stem Construction Rules:

• State the stem in either question form or completion form. • When using a completion form, don’t leave a blank for completion in the beginning or middle of the stem. • Ensure that the directions in the stem are clear, and that wording lets the examinee know exactly what is being asked. • Avoid window dressing (excessive verbiage) in the stem. • Word the stem positively; avoid negative phrasing such as “not” or “except.” If this cannot be avoided, the negative words should always be highlighted by underlining or capitalization: Which of the following is NOT an example …… • Include the central idea and most of the phrasing in the stem. • Avoid giving clues such as linking the stem to the answer (…. Is an example of an: test-wise students will know the correct answer should start with a vowel)


General Option Development Rules: • • • • • • • • • • • • • •

Place options in logical or numerical order. Use letters in front of options rather than numbers; numerical answers in numbered items may be confusing to students. Keep options independent; options should not be overlapping. Keep all options homogeneous in content. Keep the length of options fairly consistent. Avoid, or use sparingly, the phrase all of the above. Avoid, or use sparingly, the phrase none of the above. Avoid the use of the phrase I don’t know. Phrase options positively, not negatively. Avoid distracters that can clue test-wise examinees; for example, absurd options, formal prompts, or semantic (overly specific or overly general) clues. Avoid giving clues through the use of faulty grammatical construction. Avoid specific determinates, such as never and always. Position the correct option so that it appears about the same number of times in each possible position for a set of items. Make sure that there is one and only one correct option.


Distracter (incorrect options) Development Rules: • • • • • • • • • •

Use plausible distracters. Incorporate common errors of students in distracters. Avoid technically phrased distracters. Use familiar yet incorrect phrases as distracters. Use true statements that do not correctly answer the item. Avoid the use of humor when developing options. Distracters that are not chosen by any examinees should be replaced. Suggestions for Writing Good Multiple Choice Items: Present practical or real-world situations to the students. Present the student with a diagram of equipment and ask for application, analysis or evaluation. • Present actual quotations taken from newspapers or other published sources and ask for the interpretation or evaluation of these quotations. • Use pictorial materials that require students to apply principles and concepts. • Use charts, tables or figures that require interpretation.


General Guidelines to Writing Test Items • • • • • • • • • • • • •

Begin writing items well ahead of the time when they will be used; allow time for revision. Match items to intended outcomes at the proper difficulty level to provide a valid measure of the instructional objectives. Be sure each item deals with an important aspect of the content area and not with trivia. Be sure that the problem posed is clear and unambiguous. Be sure that each item is independent of all other items (i.e., a hint to an answer should not be unintentionally embedded in another item). Be sure the item has one correct or best answer on which experts would agree. Prevent unintended clues to the answer in the statement or question (e.g., grammatical inconsistencies such as ‘a’ or ‘an’ give clues). Avoid duplication of the textbook in writing test items; don’t lift quotes directly from any textual materials. Avoid trick or catch questions in an achievement test. (Don’t waste time testing how well the student can interpret your intentions). On a test with different question formats (e.g., multiple choice and True-False), one should group all items of similar format together. Questions should follow an easy to difficult progression. Space the items to eliminate overcrowding. Have diagrams and tables above the item using the information, not below.


Examples & Tips Below are some strategies to reduce the cognitive load of your test items. 1. Keep the stem simple, only including relevant information. Example: Change [Stem]: The purchase of the Louisiana Territory, completed in 1803 and considered one of Thomas Jefferson's greatest accomplishments as president, primarily grew out of our need for a. the port of New Orleans* b. helping Haitians against Napoleon c. the friendship of Great Britain d. control over the Indians

To [Stem]: The purchase of the Louisiana Territory primarily grew out of our need for a. the port of New Orleans* b. helping Haitians against Napoleon c. the friendship of Great Britain d. control over the Indians *an asterisk indicates the correct answer. Any additional information that is irrelevant to the question, such as the phrase "completed in 1803‌," can distract or confuse the student, thus providing an alternative explanation for why the item was missed. Keep it simple.


2. Keep the alternatives simple by adding any common words to the stem rather than including them in each alternative. Example: Change When your body adapts to your exercise load, a. you should decrease the load slightly. b. you should increase the load slightly.* c. you should change the kind of exercise you are doing. d. you should stop exercising. To When your body adapts to your exercise load, you should a. decrease the load slightly. b. increase the load slightly.* c. change the kind of exercise you are doing. d. stop exercising. Instead of repeating the phrase "you should" at the beginning each alternative add that phrase to the end of the stem. The less reading the student has to do the less chance there is for confusion. 3. Put alternatives in a logical order. Example: Change According to the 1991 census, approximately what percent of the United States population is of Spanish or Hispanic descent? a. 25% b. 39% c. 2% d. 9%* To a. 2% b. 9%* c. 25% d. 39% The more mental effort (or cognitive load) that students have to use to make sense of an item the more likely a comprehension error can occur that would provide another rival explanation. By placing the alternatives in a logical order the reader can focus on the content of the question rather than having to reorder the items mentally. Although such reordering might require a limited amount of cognitive load, such load is finite, and it does not take much additional processing to reach the point where concentration is negatively impacted. Thus, this guideline is consistently recommended (Haladyna, Downing, & Rodriguez, 2002).


4. Limit the use of negatives (e.g., NOT, EXCEPT). Example: Change Which of the following is NOT true of the Constitution? a. The Constitution sets limits on how a government can operate b. The Constitution is open to different interpretations c. The Constitution has not been amended in 50 years* To Which of the following is true of the Constitution? a. The Constitution has not been amended in 50 years b. The Constitution sets limits on how a government can operate* c. The Constitution permits only one possible interpretation Once again, trying to determine which answer is NOT consistent with the stem requires more cognitive load from the students and promotes the likelihood of more confusion. If that additional load or confusion is unnecessary it should be avoided (Haladyna, Downing, & Rodriguez, 2002). If you are going to use NOT or EXCEPT, the word should be highlighted in some manner so that students recognize a negative is being used. 5. Include the same number of alternatives for each item. The more consistent and predictable a test is the less cognitive load that is required by the student to process it. Consequently, the student can focus on the questions themselves without distractions. Additionally, if students must transpose their answers onto a score sheet of some kind, there is less likelihood of error in the transposition if the number of alternatives for each item is always the same.


Reducing the Chance of Guessing Correctly • It is easy to inadvertently include clues in your test items that point to the correct answer, help rule out incorrect alternatives or narrow the choices. • Any such clue would decrease your ability to distinguish students who know the material from those who do not, thus, providing rival explanations.


Keep the grammar consistent between stem and alternatives. Example: Change What is the dietary substance that is often associated with heart disease when found in high levels in the blood? a. glucose b. cholesterol* c. beta carotene d. proteins To

a. glucose b. cholesterol* c. beta carotene d. protein Obviously, "proteins" is inconsistent with the stem since it is singular and the others are plural. However, it can be easy for the test writer to miss such inconsistencies. As a result, students may more easily guess the correct answer without understanding the concept - a rival explanation.


Avoid including an alternative that is significantly longer than the rest. Example: Change What is the best reason for listing information sources in your research assignment? a. It is required b. It is unfair and illegal to use someone's ideas without giving proper credit* c. To get a better grade d. To make it longer To

a. It is required by most teachers b. It is unfair and illegal to use someone's ideas without giving proper credit* c. To get a better grade on the project d. So the reader knows from where you got your information Students often recognize that a significantly longer, more complex alternative is commonly the correct answer. Even if the longer alternative is not the correct answer, some students who might otherwise answer the question correctly could be misled by this common clue and select the wrong answer. So, to be safe and avoid a rival explanation, keep the alternatives similar in length.


Make all distracters plausible. Example: Change

Lincoln was assassinated by a. Lee Harvey Oswald b. John Wilkes Booth* c. Oswald Garrison Villard d. Ozzie Osbourne To Lincoln was assassinated by a. Lee Harvey Oswald b. John Wilkes Booth* c. Oswald Garrison Villard d. Louis Guiteau If students can easily discount one or more distractors (obviously Ozzie Osbourne does not belong) then the chance of guessing is increased, reducing the discriminability of that item. There is some limited evidence that including humor on a test can have certain benefits such as reducing the anxiety of the test-takers (Berk, 2000; McMorris, Boothroyd, & Pietrangelo, 1997). But humor can be included in a manner that does not reduce the discriminability of the item. For example, the nature of the question in the stem may be humorous but still addresses the material in a meaningful way.


Avoid giving too many clues in your alternatives. Example: Change "Yellow Journalism" is associated with what two publishers? a. Adolph Ochs and Martha Graham b. William Randolph Hearst and Joseph Pulitzer* c. Col. Robert McCormick and Marshall Field III d. Michael Royko and Walter Cronkite

To a. Adolph Ochs and Martha Graham b. William Randolph Hearst and Joseph Pulitzer* c. Joseph Pulitzer and Adolph Ochs d. Martha Graham and William Randolph Hearst Since both of the publishers in choice "b" are associated with yellow journalism and none of the other people mentioned is, the student only has to know of one such publisher to identify that "b" is the correct answer. That makes the item easier than if just one name is listed for each alternative. To make the question more challenging, at least some of the distracters could mention one of the correct publishers but not the other as in the second example (e.g., in distracter "c" Pulitzer is correct but Ochs is not). As a result, the student must recognize both publishers associated with yellow journalism to be certain of the correct answer.


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA ESCUELA DE IDIOMAS PROFESORADO EN INGLES LICDA. EVELYN R.QUIROA 2011

TEST ITEM PRESENTATION QUESTIONS

GROUP 1: PERFORMANCE ASSESSMENT PRESENTATION SCORE: 10/10 QUESTIONS: 1. WHAT DOES PERFORMANCE ASSESSMENT CONSIST IN? DEMONSTRATIONS, HANDS ON ACTIVITIES 2. WHY ISN’T POSSIBLE TO USE PERFORMANCE TEST FOR SMALL CHILDREN? 3. GIVE EXAMPLE OFACTIVITIES PERTAINING TO THIS TYPE OF TESTS 4. MENTION A FEW TIPS ON HOW TO WRITE A PERFORMANCE TEST. 5. MENTION ONE MAJOR LIMITATION FOR THIS TYPE OF TEST. 6. WHAT FIELDS OF STUDY ARE PERFORMANCE TESTS GOOD FOR? 7. PERFORMANCE ASSESSMENT ACTIVITIES TEND TO BE DESCRIPTIVE WHAT TYPE OF EVALUATION TOOL CAN BE USED TO AVOID BIAS RESULTS.

GROUP 2: MATCHING PRESENTATION SCORE: 10/10 QUESTIONS: 1. WHAT ARE MATCHING TEST USED FOR? 2. DESCRIBE THE BASIC PHYSICAL STRUCTURE OF A MATCHING ITEMS TEST. 3. MENTION AND EXPLAIN WHAT TYPES OF ASSESSMENT ITEMS WE CAN USE. REARRANGMENT RANKING 4. MENTION 2 ANDVANTAGES AND 2 LIMITATIONS THAT THIS TPE OF TESTS MIGHT HAVE. 5. MENTION A FEW TIPS ON HOW TO WRITE A MATCHING TEST. 6. WHAT SKILL LEVELS CAN BE REINFORCED THROUGH THIS TYPE OF TEST ITEMS? 7. MENTION ONE MAJOR LIMITATION FOR THIS TYPE OF TEST. 8. WHAT ARE THE NAMES OF THE COLUMNS? A: PREMISE B:RESPONSES

GROUP 3: TRUE OR FALSE PRESENTATION SCORE: 10/10 QUESTIONS: 1. 2. 3. 4. 5.

WHAT SKILL LEVELS DO TRUE OR FALSE TESTS EVALUATE? MENTION WHAT ARE THE ADVANTAGES AND LIMITATIONS FOR THIS TYPE OF TEST? WHY DO STUDENTS TEND TO ANSWER TRUE ON A TRUE OR FALSE TEST? WHAT SHOULD BE THE TRUE OR FALSE QUESTION RATIO? 6:4 MENTION SOME TIPS IN REFERENCE TO TRUE OR FALSE TESTS.


UNIVERSIDAD MARIANO GALVEZ DE GUATEMALA ESCUELA DE IDIOMAS PROFESORADO EN INGLES LICDA. EVELYN R.QUIROA 2011

GROUP 4: ESSAYS PRESENTATION SCORE: 10/10 QUESTIONS: 1. 2. 3. 4. 5.

6.

7. 8. 9.

WHAT SKILL LEVEL ARE ESSAYS GOOD FOR? WHAT ARE THE 4 EVALUATION LEVELS AND WHAT DO THEY CONSIST IN? WHAT ARE ESSAY EVALUTAION GOOD FOR? WHAT SUBJECT AREAS CAN BE TESTED BY USING ESSAYS? WHAT ARE THE ELEMTS OF AN ESSAY THAT CAN BE ASSESSED THROUGH AN ESSAY EVALUATION? CONTENT IDEAS ORGANIZATION FORM LANGAUGE WHAT TYPES OF ESSAYS CAN WE USE TO TEST A SPECIFIC TOPIC AND WHAT DO THEY CONSIST IN? EXTENSE RESPONSE RESTRICTED RESPONSE PROCESS ESSAY CLASSFICATION ESSAY CUSE AND EFFECT ESSAY COMPARISON ESSAY PROBLEM SOLVING ESSAYS MENTION SOME ADVANTAGES AND LIMITATIONS THAT THIS TYPE OF TEST ITEM MIGHT HAVE. ESSAY TEST ITEMS TEND TO BE DESCRIPTIVE WHAT TYPE OF EVALUATION TOOL CAN BE USED TO AVOID BIAS OR SUBJCTIVE RESULTS. MENTION SOME TIPS FOR ESSAY TEST ITEMS.

GROUP 5: COMPLETION / SHORT ANSWERS PRESENTATION SCORE: 10/10 QUESTIONS: 1. 2. 3. 4.

MENTION A FEW CHARACTERISTICS OF SHORT STORIES. MENTION A FEW CHARACTERISTICS OF SHORT ANSWERS MENTION ADVANTAGES AND LIMITATIONS TO SHORT ANSWER TEST ITEMS. TIPS ON WRITING SHORT ANSWER TEST ITEMS.


September 2011

SUNDAY

MONDAY

TUESDAY

WEDNESDAY

THURSDAY 1 Topic: The Nature of Science Objective: Classify the divisions of Nature. Assessment: Explain the concept of a topic given about a division of Nature in groups.

4

5 Topic: Scientific Enterprise Objective: Compare the different enterprises that exist. And Find the best one. Assessment: Essay of the Understanding

6 Topic: Science and language Arts. Objective: Determine the difference there are in the science and language arts. Assessment: Design a Microscope in

7 Topic: Summary Objective: Generate and develop their own thought about the nature in general. Assessment: Quiz

8 Topic: Matter Objective: Name all the types of matter. Assessment: Worksheet: List the types of matter.

FRIDAY 2 Topic: Science all Around Objective: Implementing new ideas to get better the environment they live. Assessment: Visualizing the History of Earth Science Technology. Video

SATURDAY 3

10 Topic: Atoms Objective: Explain the Atom’s meaning. Assessment: Search information about the atom. Write a report.


September 2011 Science Article.

class using the material given.

11

12 Topic: Combinations of Atoms Objective: Practice the combination of atoms using the measurement table. Assessment: Scales of measurements.

13 Topic: Properties of Matter Objective: Experiment and Probe the properties of Matter. Assessment: Visualizing states of matter

14 Topic: Properties of Matter Objective: Demonstrate use of knowledge and invent new designs. Assessment: Design your own experiment: determining density.

15 Topic: Minerals Objective: List and describe each type of minerals. Assessment: Watch a video and make a report about it.

16 Topic: Crystal formation Objective: Recall the process of crystal formation. Assessment: Illustrate and name the crystal formation process.

18

19 Topic: Mineral Identification Objective: Identify common minerals and name their characteristics. Assessment: Create an album with many minerals as possible.

20 Topic: Metallic and non-metallic Minerals Objective: Classify minerals according to their properties. Assessment: Elaborate a mural classifying minerals according to their properties.

21 Topic: Uses of Minerals Objective: Compare each kind of minerals and their uses. Assessment: Solve a comparative chart about the use of each mineral.

22 Topic: Rocks Objective: List and describe each type of rocks. Assessment: Play “name that rock!� on where the students have to name and describe each type of rock.

23 Topic: The rock Cycle Objective: Explain and summarize the rock cycle. Assessment: Do the cycle rock activity to observe the changes that rocks suffer during it cycle and make a report about it.


September 2011 25

26 Topic: Igneous Rocks Objective: Understand the relation between igneous rocks and a volcanic eruption. Assessment: Construct a volcano and simulate a volcanic eruption and describe how igneous rocks are made.

27 Topic: Metamorphic Rocks Objective: Understand that metamorphic rocks change by heat and pressure. Assessment: Do the metamorphic rock activity to observe the changes that suffer the rock by external agents.

28

Topic: Energy Objective: Learns about forms of Energy. Assessment: Creates and solve riddles about energy.

29 Topic: Sedimentary Rocks Objective: Using previous information, create his own sedimentary rock. Assessment: Cook a sedimentary rock snack to observe the different layers contained in the snack as in a sedimentary rock.

30

Review and Assessment


October 2011 SUNDAY

MONDAY

TUESDAY

WEDNESDAY

THURSDAY

FRIDAY

2

3 Topic: Renewable Resources Objective: Knows about the origin of renewable resources. Assessment: make a conceptual map of renewable resources.

5 Topic: Mineral Resources. Objective: Construct a diagram showing the differences between natural resources and Mineral resources Assessment: Make a wallpaper of different Mineral resources.

6 Topic: Landforms Objective: Debate the conclusions between different landforms. Assessment: Make flashcards of different landforms .

7 Review and assessment

9

10 Topic: Viewpoints Objective: Describes in detail some viewpoints. Assessment: Make a summary of different viewpoints .

4 Topic: Nonrenewable Resources Objective: Discusses the differences of nonrenewable resources from renewable ones. Assessment: make presentation and give a speech of non renewable resources. 11 Topic: Maps Objective: Displays and explains different kind of maps. Assessment: Design a topographic map and read it.

12 Topic: Weathering Objective: Knows the different kind of weathering and the effects it has. Assessment: Applies the difference between mechanical weathering and

13 Topic: The Nature of Soil Objective: Understands the formation of soil and the factors that affect its development. Assessment: Analyze how soil develops from

14 Review and assessment.

SATURDAY 1 8

15


October 2011 chemical weathering in a diagram.

23

24 Topic: Visualizing Soil Formation Objective: Applies what he know about soil and investigate more. Assessment: Investigate the process of Soil Formation. Discuss this process among your classmates.

30

31 Topic: Wind Objective: Understands the nature of wind. Assessment: Recognize how loess and dunes form. Explain how

25 Topic: Soil Characteristics Objective: Applies his knowledge about what he learnt so far. Assessment: Categorize Soil Characteristics in a diagram.

26 Topic: Soil Erosion Objective: Learns about how humans affect soil. Assessment: Explain why soil is important. Identify human activities that lead to soil loss. Describe ways to reduce soil loss.

rock. Describe soil by comparing soil horizons. Describe factors that affect the development of soil. 27 Topic: Erosion by Gravity Objective: Learns the difference between the concepts. Assessment: Explain the difference between erosion and deposition. Compare and contrast slumps, creep, rockfalls, rock slides, and mudflows.

28 Topic: Glaciers Objective: Understands the nature of glaciers. Assessment: Build a project of glacial erosion and deposition. Compare and contrast till and outwash.

29


October 2011 wind causes deflation and abrasion.

13 Topic: Weathering Objective: Knows the different kind of weathering and the effects it has. Assessment: Describe the difference between mechanical weathering and chemical weathering. Explain the effects of climate on weathering.


SCHOOL NAME LEVEL: SUBJECT: TEACHER:

Intermediate Science María Raquel Morales

LEVEL

CONTENTS The Nature of Science Science All Around Weathering Maps Wind

TOTALS

REMEMBERING Item Value Series 3 1

Name

GRADE: 9th. SECTION:

UNDERSTANDING Item Value Series 1 1

María Raquel Moraes

2

APPLYING Item Value Series 1 1

ANALYSING Item Value Series 0 0

EVALUATING Item Value Series 0 0

CREATING Item Value Series 0 0

2

1

1

1

1

1

1

1

0

0

0

0

1

1

1

1

1

1

1

1

1

1

0

0

1

1

2

1

1

1

0

0

1

1

0

0

1

1

1

1

1

1

1

1

0

0

1

1

LEVEL ITEMS

8

6

5

SERIES

8

POINTS

20

8

3 8

20 20 Science all The nature Weathering of Science around

2

8

5

20

20

Maps

Winds

TOTAL 100

1

TOTALS PER CONTENT

5 5 5 5 5 100 points 25.00


Colegio La ParroquĂ­a Science 9th Grade First Mid term Name: ___________________________________________ Date: ________________________________

Serie I Match the given words with the correct descriptions. 1) Biology

a) It is the science of celestial objects and phenomena that originate outside the Earth's atmosphere.

2) Physics

b) This field encompasses a set of disciplines that examines phenomena related to living organisms.

3) Atmospheric

c) Constituting the scientific study of matter at the Sience atomic and molecular scale, chemistry deals primarily with collections of atoms, such as gases, molecules, crystals, and metals

4) Astronomy

d)This embodies the study of the Fundamental constituents of the universe, the forces and interactions they exert on one another, and the results produced by these interactions.

5) Chemestry

e) Though sometimes considered in conjunction with the earth sciences, due to the independent development of its concepts, techniques and practices and also the fact of it having a wide range of sub disciplines under its wing.

Serie II Read the Description of each natural disaster. Write the correct natural disaster.

a flood / a Tsunami / a volcanic eruption / a forest fire / an earthquake 1. An underwater earthquake causes this very large wave. _____________________

2. The earth moves. It destroys buildings and roads. __________________________

3. A mountain erupts with fire and smoke. __________________________________

4. A fire burns thousands of trees.__________________________________________

5. Water fills the streets and houses. _______________________________________


Serie III Choose the right option in the following descriptions. Circle the type of pollution each can produce. 1. Cars:

a. Water pollution

b. radiation

c. automobile exhaust

2. People on the Street: a. hazardous waste

b. litter

c. radiation

3. Factories: a. air pollution

b. oil spill

c. wรกter pollution

4. Household Cleaners: a. Hazardous Waste

5. Nuclear Power Plants: a. Pesticide poisoning

b. radiation

c. litter

b. radiation

c. air pollution

Serie IV Complete the definitions. 1. A ____________________ is a large body of fresh water. 2. An____________________ is a large body of salt water. 3. A ____________________ is an area with many trees. 4. A ____________________ is a very dry area. 5. A ____________________ is a small river. SERIE V Look at the map and listen to each sentence. Circle T if the statement is true. Circle F if the statement is false. Then write the correct name in each number in the map.

1. T

F

5. T

F

2. T

F

6. T

F

3. T

F

7. T

F

4. T

F

8. T

F


Conclusion These all information gave us the knowledge we need to do a better planning in our classes. Almost will be able to create evaluations that provide our students the results they need for to know how their skills are improving during each class, unit or lesson.

We can apply the different types of evaluations since the beginning of the class, during and at the end too. This will be according the level of skills our students have.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.