Learning assessment (2)

Page 1

PEDAGOGÍA | 6° SEMESTER

Learning assessment Portfolios: learning assessment techniques LUIS MEDINA GUAL

Nancy Martha Vanegas Peña ID:0187762 23 /mayo / 2018


with a brief description of the subject and evidences structure

3 6 8 10

WHAT IS EVALUATION

12 14

ASSESSMENT FOR LEARNING

Identity

EVALUATION ROOTS Identity

WHAT IS QUALITY Identity

VALIDITY AND RELIABILITY Identity

RUBRICS

CONTENTS

TABLE OF

2

INTRODUCTION


Introduction

16

FINAL THOUGHTS Conclusion

18 G L O S S A R Y Definitions

24

ANNEXES

29

REFERENCES

Homeworks

According to


Introduction This work presents a final evidence from all the course of evaluation learning in order to understand the topics of evaluation amoung others. First, it starts with what is the evaluation.Secondly, it shows the evaluations roots (use, methods, and valuing) and the different authors with their different theories. Then it shows the definition of quality and it types of evaluations as a point of assessment. Next, it mentions the importance of validity and reliability. Finally how can I asses my own students by rubrics. Ande the most important part of this portafolio is my own reflexions and final thoughts.


WHAT IS WEEV'ARLEUAHIRING! TION ? First,we need to know what we are going to asses

Observation: Grade: Measure

Assesment: feedback: Jdgment

Evaluation: Improve; Decision


Second, we need to know what are we going to use for asses? - Specifications of reference criteria or standars. - What are you expectating - Create evaluation criteriaÂ

WE'RE HIRING!

Then, when we have clear the uses of evaluation, we can decide the procedure


Evolve evaluation idea

Final Thoughts

Accountability and control: Evaluation was born because people wanted to see if their expectatives were performed

Social inquiry: It was also born with the question: Why do people in social groups act as they do?Â

Reflection In this theme, I Clarify the differences between Assessment, evaluation, and grading depending on what you want to asses. Assessment is when you describe what happened, you make a judgment of the performance of the student and you have a feedback

- The first evaluators used the same methods as researches (Psychology). It first started using quantitative methods, then qualitative methods

Evaluation is when you give feedback, you give a number, but also, you make a decision and give an opportunity to remake the work. - Grading is when attribute numbers to the performance


EVALUATION ROOTS Use

Methods

METHODS 1. Ralph Tyler Evaluation 4 hypothesis testing Norm reference evaluation (how does a group behaves or performs). 2. Campbell - Pure experiment: You select all your sample by chance. In education you cannot do a pure experiment as a science one. - Cuasiexperiments: You don’t select your sample randomly. Compare two groups (experimental-control). You can manipulate variables. - Pre-experiment: You don’t have neither the control nor experimental groups. You can manipulate variables. - Shuman: utility for administrator - Cook: context is important. - Rossi and Chen: comprehensive evaluation why and how it works? (Black box) you evaluate the process with qualitative approaches. - Cronbach: “Enlightment” (check out all the context and choose the method you are going to use)

valuing

THE EVOLUTION FROM QUANTITATIVE (TYLER) TO QUALITATIVE (CORNBACH)


EVALUATION ROOTS VALUING

USE

:From judging in good and bad to the evolution of describing in a scale of greys.

From evaluator as god to evaluator as coach 1. Stufflebeam plus Guba: Evaluation should also be evaluated (meta-evaluator) CIPP model (context, Input,

1. Scriven: formative and summative assessment - “There is a science of valuing and that is evaluation” - “Modus operandi” (MO) = characteristic causal chain. Why the student does learn or did not learn? - Black or white – good or bad. 2. Eisner: Arts and Pedagogy - Connoisseurship and criticism. - Things that matter cannot be measured - Evaluators should not only judge outcomes, they should also be able to understand and describe them.

Process, Product) Professional’s standards: Utility, feasibility, propriety, accuracy. 2. Wholey: Emphasized on managers and policymakers. Evaluation should be understandable to decision makers. 3. Patton : The evaluation also needs to be understandable to all the stakeholders.There are different users who should be identified to make evaluations in such way that all of them can interpret them. 4. Cousin: The persons who are going to be evaluated should have a say on it. “Buy-in” program personnel

3. Owens and Wolf - Adversary evaluation: two opposite sides. - Students judge whether if they learn or not.

5. Preskill Transformational learning, to identify information needed to meet their goal. Clinical approach. 6. Owen: Evaluators (descriptive and judgment) are consultants (prescriptive and recommendations) The

4. Stake - The evaluator decides - Case studies

people who are being evaluated should decide how they should be. Blurred distinction: the one who is evaluated becomes the evaluator. The student decides how can he be evaluated.

5. MacDonald - Voices and values of stakeholders, not only students.

7. Fetterman: Empowerment evaluation: evaluator as a coach for designing and conducting the evaluation. As evaluators, we should be prescindibles, coaches for

6. House - Evaluation is not only good or bad (scale of grey) - Ethical fallacies: the evaluator should tend to right, just and fair evaluations.

designing and conducting evaluation so that in the end the student evaluates him/herself. The report should be done by the students. "

"

Reflection: What I learn of the evalution roots is it evolution and how they are taking into account the human being, in a personal way and not only with numbers in prder to improve the capabilities of the student. In this I assume that the studt has an active role in their own evaluation and it is more constructive


WHAT IS QUALITY In order to understand quality, we need to use a model, a system. (representation of reality -homeostasis.) 1. Natural systems have “control mechanisms” 2. Artificial systems do not have this kind of controls. They need “evaluation”

1. Context: where does it is, what kind of people, etc, Etnical, socioeconomical 2. Input: time, ideas, previous knowledge, students, teachers, material, knowledge, etc. 3. Process: the way we use the inputs; classes, evaluation, human resources, conferences, email, documents, etc. 4. Products: money, satisfaction, learning and social mobility 5.Objective: Curriculum, why are educating on what condition and statement Quality is define as the achievement of coherence of the different parts of the system.


TYPES OF QUALITY Equity: All the students achieve the minim of the objective and products

Efficacy: Coherence between Product and objectives Functionality: Achievement of efficacy, efficiency, pertinence, relevance and equity.Â

Efficiency: Achieve improving input, process between objective

REFLECTION What I learn in this class was to create a curriculum with the types of quality, the didactic transposition and the didactic moments. For example: Objective -> efficacy ->Planing, executive, evaluation. Cometence -> Relevant and pertinence > Planing, execute and evaluate. Standard base -> Equity ->context Planing, execute,evaluation. Purpose ->Effiencecy -> Evaluate,planning, execute

Pertinence and relevance (product with context): It depends of the people we are teaching if it is closer or boarder is usefulness


CLASSIC ASSESSMENT TECHNIQUES What is an instrument? • It is a methodological tool that enable us to measure or obtain data…

• What is measure? Procedure in which we assign a value to something using a scale.

Añadir un poco de texto

1. WHAT CAN WE MEASURE IN PSYCHOLOGY AND EDUCATION? - ALATENT VARIABLES: THEY MANIFEST IN CONDUCTS. (IT IS NO DIRECTLY MEASURE) NEVER A 100% OF THE LEARNING. - WE DON´T MEASURE THE IQ, WE TRY TO MEASURE THE IQ - WE OPERACIONALIZE VARIABLES: QUE SAY WHAT DE WE UNDERSTAND AND HOW ARE WE GOING TO REPRESENT WHAT WE WANT TO MEASURE

1. Define variable= What? 2. Conceptual definition =Objective 3. Operational definition (items) = Measure Ex: Study of love, study of friendship - Measure the concentration (what would you measure) (selective attention)(How you represent it) : • D2 test= a tests that assess the capacity of focusing on a certain task with a time limit. (How would you define it).

Kinds of measures - Nominal measures: label (color of socks) - Ordinal measures: (Likert scale) - Scalar measures (test’s results)


METHODS TO ASSESS RELIABILITY

2. HOW CAN I KNOW IF MY INSTRUMENT IS “GOOD”?

j1. Content: is the sample representative of what do I want to assess? 2. Criterion: If I design an IQ test, will the results on my - That they really measure. test would be related to the - Validity: Whenever you try WISC test? to assess something and 3. Construct: What I am represent it with another trying to assess has a measure. The coherence relation with the reality? Is a between the operational test of aggressiveness and conceptual definition of related with the behavior of the variable. the subject? Ex: Learning. Measure with memorizing tests

E1. Split-half (divided in two) 2. Test-retest correlation (research and compare) 3. Cronbach’s Alpha (Average of correlation of each item with the total score) 4. Difficulty of the item vs complexity of the item *if you have a more complex test, is gonna be harder

RELEVANT 4 ME

- RELIABILITY: TO BE CONSISTENT. HOW CAN I KNOW IF MY INSTRUMENT IS GOOD?

Y• Classical test theory: how are all the items related between them? Reliability. Positive correlation items with the average • Item response theory: how is each item behaving? Analyze each item individually.

sWhat I learn in this class was to know the concept of validity and reliability with a game of coins. Also I like when we start to play correlation, I think this is a way to made significat learning. I learn that a good item that is correctly answer by the peolple who know an the bad item is uncorrectly done when a person answer the correct answer then they don´t know ande viceversa


Assessment for learning Classical assessments 1. Assessment purposes: Certification vs feedback and growth. 2. Number of subjects: Classroom based assessments less than 30. 3. Instrument types: Standardized (1 instrument) several non-standardized instruments. 4. Instrument analysis: Quantitative vs quantitative and qualitative.

Contemporary alternatives • Books - Hacking assessment: 10 Ways to Get Gradeless in a Traditional Grades School. - Faire isn´t always equal. - A repair Kit For Grading - Assessment 30

Grading effort according to literature • You should not assess effort • Give and extrawork • The object of assessment is learning, not effort Modelo SE2R :

• We don’t want grades. We want better grades.

What I learn? In this class I learn how can I assess my own students according to differnet alternatives, either if thei are classical or contemporary I am sure that whatever theory i choose I should take into account that thay are different and unique persons and to give an feedback in order to devolp their abilities.



ONE SINGLE POINT

You need to specify what the person did. Give a justification. What was does he made to outstand or what his opportunity area is.

Benefits for student and teacher • Specify for every single criteria where did the student outstand. • It gives space to reflect on both strengths and weaknesses in student work. • It doesn’t place boundaries on student performance. • It works against students’ tendency to rank themselves and to compare themselves to or compete with one another. • It helps take student attention off the grade. • It creates more flexibility without sacrificing clarity. • It’s simple!

Rubrics (Holistic) MARZANO

SCALE

1. Level 0: Not achieved nothing 2. Level 1: help of someone else 3. Level 2: 4. Level 3: Satisfied the standard 5. Level 4: exceeded expectations If we give more performance levels, we give the student more information about his performance.

If we have 3 or 4 performance levels, we are going to be more confident, but give the student less information.


Rubrics Analytic • The criteria describes the performance level • Where does the student has the most elements in? • Grader in an easy manner WHAT CAN WE ASSESS WITH RUBRICS?

• Knowledge • Reasoning • Performance skill • Product *To assess the competence: Knowledge and performance skill. FEATURES OF A GOOD-QUALITY RUBRIC - Understandable - Standars - Illustrated with samples of student work. - Be concise - Be stated in a way students can understand. - Be easy to use. - Be worded in a positive manner. - Match the assignment/tasks. - Define various levels of performance. - Include the same features across various levels of performance. HOW TO DEVELOP A RUBRIC 1. Choose a learning target worth time 2. Search out existing relevant scoring guides. 3. Gather samples of student work. 4. Sort student work. 5. Group like indicators together. 6. Identify student work that illustrates each level on each criterion. 7. Test and revise.

“Avoid products or performances that don’t Conclusion: I learn from this topic how to relate to the content” of what is being assess, create a rubric and that it depends on what even though they may seem like good are you are going to assess in order to activities on their own. Sometimes students distinguish wheter use the holistic rubric or get so caught up in the product that they lose the analytic and why, I like to create a rubric sight of what they are actually intending to and assess it by the rubric of rubrics show with the product.” (J. McTighe)


Final Thoughts Throughout the semester I learned many things that I had no idea about evaluation. Now I know its importance throughout this process of teaching and learning The basis of everything is to know what evaluation is and to know what it is for when we can use it, to know its process, but above all to know the theory of evaluation roots because the history of evaluation, methods, and use is broken down. This allows us to enter into context so that later we identify what we want to evaluate and how to do it, above all it gives us a cultural baggage of evaluation Subsequently, we have to know what quality and types of quality are to offer students a quality education and therefore quality assessment so that they develop optimally. n the same way, we identify the classic techniques of evaluation so that we know if our instrument is good or bad according to the validity and confidence of it. On the other hand, we identified how to design a general culture test, we applied it and did an analysis to know if each item had validity and confidence. We also saw the contemporary alternatives and classical assessment in order to asses learning. The same, we saw whom to create, design or even asses a rubric. I really like the course, because we learn both theories and we practice. I like when we saw the movie of the emperor's club in order to relational with evaluation roots, I also like the exercise of the coin in order to comprehend reliability and validity. I also like to go to the INEE, because we saw many things of how in our country they assess learning. Also, I like making the general culture test and all the implications.. Also, I like to work in most of the time in pairs because I think that it is better to understand if we share our knowledge. Even I like to make this portfolio because I think that is a recount of everything we see summarized. On the other hand, making a rubric was a challenge formes. What I really think that miss is knowing how Pisa is structured, also visiting ceneval in order to the how does they desgned them. Thank you for evrything!!


Glossary

Learning assesment 1. Assessment: the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and development. 2. accreditation: : the designation that an institution functions appropriately in higher education with respect to its purpose and resources 3. criteria: the specific observable characteristics of a behavior or skill 4. Feedback: Includes or oral comments made to students on their achievements to improve their performance. 5. rubric: a set of scoring guidelines that can be used to evaluate students= work


Glossary

Learning assesment 6. Reliability: Assessment methods which are fair, consistent and as objective as possible, can be considered to be reliable. 7. Self-assessment: It may involve the student in evaluating the strengths and weaknesses of what he or she has achieved or in attributing marks to the work. 8. Summative assessment: Produces a measure which sums up someone’s achievement and has no other real use except as a description of what has been achieved 9. Test:Usually means a short exam. 10. Validity: Assessment methods which genuinely assess what you believe to be the most important things that your students are learning can be considered to be valid.


Glossary

Learning assesment 11. portfolio: a representative collection of a student's work, including some evidence that the student has evaluated the quality of his or her own work 12. Formative assessment: Is undertaken to assess achievement and identify shortcomings. Its purpose is to help learners to improve their performance. 13. Effectiveness: The degree to which something is successful in producing a desired result. 14. Certification:An official document attesting to a status or level of achievement. 15. Instrument: A methodological tool that enable us to measure or obtain data


Glossary

Learning assesment 16. Holistic Scoring:A type of grading in which an assignment is given an overall score. Possible scores are described in a rating scale 17. Outcomes: Statements that describe qualitative or quantitative measurable expectations of what students should be able to think, know or do when they've completed a given educational program 18.Standard: A knowledge, capacity, or ability that everyone within the context in which the evaluation being carried out must be able to achieve 19. Grading: The procedure of assigning numbers to the performance of a student or the phenomenon that is being analyzed 20. Evaluation: When the grading and assessment process are carried out and from the results an action is made.


Glossary

Learning assesment 21, Equity: Make sure that the standards are elements that everyone within the context in which evaluation is carried out are able to achieve. 22.Quality: the case of being quality 23. Judgment: the ability to make an opinion 24.Usefull: quality class 25.Measure: Asign a value to something using a scale 26.Pertinent: Achievement a closest context 27:Relevance: Achievement a broadest context 28. System: A model 29: Polysemy term: A lot of definitions 30. Efficacy: Achievement of the goals 31. Efficiency: Improve the use of input and processes


Annexes How would you define “learning�? Categories: Process: It takes repetitive actions and activities to really learn something. Acquisition: the capability of apprehending information, skills, and attitudes to use them daily. Use in life: it is not just about memorizing something but to be able to use it in our lives. Similarities: Most of the definitions mention that to learn something, we have to practice and apply the skills and knowledge we have acquired. Differences: Some definitions mention that learning requires intentionality and that when we learn something we have the capability to teach what we know. Own definition: Learning is the acquisition of knowledge, abilities, attitudes through experience or studying in order to apply in our daily life in any context, for being a better person and to help other people.


Annexes How would you define “assessment�? Categories: Process: it takes time and a lot of work, it is necessary to consider all of the elements that are involved in the teaching learning process to make a complete and correct assessment. Tool: It is used to evaluate, for example, an exam. Measure/evaluate: it has the purpose of giving a result, based on the measurement of knowledge. Methodology/technique: It has to be done following a series of steps or stages to get results. Similarities: Most of the definitions consider assessment as a way to "measure" learning, also as a quantitative way to evaluate it. Differences: Some of the definitions talk about assessment as a process, while others mention it is a tool or a technique that must be used in education. Own definition: Assessment is a very important process in education, it evaluates how much a student has learned, but also what is working in the teaching learning process and what should be changed or improved. Synthesis of the expectations To sum up, the expectations we have, as a group, for this subject are: to assess teacher, learner, learning, to know how to create, modify and apply learning assessment, to know how do different countries assess in their educational system and here in Mexico. Theories and techniques for assessment, best options to evaluate, to know if the evaluation is effective, and also to improve our English language.


Annexes in order to evaluate learning. 1. Rubrics 2. Exams 3. Writing feedback 4. Oral feedback 5. Quizzes 6. Observation 7. Combination of reviews 8. Case analyses 9. Portfolio 10. Check list

From the most to the least useful 1. Oral feedback 2. Writing feedback 3. Combination of reviews 4. Case analyses 5. Observation 6. Portfolio 7. Check list 8. Rubrics 9. Exams 10. Quizzes

• Search for definitions of evaluation and order them best to worst. 1. Evaluation is a broader term than the Measurement. It is more comprehensive than mere in¬clusive than the term Measurement. It goes ahead of measurement which simply indicates the numerical value. It gives the value judgment to the numerical value. It includes both tangible and intangible qualities. (Thankur, n.d.) 2. Evaluation is an essential tool for making development programmes successful.(Toole, D. 2008) 3. Evaluation is the collection of, analysis and interpretation of information about any aspect of a programme of education or training as part of a recognized process of judging its effectiveness, its efficiency and any other outcomes it may have. (Ellington, Percial, 1988) 2. “To help both teachers and pupils to clarify their purposes and to see more concretely the directions in which they are moving” (Formative evaluation See Scriven, 1967).


Annexes 1. Read carefully and reflect in two or three paragraphs about: a. Why did the author purpose an "evaluation theory tree"? To be a drastic oversimplification of very complex relationships, the authors influences, the evaluation field, the perspectives, and in order to know the roots of the evaluation theory tree. b. Which are the "roots" of the evaluation theory tree? The roots of the evaluation theory three are: Methods, validity, and use 2. Create a graphic organizer (any of your choice) and organize the different ideas presented in the reading (try to uncover hidden relations/patterns). 3. Choose one author you would like to review on the dept and write one paragraph justifying why you chose him/her. I Choose Peter Rossi because he changes the randomized experimental process to broaden the vision of evaluation, that means that the based on a theory-driven evaluation that focuses on foundational thinking and empathizes whether the student learns or not. I think it is a more personal method.Â


Annexes


Annexes

References: Penn State University. (2017). Differences between testing, assessment, and evaluation. Retrieved from: http://tutorials.istudy.psu.edu/testing/testing2.html Twersky , F and Lindblom K. (2012). Evaluation principles and practices. An integral working paper. The William and Flora Hewlett Foundation. Retrieved from: https:// www.hewlett.org/wp content/uploads/2016/08/EvaluationPrinciples-FINAL.pdf - Barnes, M. (2015) Assessment 3.0. United Kingdom: SAGE Publications. Jamal, H., and Shannah A.(2011). The role of learning mnagement systems in educational environments: An explore case study. Linnaeus University. Retrieved from: http://lnu.divaportal.org/smash/get/diva2:435519/FULLTEXT01.pdf

I


WHAT IS IMPORTANT IS TO KEEP LEARNING, TO ENJOY A CHALLENGE, AND TO TOLERATE AMBIGUITY, IN THE END THERE ARE NO CERTAINS ANSWERS

CREATED BY: NANCY MARTHA VANEGAS PEÑA


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.