Evaluating and Assessing Tools in the Digital Swamp

Page 1

Evaluating and Assessing Tools in the Digital Swamp by Michael Fullan and Katelyn Donnelly provides K–12 decision makers with a powerful tool for navigating the expanding digital swamp in order to accomplish deep learning outcomes. While the use of digital tools is widespread, many students lack sufficient skills to succeed in the rising learning revolution. As a result, educators need a powerful tool like the authors’ Digital Swamp Index (DSI) to wade through the swamp of digital innovations, which can cultivate achievement in education.

Educators will: • Ascertain the main components and subcomponents of the DSI • Consider what it means to get a green, amber green, amber red, or red rating on the DSI’s scale and read about examples for each of these color ratings across the spectrum

E V A L U AT I N G A N D A S S E S S I N G T O O L S I N T H E D I G I TA L S W A M P

Evaluating and Assessing Tools in the Digital Swamp

S olut i ons f or D i g i ta l L ea r ne r–C ente r ed C la ssr ooms

Evaluating and Assessing Tools in

the Digital Swamp

• Distinguish the main forces of the stratosphere agenda

• Discover six strategies for staying alive in the swamp

Visit go.solution-tree.com/technology to download the reproducibles in this book.

solution-tree.com

K AT E LY N D O N N E L LY

Solutions Series: Solutions for Digital Learner–Centered Classrooms offers K–12 educators easy-to-implement recommendations on digital classrooms. In a short, reader-friendly format, these how-to guides equip practitioners with the digital tools they need to engage students and transport their district, school, or classroom into the 21st century.

MICHAEL FULLAN

• Gain guidance on using the DSI as a tool in conducting a workshop

Michael Fullan Katelyn Donnelly


Copyright © 2015 by Solution Tree Press Materials appearing here are copyrighted. With one exception, all rights are reserved. Readers may reproduce only those pages marked “Reproducible.” Otherwise, no part of this book may be reproduced or transmitted in any form or by any means (electronic, photocopying, recording, or otherwise) without prior written permission of the publisher. 555 North Morton Street Bloomington, IN 47404 800.733.6786 (toll free) / 812.336.7700 FAX: 812.336.7790 email: info@solution-tree.com solution-tree.com Visit go.solution-tree.com/technology to download the reproducibles in this book. Printed in the United States of America 19 18 17 16 15

1 2 3 4 5

Library of Congress Cataloging-in-Publication Data

Fullan, Michael. Evaluating and assessing tools in the digital swamp / by Michael Fullan and Katelyn Donnelly. pages cm Includes bibliographical references. ISBN 978-1-936763-66-5 (perfect bound) 1. Education--Computer network resources. 2. Teaching--Computer network resources. 3. Educational technology. I. Donnelly, Katelyn. II. Title. LB1044.87.F85 2015 371.33--dc23 2015009167 Solution Tree Jeffrey C. Jones, CEO Edmund M. Ackerman, President Solution Tree Press President: Douglas M. Rife Associate Acquisitions Editor: Kari Gillesse Editorial Director: Lesley Bolton Managing Production Editor: Caroline Weiss Copy Editor: Amy Rubenstein Text and Cover Designer: Rian Anderson Compositor: Rachel Smith


Table of Contents About the Authors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii Chapter 1: Alive in the Swamp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 The Push and Pull of Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 The Stratosphere Agenda. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Hidden Dangers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 What’s the Point of All This? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

Chapter 2: The Digital Swamp Index. . . . . . . . . . . . . . . . . . . . . . . 19 Pedagogy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21 Clarity and Quality of Intended Outcome. . . . . . . . . . . . . . . . . . . . . 22 Quality of Pedagogy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 Quality of Assessment Platform. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

System Change. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Implementation Support . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Value for Money. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 Whole System Change Potential. . . . . . . . . . . . . . . . . . . . . . . . . . . . 37

Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40 Quality of User Experience and Model Design. . . . . . . . . . . . . . . . 40 Ease of Adaptation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 Comprehensiveness and Integration. . . . . . . . . . . . . . . . . . . . . . . . 43

Chapter 3: Using the Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 The Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

v


vi

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 Conducting a Workshop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

Chapter 4: The Learning Revolution in Education. . . . . . . . . 57 References. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65


About the Authors Michael Fullan, PhD, is professor emeritus at the Ontario Institute for Studies in Education, University of Toronto. He has served as the special adviser to Dalton McGuinty, the premier of Ontario, and is now one of the senior advisers to Premier Kathleen Wynne and the minister of education in Ontario. He works with systems around the world to accomplish whole-system reform in education in provinces, states, and countries. He has received five honorary doctoral degrees. His award-winning books have been published in many languages. His most recent publications include Stratosphere: Integrating Technology, Pedagogy, and Change Knowledge; Professional Capital: Transforming Teaching in Every School (with Andy Hargreaves); Motion Leadership in Action; The Principal: Three Keys to Maximizing Impact; Coherence: The Right Drivers in Action for Schools, Districts, and Systems (with Joanne Quinn); and Freedom to Change. Professional Capital was awarded the 2015 Grawemeyer prize for best education book of the year. To learn more about Michael’s work, visit his website at www .michaelfullan.ca and follow him on Twitter @michaelfullan1.

vii


viii

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

Katelyn Donnelly is a managing director at Pearson, where she leads the Pearson Affordable Learning Fund, an emerging market venture fund that invests in early stage education companies. Katelyn is also an active adviser on Pearson’s global strategy, research, and innovation agenda as well as a consultant to governments on system transformation and delivery. She is a frequent speaker on innovation in education and was recently named to the Forbes 30 Under 30 list. Katelyn is the co-author of two recent education and innovation publications: Oceans of Innovation and An Avalanche Is Coming. She serves as a nonexecutive director and strategic adviser for several start-up companies across Europe, Asia, and Africa. Previously Katelyn was a consultant at McKinsey and Company. You can follow Katelyn on Twitter @krdonnelly. To book Michael Fullan or Katelyn Donnelly for professional development, contact pd@solution-tree.com.


The Digital Swamp Index For all the reasons discussed in chapter 1, there is a great need for some guidance in how to navigate the swamp. This guidance must help sort out, improve, and integrate the three big forces of Stratosphere: technology, pedagogy, and system change. To this end we have developed an index—the Digital Swamp Index (DSI), or simply the Index (see figure 2.1, page 20). We have made the case that there is a rapidly growing need for such an index as systems waste millions and millions of dollars, time, and energy on dead ends. The good news is that there is a growing and urgent desire to improve learning and deepen its outcomes. Accompanying this is a greater willingness to see digital as an accelerator. By and large there is no longer resistance to digital (if anything, there is too much willingness to jump into the deep end), but more a series of questions about how to approach the matter. People are desperate for help on what to do and how to approach the myriad of possibilities. In this chapter, we describe the Index itself. Chapter 3 focuses on its use.

Š 2015 by Solution Tree Press. All rights reserved.

Chapter 2

We developed the Index in response to the changes in digital technology and a renewed focus from entrepreneurs and educators on improving learning outcomes. The first objective of the Index is to

19 19


© 2015 by Solution Tree Press. All rights reserved.

Visit go.solution-tree.com/technology to download a color version of this figure.

Figure 2.1: Digital Swamp Index.

RED: Off track—Unlikely to succeed

AMBER RED: Problematic—Requires substantial attention; some portions are gaps and need improvement

Rationale Summary

AMBER GREEN: Mixed—Some aspects are solid; a few aspects are lacking full potential

Rating

GREEN: Good—Likely to succeed and produce transformative outcomes

Comprehensiveness and integration

Ease of adaptation

Quality of user experience/model design

Technology

Whole system change potential

Value for money

Implementation support

System Change

Quality of assessment platform and functioning

Quality of pedagogy and relationship between teacher and learner

Clarity and quality of intended outcome

Pedagogy

Criteria Area

Innovation Index

20

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP


T he Digi t al S wamp Index

The Index is designed to be used with innovations that focus on learners between the ages of four and twenty and have a school- or system-based application. We have tested the Index on many innovations spanning technology-enabled learning tools such as Khan Academy, Prodigy, and LearnZillion and school-based models such as School of One and Rocketship Education. We believe that student-centered learning requires a blended combination of critical drivers. Pedagogy, change knowledge related to system change, and technology will need to interact. Technology can improve pedagogy by supporting effective feedback for students and teachers. Technology can support system change by helping educators monitor achievement scores and enabling communication between schools and school clusters. The training of system users will need to focus not just on the technical use of the tool but also on how the digital innovation can support collaboration and effective interaction and drive learning gains. Each of the three components (pedagogy, system change, and technology) should be leveraged, blended, and interconnected in a way that produces results for learners and reverberates throughout the system.

Pedagogy We take up pedagogy first because it should be the driver of any change effort. Without defined outcomes and a theory of learning, learning and achievement cannot take place. Pedagogy is the interaction that the technology and the teacher have with the learner, and thus it is primary.

Š 2015 by Solution Tree Press. All rights reserved.

enable a nonexpert to systematically evaluate new companies, products, and school models and determine their strengths and weaknesses in the ability to improve outcomes for learners. The second objective is to provide a tool for educational system leaders to implement technology into classrooms and create effective plans and combinations of innovations to achieve results.

21


22

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

Clarity and Quality of Intended Outcome

In this component, we ask several questions: • How clearly are the learning outcomes of the innovation defined? • Are the outcomes quantified and time-bound? • Are the learning outcomes explicit and defined for the school, the student, the parents, and the school system? • Are the students contributing to framing questions and defining their own goals? • Is the clarity of the outcome shared by students, teachers, parents, school, and school system? • Are the learning outcomes comprehensive of the body of content or the skills that the learner is expected to know or be able to perform? This is an important category because to make transformative system improvements we need to know, with precision and clarity, what the learning goals are and how they are being measured. Digital content and tools that do not align with what is to be learned will likely not translate into increased attainment. We put a high value on precision and specificity of pedagogy, not to be prescriptive, but to be clear. Too many digital products are vague about pedagogy, leaving the user without support or maybe even engaged in its use but without learning anything of value. To achieve a green rating on the Index, every activity, lesson, and course of study must have clear, quantifiable outcomes that are

© 2015 by Solution Tree Press. All rights reserved.

The first subcomponent area under pedagogy is clarity and quality of intended outcome. The outcomes of the innovation should link to the goals of the learner, the school, or the system. Knowing what the learning goals are and what the learner needs to achieve is critical to measuring success and ensuring the learner is on track to be successful.


T he Digi t al S wamp Index

In the very best situations, trajectories are created for key outcomes so that progress toward goals can be tracked and measured in real time. This enables the learner and the instructor to always know where the learner is in terms of his or her learning goals. Where possible, the innovation should be able to quantify the impact it is having on the student and the class achieving key national indicators of student attainment. These achievements should be benchmarked internationally when possible so that, for example, a second-grade student in the United States can compare his or her performance to that of a similar student in Singapore. In systems such as Ontario’s, we have seen examples of the student and teacher together being clear about learning goals and corresponding criteria for success, with tremendous results (Fullan & Rincón-Gallardo, in press). In red-rated innovations, outcomes for exercises and modules have not been clearly identified. Often the learning goals are general or implied within the product. When outcomes are identified, they lack specificity and clarity, and they are not linked to standards. It is problematic when there is insufficient linkage to key leading indicators and a lack of progression trajectories. Confusion abounds in learning environments where teachers and administrators are unaware of the impact of the innovation and the benefits to students. Often, in these environments, teachers will stop using the product, as they fail to see its benefits. We have found that when there is no tracking or monitoring system in place to ensure that students are learning and adaptations are being made, it is unlikely that the digital innovation will succeed in producing outcomes.

© 2015 by Solution Tree Press. All rights reserved.

recorded and presented to the teacher and the learner. These learning outcomes and goals should be shared with students, teachers, and parents, as well as communicated within the classroom, school, and district. The students should feel ownership over the goals and have an intrinsic understanding of why the goals are important to them personally. The innovation should be able to demonstrate strong benefits for educators and students in the short and long term.

23


24

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

Quality of Pedagogy The second subcomponent in pedagogy is the quality of the pedagogy itself, defined as the underlying theory and practice of how best to deliver learning. This element focuses on what we would call the learning partnership between and among teachers and students. While this new pedagogy, particularly in a digital context, is still being defined, we think that it is important that the digital innovations contain pedagogy that reflects the most advanced evidence-based learning approaches and techniques to date. We recognize that there is no one single “right� pedagogy. There are theories of how people learn that are context-specific and adaptable to goal setting. Our emphasis here is that the innovation must have an explicit model of how learning happens. This model should be embedded in the technology, model design, and training of teachers. It is important to note that many teachers struggle defining and using an evidence-based pedagogy regardless of whether it sits in a digital innovation, so pedagogy is a difficult challenge. In any case, the model should include a view that the teacher is a change agent and the student is an active partner in the learning setting. Regardless of the exact pedagogical theory, there are certain elements of what is being taught that should not change: problems and questions are placed in a real-world context, the emphasis is on intellectual risk taking and trial-and-error problem solving, and there is a healthy partnership between the student and teacher that is built on inquiry and data.

Š 2015 by Solution Tree Press. All rights reserved.

An example of a red-rated innovation might be a digital reading program where learners are asked to read a passage or book online. However, the program lacks clear learning goals. Parents, teachers, and learners are unsure of the impact of the digital reading activity on literacy attainment. Progress is measured at the end of the term on an exam that is not linked back to the digital innovation, so the value that the innovation adds is unclear.


T he Digi t al S wamp Index

25

The Index poses the following questions for guidance under the pedagogy subcomponent: • How refined is the pedagogical underpinning?

• Are students at least in part encouraged to learn through inquiry? • How is the learner engaged? • How is the teacher’s role defined? Is the role reflective of the “teacher as activator” relationship? • Is there a student-teacher partnership? • Is the pedagogy consistent across the innovation and the system? • Is there a shared understanding among all the teachers involved? • Does the model include an emphasis on the necessary psychological and intellectual processes? • Is there a mechanism in place to ensure that the pedagogy is updated? • Can teachers and students provide defensible evidence of positive links to learning? Innovations that aspire to achieve a green rating should take note of Hattie’s (2012) meta-analysis work on teaching—that is, we should be able to see signs that the teacher’s role is defined as an activator of learning and his or her job is centered on servicing and pushing deeper the thoughts from the learner. Teachers should be viewed as being in partnership with students. Teachers should exhibit a constant behavior of seeking evidence, openness to alternatives, and adaptability and flexibility when new evidence is raised.

© 2015 by Solution Tree Press. All rights reserved.

• Does the pedagogy reflect the latest global research including the emphasis on inquiry, constructivism, and real-world examples?


26

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

There are many new developments, with still more work needed, in defining the role of the student in the new learning partnerships. We think of major new developments related to “my learning” (what my goals are and how I can best define and pursue them), “my belonging” (how connected I am to my school and to others—other students, teachers—with whom I might be learning), and “my aspirations” (where I am headed and where I want to head). Red-rated innovations may employ a pedagogy that relies on the traditional rote method of learning embodied frequently in “tell and test” or “experience and evoke.” In other cases of poor pedagogy, we have found that a theory of learning or consistent standard of teaching is absent, difficult to detect, and sometimes totally incoherent. In simple education technologies, the pedagogy is assumed as intuition, and instead of thinking through pedagogy, designers are consumed by simple engagement or consumption rather than a focus on learning. Such a situation may indeed consume the student but with little yield. In cases where pedagogy is weak, it is unclear how technology can assist and accelerate learning. There is a tension between the student and teacher based on misaligned incentives. Learners are not engaged in the studies or activities at hand and get frustrated with the process of learning. Teachers, or the implementers, are teaching using a method that works for them and is without an evidence base. Students are taught to, curriculum and school design are imposed, and students feel unsupported by the school, the teacher, or both. One example of a company with a green rating on pedagogy is one of Katelyn’s portfolio companies, Avanti Learning Centers, based in Mumbai, India. Avanti is a test prep company that uses a blended

© 2015 by Solution Tree Press. All rights reserved.

Further, in green-rated innovations, students are engaged through inquiry, and learning is personalized with the goal of unlocking the passion of the learner. Students feel psychologically supported by teachers who are trained to focus on the personal experience of the individual student and help uncover values and motivations.


T he Digi t al S wamp Index

Another powerful digital innovation with which we are connected is the online game Prodigy, which was developed by SMARTeacher, a new Ontario-based lean start-up that focuses on math. It is an engaging, free math game aligned to the math curriculum for grades 1–8, a subject that has not fared well at the elementary level. From January 2013 to November 2014 Prodigy grew from three thousand users to over one million. Its owners and developers, Rohan Mahimker and Alex Peters, deliberately collaborated with us as they developed the game, as they wanted to build in strong pedagogy, implementation, and links to learning outcomes. They recently applied the Digital Swamp Index to Prodigy to check on how they were doing. They found that the game fares well on a number of the dimensions. With respect to technology, they rated Prodigy as amber green on “quality of user experience” and “comprehensiveness and integration,” and green on “ease of adaptation.” For system change, the developers rated “implementation support” and “value for money” as green, and “whole system potential” as amber green. We questioned the latter rating as being too generous, asking whether the product was not only being embraced by large numbers of teachers, but was also being built into administrative priorities and support structures. (Note that the developers do not need to be responsible for system change, but they do need to address the matter with adopters.)

© 2015 by Solution Tree Press. All rights reserved.

approach, combining technology—to beam lectures into the centers and students’ homes—with peer-to-peer work in the classroom. Avanti’s pedagogy closely follows Eric Mazur’s method of Peer Instruction (Harvard School of Engineering and Applied Sciences, 2014). Mazur is a physics professor at Harvard who pioneered the Peer Instruction learning method, in which students read or watch videos outside of class to become familiar with the content, and the teacher uses class time for question-based learning as opposed to traditional lecture styles. His method has been proven worldwide for increasing outcomes, and it is the driver behind Avanti Centers’ approach.

27


28

E VA LUAT IN G A ND A S S E S SIN G T O O L S IN T H E DIGI TA L S WA MP

Mahimker and Peters are clearly proactive developers. They know that the lessons learned from using the Index improve their product and its impact and, more importantly, the learning experiences that lead to impact. They have tested impact using the standards from the very high Ontario math assessment and its agency, Education Quality and Accountability Office (EQAO). Within one district with twenty-five thousand users, grade 6 math results improved over one year by 2.1 percent, while the district as a whole declined by 6 percent. In another comparison in the same district, eight “high usage” schools showed an increase of 12.7 percent at the grade 3 level, while the district as a whole had a 2 percent increase. These developments are occurring at the very early stages of development as new improvements are constantly being added, arising from what we would call “Index thinking a la the Swamp.” In sum, the teaching and learning subcomponent is clearly one of the most important areas of the Index and the one in which innovations are either successes or failures. Often the pedagogy simply isn’t focused on or researched: too many evaluations focus on the technology, its platform, and its surface use. Going deep on pedagogy represents the richest vein of development in the Index.

Quality of Assessment Platform The third pedagogy subcomponent is the quality of the assessment platform. Both summative and formative assessments are vital

© 2015 by Solution Tree Press. All rights reserved.

When it came to pedagogy, the developers indicated that this was the biggest area of improvement that they needed to address. They did have a strong and effective “assessment platform” (amber green) but found that they needed to sharpen “clarity and quality of intended outcome” (amber red) and “quality of pedagogy and relationship between teacher and student” (amber red). Relative to the latter, they found that they were perhaps confusing “engagement” (which was very high) with understanding how students learn and how to best support them.


Evaluating and Assessing Tools in the Digital Swamp by Michael Fullan and Katelyn Donnelly provides K–12 decision makers with a powerful tool for navigating the expanding digital swamp in order to accomplish deep learning outcomes. While the use of digital tools is widespread, many students lack sufficient skills to succeed in the rising learning revolution. As a result, educators need a powerful tool like the authors’ Digital Swamp Index (DSI) to wade through the swamp of digital innovations, which can cultivate achievement in education.

Educators will: • Ascertain the main components and subcomponents of the DSI • Consider what it means to get a green, amber green, amber red, or red rating on the DSI’s scale and read about examples for each of these color ratings across the spectrum

E V A L U AT I N G A N D A S S E S S I N G T O O L S I N T H E D I G I TA L S W A M P

Evaluating and Assessing Tools in the Digital Swamp

S olut i ons f or D i g i ta l L ea r ne r–C ente r ed C la ssr ooms

Evaluating and Assessing Tools in

the Digital Swamp

• Distinguish the main forces of the stratosphere agenda

• Discover six strategies for staying alive in the swamp

Visit go.solution-tree.com/technology to download the reproducibles in this book.

solution-tree.com

K AT E LY N D O N N E L LY

Solutions Series: Solutions for Digital Learner–Centered Classrooms offers K–12 educators easy-to-implement recommendations on digital classrooms. In a short, reader-friendly format, these how-to guides equip practitioners with the digital tools they need to engage students and transport their district, school, or classroom into the 21st century.

MICHAEL FULLAN

• Gain guidance on using the DSI as a tool in conducting a workshop

Michael Fullan Katelyn Donnelly


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.