Do Our Students Measure Up? How to Define and Assess Student Technology Proficiency
Introduction Educators are under enormous pressure today to equip students with the technology skills they need for the 21st century. There is also pressure to meet federal and state mandates to demonstrate students' technology proficiency. Before we can begin to address these challenges, however, we must first define what it means to be technology literate, and then look for the best ways to assess how our students measure up to the standards. In this eBook we will take a look at some of the research behind technology literacy assessment, including defining the skill set, identifying the elements of a good assessment, measuring progress, and gathering and interpreting data. We will look at an exciting online assessment tool developed expressly to meet these needs and how school districts are using technology assessment to systematically measure and improve student performance. We hope this eBook will help you as you make decisions about how to gather meaningful data about your students tech literacy. â€”Jo-Ann McDevitt, Publisher
Defining and Assessing Student Technology Proficiency: Implementation and measurement strategies More than ever, school administrators are feeling the pressure to demonstrate with data how well their students are progressing toward proficiency in technology literacy. With increasing pressure from the federal government to meet the requirement for studentsâ€™ demonstrated technology literacy under the No Child Left Behind Act, district leaders are focusing on how best to provide that data, and just as important, how best to define what technology literacy means. Educators are also under mounting pressure to equip students with the necessary technology skills for the 21st century. A body of evidence is growing that the United States is falling behind on the worldwide playing field, particularly in the STEM fields of science, technology, engineering and math. Employers decry the quality of workers being sent to them without the necessary skills to do the jobs. And America is slipping in the number of graduates ready for the challenges of todayâ€™s workplace. Getting the Data To attain these parallel needs to meet the NCLB technology literacy reporting requirement while supporting students to be prepared graduates, educators need solid data. They must understand how well students are grasping critical technology skills, and they also must find how best to measure the effectiveness of their technology
“Without benchmarks and standards by which to measure student technology literacy, it is very difficult to know where students need additional support, instruction, or experiences. Technology literacy is one key skill set that helps facilitate learning in all subject areas.” — Mary Ann Wolf, Executive Director State Educational Technology Directors Association (SETDA) A key challenge educators face is how to assess students’ technology skills. Using paper and pencil testing to demonstrate technology skills is, on its face, inadequate at best. Portfolio assessments, while valuable, can be challenging because of their timeconsuming and subjective nature, and the difficulty to scale across districts and states. So how do educators best use technology to authentically and accurately measure students’ technology proficiency? And what is the definition of “technology proficiency?” The federal government has said it will leave it to states and schools to determine how to define and measure technology proficiency. States must determine their own proficiency levels, and they are defining that proficiency benchmark in vastly differing ways. “States typically have unique definitions of technology literacy, although the components of the definition are very similar across the nation,” says Mary Ann Wolf, Executive Director, State Educational Technology Directors Association (SETDA). She adds that states vary widely in how they assess or plan to assess technology literacy, including formal assessments at the state level, surveys or formal assessments at the district levels, and other performance indicators. “Eight states formally assess technology literacy at the state level, and eight additional states are in the process of using a formal assessment to measure technology literacy. Many others rely on districts to determine how to assess technology literacy and report the statistics to the state,” Wolf says. Setting the Standard How do we come up with a standard that meets these multiple needs for educators: an authentic assessment that provides accurate data, and that defines appropriate skills to assess? Moreover, how do we do so in a manner that doesn’t bring educators to the breaking point with one more testing requirement? Overlying all this is the need for any assessment to be valid, and that provides the criterion-referenced data that allow districts to accurately compare student performance to a standard.
Learning.com, a leading provider of Web-based curriculum, directly responded to educators’ expressed needs for a practical, classroom-based tool to measure the technology proficiency for elementary and middle school students. Delivered online, this criterion-referenced assessment is called TechLiteracy Assessment, and is designed to serve districts as an accountability tool: assessment score data help districts demonstrate that national technology standards (ISTE NETS-S) and prevailing state technology standards are being addressed. It also provides districts with the equally important data to inform instruction and support data-driven decision making. But before districts can assess for technology literacy, there must first be a clear and accurate definition of what it means for students to be “technology literate.” Defining the Skill Set Learning.com’s team developed a skill set that articulates technology proficiency for elementary and middle school students – in other words, what these students must possess in order to succeed in the contemporary classroom and larger world. With a carefully assembled panel of national experts in K-8 technology instruction and assessment, including an independent psychometrics group, Learning.com set separate proficiency standards for elementary and middle school students. The group based its conclusions on each member’s extensive classroom and administrative expertise, research studies, results from Learning.com’s TechLiteracy Assessment national assessment pilots, and an extensive survey and comparison of state and national standards. The team worked with data from more than 8,000 students across 68 schools.
The outcome is a rigorous measure of student readiness that educators can use to track the effectiveness of their own instructional programs and get the data they need for accountability purposes. This group determined where the bar for technology proficiency should lie for both the elementary and middle school national student population. This determined the Proficiency Standard used in TechLiteracy Assessment.
TechLiteracy Assessment engages students with visually interesting questions that test knowledge and skills. To meet the Proficiency Standard an eighth grade student must have the following skills and knowledge: • • • • • • • • • • • • • • • • •
Word processor Spreadsheets (create and label charts, apply functions, use formulas) Web browser Email Online/database search by keyword and category Online/database search by Boolean operators (and, or, not) Presentation software – create and save Audiovisual presentations Meta-skills (selecting data format and tools appropriate to the required task and target recipient/audience) Database – sort entries (first order sort, both numerically and alphabetically) Information storage and retrieval (create, store files, distinguish among drives and devices) Knows basic computer parts Recognizes computer operating problems and can troubleshoot Social and ethical concept awareness and knowledge (i.e., plagiarism, copyright) Technology symbol familiarity File type – recognize and distinguish Historical and contemporary impact of technology on society, social issues (privacy, online dangers)
With TechLiteracy Assessment, educators have the data to show that their students possess these skills. Elements of a Good Assessment A good assessment – and each item on the assessment – must answer these
questions: • •
What does this measure? What information does the students’ response tell me about their ability?
Measuring Progress Students are not measured against one another on TechLiteracy Assessment. Instead, they are measured against the Proficiency Standard for computer technology competency required for success in the modern classroom. Although TechLiteracy Assessment is a summative, criterion-referenced assessment, it also serves as a formative assessment; that is, it is meant to measure student progress, but also to provide administrators and classroom instructors with meaningful data for targeted goal-setting and intervention. For this reason TechLiteracy Assessment reports provide raw data on student performance in seven key areas, or “skills modules,” of computer technology: • • • • • • •
Systems and fundamentals Social and ethical issues (in regard to technology) Word processing Spreadsheets Multimedia and presentations Telecommunications and Internet Databases
These module scores provide educators with specific information on which skills the student requires the most help so that the teacher can implement the most effective and appropriate response. Additionally, they provide district and school administrators with data to make sound resource allocation and instructional decisions across the district. Implementing a Technology Literacy Assessment When regular assessments are part of an ongoing process of teaching and evaluating, the assessments themselves should not be grueling, multi-hour or special events that raise student anxiety about testing or that disrupt regular school activities. TechLiteracy Assessment is designed with the goals of ease and flexibility of implementation in mind. Students can take the assessment during a typical classroom period or during regularly scheduled computer lab sessions. The average completion time for TechLiteracy Assessment is less than 30 minutes, with 95 percent of students finishing within 40 minutes. Additionally, there is no time limit built into the assessment, so students who need more time may easily be accommodated.
TechLiteracy Assessment provides detailed information at the student level to help inform instruction. Students view TechLiteracy Assessment testing as a positive experience â€“ and one from which they learn. Post-assessment student interviews confirm that students enjoy taking TechLiteracy Assessment and come away from their assessment with an expanded curiosity about computer technology.
Educators can get immediate insight into how well classes are grasping critical technology skills with the class reports. Proving It Works The work of creating and delivering assessments is itself a process in which the goals and content are constantly re-examined to ensure that all items and assessments continue to accurately measure student progress toward relevant goals. Psychometric (scientific and statistical) evaluation and an active program of item-writing and testing are essential to any assessment program that seeks to provide more than hunches about student performance. TechLiteracy Assessment tests are examined anew after each testing window, and independent, highly experienced psychometricians measure each item to ensure that studentsâ€™ answers and abilities are measured accurately against the stated benchmarks. Learning.com has prepared case studies to provide insight into how TechLiteracy Assessment works in the real world. The state of Arizona is using TechLiteracy Assessment as its statewide technology literacy assessment tool. And in Gaston County, North Carolina, educators at the school district are finding a direct correlation in studentsâ€™ passing the state technology assessment and the use of TechLiteracy Assessment to inform instruction. See the full case studies that follow in this eBook.
TechLiteracy Assessmentâ€™s performance-based questions allow students to demonstrate their skills with relevant technology tools. Summary Educators face enormous pressures to ensure students are well prepared for this technology age, and have the skills to be successful in the 21st century. There is also pressure to meet federal and state mandates to demonstrate studentsâ€™ technology literacy. Doing so entails first understanding how best to define technology proficiency, and then creating a mechanism to get the data necessary to inform instruction and meet accountability requirements. Learning.comâ€™s TechLiteracy Assessment provides educators with an efficient, accurate and authentic assessment to meet these goals. Relying on an expert team, Learning.com created a criterion-referenced assessment through a rigorous process that provides data for critical technology skills and knowledge. Just as important, TechLiteracy Assessment was created in a manner that is sensitive to the student testtaker, providing an instructional experience that is engaging and relevant. Educators who choose TechLiteracy Assessment find it to be invaluable in providing the data they need to help their students be ready for the challenges of a technologically rich future.
Case Study: Arizona Pioneers Statewide Measurement of Students’ Technology Literacy Skills Breaking new ground, Arizona is the first state in the U.S. To formally measure its students’ proficiencies with technology using TechLiteracy Assessment by Learning.com. A spring 2006 pilot program administered Learning.com’s online authentic assessment to more than 24,000 fifth and eighth graders statewide. Similar numbers of Arizona used TechLiteracy Assessment in the 2006-07 school year. ”The data that came from our (TechLiteracy Assessment) pilot has just been phenomenal,” says Cathy Poplin, Deputy Associate Superintendent, Educational Technology, Arizona Department of Education, School Effectiveness Division, who theorizes that students’ technology skills correlate to overall academic achievement. ”It was fairly easy for us to make the decision to continue it.” Data to Support Funding Effectiveness Rather than simply react, Poplin and the department were anticipating what type of data the federal Department of Education (DoE) would next want collected for No Child Left Behind purposes. They were also aware of a clause (Title IID) mandating that all students be technology literate by grade eight by 2006. Poplin saw the TechLiteracy Assessment pilot as a “golden opportunity” to begin to proactively collect data. “With the data, we’d be able to show the federal DoE—very concretely—how Arizona was making use of federal Title IID funds to improve student achievement.” At the same time, the same data would be helping to inform districts and the state on how best to deploy funds and resources for technology literacy. Performance-based Assessment Engages Students Arizona schools administered TechLiteracy Assessment over a span of about seven weeks. Students used their school’s computer labs to take the online test during a single class period.Teachers who proctored the assessment noted that students seemed “very engaged” by the online test. One responded that she would’ve loved to have had a video camera with her to document the phenomenon. Another mentioned how amazingly quiet the computer lab was during the assessment. When taking the online test, students interact with assessment content in ways that allow them to demonstrate their proficiencies. Often, they must perform actions via simulations, rather than pick answers from among multiple choices. Thus, students must be able to format a paragraph, apply aspreadsheet formula, or conduct a database search. And they must demonstrate durable skills via generic menus and commands, not through brand-specific memorized shortcuts. Students’ internal motivations must be factored in to their engagement as well. Students place great “personal value” in technology, observes Mary Knight, who proctored the assessment in her district, Flagstaff Unified. “Technology is such a big
part of their everyday lives,” she explains, that “our students were excited at the opportunity to be assessed in this area.” In other words, students wanted to be tested on their proficiencies with technology applications. They also wanted to know their scores immediately. Even though it was a pilot, Learning.com provided the individual student reports for teachers to review with students before the end of the school year. All 50 districts that participated also received aggregated reports to share with administrators, teachers, technology coordinators, and the community. Results of TechLiteracy Assessment Lead to Changes Results from the TechLiteracy Assessment pilot were eye opening. Some districts that expected their students to perform well, for example, were surprised to learn that they did not score all that well. Other districts, such as one that draws a majority of its students from Navajo reservations in which there isn’t even electricity going to most homes, scored higher than the district expected. Statewide, 63 percent of the nearly 12,000 Arizona eighth graders tested by TechLiteracy Assessment did not meet proficiency in technology skills. Of the fifth graders tested, 73 percent were not proficient. “It told us we have work to do, certainly” says Poplin. “More importantly, it showed us some weak spots, which will help us better target our efforts. That’s the key.” And districts are doing just that. Washington Elementary School District will be adjusting its junior high curriculum based on the TechLiteracy Assessment pilot’s data. Mike Cannon, technology training coordinator for the district, explains that due to really tight schedules, a lot of technology instruction occurs in grade 7. After that, focus tends to fade. This pacing may be problematic, according to the pilot's results from the district’s 450 tested students. “We learned that our kids aren’t retaining as much of their technology skills as we would like,” says Cannon. Thus, he says, discussions are now underway with the junior high schools’ computer teachers on how to reconfigure the technology curriculum in grade 8 to better maintain mastery levels. Holbrook Unified School District #3, on the other hand, performed better than expected in the TechLiteracy Assessment pilot. While remote and rural—many of its students come from Navajo reservations—the district is also “progressive and proactive” with technology, says Ann Gardner, its technology coordinator. All Holbrook schools have computer labs, most classrooms have extra computers, and digital whiteboards are prevalent too. Additionally, Holbrook has a technology curriculum in place. But the pivotal clue to its students’ relatively high scores on TechLiteracy Assessment, the ADE’s Poplin feels, lies in the heavy emphasis Holbrook Unified places on professional development for its teachers. “These scores are testimonials for a district in which a whole lot of things are being done right,” says Poplin. “All of our professional development for teachers focuses on technology integration,” agrees Gardner. “Actually, it focuses on good lesson design because technology is nothing without
effective practices to support technology as a tool.” Assessment Continued in 2006-2007 Encouraged by the results of the Arizona TechLiteracy Assessment pilot, most districts are participating again in 2006-07, during which both a pre-test and post-test assessment are being administered to fifth and eighth grade students statewide. If the whispers prove true, and technology skills are the next curricular area to be formally tracked by federal accountability measures, Arizona will be more ready than most to meet the new challenge.
Case Study: North Carolina Students Pass State Technology Test in Greater Numbers with Technology Assessment Solution North Carolina educators are no strangers to supporting students to pass their statemandated computer skills test. And Gaston County Schools educators have found the tools they need to help them boost the numbers of students who successfully pass that test. The state’s new Online Test of Computer Skills, first assessed in Fall 2005, requires that districts test students for computer competency. Eighth grade students must pass the test as part of their graduation requirements. Gaston wanted to understand how ready their students would be for that state test, and so implemented TechLiteracy Assessment by Learning.com. The district first administered the assessment in November 2005 to fifth and seventh graders. The results were used to guide instruction for the spring 2006 semester, and to support students to be ready for the state test.
Gaston staff was looking for an online assessment because the state test is delivered online, “and we wanted to match format as well as content,” says Debbie Core, Chief Technology Officer for the district. Putting What They Learned to Use Because TechLiteracy Assessment is Web-delivered, Learning.com returns results in days. Matching well to North Carolina’s technology curriculum and high-stakes test
means TechLiteracy results also easily convert into information that is critical for Gaston County teachers and administrator. “The reports are very good,” says Roxie Miller, Assistant Chief Technology Officer. Bar graphs make it easy, she says, to see where each school in the district ranks according to proficiency for each skills module. “That’s key, of course. We can also tell how we compare with other schools and other districts that have used TechLiteracy Assessment. These will all be very helpful for principals and for all of us at the district level.” TechLiteracy Assessment’s reports are organized by district, school, class and student. Scores note whether students meet or fall below proficiency, defined as ‘what an average student should be expected to do, with technology, for his or her grade level,’ ” says Miller. The assessment’s content is arranged in seven modules: database, spreadsheet, word processing, multimedia and presentations, telecommunication and Internet, systems and fundamentals, and social and ethical issues. “TechLiteracy Assessment’s individual student reports are fabulous – really valuable,” says Miller. “They report on where a child is weak and strong in each skill area, so you can really do individualized remediation or intervention. They could also help a teacher with planning, overall, how to better integrate technology use into class work.” Gaston leaders found the data invaluable in helping students be more proficient technology users and be ready for the state test. The technology team can already say they see a correlation among those students who passed the TechLiteracy Assessment in seventh grade, and those who then went on to pass the state test as eighth-grade students. Data from the TechLiteracy Assessment comparing students’ technology proficiency on a start-of-the-year test to an end-of-year test shows an increase of 18 percent in the number of seventh graders at or above proficiency on the assessment. Middle schools that do the best on the TechLiteracy Assessment test are also the top scoring middle schools on the state test. In one cluster of schools in the district, close to 95 percent of seventh graders who passed TechLiteracy Assessment also passed the eighth-grade test.
”The more we use TechLiteracy Assessment, the better results we are getting in eighth grade. Schools with the highest TechLiteracy Assessment scores in seventh grade have the highest scores on the state test as well.” –
Assistant Chief Operating Officer Gaston County Schools, North Carolina
The data from TechLiteracy Assessment gives the district the compass it needs to support students to improve in specific areas in which they are weak. For example, Miller noted a curious symmetry in Gaston Countyâ€™s results for its elementary school students. The students scored lowest in social and ethical issues, but high in multimedia and presentation skills. In its middle schools, however, the opposite was true. That led her team to alert middle school principals and teachers that their students needed more focused time learning presentation skills. Teachers are making an effort to use presentation technology in their core instruction.
Resourcses Elements of a Good Assessment This three-page document expands on some of the material in this eBook, providing more depth into how TechLiteracy Assessment meets the criteria for a valid and authentic technology literacy assessment. See Sample Tests Take a quick eight-question sample of TechLiteracy Assessment to experience its student-friendly format and try out its knowledge-based and performance-based questions that authentically measure student proficiency. Setting Proficiency Standards for TechLiteracy Assessment Learn more about how Learning.com developed and set the standards for their Technology Proficiency Standard in this brief white paper. How to Evaluate a Technology Literacy Assessment This worksheet will help you compare technology literacy assessments on a feature-byfeature basis, including content, implementation, reporting capabilities and cost. Technology Literacy Solutions from Learning.com Learn the full features and benefits for TechLiteracy Assessment and for EasyTech, the K-8 technology literacy curriculum that integrates technology into core curriculum.