Page 1

Research, Policy and Practice Conference School District of Philadelphia 440 North Broad Street Philadelphia, PA 19130 April 8, 2014


Page Left Blank Intentionally


WIFI LOGIN

Wireless network: R2P 2014 Wireless ID: R2P Password: 2014


Table of o Conttents Tab 1 Agendda

Tab 2 Floor Plan

Tab 3 Schoool District of Phhiladelphia Acction Plan 2.0 Financial Supplemeent

Tab 4 Conneecting Researcch to Policy annd Practice Pannel Locatiion: 2nd Floor Auditorium A Chair and Discussannt: Andrew Poorter, Universitty of Pennsylvvania Panel: Chhristopher Lubbienski – The Role of Interm mediary Organnizations in Edducation Policymaking: Preliminary Finndings Margaret M Goerttz – SEA Acquisition and Usse of Research in School Impprovement Caarrie Conawayy – Research in Policy Makinng: The View ffrom the Otheer Side

Tab 5 Instructions for Breakout Group Discussion D


Tab 6 Concuurrent Session A: Improving Student Learning – College & Career Readiness Locatiion: 2nd Floor Auditorium A Chair: Shelly Engelm man, Office off Research andd Evaluation, SSDP Panel: 1.. Evaluationn of the Philade delphia GEAR-UUP Partnership ip Initiative— —Results and IImplications, JJulia Alemany and Manuel Guutierrez, Metiss Associates 2.. Non-Cogniitive Predictorrs of College Enrollment En , Anngela Duckworrth and David Meketon, University of Pennsylvannia 3.. Post-Seconndary Workforrce Attainmennt and School OOutcomes, Paaul Harringtonn, Drexel Univeersity Discusssant: Dave Kipphut, K Officee of Career andd Technical Edducation, SDP

Tab 7 Concuurrent Session B: Becoming a Parent- andd Family-Centeered Organizaation Locatiion: Room 10880 Chair: David Kowalsski, Office of Research R and Evaluation, E SDDP Panel: 1.. Recent Devvelopments inn Customer Seervice at the Sc School District, Karen James, Office of Pareent and Familyy Services, SDPP 2.. The New District-wide D Parent P and Guaardian Surveyy Program, Davvid Kowalski, Office of Research and Evaluaation, SDP 3.. Nudging Attendance At witith Social Comp mparison, Toddd Rogers, Kennnedy School, HHarvard University Discusssant: Allisonn Still, Office of Multilingual Curriculum and Programs,, SDP


Â

Tab 8 Concuurrent Session C: Developingg a System of Excellent Schoools Locatiion: Room 10775 Chair: Kati Stratos, Office O of Research and Evaluuation, SDP Panel: 1.. Evaluationn of the Renaisssance Schoolsls Initiative, Kaati Stratos andd Adrienne Reiitano, Office oof Research and Evaluationn, SDP 2.. Developingg a New Schoool in Philadelpphia, Laura Shubila, Buildingg 21 3.. Evidence for fo Developingg Systems of Excellent Ex Schoools: Turnarounnds, Charters, and School Closures, Matt M Steinbergg, University of Pennsylvania Discusssant: Grace Cannon, C Officee of New Schoool Models, SDDP

Tab 9 Concuurrent Session D: Becoming an Innovativee and Accounttable Organizaation, and Achhieving and Sustaiining Financiaal Balance Locatiion: Room 10772 Chair: Jon Supovitz,, Co-Director, Center for Pollicy Research iin Education, University of PPennsylvania Panel: 1.. Understand nding School-BBased Partners rships in the Sc School District oof Philadelphi hia, Sarah Costeello, Swarthmorre 2.. Developingg a New Schoool District Accoountability Sys ystem, Jura Chung, Office off Strategic Analytics, SDP S 3.. Fair Financcing for Schoools in Philadelpphia, Rand Quiinn, Graduatee School of Eduucation, Univeersity of Pennsylvvania Discusssant: Jon Supovitz, Co-Dirrector, Center for Policy Research in Educaation, Universsity of Pennsylvania


Tab 10 Concuurrent Session E: Improving Student Learnning – Dropouut Prevention Locatiion: 2nd Floor Auditorium A Speakker: Russell Ruumberger, Caliifornia Dropouut Research Prroject, Universsity of Californnia, Santa Barbbara

Tab 11 Concuurrent Session F: Identifyingg and Developing Exceptionnal, Committedd People Locatiion: Room 10775 Chair: James Jack, Office O of Reseaarch and Evaluuation, SDP Panel: 1.. Implement nting the New Educator Effec ectiveness Systtem in Philadeelphia, Brett Shiel, Office of Teacher Efffectiveness, SDDP 2.. Evaluatingg the New Educ ucator Effectiveeness System in Philadelphhia, Kati Stratoos and James JJack, Office of Reesearch and Evvaluation, SDPP 3.. Improvingg Teachers’ Proofessional Deve velopment: Inssights from Reecent Random mized Control TTrials, Laura Desim mone, Graduaate School of Education, E Uniiversity of Pennnsylvania 4.. The Relatioonship betweeen Teacher Pre reparation Expperiences andd Early Teacherr Effectivenesss and Implicationns for Human Capital Manag agement Strate tegies, Peter W Witham, Educaation Analyticss Discusssant: Matt Steinberg, Gradduate School of o Education, UUniversity of PPennsylvania


Tab 12 Concuurrent Session G: Improving Student Learning – Literaccy Locatiion: Room 10880 Chair: Adrienne Reitano, Office off Research andd Evaluation, SDP Panel: 1.. K-3 Literacy cy in the Schoool District, Dorria Mitchell, Offfice of Early CChildhood, SDP 2.. Children’s Literacy L Initiaative in Philade delphia, Jill Vallunas 3.. Translanguuaging as a Reesource for Lite teracy Developpment, Nelsonn Flores, Graduuate School off Education, University of Pennsylvania Discusssants: Barbarra Wasik, Temple Universityy Cheryl Micheau, Offiice of Multilingual Curriculuum and Prograams, SDP

Tab 13 Readings Cooburn, C.E., Peenuel,W.R., & Geil, K.E. (20112). Research-Practice Parttnerships at thhe District Leve vel: A New Straategy for Leveeraging Researrch for Educattional Improveement. A Whitte Paper prepared for the William W T. Grannt Foundation. Coonaway, C.L. (2013). ( The prroblem with briefs, b in brief.. Education Fin inance and Pol olicy, 8(3), 287-299. Cuunningham, D.H., D & Wyckoff, J. (2013). Policy P makers and researcheers schooling eeach other: Leessons in educattional policy from f New Yorkk. Education FFinance and Po Policy, 8(3), 275-286. Fiixsen, D., Blase,K., Horner, R., R Sims, B., & Sugai, G. (20113). Scaling-UUp Brief. Univeersity of Northh mentation & Scaling-Up of EEvidence-Baseed Practices. Carolina:: State Implem Goertz, M.E., Baarnes, C., & Massell, D. (20113). How state te education aggencies acquirire and use ressearch

in school ol improvemennt strategies (CConsortium foor Policy Reseaarch in Educattion Policy Brie ief RB 55). Philaadelphia, PA: University of Pennsylvania..


Malin, M J., & Lubbienski, C. (2013). Whose Oppinion Countss in Educationaal Policymakinng? Current Isssues in Educat ation, 16 (2). Peenuel, W. R., Fishman, F B. J., Cheng, B. H.,, and Sabelli, N. (October 20011). Organizing Research aand Developm ment at the Inntersection of Learning, Impplementation,, and Design. Education Researchher.

Tab 14 Posters Related to Action A Plan Strategy 1 –Impprove Studentt Learning 1..01 Barghaus & Fantuzzo – Validation of o the Preschoool Child Obserrvation Recordd 1..02 Brawner – Project GOLLD: “We are Kings and Queeens” HIV/STI PPrevention forr Black Adolesccents 1..03 Carlton, Sorhagen, S Stuull, & Weinrauub – Whose Heead Hit the Pillow First: A RReview of Sporrts and Sleep in Preadolesscence 1..04 Chein, Roosenbaum, & Botdorf – Woorking Memoryy Training Impproves Perform mance on an Emotionaal Cognitive Control Task in Adolescents 1..05 Dahl – SNAP-Ed EAT.RRIGHT.NOW. Nutrition Educaation Program m: Long-Term Impacts of 4thh Grade Veegetable Core Intervention 1..06 Dahl –EAT.RIGHT.NOW EA W. Nutrition Edducation Proggram: 2012-20013 School Yeaar Reach and Scope 1..07 Daly & Tiimony – Teachhing Children to Succeed: A University-Ellementary Schhools Partnersship to Promoote Social, Emotional, and Behavioral B Heaalth in Studennts 1..08 Darken – Experiences and Perceptioons of Bilinguaal Students in Two Settingss 1..09 Dickard, Magee, Agostto, & Forte – Studying S Teenn Social Mediaa Use: What Doo You Think iss the #1 Topic Teens Ask About on Social Media? 1..10 Ellis – Ouut-of-School Time T and Studdent Outcomess 1..11 Friedmann – PA SNAP-ED funded Nutrition Educattion Assembliees School Disttrict of Philadeelphia 1..12 Glunt – Blueprints B LifeeSkills Training Outcomes EEvaluation 1..13 Gundersoon – Praise foor Effort Increaases Motivatioon and Learninng 1..14 Gundersoon & Ramirez – Math Anxieety in Early Eleementary Schoool


1..15 Harbisonn, Sack, Campbbell, & Ciner – Vision in Preschoolers: Hyperopia in Preeschoolers (VIPHIP) 1..16 Hirsh-Pasek, Pace, & Yust Y – Developping a Computerized Languuage Assessmeent for Preschooolers 1..17 Hirsh-Pasek,Newcombbe, Golinkoff, & Verdine – DDo Early Spatiaal Skills Lead tto Later Spatiaal and Math Gains? 1..18 Hirsh-Pasek, Spiewak--Toub, & Hassinger-Das – PPlaying to Learn: Vocabularyy Developmennt in Early Childhood 1..19 Holmes – Proportionaal Reasoning: An A Interventioon to Support Quantitative Developmentt 1..20 Jirout – Learning L from m Play: The Impact of Spatiaal Game Play oon Spatial Ability 1..21 Jubilee & Engelman – City Year Proggram Evaluatiion 1..22 Kupersm midt & Scull – An A Evaluationn of Media Awa ware, A Media LLiteracy Educaation Substance Abuse Prrevention Proggram for High School Studeents 1..23 Lawman – Trends in Relative R Weighht Over One Yeear in U.S. Urbban Youth 1..24 Lent – Purchasing Pattterns of Adultts, Adolescentts, and Childreen in Urban Coorner Stores: Quantityy, Spending, annd Nutritionall Characteristiics 1..25 McClung – Educating Children and Youth Y Experieencing Homeleessness: Transsportation as a Barrier too Attendance 1..26 Merlino – Summary off a Matched-SSample Study Comparing thhe Interactive Math Program m and Traditionnally-Taught High H School Sttudents in Philadelphia 1..27 Mojica – An Examination of English Language Prooficiency and Achievementt Test Outcomees 1..28 Lawrence, Krier, & Possner – Engaging H.S. Math Students and Teachers throough a Proficieencym Based Asssessment andd Reassessmennt of Learningg Outcomes (PPARLO) System 1..29 Nocito – SNAP-Ed Garden-Based Nuutrition Educattion: A New Strategy in thee Fight againstt Obesity 1..30 Norton & Reumann-M Moore – 2012-13 Evaluationn of City Year GGreater Philaddelphia: A Fivee-Year Study byy Research for Action


1..31 Peters – Cardiovasculaar Health amoong Philadelphhia Adolescennts: Analysis off Youth Risk Behaviorr Data, 2011 1..32 Polonskyy – Breakfast Patterns P amonng Low-Incom me, Ethnically Diverse Elemeentary School Children 1..33 Reed & Hirsh-Pasek H – Arts and Earlyy Education: A Promising Duuet to Supporrt School Readdiness Skills 1..34 Reisinger & Mandell – Philadelphia Autism Instruuctional Methhods Study: Phhilly AIMS 1..35 Schaeffer & Leach – Arts Link: Buildding Mathemaatics and Sciennce Competenncies through aan Arts Integration Modeel 1..36 Shakow – Eating the Alphabet A 1..37 Shakow – F.U.N. (Fam milies Understaanding Nutritiion) and Fit thhrough P.L.A.YY. (Playful Leaarning for Adultts and Youth) 1..38 Simon & Robbins – “BBe the HYPE”: Engaging E Youung People as School Wellness Leaders 1..39 Stull, Weeinraub, & Harrmon – Early Childcare C Careeers: What aree the Facts? 1..40 Valshteinn, Foley, Stull,, & Weinraub – What a Nightmare! Not AAll Sleep Meassures are Created Equal 1..41 Vetter – Cardiopulmonary Resuscitaation/Automaated External DDefibrillator OOlympic Competittion Enhancess Resuscitationn in High Schoool Students 1..42 Wasik, Hindman, & Snnell – Exceptioonal Coachingg for Early Language and Litteracy ExCELL--e

Tab 15 Posters Related to Action A Plan Strategy 2 –Devvelop a System m of Excellentt Schools 2..01 Dowdall – Shuttered Schools: S Brininng New Life too Old Buildinggs 2..02 Hill – Sittuating Successs: A Mixed-Methods Study of School Turnarounds in PPhiladelphia 2..03 Jack – Thhe Effects of Philadelphia’s School Choicee Policies on Scchool Utilizatiion Rates 2..04 Jones – Measuring M Bullying Preventtion Impleme ntation Readiiness and Proggress: A Pilot SStudy 2..05 Max & Gllazerman – Do Disadvantagged Students GGet Less Effecctive Teaching? 2..06 Norton – Philadelphiaa’s Acceleratedd High Schoolss: An Analysis of Strategies and Outcomees


2..07 Peters – Youth Risk Beehavior Surveyy, Philadelphia 2..08 Schwartzz & Eiraldi – School-Wide Positive Behavvior Interventioons and Supports (SWPBIS)) in SDP Schoools: An Exploratory Study 2..09 Stratos & Reitano – Sttudent Achievement in Renaaissance Charrter Schools 2..10 Supovitz – The Common Core in New w York City Scchools

Tab 16 Posters Related to Action A Plan Strategy 3–Idenntify and Deveelop Exceptionnal, Committeed People 3..01 Ebby – Teacher T Analyssis of Student Knowledge (TTASK) 3..02 Herzog – Teacher Netw works in Philaadelphia: The CCurrent Landsscape 3..03 Jack & Sttratos – Teachher Perceptionn of Evaluationn Focus and Reeported Outcoomes 3..04 Merlino – National Center for Cogniition and Sciennce Instruction – Findings ffrom Philadelphia

Tab 17 Posters Related to Action A Plan Strategy 4–Beccome a Parentt- and Family--Centered Organization 4..01 Kowalskii – District-Wiide Parent andd Student Surv rveys 4..02 Rogers & Subramanyaam – Nudgingg Attendance w with Social Coomparison: A RRandomized Controlleed Experimentt 4..03 Spier – Investing in Faamily Engagem ment

Tab 18 Posters Related to Action A Plan Strategy 5–Beccome an Innovvative and Acccountable Organization 5..01. Gutierrezz & Alemany – Evaluation of o the Philadellphia GEAR UPP Partnership 5..02 Hartmannn, Gao, & Hallar – Aggregaated Analysis oof Six 21st Century Communnity Learning Center Evvaluations, 20011-12 5..03 Kerstetteer – The Consoortium for Policy Research in Education (CPRE)


Tab 19 Poster related to Acction Plan Straategy 6–Achieeve and Sustaiin Financial Baalance 6..01 Steinbergg & Quinn – Assessing A Adeqquacy and Equuity in Education Spending

Tab 20 Contact Informationn


Â

Tab 1 Agennda


Page Left Blank Intentionally


Aggenda 11:00 AM M – 12:30 PM M

Check-in

11:30 AM M – 12:30 PM M

Lunnch/Networrking

12:30 PM M – 12:35 PM M

Weelcome 2nd Floor F Auditorium m

12:35 PM M – 1:00 PM

Thee Landscapee of Researchh and Evaluaation in the DDistrict: Whoo, Whhat, When, and a How? 2nd Floor F Auditorium m

1:00 PM – 2:00 PM

Invvited Panel: Connecting Research too Policy and PPractice 2nd Floor F Auditorium m

2:00 PM – 3:00 PM

Breeakout Groups: Connecting Researchh to Policy and Practice 2nd Floor F Auditorium m

3:00 PM – 3:30 PM

Inttermission

4:00 PM – 5:30 PM

Putting the Action Plan intto Action: Cooncurrent Seessions I

2nd Flooor Auditorium Room 1080 1 Room 1075 1 Room 1072 1

5:30 PM – 7:00 PM 2nd Flooor Auditorium Room 1075 1 Room 1080 1

Sess ssion A: Improvinng Student Learn rning – College & Career Readine ness Sess ssion B: Becoming ng a Parent- andd Family-Centereed Organization Sess ssion C: Developiing a System off EExcellent School ols Sess ssion D: Becominng an Innovativee and Accountabble Organization, n, and Achieving and Sust staining Financiaal Balance

Putting the Action Plan intto Action: Cooncurrent Seessions II Sess ssion E: Improvinng Student Learnning – Dropout PPrevention Sess ssion F: Identifyin ing and Developi ping Exceptional,l, Committed Peoople Sess ssion G: Improvinng Student Learn rning – Literacy

From 3:300 PM to 7:00 PM there will be interacttive poster d isplays, foodd, and refreshhments


Page Left Blank Intentionally


Tab 2 Floor Plan


Page Left Blank Intentionally


Â

Tab 3 Schoool District of Philadelphia Actionn Plan 2.0 Financial Suppllement


Page Left Blank Intentionally


Action Plan  v2.0   School  District  of  Philadelphia   February  17,  2014      

“We are  making  our  schools  great.   Join  us.”                  

OVERVIEW This  plan  is  a  description  of  the  School  District  of  Philadelphia’s  current  and  planned  priority  work.    Its  primary   objective  is  to  align  the  work  of  all  employees  to  the  Goals,  Strategies,  and  Actions  described  here.    It  is  also   intended  to  communicate  a  comprehensive  overview  of  the  District’s  plan  to  parents,  families,  students,  partners   and  stakeholders.      Building  off  v1.0,  and  developed  after  an  additional  year  of  work  and  reflection,  review,  and   research,  it  is  a  “living  document”  subject  to  change  as  new  facts  are  gathered  and  new  evidence  comes  to  light.   Feedback  on  this  Action  Plan  and  new  ideas  should  be  provided  to:  actionplan@philasd.org           This  Action  Plan  can  be  accessed  online  at:  www.philasd.org/actionplan    

 


The School  District  of  Philadelphia  Action  Plan  v2.0    

Table of  Contents   Executive  Summary:  Goals,  Strategies  and  Actions  in  Brief  .........................................................................................  1   Preface  ..........................................................................................................................................................................  4   Introduction:    We  Are  Making  Our  Schools  Great.    Join  Us.  .........................................................................................  5   Our  Vision  .................................................................................................................................................................  5   The  Values  That  Drive  Our  Work  ..............................................................................................................................  5   What  We  Have  Accomplished:  v1.0  to  v2.0  .............................................................................................................  6   The  Case  for  Investment  ..........................................................................................................................................  7   Part  I:  Anchor  Goals  ......................................................................................................................................................  8   How  We  Will  Track  Our  Progress  and  How  We  Will  Hold  Ourselves  Accountable  ................................................  10   Part  II:  Strategies  and  Actions  ....................................................................................................................................  11   Strategy  1:  Improve  Student  Learning  ...................................................................................................................  11   Strategy  2:  Develop  a  System  of  Excellent  Schools  ................................................................................................  15   Strategy  3:  Identify  and  Develop  Exceptional,  Committed  People  ........................................................................  19   Strategy  4:  Become  a  Parent-­‐  and  Family-­‐Centered  Organization  .........................................................................  21   Strategy  5:  Become  an  Innovative  and  Accountable  Organization  ........................................................................  23   Strategy  6:  Achieve  and  Sustain  Financial  Balance  ................................................................................................  25   PART  III:  Where  We  Go  From  Here  .............................................................................................................................  27   Exhibit  1  –  Inputs  to  Action  Plan  v2.0  .........................................................................................................................  28   Exhibit  2  –  How  Did  We  Do?  A  Scorecard  Against  Action  Plan  v1.0  ...........................................................................  30   Exhibit  3  –  SDP  High  Performing  Schools  Practices  Based  on  the  Research  ...............................................................  32   Exhibit  4  –  SDP  Highly  Effective  Instructional  Practices  .............................................................................................  33   Exhibit  5  –  SDP  Functional  Organization  Chart  Aligned  with  Action  Plan  v2.0  ...........................................................  34   Endnotes  .....................................................................................................................................................................  35    

 


The School  District  of  Philadelphia  Action  Plan  v2.0  

Executive Summary:  Goals,  Strategies  and  Actions  in  Brief  

Specific Actions   STRATEGY  1:  IMPROVE  STUDENT  LEARNING   A. Fully  adopt  and  integrate  the  PA  Core  standards  in  all  of  our  teaching  and  learning  activities   B. Define  college  and  career  readiness  based  on  student  mastery  of  content,  and  align  graduation   standards   C. Identify  and  implement  a  rigorous,  flexible  PreK-­‐12  curriculum     D. Implement  a  literacy-­‐rich  early  childhood  continuum  of  services,  including  recuperative   practices   E. Develop  and  implement  a  coherent  assessment  system   F. Promote  effective  instructional  practices  in  every  classroom   G. Accelerate  progress  towards  personalized  learning   H. Provide  high  quality  Special  Education  services  in  the  least  restrictive  learning  environment   I. Support  rigorous  and  linguistically  appropriate  learning  experiences  for  English  Language   Learners  (ELLs)   J. Integrate  a  focus  on  “academic  tenacity”  throughout  the  curriculum   K. Improve  student  nutrition  and  meal  experience     1  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

STRATEGY 2:  DEVELOP  A  SYSTEM  OF  EXCELLENT  SCHOOLS   A. Make  all  District  schools  great  by  implementing  high  performing  school  practices   B. Provide  students  with  an  environment  conducive  to  learning  by  implementing  and  maintaining   safety  and  climate  plans  that  incorporate  evidence-­‐based  programs   C. Ensure  all  schools  are  porous  –  connected  to  community  resources  and  partnerships  to  meet   student  needs   D. Empower  school  leaders  and  their  leadership  teams  with  the  authority  to  make  important   decisions   E. Make  poor  performing  schools  better  through  the  Renaissance  turnaround  program,  including   evidence-­‐based  revisions  to  the  Promise  Academy  model     F. Promote  compelling,  successful  programs  including  Career  and  Technical  Education  and  project-­‐ based  learning   G. Review  and  improve  the  provision  of  schooling  across  all  our  alternative  settings   H. Strengthen  neighborhood  schools   I. Create  and  launch  new,  evidence-­‐based  school  models,  and  scale  the  ones  that  work   J. Be  a  great  charter  school  authorizer  to  ensure  all  charters  are  good  school  options,  and  promote   the  sharing  of  successful  practices  across  all  schools   K. Develop  and  implement  a  school  progress  measure   L. Provide  a  clean  and  comfortable  building  environment  in  all  schools   M. Continuously  update  and  refine  the  system-­‐of-­‐schools  plan,  including  school  expansions  and   closure  assessments  of  chronically  under-­‐enrolled  and  under-­‐performing  schools     STRATEGY  3:  IDENTIFY  AND  DEVELOP  EXCEPTIONAL,  COMMITTED  PEOPLE   A. Improve  recruitment  and  hiring  practices  to  attract  the  highest  quality  candidates   B. Strengthen  the  principal  and  teacher  pipelines   C. Celebrate,  retain  and  promote  high  performing  staff,  particularly  great  teachers  and  principals   D. Support  the  continuous  development  of  all  personnel  –  tailored  to  individuals  –  including  an   emphasis  on  school-­‐based  coaching  for  principals  and  teachers   E. Create  meaningful  opportunities  for  teacher  collaboration  and  for  principal  collaboration   F. Collaborate  with  city  and  other  partners  to  make  Philadelphia  a  premier  place  for  principals  and   teachers  to  work   G. Set  clear  expectations  for  teachers,  principals  and  support  staff  and  implement  regular   performance  evaluations   H. Engage  teachers,  principals,  professional  networks,  labor  unions  and  other  partners  to  identify,   explore,  develop,  and  scale  great  ideas  related  to  talent     STRATEGY  4:  BECOME  A  PARENT-­‐  AND  FAMILY-­‐CENTERED  ORGANIZATION   A. Actively  reach  out  to  parents  to  involve  them  in  their  children’s  schools,  including  the  launch  of   a  School  Advisory  Council  in  every  school   B. Establish  clear  processes  for  parent  and  family  input  and  ideas   C. Provide  parents  with  information  about  their  students’  progress  and  how  to  support  that   progress   D. Provide  parents  and  families  with  excellent  customer  service   E. Provide  parents  with  ample  information  on  schools,  and  increase  the  equity  and  transparency  of   the  school  selection,  transfer,  and  placement  processes       2  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

STRATEGY 5:  BECOME  AN  INNOVATIVE  AND  ACCOUNTABLE  ORGANIZATION   A. Cultivate  and  sustain  partnerships  at  the  system  and  school  levels   B. Transform  the  organization  by  instituting  strategic  management  processes  at  all  levels  and   building  a  culture  of  excellence     C. Improve  data  accuracy,  application,  and  accessibility   D. Implement  effective,  aligned  business  processes   E. Improve  communication  throughout  the  organization  and  to  the  public   F. Actively  promote  innovation  and  cross-­‐functional  design  thinking   G. Implement  core  student-­‐  and  teacher-­‐facing  systems  for  schools,  including  a  Learning   Management  System  and  a  Student  Information  System     H. Improve  the  quality  and  lower  the  cost  of  transportation  services     STRATEGY  6:  ACHIEVE  AND  SUSTAIN  FINANCIAL  BALANCE   A. Seek  additional  revenues   B. Continuously  identify  savings  opportunities  and  capture  identified  cost  savings   C. Meet  the  immediate  financial  challenges  of  FY14  and  FY15   D. Continuously  analyze  the  impact  of  spending  and  deploy  resources  to  achieve  priorities,   including  the  activities,  schools  and  programs  that  need  them  the  most   E. Develop  a  comprehensive,  outcomes-­‐focused  budgeting  strategy,  including  five-­‐year  planning   F. Institute  financial  controls   G. Align  the  capital  and  grants  programs  in  support  of  the  anchor  goals    

3 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Preface In  working  to  make  all  schools  great,  this  much  is  clear:  All  means  all.     We  cannot  maintain  a  school  system  where  too  few  students  are  adequately  prepared  for  higher  education  and   the  workforce,  where  too  many  lack  opportunities  for  academic  or  professional  growth.  All  of  our  schools  and   students  need  bold  expectations,  ambitious  goals  and  unapologetic  solutions.     The  citizens  of  Philadelphia  deserve  great  schools.  Parents  deserve  great  schools  for  their  children.  Students   deserve  high-­‐quality  education  that  prepares  them  for  life.  Residents  deserve  an  outstanding  next  generation  of   civic,  business  and  social  leaders  –  people  committed  to  the  collective  effort  of  building  and  sustaining  a  system  of   exceptional  schools  for  all  children.     In  introducing  Action  Plan  v2.0,  the  four  goals  that  anchor  our  work  are  both  aspirational  in  scope  and  urgent  in   nature.  All  students  must  graduate,  ready  for  college  and  career.  All  8-­‐year-­‐olds  must  be  able  to  read  on  grade   level  in  preparing  for  future  academic  success.  All  schools  must  have  great  principals  and  teachers  at  all  grade   levels.  And  we  must  spend  all  funds  wisely.  That  is  the  challenge  before  us,  and  we  need  all  Philadelphians  to  join   us  in  achieving  these  goals.           Action  Plan  v2.0  is  an  evolution  of  our  blueprint  for  making  schools  great.  It  explains  what  we  mean  by  great,  our   goals  and  strategies  for  achieving  our  targets,  and  the  actions  inside  our  strategies  that  comprise  our  core,  priority   work.  It  also  identifies  how  we  will  know  our  schools  are  becoming  great.     My  team  and  I  understand  what  it  will  take  to  make  our  schools  great.  In  many  cases,  this  will  not  be  a  return  to   past  practices  or  staffing  patterns  that  did  not  produce  better  results.  Making  schools  great  requires  investments   in  evidenced-­‐based  strategies  that  have  worked  here  and  in  similar  urban  settings.  These  investments  will  require   commitments  from  our  legislators,  local  leaders,  businesses  and  taxpayers.  Our  schools  are  operating  this  year   under  circumstances  that  none  of  us  would  wish  for;  a  year-­‐to-­‐year  funding  mindset  cannot  continue  to  be  the   norm.  A  stable  statewide  funding  structure  that  meets  students’  needs  is  paramount  in  order  for  The  School   District  of  Philadelphia  to  meet  its  obligation  to  all  students  and  families.     As  a  school  district  –  as  a  city  –  we  should  aspire  to  have  all  children  exposed  to  rigorous  academics,  surrounded  by   caring  adults  with  high  expectations  for  them.  Our  goals  are  solidly  intertwined;  we  cannot  graduate  100  percent   of  students  who  are  both  college-­‐  and  career  ready  if  we  do  not  have  100  percent  of  8-­‐year-­‐olds  reading  on  grade   level.  We  cannot  invest  in  making  all  schools  great  without  100  percent  of  the  funding  needed  to  educate  all   children.  We  cannot  have  100  percent  of  our  students  meeting  our  high  expectations  without  100  percent  of  our   schools  having  great  principals  and  teachers.    And  we  cannot  enhance  our  workforce  and  regional  economy   without  100  percent  of  students  becoming  productive  citizens.       This  work  will  ultimately  determine  the  future  of  our  great  city  and  the  opportunities  for  our  youngest  citizens  to   access  a  rich,  rigorous,  high-­‐quality  education.  Every  child  can  learn.  Every  school  can  be  great.  All  of  us  can  help.   That  is  our  foundation  moving  forward.  We  ask  you  and  all  Philadelphians  to  join  us.     Superintendent  William  Hite   February  2014      

4 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Introduction:  We  Are  Making  Our  Schools  Great.    Join  Us.   Every  morning  throughout  the  city,  students  wake  up  ready  to  learn.    Every  day,  teachers,  principals,  and  support   staff  arrive  at  our  schools  ready  to  teach,  to  lead,  and  to  support  student  learning.    The  countless  meaningful   interactions  between  students  and  adults  that  unfold  each  day  in  our  schools  are  the  heart  of  what  we  do.    These   are  the  building  blocks  of  opportunity.    This  is  the  way  that  we,  collectively,  deliver  on  the  civil  right  of  every  child   to  a  quality  public  education.    This  is  how  we  develop  the  next  generation  of  civic  and  social  leaders.    This  is  what  it   takes  to  grow  the  city.    And  we  can  do  much,  much  better.    We  all  have  a  stake  in  this  work  –  and  we  all  can   contribute  to  its  success.   We  are  making  our  schools  great.    Period.    And  we  say  to  everyone  in  Philadelphia  and  beyond:    join  us.   This  Action  Plan  v2.0  is  an  evolution  of  our  blueprint  for  making  our  schools  great.    In  the  pages  that  follow,  we   provide  the  details  of  what  we  need  to  do,  our  collective  work  and  obligations.    We  are  at  a  fortunate  moment  in   the  history  of  education  in  this  country;  we  know  what  to  do,  based  on  the  mountains  of  practice  and  research  and   evidence  that  have  been  accumulating  over  decades.    Much  of  our  collective  challenge  is  about  doing  it  well.     Therefore,  many  actions  here  are  about  excellence  of  implementation,  and  about  building  the  strong,  elaborate   support  systems  to  enable  the  crucible  of  our  classrooms  –  the  fine  points  of  interaction  between  our  teachers,   other  educators,  and  our  students  –  to  become  places  of  consistent  joy  and  success.   We  are  making  our  schools  great.    Join  us.  

Our Vision  

The  goals,  strategies  and  actions  detailed  in  this  plan  all  promote  our  profound  vision:     The  School  District  of  Philadelphia  will  deliver  on  the  right  of  every  child  in  Philadelphia   to  an  excellent  public  school  education  and  ensure  all  children  graduate  from  high   school  ready  to  succeed.     The  key  word  in  this  vision  is  “right.”    The  District  exists  to  deliver  on  the  civil  right  of  every  child  to  a   strong,  lifelong  foundation.  

The Values  That  Drive  Our  Work  

A  set  of  core  values  undergird  this  vision,  and  inform  all  of  our  strategies  and  actions.    We  believe  that:     1) All  students  can  and  will  learn  –  We  care  deeply  about  each  student,  and  we  believe  that  every  student  has   the  potential  to  learn  at  high  levels.    We  believe  the  culture,  language,  and  background  that  each  child  brings   to  school  are  strengths  to  build  upon,  and  that  we  have  a  responsibility  to  meet  each  student’s  educational   needs  and  goals  and  provide  a  safe  and  engaging  environment.   2)

High quality  instruction  is  at  the  core  of  our  work  –  We  believe  in  the  persistent  pursuit  of  excellence  in   teaching  and  expertise  in  content.    We  strive  to  deliver  instruction  that  reflects  high  expectations  for  learning,   that  inspires  students  to  meet  high  standards,  and  that  sparks  passionate  and  joyful  interest  in  learning.    We   believe  in  the  power  of  teachers  and  the  principals  who  support  them  to  provide  transformative  instructional   experiences  for  all  children.  

3)

Schools are  learning  organizations  –  We  believe  in  cultivating  respectful  and  productive  relationships  amongst   all  stakeholders  that  promote  critical  reflection,  shared  accountability,  and  continuous  improvement.      We  are   committed  to  constantly  improving  the  performance  of  each  person  and  each  system  within  the  organization.       5  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

4)

Parents and  families  are  our  partners  –  Parents  and  families  are  the  primary  custodians  of  their  child’s   learning.      We  believe  that  our  role  is  to  work  in  partnership  with  parents  and  families  to  provide  students  with   the  education  they  need  and  deserve.  

5)

We are  trusted  stewards  of  public  resources  –  We  believe  that  all  District  staff  are  responsible  stewards  of   existing  resources  whereby  all  expenditure  decisions  –  no  matter  how  large  or  small  –are  aligned  with  and   help  to  advance  the  District’s  strategic  priorities.    It  is  equally  important  that  we  operate  in  manner  that   ensures  fiscal  and  financial  stability.  

What We  Have  Accomplished:  v1.0  to  v2.0  

By  taking  many  of  the  Actions  outlined  in  Action  Plan  v1.0,  published  in  January  2013,  we  have  managed  significant   changes  to  both  the  structure  and  operations  of  the  District  and  achieved  much  during  this  past  school  year.  As  a   result,  we  have  a  stronger  foundation  on  which  to  expand  our  system  of  excellent  schools  and  to  continue  to   ensure  more  students  are  in  great  schools.       Throughout  this  past  year,  we  have  increased  our  focus  on  teaching  and  learning  by  launching  the  alignment  of   our  curriculum  with  the  Pennsylvania  Core  Standards  and  the  implementation  of  these  standards  within  our   schools.    Toward  that  end,  the  District  developed  and  is  working  with  our  educators  to  implement  high-­‐performing   school  practices  as  well  as  highly  effective  instructional  practices  across  all  District  schools  (see  Exhibits  3  and  4).   We  have  also  worked  to  improve  school  climate  and  safety.    The  School  District  has  decreased  the  number  of   persistently  dangerous  schools  from  six  (6)  schools  in  SY  2012-­‐2013  to  two  (2)  schools  this  current  year.    Just  as   importantly,  we  were  able  to  provide  research-­‐based  behavioral  and  intervention  programs  –  Positive  Behavioral   Intervention  and  Supports  (PBIS)  and  International  Institute  for  Restorative  Practices  (IIRP)  –  to  26  District  schools   to  improve  students’  learning  environment.         Our  staff,  along  with  our  principals  and  partners  worked  tirelessly  throughout  the  summer  to  provide  a  welcoming   environment  for  our  students  and  their  families.    More  than  fifty  District  schools  received  new  paint  jobs,  lighting   upgrades,  tile  replacements  or  other  improvements  and  every  school  that  welcomed  our  students  hosted  an  open   house.    Our  partners  also  provided  over  $3  million  of  additional  funds  and  services  to  support  school  opening.   While  23  schools  were  closed  and  two  schools  relocated,  improving  the  utilization  of  our  schools  from  67%  to  74%,   the  District  along  with  our  partners  continued  to  concentrate  on  improving  our  overall  system  of  schools  and   providing  our  students  with  quality  options.    The  District  continued  to  turnaround  our  lowest  performing  schools   by  investing  resources  in  four  (4)  new  Promise  Academies,  two  (2)  new  pre-­‐Promise  Academies,  and  three  (3)   Renaissance  charter  schools,  positively  affecting  a  over  5,000  students.    We  improved  Career  and  Technical   Education  (CTE)  programming  with  the  support  of  the  Middleton  Family,  implemented  a  career  academy  model  at   two  high  schools,  and  piloted  a  proficiency  based  pathways  program  in  several  of  our  high  schools.    Furthermore,   over  $5.9  million  was  secured  to  support  the  expansion  or  creation  of  high  performing  schools  in  the  city  for  SY13-­‐ 14;  we  opened  an  extension  campus  for  SLA  at  Beeber,  established  a  new  project-­‐based  learning  school,  The   Workshop  School,  and  expanded  Hill-­‐Freedman,  a  high  performing  middle  school.    Additionally,  $3.3  million  was   secured  to  support  the  development  of  three  new,  evidence-­‐based  high  schools  for  SY14-­‐15.     As  this  school  year  saw  charter  school  enrollment  increase  to  over  60,000,  we  have  also  seen  significant   improvements  in  our  charter  authorizing  function.    This  year’s  charter  renewal  process  is  more  effective  and   efficient,  and  less  burdensome  on  charter  schools.    We  have  drafted  a  comprehensive  revision  to  the  SRC’s   authorizing  policies,  including  clearer  standards  and  a  focus  on  student  outcomes.    We  have  made  tough  non-­‐ renewal  recommendations  on  three  low-­‐performing  charter  schools.   None  of  this  work  could  be  achieved  without  talented  staff.      We  launched  the  PhillyPlus+  pilot  program  for   principal  training  residencies  in  collaboration  with  the  Great  Schools  Compact  and  have  sought  to  provide   increased  and  improved  professional  opportunities  for  teachers  and  principals  through  the  implementation  of   6  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

evidence-­‐based “high  performing  schools  practices.”    We  have  also  worked  with  higher  education  partners  to   implement  change  management  programs  for  school  leaders  and  administrators.    Our  teachers  and  principals  have   also  all  been  trained  on  the  Danielson  Framework  for  Teaching.    Teachers  are  now  implementing  PA  Core  aligned   instructional  practices,  and  we  are  planning  city-­‐wide  teacher  conferences  to  share  expertise  and  develop   collaborative  solutions  to  instructional  challenges.     Our  organization  has  also  begun  a  profound  transformation.    We  have  shifted  towards  greater  transparency  and   accountability  through  the  launch  of  our  open  data  initiative,  and  the  revamping  of  our  town  halls  and  leadership   meetings.    We  have  improved  our  operational  efficiency  and  effectiveness  by,  amongst  other  things,    reducing   principal’s  approval  burden  on  grant-­‐funded  extra-­‐curricular  activities  as  well  as  starting  the  process  of  revising  our   position  control  systems.    We  have  continued  to  support  innovation  through  our  support  of  EduCon  and  the  first-­‐ ever  District-­‐sponsored  hackathon  to  promote  technology  solutions  for  business  challenges.   Finally,  in  our  role  as  responsible  financial  stewards  of  public  resources,  we  consistently  advocated  for  additional   school  funding  to  ensure  our  schools  could  open;  this  resulted  in  $112  million  of  additional  revenue  for  the  District   in  FY  2014.      In  FY  2012  a  series  of  actions  were  taken  to  reduce  expenditures  by  $662  million,  consisting  of  $526   million  in  recurring  cuts  and,  $136  million  of  non-­‐recurring  cuts.    However,  a  structural  gap  remained  for  FY  2013;   as  a  result,  the  District  borrowed  $300  million  in  FY  2013  in  order  to  maintain  FY  2012  service  in  levels  in  schools,   while  simultaneously  generating  approximately  a  $17  million  savings.    For  FY  2014,  owing  to  the  loss  of   approximately  $119  million  in  federal  and  state  grant  funds  and  a  projected  budget  gap  of  approximately  $304   million,  the  District  was  forced  to  cut  expenditures  and  open  schools  with  minimal  staffing  and  a  reduced  central   office;  this  loss  was  mitigated  by  the  fact  that  we  identified  and  are  tracking  an  additional  $96  million  in  savings.     We  have  also  implemented  a  strong  system  of  grant  compliance  and  have  begun  planning  for  the  implementation   of  a  weighted  student  funding  formula.     The  District’s  continued  development  and  progress  since  the  release  of  Action  Plan  v1.0  in  January  2013  merits  an   update  to  the  Plan.    Consistent  with  v1.0,  Action  Plan  v2.0  is  intended  to  outline  the  priority  work  that  we  will   pursue  to  achieve  our  Anchor  Goals.    As  with  Action  Plan  v1.0,  Action  Plan  v2.0  is  a  “living  document,”  subject  to   change  as  progress  is  made,  circumstances  change,  and  additional  evidence  comes  to  light.

The Case  for  Investment  

This  Action  Plan  describes  what  we  need  to  do.    We  can  do  some  portion  of  each  listed  action  within  our  existing   resources,  but  this  will  not  make  our  schools  great.    While  we  feel  tremendous  urgency  to  make  our  schools  great,   we  are  constrained  by  our  financial  condition.    More  investment  is  required,  and  the  case  for  this  investment  is   clear:   • •

• •

We have  a  detailed,  evidence-­‐based  plan  to  make  all  our  schools  great  and  to  ensure  all  students  have   access  to  a  great  school.   For  students,  increased  educational  attainment  has  been  consistently  linked  to  increased  wages.        Earning   a  high  school  diploma  “increases  average  lifetime  earnings  by  $200,000,  a  bachelor’s  degree  increases   such  earnings  by  $600,000.”1  Furthermore,  increased  education  improves  life  outcomes  and  results  in   higher  levels  of  civic  engagement.   For  communities,  a  well-­‐maintained,  quality  neighborhood  school  can  help  attract  and  retain  residents  in   an  existing  neighborhood,  generate  support  for  local  businesses,  and  revitalize  the  community.   For  the  local  economy,  an  educated  workforce  increases  productivity  and,  by  extension,  regional  wealth;   successful  graduates  also  reduce  the  burden  on  the  public  of  future  costs.  

7 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

To facilitate  these  positive  outcomes,  it  is  imperative  that  the  educational  options  provided  to  our  students  and   families  are  of  high  quality.       In  the  near-­‐term  an  improved  education  system  provides  a  rationale  for  our  families  to  continue  to  select  our   schools  and  for  our  millennial  workforce  to  remain  in  the  region  as  they  raise  their  own  families.    According  to  a   recently  released  study,  half  of  the  young  adults  living  in  Philadelphia  indicated  that  they  “definitely  or  probably   would  not  be  living  in  Philadelphia  five  to  10  years  from  now  [….  because  of]  job  and  career  reasons,  school  and   child-­‐rearing  concerns,  and  crime  and  public  safety.”    Of  those  who  indicated  they  might  move  from  Philadelphia   29%  stated  school  and  child-­‐upbringing  concerns  –    the  second  most  cited  reason.           A  high  quality  education  system  can  help  draw  new  families  and  jobs  to  the  region;  contributing  to  a  virtuous  cycle   of  economic  development.    In  the  long-­‐term,  an  improved  education  system  will  mean  more  of  our  students   graduate  ready  for  college  and  career,  thereby  contributing  to  and  benefitting  from  this  economic  growth.    The   imperative  is  clear:  we  must  work  to  improve  our  schools  to  ensure  that  Philadelphia  is  a  city  of  opportunity  for  all   our  children.  

Part I:  Anchor  Goals  

More than  ever,  our  students’  graduation  and   future  success  depends  on  their  ability  to   demonstrate  mastery  of  high  standards  at  all   th levels.    Specifically,  our  entering  9  grade  class   is  now  required  to  score  at  proficient  on  the   state’s  Algebra  I,  Literature  and  Biology  exams   in  order  to  graduate.      Based  on  the  most   recent  performance  results,  only  39.8%  of  our   th 11  graders  scored  proficient  or  advanced  in   Algebra  I,  20.3%  scored  proficient  or  advanced   in  Biology;  and  53.4%  scored  proficient  or   advanced  in  Literature.      

Student Progression  From  9th  Grade  Through  to  College   Matriculacon   100%   100%  

80%

78.2%

Nasonal  

Average

60%

53.3%*

64%

40%

32.3%

20% 0%  

9th Grade  Cohort   (2008-­‐2009)  

Graduason (2011-­‐2012)  

SDP

First Fall  Matriculason   (2012-­‐2013)  

Student Progress  

Source: NCES,  Digest  of  Educa>on  Sta>>cs  2012  Figures  and  Office  of  Strategic  Analyscs,  School   District  of  Philadelphia   *  Computed  using  NCES  gradua>on  and  college-­‐matriculaton  rates;  not  a  single  cohort.    

Percentage of  11th  Grade  Students  Scoring  Proficient/ Advanced  on  the  Keystone  Exam   SY12-­‐13   Percentage  fo  Students  

We believe  that  all  students  can  succeed   academically.  This  is  demonstrated  by  the  fact   that  the  District  is  home  to  some  of  the  best   schools  in  the  city  and  state.    It  is  our   collective  responsibility  to  work  relentlessly  to   improve  academic  outcomes  and   opportunities  for  our  students  who  are   progressing  through  our  system.      The  truth  is   that  while  we  have  graduated  more  students   over  time  and  our  college  matriculation  rate   has  also  risen,  we  are  still  far  below  the   national  average  for  graduation  and  college   metrication  rates.      Our  4-­‐year  cohort   graduation  rate  of  64%  is  14  percentage  points   lower  than  the  national  average  of  78%.  

Percentage of  Students  

Anchor Goal  1:  100%  of  students  will  graduate,  ready  for  college  and  career.  

100% 80%   60%  

53.4%

39.8%

40%

20.3%

20% 0%   Algebra  I  

Biology

Literature

Subject of  Keystone  Exam  

  8  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Anchor Goal  2:  100%  of  8-­‐year-­‐olds  will  read  on  grade  level.   Percentage  of  3rd  Grade  Students  Scoring  Proficient  /   Advanced  on  PSSA-­‐Reading  (SY12-­‐13)  

100% Percentage  of  Students  

The foundation  of  all  student  learning  begins   in  early  childhood  and  is  built  on  the  ability  to   read  on  grade  level.      Research  has  shown  that   students’  third  grade  reading  levels  are  highly   th th predictive  of  their  8  and  9  grade  reading   performance,  high  school  graduation  and   2 college  attendance.    However,  in  2012-­‐2013,   only  45%  of  District  students  scored  proficient   or  advanced  on  the  PSSA-­‐Reading  assessment,   the  lowest  it  has  been  since  2005-­‐2006.     Recognizing  this  fact,  and  the  importance  of   rd 3 3  grade  literacy  in  students’  future  success,   it  is  imperative  that  we  work  to  improve  early   literacy  performance  across  the  District.      

80% 60%   40%  

7.0%

20%

37.5%

Advanced Proficient  

0%

Anchor Goal  3:  100%  of  schools  will  have  great  principals  and  teachers.      

Tennessee: Teachers  and  Student  Performance  

Students placed  with  high-­‐performing  teachers  for  three  years   versus  students  placed  with  low-­‐performing  teachers  for  three   years   Percencle  Performance  

Teachers and  principals  have  a  tremendous   impact  on  student  learning.    In  a  longitudinal   study  of  student  performance  in  Tennessee   (see  chart),  students  who  were  placed  with   high-­‐performing  teachers  for  three   consecutive  years  performed  52  to  54   percentile  rank  higher  than  students  who  were   placed  with  low-­‐performing  teachers  for  three   consecutive  years.4    Therefore,  in  order  to   provide  students  with  a  school  climate  and   culture,  and  dynamic,  excellent  classroom   experiences  which  facilitate  and  advance   learning,  we  are  committed  to  ensuring  that   all  of  our  schools  have  great  principals  and   teachers.    

100% 50%  

96%

83% 44%  

29%

0% Percensle  Performance:     System  A   High-­‐Performing  Teachers  

Percensle Performance:     System  B   Low-­‐Performing  Teachers  

Anchor Goal  4:  SDP  will  have  100%  of  the  funding  we  need  for  great  schools,  and  zero  deficit.   We  are  currently  faced  with  significant  financial  challenges.  Although  we  have  made  significant  cuts  to  our   operating  costs,  non-­‐discretionary  costs  including  pensions,  benefits  and  debt-­‐service  continue  to  increase  and   impose  enormous  financial  responsibilities.       As  detailed  above,  additional  resources  are  needed  to  provide  our  students  with  a  high  quality  education.    The   District  will  work  tirelessly  to  seek  additional  resources  for  our  students  and  schools,  and  to  ensure  these   resources  are  deployed  in  a  manner  that  will  help  yield  the  best  student  outcomes.     Simultaneously,  we  must  continue  to  be  good  stewards  of  public  resources  and  maintain  financial  stability  in  order   to  better  serve  our  students’  academic,  social  and  emotional  needs.      Through  sound  fiscal  management,  we  will   be  able  to  provide  our  students  and  staff  with  a  stable  learning  and  working  environment  while  ensuring  the   financial  sustainability  of  the  District.    

9 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

How We  Will  Track  Our  Progress  and  How  We  Will  Hold  Ourselves  Accountable  

Anchor  Goal  1:  100%  of  students  with  graduate  college  and  career-­‐ready   • The  percentage  of  students  who  score  advanced  on  the  PSSA  exams   • The  percentage  of  students  who  score  proficient  or  advanced  on  the  PSSA  exams   • Keystone  exam  pass  rate  (first  time  and  repeat)   • The  percentage  of  students  who  score  advanced  on  the  Keystone  exams   • The  percentage  of  students  reading  below  grade  level  who  demonstrate  improvement   • Growth  (AGI)  on  PSSA  exams   • Growth  (AGI)  on  Keystone  exams   • On-­‐Track  Metric  -­‐  %  of  First-­‐Time  9th-­‐,  10th,  and  11th-­‐Graders  who  Earned  the  Minimum  Number  Credits   Required  for  Promotion   • Back-­‐on-­‐Track  Metric  -­‐  %  of  Under-­‐Credited  Students  who  Earned  the  Minimum  Number  of  Credits   Required  (or  More)  for  Promotion   • AP  -­‐  %  12th  Graders  Ever  Scored  3  or  Higher  (on  at  least  one  exam)   • SAT/ACT  -­‐  %  12th  Graders  Ever  Scored  1550  on  SAT/22  on  ACT   • NOCTI/NIMS  -­‐  %  Competent  or  Advanced   • 4-­‐Year  Cohort  Graduation  Rate  (New  Local)   • First-­‐Fall  College  Matriculation  Rate   • %  of  Students  Attending  95%  or  More  of  Instructional  Days   • Student  Retention  Rate   • %  of  Students  with  Zero  Out-­‐of-­‐School  Suspensions   • Teacher  Attendance  Rate   • ELL  –  ACCESS  performance  and  growth   • ELL  –  exit  /  re-­‐entry     Anchor  Goal  2:  100%  of  8  year-­‐olds  will  read  on  grade  level   rd • The  percentage  of  students  in  grades  kindergarten  through  3  grade  reading  on  grade  level     rd • The  percentage  of  students  in  3  grade  scoring  proficient  or  advanced  on  the  PSSA-­‐Reading  Assessment     Anchor  Goal  3:  100%  of  schools  will  have  great  principals  and  teachers   • The  percentage  of  schools  with  principals  who  have  improved  school  climate  and  student  growth   • The  percentage  of  schools  where  the  principal  is  identified  as  “proficient”  or  “distinguished”   • The  percentage  of  schools  where  the  majority  of  teachers  are  identified  as  “proficient”  or  “distinguished”     Anchor  Goal  4:  SDP  will  have  100%  of  funding  we  need  for  great  schools,  and  zero  deficit   • (Revenues  –  Expenditures  Required  for  Great  Schools)  >  0   • %  increase  in  revenues       The  Actions  that  follow  have  been  proven  to  help  promote  and  improve  student  achievement;  it  is  our  job  to  take   these  Actions  and  implement  them  with  fidelity.    

10 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Part II:  Strategies  and  Actions   KEY  DEFINITIONS:   STRATEGY:  Represents  a  District  priority  around  which  District  work  is  organized  that  directly  contributes  to  the   achievement  of  our  Anchor  goals.   ACTION:  Represents  a  specific  and  measurable  body  of  work  that  directly  and  significantly  contributes  to  the   achievement  of  a  Strategy.       The  Strategies  and  Actions  that  follow  are  connected.    The  decision  to  place  any  particular  Action  under  any  specific   Strategy  is  an  attempt  to  provide  order  and  clarity  to  the  work;  it  is  by  no  means  an  indication  that  a  particular   Action  does  not  influence  or  advance  another  Strategy.  

Strategy 1:  Improve  Student  Learning  

Providing  our  students  with  opportunities  and  choices  upon  graduation  requires  that  we  improve  the  content  we   deliver  and  empower  our  teachers  with  the  tools  necessary  to  furnish  our  students  with  the  best  quality  instruction.     Through  this  Strategy,  we  will  implement  Actions  that  will  clarify  our  expectations  for  teaching  and  learning  and   provide  the  tools  our  educators  need  to  differentiate  their  high  quality  instruction.     A. Fully  adopt  and  integrate  the  PA  Core  Standards  in  all  of  our  teaching  and  learning  activities.    The  PA  Core   Standards  are  a  consistent  set  of  standards  developed  in  reference  to  best  educational  practices  from  across   5 the  globe  aimed  at  ensuring  students  are  prepared  to  succeed  in  their  college  and  career  goals.    The  PA  State   Board  of  Education  adopted  the  Common  Core  standards  in  2010  and  subsequently  tailored  them  to  meet  the   specific  needs  of  students  in  Pennsylvania,  creating  the  PA  Core  Standards.    These  new  standards  will  require   PA  Core  aligned  assessments  for  graduation  starting  with  the  class  of  2016-­‐2017.    We  are  working  to  ensure   our  students  have  the  instruction  and  opportunities  to  master  the  new  rigorous  academic  content  and  will   modify  the  curriculum  to  match  the  focus  on  literacy  and  increased  reading  time.     6 The  Pennsylvania  State  Board  of  Education  adopted  the  Common  Core  Standards  in  2010,  an  important   step  towards  ensuring  that  our  students  graduate  with  the  knowledge  and  skills  critical  for  success.    The   state  has  released  new  Keystone  assessments  and  changed  its  graduation  requirements  to  reflect  these   7 new  standards.  Research  suggests  that  District  proficiency  rates  could  decline  by  as  much  as  36   8 percentage  points  during  the  shift  to  assessment  under  these  higher  standards.         “The  shift  to  Common  Core  Standards  was  a  good  step  to  take.    We  all  need  to  understand  how  to  be   ‘results  oriented’  regarding  academics  and  curriculum.”     -­‐  Elementary  School  Principal  

B. Define college  and  career  readiness  based  on  student  mastery  of  content,  and  align  graduation  standards.     Preparing  students  for  college  and  career  begins  the  moment  our  students  enter  our  schools.  By  providing  our   students  with  high  quality  programs,  instruction,  and  learning  experiences,  as  well  as  with  diverse   opportunities  to  demonstrate  content  mastery,  District  students  will  have  multiple  means  by  which  they  can   reach  their  college  and  career  aspirations.       Students  who  read  on  grade  level  by  third  grade,  graduate  within  four  years,  score  a  3  or  better  on  an  AP   9 exam,  score  a  1550  on  the  SAT ,  and  matriculate  within  one  year  of  high  school  graduation  are  more  likely   11  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

to persist  and  graduate  from  college.    Graduation  standards  were  last  revised  in  2005,  and  are  need  of   updating.   C.

Identify  and  implement  a  rigorous,  flexible  PreK-­‐12  curriculum..    The  District  will  work  to  implement  a  PA   Core  aligned  curriculum  which  is  coherent  enough  to  provide  principals  and  teachers  with  much-­‐sought   guidance,  yet  flexible  enough  to  facilitate  individual  school  missions  and  approaches.    A  rigorous,  flexible   curriculum  will  foster  opportunities  for  personalized  instruction  while  ensuring  that  students  have  the   opportunity  to  achieve  and  exceed  state  standards.       Coherent,  comprehensive  curricula  aligned  with  clear  standards  have  been  shown  to  be  critical   10 components  of  systemic,  sustained  improvements  in  student  outcomes.  

D. Implement  a  literacy-­‐rich  early  childhood  continuum,  including  recuperative  practices.    We  will  equip  our   young  learners  with  the  cognitive,  social  and  emotional  skills  they  need  to  start  and  stay  on  track  in  school  by   implementing  a  comprehensive  literacy  framework,  working  with  our  partners  to  increase  the  number  of  high   quality  pre-­‐K  seats  across  the  city,  strengthening  our  students  and  families’  kindergarten  transition   rd experience,  and  bolstering  literacy  in  kindergarten  through  3  grade.    We  will  do  all  of  this  as  an  active   collaborator  with  Philadelphia’s  campaign  for  grade-­‐level  reading.     11 High-­‐quality  early  education  programs  are  critical  for  the  success  of  children.  Research  demonstrates   that  children  who  take  part  in  high-­‐quality  pre-­‐kindergarten  programs  become  better,  higher  performing   students.  Such  success  translates  into  higher  graduate  rates,  better  jobs,  more  fiscally  responsible  citizens,   less  dependency  on  social  supports  or  involvement  with  law  enforcement  and  a  significant  improvement  of   12 their  state’s  economy.    However,  based  on  performance  on  the  Developmental  Reading  Assessment   (DRA),  only  53%  of  District  kindergarten  students  were  reading  on  grade  level  at  the  end  of  the  2012-­‐2013   rd school  year  and  by  the  end  of  3  grade,  approximately  45%  of  District  students  are  scoring  proficient  or   advanced  on  the  state’s  PSSA  Reading  assessment.       “Begin  to  stress  literacy  and  writing  from  elementary  school  onward  so  that  by  the  time  students  arrive   in  high  school  they  will  have  a  working  vocabulary  to  discuss  literature  and  the  ability  to  write  on  a   proficient  level.”     -­‐  High  School  Teacher  

E.

F.

Develop and  implement  a  coherent  assessment  system.      To  ensure  that  students  are  learning  throughout   the  year  and  that  teachers  and  parents  have  the  information  necessary  to  support  student  learning,  the   District  will  implement  a  comprehensive  assessment  system  that  helps  educators  monitor  student  progress   against  PA  Core  aligned  standards,  informs  instructional  practices,  and  assists  in  the  identification  of  student   interventions.      To  advance  our  early  literacy  goal,  the  District  will  prioritize  common  assessments  and   universal  screeners  for  early  learners.     A  coherent  assessment  system  –    one  which  uses  a  combination  of  formative  and  summative  assessments   13 –  makes  it  possible  for  educators  to  track  and  advance  student  learning  throughout  the  year  as  well  as   determine  whether  students  have  learned  the  necessary  content  by  the  end  of  the  year.    High  performing   systems  have  used  individual  student  data  to  identify  strengths  and  weaknesses  in  instruction,  establish  a   14 sense  of  shared  accountability  and  focus  on  results.   Promote  effective  instructional  practices  in  every  classroom.    Quality  instruction  is  core  to  student  success;   therefore,  we  will  provide  professional  development  opportunities  and  support  teachers  in  the   implementation  of  highly  effectively  instructional  practices  (see  Exhibit  4),  a  researched-­‐based  set  of  effective   12  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

English Language  Arts,  technical  subject  and  math  practices  aligned  to  PA  Core  instruction.    In  addition,  we  will   continue  to  promote  the  District’s  Backward  By  Design  instructional  framework,  an  outcomes-­‐focused   instructional  model  aimed  at  ensuring  effective  delivery  of  the  PA  Core  Standards  and  enhanced  by  the   Response  to  Instruction  and  Intervention  (RtII)  process.         Student  achievement  is  highly  correlated  to  the  quality  of  instruction  students  receive;  it  is  critical  that  we   15 define  quality  instruction  and  support  our  teachers  in  its  delivery.         G. Accelerate  progress  towards  personalized  learning.    Our  students  have  different  backgrounds,  experiences,   skills,  interests,  and  learning  styles.    We  can  best  support  our  diverse  student  population  by  implementing  a   16 variety  of  instructional  strategies  and  offering  learning  opportunities  that  keep  our  students  engaged.     Personalization  involves  both  the  creation  of  deep  relationships  so  students  are  well  known  to  adults,  as  well   as  the  customization  of  how  content  is  delivered  to  better  support  a  more  diverse  array  of  learners.    Promising   strategies  and  opportunities  include  competency-­‐based  and  individualized  learning  pathways,  project-­‐based   experiences,  and  opportunities  for  self-­‐paced  acceleration  and  recuperation  that  include  online  learning,   anywhere/anytime  learning,  and  internships  as  well  as  other  real  world  learning  experiences.  To  help  facilitate   this  work,  our  educators,  families  and  students  will  work  collaboratively  to  develop  personalized  learning   plans  to  guide  our  students  learning.    We  will  also  provide  educators  and  administrators  with  disaggregated   data  to  inform  instructional  and  intervention  practices,  helping  to  promote  personalized  educational   17 experiences  for  our  students.      The  District  will  develop  and  promote  these  practices,  and  identify  and  invest   in  technology  tools  and  systems  that  will  give  students  and  educators  the  ability  to  personalize  pathways  and   pace  with  opportunities  for  accelerating  and  recuperating  learning  and  access  to  "just  in  time"  feedback  and   instructional  content.    By  deeply  personalizing  the  learning  experience  for  its  students  the  District  will  support   all  students  to  master  rigorous  standards.     Personalized,  or  competency-­‐based,  learning  allows  students  to  progress  as  they  demonstrate  content   mastery.    Such  instructional  approaches  to  personalizing  learning  are  relatively  new,  but  hold  great   18 promise.      Schools  that  are  advancing  models  of  personalized  learning  are  achieving  impressive   results.  For  example,  Summit  Schools  in  California  are  highly  personalized,  rigorous  and  focused  on  helping   students  to  direct  their  own  learning.    Summit  graduates  are  on  track  to  graduate  college  in  six  years  at   19 double  the  national  average.      Similarly,  The  New  Tech  Network,  a  nonprofit  organization  that   transforms  schools  into  innovative  learning  through  a  more  personalized,  blended-­‐learning  and  project   20 based  approach,  is  showing  promising  results  –  74%  of  New  Tech  students  enroll  in  2  or  4  year  colleges.     H. Provide  high  quality  Special  Education  services  in  the  least  restrictive  learning  environment.    The  needs  of   our  students  with  disabilities  are  diverse;  accordingly,  we  will  align  our  organization,  systems,  policies,  and   21 investments  to  provide  excellent  instruction  and  effectively  meet  their  unique  needs.    Among  other   activities,  we  will  review  and  utilize  our  students’  data  during  IEP  meetings  to  inform  the  programs  and   services  we  provide;  train  and  share  practices  and  instructional  strategies  that  have  proven  effective  in   improving  the  outcomes  for  our  students  with  disabilities  with  our  administrators,  teachers  and  staff;  and   work  with  schools  and  staff  to  ensure  that  students  with  disabilities  are  being  educated  in  the  least  restrictive   environment.     Most  recent  District  data  indicate  that  only  16%  of  District  students  who  have  an  IEP  scored  proficient  or   advanced  on  the  PSSA  in  Reading  and  only  18%  scored  proficient  or  advanced  on  the  PSSA  in  Mathematics.       th To  offer  a  national  comparison,  in  Maryland,  28%  of  4  grade  students  with  an  IEP  scored  proficient  or   advanced  on  the  National  Assessment  of  Educational  Progress  (NAEP)  Reading  Assessment,  considered  a   th “harder”  assessment  than  PSSAs.    Similarly,  in  Massachusetts,  21%  of  4  grade  students  with  an  IEP   22 scored  proficient  or  advanced  on  the  NAEP  Math  Assessment.        

13 |  P a g e    


I.

J.

K.

The School  District  of  Philadelphia  Action  Plan  v2.0  

Support rigorous  and  linguistically  appropriate  learning  experiences  for  English  Language  Learners  (ELLs).     With  appropriate  capacity  and  fidelity  of  implementation,  English  Language  Learner  programs  improve  the   23 outcomes  of  the  students  they  serve.      The  District  will  work  to  improve  and  expand  bilingual  programs  and   New  Learning  Academies  to  serve  additional  students  and  will  track  ELL  students’  progress  to  ensure  they  are   succeeding  and  have  received  sufficient  support.     In  2012-­‐2013,  only  14%  of  English  Language  Learner  students  scored  proficient  or  advanced  on  the  PSSA  in   Reading  and  27%  scored  proficient  or  advanced  on  the  PSSA  in  Mathematics.    To  offer  a  national   th comparison,  in  Ohio  18%  of  4  grade  ELL  students  scored  proficient  or  advanced  the  National  Assessment   of  Educational  Progress  (NAEP)  Reading  Assessment,  considered  a  “harder”  assessment  than  PSSAs.     th Similarly,  in  Indiana  and  Ohio,  30%  of  4  grade  ELL  students  scored  proficient  or  advanced  on  the  NAEP   24 Math  Assessment.       Integrate  a  focus  on  “academic  tenacity”  throughout  the  curriculum.      We  will  actively  cultivate  students’   academic  tenacity  by  integrating  the  language  and  skills  of  resilience,  perseverance,  self-­‐control,  and  problem-­‐  25   solving  into  schools’  curriculum  and  culture.     26 Students  can  be  taught  techniques  to  strengthen  their  academic  tenacity,  defined  as  the  “mindsets  and   skills  that  allow  students  to  look  beyond  short-­‐term  concerns  to  longer-­‐term  or  higher-­‐order  goals  and  to   27 withstand  challenges  and  setbacks  to  persevere  toward  these  goals”.  These  “non-­‐cognitive”  skills  are   more  reliable  than  traditional  academic  indicators  in  predicting  whether  students  will  graduate,  their  final   Grade  Point  Average,  and  their  income  levels,  savings  behavior,  and  mental  and  physical  health  as   28 adults.       Improve  student  nutrition  and  meal  experience.    We  will  continually  improve  student  nutrition,  student   participation  in  meals,  and  customer  satisfaction  by  transitioning  schools,  where  possible,  to  full-­‐service  meals   and  actively  engaging  students  in  making  healthy  dietary  decisions.    In  addition,  we  will  utilize  the  2014   satellite  meal  RFP  to  improve  nutritional  value  and  options,  move  to  the  USDA  Community  Eligibility  Option,   and  work  to  retain  our  student  meal  “best  practices”  designation  from  the  USDA  and  PDE.    We  have  promising   29 evidence  of  declining  obesity  rates  among  Philadelphia  students,  and  we  will  continue  to  actively  invest  in   our  students'  health  through  high  quality  school  meals.   Improved  nutrition  and  participation  in  meals  –  especially  breakfast  –  have  been  associated  with  increased   30 academic  performance,  improved  attendance,  and  decreased  tardiness  among  school-­‐age  children.    

14 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Strategy 2:  Develop  a  System  of  Excellent  Schools  

Regardless  of  whether  our  schools  are  being  managed  by  District  staff,  charter  operators,  or  contractors  we  are   committed  to  providing  high  quality  options  for  all  Philadelphia  students.    As  this  Action  Plan  makes  clear,  while  we   aspire  to  being  a  great  charter  school  authorizer  and  contract-­‐school  manager,  the  vast  majority  of  our  efforts  are   focused  on  making  District-­‐managed  schools  great.    The  District  has  four  levers  to  help  ensure  that  Philadelphia   students  have  high  quality  school  options:    (1)  transforming  poor  performing  schools;  (2)  expanding  existing  good   schools;  (3)  creating  new  good  schools;  and  (4)  closing  poor  performing  schools.    

  The  Actions  that  follow  reflect  the  fact  that  no  single  lever  offers  a  “solution.”    Rather,  we  must  engage  all  levers  as   parts  of  a  necessary  system  of  change.         A. Make  all  District  schools  great  by  implementing  high  performing  school  practices  (see  Exhibit  3).      Based  on   extensive  research  regarding  high  performing  school  practices,  the  District  has  defined  the  expectations  for  all   schools,  and  therefore,  for  school  leadership.    By  promoting  these  practices,  incorporating  them  into  principal   development,  and  holding  all  school  leaders  accountable  for  the  implementation  of  these  practices,  the   District  will  create  the  best  possible  in-­‐school  conditions  for  student  success.     Schools  that  have  strong  essential  supports  analogous  to  many  of  the  District’s  high  performing  school   practices  were  up  to  ten  times  more  likely  to  improve  students'  reading  and  mathematics  learning  than   those  where  one  or  more  of  these  indicators  were  weak.    Moreover,  a  low  score  on  even  just  one  indicator   31 can  reduce  the  likelihood  of  improvement  to  less  than  ten  percent.     B. Provide  students  with  an  environment  conducive  to  learning  by  implementing  and  maintaining  age-­‐ appropriate,  school-­‐wide  safety  and  climate  plans  that  incorporate  evidence-­‐based  programs.      It  is   imperative  that  we  improve  school  safety  and  climate  as  they  are  essential  elements  of  an  environment  that  is    32 conducive  to  learning.      The  District  will  implement  evidence-­‐based  school-­‐wide  climate  and  culture   15  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

33

programs and  train  school  administrators  on  the  creation  of  safe  and  constructive  climates;  provide   professional  development  and  support  regarding  our  attendance,  truancy,  and  disciplinary  policies  and   practices;  and  implement  Response  to  Instruction  and  Intervention  (RtII)  strategies  that  address  students’   behavioral  needs.      The  District  will  prioritize  the  identification  of  age-­‐appropriate  behavioral  health  and   disciplinary  alternatives  for  young  students.     Safety  and  climate  have  been  linked  to  improved  student  achievement;  reduced  vandalism,  absenteeism   34 and  disciplinary  incidents;  and  higher  teacher  retention  and  satisfaction.    However,  in  the  2012-­‐2013   school  year,  there  were  over  9,500  serious  incidents  in  District  schools  and  more  than  17,000  students   received  at  least  one  in-­‐school  or  out-­‐of  school  suspension.    This  meant  that  58  of  our  schools  had  10  or   more  serious  incidents  per  100  students  and  12%  of  District  students  were  suspended  at  least  once  during   the  school  year.    Just  as  importantly,  only  8  out  of  52  high  schools  had  50%  or  more  of  their  students   attending  95%  or  more  enrolled  days.     “Safety  is  one  of  my  most  important  concerns.    Students  need  to  k now  that  their  school  is  a  safe,   welcoming  learning  community.  Bullying  awareness,  prevention  should  be  emphasized.”       -­‐  Elementary  School  Teacher  

Ensure  all  schools  are  porous  –  connected  to  community  resources  and  partnerships  to  meet  student  needs.   Philadelphia  schools  are  surrounded  by  a  rich  array  of  resources  that  support  the  development  and  learning  of   students.    These  resources  are  currently  under-­‐utilized.    District  and  school  leadership  need  to  ensure  that  all   schools  are  “porous”  –  meaning  open  to  partnerships  that  aid  and  support  students  to  thrive  –  and  that  such   resources  are  best  aligned  with  the  needs  of  students.     In  a  review  of  effective  high  school  design  principles,  establishing  and  maintaining  effective  partnerships   with  organizations  that  enrich  student  learning  is  one  of  the  10  core  principles  found  in  high-­‐performing   35 secondary  schools.     D. Empower  school  leaders  and  their  leadership  teams  with  the  authority  to  make  important  decisions.    Based   on  the  practices  of  high  performing  schools  globally,  we  know  that  student  performance  can  be  improved   when  principals  have  autonomy  over  academic  content,  personnel  and  budgets.        Accordingly,  we  will  clarify   and  communicate  a  consistent  set  of  autonomies,  decision-­‐making  processes  and  expectations  to  help  our   principals  manage  their  schools.    We  will  also  ensure  that  our  schools  receive  appropriate  and  differentiated   support,  guidance  and  oversight.     Research  suggests  that  nearly  25  percent  of  the  variation  in  a  school’s  achievement  can  be  attributed  to  a   principal  because  he/she  is  in  a  unique  position  to  bring  together  the  multiple  in-­‐school  factors  that  are   36 necessary  to  significantly  improve  student  achievement  on  a  large  scale.    This  fact,  in  combination  with  a   study  by  OECD  which  suggests  that  “in  countries  where  schools  have  greater  autonomy  over  what  is   37 taught  and  how  students  are  assessed,  students  tend  to  perform  better,”  makes  it  clear  that  principal   autonomy  is  critical  to  student  success.     E. Make  poor  performing  schools  better  through  the  Renaissance  turnaround  program,  including  evidence-­‐ based  revisions  to  the  Promise  Academy  model.        Through  the  Renaissance  Schools  Initiative  and  other   proven  interventions,  we  will  turn  around  our  lowest-­‐performing  schools  that  have  failed  to  make  and  sustain   significant  improvements  in  student  academic  performance,  attendance,  and  school  climate.      It  is  imperative   that  the  District  utilize  evidence-­‐based  models  and  strategies  associated  with  aggressive  turnarounds.         Initial  reviews  of  the  Renaissance  Initiative  indicate  that  K-­‐8  schools  were  making  statistically  significant   gains  (or  mitigating  declines)  over  comparison  schools  in  both  reading  and  math  during  the  first  two  years   C.

16 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

of the  program.  In  projecting  whether  schools    participating  in  the  program  are  on  track  for  dramatic   improvements  over  a  5-­‐6  year  window,  Renaissance  charter  schools  operated  by  Mastery  Charter  are  on   track  for  dramatic  improvements  in  both  reading  and  math  while  the  results  for  other  Renaissance   charters  indicate  that  they  may  fall  short.    Mastery,  Aspira,  Young  Scholars  and  Universal  have  successfully   turned  around  former  District  schools.    In  particular,  four  Mastery  schools  and  one  Aspira  school  are   projected  to  have  reading  and  math  proficiency  scores  that  exceed  60%  by  2015-­‐2016.    Promise   Academies,  on  the  other  hand,  while  exhibiting  promising  early  results,  are  not  currently  on  track  to   achieve  similarly  significant  gains  over  a  5-­‐6  year  window;  however,  this  may  be  a  result  of  poor   38 implementation  fidelity.     F.

Promote compelling,  successful  programs  including  Career  and  Technical  Education  and  project  based   learning.    The  District  will  increase  the  numbers  of  students  that  are  enrolled  in  and  have  access  to  relevant   high-­‐quality  programs  that  support  student  learning  by  promoting  existing  programs  that  still  have  capacity   and  expanding  programs  that  have  proven  successful.     Students  who  participated  in  project-­‐based  learning  classes  tended  to  perform  better  on  assessments  of   content  knowledge,  had  high  levels  of  engagement,  and  benefitted  from  improved  critical  thinking,   39 problem-­‐solving,  and  collaborative  skills.         In  2012-­‐13,  the  graduation  rate  for  students  in  comprehensive  CTE  high  schools  was  89.2%,  which  is  higher   than  the  District’s  average  graduation  rate  of  64%.    In  partnership  with  the  business  community,   continued  investment  in  CTE  programs  that  reflect  Philadelphia's  high-­‐priority,  growing  occupations   supports  the  Mayor's  initiatives  in  these  areas  and  will  improve  academic  outcomes  for  participating   students.         G. Review  and  improve  the  provision  of  schooling  across  all  our  alternative  settings.        The  District  will  work  to   improve  and  monitor  the  quality  of  non-­‐traditional  programming  currently  being  provided  to  Philadelphia’s   students,  including  students  in  alternative  education  students,  foster  students  placed  by  external  agencies,   and  students  placed  in  outside  educational  institutions.     Nearly  10,000  Philadelphia  students  either  participate  in  alternative  education  programs  or  attend  outside   education  institutions  at  a  cost  of  approximately  $129  million  annually.    For  students  in  our  accelerated   program,  across  every  9th  grade  cohort  from  2003  to  2006,  “accelerated  high  school  students  had   significantly  higher  five-­‐  and  six-­‐year  graduation  rates  than  similar  students  enrolled  in  neighborhood  high   schools;”  however,  the  majority  of  our  accelerated  students  graduate  below  high  school  reading  and   40 numeracy  levels,  only  27%  go  on  to  postsecondary  education,  and  only  11%  persist.     H. Strengthen  neighborhood  schools.    Nearly  90%  of  the  students  who  attend  a  District  school  are  enrolled  in  a   neighborhood  school.    Therefore,  in  order  to  improve  opportunities  and  outcomes  for  the  most  students  living   in  this  city,  it  is  imperative  that  the  District  focus  on  strengthening  neighborhood  schools.    We  will  do  this  by   prioritizing  neighborhood  schools  as  we  implement  many  of  the  instructional  and  climate  actions  outlined  in   this  Plan.     In  2012-­‐2013,  excluding  charter  and  closure  schools,  less  than  half  of  the  students  attending  neighborhood   schools  tested  proficient  or  advanced  on  the  PSSA  Math  and  PSSA  Reading.    This  was  comparatively  lower   than  80%  of  their  peers  who  attended  special  admission  schools.       I. Create  and  launch  new,  evidence-­‐based  school  models,  and  scale  the  ones  that  work.    We  will  adjust  our   school  models  and  classroom  structures  to  reflect  the  needs  of  our  students  by  expanding  and  replicating   high-­‐performing  schools  and  programs,  investing  in  new  models,  and  encouraging  flexibility  and  innovation  in   educational  delivery.   17  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

New  York  City  implemented  a  reform  effort  to  establish  “small  schools  of  choice”  (SSC)  –  small,   academically  nonselective  four-­‐year  public  high  schools.    With  additional  start-­‐up  resources  and  assistance   as  well  as  policy  protections  to  facilitate  leadership  development,  hiring  and  implementation  there  have   been  promising  results.    For  example,  after  the  first  year  of  high  school,  58.5%  of  SSC  students  were  on   track  to  graduate  compared  to  48.5%  of  non-­‐SSC  students.    Furthermore,  after  four  years,  SSCs  increased   their  overall  graduation  rates  by  6.8  percentage  points.41     J.

Be a  great  charter  school  authorizer  to  ensure  all  charters  are  good  school  options,  and  promote  the  sharing   of  successful  practices  across  all  schools.    The  District  will  support  the  School  Reform  Commission  (SRC)  in   becoming  a  top-­‐quality  charter  school  authorizer  by  improving  the  quality,  clarity,  transparency,  and   42 consistency  of  the  SRC’s  charter  school  authorizing  practices  through  the  Authorizing  Quality  Initiative  (AQI).     In  becoming  a  great  charter  authorizer,  the  SRC  will  both  promote  and  expand  high-­‐quality  public  school   options  and  actively  seek  the  non-­‐renewal  and  revocation  of  the  lowest-­‐performing  charter  schools.    In   addition,  as  a  part  of  this  work,  the  District  will  continue  working  on  behalf  of  the  SRC  to  ensure  all  charter   schools  have  signed  charter  agreements  and  manage  their  enrollment  to  ensure  that  they  stay  within  their   enrollment  limits,  to  facilitate  the  equitable  and  successful  distribution  of  resources  across  families  of  students   in  District  and  charter  schools  and  within  the  charter  sector.     The  86  charter  schools  in  operation  in  Philadelphia  enroll  approximately  60,000  students.    The  number  of   Philadelphia  students  enrolled  at  charters  has  expanded  by  over  45,000  since  the  2003-­‐2004  school  year.   While  there  are  many  examples  of  charters  driving  transformational  change,  charter  school  performance,   like  that  of  District  schools,  is  variable.  According  to  the  State  School  Performance  Profile  37  out  of  86   charter  schools  achieved  a  “good  school”  state  rating,  i.e.,  a  score  of  70  points  or  better.    

K.

L.

Develop and  implement  a  school  progress  measure.    We  will  develop  a  School  Progress  Report  to  measure   and  communicate  the  performance  of  both  District  and  charter  managed  schools  across  key  indicators  that   reflect  the  feedback  of  parents  and  educators  and  are  also  aligned  with  District  priorities.    This  new  measure   will  also  help  the  District  hold  District  and  charter  schools  to  the  same  high  standards  of  academic   performance,  equity  and  safety.     School  report  cards  can  help  increase  transparency,  establish  a  basis  for  accountability  and  provide  tools   for  effective  management,  ultimately  helping  parents,  teachers  and  school  officials  assess  school   43 performance  and  status,  and  develop  the  most  effective  interventions  and  supports.           Provide  a  clean  and  comfortable  building  environment  in  all  schools.    We  will  improve  the  physical  and   environmental  condition  of  buildings  and  transform  buildings  into  welcoming  and  inviting  spaces.    Specific   work  will  include  executing  our  collaborative  labor  plan  with  32BJ  /  SEIU  1201,  implementing  a  work  order   management  system,  executing  a  facility  condition  assessment  study,  undertaking  customer  surveys  and   executing  a  more  aggressive  preventative  maintenance  plan.   School  building  design  and  building  conditions  have  a  measurable  impact  on  student  achievement.     Researchers  have  found  a  “5-­‐17  percentile  point  difference  between  students  in  poor  buildings  and  those   44 in  standard  buildings.”         “Make  schools  a  place  where  people  WANT  to  go.  Where  do  you  work?  Is  it  somewhere  bright,  clean  and     cheerful?  Why  shouldn't  schools  be  the  same?”   –Elementary  School  Teacher  

M. Continuously update  and  refine  the  system-­‐of-­‐schools  plan,  including  expansions  and  replications  of  good   schools,  and  transformation  or  closure  of  chronically  under-­‐enrolled  and  under-­‐performing  schools.    We   continue  to  develop  a  comprehensive,  evidence-­‐based,  transparent  decision-­‐making  system  for  all  our  schools   18  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

to ensure  that  good  schools  are  supported  and  promoted,  good  schooling  ideas  flourish,  and  poor-­‐performing   schools  are  not  left  to  languish.    To  this  end,  we  are  implementing  a  revised  way  of  making  decisions  about  our   system  of  schools,  including  transparent  processes,  clear  expectations,  and  follow-­‐up  actions.   The  School  District  of  Philadelphia  has  some  of  the  best  performing  schools  in  the  state;  we  have  selective   admission  schools;  we  have  career  academies  and  neighborhood  schools;  we  also  have  schools  struggling   to  get  more  than  50%  of  their  students  reading  and  completing  math  on  grade  level.    Therefore,  in  order  to   support  all  of  our  students  we  must  simultaneously  review  schools  on  an  individual  basis  but  also  as  a   collective    

Strategy 3:  Identify  and  Develop  Exceptional,  Committed  People  

Our  ability  to  achieve  our  Anchor  Goals  depends  on  the  ability  of  everyone  in  the  District  –  our  full  team  –  to   execute  the  Actions  identified  in  this  Plan.    We  are  therefore  committed  to  supporting  our  staffs’  continuous   professional  growth  and  development,  as  well  as  ensuring  exceptional  people  continue  to  choose  the  District,  and   choose  to  stay.     A. Improve  recruitment  and  hiring  practices  to  attract  the  highest  quality  candidates.    It  is  crucial  that  we   continue  to  recruit,  hire,  and  retain  quality  teachers,  principals  and  central  office  staff  to  successfully  execute   the  Actions  in  this  Plan.      Practices  that  would  help  advance  this  action  include:  recruiting  and  hiring  talented   45 teachers  and  hiring  them  in  a  timely  manner,  strengthening  the  principal  pipeline  and  identifying  principals   with  strong  leadership  capacity;  and  improving  placement  practices  to  better  match  employees’  skills  with   46 position  requirements.      To  help  advance  our  goal  of  improving  early  literacy,  the  District  will  prioritize  the   hiring  and  placement  of  high  quality  teachers  and  principals  in  elementary  schools.     With  aggressive  recruitment,  districts  receive  more  applicants  than  they  need  to  fill  existing  vacancies;   however,  delays  in  job  offers  result  in  the  withdrawal  of  between  31  to  60  percent  of  applicants  with  the   majority  (50  to  70  percent)  of  applicants  indicating  the  hiring  timeline  as  a  major  reason  for  taking   another  position.    Consequently,  districts  are  often  forced  to  fill  their  vacancies  with  less  qualified   candidates.47        “…[p]eople,  not  programs,  impact  student  performance.  Recruit  and  retain  highly  qualified  teachers.”       -­‐  Middle  School  Teacher       B. Strengthen  the  principal  and  teacher  pipelines.    The  District  will  work  to  actively  prepare  new  leaders  from   our  cadre  of  high-­‐performing  educators.    We  will  also  promote  residency  years  for  aspiring  leaders  and  will   build  stronger,  more  actively-­‐managed  relationships  with  our  core  relevant  partners  including  higher   education  institutions.     School  leadership  ranks  second  only  to  teacher  quality  in  its  impact  on  student  achievement,48and   cultivating  a  strong  pipeline  of  school  leaders  by  investing  in  leadership  pathways  has  proven  to  produce   highly  effective  school  leaders.      According  to  over  10  years  of  research  by  the  Wallace  Foundation,   building  a  strong  pipeline  of  school  leaders  requires  that  districts  clearly  detail  the  rigorous  requirements   for  school  leadership  positions,  provide  high-­‐quality  training  for  aspiring  leaders,  engage  in  selective   49 hiring,  and  offer  solid  on-­‐the-­‐job  support  and  performance  evaluations.       C. Celebrate,  reward,  retain  and  promote  high  performing  staff,  particularly  great  teachers  and  principals.     Given  the  District’s  investment  in  recruiting  and  training  our  high-­‐performing  employees  and,  more   importantly,  their  contribution  to  improving  student  achievement,  it  is  crucial  that  we  retain  high  performing   19  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

staff by  providing  them  with  opportunities  for  on-­‐going  growth  in  their  current  roles,  leadership   development,  and/or  other  advancement  possibilities.    

50

Retaining our  highest  performing  teachers  improves  student  outcomes  in  both  the  short-­‐  and  long-­‐term.     According  to  research,  “when  a  high  value-­‐added  teacher  enters  a  school,  end-­‐of-­‐school  year  test  scores  in   the  grade  he/she  teaches  immediately  rise  and  students  assigned  to  such  high  value-­‐added  teachers  are   51 more  likely  to  go  to  college,  earn  higher  incomes,  and  are  less  likely  to  be  teenage  mothers.”     D. Support  the  continuous  development  of  all  personnel  –  tailored  to  individuals  –  including  an  emphasis  on   school-­‐based  coaching  for  principals  and  teachers.    Teachers,  principals  and  central  office  staff  must  be   equipped  to  succeed  in  their  roles.    As  our  system  evolves  and  improves,  we  will  support  our  personnel  in   their  efforts  to  adapt  and  enhance  their  skills  to  meet  the  changing  demands  of  their  jobs.    Specifically,  we   are  committed  to  providing  effective  and  intensive  development  opportunities  to  support  principals  in  the   implementation  of  “high  performing  school  practices”  (see  Exhibit  3),  by  focusing  on  improving  principals’   abilities  to  serve  as  instructional  leaders  and  the  operational  managers  of  their  schools.      We  will  also  support   teachers  in  the  implementation  of  “highly  effective  instructional  practices  (see  Exhibit  4).     Effective  professional  development  matters.    When  principals  are  given  the  support,  feedback,  and   52 resources  to  be  effective,  teacher  performance,  student  achievement,  and  school  quality  improve.    From   a  teacher  perspective,  job-­‐embedded  professional  development  –  such  as  literacy  coaching  –    has   demonstrated  increasing  improvements  in  student  literacy  learning  which  persisted  over  time.    For   example,  on  average,  students  in  17  schools  that  participated  in  a  literacy  coaching  program  made  16%   larger  gains  in  the  first  year  than  students  whose  teacher  did  not  participate  in  the  program;  these  gains   53 increased  to  28%  in  the  second  year  and  32%  by  the  third  year.     E. Create  meaningful  opportunities  for  teacher  collaboration  and  for  principal  collaboration.    We  will  promote   teacher  collaboration  by  supporting  or  creating  meaningful  supportive  and  knowledge  sharing  opportunities   54 for  teachers  in  formal  and  informal  groups,  cohorts,  and  networks  both  within  and  between  schools.        To   advance  our  goal  of  improving  early  literacy,  the  District  will  provide  professional  development  focused  on   differentiated  reading  and  literacy  strategies  and  interventions.       A  recent  study  of  New  York  City  public  school  teachers  found  that  “social  capital”  among  teachers— defined  as  the  quality  and  frequency  of  interaction  and  collaboration—“was  a  significant  predictor  of   55 student  achievement  gains  above  and  beyond  teacher  experience  or  ability  in  the  classroom”.            “Teachers  need  time  to  work  as  a  team.”                 -­‐  Elementary  School  Teacher     F. Collaborate  with  the  City  and  other  partners  to  make  Philadelphia  a  premier  place  for  principals  and   teachers  to  work.    We  believe  that  Philadelphia  is  a  special  place  to  work  and  live,  and  that  our  schools  are   special  places  within  the  city.    We  will  collaborate  with  teachers,  principals,  community  organizations,  city   agencies  and  private  enterprise  to  get  the  word  out:    we  are  making  our  schools  great,  and  we  need   exceptional  people  to  join  us  in  this  endeavor.    Specific  steps  here  include  support  for  reduced-­‐price  housing   for  teachers,  the  promotion  of  Philadelphia  through  local  and  national  networks,  multi-­‐city  job  fairs,  the   promotion  of  school  successes,  and  collaboration  with  the  City  of  Philadelphia.     To  address  the  need  for  high-­‐quality  teachers  and  principals,  districts  are  partnering  with  postsecondary   56 institutions  and  non-­‐profits  to  establish  teacher  pipelines,  and  cities  and  states  are  providing  tax  credits   to  developers  who  create  affordable,  supportive  housing  complexes  for  teachers  and  non-­‐profit   57 educational  organizations.     20  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

G. Set  clear  expectations  for  teachers,  principals,  and  support  staff  and  implement  regular  performance   evaluations.      Defining  roles  and  expectations  allows  leaders  to  communicate  the  activities  and  roles  that   they  value  in  their  employees.    We  will  implement  the  Pennsylvania  Educator  Effectiveness  System  as   58 outlined  in  Pennsylvania  Act  82  for  teachers,  principals,  and  school-­‐based  specialists,  as  well  as  a   performance  management  system  for  all  staff.      In  short,  we  will  be  setting  and  adhering  to  high  expectations   59 for  all  staff  -­‐-­‐  our  teachers, principals,  specialists,  and  central  office  employees.           60 The  Pennsylvania  Department  of  Education  (PDE)  Educator  Effectiveness  Project  is  currently  piloting  new   educator  performance  evaluation  systems,  which  will  be  implemented  statewide  for  school-­‐based   professionals.    Quality  classroom-­‐observation-­‐based  evaluations  have  been  linked  to  improved  teacher   61 performance  both  during  the  evaluation  period  and  in  subsequent  years,  even  for  experienced  teachers.     Well-­‐structured  principal  evaluations  that  provide  timely,  actionable  feedback  and  District  oversight  of   62 schools  can  strengthen  leadership  practices  and  have  meaningful  impact  on  student  achievement.       “All  educators  must  know  what  our  expected  outcomes  are  and  provide  clear  directions  to  both  parents   and  students  alike.”        

-­‐ High  School  Assistant  Principal  

H. Engage  teachers,  professional  networks,  labor  unions,  and  other  partners  to  identify,  explore,  develop,  and   scale  great  ideas  related  to  talent.      The  majority  of  the  District’s  labor  force  is  represented  by  one  of  five   union  organizations.    Therefore,  in  order  to  further  develop  and  strengthen  the  District’s  workforce,  it  is   important  that  we  work  collaboratively  with  both  our  staff  and  the  organizations  that  represent  them  to   develop  and  scale  practices  that  work.    Furthermore,  the  District  will  continue  to  promote  and  support  the   many  formal  and  informal  professional  educator  networks  in  Philadelphia.     The  District  is  made  of  up  of  17,024  employees,  16,592  of  whom  are  represented  by  our  labor  partners,   including  our  8,910  teachers,  counselors,  assistant  principals,  and  principals.    

Strategy 4:  Become  a  Parent-­‐  and  Family-­‐Centered  Organization  

Parents  and  families  are  vital  assets  to  our  schools  whose  active  engagement  will  help  improve  the  achievement  of     our  students  as  well  as  our  overall  system  performance.     A. Actively  reach  out  to  parents  to  involve  them  in  the  achievement  of  the  goals  of  their  children’s  schools,   including  the  launch  of  an  SAC  in  every  school.      We  will  support  our  parents’  efforts  to  be  more  actively  and   meaningfully  engaged  in  supporting  their  children’s  schools.    We  have  School  Advisory  Councils  (SACs)  in  152   schools  and  will  continue  to  expand  the  establishment  of  SACs  to  include  every  school  in  the  District  and  will   work  to  increase  the  number  of  parents  participating  in  school-­‐based  family  and  parent  groups.     63 Research  has  shown  that  increased  engagement  of  parents  leads  to  improved  academic  outcomes,   64 including  improved  student  language  use  and  reading  and  writing  skills.           “We  need  to  include  parents  in  the  process  of  creating  plans  for  student  achievement,  invite  parents  into     school  to  see  their  students,  and  then  continue  to  support  parents  with  whatever  they  need  for  their  child  to     succeed.”     –Elementary  School  Teacher         21  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

B. Establish  clear  processes  for  parent  and  family  input  and  ideas.    The  District  will  establish,  publicize,  and   monitor  multiple  clear  and  consistent  avenues  for  parents  and  families  to  express  and  resolve  concerns.  We   will  also  seek  parental  feedback,  either  through  surveys,  focus  groups,  or  town  halls,  on  the  effectiveness  of   our  schools  and  our  key  departments.             Meaningful  family  engagement  in  schools  enables  our  principals  and  educators  to  capitalize  on  our   65 families’  knowledge  of  students  and  communities,  which  has  been  associated  with  improved  academic,   66 behavioral  and  social  outcomes  for  students.       C.

Provide parents  with  information  about  their  students’  progress  and  how  to  support  that  progress.      We  will   clearly  articulate  our  expectations  for  learners  at  all  ages  and  parents’  role  in  their  children’s  learning  by   providing  parents  with  training,  information,  and  necessary  tools  to  support  student  learning.      In  short,  we   will  empower  parents  and  families  with  information,  insight,  and  ideas  on  how  to  support  students  in  their   learning  and  how  to  hold  schools  and  the  District  accountable  for  the  delivery  of  high-­‐quality  educational   opportunities.   Parents  and  families  are  an  incredible  source  of  support  for  students;  a  number  of  innovative  practices   around  the  country  have  illustrated  the  powerful  impact  of  parents  and  teachers  collaborating  on  student   67 learning.    In  addition,  parents  and  families  throughout  Philadelphia  and  elsewhere  have  demonstrated   an  ability  to  advocate  for  and  support  better  outcomes  for  their  children.  

D. Provide  parents  and  families  with  excellent  customer  service.      Parents  and  families  are  critical  partners  in   our  work  to  educate  their  children.  As  such,  it  is  essential  that  we  provide  them  with  a  welcoming   environment  in  our  schools  and  at  the  central  office,  including  improving  the  quality  and  accessibility  of  the   District’s  call  center  by  establishing  a  call-­‐ticketing  system  and  launching  the  Knowledge  Base,  a  parent-­‐ developed  initiative  serving  as  a  one-­‐stop-­‐shop  for  parents  to  obtain  immediate  answers  to  frequently  asked   questions.         We  work  on  behalf  of  the  public,  for  the  public.    At  parent  meetings  around  the  city  and  in  recent  SRC   Strategy,  Policy  and  Priority  meetings  we  have  heard  from  parents  that  they  expect  better  experiences   with  schools  and  with  central  administration.     “Parents  should  not  have  to  work  through  a  tangled  system  to  make  sure  they  are  getting  all  they  can   out  of  the  school  system.”    –Elementary  School  Teacher    

E.

  Provide  parents  with  ample  information  on  schools,  and  increase  the  equity  and  transparency  of  the  school   selection,  transfer,  and  placement  process.    Parents  are  better  able  to  support  the  academic  needs  of  their   68 children  when  equipped  with  the  necessary  information.    Toward  that  end,  the  District  is  committed  to   providing  parents  with  frequent  and  transparent  information  about  their  children,  our  schools,  and  our   performance  as  a  system  through  ParentNet.    We  will  integrate  information  requested  by  parents  into  our   School  Progress  Report  framework,  leverage  and  utilize  the  multiple  means  of  communication  currently  at  our   disposal,  and  clearly  articulate  and  streamline  processes  associated  with  student  enrollment,  transfer  and   placement.       During  the  2012-­‐2013  school  year,  63,316  District  students  were  placed  or  transferred  between  and   among  schools.    Of  these,  34,962  participated  in  the  high  school  selection  process.   22  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

Strategy  5:  Become  an  Innovative  and  Accountable  Organization  

In  addition  to  having  talented  individuals  to  help  execute  this  Plan,  to  do  this  work  most  efficiently  and  effectively  it   is  important  that  we  coordinate  and  align  our  efforts  and  hold  ourselves  and  each  other  accountable  for  our   progress.    We  also  need  to  create  essential  space  for  innovation  and  build  a  performance-­‐oriented  culture  focused   squarely  on  students.     A. Cultivate  and  sustain  partnerships  at  the  system  and  school  levels.  We  will  form  new,  and  cultivate  existing   partnerships  in  an  effort  to  supplement  the  progress  schools  have  made  in  order  to  improve  children’s   69 academic,  social,  and  emotional  development  and  better  support  families.    The  District  will  continue  to   develop  and  maintain  partnerships  with  philanthropic,  business,  non-­‐profit,  higher  education  and  community   organizations  and  others,  and  collaboratively  determine  where  and  how  our  partners  can  support  our  goals.       The  District  will  also  maintain  and  expand  collaboration  opportunities  with  current  City  and  institutional   partners  to  provide  and  prioritize  academic  and  behavioral  supports,  ensure  student  safety,  and  offer  extra-­‐ curricular  opportunities  for  our  students  in  the  early  grades.    The  District  will  work  with  our  partners  to   promote  kindergarten  attendance  and  will  work  with  citywide  pre-­‐K  providers  to  create  a  common   understanding  and  expectation  associated  with  Kindergarten  readiness.     During  a  period  of  significant  financial  challenges  and  transitions,  City  agencies,  philanthropic  and   community  organizations,  and  families  have  been  extraordinarily  supportive  of  the  District  and  its  schools.     Over  $10  million  was  secured  in  SY12-­‐13  to  support  the  transition  of  District  students,  to  enable  the   expansion  of  high  quality  schools,  and  to  help  sustain  important  student-­‐focused  programming.       B. Transform  the  organization  by  instituting  strategic  management  processes  at  all  levels  and  building  a   culture  of  excellence.    Strategic  management  provides  a  rigorous  approach  for  communicating  organizational   goals,  prioritizing  and  pursuing  highest  impact  strategies,  tracking  progress  against  targets,  evaluating  and   adjusting  strategies  based  on  data,  holding  the  organization  and  team  members  accountable  for  progress   toward  collective  goals,  and  identifying  and  celebrating  individual  and  team  contributions  to  overall   efforts.    We  will  implement  strategic  management  tools  to  expand  our  capacity  to  more  effectively,  reliably,   and  efficiently  achieve  our  Anchor  Goals,  which  are  focused  on  student  and  school  success.     We  do  not  currently  have  strategic  management  processes  in  place  across  the  District.    High  performing   organizations  incorporate  a  focus  on  results  into  their  cultures,  ensuring  all  people  understand  how  to   contribute  to  organizational  success.   C.

Improve data  accuracy,  application,  and  accessibility.  In  order  to  implement  data-­‐driven  goals,  it  is  critical   that  we  ensure  the  accuracy  of  our  data  and  the  appropriateness  of  its  application  across  the  system  –  in   classrooms,  schools  and  the  central  administration.      To  facilitate  this  work,  we  will  update  the  District’s  core   systems,  implement  a  performance  framework,  and  implement  internal  processes  that  incorporate  the  use  of   data.         Currently,  many  management  decisions  are  made  in  the  absence  of  data.    Strong  organizations   understand  their  goals  and  track  progress  in  order  to  justify  decisions  and  make  necessary  course   corrections.    At  the  school-­‐level,  effective  and  timely  data  can  help  drive  instructional  and  curricular   changes,  student  interventions,  and  resource-­‐allocation  decisions.  

D. Implement effective,  aligned  business  processes.  Efficiency  and  alignment  of  our  organizational  structures   70 are  critical  to  the  effective  implementation  of  reform  efforts.    This  work  includes,  but  is  not  limited  to,  the   re-­‐engineering  of  our  position  control  system,  our  automated  routing  system,  and  our  facility  work  order   system;  improving  the  cost  and  quality  competitiveness  of  procurement  services;  automating  additional   23  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

business processes  where  possible;  and  providing  training  on  core  systems  to  ensure  that  staff  understand  and   can  utilize  systems  relevant  to  their  work.     In  order  to  effectively  execute  the  District’s  Anchor  Goals  and  strategies,  the  District  must  take  the   initiative  to  realign  its  businesses  practices  to  reflect  a  more  efficient  working  environment.         Improve  communication  throughout  the  organization  and  to  the  public.    The  District  can  build  and  maintain   momentum  for  the  excellent  work  performed  by  our  staff  through  frequent,  transparent  and  consistent   communication  of  expectations  and  progress  across  all  levels  of  the  organization.    We  will  start  by  redesigning   our  internal  meeting  structure  to  include  regularly  scheduled  meetings  dedicated  to  data-­‐focused,   collaborative  problem  solving,  and  by  establishing  teacher  and  principal  advisory  groups  to  improve   communication  with  school-­‐based  staff.    Furthermore,  much  of  the  great  work  underway  at,  and  many   successful  programs  within,  the  District  remain  under  the  public  radar.    The  District  has  made  and  will   continue  to  make  a  concerted  effort  to  call  attention  to  and  highlight  the  innovative,  productive,  and  excellent   work  that  our  staff  and  students  do  on  a  more  frequent,  consistent  and  transparent  basis.     Communication  is  a  core  competency  that,  when  properly  executed,  can  help  ensure  successful  project   71 implementation  by  connecting  the  team  to  a  common  set  of  strategies,  goals  and  actions.    Clear   communication  can  help  foster  organizational  coherence,  which  ultimately  improves  the  effectiveness  and   72 sustainability  of  reform  efforts.         E.

F.

Actively promote  innovation  and  cross-­‐functional  design  thinking.    As  a  system  designed  to  promote   excellent  educational  experiences  for  children,  we  expect  our  own  organization,  from  central  administration   to  all  our  schools,  to  be  a  “learning  organization.”    This  means  we  expect  all  elements  of  our  organization    –   from  classrooms  to  transportation  depots,  parent  engagement  centers  to  staff  development  functions  –    to  be   flexible  enough  to  respond  to  information  as  it  comes  in,  to  solve  problems  quickly  and  efficiently,  and  to   collaborate  with  a  variety  of  colleagues  (and  not  live  in  silos).    This  mindset  and  organizational  philosophy  will     lead  to  better  outcomes  for  students  and  families.     At  the  school  level  working  in  teams  can  improve  school  performance  and  sustain  a  continuous   improvement  process  through  empowering  staff,  creating  a  sense  of  ownership  and  fostering   collaboration.73     Implement  core  student-­‐  and  teacher-­‐facing  systems  for  schools,  including  a  Learning  Management  System   and  a  Student  Information  System.    In  order  to  meet  the  growing  need  to  use  data  to  drive  accountability  in   both  financial  practice  and  instructional  change,  the  District  must  position  itself  to  modernize  its  core  systems.     The  District  currently  operates  on  a  13-­‐year  old  legacy  Enterprise  Resource  Planning  (ERP)  system  and  a  25-­‐ year  old  Student  Information  System  (SIS).      

Accurate  and  accessible  data  is  of  utmost  importance  for  school  districts  that  use  data  to  inform  their    74 policies  and  for  educators  who  use  this  information  to  inform  their  practice.       G. Improve  quality  and  lower  cost  of  transportation  services.    We  remain  committed  to  getting  all  students  to   school  safely,  on  time  and  with  less  than  an  hour  of  travel  time.    This  means  we  will  continue  to  roll  out  GPS   on  all  our  buses  and  work  to  optimize  all  of  our  route  times.    Additionally,  we  will  actively  review  bell  times   and  look  into  consolidating  pick-­‐up  locations  in  the  interest  of  providing  much  better  transportation  services   to  the  majority  of  our  student  riders.         A  recent  research  study  on  long  bus  rides  indicated  that  students  with  “large  average  times  on  a  bus   report  lower  grades  and  poorer  level  of  fitness,  fewer  social  activities  and  poor  study  habits.”75       Furthermore,  given  the  importance  of  a  healthy  breakfast  on  student  achievement,  it  is  imperative  that   school  buses  arrive  safely,  and  on-­‐time  to  enable  students  to  participate  in  the  school  breakfast  program.   24  |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Strategy  6:  Achieve  and  Sustain  Financial  Balance  

The  District’s  future  depends  on  our  ability  to  efficiently  and  effectively  manage  our  resources  and  obtain  additional   revenues.  Through  this  Strategy,  we  will  put  into  place  Actions  that  ensure  our  expenditures  do  not  exceed  our   revenues  while  continuing  to  aggressively  seek  additional  revenues  to  meet  the  District’s  strategic  priorities.         A. Seek  additional  revenues.  Attainment  of  additional  revenues,  both  private  and  public,  will  supplement  our   limited  resources  to  allow  us  to  better  serve  our  students.      The  District  will  continue  to  seek  additional,   recurring  revenues  from  the  State  and  the  City  to  help  ensure  financial  sustainability;  leverage  the  Office  of   Strategic  Partnerships  to  establish  partnerships  to  supplement  the  work  of  schools,  with  a  focus  on  early   literacy,  until  recurring  revenues  are  secure;  and  apply  for  and  manage  grants  that  meet  the  needs  of  our   students  and  systems.         As  a  result  of  the  end  of  stimulus  funding,  historically  low  levels  of  state  education  funding,  and  stagnating   local  tax  revenues,  revenues  received  by  the  District  have  significantly  diminished.    After  over  $300  million   in  budget  cuts  in  2012,  the  School  District  is  a  "bare  bones"  operation.  We  must  actively  pursue  additional   recurring  sources  of  revenue,  including  better  collection  efforts  with  the  city,  in  support  of  our  commitment   to  improve  student  academic  outcomes.     B. Continuously  identify  savings  opportunities  and  capture  identified  cost  savings.    In  order  to  help  meet   ongoing  fiscal  challenges,  the  District  has  identified  and  is  implementing  a  series  of  cost  savings  measures.       Over  the  course  of  the  next  five  years  the  District  is  expected  to  save  over  $120M  through  a  series  of  cost   savings  initiatives.   C.

Meet the  immediate  financial  challenges  of  Fiscal  Year  2014  and  Fiscal  Year  2015.    As  we  prepare  for  Fiscal   Year  2015,  it  is  critical  that  we  do  not  exceed  our  expected  expenditures  in  Fiscal  Year  2014.    Therefore,  we   must  implement  strict  financial  controls,  be  strategic  in  our  resource  allocation  process  and  continuously  track   progress  to  ensure  fiscal  stability  in  Fiscal  Year  2014.      Moreover,  to  begin  to  address  the  challenges  of  Fiscal   Year  2015,  the  District  must  seek  additional  recurring  revenues  to  replace  the  various  one-­‐time  revenues  it   received  in  Fiscal  Year  2014.   In  Fiscal  Year  2014,  the  District  requested  $304  million  in  order  to  provide  students  with  the  same  level  of   educational  services  as  was  received  in  Fiscal  Year  2013;  however,  the  District  received  $112  million  in   revenues,  of  which  approximately  $95  million  in  was  non-­‐recurring.    These  revenues  enabled  the  District  to   open  schools  with  the  minimal  number  staff  and  resources  required  to  do  so,  a  staffing  level  that  was  far   below  the  allocations  schools  received  in  Fiscal  Year  2013.  

D. Continuously analyze  the  impact  of  spending,  and  deploy  resources  to  achieve  priorities,  including  the   activities,  schools  and  programs  that  need  them  the  most.      The  District  must  continuously  review  our   investments  and  the  effects  they  have  on  student  achievement.    We  will  work  to  identify  our  desired  student   and  system  level  outcomes  and  will  allocate  resources  towards  strategies  that  have  proven  effective  in   achieving  those  identified  outcomes.    If  and  when  we  determine  that  programs  are  ineffective,  we  will  stop   investing  in  them.    We  will  also  pilot  a  student  weighted  funding  formula  by  allocating  resources  to  schools   based  upon  the  number  and  needs  of  the  students  they  serve.   The  District  invests  over  $1.4  billion  in  the  education  of  approximately  131,000  District  students.   E.

Develop a  comprehensive,  outcomes-­‐focused  budgeting  strategy,  including  five  year  planning.    Many  school   districts  tend  to  budget  based  on  prior  expenditures  and  long-­‐established  formulas.    However,  to  facilitate  our   ability  to  reach  our  goals,  we  will  work  to  establish  an  outcomes  focused  budgeting  strategy  which  takes  into   25  |  P a g e  


F.

The School  District  of  Philadelphia  Action  Plan  v2.0  

account both  the  short-­‐  and  long-­‐term  implications  of  our  decisions.    We  will  start  by  implementing  an  annual,   standard,  data-­‐driven  budgeting  process  across  operating,  capital  and  grant  budgets  to  improve  financial   76 sustainability  as  well  as  organizational  efficiency,  transparency  and  innovation.     Disciplined  resource  allocation  is  fundamental  to  achieving  sustainable  results  in  public  education.     Budgeting  is  an  essential  vehicle  for  prioritizing  and  planning,  gathering  stakeholder  input,  communicating   with  funders,  and  ensuring  organizational  alignment  and  accountability  to  our  most  important  work— 77 educating  students.     Institute  financial  controls.  Incorporating  fiscal  discipline  and  control  helps  us  manage  and  better  understand   78 our  spending.    The  District  will  institute  financial  controls  at  all  levels  of  the  organization,  including   incorporating  good  financial  stewardship  as  part  of  principal  and  program  manager  training,  supports  and   evaluations.   With  the  limited  financial  controls  currently  in  place,  our  organization  is  at  tremendous  risk  for  fiscal   mismanagement  and  inefficiencies.    We  must  ensure  our  principals  and  program  managers,  the  day-­‐to-­‐ day  financial  managers  of  our  District,  receive  the  training  and  support  necessary  to  be  effective,   responsible  financial  stewards.  

G. Align capital  and  grants  programs  in  support  of  Anchor  Goals.  Our  organization  has  developed  certain   “habits”  of  spending  both  capital  dollars  as  well  as  Federal  grant  dollars.    While  these  allocations  from  capital   and  grants  have  served  the  organization  well  historically,  it  is  critical  in  this  moment  of  serious  fiscal  austerity   that  we  ensure  all  spending  from  all  sources  conforms  with  this  Action  Plan,  in  support  of  our  Anchor  Goals.     Toward  this  end,  we  will  implement  a  new  “capital  call”  planning  cycle,  review  all  grant  spending,  and   incorporate  both  capital  and  grant  budgets  alongside  our  operating  budget  within  the  five-­‐year  planning   process.   The  School  District  of  Philadelphia’s  Capital  Budget  for  FY14  is  $134  million  and  its  grant  budget  is  $336   million.      

26 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

PART III:  Where  We  Go  From  Here   The  plan  described  above  is  part  of  a  process.    Much  of  what  is  described  here  has  already  started,  or  is   ongoing.    Some  Actions  will  be  new,  done  differently,  or  done  better.    And  all  Actions  are  subject  to  review  and   revision.    As  with  Action  Plan  v1.0,  this  document  will  evolve  with  time.    That  said,  there  is  a  clear  path  forward.    Our  specific  next  steps  are:   1.  Begin  implementation.  We  will  define  timing,  metrics  and  targets,  identify  owners,  and  develop  implementation   plans  for  all  Actions  included  in  the  Plan  with  a  specific  focus  on  activities  that  will  advance  the  District’s  early   literacy  goal.    2.  Set  up  systems  and  routines  to  drive  progress.  There  will  be  weekly  reviews  of  progress  toward  our   targets.    Collectively,  we  will  engage  in  solution-­‐oriented,  data-­‐driven  problem  solving  sessions  in  support  of  the   realization  of  our  commitments  and  to  resolve  implementation  challenges.    3.  Drive  the  FY  2015  budgeting  process.  This  Action  Plan  will  serve  as  a  major  input  into  the  District  budgeting   process.    All  central  office  departments  and  schools  will  be  asked  to  align  their  work  and  budgets  to  the  priorities   set  forth  in  the  Action  Plan.      4.  Refine  through  stakeholder  input.    We  will  incorporate  input  from  staff  and  stakeholders  to  refine  and  further   evolve  the  Plan.       Principals  will  be  invited  to  utilize  their  Learning  Networks  (LNs)  as  a  forum  for  discussion  of  and  input   into  this  plan.  In  addition,  the  principal  advisory  board  will  continue  to  be  a  source  of  school-­‐level  input   into  District  policy  and  practices.    We  will  also  use  the  annual  District-­‐Wide  Principal  Survey  to  refine  our   understanding  of  principals'  needs  and  opinions.   Teachers  and  other  school-­‐based  professionals  will  be  invited  to  participate  in  building-­‐level  discussions   and  focus  groups.  We  will  continue  to  collaborate  with  teacher  networks,  the  Philadelphia  Federation  of   Teachers  and  other  labor  partners.    We  will  also  use  the  annual  District-­‐Wide  Teacher  Survey  to  refine  our   understanding  of  teachers'  needs  and  opinions.   Parents  and  families  will  be  engaged  through  the  Actions  identified  above  in  Strategy  4,  as  well  as   through  the  District  parent  advisory  council  and  a  parent  survey.   Students  will  be  invited  to  participate  in  "youth-­‐friendly  spaces"  that  will  encourage  their  participation   and  engagement,  such  as  the  District-­‐wide  student  government  initiative  and  existing  youth   organizations.    Student  voices  and  input  is  also  provided  via  the  Superintendent’s  student  advisory  board,   and  an  annual  student  survey.   Central  administrative  staff  will  engage  in  quarterly  "town  halls"  and  will  have  ongoing  opportunities  to   share  input  within  teams,  and  executive  staff  will  engage  in  monthly  “executive  team”  meetings  with   District  leadership.   External  partners  will  be  directly  engaged  through  our  Office  of  Strategic  Partnerships.     Furthermore,  stakeholders  will  be  invited  to  provide  input  through  a  comprehensive  planning  process  throughout   2014,  which  will  ultimately  inform  the  next  iteration  of  this  Action  Plan.    

27 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Exhibit 1  –  Inputs  to  Action  Plan  v2.0   Action  Plan  v2.0  expanded  and  improved  upon  Action  Plan  v1.0  which  was  developed  inclusive  of  input  from  our   employees,  parents  and  families,  and  the  broader  community.    All  amendments  in  v2.0  were  made  in  reference  to   input  from  our  principals,  assistant  principals,  teachers,  school  based  support  staff  and  central  office  staff.  More   specifically,  in  addition  to  a  high  reliance  on  the  content  of  Action  Plan  v1.0,  Action  Plan  v2.0  reflects  input  from   the  following  sources:   •

"Invitation for  Input  on  District-­‐wide  Strategic  Priorities"  December  2012  survey  through  which  295  school-­‐ based  staff  provided  input  

2011-­‐2012 District-­‐Wide  Public  School  Principal  Survey  

2011-­‐2012 District-­‐Wide  Public  School  Teacher  Survey  

Interviews with  more  than  30  district  leaders,  Assistant  Superintendents,  and  program  managers.  

Inputs and  support  from  additional  central  office  staff  who  offered  their  thoughts  and  feedback  throughout   the  development  of  the  Plan    

Focus groups  and  parent  conversations  during  over  50  school  visits  and  meetings  conducted  by  Dr.  Hite  from   September  through  December  2012  

Community meetings  and  interviews  with  over  35  groups  across  Philadelphia  

Feedback received  at  Parent  School  Progress  Report  sessions  

Feedback received  at  Stakeholder  School  Progress  Report  meetings  

Input from  the  SRC’s  Strategy,  Policy  and  Priorities  meetings  from  parents,  families,  students,  educators,   community  members  and  advocates  

Public testimony  at  monthly  SRC  public  meetings  

In  addition,  several  strategic  documents  shaped  the  content  of  this  Plan.  These  documents  include:     • Action  Plan  v1.0   •

Professional Development  Handbook  for  School  Leaders  

The Office  of  Career  and  Technical  Education’s  Five  Year  Strategic  Plan  for  Quality,  Access  and  Equity:  Action   Plan  

The Office  of  Curriculum  Instruction  and  Assessment’s  proposed  RtII  Model  Plan  

Charter School  Office  Authorizing  Quality  Initiative  documents  

The Five  Year  Financial  Plan,  Fiscal  Years  2013-­‐2017  

Renaissance Schools  Initiative  Progress  Report:  2010-­‐2011  through  2012-­‐2013  

The School  District  of  Philadelphia  draft  Academic  Priorities  2012-­‐2013  

Meeting Milestones:  The  Third  Annual  Report  to  Mayor  Nutter  From  the  Philadelphia  Council  for  College  and   Career  Success  

A Blueprint  for  Action:  Blue  Ribbon  Commission  on  Safe  Schools,  January  2012  

The Philadelphia  Great  Schools  Compact,  December  20,  2011  

College Board,  Advanced  Placement  Data,  2009-­‐2010  

Analysis and  Findings  of  the  SDP  College-­‐Going  Working  Group   28  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

Ongoing work  of  the  SDP  Early  Literacy  Working  Group  

Embracing the  Challenge:  A  Five  Year  Blueprint  For  Increasing  Achievement  in  Secondary  Grades  in  The  School   District  Of  Philadelphia,  2008-­‐2013  

The African  American  and  Latino  Male  Dropout  Taskforce  Report,  September  2,  2010  

Harvard University  Strategic  Data  Project  Human  Capital  Analyses,  June  2012  and  findings  of  the  SDP  Human   Capital  Working  Group  

The Office  of  Charter  Schools  Strategic  Plan  Draft  

Strategic Planning  for  the  School  District  of  Philadelphia:  Lessons  Learned  from  Improved  Districts,  December   2012  

Financial Systems  and  Operations  Working  Group  Report,  July  2011  

Convectus Solutions,  April  2011  Report  

A Blueprint  for  Transforming  Philadelphia'  Public  Schools:  Safe,  high-­‐quality  schools.  Fiscal  Sustainability.  

School District  of  Philadelphia,  2010-­‐2011  High  School  Exit  Survey  

SY2013-­‐2014 Principal  Handbook  

Student Code  of  Conduct  

Draft early  childhood  strategy  

29 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Exhibit 2  –  How  Did  We  Do?  A  Scorecard  Against  Action  Plan  v1.0   Anchor  Goal  1:  Improve  academic  outcomes  for  students  in  all  the  schools  we  manage  and  in  the   charter  schools  we  authorize   Anchor  Goal  2:  Ensure  the  financial  stability  and  sustainability  of  the  District     =  Significant  Progress  

 =  Some  Progress  

 =  Little  Progress  

Strategy  1:  ACHIEVE  AND  SUSTAIN  FINANCIAL  BALANCE   A. Capture  cost  savings  and  track  progress  against  the  Five  Year  Financial  Plan   B. Meet  the  immediate  financial  challenge  of  Fiscal  Year  2014   C. Implement  a  data-­‐driven  budgeting  process   D. Institute  financial  controls   E. Seek  additional  revenues   F. Effectively  manage  grants     Strategy  2:  IMPROVE  STUDENT  OUTCOMES    

               

A. Utilize data  to  assess  student  needs   B. Sustain  high  academic  standards  and  expectations     C. Advance  the  implementation  of  Response  to  Instruction  and  Intervention   D. Prioritize  early  literacy     E. Cultivate  academic  tenacity   F. Clarify  the  profiles  of  college  and  career  ready  graduates   G. Track  students’  progress  to  graduation,  college  and  career   H. Develop  a  high  school  improvement  strategy   I. Increase  access  to  Career  and  Technical  Education   J. Meet  the  needs  of  students  in  Special  Education   K. Meet  the  needs  of  English  Language  Learners   L. Improve  alternative  education   M. Review  outside  educational  institutions   N. Improve  student  nutrition     Strategy  3:  DEVELOP  A  SYSTEM  OF  EXCELLENT  SCHOOLS  

                                       

A. B. C. D. E. F. G. H. I.

                        

Improve school  safety  and  climate   Implement  the  Facilities  Master  Plan   Enhance  the  physical  environments  of  schools   Clarify  school  autonomy   Develop  innovative  school  models   Turn  around  low  performing  schools   Become  a  top-­‐quality  charter  school  authorizer   Collaborate  with  other  school  operators   Develop  school  performance  measure  

 

30 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Strategy 4:  IDENTIFY  AND  DEVELOP  COMMITTED,  CAPABLE  PEOPLE   A. Enhance  teacher  recruitment  and  hiring  practices   B. Implement  teacher,  principal  and  specialist  evaluations   C. Strengthen  teacher  development   D. Create  meaningful  opportunities  for  teacher  collaboration   E. Provide  effective  principal  support   F. Increase  capacity  of  principals  and  leadership  teams   G. Strengthen  the  principal  pipeline   H. Clearly  define  administrative  staff  roles  and  performance  evaluations   I. Launch  the  Transformation  Corps     Strategy  5:  BECOME  A  PARENT  AND  FAMILY-­‐CENTERED  ORGANIZATION  

                        

A. Improve customer  service   B. Launch  additional,  effective  School  Advisory  Councils   C. Establish  clear  processes  for  parent/family  input   D. Support  the  unique  needs  of  parents     E. Empower  parents  with  information       Strategy  6:  BECOME  AN  ALIGNED,  ACCOUNTABLE  ORGANIZATION  

            

A. Institute a  change  management  program   B. Institute  performance  management  processes   C. Improve  data  accuracy  and  application   D. Implement  effective,  aligned  business  processes   E. Invest  in  core  systems   F. Ensure  testing  integrity   G. Cultivate  and  sustain  partnerships   H. Update  the  organization  structure      

                     

31 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0  

Exhibit 3  –  SDP  High  Performing  Schools  Practices  Based  on  the  Research   The  following  High  Performing  School  Practices  for  the  School  District  of  Philadelphia  were  developed  as  a   collaborative  effort  between  District  teachers,  principals,  and  central  office  administrators.    Based  on  the  most   current  research  on  effective  schools,  the  following  practices  articulate  the  District’s  baseline  expectations  for   performance  in  every  school.  

Categories Vision  for   Learning  

School Safety  

High Quality   Instruction  

Practices When  you  walk  into  a  SDP  school,  you  should  expect  to  see  evidence  of…  

School leaders  developing,  articulating,  stewarding,  and  implementing  a  clear  vision  for   learning  for  all  students  and  a  strategic  plan  to  accomplish  that  vision   All  school  stakeholders  able  to  articulate  a  clear  and  shared  vision  for  learning  

A safe,  secure  and  orderly  environment  for  all    

Principals who  are  experts  in  high  quality  instructional  practices  that  consistently   promote  excellent  instruction  school-­‐wide   Principals  who  are  visible  in  classrooms  and  teachers  regularly  receiving  timely  and   constructive  feedback  on  classroom  instruction  from  school  administrators  and   colleagues   School  leaders  and  teachers  clearly  communicating  learning  and  development  objectives   that  reflect  high  expectations  for  learning  and  growth,  a  belief  that  all  students  can  learn,   and  a  commitment  to  meet  each  student’s  educational  needs  

Positive Environment  

• • •

Talent Development  

• • •

Data

Family and   Community   Relationships  

Collegial and  professional  relationships  among  staff  and  students  that  promote  critical   reflection,  shared  accountability,  and  continuous  improvement   Constructive  management  of  conflict  at  all  levels   Teachers  regularly  collaborating  on  practice  and  providing  each  other  with  support  and   constructive  feedback   Careful  staff  selection  and  effective  assignment  of  staff   Plans  to  support  the  professional  growth  of  staff  members  that  are  differentiated  based   on  identified  needs  and  individual  goals   A  deliberate  approach  to  building  leadership  capacity  among  staff  

The frequent  collection,  analysis,  and  use  of  multiple  sources  of  data  to  guide  continuous   improvement  in  student  achievement  and  well-­‐being  and  professional  development  for   staff    

Positive and  collaborative  relationships  with  families  and  communities  

For  alignment  with  principal  leadership  framework,  please  see  the  SY2013-­‐2014  Principal  Handbook    

32  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

Exhibit 4  –  SDP  Highly  Effective  Instructional  Practices   Ensuring  excellent  instruction  in  every  classroom  is  at  the  core  of  the  School  District’s  work.    Principal  leadership   and  support  of  the  District’s  teachers  to  be  able  to  implement  high  quality,  standards-­‐based  instructional  practices   is  of  critical  importance.    To  drive  this  process,  a  working  group  comprised  of  teachers,  assistant  superintendents,   and  curriculum  administrators  developed  a  set  of  highly  effective  instructional  practices  to  serve  as  a  core  set  of   District-­‐wide  expectations  for  teaching.    The  practices  for  English  language  arts,  social  studies,  science,  and  the   technical  subjects,  as  well  as  those  for  math,  are  listed  below.    Philadelphia  school  leaders  are  charged  with   developing  expertise  in  the  instructional  practices  both  for  themselves,  and  also  among  the  staff.    Through   consistent  observation  and  monitoring,  principals  will  promote  the  implementation  of  the  highly  effective   instructional  practices  using  timely,  constructive,  and  evidence-­‐based  feedback  to  teachers  on  their  planning,   preparation,  and  instruction  in  accordance  with  these  practices.    More  information  on  these  practices  can  be   found  on  SchoolNet  in  the  Outreach  Collaboration  Section  by  clicking  on  the  “Instructional  Practices”  link.      

Elements of  High   Quality  Instruction  

Instructional Practices  for  Math  

Practice 1:  An  instructional  objective   (accessible  to  students,  teachers  and   observers)  linked  to  the  content  and  a   worthwhile  mathematical  task       Practice  2:  Curriculum-­‐driven  opportunities  to   determine  the  meaning  of  general  and  domain   specific  words  and  symbols       Practice  3:  Lessons  characterized  by  knowledge   of  student  ability  and  the  gradual  release  of   responsibility  (from  teacher  dependence  to   student  independence)  toward  mastery       Practice  4:  The  consistent  use  of  manipulatives   to  teach  abstract  mathematical  concepts       Practice  5:  A  scope  and  sequence  driven  by  the   connection  of  new  concepts  as  a  logical   extension  of  previously  taught/mastered   concepts  (Coherent  Instruction)     Practice  6:  Lessons  characterized  by  a  balance   of  procedural  fluency  and  conceptual   understanding  (Dual  Intensity)       Practice  7:  Homework  aligned  with  the   requisite  concept  necessary  to  demonstrate   mastery  of  the  content,  concept,  process  under   study       Practice  8:  Multiple  opportunities  for  students   to  demonstrate  behaviors  associated  with  the  8   Standards  for  Mathematical  Practice       Practice  9:  Teachers’  consistent  use  of  the   language  of  proficient  mathematician  (the  8   Standards  for  Mathematical  Practice)   throughout  the  gradual  release  of  responsibility    

Practice 1:  An  instructional  objective  (accessible  to   students,  teachers  and  observers)  linked  to  the   content  and  a  literacy  standard     Practice  2:  Curriculum-­‐driven  opportunities  to   determine  the  meaning  of  general  and  domain   specific  words  and  phrases  (pre-­‐reading  and  during-­‐ reading).       Practice  3:  Lessons  characterized  by  gradual  release   of  responsibility  (from  teacher  dependence  to   student  independence)     Practice  4:  Curriculum-­‐driven  reading  opportunities   characterized  by  a  balance  of  informational  (on  a   variety  of  topics,  perspectives,  and  eras)  and/or   literary  texts  (from  a  variety  of  authors,  topics,   genres,  eras,  and  traditions)     Practice  5:  Curriculum-­‐driven  reading  opportunities   characterized  by  discipline-­‐specific  approaches  to   text.  Students  should  regularly  be  taught,  assessed,   and  re-­‐taught  (if  necessary)  the  discipline-­‐specific   lens  through  which  members  of  specific  disciplines   read,  analyze,  and  make  use  of  text     Practice  6:  Curriculum-­‐driven  reading  opportunities   characterized  by  careful,  sustained  interpretation  of   a  variety  of  texts  with  an  emphasis  on:  the   quantitative  measure;    the  qualitative  measure;  and   the  reader  and  task  measure       Practice  7:  Curriculum-­‐driven  opportunities  to   engage  in  evidence-­‐based  conversations  about  the   text  in  whole  group  and  small  group  settings     Practice  8:  Multiple  opportunities  to  use  evidence   from  multiple  sources  on  the  same  topic  to   compose  an  original  text  or  to  evaluate   composition  

(as defined  in  the  High   Performing  School  Practices)  

Principals  who  are  experts   in  high  quality   instructional  practices  that   consistently  promote   excellent  instruction   school-­‐wide     Principals  who  are  visible   in  classrooms  and   teachers  regularly   receiving  timely  and   constructive  feedback  on   classroom  instruction   from  school   administrators  and   colleagues     School  leaders  and   teachers  clearly   communicating  learning   and  development   objectives  that  reflect  high   expectations  for  learning   and  growth,  a  belief  that   all  students  can  learn,  and   a  commitment  to  meet   each  student’s   educational  needs  

Instructional Practices  for  ELA,  Social   Studies,  Science,  Technical  Subjects  

33  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0  

Exhibit 5  –  SDP  Functional  Organization  Chart  Aligned  with  Action  Plan  v2.0

34 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0      

Endnotes                                                                                                                                     1  Weiss,  Jonathan  D.  (2004).  “Public  Schools  and  Economic  Development:  What  the  Research  Shows,”  Knowledge   Works  Foundation.   2

Lesnick,  J.,  Goerge,  R.  M.,  Smithgall,  C.  &  Gwynne,  J.  (2010).  Reading  on  Grade  Level  in  Third  Grade:  How  Is  It   Related  to  High  School  Performance  and  College  Enrollment?  Chicago:  Chapin  Hall  at  the  University  of   Chicago.  Retrieved  from  http://www.chapinhall.org/research/report/reading-­‐grade-­‐level-­‐third-­‐grade-­‐ how-­‐it-­‐related-­‐high-­‐school-­‐performance-­‐and-­‐college-­‐e  

3

Lesnick,  J.,  Goerge,  R.  M.,  Smithgall,  C.  &  Gwynne,  J.  (2010).  Reading  on  Grade  Level  in  Third  Grade:  How  Is  It   Related  to  High  School  Performance  and  College  Enrollment?  Chicago:  Chapin  Hall  at  the  University  of   Chicago.  Retrieved  from  http://www.chapinhall.org/research/report/reading-­‐grade-­‐level-­‐third-­‐grade-­‐ how-­‐it-­‐related-­‐high-­‐school-­‐performance-­‐and-­‐college-­‐e  

4

Sanders,  W.  L.,  &  Rivers,  J.  C.  (1996).  Cumulative  and  residual  effects  of  teachers  on  future  student  academic   achievement  (Research  Progress  Report).  Knoxville,  TN:  University  of  Tennessee  Value-­‐Added  Research   and  Assessment  Center.  

5

Common  Core  standards  define  the  knowledge  and  skills  students  should  have  within  their  K-­‐12  education   careers  so  that  they  will  graduate  high  school  able  to  succeed  in  entry-­‐level,  credit-­‐bearing  academic   college  courses  and  in  workforce  training  programs.  The  standards  are  generally:   • • • •    

Aligned with  college  and  work  expectations;   Clear,  understandable  and  consistent;   Evidence-­‐based;  and   Informed  by  other  top  performing  countries,  so  that  all  students  are  prepared  to  succeed  in  our   global  economy  and  society;   http://www.corestandards.org/about-­‐the-­‐standards    

Additional  information  can  also  be  accessed  through:  PA  Core.  (2012).  Frequently  Asked  Questions.   Retrieved  from  http://www.pa-­‐commoncorestandards.com/faq/   6

Common  Core  standards  define  the  knowledge  and  skills  students  should  have  within  their  K-­‐12  education   careers  so  that  they  will  graduate  high  school  able  to  succeed  in  entry-­‐level,  credit-­‐bearing  academic   college  courses  and  in  workforce  training  programs.  The  standards  are  generally:   • Aligned  with  college  and  work  expectations;   • Clear,  understandable  and  consistent;   • Evidence-­‐based;   • Informed  by  other  top  performing  countries,  so  that  all  students  are  prepared  to  succeed  in  our   global  economy  and  society;  and   http://www.corestandards.org/about-­‐the-­‐standards  

7

Pennsylvania  Academic  Code,  Chapter  4.  Academic  Standards  and  Assessments,  retrieved  at:   http://www.pacode.com/secure/data/022/chapter4/chap4toc.html.    

35 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      8  The  state  of  Kentucky  experienced  a  28  percentage  point  decline  in  reading  and  a  33  percentage  point  drop  in   mathematics  for  its  elementary-­‐aged  students.    Similarly,  after  its  initial  administration  of  its  Common   Core-­‐aligned  assessment,  proficiency  rates  for  English  language  arts  and  math  declined  by  24  percentage   points  and  34  percentage  points,  respectively.     http://www.edweek.org/ew/articles/2012/11/02/11standards.h32.html    and   http://blogs.edweek.org/edweek/state_edwatch/2013/08/_one_interesting_aspect_of.html     9

Harvard  Strategic  Data  Project  (2013).    SDP  College-­‐Going  Diagnostic:  The  School  District  of  Philadelphia.     Retrieved  from  http://www.gse.harvard.edu/cepr-­‐resources/files/news-­‐events/sdp-­‐sdp-­‐cg.pdf.    

10

McKinsey  &  Company  (2007).    “How  the  World’s  Best-­‐Performing  School  Systems  Come  Out  on  Top”.    Retrieved   from  http://mckinseyonsociety.com/downloads/reports/Education/Worlds_School_Systems_Final.pdf.     Levin,  B.  (2008).    How  To  Change  5000  Schools:  A  Practical,  and  Positive  Approach  for  Leading  Change  at   Every  Level.”    Harvard  Education  Press.  

11

According  to  the  Ounce  of  Prevention  Fund,  at-­‐risk  children  who  don’t  receive  a  high-­‐quality  early  childhood   education  are:   • 25%  more  likely  to  drop  out  of  school   • 40%  more  likely  to  become  a  teen  parent   • 50%  more  likely  to  be  placed  in  special  education   • 60%  more  likely  to  never  attend  college   • 70%  more  likely  to  be  arrested  for  a  violent  crime   http://www.ounceofprevention.org/about/why-­‐early-­‐childhood-­‐investments-­‐work.php  

 

Also see:  Fight  Crime:  Invest  in  Kids.  (2012).  High  Quality  Early  Care  and  Education:  A  Key  to  Reducing  Future   Crime  in  Pennsylvania.  Retrieved  from  http://www.fightcrime.org/wp-­‐content/uploads/PA-­‐ECE-­‐quality-­‐ report-­‐SEPT-­‐2012.pdf.    

12

Barnett,  S.  (2013).    “The  True  Value  of  ECE,”  Washington,  DC.    Retrieved  from:   http://nieer.org/sites/nieer/files/The%20True%20Value%20of%20ECE.pdf;  Tennessee  Department  of   Education.  “The  Tennessee  STAR-­‐Quality  Child  Care  Program:  What  It  Means  for  Pre-­‐K.    Retrieved  from  :   http://www.state.tn.us/education/earlylearning/prek/documents/PreK_UnderstandingCollab.pdf;  and   The  Pew  Charitable  Trusts,  “Why  All  Children  Benefit  from  Pre-­‐K.”  Retrieved  from:   http://www.pewtrusts.org/news_room_detail.aspx?id=19434#sthash.wRQrq5Kf.dpuf    

13

Center  for  Educational  Research  and  Innovation  (2008).    “Assessment  for  Learning  Formative  Assessment.     st OECD/CERI  International  Conference  “Learning  in  the  21  Century:  Research  Innovation  and  Policy.”     Retrieved  from  http://www.oecd.org/site/educeri21st/40600533.pdf.    And    Black,  Paul,  Wiliam,  Dylan,   Assessment  in  Education:  Principles,  Policy  &  Practice.  Mar1998,  Vol.  5,  Issue.    Retrieved  from:   http://area.fc.ul.pt/en/artigos%20publicados%20internacionais/Assessment%20and%20classroom%20lea rning.doc    

36 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      14  Center  for  Policy  Research  in  Education  (2007).  “How  the  world’s  best  performing  school  systems  come  out  on   top”.    Retrieved  from:  http://www.smhc-­‐cpre.org/wp-­‐content/uploads/2008/07/how-­‐the-­‐worlds-­‐best-­‐ performing-­‐school-­‐systems-­‐come-­‐out-­‐on-­‐top-­‐sept-­‐072.pdf.     15

Daniel  Weisberg,  Susan  Sexton,  Jennifer  Mulhern  and  David  Keeling.  (2009).  The  Widget  Effect:  Our  National   Failure  to  Acknowledge  and  Act  on  Differences  in  Teacher  Effectiveness.  The  New  Teacher  Project.   Retrieved  from  http://tntp.org/ideas-­‐and-­‐innovations/view/the-­‐widget-­‐effect    

16

Yonezwa,  S.  McClure,  L.  &  Jones,  M.  (2012).  Personalization  in  Schools.  Students  at  the  Center:  Teaching  and   Learning  in  the  Era  of  the  Common  Core,  A  Jobs  for  the  Future  Project.  Retrieved  from   http://www.studentsatthecenter.org/sites/scl.dl-­‐dev.com/files/Personalization%20in%20Schools.pdf    

17

Fryer,  Jr.,  R.  G.  (2012).  Learning  from  the  Successes  and  Failures  of  Charter  Schools.  Discussion  paper  2012-­‐06.   The  Hamilton  Project.  Retrieved  from   http://www.brookings.edu/~/media/research/files/papers/2012/9/27%20charter%20schools/thp_fryer_c harters_discpaper.pdf    

18

Yonezawa,  S.,  McClure,  L.,  and  Jones,  M.  (2012).    “Personalization  in  Schools.”    Students  at  the  Center:  Teaching   and  Learning  in  the  Era  of  the  Common  Core  –  A  Jobs  For  the  Future  Project.    Retrieved  from     http://www.studentsatthecenter.org/sites/scl.dl-­‐dev.com/files/Personalization%20in%20Schools.pdf.    

19

Summit  Public  Schools.  http://www.summitps.org/our_results/    

20

New  Tech  Network  (2013).    “Student  Outcomes  Report  2013:  Re-­‐Imagining  Teaching  and  Learning.”    Retrieved   from  http://www.newtechnetwork.org/sites/default/files/news/2013_annual_data_v14-­‐01.pdf.    

21

See  Hattie,  J.  (2009).  Visible  Learning:  A  Synthesis  of  Over  800  Meta-­‐Analyses  Relating  to  Achievement.     Abingdon,  Oxon:  Routledge  and  Hanushek,  E.  A.,  Kain,  J.  F.,  and  Rivkin,  S.  G.  (1998).  Does  Special   Education  Raise  Academic  Achievement  for  Students  with  Disabilities?  National  Bureau  of  Economic   Research.  NBER  Working  Paper  Series.  http://www.nber.org/papers/w6690  

22

NAEP,  “What  proportions  of  student  groups  are  reaching  Proficient”.    Retrieved  from:   http://nationsreportcard.gov/reading_math_2013/#/student-­‐groups  

23

Horwitz,  A.  R.,  Uro,  G.  Price-­‐Baugh,  R.,  Simon,  C.  Uzzell,  R.  Lewis,  S.  and  Casserly,  M.  (2009).  Succeeding  with   English  Language  Learners:  Lessons  Learned  from  the  Great  City  Schools.  The  Council  on  Great  City   Schools.  Retrieved  from   http://cgcs.schoolwires.net/cms/lib/DC00001581/Centricity/Domain/35/Publication%20Docs/ELL_Report 09.pdf  

24

NAEP,  “What  proportions  of  student  groups  are  reaching  Proficient”.    Retrieved  from:   http://nationsreportcard.gov/reading_math_2013/#/student-­‐groups  

25

See  Duckworth,  A.  L.,  Grant,  H.,  Loew,  B.,  Oettingen,  G.  &  Gollwitzer,  P.  M.  (2011).  Self-­‐regulation  strategies   improve  self-­‐discipline  in  adolescents:  Benefits  of  mental  contrasting  and  implementation  intentions.   Educational  Psychology:  An  International  Journal  of  Experimental  Educational  Psychology,  31(1),  17-­‐26.   37  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      Retrieved  from   http://www.sas.upenn.edu/~duckwort/images/publications/DuckworthGrantLoewOettingenGollwitzer_2 011_Self-­‐regulationStrategiesImproveSelf-­‐DisciplineinAdolescents.pdf,  Dweck,  C.  S.,  Walton,  G.  M.,   Cohen,  G.  L.,  Paunesku,  D.,  &  Yeager,  D.  (2011).  Academic  tenacity:  Mindsets  and  skills  that  promote  long-­‐ term  learning.  Paper  prepared  for  the  Bill  &  Melinda  Gates  Foundation.  Stanford  University,  and   Duckworth,  A.  L.  (2011).  The  significance  of  self-­‐control.  Proceedings  of  the  National  Academy  of  Sciences,   108(7),  2639-­‐40.  Retrieved  from   http://www.sas.upenn.edu/~duckwort/images/publications/Duckworth_2011_TheSignificanceofSelf-­‐ Control.pdf   26

Duckworth,  A.  L.,  Grant,  H.,  Loew,  B.,  Oettingen,  G.  &  Gollwitzer,  P.  M.  (2011).  Self-­‐regulation  strategies  improve   self-­‐discipline  in  adolescents:  Benefits  of  mental  contrasting  and  implementation  intentions.  Educational   Psychology:  An  International  Journal  of  Experimental  Educational  Psychology,  31(1),  17-­‐26.  Retrieved  from   http://www.sas.upenn.edu/~duckwort/images/publications/DuckworthGrantLoewOettingenGollwitzer_2 011_Self-­‐regulationStrategiesImproveSelf-­‐DisciplineinAdolescents.pdf    

27

Dweck,  C.  S.,  Walton,  G.  M.,  Cohen,  G.  L.,  Paunesku,  D.,  &  Yeager,  D.  (2011).  Academic  tenacity:  Mindsets  and   skills  that  promote  long-­‐term  learning.  Paper  prepared  for  the  Bill  &  Melinda  Gates  Foundation.  Stanford   University.  

28

Duckworth,  A.  L.  (2011).  The  significance  of  self-­‐control.  Proceedings  of  the  National  Academy  of  Sciences,   108(7),  2639-­‐40.  Retrieved  from   http://www.sas.upenn.edu/~duckwort/images/publications/Duckworth_2011_TheSignificanceofSelf-­‐ Control.pdf    

29

Tavernise,  S.  (2012).  Obesity  in  Young  Is  Seen  as  Falling  in  Several  Cities.  The  New  York  Times.    Retrieved  from   http://www.nytimes.com/2012/12/11/health/childhood-­‐obesity-­‐drops-­‐in-­‐new-­‐york-­‐and-­‐ philadelphia.html?pagewanted=all&_r=0  

30

Taras,  H.  (2005).  Nutrition  and  Student  Performance  at  School.  Journal  of  School  Health,  75(6),  199-­‐213.  

31

Byrk,  A.,  Sebring,  P.,  Allensworth,  E.,  Luppescu,  S.,  &  Easton,  J.  (2010).  Organizing  Schools  for  Improvement:   Lessons  from  Chicago.    University  of  Chicago  Press.    

32

See  The  Pew  Charitable  Trust.  (2010).  Philadelphia’s  Changing  Schools  and  What  Parents  Want  from  Them.   Retrieved  from   http://www.pewtrusts.org/uploadedFiles/wwwpewtrustsorg/Reports/Philadelphia_Research_Initiative/P RI_education_report.pdf  and  Thapa,  A.  Cohen,  J.,  Higgins-­‐D’Alessandro,  A.  &  Guffey,  S.  (2012).  School   Climate  Research  Summary:  August  2012.  National  School  Climate  Center.  Retrieved  from   http://www.schoolclimate.org/climate/documents/policy/sc-­‐brief-­‐v3.pdf    

33

Improving  School  Climate:  Findings  from  Schools  Implementing  Restorative  Practices,  A  Report  from  the   International  Institute  of  Restorative  Practices  Graduate  School.  (2009).  Retrieved  from   http://www.iirp.edu/pdf/IIRP-­‐Improving-­‐School-­‐Climate.pdf  

34

Uline,  C.,  Tschannen-­‐Moran,  M.  (2008).    “The  Walls  Speak:  The  Interplay  of  Quality  Facilities,  School  Climate  and   Student  Achivement.”  Journal  of  Educational  Administration.  46(1),  55-­‐73.  Retrieved  from   38  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      http://edweb.sdsu.edu/schoolhouse/documents/wallsspeak.pdf    and  The  Victorian  Institute  of  Teaching,   “The  Effect  of  the  Physical  Learning  Environment  on  Teaching  and  Learning.”  Retrieved  from   http://www.vit.vic.edu.au/SiteCollectionDocuments/PDF/1137_The-­‐Effect-­‐of-­‐the-­‐Physical-­‐Learning-­‐ Environment-­‐on-­‐Teaching-­‐and-­‐Learning.pdf       35  Carnegie  Corporation  of  New  York  (2013).  “Opportunity  by  Design:  New  High  School  Models  for  Student   Success.”  New  York,  New  York.  Retrieved  from   http://carnegie.org/fileadmin/Media/Programs/Opportunity_by_design/Opportunity_By_Design_FINAL.p df.     36

Center  for  Public  Education,  “The  Principal  Perspective,”  Retrieved  at:   http://www.centerforpubliceducation.org/principal-­‐perspective    

37

OECD  (2011),  "School  Autonomy  and  Accountability:  Are  They  Related  to  Student  Performance?",PISA  in  Focus,   No.  9,  OECD  Publishing.  doi:  10.1787/5k9h362kcx9w-­‐en  

38

The  School  District  of  Philadelphia,  “Renaissance  Schools  Initiative  Progress  Report:  2010-­‐2011  through  2012-­‐ 2013,”  December  2013.    Retrieved  from   http://webgui.phila.k12.pa.us/uploads/TM/8_/TM8_61FkJvh05xIyLNSS9w/Renaissance_Report_Dec_201 3.pdf    

39

Center  of  Excellence  in  Leadership  of  Learning,  “Summary  of  Research  on  Project  Based  Learning,”  University  of   Indianapolis,  June  2009.    Retrieved  from:  http://cell.uindy.edu/docs/PBL%20research%20summary.pdf    

40

Gold,  E.,  Norton,  M.,  Peralta,  R.  (2013).    “Philadelphia’s  Accelerated  High  Schools:  An  Analysis  of  Strategies  and   Outcomes  –  Final  Report.”      

41

MDRC,  “Transforming  the  High  School  Experience:  How  New  York  City’s  Small  Schools  are  Boosting  Student   Achievement  and  Graduation  Rates,”  June  2010.    Retrieved  from:   http://www.mdrc.org/sites/default/files/full_589.pdf    

42

AQI  Strategic  Plan  

43

USAID  (2006).    “School  Report  Cards:  Some  Recent  Experiences.”    Retrieved  from   http://files.eric.ed.gov/fulltext/ED524467.pdf.    And  Jacobson,  R.,  Saultz.,  A,  and  Snyder  J.  (2013).     “Grading  School  Report  Cards.”    Phi  Delta  Kappan.    Retrieved  from   http://web.a.ebscohost.com/ehost/pdfviewer/pdfviewer?sid=22eb7da0-­‐6a48-­‐4caa-­‐ac5b-­‐ c9270bac5b1b%40sessionmgr4004&vid=38&hid=4206.    

44

Earthman,  G.  I.  (2002).  “School  Facility  Conditions  and  Student  Academic  Achievement.”    Williams  Watch  Series:   Investigating  the  Claims  of  Williams  v.  State  of  California.    UCLA  Institute  for  Democracy,  Education  &   Access.    Retreived  from:  http://escholarship.org/uc/item/5sw56439#page-­‐1    and  Tanner,  C.  K.  (2008).   Explaining  Relationships  Among  Student  Outcomes  and  the  School’s  Physical  Environment.  Journal  of   Advanced  Academics,  19(3),  444-­‐471.  doi:  10.4219/jaa-­‐2008-­‐812  

45

Engel  (2009).  Time-­‐out  on  Timing:  The  Relationship  Between  the  Timing  of  Teacher  Hires  and  Teacher  Quality.   39  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      46  Liu,  E.  (2005).  Hiring,  Job  Satisfaction,  and  the  Fit  Between  New  Teachers  and  their  Schools,  paper  presented  at   the  annual  meeting  of  the  America  Educational  Research  Association,  Montreal,  April  2005.  Retrieved   from   http://isites.harvard.edu/fs/docs/icb.topic1239487.files/Liu_AERA_2005_Hiring_and_Job_Satisfaction.pdf     47

Levin,  J.  and  Quinn,  M.(2003).  “Missed  Opportunities:  How  We  Keep  High-­‐Quality  Teachers  Out  of  Urban   Classrooms,”  The  New  Teacher  Project,  2003.    Retrieved  at:   http://tntp.org/assets/documents/MissedOpportunities.pdf  and  Jacob,  B.  (2007)  “The  Challenges  of   Staffing  Urban  Schools  with  Effective  Teachers.”    Future  of  Children,  v17  n1.,  p129-­‐153.    Retrieved  from   http://files.eric.ed.gov/fulltext/EJ795883.pdf.    

48

Branch,  G.F.,  Hanushek,  E.A.,  &  Rivkin,  S.G.  (2003).  School  Leaders  Matter:  Measuring  the  impact  of  effective   principals.  Education  Next.  13(1).  Retrieved  from  http://educationnext.org/school-­‐leaders-­‐matter/  

49

Mendels,  P.  (2012).  Principals  In  the  Pipeline:  Districts  Construct  a  Framework  To  Develop  School   Leadership.    The  Learning  Forward  Journal.  33(3).  Retrieved  from   http://www.wallacefoundation.org/knowledge-­‐center/school-­‐leadership/district-­‐policy-­‐and-­‐ practice/Documents/Principals-­‐in-­‐the-­‐Pipeline.pdf  

50

TNTP.  (2012)  The  Irreplaceables.  

51

Chetty,  Raj,  John  N.  Friedman  and  Jonah  E.  Rockoff,  “The  Long-­‐term  Impacts  of  Teachers:  Teacher  Value-­‐Added   and  Student  Outcomes  in  Adulthood,”  Retrieved  from:   http://obs.rc.fas.harvard.edu/chetty/value_added.html    

52

Clifford,  M.  &  Ross,  S.  (2011).  Designing  Principal  Evaluation  Systems:  Research  To  Guide  Decision-­‐Making,  An   Executive  Summary  of  Current  Research.  American  Institutes  for  Research.  Retrieved  from   https://www.naesp.org/sites/default/files/PrincipalEvaluation_ExecutiveSummary.pdf    

53

Biancarosa,  Gina,  Anthony  S.  Bryk  and  Emily  Dexter,  “Assessing  the  Value-­‐Added  Effects  of  Literacy  Collaborative   Professional  Development  on  Student  Learning,”  The  Elementary  School  Journal,  vol.  111,  No.  1  (Sept   2010),  pp  7-­‐34.  

54

See  Johnson,  S.  M.,  Kraft,  M.  A.  &  Papay,  J.  P.  (2012).  How  Context  Matters  in  High-­‐Needs  Schools:  The  Effects  of   Teachers’  Working  Conditions  on  Their  Professional  Satisfaction  and  Their  Students’  Achievement.   Teachers  College  Record,  114  (10,  2012).  1-­‐39.  Retrieved  from   http://www.tcrecord.org.revproxy.brown.edu/library/Issue.asp?volyear=2012&number=10&volume=114   and  Jackson,  C.  K.  &  Bruegmann,  E.  (2009).  Teaching  Students  and  Teaching  Each  Other:  The  Importance   of  Peer  Learning  for  Teachers.  American  Economic  Journal:  Applied  Economics  2009,  1(4),  85-­‐108.   doi:  10.1257/app.1.4.85    

55

Leana,  C.R.  (2011).  The  Missing  Link  in  School  Reform.  Stanford  Social  Innovation  Review.  9(4),  30-­‐35.  Retrieved   from  http://www.ssireview.org/articles/entry/the_missing_link_in_school_reform    

56

Jacob,  B  (2007).  “The  Challenges  of  Staffing  Urban  Schools  with  Effective  Teachers.”    Future  of  Children,  v17  n1.,   p129-­‐153.    Retrieved  from  http://files.eric.ed.gov/fulltext/EJ795883.pdf.       40  |  P a g e  


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      57  Hurdle,  J.  (2013).    “With  an  Old  Factory,  Philadelphia  Is  Hoping  to  Draw  New  Teachers.”  The  New  York  Times.     Retrieved  from:  http://www.nytimes.com/2013/05/05/education/philadelphia-­‐renovating-­‐apartments-­‐ to-­‐lure-­‐teachers.html     58

The  Pennsylvania  Department  of  Education.  (2012).  Measuring  Principal  Effectiveness  Presentation.  Retrieved   from   http://www.portal.state.pa.us/portal/server.pt/community/educator_effectiveness_project/20903/page/ 1173846    

59

The  Irreplaceables:  Understanding  the  Real  Retention  Crisis  in  America’s  Urban  Schools.  The  New  Teacher   Project.  Retrieved  from  http://tntp.org/irreplaceables  

60

Pennsylvania  Department  of  Education  Educator  Effectiveness  Project.  Retrieved  from   http://www.portal.state.pa.us/portal/server.pt/community/educator_effectiveness_project/20903  

61

Taylor,  E.S.  &  Tyler,  J.H.  (2011).  The  Effect  of  Evaluation  on  Performance:  Evidence  from  Longitudinal  Student   Achievement  Data  of  Mid-­‐Career  Teachers.  National  Bureau  of  Economic  Research.  Retrieved  from   http://4teachingexcellence.org/uploads/media_items/the-­‐effect-­‐of-­‐evaluation-­‐on-­‐ performance.original.pdf    

62

Clifford,  M.  &  Ross,  S.  (2011).  Designing  Principal  Evaluation  Systems:  Research  To  Guide  Decision-­‐Making,  An   Executive  Summary  of  Current  Research.  American  Institutes  for  Research.  Retrieved  from   https://www.naesp.org/sites/default/files/PrincipalEvaluation_ExecutiveSummary.pdf    

63

See  Behnke,  A.  O.  &  Kelly,  C.  (2011).  Creating  Programs  to  Help  Latino  Youth  Thrive  in  School:  The  Influence  of   Latino  Parent  Involvement  Programs.  Journal  of  Extension,  41(1),  1-­‐11.  Retrieved  from   http://www.joe.org/joe/2011february/a7.php  and  Sheridan,  S.  M.,  Knoche,  L.  L.,  Kupzyk,  K.  A.,  Edwards,   C.  P.  &  Marvin,  C.  A.  (2011).  A  Randomized  Trial  Examining  the  Effects  of  Parent  Engagement  on  Early   Language  and  Literacy:  the  Getting  Ready  Intervention.  Journal  of  School  Psychology,  49(3),  361-­‐383.  doi:   10.2016/j.jsp.2011.03.001     64  Sheridan,  S.  M.,  Knoche,  L.  L.,  Kupzyk,  K.  A.,  Edwards,  C.  P.  &  Marvin,  C.  A.  (2011).  A  Randomized  Trial  Examining   the  Effects  of  Parent  Engagement  on  Early  Language  and  Literacy:  the  Getting  Ready  Intervention.  Journal   of  School  Psychology,  49(3),  361-­‐383.  doi:  10.2016/j.jsp.2011.03.001   65  Patton,  C.,  &  Wang,  J.  (2012)  Ready  for  Success:  Creating  Collaborative  and  Thought  Transitions  into   Kindergarten.  FINE  Newsletter,  4(3),  1-­‐22.     66

Harris,  A.,  Andrew-­‐Power,  K.,  Goodall,  J.  (2009)  Do  Parents  Know  They  Matter?:  Raising  Achievement  Through   Parental  Engagement.  New  York:  Continuum  Publishing  Group.  

67

Rosenzweig,  C.  (2001).    “A  Meta-­‐analysis  of  Parenting  and  School  Success:  The  Role  of  Parents  in  Promoting   Students’  Academic  Performance.”    Presentation  at  the  American  Educational  Research  Association  2001   Conference.    Retrieved  from  http://files.eric.ed.gov/fulltext/ED452232.pdf.    

68

Ouellett,  P.M.,  &  Wilkerson,  D.  (2008)  "They  Won't  Come":  Increasing  Parent  Involvement  in  Parent   Management  Training  Programs  for  At-­‐Risk  Youths  in  Schools.  School  Social  Work  Journal,  32  (2),  39-­‐53.    

41 |  P a g e    


The School  District  of  Philadelphia  Action  Plan  v2.0      

                                                                                                                                                                                                                                                                                                                                                                                                                                      69  Rothman,  R.  (2005).  Expanding  Opportunity:  Partners  for  Learning.  Annenberg  Institute  for  School  Reform:  Voices   in  Urban  Education,  Spring  2005,  2-­‐4.  Retrieved  from   http://annenberginstitute.org/sites/default/files/product/216/files/VUE7.pdf     70

Childress,  S.,  Elmore,  R.  F.,  Grossman,  A.  S.,  &  Johnson,  S.  M.  (Eds.).  (2007).  Managing  School  Districts  for  High   Performance:  Cases  in  Public  Education  Leadership.  Cambridge,  MA:  Harvard  University  Press.  

71

The  Project  Management  Institute  (2013).    “The  High  Cost  of  Low  Performance:  The  Essential  Role  of   Communications”.    Retrieved  from  http://www.pmi.org/Knowledge-­‐Center/Pulse/~/media/PDF/Business-­‐ Solutions/The-­‐High-­‐Cost-­‐Low-­‐Performance-­‐The-­‐Essential-­‐Role-­‐of-­‐Communications.ashx.    

69

Hartman,  W.  T.  (2003).  School  District  Budgeting.  Lanham,  MD,  20706:  ScarecrowEducation.  

73

Smialek,  M.A.,  (2002).    “Inside  Teams  in  Education”.    Phi  Delta  Kappan,  2002,  Issue  496.  

74

Marsh,  J.  A.,  Pane,  J.  F.  &  Hamilton,  L.  S.  (2006).  Making  Sense  of  Data-­‐Driven  Decision  Making  in  Education:   Evidence  from  Recent  RAND  Research.  The  RAND  Corporation.  Retrieved  from   http://www.rand.org/content/dam/rand/pubs/occasional_papers/2006/RAND_OP170.pdf    and  Gottfried,   M.  A.,  Ikemoto,  G.  S.,  Orr,  N.  &  Lemke,  Cheryl.  (2011).  What  four  states  are  doing  to  support  local  data-­‐ driven  decisionmaking:  policies,  practices  and  programs.  REL  Mid-­‐Atlantic:  Regional  Education  Laboratory   at  Pennsylvania  State  University.  REL  2012,  118.  Retrieved  from   http://files.eric.ed.gov/fulltext/ED526134.pdf  

75

Zars,  B.  (1998).    “Long  Rides,  Tough  Hides:  Enduring  Long  School  Bus  Rides,”  Rural  Challenge  Policy  Program.     Retrieved  from  http://files.eric.ed.gov/fulltext/ED432419.pdf.  And  Spence,  B.  (2000).    “Long  School  Bus   Rides:  Their  Effect  on  School  Budgets,  Family  Life,  and  Student  Achievement.”    Rural  Education  Issue   Digest.    Retrieved  from  http://files.eric.ed.gov/fulltext/ED448955.pdf.      

76

Mucha,  M.  J.  (2012).  Budgeting  for  outcomes:  Improving  on  a  best  practice.  Government  Finance  Review,  28(6),   45-­‐46.  Retrieved  from  http://search.proquest.com/docview/1231354920?accountid=9758  

77

Hartman,  W.  T.  (2003).  School  District  Budgeting.  Lanham,  MD,  20706:  ScarecrowEducation.  

78

th

Mikesell,  J.  (2006).  Fiscal  Administration:  Analysis  and  Applications  for  the  Public  Sector  (7  ed.).  Belmont,  CA:   Thomas  Higher  Education.  

42 |  P a g e    


Action Plan v2.0 Financial Supplement: Getting to Great February 20, 2014

This brief supplement to the Action Plan v2.0 asks our funders to provide the sustainable level of investment needed for us all to be successful as we work together to make our schools great. As laid out in Action Plan v2.0, the case for investing in great schools is clear. Firstly, we know what works and what we need to do. We are making our schools great by investing in school leadership and in teacher development, by investing in neighborhood schools, by investing in new ways of reaching all students according to their needs, by investing in the quality instruction necessary to achieve high standards, by investing in school safety and high-quality service for parents and families, and by investing in high quality charter options. In short, we are investing our precious resources in what works. In addition, over the past year, in spite of our budget constraints, the District has been able to invest in specific programs that work including: expanding Career and Technical Education programs in high priority occupations; investing in turning a program – the Sustainability Workshop – into a new school; starting a career academies model at Roxborough and Lincoln; replicating and expanding high performing schools such as the Science Leadership Academy and Hill-Freedman; and turning around several low performing schools by establishing three new Renaissance charters and six new Promise Academies. While these programs helped to move over 5,000 students into programs and school models that have been demonstrated to improve student achievement, this number remains far too low. Secondly, a quality education benefits our students, families, communities, and the city. 1 Our funders, and thus taxpayers, should support the necessary investment because it is the right thing to do, to make great schools for all young people; and because strong schools are the heart of any vibrant city’s civic and economic infrastructure. Simply put, strong schools will strengthen the foundation of Philadelphia’s economy in meaningful ways; weak schools will erode the progress that the city has recently experienced. More specifically, there are five facts that support the case for investing in great schools. 1. Children benefit tremendously from great schooling, as great schools improve learning, decrease the number of drop-outs, and increase the likelihood of children going on to further education and work. High school graduates enjoy 39% more in earnings over their lifetimes than high school dropouts, while people who complete college earn 129% more over their lifetime. Annually, high school dropouts earn $10,300 less than high school graduates and $31,400 less than college graduates. 2 Conversely, high school dropouts are 1.5 times more likely to end up unemployed than high school graduates, and more than 63 times more likely to enter the criminal justice system than those with at least a bachelor’s degree. 3 2. Families deserve equitable investments across all schools, particularly given regressive taxation policies. According to the Education Law Center’s review of all fifty state’s education funding 1|P a g e


policies, Pennsylvania is considered a state with a regressive policy, i.e., it does not provide additional resources to schools based on their poverty concentration. 4 In essence, the state’s funding policy does not provide differentiated levels of support to students based on their level of need. 3. Great schools support and sustain neighborhoods, providing essential “social capital” that counters the adverse effects of blight and poverty. Schools and education can build social capital by providing forums for community activity. 5 4. Great schools will contribute meaningfully to Philadelphia’s and Pennsylvania’s economy. Investments in great schools both increase future public revenue and decrease current and future public costs. Educated workers raise regional income because of increased productivity. Furthermore, dollars invested in quality schools can reduce other areas of public spending such as unemployment, the criminal justice system, and public aid. 6 For example, every $1 spent on quality pre-school for low income families is estimated to generate $4 to $11 of economic benefits over a child’s lifetime. 7 5. Great schools contribute to the recruitment and maintenance of a high quality workforce. According to a recent study by the Pew Charitable Trusts, 56% of young adults said they would not recommend Philadelphia as a place to live as the condition of the School District of Philadelphia “weighs heavily on millennials;” 81% of them have a negative impression of the job that schools are doing. 8 In addition to not recommending Philadelphia as a place to live, half of those surveyed indicated that they definitely or probably will not be living in Philadelphia in the next five to 10 years with 29% indicating that school and child-upbringing as their primary source of concern. 9

Our Current Ability to Invest in Students and Schools Improving our schools is our work; it is clearly reflected as 100% our highest priority in the way 80% we spend our limited resources. After paying for mandatory 60% expenditures, approximately $1.41 billion, or 56% of our 40% operating budget is available for District expenses. Of the $1.41 Non-District 20% Operated billion available to cover District Schools, 4% expenses, $1.35 billion goes to 0% paying for our school buildings and our students’ instruction. This constitutes 95% of our available funding.

Operating Fund Expenditures FY 2014 Charters, 29%

Admin

District Schools & Admin, 56%

Debt Service, 11%

Schools

FY14

In spite of the District’s commitment and our actions to protect as much school funding as possible, we have had to scale back on school-based personnel and many activities that support our students, including decreasing some instructional programming, shrinking our extracurricular programming and counseling support, decreasing the number of librarians, scaling back on our school transformation efforts, and decreasing the activities that support our retention activities and gifted programs. 2|P a g e


These reductions have occurred over several years, and we have endeavored to minimize the impact on students and the school district. However, over the past three years, The School District of Philadelphia has made massive budget reductions to close shortfalls. In FY 2012 the District closed a budget gap of over $700 million which included $315 million in school-based reductions and reducing the central office FTE total by 50%. To allow schools to open, the District borrowed $300 million – an option that was no longer available to us in FY 2014. Therefore, by FY 2014, the District was facing a $304 million budget gap. In an effort to only spend what we have, the adopted budget for FY 2014 included over $250 million in additional expenditure reductions, leaving many of our schools with a principal, teachers at the contractual class size limits and very little else. The District had to reduce nearly 5,000 positions (25% of total positions) resulting in roughly 3,800 layoffs. Central office spending was further reduced by 30% leaving Central Office spending at a little over 2% of the total operating budget. Since Budget adoption, additional revenue ($112 million) has been identified which has partially restored some services to school. However, as the majority of the $112 million in new revenues for FY 2014 is non-recurring, these resources are no longer available for FY 2015. Therefore, it is critical that the District receives the full $120 million in recurring revenues from the 1% sales tax continuance. As a result, we are in the same situation we were last year. The District cannot afford what works. We can only afford some of what works. This means that we cannot afford to replicate and scale programs that work at the rate our students deserve. We cannot afford to provide all of our schools with the opportunities necessary to ensure a high quality education which prepares our students for college and career. We cannot afford to provide our teachers and principals and other educators with the time and support necessary to help strengthen their instructional practice and continue their own professional growth in aid of our students. We cannot afford these things because we do not have sufficient funding. If we compare our estimated per pupil spend to our nine top performing neighboring districts, the District spends between $1,890 and $12,204 less on each student than our neighboring districts, despite having 150%-1600% higher proportion of students who quality for free and reduced-price meals (see Table 1). 10 Therefore, to have a similar per-pupil spend as neighboring districts, the SDP’s operating budget would have to increase by ~$250 million to ~$1.6 billion annually. Table 1: 2012 Per Pupil Spend for SDP and Nine Neighboring School Districts & Pittsburgh11 School District Lower Merion Pittsburgh Cheltenham Colonial Lower Moreland Neshaminy Bensalem Abington Haverford Springfield Philadelphia

Per Pupil Estimate* (2012) $25,370 $21,000 $20,941 $19,132 $18,718 $17,230 $16,976 $15,543 $15,398 $15,056 $13,167

PSSA Proficiency (Gr 3-5)** (2012) 90% 52.8% 80% 90% 86% 81% 67% 84% 89% 89% 41%

Economically Disadvantaged*** (2012-2013) 8.17% 69.46% 22.40% 18.55% 5.14% 19.68% 46% 18.42% 12.95% 13% 83.93%

Funding required to provide SDP with similar resources $1,603 million $1,029 million $1,021 million $784 million $729 million $534 million $500 million $312 million $293 million $248 million $0

*Source: Pennsylvania Department of Education Statewide AFR Expenditures ** Source: PA AYP ***Source: Pennsylvania Department of Education PA School Performance Profile

3|P a g e


Similarly, a recent report prepared for Philadelphia’s City Council by scholars at the University of Pennsylvania about education spending in Pennsylvania (based on 2009-2010 budgets) estimates that Philadelphia spends $5,478 per student less than it should to provide an adequate education. 12 In 2013 dollars, this means that the District is operating a budget that is approximately $770 million less than what is required for adequacy. 13 Furthermore, in comparison to Table 2: Per Pupil Spend in Neighboring States (2011) neighboring states The School District Comparison State Per Pupil Amounts Equivalency Gap of Philadelphia spends between $2,800 14 FY 2011 to $5,900 less per pupil. Therefore, New York $19,076 $776 million when accounting for the number of District of Columbia $18,475 $697 million students we serve, this difference equates to approximately $368 million New Jersey $15,968 $368 million less than what New Jersey would Source: US Census Bureau otherwise spend on its students and about $776 million less than what New York would spend (see Table 2). We agree that funding is not the sole solution to the many challenges facing our schools. However, money does matter, and we do not have enough. For example, we currently do not have resources to: • • • • • • •

Significantly improve our early literacy and kindergarten readiness program Provide our students with an adequate number of counselors 15 Offer a range of extracurricular options for our students 16 Support students who demonstrate advanced academic potential 17 Provide our teachers with opportunities to be observed and receive feedback to help strengthen their instructional practices Incubate or replicate our high performing schools Dramatically improve the physical environments of all of our schools

In short, we do not have sufficient funds to fully implement the evidence-based actions identified in Action Plan v2.0.

A Stark Choice In addition to the $120 million of recurring revenues from the 1% sales tax continuance, the District will require $320 million in recurring revenues to provide a minimum amount of improved and sustained educational opportunities for our students and families. Approximately $80m of this recurring funding would go to closing a new anticipated budget gap (due to increased expenses for pensions and charter school growth), to ensure that students are provided the same level of service. The difference of approximately $240 million will allow the District to provide additional supports to our schools and students (see Table 3). These services will provide the incremental support our schools and the system needs to reverse the tide of underinvestment and under performance.

4|P a g e


Table 3: Additional Resources for Schools with $320 Million Additional Recurring Revenues Action Plan Strategies Activities Strategy 1: Improve Student • Robust early literacy program Learning • School climate programs • Limited credit recovery program for high school students • Additional supports to English Language learners and students with IEPs • Additional counseling, mental, and behavioral health support to schools and students • Additional support to students for Keystone exams • Updated curricular materials • College and career readiness assessments and programs for most schools (e.g., PSAT, SAT, AP, IB, and dual enrollment) • Modest increase in discretionary spend for schools Strategy 2: Develop a System of • More expansions, replications, and creation of new schools Excellent Schools • Strengthened and expanded career academies operating in the District • Expanded CTE programming • Improved safety and physical infrastructure for a select number of schools • 1-5 Renaissance Charter conversions • Expansion of top charter performers Strategy 3: Identify and Develop • Additional instructional activities Exceptional, • Strengthened teacher and principal PD Committed People • Principal residency program • Development of internal leadership pipeline • Additional non-instructional supports for schools Strategy 4: Become a Parent- and • Improved student enrollment process Family-Centered • Improved customer service and training for parents Organization Strategy 5: Become and • Upgraded student information system Innovative and • Improved data processing and reporting Accountable Organization Strategy 6: Achieve and Sustain • Increased capacity to leverage partnership support Financial Balance

However, to be clear, the additional $320 million in new recurring revenues will not provide the District, our schools, our students, or the charter sector the sufficient resources to fully implement the activities identified in Action Plan v2.0. It does not allow us to do all of the hard work necessary to turnaround each school and get to great. Getting to great requires more.

Further Considerations In addition to the operating budget, additional considerations must be made for other sources of revenues received by the District; the charter funding formula; and supports received from our communities and partners. Labor We are currently engaged in contract negotiations with four of our five labor unions. Last year, we included $133 million in cost reductions from our labor partners in our funding requests. We remain committed to the belief that all must share in the sacrifice. In order for the District to implement the various initiatives in a sustainable and cost effective manner, cost savings will be paramount to further enhance the program improvements outlined in Action Plan v2.0.

5|P a g e


Furthermore, we need more than economic concessions from our labor partners. The following work rule reforms are absolutely necessary to implement our initiatives so they achieve the outcomes desired. Those reforms include: •

• • •

Getting the right people in schools, ensuring that principals and school leadership teams are able to assemble school teams that best meet the needs of the students and school community through: o 100% inbound site selection (i.e., all open positions in schools may be filled through a thoughtful process involving interviews) o Providing principals with the authority to determine who exits a building owing to either enrollment or funding reductions based on appropriate, student-focused criteria o Enabling flexibility on recall of laid-off employees Providing principals with the ability to construct the use of preparation periods to facilitate collaborative planning among teachers in a school Increasing the length of instructional time during the school day Flexibility around a school’s roster so that class schedules can be created in an efficient manner that meets the needs of students in the school

State funding commission We are enthusiastic about the state’s interest in establishing a commission to review the distribution of school funding and will fully support the commission’s work should House Bill 1738 be voted into law. 18 At the same time, our students and families should not have to wait another year for better resourced schools. We ask our funders to invest in making our schools better now. Additional funding streams In terms of other revenues, the District also receives resources in the form of state and federal grants, capital funds, and small enterprise funds. Therefore, we are working to ensure that all of our resources are well managed and that our expenditures are allocated in a manner that is aligned with our strategies. •

• •

Grant Funds. Grants comprise approximately 11% of the total District consolidated budget, 19 or $336 million. The District has spent two years ensuring we are completely compliant with federal regulations, and were recently cited for our exceptional approach to the use of Grant funds. We are now reviewing all allowable uses to make sure that our Federal grants are being allocated to our most important priorities. Capital Funds. Our Fiscal Year 2014 capital budget is $134 million. 20 We have instituted strong controls on capital budget decision-making to ensure complete harmony with our overall budget priorities. Enterprise Funds. These funds are used to account for the operations of the Food Services Division within the School District of Philadelphia. These fund budgets are not adopted; however, formal budgets are prepared and approved by management. These funds amount to approximately $81.8 million, which is 2.8% of the District’s budget.

Charter School Funding The District, as authorizer, supports high performing charter schools as important and real options for families in Philadelphia. The District also suffers from an unreasonable state funding formula that 6|P a g e


penalizes District schools for every child that leaves to attend a charter school. Therefore, as we consider our authorizing work, we are committed to expanding high performing charter schools, and doing so in ways that are cost neutral to the District. Toward this end, we plan to continue to work with the state to stop payments to schools, especially low performing schools, that are over-enrolled and get payment directly from the state; and we plan to aggressively seek to close the lowest performing charter schools that are under-serving children and families. Partnership Goals It is our intention to work collaboratively with the philanthropic and corporate communities to secure both financial and in-kind services to support our priorities. We have established the following financial and service targets for our fledging Office of Strategic Partnerships over the next year: • • • •

$2 million from corporations $7 million from local and regional foundations $5 million from national foundations $25 million of in-kind services

The total estimated value of partnerships in FY15 is $39 million. These resources are intended to provide our schools with the complement of supports and services necessary to accelerate their progress towards “great;” it is not to achieve the minimum of services described above.

Conclusion As the numbers starkly indicate, for too long there has been a disinvestment in the School District of Philadelphia’s students. This is a policy with real and damaging consequences for the lives of our students, the future of our city, and the social and economic health of our state. As a District, we are committed to realizing a system of excellent schools capable of providing all our students with the quality education they deserve. Such a system, however, is not possible with the kind of chronic underfunding that is starving our schools and shortchanging our students. Together we have an opportunity to take action. Yes, commitment is necessary. And yes, we have an evidence-based and implementable plan. But good intentions, good will, and good planning can only take us so far. Real improvement requires adequate, fair and stable funding.

7|P a g e


End Notes 1

Students without a high school diploma are three times more likely to be unemployed compared to individuals with a bachelor’s degree, see the Bureau of Labor Statistics Employment Projections for 2012 (http://www.bls.gov/emp/ep_chart_001.htm). In terms of lifetime earnings, an individual with a bachelor degree could expect to earn approximately $900,000 more than a high school drop out and about $700,000 more than a high school graduate throughout his/her lifetime, see US Census American Community Survey Report “Education and Synthetic Work-Life Earning Estimates” (http://www.census.gov/prod/2011pubs/acs14.pdf). Additional information about impact of public schools can be found in Weiss, Johathan D. (2004), “Public Schools and Economic Development: What the Research Shows,” Knowledge Works Foundation. Also see Action Plan v2.0 section on “The Case for Investment.” www.philasd.org/actionplan/. 2 Baum, S., Ma, J., and Peyea, K. (2013). “Education Pays 2013: The Benefits of Higher Education for Individuals and Society.” The College Board. Retrieved from http://trends.collegeboard.org/sites/default/files/education-pays-2013-full-report.pdf 3 “The Consequences of Dropping Out of High School: Joblessness and Jailing for High School Dropouts and the High Cost for Taxpayers,” 2009. http://www.northeastern.edu/clms/wpcontent/uploads/The_Consequences_of_Dropping_Out_of_High_School.pdf 4 Education Law Center (2004). “Is School Funding Fair? A National Report Card”. Retrieved from http://www.schoolfundingfairness.org/ExecutiveSummary_2014.htm 5 The World Bank. “Social Capital and Education.” Retrieved from http://web.worldbank.org/WBSITE/EXTERNAL/TOPICS/EXTSOCIALDEVELOPMENT/EXTTSOCIALCAPITAL/0,,cont entMDK:20186584~isCURL:Y~menuPK:418214~pagePK:148956~piPK:216618~theSitePK:401015,00.html 6 Alliance for Excellent Education (2013). “Saving Futures, Saving Dollars: The Impact of Education on Crime Reduction and Earnings.” Retrieved from http://all4ed.org/wp-content/uploads/2013/09/SavingFutures.pdf 7 National Institutes of Health (2011). “High-quality preschool program produces long-term economic payoff.” Retrieved from http://www.nih.gov/news/health/feb2011/nichd-04.htm. 8 The Pew Charitable Trust (2004). “Millennials in Philadelphia: A Promising but Fragile Boom.” Retrieved from http://www.pewtrusts.org/uploadedFiles/wwwpewtrustsorg/Reports/Philadelphia_Research_Initiative/Philly _Millennials_Report_012214.pdf 9 The Pew Charitable Trust (2004). “Millennials in Philadelphia: A Promising but Fragile Boom.” Retrieved from. http://www.pewtrusts.org/uploadedFiles/wwwpewtrustsorg/Reports/Philadelphia_Research_Initiative/Philly _Millennials_Report_012214.pdf 10

The statewide per pupil funding table can be accessed here: http://www.portal.state.pa.us/portal/server.pt/community/summaries_of_annual_financial_report_data/767 3/afr_excel_data_files/509047 11 The funding required to provide SDP with similar resources was calculated by taking the difference in per pupil expenditures and multiplying it to the number of students currently served by SDP. 12

Steinberg, M. and Quinn, R., (2013). “Assessing Adequacy in Education Spending: A Summary of Key Findings from Pennsylvania and Philadelphia.” Retrieved from http://www.gse.upenn.edu/pdf/school_funding_summary_findings_steinberg_quinn.pdf

13

The $770 million dollar estimate was derived by multiplying the adequacy funding gap ($5,478) by the number of K-12 students currently served by SDP (131,362) and converting a 2010 dollar value to a 2013 dollar value. The cumulative rate of inflation between 2010 and 2013 is approximately 6.8%. It is important to note that between 2010 and 2013, the District has cut more than $200 million in expenditures, if this was added back to our operating budget the gap would be closer to $970 million. This calculation also excludes Pre-K students, alternative education students, and students who are placed in alternative settings – all of whom are paid for by SDP. 14 These estimates were derived by subtracting the 2011 state per pupil spend as estimated by the US Census from SDP’s 2012 per pupil spend as provided by PDE. 15

Decreasing the student to counselor ratio to 250:1 has shown to have decreased the probability of disciplinary infraction or recurrences, see Carrell, S.(2006). “Do Lower Student-to-Counselor Ratios Reduce School 8|P a g e


Disciplinary Problems?” Contributions to Economic Analysis & Policy, vol. 5: Iss 1. http://www.umass.edu/schoolcounseling/uploads/breif5.3.pdf. Furthermore, lower student-to-counselor ratio has shown to improve graduation and school attendance rates, see Lapan, R., Gysbers, N., Stanley, B., and Pierce, M. (2012). “Missouri Professional School Counselors: Ratios Matter, Especially in High-Poverty Schools.” Professional School Counseling, v16, n2, p108-116. 16

Hallow, J., (2002). “Research Link / Extracurricular Activities and Student Motivation.” Educational Leadership, v60, no1. Retrieved from http://www.ascd.org/publications/educational-leadership/sept02/vol60/num01/Extracurricular-Activities-and-Student-Motivation.aspx and Scott, T. (2010). “The Benefit of Extracurricular Activities in High School: Involvement Enhances Academic Achievement and the Way Forward.” Academic Leadership, vol8, issue 3, p 239-244.

17

While there is research indicating that students who achieve a certain score on the PSAT are likely to score a 3 or 4 on the AP exam, the District is currently unable to fund PSAT exams for all our students. As a result, we are unable to identify and support students who demonstrate a potential to succeed at higher levels. See Ewing, M., Camara, W., and Millsap R., (2006). “The Relationship between PSAT/NMSQT Scores and AP Examination Grades: A Follow-Up Study.” College Board Research Report No 2006-1. 18 Harris, M. (2013). “House OKs commission to examine school spending.” TRIBLive. Accessed on 2/14/14 at http://triblive.com/news/education/5422379-74/funding-formula-state#axzz2teDa8p00. See also The General Assembly of Pennsylvania House Bill 1738. Retrieved from http://www.legis.state.pa.us/CFDOCS/Legis/PN/Public/btCheck.cfm?txtType=HTM&sessYr=2013&sessInd=0& billBody=H&billTyp=B&billNbr=1738&pn=2878. 19

The District’s consolidated budget includes four sources of funding: operating, categorical (grants), and food services.

20

The Capital budget is not based on a specific source of revenue, rather it is based on a bond issuance; therefore, the Capital budget will vary based on District need as well as the District’s ability to absorb debt service fees in its operating costs.

9|P a g e


Page Left Blank Intentionally


Tab 4 Connnecting Ressearch to Policy and Practice P Pannel Location: 2nd Floor Auditoorium Chair and Discussannt: Andrew Poorter, Universitty of Pennsylvvania Panel: Chhristopher Lubbienski – The Role of Interm mediary Organnizations in Edducation Policymaking: Preliminary Finndings Margaret M Goerttz – SEA Acquisition and Usse of Research in School Impprovement Caarrie Conawayy – Research in Policy Makinng: The View ffrom the Otheer Side


Page Left Blank Intentionally


The Role of Intermediary Organizations in Education Policymaking: Preliminary Findings Co-PIs: Christopher Lubienski, University of Illinois Janelle Scott, UC Berkeley Elizabeth DeBray, University of Georgia


Policy Issue of Interest The demand for evidence on effective educational policies and interventions has further invigorated an already vibrant sector of intermediary organizations that seek to package and promote research on education policies and programs for policymakers, typically around specific market-oriented policies.


What Are Intermediary Organizations? • Groups that work between districts, school boards, state and federal government, and schools. • Can provide start-up funding and sundry curricular and administrative support. • Examples: ALEC, Urban League, New Schools for New Orleans, National Association of Charter School Authorizers, The Broad Foundation


Typology of Organizations with Intermediary Functions in Educational Policy Making


Preliminary Findings: Three Analyses • The role of foundations in the intermediary sector • Advocacy coalitions and research production in New Orleans • Bibliometric examination of research citations


Foundations in Knowledge Production • What have been the hallmarks of foundations’ efforts to advance the study of or the adoption of incentivist reforms in education policy? • Is there evidence of coalitional or networked behavior across foundations and other intermediaries? An echo chamber? • Is there evidence of research production, promotion, and utilization?


Foundations and IOs • Constructing data systems such as those necessary for measuring value-added and growth, grants for evaluations and research on these models, and developing state and district assessments. • Funding for education journalism, media, and blogs to translate data and research for the public, and to promote particular researchers/research. • Engaging policymakers, state and district officials, and reformers in conversations via the funding meetings and conferences where participants may exchange ideas about data use. • Funding for IOs: particular concentrations in charter school management networks, advocacy organizations.


Colorado Legislator: Research use in teacher merit pay So I would say that research was the foundation of the argument that we built. The other two pieces of research that I used a lot, and this is from some of the Gates research and some of the others…So we’ve had a year or so on process on this. And the dominant strain of every conversation has been, “show us a structure that you think works and show us a place that you know it works.” So it’s let’s see what Toledo, Ohio did and let’s see what Houston did with their evaluation system. Let’s look at DC IMPACT, let’s look at what Louisiana’s proposal is, what does TNTP have to say is the best structure? So it’s not as much let’s look at the Phi Delta Kappan for this month and see what’s in it, but it’s all about if you have an idea, you have to show us where that idea’s been tried before and they’ve had results on it. Colorado Legislator, Commenting about ProComp


Small and Big Foundations So we were probably the second funder in for ProComp, so when the Rose Community Foundation started it, they started to search around for local sources of revenue. We came in and gave them a grant-we were quickly overshadowed by the national foundations, Broad and those folks. So, we had a real interest in it and initially started working on it. It’s been an interesting journey because my take is that it’s one of the weakest levers around student achievement out there. And so, is merit pay or pay for performance a good idea? Eh, it’s alright. And I think it has some value, but not a lot of value. Denver-based family foundation


Building Institutional Infrastructure So whether they’re donors to our new school development effort or whether they support our advocacy or whatever. We have various, we certainly have at any given time, financial support from 6 to 8, maybe, local or national foundations. Sometimes that’s big on one and small here and it’s not...however nice it would be that they would all just write a big check and say, “use this however you choose”, they don’t and it tends to be around specific projects or initiatives and efforts, so there’s some fluctuation in terms of who is doing what and when. But generally speaking, at any given time, we do have a sizable chunk of revenue and relationships with a bunch of different funders. Colorado charter school organization


Advocacy Coalitions and IOs in New Orleans • Groups in “education reformer” coalition include o Local and State Level: New Schools for New Orleans, Pelican Institute, teachNOLA, Louisiana Association for Public Charters, Educate Now! o National: Stand for Children, Black Alliance for Educational Options, Teach For America, The New Teacher Project, CEE Trust, Center on Reinventing Public Education • Groups in opposing coalition include o Local and State Level: Research on Reforms, New Orleans Education Equity Roundtable, Louisiana School Boards Association o National: Parents Across America


Preliminary map of coalitions


Preliminary Findings (supply side) • Two opposing coalitions that interpret research results differently o Agreement that there is a lack of a credible, independent research voice • National level (NSNO report promoted by Sen. Mary Landrieu) • Some indication that the advent of vouchers in Louisiana may cause political divisions within local charter coalition


Preliminary Findings (demand side) • Opposing coalition organizations report BESE and state department not interested in their analyses • Cowen Institute at Tulane – relied on widely, but concern that mainly descriptive data • Many organizations and RSD report reliance on blogs, and contact with well-known researchers • Little use of peer-reviewed research (many not sure how to incorporate it)


Preliminary Findings, continued • With a couple of exceptions (New Schools for New Orleans and Louisiana Association of Public Charter Schools), relatively little research capacity in local IO networks • Culture of mutual mistrust • Implementation has not slowed despite research critiques; calls by Senator Landrieu for scaling up at national level • Opposing coalition, while not well-funded, has gotten some play in national media (e.g. Washington Post)


Culture of Mutual Mistrust Well, I have no problem saying this, but I think that we are really plagued by the same two narratives. It's either everything is going great or things are going to hell. There's a middle ground. I do believe that we need to have a more honest conversation about the current state of affairs in New Orleans public education. What's missing is honesty. Meaning it just would really be helpful if the folks who were in the game didn't have a particular stake in the game, so that it would be more neutral. But everyone comes to the table as coming with an angle. Even us. We're not immune to that, but I think we really do try to be honest about what is happening, but oftentimes I think people are overly cautious or not cautious enough because of who they serve.

--Staff Member at Orleans Parish Education Network


Bibliometric and Twitter analysis • Measuring the impact/influence of reports on broader fields (and broader fields on a given report) • Considering the role of different approaches to research, and of IOs in advancing specific forms • Forward and backward mapping • Testing the “echo-chamber” hypothesis


Citation Patterns (backwards) of two sets


Map of Citation Networks with Peer Review Status (backwards)


Type of user mentioned by intermediary organizations tweeting about “charter schools� "Charter School" User Mentions

2

10

2

39

54

User In Organization User In Coalition User Outside Coalition User Media Organization User Policy Maker


Number of users mentioned by intermediary organizations and type of user mentioned in tweets about “charter schools�


Page Left Blank Intentionally


SEA Acquisition and Use of Research in School Improvement School District of Philadelphia Research, Policy and Practice Conference April 8, 2014

Margaret E. Goertz University of Pennsylvania Carol Barnes & Diane Massell University of Michigan Consortium for Policy Research in Education Graduate School of Education University of Pennsylvania


2


Study Overview Overview Study Research Questions: 1. How are SEAs organized for managing and using researchand other knowledge in their improvement strategies for lowperforming schools and districts? 2. How do SEAs search for, obtain and incorporate researchbased information into policy and practices to support improvement for low-performing schools and districts? 3. What role does research and other forms of evidence or practitioner advice play in states’ strategies to improve districts and schools?

Sample: Three State Education Agencies (SEAs) Data Collection: Interviews, Surveys, Document Review Conceptual Frame: Organizational theory and social network theory


Types and Forms of Knowledge  Research-based Knowledge (RBK)  Published original research, research syntheses or summaries, meta-analyses, and published or un-published program evaluations.  RBK Designed for Use: models, programs, protocols or other tools that embed research-based practices in guides for action.

 Other Evidence-based Knowledge (EBK)  Data, facts and other information relevant to the problem at hand (e.g., student outcome data)

 Practitioner knowledge (PK)  Information, beliefs and understanding of context that practitioners acquire through experience 4


Finding #1 ďƒ˜SEA staff actively sought and were receptive to research ideas. ďƒ˜Their search was broad, reaching across the SEA and outside to external sources of research.

5


Finding #2 ďƒ˜70% of SEA staff searched internally when seeking research on school improvement. ďƒ˜This search reached across SEA offices, reflecting the cross-divisional work of school improvement work.

6


Structure & Strength of Research Network: State C Yellow - Superintendent Lt Green - School Improvement Dk Green - Instruction Aqua - Program Monitoring Brown - Teacher Prep Blue - Career and Tech Pink - Special Ed Gray - Assessment Black - External

7


Finding #3  Some SEA staff reached out to a broad array of external organizations for research information on school improvement .  External organizations:  Provided expertise and human capital.  Translated, synthesized and packaged research.  Provided access to ready-made or adaptable tools.  Validated that actions were in step with research and/or practices of other states. 8


External Sources for School Improvement External Sources for Research Knowledge Research Knowledge Govt: Federal Govt: State/regional/local Professional Membership State A

Higher Education

State B State C

Research/Provider (NGO) Education Journals Other/Unknown 0%

5% 10% 15% 20% 25% 30% 35% 40% 45% 9


Finding #4 ďƒ˜ Broad research networks, with many singular connections to knowledge sources, facilitated the flow of new ideas. ďƒ˜ But a set of influential SEA staff (and offices) incorporated research into school improvement strategies.

10


Structure & Strength of Research Network: State C Yellow - Superintendent Lt Green - School Improvement Dk Green - Instruction Aqua - Program Monitoring Brown - Teacher Prep Blue - Career and Tech Pink - Special Ed Gray - Assessment Black - External

11


Finding #5  RBK informed the design of SI frameworks, indicators, strategies, tools and other forms of SI assistance.  SEA staff valued “RBK designed for use.”  RBK was usually coupled with Practitioner Knowledge, as well as EBK, in the design/refinement of policies and practices.

12


Resources Margaret E. Goertz, Carol Barnes and Diane Massell (2013, August). How State Education Agencies Acquire and Use Research in School Improvement Strategies. CPRE Policy Brief (RB-55). www.cpre.org/policy-briefs Full Research Report: Margaret E. Goertz, Carol Barnes, Diane Massell, Ryan Fink, and Anthony Tuf Francis (2013, September). State Education Agencies’ Acquisition and Use of Research Knowledge in School Improvement Strategies. CPRE Research Report (RR-76). www.cpre.org/research-reports Kara S. Finnigan and Alan J. Daly (Eds.) (2014) Using Research Evidence in Education: From the Schoolhouse Door to Capitol Hill. Springer Publishers.

13


Page Left Blank Intentionally


Research in policy-making: The view from the other side

Carrie Conaway The SDP Research, Policy, and Practice Conference April 8, 2013


Educator evaluation in MA • Promote growth and development of leaders and teachers • Place student learning at the center, using multiple measures of student learning, growth, and achievement • Recognize excellence in teaching and leading • Set a high bar for professional teaching status • Shorten timelines for improvement


Where research mattered • Defining evaluation as a problem that required a solution • Determining how to include educator impact on student learning in the evaluation system • Incorporating student feedback in the system • Informing ongoing implementation


Why research mattered • Inherent nerdiness of Massachusetts • Commitment to learn from previous and ongoing work • A question whose answer was not preordained • Sufficient time


What types of research mattered • Literature reviews and reports from research intermediaries • Large-scale national studies • Studies with results specific to Massachusetts • Studies we commissioned • (Implementation studies, if they had existed)


Some suggestions for researchers • • • • • • •

Know the policy and reform agenda Focus on implementation, not just outcomes Facilitate comparisons Replicate analyses with local data Draw actionable conclusions Ditch the tome Embrace imperfect information


Embrace imperfect information

“Correlation doesn’t imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing ‘look over there’.” XKCD: http://xkcd.com/552/


Thank you!

Carrie Conaway carrie@virtualmax.com


Page Left Blank Intentionally


Â

Tab 5 Instrructions forr Breakout Group Disccussion


Page Left Blank Intentionally


BREAKOUT GROUP DISCUSSIONS

How can we enhance the use and communication of research within SDP? Format •

Discussions with tablemates (20 minutes) o Nominate one person from each table as a reporter o A note taker will be assigned to each table (SDP or REL MA staff)

Reporting out by all tables (20 minutes)

Vote on highest priority research-to-practice strategies (10 minutes)

Wrap up (by Tonya Wolford; 5 minutes)

Questions to Consider •

What strategies do you use to translate research to policy and practice? o Which strategies are general? o Which strategies are specific to a content area or methodology?

What are the barriers to translating research to policy and practice?

What strategies should be used to improve the translation of research to policy and practice?

How can we enhance the use of research being conducted in the District? o What is feasible? o What can we do to overcome barriers?


Page Left Blank Intentionally


Tab 6 Conccurrent Session A: Impproving Stuudent Learnning – Colllege & Careeer Readineess Location: 2nd Floor F Auditorium Chair: Shelly Engelm man, Office off Research andd Evaluation, SSDP Panel: 1.. Evaluationn of the Philade delphia GEAR-UUP Partnership ip Initiative— —Results and IImplications, JJulia Alemany and Manuel Guutierrez, Metiss Associates 2.. Non-Cogniitive Predictorrs of College Enrollment En , Anngela Duckworrth and David Meketon, University of Pennsylvannia 3.. Post-Seconndary Workforrce Attainmennt and School OOutcomes, Paaul Harringtonn, Drexel Univeersity Discusssant: Dave Kipphut, K Officee of Career andd Technical Edducation, SDP


Page Left Blank Intentionally


Non-Cognitive Predictors of College Enrollment Angela Duckworth David Meketon -University of Pennsylvania1


The Penn Study of Young Adults • Total of 2,160 high school seniors from Boston (36%) and Philadelphia (64%) • 58% of students qualified for free or reduced lunch • 50% of the students were female • 33% African American; 31% Caucasian; 18% Asian; 14% Hispanic; 4% Other • 58% of students enrolled in college in fall 2013 2


Motivation for Attending College

20%

25%

27%

28%

Interdependent Prosocial Purpose Independent Instrumental

Interdependent: “I want to help my family out after I’m done with college” Prosocial Purpose: “I want to become an educated citizen who can contribute to society” Independent: “I want to become an independent thinker” 3 Instrumental: “I want to get a good job”


Grittier Students have Higher GPAs* 2.90 2.85 2.80

GPA

2.75 2.70 2.65 2.60 2.55 2.50 Bottom third

Middle third

Grit

Top third

1) “I finish whatever I begin” 2) “I work very hard. I keep working when others stop to take a break” 3) “I stay interested in my goals even if they take a long time (months or years) to complete” 4) “I am diligent. I never give up” 4 *The difference between GPA is significant.


5 4.8

Students With Higher Non-Cognitive Skills are More Likely to Enroll Full Time in College

4.6

Not enrolled in college

4.4

Enrolled in college

4.2 4 3.8 3.6 3.4 3.2 3 Internal locus of Control*

Grit*

Self-Control*

Sense of belonging*

Trust*

Locus of Control: “Getting good grades is a matter of luck” (reverse-coded) Grit: “I work very hard. I keep working when others stop to take a break” Self-Control: “I pay attention and resist distractions in class” Sense of Belonging: “I feel confident that I will belong in college” Trust: “I am treated fairly by teachers and other adults at my high school” 5 *The difference between enrolled and not enrolled is significant.


The Diligence Task

6


Diligence Task Performance Predicts Full Time College Enrollment* Number of correct math problems

300 275 250 225 200 175 150 125 100 75 50 25 0 Not Enrolled in college

Enrolled in college

*The difference between enrolled and not enrolled is significant. 7


Page Left Blank Intentionally


April 8, 2014

Evaluation of the Philadelphia GEAR UP Partnership Initiative – Results and Implications Presented at the School District of Philadelphia R2P Conference

Presented By:

Julia Alemany Manuel GutiĂŠrrez


The Philadelphia GEAR UP Partnership Initiative •

Need for GEAR UP  only 45% of the students in the original target high schools graduate within four years and less than half of these graduates enroll in college.

Six-year federal grant that began in 2009-10 with 6th- and 7th-grade cohorts. In 2013-14, the program serves over 4,200 10th-grade and 11thgrade students in seven high schools.

Program components – Academic enrichment and supports – Teacher institutes and PD – College and career preparation activities – Family events and workshops

1


Evaluation Design

2


Key Implementation Findings From Program Participation Data Analyses • GEAR UP has a strong presence at the participating schools – 95% of students in the cohort were served – 87% participated in one or more college readiness activities

• Students with low or no participation were more likely to be male, ELLs, special education students and white. • For the most part, students participating in each GEAR UP activity were representative of the cohort as a whole.

3


Key Implementation Findings From Branch Qualitative Study

• Successes – Increased school engagement and buy-in – Comprehensive range of program offerings, increased availability of academic supports, and expanded academic advisement – Growth and stabilization of AVID – Sustainability of key elements of GEAR UP in many middle schools

• Challenges: – Low participation in services for at-risk students – Poor school climate at some schools – Larger schools experienced more challenges in meeting participation targets

4


Key Outcome Findings • 61% of students earned 6+ credits; 54% of students met attendance target of 85% • Students with greater GEAR UP dosage exhibited more positive outcomes, including: – Number of credits earned – Average daily attendance – Knowledge and attitudes towards postsecondary education

• Large variation in outcomes across schools

5


Relationship between College Programming and Students’ Self-Reported Learning

6


Key Impact Findings From Rigorous QED Study • Rigorous quasi-experimental design using Propensity Score Matching for developing a comparison group • Three sets of analyses to assess: 1. Overall impact of GEAR UP initiative 2. Impact of AVID 3. Impact of prolonged exposure to GEAR UP • Outcome variables: credits earned, average daily school attendance, Algebra I completion • Very promising results, including significant impact of AVID on credits earned and school attendance.

7


Key Impact Findings From Rigorous QED Study Figure 2: Impact of AVID Participation on Student Outcomes

1

An asterisk denotes a statistically significant difference at the .05 level.

8


Connecting Research to Practice Uses of Evaluation Data Program Participation Data

• To facilitate program management and oversight • To monitor implementation of each program component and inform programmatic decisions • To assess fidelity to the program model • To link GEAR UP dosage to student outcomes

Student, Parent, Principal, and Teacher Survey Data

• To identify needs and interests of students and parents • To assess changes over time in knowledge of college options, selection and application process • To obtain formative feedback on specific program activities • To assess sustainability of GEAR UP at the middle schools • To assess AVID implementation and self-reported outcomes

Outcome/Impact Data

• To identify student and schools’ academic needs and plan for activities • To identify effective and ineffective interventions • To demonstrate impact and promote buy-in from stakeholders • To inform advocacy and public policy efforts and secure resources

9


Connecting Research to Practice Sharing Results GEAR UP stakeholder

Method for reporting/disseminating findings

Program staff

• • • •

Grant partners

• Partner meetings

US Department of Education

• APRs, biennial report, final report

Ongoing communication Monthly program participation reports Quarterly evaluation meetings and report Annual comprehensive reports

School community (school • School-based survey reports • Factsheets (e.g., AVID) personnel, students and • Newsletter families)

10


Questions

11


12


Page Left Blank Intentionally


Â

Tab 7 Conccurrent Session B: Beccoming a Parent- andd Family-Ceentered Orgganization Location: Room m 1080 Chair: David Kowalsski, Office of Research R and Evaluation, E SDDP Panel: 1.. Recent Devvelopments inn Customer Seervice at the Sc School District, Karen James, Office of Pareent and Familyy Services, SDPP 2.. The New District-wide D Parent P and Guaardian Surveyy Program, Davvid Kowalski, Office of Research and Evaluaation, SDP 3.. Nudging Attendance At witith Social Comp mparison, Toddd Rogers, Kennnedy School, HHarvard University Discusssant: Allisonn Still, Office of Multilingual Curriculum and Programs,, SDP


Page Left Blank Intentionally


Office of Research and Evaluation

THE NEW DISTRICT-WIDE PARENT AND GUARDIAN SURVEY PROGRAM David Kowalski, ORE Statistician


Survey Goals 

The survey is designed to gain insight into parents’ & guardians’ perceptions of:  School

Safety  The Parent/Guardian-School Partnership  Their child’s academic experience  District Performance


Stakeholder Participation 

In designing the survey the following sources were consulted: Parents & Guardians  Focus

Groups  Item Testing 

 

Principals, Teachers, Counselors & Administrators Researchers & Central Office Staff Current Research


Platform & Distribution

Online distribution will:  Reduce

costs  Allow for real-time feedback on response rates  Reduce turnaround time on analysis


Proposed Timeline 

The survey window will extend from April 28th until June 6th A window of this duration will allow:  Parents

time to take the survey  Community partners time to connect with parents  The survey team time to provide targeted outreach  Schools to utilize report card conferences to support parents in taking the survey


Moving Forward 

Gathered data will be used to:  Develop

Individual School Reports  Provide insight into the level of Parent-School Partnership  Construct a Parent Score for the School Progress Report  Inform school & District level decisions  Direct future research efforts


Takeaways 

The new Parent/Guardian survey:  Was

constructed in partnership with parents  Will provide a clear conduit for parent feedback  Is more user friendly than previous surveys  Will cost less to administer than previous surveys  Will allow for greater transparency & accountability 

Prior to the next administration further feedback will be gathered from all stakeholders


Office of Research and Evaluation

THANK YOU FOR YOUR TIME!

Name: David Kowalski Email: dkowalski@philasd.org Phone: 215-400-6136


Â

Tab 8 Conccurrent Session C: Devveloping a System S of EExcellent SSchools Location: Room m 1075 Chair: Kati Stratos, Office O of Research and Evaluuation, SDP Panel: 1.. Evaluationn of the Renaisssance Schoolsls Initiative, Kaati Stratos andd Adrienne Reiitano, Office oof Research and Evaluationn, SDP 2.. Developingg a New Schoool in Philadelpphia, Laura Shubila, Buildingg 21 3.. Evidence for fo Developingg Systems of Excellent Ex Schoools: Turnarounnds, Charters, and School Closures, Matt M Steinbergg, University of Pennsylvania Discusssant: Grace Cannon, C Officee of New Schoool Models, SDDP


Page Left Blank Intentionally


Office of Research and Evaluation

SCHOOL TURNAROUND IN PHILADELPHIA Kati Stratos, Senior Research Associate Adrienne Reitano, Research Assistant


Overview 

Background and Context  School

Turnaround  The Renaissance Initiative 

ORE’s Role and Evaluation Plan  Quantitative

Research – Findings  Qualitative Research – In Progress 

Research to Practice


Background and Context Turnaround in the Literature 

Generally defined as: “a dramatic increase” in student achievement over a short period of time There is no single, replicable school turnaround model that has proven successful Other large urban districts undertaking large scale turnaround    

Chicago New York Denver Los Angeles

Turnaround in Philadelphia 

The Renaissance Initiative  

Renaissance Charters Promise Academies

Current Landscape 

20 Renaissance Charters across 7 Charter Management Organizations (CMOs) 12 Promise Academies (3 previously closed)


Renaissance Charters vs. Promise Academies Renaissance Charters 

Run by CMOs but remain neighborhood schools Relative autonomy from the District, but greater oversight than traditional charters Required School Advisory Councils (SACs)

Promise Academies 

Year 1       

Most principals replaced At least half of the staff replaced Extended school day Saturday and summer school Additional funding and staff Uniforms SACs

Now (Year 4)   

Additional funding Uniforms SACs


Quantitative Findings – PSSAs – Renaissance Charters Average Percentage Point Change since Turnaround

Figure 1. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Reading for Renaissance Charter Schools Cohort 1 Schools

Cohort 2 Schools

20.04 25 20 14.01 15.05 16.07 15 7.79 7.97 8.83 10 5 0 -5

3.28

7.87

9.3

Key Findings:

Cohort 3 Schools 12.63 13.1 13.6

11.45 2.63 -0.97 -0.62

 Figure 2. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Math for Renaissance Charter Schools

Average Percentage Point Change since Turnaround

Cohort 1 Schools

Cohort 2 Schools

30.34 35 30 24.55 23.77 25 19.28 15.27 15.27 20 15 5.68 10 5 0 -5 -10 -15

14.01 12.2

Cohort 3 Schools

19.15

16.3

20.34

13.6 -11.82

-0.49

-1.29

-5.53

15 of 17 schools increased in reading 13 of 17 schools increased in math More than half met the criteria for “rapid growth” – minimum increase of 4 points each year since turnaround


Average Percentage Point Change since Turnaround

Average Percentage Point Change since Turnaround

Quantitative Findings – PSSAs – Promise Academies Figure 1. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Reading for Promise Academy Schools Cohort 1 Schools 20

Cohort 2 Schools 14.7

15 10 5

2.8

1.87

0 -5 -10

-6.46

-8.2

-4.4

Figure 2. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Math for Promise Academy Schools Cohort 1 Schools 20

Cohort 2 Schools 14.92

10 0 -2.4

-10 -20

-6.57

-8.6

-10.11

-13.3

Key Findings:  3 of 6 schools increased in reading  1 of 6 schools increased in math  Only 1 school met the criteria for rapid growth


Within-year Retention Rate

Quantitative Findings – Retention Figure 1. Within-year Retention Rate from Year Preceding Turnaround to 1 Year into Turnaround in Renaissance Charter Schools and Promise Academies 95.0% 90.0% 85.0% 80.0% 75.0% 70.0% 65.0%

Pre-turnaround Post-Turnaround Difference

83.0%

89.1%

76.3%

Renaissance Charters 83.0% 89.1%

81.6%

Promise Academies 76.3% 81.6%

6.1*

5.3*

*Indicates that difference is significant at p<.05.

WIthin-year Retention Rate

Figure 2. Within-year Student Retention Rate by School Type in 2012-2013 School Year District Schools

100% 90%

Charter Schools

Renaissance Charters Promise Academies 97% 94%93% 93% 91% 91% 90% 87% 86% 84% 83% 80%

80% 70% 60% 50% Grade 3

Grade 7

Grade 9

Key Findings: Improvement in within-year student retention in the year after turnaround in both Renaissance Charters and Promise Academies. The improvement is greater in Renaissance Charters. These gains are significant across grades 3, 5 and 9 in Renaissance Charters but only for grade 9 in Promise Academies. Increase in students retention experienced in the year after turnaround were maintained through 2012-2013.


Quantitative Findings – Climate Figure 1. Change in Serious Incident Rate since Turnaround in Renaissance Charter Schools

Mast.…

Arts Edmunds

Univ.…

Memphis St.

Mosaica…

Univ.…

Univ. Vare

Mast. Gratz

Mast. Clymer

30 20 10 0 -10 -2.08 -3.24 -4 -9.05 -9.2 -10.65 -12 -20

Cohort 3

Asp. Olney

Cohort 2

Mast. Smedley

Asp. Stetson

Univ. Daroff

YS Fred.…

Mast. Harrity

Univ. Bluford

Mast. Mann

Change in Serious Incident Rate

Cohort 1

4.5 -1.65 -4.71

-6.9

23.8 -12.69-12.87

-6.7 -7.4 -9.24

Change in Serious Incident Rate

Figure 2. Change in Serious Incident Rate since Turnaround in Promise Academy Schools Cohort 1 10

Dunbar

5 0 -5 -10 -15 -12.66

Ethel Allen 0.42

Clemente 0.19

Cohort 2 Potter-Thomas West Phil. HS 0.69

4.42

ML King HS 2.94

Key Findings: Nearly all Renaissance Charters have shown a decrease in the serious incident and offender rates since turnaround. 5 of 6 Promise Academies are at pre-turnaround levels or higher. The offender rate has declined at Promise Academy elementary and middle schools, and increased at Promise Academy high schools. Changes in school climate appear to be correlated with changes in student achievement.


Qualitative Evaluation Practices When you walk into a SDP school, you should expect to see evidence of…

Categories

Site Visit Protocol 

Pre-visit survey 

 

School leaders developing, articulating, stewarding, and implementing a clear vision for learning for all students and a strategic plan to accomplish that vision (1a) All school stakeholders able to articulate a clear and shared vision for learning (1a)

10

A safe, secure and orderly environment for all (2b)

10

Principals who are experts in high quality instructional practices that consistently promote excellent instruction school-wide (3c) Principals that are visible in classrooms and teachers regularly receiving timely and constructive feedback on classroom instruction from school administrators and colleagues (3c) School leaders and teachers clearly communicating learning and development objectives that reflect high expectations for learning and growth, a belief that all students can learn, and a commitment to meet each student’s educational needs (3d)

10

Collegial and professional relationships among staff and students that promote critical reflection, shared accountability, and continuous improvement (1c) Constructive management of conflict at all levels (2f)

10

Teachers regularly collaborating on practice and providing each other with support and constructive feedback (1c) Careful staff selection and effective assignment of staff (2g) Plans to support the professional growth of staff members that are differentiated based on identified needs and individual goals (4c) A deliberate approach to building leadership capacity among staff (4c)

10

The frequent collection, analysis, and use of multiple sources of data to guide continuous improvement in student achievement and wellbeing and professional development for staff (1b)

10

Positive and collaborative relationships with families and communities (4a)

10

• High Quality Instruction

Climate, Safety, Environment Principals and School Leadership Teachers Students

Classroom observations 

School Safety

Interviews 

Vision for Learning

Structure, Procedures and Policies

Walkthrough 

Instruction and classroom management

High Performing School Practices Rubric

• Positive Environment • • Talent Development

• • • •

Data Family and Community Relationships

Possible Points

Total Score (MAX Score: 70)

70

Score


Research to Practice 

What patterns emerge from the quantitative data and how can they be informed by the qualitative findings? Which models and providers are more effective in achieving and maintaining rapid growth? What systems and policies could have contributed to these outcomes? What barriers prevent other models and providers from reaching similar goals?


Office of Research and Evaluation

THANK YOU FOR YOUR TIME!

Kati Stratos

Adrienne Reitano

kstratos@philasd.org

areitano@philasd.org

215-400-5396

215-400-6536


Page Left Blank Intentionally


Â

Tab 9 Conccurrent Session D: Beccoming an Innovativee and Accouuntable Orgganization, and Achieving and Sustaining Financial Balancce Location: Room m 1072 Chair: Jon Supovitz,, Co-Director, Center for Pollicy Research iin Education, University of PPennsylvania Panel: 1.. Understand nding School-BBased Partners rships in the Sc School District oof Philadelphi hia, Sarah Costeello, Swarthmorre 2.. Developingg a New Schoool District Accoountability Sys ystem, Jura Chung, Office off Strategic Analytics, SDP S 3.. Fair Financcing for Schoools in Philadelpphia, Rand Quiinn, Graduatee School of Eduucation, Univeersity of Pennsylvvania Discusssant: Jon Supovitz, Co-Dirrector, Center for Policy Research in Educaation, Universsity of Pennsylvania


Page Left Blank Intentionally


Assessing Adequacy in Education Spending: A Summary of Key Findings

Matthew P. Steinberg

Rand Quinn

Research, Policy and Practice Conference April 8, 2014


Our Investigation. A comprehensive (and ongoing) assessment of:

1. The legislative, legal and policy context shaping education funding both locally and nationally. 2. Funding trendsâ&#x20AC;&#x201D;the sources of education funding and expenditures per-pupilâ&#x20AC;&#x201D;both locally and nationally. 3. Issues related to equitable and adequate funding, on a perpupil basis, throughout the Commonwealth, with a focus on the School District of Philadelphia.


Our Methods

We calculated an adequate per-pupil amount for each school district in Pennsylvania using the methodology of the legislatively commissioned Costing Out Study. We then compared SDPâ&#x20AC;&#x2122;s adequacy gap to districts with similar achievement scores and populations.


Our Findings

1. PA required an additional $3.21 billion statewide in education funding to account for the difference between current per-pupil spending and an educationally adequate level of spending for the 2009-10 school year. 2. There are significant differences in the adequacy gap across PA districts. These differences vary by district-level student poverty, academic performance and geographic location. The poorest and lowest achieving districts had the largest adequacy gaps. 3. Even in the presence of a funding formula, SDP did not receive sufficient resources. However, SDP more efficiently utilized its resources than the average peer district in terms of student poverty and achievement.


Expenditures in SDP and Peer Districts, by District Poverty $16,000

Per-Pupil Spending (2010$)

$2,608

$5,478

$12,000

$8,000

$16,895

$15,916 $13,308

$11,417

$4,000

$-

Highest Poverty Districts

Per-Pupil Spending (2010$)

SDP

Adequacy Amount

Adequacy Gap


Expenditures in SDP and Peer Districts, by Math Achievement $16,000

Per-Pupil Spending (2010$)

$2,159

$5,478

$12,000

$8,000

$13,837

$16,895

$15,996 $11,417

$4,000

$-

Lowest Achieving (Math) Districts

Per-Pupil Spending (2010$)

Adequacy Amount

SDP

Adequacy Gap


Expenditures in SDP and Peer Districts, by ELA Achievement $16,000

Per-Pupil Spending (2010$)

$2,344

$5,478

$12,000

$16,895

$16,004

$8,000

$13,660 $11,417 $4,000

$-

Lowest Achieving (Reading) Districts

Per-Pupil Spending (2010$)

Adequacy Amount

SDP

Adequacy Gap


Research to Practice Strategies

1. The studyâ&#x20AC;&#x2122;s findings support claims Pennsylvaniaâ&#x20AC;&#x2122;s education funding system unfairly burdens SDP. 2. It also refutes claims SDP spends its resources inefficiently. 3. While many students in SDP are still not academically proficient, the study indicates an investment of resources could potentially catalyze large gains in student success.


Rand Quinn Assistant Professor Graduate School of Education University of Pennsylvania raq@gse.upenn.edu


Page Left Blank Intentionally


Â

Tab 10 1 Conccurrent Session E: Impproving Stuudent Learnning â&#x20AC;&#x201C; Dropout Preveention Location: 2nd Floor Auditoorium Speakker: Russell Ruumberger, Caliifornia Dropouut Research Prroject, Universsity of Californnia, Santa Barbbara


Page Left Blank Intentionally


Dropout Prevention Russell W. Rumberger University of California, Santa Barbara russ@education.ucsb.edu

1


Urgency for America So this is a problem we cannot afford to accept and we cannot afford to ignore. The stakes are too high―for our children, for our economy, and for our country. It's time for all of us to come together―parents, students, principals and teachers, business leaders and elected officials from across the political spectrum―to end America's dropout crisis. ―Barack Obama, February 24, 2009

2


Improving College Graduation Rates Requires Improving High School Graduation Rates

To produce 8.2 million new college graduates by 2020 requires raising the nationâ&#x20AC;&#x2122;s high school graduation rate by 17.5 percentage points. â&#x20AC;&#x2022;Opportunity to Learn Campaign, 2020 Vision Roadmap (2011)

3


U.S. ranks 21nd worldwide in high school graduation rate Japan Finland Korea United Kingdom Netherlands Germany Norway Denmark Ireland Portugal Iceland Spain Hungary Canada Slovak Republic Israel Poland OECD Average Chile Italy Czech Republic United States Sweden Luxembourg Greece Austria Turkey

96 96 93 93 92 92 90 90 89 89 88 88 86 85 85 85 84 83 83 79 78 77 75 70 68 67 56 0

SOURCE: OECD (2013)

20

40

60

80

100

120

4


My Background on Dropouts Research • “Dropping Out of High School” (AERJ,1983) • Engaging Schools: Fostering High School Student’s Motivation to Learn (NRC, 2005) • Improving Measures of High School Dropout, Graduation, and Completion Rates: Better Data, Better Measures, Better Decisions (NRC, 2010) • Dropping Out: Why Students Drop Out of High School and What Can be Done About It (Harvard University Press, 2011) Practice • Collaborator on proven dropout prevention program, ALAS (1990-95) • Dropout Prevention: A Practice Guide (IES, 2008) Policy • Started California Dropout Research Project (2006) • Solving California’s Dropout Crisis (CDRP, 2008) 5


cdrp.ucsb.edu

6


Dimensions of the Dropout Crisis 1. 2. 3. 4.

Magnitude and trends Consequences Causes Solutions


1. The problem is severe


What is a Dropout?

• Dropout as a status • Dropout as an event • Dropout as a process: Enroll  Attend  Progress  Graduate  Drop Out


What is a Graduate?

• Graduate earns a high school diploma • Completer earns diploma or equivalency (GED)


Dropout Factories • In US, 18% (2,007) of regular and vocational high schools account for 50% of the dropouts (“dropout factories”) • In California, 1% (25) of all high schools account for 21% of dropouts


2. The social and economic costs are staggering


Consequences of Dropping Out • INDIVIDUAL CONSEQUENCES • Lower wages • Higher unemployment • Increased crime • Poorer health • Reduced political participation • Reduced intergenerational mobility

• SOCIAL COSTS • Reduced national and state income • Reduced tax revenues • Increased social services • Increased crime • Poorer health • Reduced political participation • Reduced intergenerational mobility


Consequences of Dropping Out (Compared to High School Graduates)

• Lifetime earnings half a million dollars lower • 6 times more likely to be incarcerated • Life expectancy nine years less • 2-3 times more likely to receive Medicaid • More likely to be poor—poor children 2-3 times more likely to become poor adults


Lifetime Economic Losses per Cohort of 20-year-old Dropouts, 2004

Losses per dropout

Losses per cohort (Billions)

Taxes

$139,000

$98

Crime

$26,000

$19

Welfare

$3,000

$2

Health

$40,500

$29

TOTAL

$209,200

$148

SOURCE: Belfield and Levin (2007).


3. The causes are complexâ&#x20AC;&#x201D;related to students, families, schools, and communities


Understanding Causes • Causes vs. reasons vs. predictors • Individual • Demographic (unalterable) • Attitudes and behaviors (alterable)

• Institutional: Family, School, Community • Resources • Practices

• Proximal (high school) vs. distal (before high school) • Dropout vs. achievement


Reasons for Dropping Out ANY SCHOOL REASON

82

Missed too many days of school

44

Thought it would be easier to get GED

41

Failing in school

38

Did not like school

37

Could not keep up with schoolwork

32

ANY FAMILY REASON

34

Pregnant

28

ANY JOB REASON

35

Got a job

28 0

SOURCE: CDRP Statistical Brief 2 (2007).

10

20

30

40

50

60

70

80

90


Individual Predictors • • • • • •

Mobility Academic achievement (failed classes) Poor attendance Misbehavior Low educational aspirations Retention


Risk Indicators

SOURCE: CDRP Research Report 14 (2008).


Student and School Predictors (Predicted 10th grade graduation rates by student and school SES, 2002)

95 93 91

Percent

90 85

89 85

86

87

83 80 75

80

74

70 Low

Middle

High

Individual SES School SES Low

School SES Middle

SOURCE: Preliminary analysis of data from Education Longitudinal Study: 2002

School SES High


The Dropout Process

Environment

Beliefs and attitudes

SOURCE: National Research Council, Engaging Schools (2005).

Engagement

Dropout Achievement


The Dropout Process

Beliefs about competence and control Environment

Values and goals Sense of belonging

SOURCE: National Research Council, Engaging Schools (2005).

Cognitive Engagement Behavioral Engagement Emotional Engagement

Dropout Achievement


Noncognitive Skills and Academic Performance

Farrington, et al. (2012)

24


Implications of Research Findings for Policy and Practice

• Address both academic and social needs of students • Start before high school—more effective and less costly • Focus on individual students and institutions that support them (families, schools, communities)


4. There are a range of possible solutions


Intervention Strategies 1. Programmatic—focus on students • Support programs • Alternative programs and schools 2. Comprehensive—focus on schools • Comprehensive school reform • School/community partnerships 3. Systemic—focus on system • School/district capacity building • State policy (e.g., compulsory schooling age; graduation requirements) 27


1. Programmatic Solutions

• Advantages • Easier to design, fund, implement, evaluate • Disadvantages • Limited impact—only appropriate where dropout problem is small • Adds to programmatic “overload” at local level • Few proven programs—What Works Clearinghouse has identified five proven programs

28


2. Comprehensive Solutions

• Advantages • Potential to impact more students—more appropriate in “dropout factories” • Potential to impact multiple educational outcomes (test scores and dropout rates) • Disadvantages • More difficult to alter families, schools, and communities • Few proven comprehensive school reform models— Comprehensive School Reform Quality Center identified 3 out of 18 models that significantly improved graduation rates • Unclear what incentives, resources, and support are needed to improve school and district capacity

29


3. Systemic Solutions

• Advantages • Potential to impact more students • Potential to impact multiple educational outcomes (test scores and dropout rates) • Disadvantages • More difficult to alter families, schools, and communities • Unclear what incentives, resources, and support are needed to improve school, district, and state capacity

30


What Works Clearinghouse US Department of Education

• Reviewed 108 studies of 19 dropout interventions • 26 studies had rigorous evaluations • 13 interventions had positive or potentially positive effects • 6 Staying in school • 6 Progressing in school (toward graduation) • 5 Completing school (GED) • 0 Graduating with a diploma

31


Other Reviews • Washington State Institute for Public Policy (2009) ‒ Strict criteria for considering evidence ‒ Reduces estimated effects for weaker studies ‒ Found only 6 programs that improved graduation rates and only significant effects were for alternative education programs (Career Academies) • Campbell Collaboration (2011) ‒ More inclusive criteria for considering evidence ‒ Found many more studies that had positive impacts on dropout, enrollment, graduation, and completion ‒ Average program effect reduced dropout rates by 8 percentage points 32


Proven Dropout Interventions Costs per Graduate

Benefits per Graduate

BenefitCost Ratio

Perry Preschool Program (pre-K)

$90,700

$209,100

2.31

Chicago Parent Child Centers (pre-K)

$67,000

$209,100

3.09

Class size reduction (15 to 1) (K-3)

$143,600

$209,100

1.46

Teacher salary increase (K-12)

$82,000

$209,100

2.55

First Things First (9-12)

$59,100

$209,100

3.54

SOURCE: Belfield and Levin (2007)


(released September 2008)

34


Overview of Practice Guides

• Provide guidance to practitioners on best practices • Based on wide array of evidence from rigorous evaluations to expert opinion • Level of evidence determined for each recommendation

35


Practice Guides: Levels of Evidence Strong •

Moderate

Consistent body • Moderate body of of research that research recommendation • Either strong causal evidence improves student OR strong outcomes Targets generalizable outcomes of evidence interest • Related to outcomes of Strong causal interest evidence AND strong generalizable evidence

Minimal • Minimal body of research to support recommendation • Practice would be difficult to study • Practice has not been extensively studied • Conflicting results • Supported by expert opinion


Measures of Effectiveness

• Staying in school • Progressing in school • Completing school

37


Recommendation 1

â&#x20AC;˘ Utilize data systems that support a realistic diagnosis of the number of students who drop out and that help identify individual students at high risk of dropping out (diagnostic). â&#x20AC;˘ Level of evidence: Low

38


1. How to carry out this recommendation

1. 2. 3. 4. 5. 6.

Use longitudinal, student-level data to get an accurate read of graduation and dropout rates. Use data to identify incoming students with histories of academic problems, truancy, behavioral problems, and retentions. Monitor the academic and social performance of all students continually. Review student-level data to identify students at risk of dropping out before key academic transitions. Monitor studentsâ&#x20AC;&#x2122; sense of engagement and belonging in school. Collect and document accurate information on student withdrawals. 39


Recommendation 2

â&#x20AC;˘ Assign adult advocates to students at risk of dropping out (targeted intervention). â&#x20AC;˘ Level of evidence: Moderate

40


2. How to carry out this recommendation 1.

Choose adults who are committed to investing in the studentâ&#x20AC;&#x2122;s personal and academic success, keep caseloads low, and purposefully match students with adult advocates. Use data to identify incoming students with histories of academic problems, truancy, behavioral problems, and retentions.

2.

Establish a regular time in the school day or week for students to meet with the adult.

3.

Communicate with adult advocates about the various obstacles students may encounterâ&#x20AC;&#x201D;and provide adult advocates with guidance and training about how to work with students, parents, or school staff to address the problems.

41


Recommendation 3

â&#x20AC;˘ Provide academic support and enrichment to improve academic performance (targeted intervention). â&#x20AC;˘ Level of evidence: Moderate

42


3. How to carry out this recommendation 1.

2.

Provide individual or small group support in test-taking skills, study skills, or targeted subject areas such as reading, writing, or math. Provide extra study time and opportunities for credit recovery and accumulation through after school, Saturday school, or summer enrichment programs.

43


Recommendation 4

• Implement programs to improve students’ classroom behavior and social skills (targeted intervention). • Level of evidence: Low

44


4. How to carry out this recommendation 1.

2. 3. 4.

Use adult advocates or other engaged adults to help students establish attainable academic and behavioral goals with specific benchmarks. Recognize student accomplishments. Teach strategies to strengthen problem-solving and decision-making skills. Establish partnerships with community-based program providers and other agencies such as social services, welfare, mental health, and law enforcement. 45


Recommendation 5

â&#x20AC;˘ Personalize the learning environment and instructional process (schoolwide intervention). â&#x20AC;˘ Level of evidence: Moderate

46


5. How to carry out this recommendation 1. 2. 3. 4.

Establish small learning communities. Establish team teaching. Create smaller classes. Create extended time in classroom through changes to the school schedule. 5. Encourage student participation in extracurricular activities.

47


Recommendation 6

â&#x20AC;˘ Provide rigorous and relevant instruction to better engage students in learning and provide the skills needed to graduate and to serve them after they leave school (schoolwide intervention) . â&#x20AC;˘ Level of evidence: Moderate

48


6. How to carry out this recommendation 1. 2. 3. 4. 5.

Provide teachers with ongoing ways to expand their knowledge and improve their skills. Integrate academic content with career and skill-based themes through career academies or multiple pathways models. Host career days and offer opportunities for work-related experiences and visits to postsecondary campuses. Provide students with extra assistance and information about the demands of college. Partner with local businesses to provide opportunities for workrelated experience such as internships, simulated job interviews, or long-term employment.

49


Implementing Recommendations

• Choosing between strategies, targeted programs, schoolwide programs • Selecting strategies and programs that are both effective and cost effective • Matching programs and strategies with local needs, capacity, and context • Evaluating outcomes of locally implemented programs

50


What Else Is Needed?

1. Redefine high school success 2. Provide incentives to educate all students 3. Build the capacity of the educational system 4. Desegregate schools 5. Strengthen families and communities

51


The Importance of Noncognitive Skills

Both types of skill [cognitive and noncognitive (motivation, tenacity, trustworthiness, perseverance)] are valued in the market and affect school choicesâ&#x20AC;ŚOur findingâ&#x20AC;Ś demonstrates the folly of a psychometrically-oriented educational policy that assumes cognitive skills to be all that matter. A more comprehensive evaluation of educational systems would account for their effects on producing the noncognitive traits that are also valued in the market. â&#x20AC;&#x2022;James Heckman, Nobel Laureate, Economics (2001)


21st Century Competencies • Cognitive Competencies • Cognitive processes and strategies • Knowledge • Creativity

• Intra-Personal Competencies • Intellectual openness • Work ethic and conscientiousness • Positive core self-evaluation

• Inter-Personal Competencies • Teamwork and collaboration • Leadership

SOURCE: National Research Council. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century.

53


The Challenge In some part, the difficulties and complexity of any solution derive from the premise that our society is committed to overcoming, not merely inequalities in the distribution of educational resources (classroom teachers, libraries, etc.), but inequalities in the opportunity for educational achievement. This is a task far more ambitious than has even been attempted by any society: not just to offer, in a passive way, equal access to educational resources, but to provide an educational environment that will free a childâ&#x20AC;&#x2122;s potentialities for learning from the inequalities imposed upon him by the accident of birth into one or another home and social environment James Coleman (1967)


REFERENCES

Farrington, C. E., Roderick, M., Allensworth, E., Ngaoka, J., Keyes, T. S., Johnson, D. W., & Beechum, N. O. (2012). Teaching adolescents to become learners The role of noncognitive factors in shaping school performance. Chicago Consortium on Chicago School Research, University of Chicago. Fredricks, J. A., Blumenfeld, P. C., & Paris, A. H. (2004). School engagement: Potential of the concept, state of the evidence. Review of Educational Research, 74, 59-109. National Research Council, Committee on Increasing High School Studentsâ&#x20AC;&#x2122; Engagement and Motivation to Learn. (2005). Engaging Schools Fostering High School Studentsâ&#x20AC;&#x2122; Motivation to Learn. Washington, D.C. The National Academies Press. National Research Council, Committee on Defining Deeper Learning and 21st Century Skills. (2012). Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. Washington, D.C. The National Academies Press. Rotermund, S. The Role of Psychological Precursors and Student Engagement in a Process Model of High School Dropout. (2010). Ph.D. dissertation, University of California, Santa Barbara. Rumberger, R. W. (2011). Dropping Out: What Students Drop Out of School and What Can be Done About It. Cambridge, MA: Harvard University Press. 55


Page Left Blank Intentionally


Tab 11 1 Conccurrent Session F: Idenntifying annd Developiing Exceptiional, Committed Peoople Location: Room m 1075 Chair: James Jack, Office O of Reseaarch and Evaluuation, SDP Panel: 1.. Implement nting the New Educator Effec ectiveness Systtem in Philadeelphia, Brett Shiel, Office of Teacher Efffectiveness, SDDP 2.. Evaluatingg the New Educ ucator Effectiveeness System in Philadelphhia, Kati Stratoos and James JJack, Office of Reesearch and Evvaluation, SDPP 3.. Improvingg Teachers’ Proofessional Deve velopment: Inssights from Reecent Random mized Control TTrials, Laura Desim mone, Graduaate School of Education, E Uniiversity of Pennnsylvania 4.. The Relatioonship betweeen Teacher Pre reparation Expperiences andd Early Teacherr Effectivenesss and Implicationns for Human Capital Manag agement Strate tegies, Peter W Witham, Educaation Analyticss Discusssant: Matt Steinberg, Gradduate School of o Education, UUniversity of PPennsylvania


Page Left Blank Intentionally


Improving Teachersâ&#x20AC;&#x2122; Professional Development: Insights from Recent Randomized Control Trials Laura M. Desimone Associate Professor, Education Policy University of Pennsylvania Graduate School of Education


What Have We Learned from Recent Randomized Control Trials? • Many “high-quality” PD interventions have been successful in improving teaching and student learning. – High-quality PD has five core features: 1. 2. 3. 4. 5.

Sustained content-focused active learning collective participation coherent with other reforms and activities.

• But, sometimes even high-quality PD doesn’t work. • What do some high-quality PDs work, and others fail?


The Duration Necessary for Effects Varies with the Target of the PD â&#x20AC;˘ Three common targets: 1. Content Knowledge 2. Pedagogy 3. Decision-making

â&#x20AC;˘ Change is easier when the target is a simple, specific pedagogy (e.g., print referencing) than when it attempts to improve content knowledge or foster complex pedagogical changes (e.g., inquiry-oriented instruction).


Insights About Collective Participation • Participation in groups builds community, and provides opportunity for feedback and reflection. • Teachers differ in what they learn and do as a result of PD, based on their own knowledge, beliefs, experience, classroom conditions, etc. • PD needs to be adaptive to individual teacher knowledge, learning needs, and classroom circumstances.


Monitoring and Improving Implementation • To what extent does implementation quality matter? • Is teacher adaptation a form of constructive flexibility, or a detrimental deviation? • How much student exposure is enough to elicit gains? • Are there threshold effects?


Insights About Coherence • PD is more effective when it: – Clearly aligns with curriculum/standards – Is explicitly integrated into daily lessons – Reflects the realities or urban schools and classrooms – Has the support of school and district leaders – Is embedded in the evaluation/accountability system


Conclusions â&#x20AC;˘ We know how to make PD more effective. Start with the five core features and: 1. 2. 3. 4. 5.

Calibrate duration to the goals of the PD Adapt to teacher needs Monitor, evaluate and support implementation Align with curriculum, lessons, and school Integrate the active support of the school and district leaders


Thank you. Please direct questions or other follow up to Laura M. Desimone, Graduate School of Education, University of Pennsylvania, lauramd@gse.upenn.edu


Tab 12 1 Conccurrent Session G: Impproving Stuudent Learnning – Liteeracy Location: Room m 1080 Chair: Adrienne Reitano, Office off Research andd Evaluation, SDP Panel: 1.. K-3 Literacy cy in the Schoool District, Dorria Mitchell, Offfice of Early CChildhood, SDP 2.. Children’s Literacy L Initiaative in Philade delphia, Jill Vallunas 3.. Translanguuaging as a Reesource for Lite teracy Developpment, Nelsonn Flores, Graduuate School off Education, University of Pennsylvania Discusssants: Barbarra Wasik, Temple Universityy Cheryl Micheau, Offiice of Multilingual Curriculuum and Prograams, SDP


Page Left Blank Intentionally


Translanguaging as a Resource for Literacy Development Nelson Flores University of Pennsylvania


Monoglossic Language Ideologies • Monolingualism as the norm • Bilingualism as mastery over two discrete languages – Double monolingualism

• Languages should be strictly separated in instruction


Subtractive Bilingualism Language as Problem

L1

L2


Additive Bilingualism: Balanced Bilingualism

L1

+

L2

=

L1 + L2


Monoglossic Literacy Instruction • An ESL program at School A has a strict English-Only policy for literacy instruction • A Transitional Bilingual Education program at School B uses Spanish as a “crutch” to develop English literacy and then insists on English-Only instruction. • A dual language program keeps both languages strictly separate in literacy instruction and requires students to only use English during English time and Spanish during Spanish time.


Heteroglossic Language Ideologies • Bi/multilingualism as the norm • Bilingualism is conceptualized as one linguistic repertoire that language users strategically choose from during social interactions – Dynamic Bilingualism – Translanguaging

• Languages should be strategically used in instruction through use of translanguaging


Dynamic Bilingualism: Language as a Flexible Tool


What is translanguaging? • Fluid communicative norm of multilingual communities and families. • Dynamic process by which bilinguals “make sense” of communicative situations by performing bilingually and drawing on their entire linguistic repertoire.


Translanguaging and Literacy Instruction • New literacy practices emerge in interrelationship with literacy practices students currently engage in • Home literacy practices should be incorporated strategically in ways that facilitate this interrelationship • Goal is development of bilingual voices and identities not L1 and L2 identities


Beyond Monoglossic School Structures • Macro-Allocation Policy – Lessons have a language goal that is either: (1) Home language, (2) English, or (3) Bilingual • Micro-Allocation Policy – Encourage students to use their entire linguistic repertoire as they work toward the content and language goals of the lesson


Translanguaging in Micro-Allocation Policy in Literacy • Reading – Assign bilingual reading partners for mutual assistance – Provide bilingual books/ translations of books where appropriate to aid comprehension – Provide/encourage reading material for research project in both languages

• Writing – Allow students to brainstorm ideas first using both languages, then transfer to writing – Assign students bilingual writing partners – Students pre-write in both languages, then publish in one


Translanguaging in Micro-Allocation Policy in Oracy • Speaking – Assign newcomers a buddy to show them around school, answer questions inside and outside class, etc. – Group students so they can use both languages in small group work, then present in target language – Allow students to discuss lesson/ideas with partner in both languages

• Listening – Create a multilingual listening center comprised of fiction and non-fiction texts in the classroom, narratives of community members, and books recorded by students (a favorite book or their own writing) – Allow students to explain things to each other using both languages


Developing Macro-Spaces for Translanguaging • Bilingual projects • dual language books • bilingual brochures

• Translation projects • oral translation • written translation • receptive in one language, productive in the other.

• Translanguaging projects • reading bilingual authors who engage in translanguaging • develop writing modeling similar translanguaging strategies


Translanguaging as US Bilingual Languaging


Recommendations • Expand number of dual language programs that support emergent bilinguals in developing literacy in both English and their home language • Acknowledge and respect the translanguaging practices of emergent bilingual students • Treat translanguaging as a springboard for academic literacy development • Set clear language and literacy goals while also encouraging students to engage in translanguaging to achieve these goals • Engage emergent bilinguals in translanguaging rhetorical models


Thank you!


Â

Tab 13 1 Readdings Cooburn, C.E., Peenuel,W.R., & Geil, K.E. (20112). Research-Practice Parttnerships at thhe District Leve vel: A New Straategy for Leveeraging Researrch for Educattional Improveement. A Whitte Paper prepared for the William W T. Grannt Foundation. Coonaway, C.L. (2013). ( The prroblem with briefs, b in brief.. Education Fin inance and Pol olicy, 8(3), 287-299. Cuunningham, D.H., D & Wyckoff, J. (2013). Policy P makers and researcheers schooling eeach other: Leessons in educattional policy from f New Yorkk. Education FFinance and Po Policy, 8(3), 275-286. Fiixsen, D., Blase,K., Horner, R., R Sims, B., & Sugai, G. (20113). Scaling-UUp Brief. Univeersity of Northh mentation & Scaling-Up of EEvidence-Baseed Practices. Carolina:: State Implem Goertz, M.E., Baarnes, C., & Massell, D. (20113). How state te education aggencies acquirire and use ressearch

in school ol improvemennt strategies (CConsortium foor Policy Reseaarch in Educattion Policy Brie ief RB 55). Philaadelphia, PA: University of Pennsylvania.. Malin, M J., & Lubbienski, C. (2013). Whose Oppinion Countss in Educationaal Policymakinng? Current Isssues in Educat ation, 16 (2). Peenuel, W. R., Fishman, F B. J., Cheng, B. H.,, & Sabelli, N. (2011). Organnizing Researcch and Developm ment at the Inntersection of Learning, Impplementation,, and Design. Education Researchher, 40(7), 3311-337.


Page Left Blank Intentionally


Research-Practice Partnerships A Strategy for Leveraging Research for Educational Improvement in School Districts

A white paper prepared for the William T. Grant Foundation January 2013

Cynthia E. Coburn, Northwestern University William R. Penuel, University of Colorado, Boulder Kimberly E. Geil, Independent Researcher


ABOUT THE AUTHORS Cynthia E. Coburn, Ph.D., is a professor in the School of Education and Social Policy at Northwestern University. Dr. Coburn received the Early Career Award from the American Education Research Association (AERA) in 2011. She is on the editorial board of American Educational Research Journal, Educational Evaluation and Policy Analysis, and American Journal of Education. William Penuel, Ph.D., is a professor of educational psychology in the School of Education at the University of Colorado at Boulder. He also serves as an associate editor of the Social and Institutional Analysis section at the American Educational Research Journal and is on the editorial board for Teachers College Record, American Journal of Evaluation, and Cognition and Instruction. Kimberly E. Geil, Ph.D., is an independent researcher whose work has focused on teacher engagement and the benefits of professional development. She received her doctoral degree from the University of Colorado at Boulder.

Coburn, C.E., Penuel, W.R., & Geil, K.E. (January 2013). Research-Practice Partnerships: A Strategy for Leveraging Research for Educational Improvement in School Districts. William T. Grant Foundation, New York, NY.

Š copyright 2013 William T. Grant Foundation

William T. Grant Foundation 570 Lexington Avenue, 18th floor New York, NY 10022-6837 212.752.0071 www.wtgrantfoundation.org

i


ABOUT THE FOUNDATION Since 1936, the William T. Grant Foundation has been committed to furthering the understanding of human behavior through research. Today, the Foundation supports research to understand and improve the settings of youth ages 8 to 25 in the United States. We are interested in studies that strengthen our understanding of how settings work, how they affect youth development, and how they can be improved. We also fund studies that strengthen our understanding of how and under what conditions research is used to influence policies and practices that affect youth. Important settings include schools, youth-serving organizations, neighborhoods, families, and peer groups.

FOREWORD In keeping with our interest in the use of research, the Foundation has developed an interest in learning more about the burgeoning community of research-practice partnerships. These partnerships shift the predominant producer-push dynamic of research to practice. Instead, they foster reciprocal interaction in which practice informs research and vice versa. At their best, these partnerships facilitate the development of more relevant, actionable research and its use within the practice community. We want to follow these partnerships to see where they go and what implications or lessons they have for connecting research and practice in order to improve the lives of young people. The goal of this paper is to assess the predominant types of researchpractice partnerships in educationâ&#x20AC;&#x201D;and the benefits and challenges of eachâ&#x20AC;&#x201D;so those seeking to form or fund such a group can make informed decisions about how best to do so.

ii


TABLE OF CONTENTS Introduction..................................................................................................................... 1

What is a Research Practice-Partnership at the District Level?........................................... 2

Three Types of Research-Practice Partnerships................................................................. 4

Challenges to Developing and Maintaining Research-Practice Partnerships......................14 Implications.................................................................................................................... 20 Concluding Thoughts...................................................................................................... 25

End Notes........................................................................................................................ I

References....................................................................................................................... IX

iii


INTRODUCTION Pressures are increasing on educational policy and practice to use research to guide improvement. In recent years, federal programs such as No Child Left Behind, Reading First, and Race to the Top have all provided strong incentives for the use of research in decision-making. Educators, however, may not have the skills or the time to produce, gather, and apply research to meet their improvement goals. The available research may not be useful or credible because researchers are not always focused on answering questions relevant to school districtsâ&#x20AC;&#x2122; most pressing needs. And, too often, research findings arenâ&#x20AC;&#x2122;t accessible to educators or arrive too late to make a difference. Recently, though, there have been concerted efforts to forge new and different kinds of relationships between researchers and practitioners. School districts across the country are developing a new kind of partnership with researchers. These research-practice partnerships are long-term collaborations, which are organized to investigate problems of practice and generate solutions for improving district outcomes. Advocates argue that educators will better understand the research and its implications because they help develop it and have ready access to the researchers. Partnerships may also produce research and innovations that are more useful to practice because they are rooted in districtsâ&#x20AC;&#x2122; needs. District leaders are likely to see the research that partnerships produce as more credible because studies are done with local students and take into account local conditions. All these factors may increase the likelihood that districts will use the research findings and tools produced in the partnerships to support their efforts to improve outcomes for children and youth. Evidence is beginning to accumulate in support of these claims. When research-practice partnerships develop new educational innovations, districts adopt these new innovations in ways that can result in changes in teacher and administrator practice and increased student learning.1 Research-practice partnerships have also developed promising track records in fields as diverse as health care, social services, urban planning, and community policing.2 There remains much to learn, though. For example, there is mixed evidence that districts involved in research-practice partnerships use research more consistently in their decisionmaking.3 We also know little about the different forms partnerships can take, particularly what distinguishes one kind of partnership from another and how that matters for the work. In this white paper, we will:

1. Define research-practice partnerships.

2. Identify the major types of partnerships that operate at the district level.

3. Describe challenges partnerships face and strategies for addressing these challenges.

To do so, we draw on a review of existing research and interviews with participants in researchpractice partnerships across the country.4 Throughout, we illustrate the work of researchpractice partnerships with portraits of partnerships in action. Research-practice partnerships are bold new initiatives. We are beginning to learn about the challenges they encounter and strategies they use to ensure productivity and success. We want to provide insight into the strategic trade-offs partnerships face and the resources they need to be successful. We hope that this paper can be a guide for those seeking to develop or maintain research-practice partnerships as well as for funders of such partnerships.

1


WHAT IS A RESEARCH-PRACTICE PARTNERSHIP AT THE DISTRICT LEVEL? People use the term “partnership” to refer to many different things: consulting relationships; university-school partnerships in which local schools send prospective teachers to a university to be trained and the university, in turn, places student-teachers in local schools; traditional research projects in which studies or interventions take place in districts with limited participation by district personnel; etc. The term is so widely used, in fact, that it has come to have little meaning. We use the term “research-practice partnerships” to denote something very specific. We define research-practice partnerships at the district level as: “Long-term, mutualistic collaborations between practitioners and researchers that are intentionally organized to investigate problems of practice and solutions for improving district outcomes.” Research-practice partnerships differ from the conventional ways researchers and district leaders work together in five significant ways.

Research-Practice Partnerships: 1. Are long-term, 2. Focus on problems of practice, 3. Are committed to mutualism, 4. Use intentional strategies to foster partnership, and 5. Produce original analyses.

Long-Term: In research-practice partnerships, researchers and district leaders commit to form and maintain a long-term working collaboration. These open-ended commitments involve more than a single consulting agreement or grant. The work can span a few years, or, as is true for some partnerships, more than a decade, shifting focus as the work develops over time.5 The long-term nature of partnerships contributes to several key elements of their success. First, time invested in a partnership can help develop trust.6 Trust developed over time can help mitigate the inevitable bumps in the road. Working together over time also enables partners to take on larger questions and explore issues in depth.7 One district leader said: “the partnership … is a long-term partnership, [so] I think we can go deep on some really meaty questions with them.” Long-term partnerships also enable research organizations to serve as a repository for institutional memory, which can be important in districts with high rates of leadership turnover. They can be a source of stability and continuity in districts characterized by frequent change. One district leader in a partnership with the University of Chicago Consortium for Chicago School Research (CCSR) explained:

“ ”

I think an organization like the Consortium can be the keeper of a lot of that history. Especially because of their longevity and stability in the district, they bring a point of view that is really unique and incredibly helpful.

2


Focused on Problems of Practice: Research-practice partnerships start with a focus on problems relevant to practice.8 These are issues and questions that districts find pressing and important. They can involve student learning, classroom instruction, or how to organize a district for improvement. By starting with a problem of practice, research priorities are set in response to district needs, rather than to address gaps in existing theory or research. Starting with district needs increases the likelihood that will district leaders find research useful and apply it to their ongoing work.9 Committed to Mutualism: Research-practice partnerships are characterized by a commitment to mutualism—sustained interaction that benefits both researchers and practitioners. A researcher described this:

“ ”

Everything from whose questions we pursue, how we define those questions, what methodologies, the authorities and control over the activity are much, much, much more shared … [There has to be] a common goal, common aims, shared values, equal authority, and real work to do.

Typically, when researchers and practitioners work together, one group or the other holds the authority for setting the agenda. The researcher may develop a policy or an intervention and then attempt to persuade schools and school districts to adopt and implement it. Or, the district may hire researchers as consultants to do a specific piece of work or evaluation, which the district will then use or not as it sees fit. In research-practice partnerships, by contrast, the focus is jointly negotiated and responsibility for how the work unfolds is shared.10 Mutualism is important because it helps ensure that different perspectives—practitioners’ and researchers’—contribute to defining the focus of the work that research-practice partnerships do. All parties share ownership and are able to learn from one another. By working closely with researchers, district leaders can clarify their goals and gain insights into the implementation of district policies and programs. By working closely with practitioners, researchers can gain a deeper understanding of classrooms, schools, and districts and just what it might take to make change. Use Intentional Strategies to Foster Partnership: Research-practice partnerships use intentional strategies to organize their work with one another.11 For example, the Research Alliance for New York City Schools and the New York City Department of Education have a formal data-sharing agreement detailing how they conduct research together. Partnerships have developed strategies for supporting mutualism in other aspects of the work as well—negotiating the focus of joint work, uncovering key drivers for improvement, structuring co-design processes, and sharing and interpreting findings from research studies. Some partnerships have specific structures that bring together a broad range of stakeholders to review and make sense of research results. One of the leaders of the University of Chicago Consortium on Chicago School Research (CCSR) has developed a network of principals focused on the use of data to inform school improvement. Initially the network examined “on-track indicators,” which serve as an early-warning system to identify students at risk of dropping out. CCSR staff members hold breakfast meetings with principals in this network to help them interpret the ontrack indicators and, later on, other data. A CCSR researcher described the process:

3


“ ”

The way to get to the solution is to have the evidence and the opportunities for the people working on the problem to really talk about it, internalize it, understand it ... Out of that process, you start to develop ideas about what to do.

Produce Original Analyses: Research-practice partnerships go beyond the focus of many current organizations on making data accessible to district leaders.12 The partnerships instead produce original analyses of data to answer research questions posed by the district. The Baltimore Education Research Consortium, for instance, analyzed the relationship between early-elementary achievement and attendance in that city’s pre-kindergarten and kindergarten programs. The motivation for the study was to understand the effects of early chronic absence on later outcomes. Other partnerships collect their own data as part of studies of programs, interventions, or reform strategies the district is pursuing.13 For example, the John W. Gardner Center for Youth and Their Communities collected its own data to help the Redwood City 2020 partnership study the association between after-school programming and youth development outcomes important to the partnership. The Strategic Education Research Partnership (SERP) has done extensive data collection and analysis on the development, impact, and scale-up of Word Generation, a middle school program co-designed by researchers and practitioners in their Boston site, which builds academic language necessary to comprehend subject area texts.

THREE TYPES OF RESEARCH-PRACTICE PARTNERSHIPS While research-practice partnerships share a long-term focus on problems of practice, use intentional strategies to support their commitment to mutualism, and prioritize original analysis, they differ in the ways they go about their work. We have identified three distinct kinds of research-practice partnerships that are currently active in school districts.

Three Types of Research-Practice Partnerships 1. Research Alliances a. Cross-sector research alliance b. District-focused research alliance 2. Design Research 3. Networked Improvement Communities (NICs)

Research Alliances A research alliance is a long-term partnership between a district and an independent research organization focused on investigating questions of policy and practice that are central to the district. These alliances negotiate research questions with districts and other youth serving organizations, conduct the research, and funnel findings back to the district, the community, and other stakeholders with the goal of informing policy and improving practice in the district. Perhaps the best-known research alliance is the Consortium on Chicago School Research, which was formed in 1990 as a partnership between researchers from the University of Chicago, Chicago Public Schools, and other local organizations. In recent years, the perceived success of the Consortium has spawned the development of research alliances in cities across the country.

4


THE JOHN W. GARDNER CENTER FOR YOUTH AND THEIR COMMUNITIES AND REDWOOD CITY 2020 The John W. Gardner Center for Youth and Their Communities (JGC) at Stanford University is an example of a cross-sector research alliance: one that creates longterm partnerships between researchers and organizations in multiple sectors of a community. Its goal is to improve the lives of youth by conducting research, developing leadership, and effecting change. Its work is rooted in the principles of community youth development—young people prosper when their community prospers, and vice versa. JGC works in several communities and, in each, they work with school districts, city and county public agencies, and community-based organizations that serve youth. One of JGC’s longest partnerships is with Redwood City, California. It has spanned more than a decade and supported a major, community-wide effort focused on youth development called Redwood City 2020 (RWC 2020). RWC 2020’s mission is to help local children and families be safe, healthy, and nurtured in a stable, caring environment. The partnership aims to improve outcomes by, as one partner put it, “build[ing] the community’s capacity to better meet the needs of the youth.” JGC’s role in the partnership is to investigate the relationship between different kinds of youth experiences (e.g., participation in school and community programs) and the outcomes that Redwood City 2020 targets. This research then informs partners’ ongoing work. JGC also developed and maintains the Youth Data Archive (YDA), a comprehensive resource that links data from youth service providers, government agencies, communitybased organizations, and schools in several counties in California. JGC staff use this archive to follow youth’s experiences across a wide variety of settings, including afterschool programs, juvenile justice, parks and recreation, and health and social services. A core governing principle of this partnership is that research questions must be agreed upon by the partners. This results in intensive collaboration at the beginning and end of each study. Each year, the coordinating committee addresses the questions: What is our focus? What questions should we pursue?

Do we have the data or not? Then, JGC researchers work with different agencies and their colleagues internally to find data sources to answer the questions. There is generally less interaction between JGC and its partners as data collection and analysis proceeds. Near the end of the process, however, JGC staff work with partners to make sure that the findings ring true. As one partner explained:

“When there’s a draft of a report, [the JGC researcher] sends this draft to us so we can look at it and ask questions, look back at the data, if there’s something that really sounds off. As a matter of fact, we do that even before the data is all analyzed. Once [he] is able to put the data all together, he would shoot me an email with a chart and say, “Look at this and see if there’s any major discrepancy”… Then we work on the draft. We have discussions about the draft, get to ask questions, get to question some of the assumptions that are being made.”

Community partners view JGC’s analyses as important and valuable resources for bringing to light issues that no single agency could investigate on its own and for evaluating community programs. One district leader involved in Redwood City 2020 described work with the JGC in this way:

“To me, it’s a pretty amazing relationship. I was testifying before a Senate subcommittee last year in Sacramento. It was on community schools and a comment was made that ‘Well, you really can’t measure community schools.’ I was able to share some of the research that, through the Gardner Center, shows the efficacy of community schools.”

For more information about Redwood City 2020 and the John W. Gardner Center, see Case Study I.

Research Alliance


There are two types of research alliances: (1) those that construct partnerships with youthserving organizations across multiple sectors (education, health and human services, youth development) in a given region and (2) those that work with local school districts. Both share certain characteristics. Place-based: Research alliances form partnerships with specific school districts or regions and focus research on issues relevant to local policy and practice. Focusing on a particular place over time enables researchers to develop a thorough understanding of the district and the community. A director of one research alliance explained:

I think there is a very high premium placed on what’s relevant and meaningful in the specific context of a [district] as big and eclectic and diverse as [the district we partner with] is. The place-based nature there really makes a big difference. A national organization … has to make sure that it’s building relationships across the country, and [is going to] have multiple districts or multiple states or multiple cities participating in a study. [As a result] it can’t be as deeply committed to understanding and working within a specific context.

When research takes place within a single district, political pressure on districts and the researchers is often high.14 This may lead to the production of more timely research. In addition policymakers may pay greater attention to the findings, since the research is relevant to local education stakeholders’ concerns about which investments are working and which are not. Focus on local policy and practice first: Most research alliances’ primary goal is to produce research that informs local policy and practice. Research alliances have investigated district policies related to a range of pressing issues. For example, the Consortium for Chicago Schools Research studied college readiness in local high schools. The Research Alliance for New York City Schools investigated NYC’s turnaround schools. The San Diego Education Research Alliance studied the effects of introducing diagnostic testing on student mathematics learning in San Diego. And, the Baltimore Education Research Consortium investigated the quality of classroom instruction in district elementary schools.15 Though secondary to informing local policy and practice, contributing to the national debate on educational policy issues is also a goal for many research alliances. They circulate findings beyond their local settings, creating policy briefs to inform the national conversation and publishing their findings in academic books and journals. To meet both of these goals, most research alliances maintain academic standards of quality, particularly with strong study design and rigorous methods. They argue that this lends external credibility to the work, which then adds to researchers’ internal credibility with district stakeholders. Develop and maintain data archives: Most research alliances develop and maintain longitudinal data archives. By developing data-sharing agreements with local districts and other youthserving agencies, research alliances are able to assemble large datasets that they use to do longitudinal analysis of issues facing districts or regions.16 The John Gardner Center for Youth and Their Communities has linked data from a local elementary school district, high school district, and human services agency to investigate who is chronically absent and truant, the extent of the problem, and the consequences of missing school for a range of in-school and outof-school outcomes.

6


RESEARCH ALLIANCE FOR NEW YORK CITY SCHOOLS The Research Alliance for New York City Schools is an example of a research alliance that works primarily with its local school district. Since 2008, the Research Alliance has worked with the New York City Department of Education (DOE) and other key stakeholders in New York City to “advance equity and excellence in education by providing non-partisan evidence about policies and practices that promote students’ development and academic success.” A leader of the Research Alliance elaborated: “Our goal is really to conduct rigorous studies of questions that matter to policymakers, practitioners, and other stakeholders in New York City schools.” Housed in New York University’s (NYU) Steinhardt School of Culture, Education, and Human Development, the Research Alliance’s work is guided by a governance board that represents the DOE, NYU, and representatives of key stakeholder groups in the district and community-based organizations. This governing board sets the research agenda, which recently has centered on four issues: (1) high school achievement, attainment, and post-secondary preparation; (2) achievement and development in the middle grades; (3) contexts that support effective teaching; and (4) data use for practice and policy.

administrators to improve the measures of school environment on the DOE’s annual survey. This work is consequential since individuals at the school and district level use this survey to inform their improvement work. It is also important because measures from the survey, along with attendance figures, comprise up to 10–15 percent of the grade that schools get on their annual school progress report. As is typical for research-practice alliances, the Research Alliance works most intensively with the local district at the early and late stages of a study. Both the Research Alliance and the DOE can initiate a study. Researchers might identify a funding opportunity and approach administrators at the DOE to discuss questions about the focus of the initiative. The DOE may also approach the Research Alliance to evaluate a new initiative. After collecting and analyzing the data, the researchers return to the DOE to discuss the findings before their release to the public. This advanced notice allows the Research Alliance to solicit feedback and confirm accuracy, and for the district administrators to prepare a response or reaction to the release. This is especially important when the research involves issues that are politically charged.

Since its inception, researchers at the Research Alliance have worked with the district in three main ways. First, they have conducted evaluation studies. Recent studies have investigated how teachers use the city’s Achievement Reporting and Innovation System (ARIS), students’ transitions in and through middle school, and the impact of various organizational conditions in small schools of choice on student outcomes. Second, they have developed a longitudinal data archive. A district administrator explained:

While the primary goal of the Research Alliance is to inform policy and practice in New York City, it also seeks to inform policy debates nationwide. So, it addresses questions that have been studied in other districts in order to extend research findings to a new and different context. The Alliance also disseminates its findings widely in white papers and policy briefs. The district feels that it benefits from having an outside party providing feedback on its work:

“We have a very broad and deep datasharing relationship ... We’ve arranged to give them access to just about everything that we are functionally and legally allowed to give them, with the purpose of allowing the Research Alliance to do independent research on New York City schools.”

The extensive data archive enables the Research Alliance to draw on a broad range of data to address its research questions and be flexible and responsive to the district’s requests for analysis on short notice. Third is descriptive and formative work, including a collaboration between researchers and DOE

“I think the way we see it in the [DOE is that] although the work that they’re doing is not on our behalf in a “contractor” sense, it is on our behalf in that it’s really valuable to us as policymakers to find out the answers to the questions that they’re posing. We have a real interest in ensuring that they have the [ability to] do their projects and produce really solid research about our schools.”

For more information on the Research Alliance for New York City Schools, see Case Study II.

Research Alliance


Distinct roles for researchers and practitioners: Research alliances maintain distinct and fairly conventional roles for researchers and practitioners. They rarely involve district personnel in data collection or analysis. Deliberations around policy responses to the research rarely involve researchers. Some research alliances argue that practitioners, not researchers, should come up with solutions. The researchers’ role is contributing information to inform districts’ problem-solving efforts or “creating conditions so people on the ground are given the incentives, resources, and feedback they need to search for solutions.”17 Collaborate primarily at the beginning and end of the research process: Given these distinct roles, collaboration happens most intensively at the start and end of a given study. At the start, researchers and district administrators work together to negotiate the focus of the research. While conducting studies or analyses, however, alliance researchers maintain independence from district staff so as not to compromise the objectivity of research findings. Alliance researchers argue that this independence is also appropriate because researchers are in the best position to make decisions about research methods. The two parties come together again before researchers release their findings to the public so that district leaders have the opportunity to respond and react to the substance of the report, and have time to prepare for the public’s response.

Design Research Design research is a form of educational research that is similar to engineering research. In design research, the aim is to build and study solutions at the same time in real world contexts. It usually focuses on developing and testing instructional activities and curriculum materials, while investigating how they can best support student learning.18 In recent years, some design researchers have begun to focus on designs for improving the implementation of instructional activities and curricula at scale. To do so, they have forged partnerships with school district leaders to design and test strategies for helping school districts implement these new innovations.19 For example, the Strategic Education Research Partnership (SERP) has developed infrastructures to support collaborations between researchers and school districts to design, study, and scale innovations in teaching and learning. SERP focuses on designing innovations for the classroom, large-scale impact studies in districts, and working with central offices to create conditions that foster scale-up. Design-research partnerships that work at the district level typically share the following characteristics. Place-based: As with research alliances, design-based partnerships are usually focused on longterm, in-depth work with a single district. When researchers are involved with multiple districts, each district’s priorities shape the work in significant ways, resulting in different designs and partnership trajectories in each district. For example, in the Middle-School Mathematics in the Institutional Setting of Teaching (MIST) project, researchers at Vanderbilt University partnered with four different school districts. However, each partnership followed a different trajectory, depending upon the unique needs and context of each district. Focus on informing practice and research: Design-research partnerships typically have two goals of equal importance. They aim to develop materials and instructional approaches that can be implemented in classrooms, schools, and districts.20 At the same time, they want to advance research and theory. For example, researchers in the MIST program are developing an “actionable theory of change” for bringing about instructional improvement at scale within middle school mathematics.21 MIST uses the theory to guide selection of strategies for improving instruction at scale, and partners refine the theory in light of evidence related to the success of those strategies, helping guide changes to practice. At the same time, the MIST researchers see the project as an opportunity for developing more sophisticated and practical theories of learning

8


BELLEVUE SCHOOL DISTRICT AND UNIVERSITY OF WASHINGTON This design-research partnership involves two complementary research groups from the University of Washington, both of which have been working in the Bellevue School District to redesign elementary science units. The partnership is currently funded by the National Science Foundation, and there are three coprincipal investigators—one from each research team and a third representing the school district. The three have joined to redesign, deliver, and evaluate elementary science units that incorporate both student choice and culturally relevant teaching strategies. As with most design-research partnerships, this one is long-term. The partners have maintained the work since the mid-2000s across multiple grants and gaps in funding. The partnership is also place-based, with a focus on the Bellevue School District. At present, the partnership has two goals: (1) to develop curriculum units and professional development that are attuned to local issues and contexts and (2) to contribute to research knowledge by developing theories of and evidence related to individual, group, and organizational learning. The partnership has systems in place to make sure that the design process incorporates diverse perspectives and expertise. A weekly steering committee consisting of researchers and district staff in curriculum and science works through all issues related to the grant. A subset of this team is charged with redesigning each unit. Teachers are actively involved in redesign efforts, and researchers believe the teachers have much expertise to contribute. According to one member of the university’s research team:

“I certainly respect what my partners know because ... I can’t do that. I’m not a practitioner. I don’t have that expertise and ... I’m lousy at writing lesson plans. I’ve tried, but I can’t make it right for teachers, and we have to have that input. And ... I think our partners respect what we know about ... learning and research.”

development process—which the design team then uses to improve the unit. Professional development was created and is conducted primarily by the partnership’s district staff. The partnership also does research, which it integrates into the design process. For example, the initial research focused on students’ inquiry skills and content knowledge, using a combination of district- and researcherdeveloped assessments. Researchers are also studying the degree to which students identify more with science as a result of participating in the units. And, the team is studying the roles of professional development and curriculum in supporting more student-centered teaching in science. Most recently, the partnership has focused on redesigning science units and collecting data related to teaching and learning. Researchers have collected many hours of video, student and teacher interviews, student assessments, and survey data. Data analysis has begun in earnest, and early evidence suggests that students are indeed participating more actively in the learning process in activities that are personally relevant to them. One district staff person elaborated:

“The level of excitement and the ease at which students use evidence to support thinking in those Go Publics [final presentations of learning] is one indicator to me, as a former fifth grade teacher and a science curriculum developer, that there is some real buyin. There is something to be said for students being able to really become excited about and at ease talking about supporting their claim with evidence and reasoning. That is pretty powerful, that students don’t really have an opportunity to do in the FOSS units as they’re written.”

For more information about the partnership between the Bellevue School District and the University of Washington, see Case Study III.

Although only a few teachers are involved with the design process and writing curriculum materials, all teachers who use the curriculum provide feedback—through the professional

Design-Research Partnership


along with the organizational structures necessary to support these theories. They publish their findings in publications that target researchers as well as those more useful to practitioners. Emphasize co-design: Central to design research is the principle of co-design. Co-design is a highly facilitated process that engages people with diverse expertise (e.g., research, curriculum, professional development, teaching) in designing, developing, and testing innovations. These innovations can include curriculum materials, professional development to better equip school and district leaders for supporting classroom-based reforms, and new approaches for systemwide change.22 Collaborate at every stage in the process: District leaders and researchers work together to define the challenge or problem to be addressed. They also work together to develop design parameters or requirements for instructional activities and curriculum, test them in classrooms, and assist with revisions. Close collaboration in the context of co-design represents one way to bring diverse kinds of expertise to bear on persistent problems of educational practice. Participation by teachers and educational leaders is especially important, because they bring insight into the needs and interests of students and the fit of an innovation with current practices and curriculum materials.

Networked Improvement Communities The software engineer and inventor Douglas Engelbart coined the term “networked improvement communities (NICs)” to apply to groups engaged in collective pursuits to improve a capability, such as that of schools to provide effective teaching and learning opportunities to students.23 Research-practice partnerships using the NIC structure draw on Engelbart’s ideas and improvement research from the health care field. NICs are networks of districts that seek to leverage diverse experiences in multiple settings to advance understandings about what works where, when, and under what conditions. They draw on research techniques developed from improvement efforts in health care to engage researchers and practitioners in rapid cycles of design and redesign.24 NICs use these cycles to develop new approaches that address well-defined problems of practice or adapt existing research-based practices to local conditions. Networked improvement communities in education have the following characteristics: Involve networks of schools, districts, or universities: A core feature of NICs is that they are formed as networks that are not tied to a single district or community. Though districts and researchers from the local area may be partners, a NIC forms to address a problem that is common to many different communities. The University of Washington’s Developing Networked Improvement Communities program, for example, is using the NIC model to foster high-quality mathematics and science teaching, connect schools with each other, and spread best practices around the state. In this and other NICs, project leaders are interested in the contrasting ways that different sites implement solutions to common problems. They use this information strategically to improve the scalability of solutions and reliability of implementation. Tony Bryk from the Carnegie Foundation for the Advancement of Teaching explains:

“ ”

In any human resource intensive enterprise, such as schooling, variations in performance are the natural state of things. We have ample testimony to this from decades of educational innovations. That a practice, program, or service can work is of little value unless we discern how to make it work at scale in the hands of many different individuals working under diverse circumstances.25

10


CARNEGIE FOUNDATION FOR THE ADVANCEMENT OF TEACHING AND THE BTEN PROJECT The Carnegie Foundation for the Advancement of Teaching has a long tradition of developing and studying ways to improve teaching practice. The current president has put organizing NICs at the center of Carnegie’s work. At present, the Foundation is cultivating three major NICs, two focused on community college developmental mathematics, and a third on new teacher effectiveness and retention. We will discuss the third NIC—Building a Teacher Effectiveness Network (BTEN). All three initiatives are in their early stages and are far less mature than the other partnerships featured in this paper. BTEN focuses on developing and retaining teachers in their first three years. It is a network of different institutional partners, including the Carnegie Foundation, the Institute for Healthcare Improvement (IHI), the American Federation of Teachers, New Visions for Public Schools, the Austin Independent School District, and the Baltimore City Schools. Carnegie staff act as the primary facilitators of the work, guiding the overall improvement process. The NIC has adapted a model of improvement research that IHI developed in previous work to define problems and identify root causes. One tool is a “driver diagram” that describes the objectives and a set of “primary drivers” or levers for change that, in the case of BTEN, is hypothesized to improve the effectiveness and retention of new teachers. Each district has selected a primary driver on which to focus initially that reflects its local constraints and opportunities. One district, for example, is focusing on improving the quality and coherence of feedback from principals, administrators, and coaches to new teachers. The NIC provides opportunities for districts to share ideas with one another in order to facilitate cross-district learning. Each district also engages in PDSA (Plan, Do, Study, Act) cycles that are a hallmark of IHI improvement research. The district that focused on feedback to new teachers, for example, used a series of PDSA cycles to develop a protocol for a feedback interaction involving a principal and teacher. Each PDSA focused on testing a small change to the protocol; discoveries from one informed testing done in subsequent PDSAs.

11

Partners collect data on BTEN at two levels—at the district and for the network as a whole. At the district level, the improvement teams collected data as part of the Do phases of PDSA cycles. For example, in the feedback PDSA, the team collected information on how many steps of the protocol were executed reliably, which steps were modified, and in what ways. The purpose of these data was to help the team refine their change strategies and test their potential before spreading them more widely across the system. They also routinely collected data to determine whether changes that were tested and refined led to the outcomes the team sought to achieve. The team also collected data to make sure that desired improvements did not cause any unexpected harm. Partners also plan to collect data in the network as a whole; specifically, they will collect “ontrack” or leading indicators, such as job satisfaction and commitment to the profession, to determine if work on the primary drivers is having the anticipated impacts. These data are also intended to serve as an early-warning system to identify teachers who may be at risk of leaving. In addition, the partners will collect lagging indicators to measure progress toward the ultimate aims, such as teacher retention. While it is too early to judge the success of this work, participants in districts said they see value in the approach. Though they find the work timeconsuming, one district leader described BTEN as “hugely valuable because we’re doing so much work around measurement and thinking about assessing teacher performance, what that looks like, and how to use it.” This leader also said he appreciated Carnegie staff, who “are taking this construct of research and practice and really focusing on creating venues where practitioners are coming forward with ... technical experts to create something very practical and to make sure the research community is responding to realworld implementation issues—questions from practitioners, designers, people like me who are building these systems and want tools.”

For more information about the BTEN project, see Case Study IV.

Networked Improvement Communities


Use systematic methods for continuous improvement: NICs in health care have pioneered an approach to research called “improvement science.” Improvement science is a method of research and development focused on translating research findings into practice in real-world settings.26 In health care, improvement science focuses on helping medical professionals improve their day-to-day work by drawing on available evidence about what works and learning from adaptations of research-based practice. Some NICs, including those supported by the Carnegie Foundation for the Advancement of Teaching, use a technique common to improvement science that emphasizes cycles of Plan, Do, Study, Act (PDSA).27 In these cycles, the NICs decide on a small change to be tested, define the steps needed to test it, and determine the measures that will be used to gauge its success (Plan). Next, they carry out the plan (Do), analyze data collected to see if their predictions were borne out (Study), and determine what changes need to be made for the next cycle (Act). PDSA cycles are meant to happen rapidly—as short as two weeks—to facilitate rapid improvements to designs that are better timed to the needs of practice settings than typical research.28 In principle, rapid cycles make it possible to test and refine changes and then expand to multiple sites within a single school year. Rapid cycles allow teams to uncover major problems with a planned change quickly, as well as to get things “mostly right” before taking a change to scale. According to one leader in a NIC, the idea is to:

“ ”

Try it with one person in one place, a couple of times. Then, take it up in five places to study and learn the impact of context. Once you know that, and have tailored it and differentiated into a collection, expand to 25 places, study how to make it a permanent change. Then take it to ‘spread.’

Other NICs follow slightly different methods for continuous improvement. For example, the Strive Network was formed to improve regional outcomes for young people from “cradle to career.” It is comprised of foundations, employers, district superintendents, and universities in select cities. Strive uses data to “improve, rather than prove.” Local communities select priority areas for their work, develop strategies together in those areas, and monitor outcomes regularly to gauge progress and refocus efforts when needed. Network staff also provide consultation to help community collaboratives identify, adopt, and scale practices. For example, they help sites create detailed action plans for adopting and adapting effective practices and for supporting implementation.29 Put researchers and district staff in non-traditional roles: In NICs, researchers and practitioners assume roles that depart significantly from their usual work. Practitioners do the primary data collection and analysis related to small tests of change. Researchers act principally as facilitators, guiding the network members through the improvement process. Both researchers and practitioners are likely to need time to adjust to their new responsibilities, which may conflict with those they have at their respective organizations. One district leader described an issue faced by a principal involved in the work of a networked improvement community:

“ ”

To invest the time, when he’s pulled in so many directions, to stop and do data gathering is proving to be really challenging. Even when he schedules it, things change, a teacher may be out—things routinely happen to throw it off.

12


Primary focus on developing local capacity: NICs aim to improve schools’ and districts’ capacities to engage in a sustained, disciplined effort at improvement. For example, staff at the Carnegie Foundation for the Advancement of Teaching use the PDSA cycle to shift business-as-usual in districts. Districts often implement new programs district-wide before they have evidence that they are effective. By starting small and systematically testing and refining a small change in several different settings, Carnegie hopes to foster changes that, in the words of one staff member, become “permanent” and “embedded in the system.” The Strive Network refers to its efforts to build capacity as promoting “civic infrastructure.” According to the Network:

“ ”

‘Civic infrastructure’ is not a program, but a way in which a community comes together around a common vision and organizes itself to identify what gets results, improves and builds upon those efforts over time, and invests the community’s resources differently in order to increase impact.30

Some NICs, like those related to the Carnegie Foundation, do not place a priority on contributing to the research literature or theories of learning or change. In fact, although the approach is highly disciplined, improvement science would not in many instances meet the standards of research required for publication in research-oriented journals. Ultimately, these NICs are more focused on developing an approach to improvement that fosters the capacity of network participants to improve their own outcomes over time than conducting original research.

Emerging Types of Partnerships We have identified the three main types of research-practice partnerships that are active in education in the United States. Partnerships, however, can and do evolve and it is likely that new forms will develop over time. For example, in 2011, the Institute for Education Sciences created new guidelines for the Regional Education Laboratories (RELs), which it funds to provide research and support to school districts and states. The RELs help states and districts improve student outcomes by better using data and analysis to guide decisions on policy and practice. The new guidelines required the RELs to adopt several qualities of research-practice partnerships, including forming long-term relationships with school districts and focusing on a few core problems identified by district staff in order to make sound, evidence-based decisions to improve outcomes. The hope is that these alliances will also build the capacity of states and districts to conduct research on innovations.31 It is not yet clear what form these newer partnerships will take. It is possible that, over time, the RELs may evolve into conventional research alliances, or another of the current types of research-practice partnerships. Alternatively, the RELs and other organizations that are currently designing partnerships may develop into something completely new. It will be important to look to these organizations for fresh ways of configuring the roles and relationships of researchers and practitioners in support of district improvement.

Alliances

Networked Improvement Communities 13

Design-Research


CHALLENGES TO DEVELOPING AND MAINTAINING RESEARCH-PRACTICE PARTNERSHIPS Building and maintaining research-practice partnerships can be challenging. Working across institutional boundaries to produce high-quality research and innovation that is useful to schools and districts is not easy, particularly during times of increased policy pressure and reduced resources. Here, we tackle common challenges that research alliances, design-research partnerships, and networked improvement communities face and describe the strategies experienced partnerships have taken to address them. Understanding how and when difficulties arise is important to inform efforts to design, fund, and nurture research-practice partnerships over the long-term.

Bridging the Different Cultural Worlds of Researchers and Practitioners Building and maintaining successful research-practice partnerships can be challenging because researchers and practitioners come from different cultural worlds. They have very different ways of working and incentive systems.32 For example, district leaders feel a strong sense of urgency; they want solutions quickly so that they can put innovations or policies in place to meet students’ needs now. By contrast, research proceeds slowly, and researchers are often uncomfortable recommending action in the absence of a strong research base. Researchers and practitioners may also have different priorities and agendas.33 This can lead to divergent ideas about the steps a district should pursue. For example, one district leader involved in a research-practice partnership lamented the fact that researchers often promoted solutions that were not usable:

“ ”

 ome of the recommendations are just not as realistic as I would hope. The one S challenge would be the conflict, or potential conflict, between the best [approach from researchers] and the most realistic [approach from us].

This is further complicated by the status accorded to researchers in the broader society, which can foster resentment by practitioners and hubris among researchers. Some partnerships have developed strategies for finding common ground. For example, NICs rely on formal processes for coming to agreement on the key causes of problems and drivers of potential solutions. These processes help to level the playing field, ensuring that all voices are heard and all viewpoints considered. Co-design teams working in design-research partnerships have sought to repurpose district-adopted materials, rather than create completely new materials.34 The results build upon materials that are familiar to teachers and valued by districts, creating common ground.

Developing and Maintaining Trust Developing and maintaining trust is a persistent challenge faced by research-practice partnerships. Research organizations have not always enjoyed positive relationships with schools. Research often seems “evaluative” to practitioners, which can lead to defensiveness and apprehension about collaborating with researchers and letting them into their schools. Many district leaders have experienced instances in which researchers make promises about the benefits of their research but leave before following through on those promises or even sharing their findings.35

14


Yet, trust is essential to the compromises required for partnerships to function effectively; negotiating problems of practice and developing solutions always involves a give and take. Trust is also needed if partners are using data collected by districts as the data frequently relate to outcomes for which schools and districts are held accountable.36 And, a negative finding about a program or policy’s impact can appear in the local media, creating a difficult situation for the district. Some research-practice partnerships have developed trust by asking participants to follow through on simple commitments before tackling more complex challenges.37 Research alliances try to moderate the risk that districts experience by committing to a “no surprises” policy. Researchers agree not to release any report to the public before giving key district stakeholders an opportunity to review it. Sharing reports with district personnel ahead of their public release provides districts with time to respond and interact with research staff concerning the substance of the report. For many research alliances, the aim is not to permit district staff to rewrite reports, but to have time to develop a thoughtful—as opposed to reactive—response.38 Trust can also be developed through long-term engagement. District personnel are used to researchers coming and going, so the sustained presence of researchers builds trust in their commitment to the community and to the work. As one partner explained: “the researchers have to be hanging out in the community. Then, there’s a real opportunity for true partnership.”

BUILDING TRUST The John W. Gardner Center for Youth and Their Communities (JGC) invests much time and effort in building trust with its partners. The communities that partner with JGC entrust it with sensitive administrative data for its analyses. At the beginning of each research effort, a coordinating committee decides which questions will be addressed and JGC researchers address only those questions. Near the end of the process, after the data has been collected and the analysis begun, partners work together with JGC researchers on drafts of findings before they are released more broadly. The director of the Community Schools Initiative describes the process:

“We have discussions about the draft, get to ask questions, get to question some of the assumptions that are being made, and then we always have [a] presentation.… We bring together a group of coordinators and extract what we believe would be the most interesting part for them or something that we would like to get feedback on, and we have a conversation about that. “Okay, what do you see here? Do you think this is accurate? Do we need to work on improving the data collection?” ... Then we have conversations about the meaning of the data.”

The JGC takes an unusual additional step to ensure that its partners feel comfortable trusting the Center with potentially sensitive data—the JGC will not publish anything unless the partners have approved it. A JGC researcher explains:

15

“We’re responsive to our partners. They read everything that we produce before it goes public. I think we’ve had a chance to prove to them that their best interest is our best interest, and we don’t have a separate hidden agenda. ...We really needed to show that we honor their ownership of the data.”


Maintaining Mutualism It can also be challenging to maintain mutualism in research-practice partnerships. Differences in status mean that not all voices have equal weight in discussions. Practitioners, especially teachers, can fall silent in rooms filled with researchers. At the same time, district leaders have ultimate authority for the direction that the district takes. So, while they may listen politely to what researchers have to say, they may not consider the information or advice in their decisionmaking.39 Funding can impact mutualism, too. Funding is typically awarded to either the researchers or the district; so, one party has authority about how the grant funds are used. They are also responsible for ensuring that funders’ requirements and expectations are met.40 Those who receive the grant funding often have greater voice in decisions related to the focus of the work, the joint products developed, and the nature and timelines of those products. As one research leader in a partnership observed, “maintaining a level playing field” between research and practice is always a challenge. This person observed that when the work was funded by research grants, “in all the cases I know, [researchers] began to dominate.” Partnerships have devised a number of strategies to maintain mutualism. Some have governing boards composed of stakeholders that help set the research agenda. For example, both the Consortium on Chicago School Research and the Research Alliance for New York City Schools have boards that periodically help revise the focus and agenda of these organizations. Strategic Education Research Partnership (SERP) uses this approach, but it has also designed the SERP organization to act as a neutral intermediary that brokers relationships between researchers and practitioners in ways that maintain mutualism.41 Still other partnerships maintain mutualism by ensuring adequate time for face-to-face meetings so that district leaders and researchers can go back and forth with each other proposing ideas and then challenging and refining them. According to one district leader, having sufficient funding for meetings of this type is crucial:

“ ”

The people from the districts and researchers get to know each other. They begin to support each other, as well as feel like they can hold their own in a conversation with researchers. Say, ‘Oh, yes, this will work; no, that won’t and here’s why.’ Then work through a solution or, ‘No, that’s not really the problem. You’re defining it in a way that doesn’t match. Here, let me give you the experiences.’

Contexts Bridges Mutualism Quality Professional Cultures Trust Timelines

Scalability Incentives

Relevance

16


Balancing Local Relevance with Scalability Because many research-practice partnerships are committed to doing research or developing new innovations that meet the needs of the districts with whom they are working, they try to be deeply responsive to local contexts. But, a tool or practice that is effective in one context cannot always be adopted successfully elsewhere. The more tightly linked an innovation is with the contextual conditions of a specific place, the more challenging it may be to scale it up to other sites.42 Some research-practice partnerships—especially NICs—address this issue by involving multiple sites in the early stages of development. NICs assume that each site will need to adapt the proposed change to their local context and that by studying the local adaptations, the network can refine the change so that it can become more reliable across different contexts. This strategy also presents challenges. Other research-practice partnerships that have tried this approach have found that, as they work with larger numbers of schools or districts, they no longer have the capacity to truly be responsive to local issues and multiple entities with diverse needs.43

BALANCING LOCAL RELEVANCE AND SCALABILITY In the Center for Learning Technologies in Urban Schools (LeTUS), a partnership between learning scientists and two school districts, the main focus of the work was developing and testing curriculum units in middle school science. Over several years, a small group of teachers from Detroit Public Schools worked with researchers from the University of Michigan to co-design the units and tested early versions in their classrooms. When the team wanted to spread the units to other schools, they faced the challenge of how to do so. The units fit the needs of students in the co-design teachers’ classrooms, but the team did not know how well they would meet the needs of students in other classrooms in the district. And, the units did not have the look and feel of the polished curriculum that many teachers were accustomed to using. To increase the scalability of the units, the LeTUS team took several steps. First, they worked closely with the district director of science to ensure that the curriculum units fit with the district standards in science. Second, they trained co-design teachers to become professional developers capable of preparing other teachers to implement units. This step enabled the team to reach many more teachers, and it gave the teachers access to expert colleagues who understood and could navigate the challenges of implementation. The team then embedded additional supports into the materials to help teachers with implementation, based on lessons that co-design teachers had learned about student difficulties with the activities in the units. These supports included background knowledge about the content, as well as student ideas that teachers might encounter in implementing the materials. Finally, the team sought publishers for the curriculum materials. Today, the units are available widely in revised form through two publishers. Even after the curricula were published, they retained a local flavor. In one set of materials (Project-based Inquiry Science), a unit on water quality features an investigation of the Rouge River, a local river that students in the region can visit and use as a site for science investigations.

17


Other organizations have addressed the challenge of scaling up by hiring staff or developing partnerships with outside organizations that have skills specifically tailored to issues of scaleup.44 A study of the Boston Public Schools (BPS) and its work with the evaluation firm Education Matters found that BPS was more likely to use results from evaluation studies when the two organizations partnered with a third—the Boston Plan for Excellence—that brought additional capacity that enabled the district to respond to the findings in the study at scale.45

Meeting District Timelines While Maintaining Depth and Quality of Research Timelines for research are typically much longer than those for district decisions, making it difficult for district leaders to use research, even when it is relevant.46 One researcher remarked:

“ ”

When you engage in partnerships in an effort to be relevant, you want to do work that’s a very high priority for the district, which means they probably need the information yesterday. They need data or an analysis that can be done on fairly short order, and the more nuance you add into that, the less relevant it may feel.

At the same time, high-quality research can take a long time to unfold, especially when research questions require depth of analysis, longitudinal designs, or repeated cycles of design and redesign. Partnerships struggle to find ways to meet districts’ need for timely research, without sacrificing thorough, systematic analyses of policies and programs. They have taken different strategies to address this issue. Some partnerships maintain research studies that are long-term and in-depth, but incorporate interim products and reports for the district. Others do short-term analyses that respond to districts’ immediate needs in addition to their long-term research projects. For instance, the Baltimore Educational Research Collaborative (BERC) balances long-term research projects with what they call Rapid Response work— requests for data analysis made by the Baltimore City Schools that can be completed in a month or less. The directors of BERC explain:

“ ”

These shorter projects advance the district’s understanding of key issues and buy BERC goodwill through our willingness to provide support in instances where the university researchers know the work will not result in publications or products for dissemination beyond City Schools briefings.47

Still other partnerships, especially NICs, address the problem by having multiple layers of research. Small tests of change provide data that is intended to provide quick feedback on the potential and problems of reforms. Researchers then use a different set of measures to track long-term outcomes.

Aligning Partnership Work with Academic Norms and Incentives Researchers working at universities can find it difficult to participate as active contributors to partnerships. Faculty—especially untenured faculty—are not rewarded for many of the central activities related to partnership work. Design and development do not “count” toward research productivity, steering researchers away from this sort of work. Similarly, writing for publications more accessible to practitioners is rarely valued for promotion decisions.48 Researchers face other issues as well. Some universities do not value research on a single location because of concerns about generalizability.49 In addition, universities do not always support or appreciate the time it takes to do collaborative work with practitioners, reward collaboration, or engage in efforts to provide information to the public.50

18


Not all universities devalue partnership service, and not all researchers who partner with districts work in university settings. Many research-practice partnerships are located in centers outside of traditional academic departments and involve researchers who are not on the faculty (e.g. The John W. Gardner Center for Youth and Their Communities, Research Alliance for New York City Schools, Consortium on Chicago School Research).

Challenging School and District Contexts School and district partners can find participating in long-term research-practice partnerships difficult. They may not have the time to devote to intense design and development efforts. Districts may also lack capacity to use the research in decision-making.51 That is, they do not have the infrastructure and expertise to interpret findings and implement solutions consistent with evidence. The policy context can also work against adoption of innovations developed in partnerships.52 A partnership focused on improving instruction through professional development and coaching may find itself at odds with a policy environment that is focused on improving instructional quality through better selection of teachers. Also, policies regarding the purchase of materials and technologies can inhibit scale-up efforts of innovations that require teachers to use new curricula or digital tools.53 Turnover in districts is a persistent challenge. The average tenure of a superintendent in urban schools remains short. When superintendents leave, there can also be turnover in district offices with the closest ties to instruction (e.g., curriculum offices). Because new relationships must be formed, trust rebuilt, and focus maintained in the face of significant change, this turnover can be difficult for partnerships.54

MULTI-LEVEL PARTNERSHIPS TO SUPPORT CONTINUITY IN THE FACE OF CONTINUAL CHANGE The partnership between Bellevue School District and the University of Washington (UW) has faced more than its share of turnover of district leadership. The superintendent who initiated the partnership passed away soon after it began. As the work progressed, leadership in science curriculum changed twice. Yet, the partnership survived through thoughtful organizing and luck. One of the strategies the partnership used was to have researchers develop relationships with multiple leaders in the schools. This included a cadre of instructional coaches charged with supporting teachers. When the first science curriculum leader left her position, the replacement came from within that group. He knew the researchers and valued their contributions to district improvement efforts. The partnership also benefited from the many ties that a key researcher had with teachers in the district. Early on, she was involved in the project as a teacher in the district. She then left her teaching position to become a graduate student at the University of Washington, where she began working on the project as a researcher. Her connection to the district and her perspective as a former practitioner were valuable additions to the work.

19


IMPLICATIONS Research-practice partnerships are bold new initiatives that seek to create institutional relationships to support the development of timely, relevant, and useful research for educational improvement. In the last two decades, research-practice partnerships have been initiated across the country, garnering interest from policymakers, funders, and communities interested in leveraging research into educational opportunities for children and youth. The work is not easy, but we are beginning to accumulate evidence from existing partnerships about strategies to make that work productive. Here, we outline the implications for those seeking to develop, maintain, or fund research-practice partnerships.

Implications for Partnership Members Research on research-practice partnerships points to steps that researchers and practitioners can take to develop productive partnerships. The research does not indicate that these strategies will eliminate the challenges. Rather, it points to key things to consider when making strategic decisions about partnership work. Anticipate challenges associated with new roles and responsibilities: Many research-practice partnerships, especially design research and NICs, require that researchers and practitioners take on new roles and responsibilities. In the design-research model, researchers and practitioners co-design innovations together, something that stretches the boundaries of roles for all. Some design-research partnerships also involve practitioners in data collection and analysis, which is typically outside their normal job responsibilities. NICs take this even further. They have the practitioners take the lead in developing measures and collecting data while researchers mainly act as facilitators of the research process. Engaging practitioners centrally in the research and development process in this way may foster greater mutualism. Practitioners can develop a sense of ownership with respect to reforms, and they sometimes use their enthusiasm to convince other practitioners of the value of new solutions to difficult problems.55 This approach may also serve a capacity-building mission, enabling researchers to benefit more from practitioner knowledge and fostering greater understanding of and respect for the research process for practitioners. However, learning these new and non-traditional roles may involve a steep learning curve, requiring extensive lead time before individuals have enough capacity to perform tasks well. Furthermore, new roles and responsibilities are often layered on top of old ones, intensifying work and, at times, creating conflicts. Partnerships should anticipate the learning curve that new roles and responsibilities require and provide opportunities and support for it. Partnerships should also strategize about ways to enable school and district personnel to integrate these new responsibilities into their existing work routines. Ultimately, though, partnerships need to be realistic about what is possible for district personnel and others to take on, given how stretched they already are. Devote resources and staffing to maintaining the partnership: Maintaining the partnership takes ongoing work and attention.56 Building and maintaining trust are crucial for successful partnerships. Negotiation and communication are central to maintaining mutualism; they ensure that all parties are on the same page. Such efforts are intensified when high rates of turnover require that roles, relationships, trust, and directions for the work need to be reestablished over and over again. One commentator observed: â&#x20AC;&#x153;Coordination takes time, and none of the participating organizations have any to spare. The expectation that collaboration can occur without a supporting infrastructure is one of the most frequent reasons why it fails.â&#x20AC;?57

20


Partnerships should anticipate the central importance and the time-consuming nature of this work and devote adequate resources and staff time—adequate infrastructure—to maintain the collaboration. Existing research-practice partnerships vary greatly in the resources they devote to this aspect of the work. At one end of the spectrum are organizations like Strategic Education Research Partnership (SERP) and the John W. Gardner Center for Youth and Their Communities, which have staff members whose explicit responsibility is to coordinate and manage the work of the partnership. SERP, for example, has full-time staff members on site in several of the districts with which it works. These individuals are charged with bringing researchers and practitioners together, facilitating collaboration, and maintaining open channels of communication. Similarly, the Strive Network requires its initiatives to have an “anchor organization” that devotes staff time to program implementation and takes primary responsibility for any initiative. On the other end of the spectrum are partnerships, typically in a university setting, in which the researchers do this work in addition to their other responsibilities. In these examples, there are often few resources in the budget to support maintenance work. Since successful research-practice partnerships hinge on their ability to maintain ongoing communication, trust, and collaboration, they must be intentional in the way they devote staffing and other resources to this important, but frequently overlooked, work. Weigh pros and cons of starting small or big: Most research-practice partnerships, regardless of type, want to bring about improvements at scale. However, they go about it in different ways. Some partnerships, especially NICs, emphasize starting small. While the problems that the Carnegie Foundation for the Advancement of Teaching addresses in its work are big in scope— teacher quality, community college success—it starts by investigating relatively minor changes in a small number of sites. The idea is to learn from successes and failures and gradually build out or scale up solutions to problems. By contrast, research alliances study policies that implicate entire districts and design-research projects often focus on designing for districtwide implementation. This approach is based on the premise that improvements to teaching and learning require the coordination of supports at multiple levels of districts, which is more readily accomplished by working at scale from the start. At present, we know little about which approach is more effective, though both have potential advantages. Starting small could allow researchers and practitioners to work without much exposure to those in the district and community who might not agree with the direction of a policy or program being pursued. It can also allow researchers to gather data on the potential of a program, which can later be used as evidence to support scaling up. By contrast, starting big might enable district staff to build political support for large-scale changes early on and obtain the resources needed to make big changes through the process. Over time, the relative value of one approach to scaling may become apparent through experience or research. In the meantime, however, partnerships should weigh these trade-offs carefully, thinking through how different approaches may be more or less applicable given the problem of practice they are addressing and the local conditions. Acknowledge the tension between research independence and joint work: Research-practice partnerships take varying stances about the importance of maintaining independence for researchers. Research alliances typically embrace independence, while design-research partnerships and NICs are more concerned with collaboration and joint accountability.

21


Positioning researchers as independent may help preserve the integrity of the research process. When researchers are independent, they are not as vested in the results of a study of a policy or program since they are not responsible for designing or implementing it. This stance may also create credibility in the eyes of a broader range of stakeholders, including critics of policies and programs. As one district leader explained:

“ ”

 heir independence is very important. It lends credibility to their work. It allows them T to take on projects that are important generally, even if they’re not as much of a focal point for us. I think it’s valuable for us to have this outside group that … exercise[s] some independent judgment in what they do and how they do it.

Independence can also create a measure of distance, making it more difficult to maintain high levels of trust. It can also limit the degree to which researchers and the research process can help build capacity. Research in the absence of capacity to interpret it or implement its recommendations only goes so far. Furthermore, independence can be difficult to maintain. Many organizations become more intertwined and interdependent over time as they come to rely on one another for access to data and external funding.58 In contrast, maximizing collaboration and joint accountability may foster increased trust as practitioners and researchers see each other as critical to the success of the endeavor. Close collaboration may also foster greater learning and capacity-building on the part of both researchers and practitioners. It can also enable all parties to gain insight into one another’s motivations and needs, increasing buy-in and use. One district leader involved in a designresearch project explained:

“ ”

There is a collaborative give-and-take on what the district needs and what the researchers need … I feel like I understand better what research questions are being addressed [and] what the overall outcome or agenda is.

But, such close involvement challenges traditional standards of research. Design researchers have faced skepticism from the broader research community, including questions about whether the approach is adequate for testing claims about interventions.59 And, when partners are heavily invested in a strategy, it takes tremendous discipline to look at what is working and what is not. Whether researchers seek independence or not, partnership work takes place in a political environment in which the questions that researchers study can be hotly debated. By studying problems that matter to districts, researchers place themselves firmly within that political context. Researchers’ desire for independence is borne of a real concern that research integrity can be compromised by working in partnership. At the same time, the desire for joint accountability is a reminder that research in the context of partnerships is in the service of helping solve district problems. As research-practice partnerships mature and spread, we need to develop new norms and practices for maintaining high-quality research in settings in which researchers—even those involved in research alliances—are linked to the districts and projects that they are studying.

22


Implications for Funders Supporting Research-Practice Partnerships Funders play an important role in developing and sustaining research-practice partnerships. Private foundations have provided seed money to establish research-practice partnerships and they, along with federal agencies, have supported a large percentage of research projects the partnerships undertake. A small number of research-practice partnerships have endowments (e.g. Carnegie Foundation for the Advancement of Teaching, John W. Gardner Center for Youth and Their Communities), but this is relatively rare. For this reason, funders and the decisions they make about how to support partnerships play a key role in their success. Here, we outline implications of our analysis for funders that support research-practice partnerships. Consider providing general operating support: Research-practice partnerships typically find it challenging to attract funding for general operating support or project infrastructure.60 Some funders are hesitant to fund core operations because it falls outside their missions. Others are hesitant because it can be difficult to measure the impact of an investment in core support or infrastructure. As one funder told us: â&#x20AC;&#x153;We havenâ&#x20AC;&#x2122;t demonstrated that [stable funding for infrastructure] is importantâ&#x20AC;? for the long-term success of partnerships. However, we know that general operating support enables partnerships to devote staff time to maintain the relationships that are essential to getting the work done. It also helps partnerships maintain and keep focused on a longer-term agenda. And, it allows projects to be more flexible and responsive to district needs, rather than to the requirements of individual grants. All of these tasks are very difficult if participants must knit together a series of short-term grants to fund the partnership over time.61 Of course, funding specific projects is also essential. Projects are a key way partnerships get off the ground; they organize activity around concrete goals. They also create accountability for achieving goals for external funders. But, funding individual research and development projects in the absence of adequate general support makes it difficult for partnerships to reach their goals. Require potential grantees to provide details about both the products they will produce and the processes for producing them: Most funding for research and development, especially with design-research partnerships, is organized around a product development cycle. The idea is that education is similar to industries in which up-front investment in creating a quality product is expected to pay off once the product is ready to be sold. It is difficult for research-practice partnerships to find funding for development and design work unless it is tied to a specific product or innovation. Most funders expect details about the features of products that partnerships will produce in the grant proposal. They expect far less when it comes to descriptions about how partnerships will organize their work to develop these products. For example, partners may need only to provide letters of commitment as evidence of their involvement, while funders require project narratives describing the details of the programs that teams will develop. If partnerships are working collaboratively and are responsive to data on implementation, though, it is just as important that proposals describe how the researchers and practitioners will go about working with one another. This includes details about how partnerships will negotiate which problems to focus on (or how they already have), how they will go about the design process, and what forms of evidence will be used to inform ongoing design.62

23


Identify innovative ways to co-fund researchers and practitioners: At present, most funding for research-practice partnerships is awarded to the researcher or intermediary organization. Many funders prefer to give grants for specific studies to researchers, because they perceive them as the experts in research activities. Some funders are also concerned about school districts’ capacities to use the money well, given challenging bureaucracies and the difficulty of ensuring that spending is used to support the partnership work rather than districts’ other financial needs. Yet, the decision about who to fund has implications for the dynamics of the partnership and how the work unfolds. Funding only one party makes it challenging to maintain mutualism in partnerships. The group that wins funding controls decisions about how monies are allocated and is also ultimately accountable to the funder for results. Some grant programs have attempted to address this issue by including the requirement that practitioner partners be co-principal investigators on the grant. However, this may not go very far if that title does not come with a subcontract of funds to support the work. Given that the work is fundamentally rooted in the notion of “partnership,” funders should consider how their funding strategies can empower various stakeholders. Consider supporting capacity-building activities: Working in research-practice partnerships often requires the development of new roles and capacities. Researchers must develop the ability to negotiate problems of practice with individuals in local districts. They must learn new skills related to partnership development and maintenance. In design-research partnerships and NICs, they must also develop skills in new forms of design, such as techniques that support practitioners’ participation in design and facilitation of Plan, Do, Study, Act cycles. Districts are also challenged to develop new capacities. They must learn how to find researchers who are capable of helping them address their needs. In design-research partnerships and NICs, district staff may need to learn how to develop measures, collect data, and participate in structured approaches to design and development. Many partnerships may also benefit from identifying and training key staff who can serve as go-betweens in the partnership. They need to “translate” the work of partners to other practitioners and mobilize support for partnership activities within districts.

Research 24


At present, there are virtually no organized opportunities to learn these skills. Most universities are not set up to train researchers to work with school districts in this way.63 And, most school districts do not have the resources or infrastructure to support their teachers, principals, and district leaders in learning new roles. To the degree that capacity emerges, it is an unacknowledged by-product of individual grants and projects. Funders may want to build into funding requirements explicit attention to building the capacity of researchers and practitioners to do this boundary-spanning work. They should also consider providing funds specifically for capacity-building activities.

CONCLUDING THOUGHTS In the last two decades, researchers and practitioners have worked to forge new ways to bring research and practice together to improve schools and districts. District leaders and researchers have crafted new models for developing research-based innovations and creating conditions in schools and districts that are more conducive to the ongoing use of research in policymaking and practice. Research-practice partnerships move away from the conventional ways of doing business. Researchers engage with district leaders in long-term partnerships, focused on doing research and development that meets districtsâ&#x20AC;&#x2122; pressing needs. There is a new commitment to mutualism to ensure that original analysis is informed by the unique knowledge and perspective of both researchers and practitioners and benefits both. And, care is taken to structure the partnerships, using intentional strategies to foster the collaboration needed to bring different perspectives to the table in productive ways. Research-practice partnerships are a promising strategy for improving schools and districts. But, it is often difficult for researchers and district administrators involved in partnerships to learn from one another. It can also be challenging for those interested in developing new partnerships to learn about different ways they might organize their work or anticipate and address the issues they may face. What is needed is a more robust dialogue in which district leaders, researchers, policymakers, and funders speak candidly about the strategic trade-offs partnerships face and the resources that are required for success. This white paper is an important part of moving this conversation forward.

Practice 25


END NOTES 1. B  arry J. Fishman, Ronald W. Marx, S. Best, and R. Tal, “Linking Teacher and Student Learning to Improve Professional Development in Systemic Reform,” Teaching and Teacher Education 19, no. 6 (2003); Louise Yarnall, Nicole Shechtman, and William R. Penuel, “Using Handheld Computers to Support Improved Classroom Assessment in Science: Results from a Field Trial” (paper presented at the Annual Meeting of the American Educational Research Association, Montreal, Quebec, Canada, April 2005); Elaine Allensworth and John Q. Easton, “The on-Track Indicator as a Predictor of High School Graduation,” (Chicago, IL: Consortium on Chicago School Research, 2005); Elaine Allensworth and John Q. Easton, “What Matters for Staying on-Track and Graduating in Chicago Public High Schools: A Close Look at Course Grades, Failures, and Attendance in the Freshman Year. Research Report,” (Chicago, IL: Consortium on Chicago School Research, 2007); Patricia F. Campbell and Nathaniel N. Malkus, “The Impact of Elementary Mathematics Coaches on Student Achievement,” Elementary School Journal 111, no. 3 (2011); Robert Geier, Phyllis Blumenfeld, Ronald W. Marx, Joseph Krajcik, Barry J. Fishman, and Elliot Soloway, “Standardized Test Outcomes for Students Engaged in Inquiry-Based Science Curricula in the Context of Urban Reform,” Journal of Research in Science Teaching 45, no. 8 (2008); Kari Shutt, Rachel S. Phillips, Nancy Vye, Katie Van Horne, and John D. Bransford, “Developing Science Inquiry Skills with Challenge-Based, Student-Directed Learning” (paper presented at the Annual Meeting of the American Educational Research Association, Denver, CO, April 2010); Catherine E. Snow, Joshua F. Lawrence, and Claire White, “Generating Knowledge of Academic Language among Urban Middle School Students,” Journal of Research on Educational Effectiveness 2, no. 4 (2009). 2. S  ybil M. Madison, Mary M. McKay, Roberta Paikoff, and Carl C. Bell, “Basic Research and Community Collaboration: Necessary Ingredients for the Development of a Family-Based HIV Prevention Program,” AIDS Education and Prevention 12, no. 4 (2000); Mary M. McKay, Geetha Gopalan, Lydia M. Franco, Kosta Kalogerogiannis, Mari Umpierre, Orly Olshtain-Mann, William Bannon, Laura Elwyn, and Leah Goldstein, “It Takes a Village to Deliver and Test Child and Family-Focused Services,” Research on Social Work Practice 20, no. 5 (2010); Lawrence A. Palinkas and Haluk Soydan, Translation and Implementation of Evidence-Based Practice, ed. Joan Levy Zlotnik, Building Social Work Research Capacity (New York: Oxford University Press, 2012); Jeff Rojek, Geoffrey P. Alpert, and Hayden P. Smith, “The Utilization of Research by the Police,” Police Practice and Research: An International Journal (2012); Jeff Rojek, Hayden P. Smith, and Geoffrey P. Alpert, “The Prevalence and Characteristics of Police Practitioner-Researcher Partnerships,” Police Quarterly (2012). 3. M  elissa Roderick and John Q. Easton, “Developing New Roles for Research in New Policy Environments: The Consortium on Chicago School Research,” (Chicago, IL: The Consortium on Chicago School Research, 2007); Lea Hubbard, “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence,” in Research and Practice in Education: Building Alliances, Bridging the Divide, ed. Cynthia E. Coburn and Mary Kay Stein (Lanham, MD: Rowman & Littlefield, 2010); Cynthia E. Coburn and Mary Kay Stein, eds., Research and Practice in Education: Building Alliances, Bridging the Divide (Lanham, MD: Rowman & Littlefield, 2010); Cynthia E. Coburn, Soung Bae, and Erica O. Turner, “Authority, Status, and the Dynamics of Insider-Outsider Partnerships at the District Level,” Peabody Journal of Education 83(2008). 4. T  he paper draws on three sources of data. First, we interviewed 17 national leaders of research-practice partnerships. We drew on these interviews to derive our definition and common features of research-practice partnerships. Second, we reviewed existing research on research-practice partnerships inside and outside education. This literature review was used to identify challenges of research-practice partnerships and support the definition and typology. Third, we conducted case studies of four researchpractice partnerships. Each partnership was selected because it was a good example of a given type of partnership. We selected two partnerships that exemplified research alliances: one that was a district-focused alliance and the other that was a crosssectoral alliance. For each case study, we did three to six interviews with district leaders and researchers, and collected and analyzed project artifacts, project reports, and publications. We used these case studies to illustrate partnerships of each type. 4. L  aura D’Amico, “The Center for Learning Technologies in Urban Schools: Evolving Relationships in Design-Based Research,” in Research and Practice in Education: Building Alliances, Bridging the Divide, ed. Cynthia E. Coburn and Mary Kay Stein (Lanham, MD: Rowan & Littlefield, 2010). 5. M  elissa Roderick, John Q. Easton, and Penny Bender Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform,” (Chicago, IL: Consortium on Chicago School Research, 2009); for similar arguments outside education, see Madison, McKay, Paikoff, and Bell. “Basic Research and Community Collaboration: Necessary Ingredients for the Development of a Family-Based HIV Prevention Program”; Suzanne Christopher, Vanessa Watts, Alma Knows His Gun McCormick, and Sara Young, “Building and Maintaining Trust in a Community-Based Participatory Research Partnership,” American Journal of Public Health 98, no. 8 (2008). 6. H  ugh Burkhardt and Alan H. Schoenfeld, “Improving Educational Research: Toward a More Useful, More Influential, and BetterFunded Enterprise,” Educational Researcher 32, no. 9 (2003); Barry J. Fishman, Ronald W. Marx, Phyllis Blumenfeld, Joseph Krajcik, and Elliot Soloway, “Creating a Framework for Research on Systemic Technology Innovations,” The Journal of the Learning Sciences 13, no. 1 (2004). 7. S  uzanne Donovan, Alexandra K. Wigdor, and Catherine E. Snow, Strategic Education Research Partnership (Washington, DC: National Research Council, 2003); Anne E. Kazak, Kimberly Hoagwood, John R. Weisz, Korey Hood, Thomas R. Kratochwill, Luis A. Vargas, and Gerard A. Banez. “A Meta-Systems Approach to Evidence-Based Practice for Children and Adolescents.” American Psychologist 65, no. 2 (2010). 8. R  oderick, Easton, and Penny Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 9. A  nthony S. Bryk, Louis M. Gomez, and Alicia Grunow, “Getting Ideas into Action: Building Networked Improvement Communities in Education,” in Frontiers in Sociology of Education, ed. Maureen Hallinan (Dordrecht, the Netherlands: Verlag, 2011); Donovan, Wigdor, and Snow, Strategic Education Research Partnership; Anthony S. Bryk, Sharon G. Rollow, and Gay Su Pinnell. “Urban School Development: Literacy as a Lever for Change.” Educational Policy 10, no. 2 (1996). 10. W  illiam R. Penuel and Louise Yarnall, “Designing Handheld Software to Support Classroom Assessment: An Analysis of Conditions for Teacher Adoption,” Journal of Technology, Learning, and Assessment 3, no. 5 (2005); for a similar argument outside education, see Palinkas and Soydan, Translation and Implementation of Evidence-Based Practice; Brian J. Reiser, James P. Spillane, Franci Steinmuler, Don Sorsa, Karen Carney, and Eleni Kyza, “Investigating the Mutual Adaptation Process in Teachers’


Design of Technology-Infused Curricula,” in Fourth International Conference of the Learning Sciences., ed. Barry Fishman and S. O’Connor-Divelbiss (Mahwah, NJ: Erlbaum, 2000); Nina Wallerstein and Bonnie Duran, “Community-Based Participatory Research Contributions to Intervention Research: The Intersection of Science and Practice to Improve Health Equity,” American Journal of Public Health 100, no. S1 (2010). 11. e  .g., the Grow Network; Cornelia Brunner, Chat Fasca, Juliette Heinze, Margaret Honey, Daniel Light, Ellen B. Mandinach, and Data Wexler, “Linking Data and Learning: The Grow Network Study,” Journal of Education for Students Placed at Risk 10, no. 3 (2005). 12. W  illiam R. Penuel and Barbara Means, “Using Large-Scale Databases in Evaluation: Advances, Opportunities, and Challenges,” American Journal of Evaluation 32, no. 1 (2011); Faith Connolly and Linda S. Olson, “Early Elementary Performance and Attendance in Baltimore City Schools’ Pre-Kindergarten and Kindergarten,” (Baltimore, MD: Baltimore Education Research Consortium, 2012). 13. R  oderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 14. V  anessa Coco, David Johnson, Thomas Kelley-Kemple, Melissa Roderick, Eliza Moeller, Nicole Williams, and Kafi Moragne, “Working to My Potential: The Postsecondary Experiences of CPS Students in the International Baccalaureate Diploma Programme,” (Chicago, IL: Consortium on Chicago Schools Research, 2012); Rebecca A. London and Oded Gurantz, “State and Local Data Infrastructure for Tracking Secondary to Postsecondary Educational Outcomes,” (Stanford, CA: John W. Gardner Center for Youth and Their Communities, 2010); Adriana Villavicencio, and Justina K. Grayman, “Learning From “Turnaround” Middle Schools: Strategies for Success,” (New York: Research Alliance for New York City Schools, 2012); Julian R. Betts, Youjin Hahn, and Andrew C. Zau, “Does Diagnostic Math Testing Improve Student Learning,” (San Francisco, CA: Public Policy Institute of California, 2011); Stephen B. Plank and Barbara Condliffe, “Pressures of the Season: A Descriptive Look at Classroom Quality in Second and Third Grade Classrooms,” (Baltimore, MD: Baltimore Education Research Consortium, 2011). 15. Donovan, Wigdor, and Snow, Strategic Education Research Partnership. 16. R  oderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform”: 22. 17. D  aniel C. Edelson, “Design Research: What We Learn When We Engage in Design,” The Journal of the Learning Sciences 11, no. 1 (2002); Allan Collins, Diana Joseph, and Katerine Bielaczyc, “Design Research: Theoretical and Methodological Issues,” The Journal of the Learning Sciences 13, no. 1 (2004); Paul A. Cobb, Jere Confrey, Andrea A. diSessa, Richard Lehrer, and Leona Schauble, “Design Experiments in Educational Research,” Educational Researcher 32, no. 1 (2003). 18. e  .g., Paul A. Cobb, Kay McClain, Teruni Salva Laumberg, and Chrystal Dean, “Situating Teachers’ Instructional Practices in the Institutional Setting of the School and District,” Educational Researcher 32, no. 6 (2003); Lauren B. Resnick and James P. Spillane, “From Individual Learning to Organizational Designs for Learning,” in Instructional Psychology: Past, Present and Future Trends. Sixteen Essays in Honor of Erik De Corte, ed. Lieven Verschaffel, Filip Dochy, Monique Boekaerts, and Stella Vosinadou, Advances in Learning and Instruction (Oxford: Pergamon, 2006); Snow, Lawrence, and White “Generating Knowledge of Academic Language among Urban Middle School Students.” 19. S  asha Barab and April L. Luehmann, “Building Sustainable Science Curriculum: Acknowledging and Accommodating Local Adaptation.” Science Education 87, no. 4 (2003); Barry J. Fishman and Joe Krajcik, “What Does It Mean to Create Sustainable Science Curriculum Innovations? A Commentary,” Science Education 87, no. 4 (2003); Fishman, Marx, Blumenfeld, Krajcik, and Soloway, “Creating a Framework for Research on Systemic Technology Innovations”; William A. Sandoval and Philip Bell, “DesignBased Research Methods for Studying Learning in Context: Introduction,” Educational Psychologist 39, no. 4 (2004). 20. P  aul A. Cobb and Kara Jackson, “Analyzing Educational Policies: A Learning Design Perspective,” Journal of the Learning Sciences 21, no. 4 (2012). 21. e  .g., William R. Penuel, Jeremy Roschelle, and Nicole Shechtman, “The Whirl Co-Design Process: Participant Experiences,” Research and Practice in Technology Enhanced Learning 2, no. 1 (2007); Elizabeth A. Davis and Joseph Krajcik, “Designing Educative Curriculum Materials to Promote Teacher Learning,” Educational Researcher 34, no. 3 (2005); Jane Bowyer, Libby Gerard, and Ronald W. Marx, “Building Leadership for Scaling Science Curriculum Reform,” in Designing Coherent Science Education, ed. Yael Kali, Marcia C. Linn, and Jo Ellen Roseman (New York: Teachers College Press, 2008); Resnick and Spillane, “From Individual Learning to Organizational Designs for Learning.” 22. D  ouglas C. Engelbart, “Toward High-Performance Organizations: A Strategic Role for Groupware” (paper presented at the GroupWare ‘92 Conference, San Jose, CA, August 1992). 23. Donald M. Berwick, “The Science of Improvement,” The Journal of the American Medical Association 299, no. 10 (2008). 24. Anthony S. Bryk, “Support a Science of Performance Improvement,” Phi Delta Kappan 90, no. 8 (2009): 598. 25. B  erwick, “The Science of Improvement”; Kaveh G. Shojania and Jeremy M. Grimshaw, “Evidence-Based Quality Improvement: The State of the Science,” Health Affairs 24, no. 1 (2005). 26. William E. Deming, Out of the Crisis (Cambridge, MA: MIT Press, 1986). 27. J ennifer Zoltners Sherer and Alicia Grunow, “90-Day Cycle: Exploration of Math Intensives as a Strategy to Move More Community College Students out of Developmental Math Courses,” (Palo Alto, CA: The Carnegie Foundation for the Advancement of Teaching, 2010). 28. “The Strive Network,” www.strivenetwork.org. 29. “Strive,” http://knowledgeworks.org/impacting-schools-communities/strive. 30. Institute of Education Sciences, “Regional Educational Laboratories 2012 - 2017, Solicitation Ed-Ies-11-R-0036,” (Washington, DC: Department of Education, 2012). 31. W  illiam E. Bickel and Rosemary A. Hattrup, “Teachers and Researchers in Collaboration: Reflections on the Process,” American Educational Research Journal 32, no. 1 (1995); Susan M. Brookhart and William E. Loadman, “School-University Collaboration and Perceived Professional Rewards,” Journal of Research in Education 2, no. 1 (1992); Bernard R. Gifford, “The Evolution of the


School-University Partnership for Educational Renewal,” Education and Urban Society 19, no. 1 (1986); Pamela J. Keating and Richard W. Clark, “Accent on Leadership: The Puget Sound Educational Consortium,” in School-University Partnerships in Action: Concepts, Cases and Concerns, ed. Kenneth A. Sirotnik and John I. Goodlad (New York: Teachers College Press, 1988); Philip C. Schlecty and Betty Lou Whitford, “Shared Problems and Shared Vision: Organic Collaboration,” in School-University Partnerships in Action: Concepts, Cases and Concerns, ed. Kenneth A. Sirotnik and John I. Goodlad (New York: Teachers College Press, 1988); for similar findings outside education, see Lawrence A. Palinkas, Gregory A. Aarons, Bruce F. Chorpita, Kimberly Hoagwood, John Landsverk, and John R. Weisz, “Cultural Exchange and the Implementation of Evidence-Based Practices: Two Case Studies,” Research on Social Work Practice 19, no. 5 (2009). 32. W  illiam A. Firestone and Jennifer L. Fisler, “Politics, Community, and Leadership in a School-University Partnership,” Educational Administration Quarterly 38, no. 4 (2002); Paul E. Heckman, “The Southern California Partnership: A Retrospective Analysis,” in School-University Partnerships in Action: Concepts, Cases and Concerns, ed. Kenneth A. Sirotnik and John I. Goodlad (New York: Teachers College Press, 1988); Les Vozzo and Jack MacFadden, “A School-University Partnership: A Commitment to Collaboration and Professional Renewal,” in The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, ed. Ruth Ravid and Marianne G. Handler (Englewood, CO: Teachers Ideas Press, 2001). 33. e  .g., Nancy Vye, Philip Bell, Carrie T. Tzou, and John D. Bransford, “Instructional Design Principles for Blending and Bridging Science Learning across Formal and Informal Environments” (paper presented at the National Association for Research in Science Teaching Annual International Conference, Philadelphia, PA, March 2010). 34. M  cKay, Gopalan, Franco, Kalogerogiannis, Umpierre, Olshtain-Mann, Bannon, Elwyn, and Goldstein, “It Takes a Village to Deliver and Test Child and Family-Focused Services”; Palinkas and Soydan, Translation and Implementation of Evidence-Based Practice; Pamela Gates-Duffield and Carol Stark, “Recipes for Professional Development Schools: A Metaphor for Collaboration,” in The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, ed. Ruth Ravid and Marianne G. Handler (Englewood, CO: Teachers Ideas Press, 2001); Robin Hasslen, Nancy Bacharach, Judy Rotto, and Jonathan Fribley. “Learning Connections: Working toward a Shared Vision.” In The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, edited by Ruth Ravid and Marianne G. Handler, 60-69. Englewood, CO: Teachers Ideas Press, 2001); Ann Lieberman, “The Metropolitan School Study Council: A Living History,” in School-University Partnerships in Action: Concepts, Cases and Concerns, ed. Kenneth A. Sirotnik and John I. Goodlad (New York: Teachers College Press, 1988); Lisa Rosen, “Deconstructing the Rhetoric of Rigor: Evidence-Based Policy and the Politics of Representation,” (Chicago: Center for School Improvement, in preparation); David Dwayne Williams, “The Brigham Young University-Public School Partnership,” in School-University Partnerships in Action: Concepts, Cases and Concerns, ed. Kenneth A. Sirotnik and John I. Goodlad (New York: Teachers College Press, 1988). 35. P  hyllis Blumenfeld, Barry J. Fishman, Joseph Krajcik, Ronald W. Marx, and Elliot Soloway, “Creating Usable Innovations in Systemic Reform: Scaling up Technology-Embedded Project-Based Science in Urban Schools,” Educational Psychologist 35, no. 3 (2000); Suzanne Donovan and James W. Pellegrino, “Learning and Instruction: A SERP Research Agenda,” (Washington, DC: National Research Council, 2003); Roderick and Easton, “Developing New Roles for Research in New Policy Environments: The Consortium on Chicago School Research”; Roderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 36. Julie Reed Kochanek, Building Trust for Better Schools: Research-Based Practices (Thousand Oaks, CA: Corwin Press, 2005). 37. R  oderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 38. Coburn, Bae, and Turner, “Authority, Status, and the Dynamics of Insider-Outsider Partnerships at the District Level.” 39. Penuel, Roschelle, and Shechtman, “The Whirl Co-Design Process: Participant Experiences.” 40. Donovan, Wigdor, and Snow, Strategic Education Research Partnership. 41. B  arbara Means and William R. Penuel, “Research to Support Scaling up Technology-Based Educational Innovations,” in Scaling up Success: Lessons from Technology-Based Educational Improvement, ed. Christopher Dede, James P. Honan, and Lawrence C. Peters (San Francisco, CA: Jossey-Bass, 2005); D’Amico, “The Center for Learning Technologies in Urban Schools: Evolving Relationships in Design-Based Research”; for a similar finding outside education, see Palinkas and Soydan, Translation and Implementation of Evidence-Based Practice. 42. D  onovan, Wigdor, and Snow, Strategic Education Research Partnership; Gina Schuyler Ikemoto and Meredith I. Honig, “Tools to Deepen Practitioners’ Engagement with Research: The Case of the Institute for Learning,” in Research and Practice in Education: Building Alliances, Bridging the Divide, ed. Cynthia E. Coburn and Mary Kay Stein (Lanham, MD: Rowman & Littlefield, 2010). 43. D  ’Amico, “The Center for Learning Technologies in Urban Schools: Evolving Relationships in Design-Based Research”; Hubbard, “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence”; Ikemoto and Honig, “Tools to Deepen Practitioners’ Engagement with Research: The Case of the Institute for Learning”; Lisa Rosen, “Examining a Novel Partnership for Educational Innovation: Promises and Complexities of Cross-Institutional Collaboration,” in Research and Practice in Education: Building Alliances, Bridging the Divide, ed. Cynthia E. Coburn and Mary Kay Stein (New York: Rowman & Littlefield Publishing Group, 2010). 44. Hubbard, “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence.” 45. F  or a review, see Cynthia E. Coburn, Meredith I. Honig, and Mary Kay Stein, “What’s the Evidence on Districts’ Use of Evidence?” in The Role of Research in Educational Improvement, ed. John D. Bransford, Deborah J. Stipek, Nancy J. Vye, Louis M. Gomez, and Diana Lam (Cambridge, MA: Harvard Education Press, 2009). 46. F  aith Connolly, Stephen Plank, and Tracy Rone, “Baltimore Education Research Consortium: A Consideration of Past, Present, and Future,” (Baltimore, MD: Baltimore Education Research Consortium, 2012): 4. 47. Ikemoto and Honig, “Tools to Deepen Practitioners’ Engagement with Research: The Case of the Institute for Learning”; Roderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 48. Roderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting


Urban School Reform”; for a similar finding outside education, see Susan Thering and Victoria Chanse, “The Scholarship of Transdisciplinary Action Research: Toward a New Paradigm for the Planning and Design Traditions,” Landscape Journal 30, no. 1 (2011). 49. P  alinkas and Soydan, Translation and Implementation of Evidence-Based Practice; Burkhardt and Schoenfeld, “Improving Educational Research: Toward a More Useful, More Influential, and Better-Funded Enterprise”; Roderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 50. B  lumenfeld, Fishman, Krajcik, Marx, and Soloway, “Creating Usable Innovations in Systemic Reform: Scaling up TechnologyEmbedded Project-Based Science in Urban Schools”; Cynthia E. Coburn, “Partnership for District Reform: The Challenges of Evidence Use in a Major Urban District,” in Research and Practice in Education: Building Alliances, Bridging the Divide, ed. Cynthia E. Coburn and Mary Kay Stein (New York: Rowman & Littlefield Publishing Group, 2010); Fishman, Marx, Blumenfeld, Krajcik, and Soloway, “Creating a Framework for Research on Systemic Technology Innovations”; Hubbard, “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence”; Roderick, Easton, and Sebring, “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” 51. J . Curtis McMillen, Shannon L. Lenze, Kristin M. Hawley, and Victoria A. Osborne, “Revisiting Practice-Based Research Networks as a Platform for Mental Health Services Research,” Administration and Policy in Mental Health and Mental Health Services Research 36(2009). 52. W  illiam R. Penuel, Barry J. Fishman, Ryoko Yamaguchi, and Lawrence P. Gallagher, “What Makes Professional Development Effective? Strategies That Foster Curriculum Implementation,” American Educational Research Journal 44, no. 4 (2007). 53. T  homas E. Glass and Louis A. Franceschini, The State of the American School Superintendency: A Mid-Decade Study (Lanham, MD: Rowman & Littlefield, 2007); Hannele Kerosuo, “’Boundary Encounters’ as a Place for Learning and Development at Work,” Outlines: Critical Social Studies 6, no. 1 (2001). 54. e  .g., Penuel and Yarnall, “Designing Handheld Software to Support Classroom Assessment: An Analysis of Conditions for Teacher Adoption.” 55. J ohn Kania and Mark Kramer, “Collective Impact,” Stanford Social Innovation Review 9, no. 1 (2011); Robert E. Kraut, Jolene Galegher, and Carmen Egido, “Relationships and Tasks in Scientific Research Collaboration,” Human-Computer Interaction 3, no. 1 (1987). 56. Kania and Kramer, “Collective Impact”: 40. 57. M  ark A. Smylie and Thomas B. Corcoran, “Nonprofit Organizations and the Promotion of Evidence-Based Practice,” in The Role of Research in Educational Improvement, ed. John D. Bransford, Deborah J. Stipek, Nancy Vye, L. Gomez, and Diana Lam (Massachusetts: Harvard Educational Press, 2009). 58. R  ichard J. Shavelson, Dennis C. Phillips, Lisa Towne, and Michael J. Feuer, “On the Science of Education Design Studies,” Educational Researcher 32, no. 1 (2003). 59. B  lumenfeld, Fishman, Krajcik, Marx, and Soloway, “Creating Usable Innovations in Systemic Reform: Scaling up TechnologyEmbedded Project-Based Science in Urban Schools”; Burkhardt and Schoenfeld, “Improving Educational Research: Toward a More Useful, More Influential, and Better-Funded Enterprise”; Coburn and Stein, eds., Research and Practice in Education: Building Alliances, Bridging the Divide; Donovan, Wigdor, and Snow, Strategic Education Research Partnership; for discussion of similar issues outside education see McMillen, Lenze, Hawley, and Osborne; Rojek, Smith, and Alpert, “The Prevalence and Characteristics of Police Practitioner-Researcher Partnerships.” 60. B  lumenfeld, Fishman, Krajcik, Marx, and Soloway, “Creating Usable Innovations in Systemic Reform: Scaling up TechnologyEmbedded Project-Based Science in Urban Schools”; Burkhardt and Schoenfeld, “Improving Educational Research: Toward a More Useful, More Influential, and Better-Funded Enterprise”; Coburn and Stein, eds., Research and Practice in Education: Building Alliances, Bridging the Divide; Donovan, Wigdor, and Snow, Strategic Education Research Partnership; Hubbard, “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence”; Kazak, Hoagwood, Weisz, Hood, Kratochwill, Vargas, and Banez, “A Meta-Systems Approach to Evidence-Based Practice for Children and Adolescents.” 61. B  arbara Means and Christopher J. Harris, “Towards an Evidence Framework for Design-Based Implementation Research,” in Design-Based Implementation Research, ed. Barry J. Fishman, William R. Penuel, Anna-Ruth Allen, and Britte Haugan Cheng (National Society for the Study of Education Yearbook, 112(1): in press). 62. B  urkhardt and Schoenfeld, “Improving Educational Research: Toward a More Useful, More Influential, and Better-Funded Enterprise.”


REFERENCES Allensworth, Elaine, and John Q. Easton. “The On-Track Indicator as a Predictor of High School Graduation.” Chicago, IL: Consortium on Chicago School Research, 2005. ———. “What Matters for Staying on-Track and Graduating in Chicago Public High Schools: A Close Look at Course Grades, Failures, and Attendance in the Freshman Year. Research Report.” Chicago, IL: Consortium on Chicago School Research, 2007. Barab, Sasha, and April L. Luehmann. “Building Sustainable Science Curriculum: Acknowledging and Accommodating Local Adaptation.” Science Education 87, no. 4 (2003): 454-67. Berwick, Donald M. “The Science of Improvement.” The Journal of the American Medical Association 299, no. 10 (2008): 1182-84. Betts, Julian R., Youjin Hahn, and Andrew C. Zau. “Does Diagnostic Math Testing Improve Student Learning.” San Francisco, CA: Public Policy Institute of California, 2011. Bickel, William E., and Rosemary A. Hattrup. “Teachers and Researchers in Collaboration: Reflections on the Process.” American Educational Research Journal 32, no. 1 (1995): 35-62. Blumenfeld, Phyllis, Barry J. Fishman, Joseph Krajcik, Ronald W. Marx, and Elliot Soloway. “Creating Usable Innovations in Systemic Reform: Scaling up Technology-Embedded Project-Based Science in Urban Schools.” Educational Psychologist 35, no. 3 (2000): 149-64. Bowyer, Jane, Libby Gerard, and Ronald W. Marx. “Building Leadership for Scaling Science Curriculum Reform.” In Designing Coherent Science Education, edited by Yael Kali, Marcia C. Linn and Jo Ellen Roseman. New York: Teachers College Press, 2008:123-152. Brookhart, Susan M., and William E. Loadman. “School-University Collaboration and Perceived Professional Rewards.” Journal of Research in Education 2, no. 1 (1992): 68-76. Brunner, Cornelia, Chat Fasca, Juliette Heinze, Margaret Honey, Daniel Light, Ellen B. Mandinach, and Data Wexler. “Linking Data and Learning: The Grow Network Study.” Journal of Education for Students Placed at Risk 10, no. 3 (2005): 241-67. Bryk, Anthony S. “Support a Science of Performance Improvement.” Phi Delta Kappan 90, no. 8 (2009): 597-600. Bryk, Anthony S., Louis M. Gomez, and Alicia Grunow. “Getting Ideas into Action: Building Networked Improvement Communities in Education.” In Frontiers in Sociology of Education, edited by Maureen Hallinan, 127-62. Dordrecht, the Netherlands: Verlag, 2011. Bryk, Anthony S., Sharon G. Rollow, and Gay Su Pinnell. “Urban School Development: Literacy as a Lever for Change.” Educational Policy 10, no. 2 (1996): 172-201. Burkhardt, Hugh, and Alan H. Schoenfeld. “Improving Educational Research: Toward a More Useful, More Influential, and BetterFunded Enterprise.” Educational Researcher 32, no. 9 (2003): 3-14. Campbell, Patricia F., and Nathaniel N. Malkus. “The Impact of Elementary Mathematics Coaches on Student Achievement.” Elementary School Journal 111, no. 3 (2011): 430-54. Christopher, Suzanne, Vanessa Watts, McCormick, and Sara Young. “Building and Maintaining Trust in a Community-Based Participatory Research Partnership.” American Journal of Public Health 98, no. 8 (2008): 1390-1406. Cobb, Paul A., Jere Confrey, Andrea A. diSessa, Richard Lehrer, and Leona Schauble. “Design Experiments in Educational Research.” Educational Researcher 32, no. 1 (2003): 9-13. Cobb, Paul A., and Kara Jackson. “Analyzing Educational Policies: A Learning Design Perspective.” Journal of the Learning Sciences 21, no. 4 (2012): 487-521. Cobb, Paul A., Kay McClain, Teruni Salva Laumberg, and Chrystal Dean. “Situating Teachers’ Instructional Practices in the Institutional Setting of the School and District.” Educational Researcher 32, no. 6 (2003): 13-24. Coburn, Cynthia E. “Partnership for District Reform: The Challenges of Evidence Use in a Major Urban District.” In Research and Practice in Education: Building Alliances, Bridging the Divide, edited by Cynthia E. Coburn and Mary Kay Stein, 167-82. New York: Rowman & Littlefield Publishing Group, 2010. Coburn, Cynthia E., Soung Bae, and Erica O. Turner. “Authority, Status, and the Dynamics of Insider-Outsider Partnerships at the District Level.” Peabody Journal of Education 83 (2008): 364-99. Coburn, Cynthia E., Meredith I. Honig, and Mary Kay Stein. “What’s the Evidence on Districts’ Use of Evidence?” In The Role of Research in Educational Improvement, edited by John D. Bransford, Deborah J. Stipek, Nancy J. Vye, Louis M. Gomez and Diana Lam, 67-87. Cambridge, MA: Harvard Education Press, 2009. Coburn, Cynthia E., and Mary Kay Stein, eds. Research and Practice in Education: Building Alliances, Bridging the Divide. Lanham, MD: Rowman & Littlefield, 2010. Coca, Vanessa, David Johnson, Thomas Kelley-Kemple, Melissa Roderick, Eliza Moeller, Nicole Williams, and Kafi Moragne. “Working to My Potential: The Postsecondary Experiences of CPS Students in the International Baccalaureate Diploma Programme.” Chicago, IL: Consortium on Chicago Schools Research, 2012. Collins, Allan, Diana Joseph, and Katerine Bielaczyc. “Design Research: Theoretical and Methodological Issues.” The Journal of the Learning Sciences 13, no. 1 (2004): 15-42. Connolly, Faith, and Linda S. Olson. “Early Elementary Performance and Attendance in Baltimore City Schools’ Pre-Kindergarten and Kindergarten.” Baltimore, MD: Baltimore Education Research Consortium, 2012. Connolly, Faith, Stephen Plank, and Tracy Rone. “Baltimore Education Research Consortium: A Consideration of Past, Present, and Future.” Baltimore, MD: Baltimore Education Research Consortium, 2012. D’Amico, Laura. “The Center for Learning Technologies in Urban Schools: Evolving Relationships in Design-Based Research.” In Research and Practice in Education: Building Alliances, Bridging the Divide, edited by Cynthia E. Coburn and Mary Kay Stein, 37-53.


Lanham, MD: Rowan & Littlefield, 2010. Davis, Elizabeth A., and Joseph Krajcik. “Designing Educative Curriculum Materials to Promote Teacher Learning.” Educational Researcher 34, no. 3 (2005): 3-14. Deming, William Edwards. Out of the Crisis Cambridge, MA: MIT Press, 1986. Donovan, Suzanne, and James W. Pellegrino. “Learning and Instruction: A SERP Research Agenda.” Washington, DC: National Research Council, 2003. Donovan, Suzanne, Alexandra K. Wigdor, and Catherine E. Snow. Strategic Education Research Partnership. Washington, DC: National Research Council, 2003. Edelson, Daniel C. “Design Research: What We Learn When We Engage in Design.” The Journal of the Learning Sciences 11, no. 1 (2002): 105-21. Engelbart, Douglas C. “Toward High-Performance Organizations: A Strategic Role for Groupware.” Paper presented at the GroupWare ‘92 Conference, San Jose, CA, August 1992. Firestone, William A., and Jennifer L. Fisler. “Politics, Community, and Leadership in a School-University Partnership.” Educational Administration Quarterly 38, no. 4 (2002): 449-93. Fishman, Barry J., and Joe Krajcik. “What Does It Mean to Create Sustainable Science Curriculum Innovations? A Commentary.” Science Education 87, no. 4 (2003): 564-73. Fishman, Barry J., Ronald W. Marx, Stephen Best, and Revital Tal. “Linking Teacher and Student Learning to Improve Professional Development in Systemic Reform.” Teaching and Teacher Education 19, no. 6 (2003): 643-58. Fishman, Barry J., Ronald W. Marx, Phyllis Blumenfeld, Joseph Krajcik, and Elliot Soloway. “Creating a Framework for Research on Systemic Technology Innovations.” The Journal of the Learning Sciences 13, no. 1 (2004): 43-76. Gates-Duffield, Pamela, and Carol Stark. “Recipes for Professional Development Schools: A Metaphor for Collaboration.” In The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, edited by Ruth Ravid and Marianne G. Handler, 45-57. Englewood, CO: Teachers Ideas Press, 2001. Gawande, Atul. The Checklist Manifesto: How to Get Things Right. New York: Metropolitan Books, 2010. Geier, Robert, Phyllis Blumenfeld, Ronald W. Marx, Joseph Krajcik, Barry J. Fishman, and Elliot Soloway. “Standardized Test Outcomes for Students Engaged in Inquiry-Based Science Curricula in the Context of Urban Reform.” Journal of Research in Science Teaching 45, no. 8 (2008): 922-39. Gifford, Bernard R. “The Evolution of the School-University Partnership for Educational Renewal.” Education and Urban Society 19, no. 1 (1986): 77-106. Glass, Thomas E., and Louis A. Franceschini. The State of the American School Superintendency: A Mid-Decade Study. Lanham, MD: Rowman & Littlefield, 2007. Hasslen, Robin, Nancy Bacharach, Judy Rotto, and Jonathan Fribley. “Learning Connections: Working toward a Shared Vision.” In The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, edited by Ruth Ravid and Marianne G. Handler, 60-69. Englewood, CO: Teachers Ideas Press, 2001. Heckman, Paul E. “The Southern California Partnership: A Retrospective Analysis.” In School-University Partnerships in Action: Concepts, Cases and Concerns, edited by Kenneth A. Sirotnik and John I. Goodlad, 106-23. New York: Teachers College Press, 1988. Hubbard, Lea. “Research to Practice: The Case of Boston Public Schools, Education Matters and the Boston Plan for Excellence.” In Research and Practice in Education: Building Alliances, Bridging the Divide, edited by Cynthia E. Coburn and Mary Kay Stein, 55-72. Lanham, MD: Rowman & Littlefield, 2010. Ikemoto, Gina Schuyler, and Meredith I. Honig. “Tools to Deepen Practitioners’ Engagement with Research: The Case of the Institute for Learning.” In Research and Practice in Education: Building Alliances, Bridging the Divide, edited by Cynthia E. Coburn and Meredith I. Honig, 93-108. Lanham, MD: Rowman & Littlefield, 2010. Institute of Education Sciences. “Regional Educational Laboratories 2012 - 2017, Solicitation ED-IES-11-R-0036.” Washington, DC: Department of Education, 2012. Kania, John, and Mark Kramer. “Collective Impact.” Stanford Social Innovation Review 9, no. 1 (2011): 36-41. Kazak, Anne E., Kimberly Hoagwood, John R. Weisz, Korey Hood, Thomas R. Kratochwill, Luis A. Vargas, and Gerard A. Banez. “A Meta-Systems Approach to Evidence-Based Practice for Children and Adolescents.” American Psychologist 65, no. 2 (2010): 85-97. Keating, Pamela J., and Richard W. Clark. “Accent on Leadership: The Puget Sound Educational Consortium.” In School-University Partnerships in Action: Concepts, Cases and Concerns, edited by Kenneth A. Sirotnik and John I. Goodlad, 148-66. New York: Teachers College Press, 1988. Kerosuo, Hannele. “’Boundary Encounters as a Place for Learning and Development at Work.” Outlines: Critical Social Studies 6, no. 1 (2001): 53-65. KnowledgeWorks. “Strive.” (http://knowledgeworks.org/impacting-schools-communities/strive). Kochanek, Julie Reed. Building Trust for Better Schools: Research-Based Practices. Thousand Oaks, CA: Corwin Press, 2005. Kraut, Robert E., Jolene Galegher, and Carmen Egido. “Relationships and Tasks in Scientific Research Collaboration.” HumanComputer Interaction 3, no. 1 (1987): 31-58. Lieberman, Ann. “The Metropolitan School Study Council: A Living History.” In School-University Partnerships in Action: Concepts, Cases and Concerns, edited by Kenneth A. Sirotnik and John I. Goodlad, 69-86. New York: Teachers College Press, 1988. London, Rebecca A., and Oded Gurantz. “State and Local Data Infrastructure for Tracking Secondary to Postsecondary Educational Outcomes.” Stanford, CA: John W. Gardner Center for Youth and Their Communities, 2010.


Madison, Sybil M., Mary M. McKay, Roberta Paikoff, and Carl C. Bell. “Basic Research and Community Collaboration: Necessary Ingredients for the Development of a Family-Based HIV Prevention Program.” AIDS Education and Prevention 12, no. 4 (2000): 281-98. McKay, Mary M., Geetha Gopalan, Lydia M. Franco, Kosta Kalogerogiannis, Mari Umpierre, Orly Olshtain-Mann, William Bannon, Laura Elwyn, and Leah Goldstein. “It Takes a Village to Deliver and Test Child and Family-Focused Services.” Research on Social Work Practice 20, no. 5 (2010): 476-82. McMillen, J. Curtis, Shannon L. Lenze, Kristin M. Hawley, and Victoria A. Osborne. “Revisiting Practice-Based Research Networks as a Platform for Mental Health Services Research.” Administration and Policy in Mental Health and Mental Health Services Research 36 (2009): 308-21. Means, Barbara, and Christopher J. Harris. “Towards an Evidence Framework for Design-Based Implementation Research.” In DesignBased Implementation Research, edited by Barry J. Fishman, William R. Penuel, Anna Ruth Allen and Britte Cheng. National Society for the Study of Education Yearbook, 112(1), in press. Means, Barbara, and William R. Penuel. “Research to Support Scaling up Technology-Based Educational Innovations.” In Scaling up Success: Lessons from Technology-Based Educational Improvement, edited by Christopher Dede, James P. Honan and Lawrence C. Peters, 176-97. San Francisco, CA: Jossey-Bass, 2005. NYU Steinhardt School of Culture. “Research Alliance for New York City Schools.” http://steinhardt.nyu.edu/research_alliance/. Palinkas, Lawrence A., Gregory A. Aarons, Bruce F. Chorpita, Kimberly Hoagwood, John Landsverk, and John R. Weisz. “Cultural Exchange and the Implementation of Evidence-Based Practices: Two Case Studies.” Research on Social Work Practice 19, no. 5 (2009): 602-12. Palinkas, Lawrence A., and Haluk Soydan. Translation and Implementation of Evidence-Based Practice. Building Social Work Research Capacity, edited by Joan Levy Zlotnik. New York: Oxford University Press, 2012. Penuel, William R., Cynthia E. Coburn, and Dan Gallagher. “Negotiating Problems of Practice in Research-Practice Partnerships Focused on Design.” In Design-Based Implementation Research: Theories, Methods, and Exemplars, edited by Barry J. Fishman, William R. Penuel, Anna Ruth Allen and Britte Cheng. National Society for the Study of Education Yearbook, 112(1), in press. Penuel, William R., Barry J. Fishman, Ryoko Yamaguchi, and Lawrence P. Gallagher. “What Makes Professional Development Effective? Strategies That Foster Curriculum Implementation.” American Educational Research Journal 44, no. 4 (2007): 921-58. Penuel, William R., and Barbara Means. “Using Large-Scale Databases in Evaluation: Advances, Opportunities, and Challenges.” American Journal of Evaluation 32, no. 1 (2011): 188-33. Penuel, William R., Jeremy Roschelle, and Nicole Shechtman. “The Whirl Co-Design Process: Participant Experiences.” Research and Practice in Technology Enhanced Learning 2, no. 1 (2007): 51-74. Penuel, William R., and Louise Yarnall. “Designing Handheld Software to Support Classroom Assessment: An Analysis of Conditions for Teacher Adoption.” Journal of Technology, Learning, and Assessment 3, no. 5 (2005): Available from http://www.jtla.org. Plank, Stephen B., and Barbara Condliffe. “Pressures of the Season: A Descriptive Look at Classroom Quality in Second and Third Grade Classrooms.” Baltimore, MD: Baltimore Education Research Consortium, 2011. Reiser, Brian J., James P. Spillane, Franci Steinmuler, Don Sorsa, Karen Carney, and Eleni Kyza. “Investigating the Mutual Adaptation Process in Teachers’ Design of Technology-Infused Curricula.” In Fourth International Conference of the Learning Sciences, edited by Barry Fishman and Sam O’Connor-Divelbiss, 342-49. Mahwah, NJ: Erlbaum, 2000. Resnick, Lauren B., and James P. Spillane. “From Individual Learning to Organizational Designs for Learning.” In Instructional Psychology: Past, Present and Future Trends. Sixteen Essays in Honor of Erik De Corte, edited by Lieven Verschaffel, Filip Dochy, Monique Boekaerts and Stella Vosinadou. Advances in Learning and Instruction, 257-74. Oxford: Pergamon, 2006. Roderick, Melissa, and John Q. Easton. “Developing New Roles for Research in New Policy Environments: The Consortium on Chicago School Research.” Chicago, IL: The Consortium on Chicago School Research, 2007. Roderick, Melissa, John Q. Easton, and Penny Bender Sebring. “The Consortium on Chicago School Research: A New Model for the Role of Research in Supporting Urban School Reform.” Chicago, IL: Consortium on Chicago School Research, 2009. Rojek, Jeff, Geoffrey P. Alpert, and Hayden P. Smith. “The Utilization of Research by the Police.” Police Practice and Research: An International Journal (2012): First published online on March 30, 2012 as DOI:10.1080/15614263.2012.671599. Rojek, Jeff, Hayden P. Smith, and Geoffrey P. Alpert. “The Prevalence and Characteristics of Police Practitioner-Researcher Partnerships.” Police Quarterly (2012): First published online on April 11, 2012 as doi:10.1177/1098611112440698. Rosen, Lisa. “Deconstructing the Rhetoric of Rigor: Evidence-Based Policy and the Politics of Representation.” Chicago: Center for School Improvement, in preparation. Rosen, Lisa. “Examining a Novel Partnership for Educational Innovation: Promises and Complexities of Cross-Institutional Collaboration.” In Research and Practice in Educatiton: Building Alliances, Bridging the Divide, edited by Cynthia E. Coburn and Mary Kay Stein, 71-93. New York: Rowman & Littlefield Publishing Group, 2010. Sandoval, William A., and Philip Bell. “Design-Based Research Methods for Studying Learning in Context: Introduction.” Educational Psychologist 39, no. 4 (2004): 199-201. Schlecty, Phillip C., and Betty Lou Whitford. “Shared Problems and Shared Vision: Organic Collaboration.” In School-University Partnerships in Action: Concepts, Cases and Concerns, edited by Kenneth A. Sirotnik and John I. Goodlad. New York: Teachers College Press, 1988. Schneider, Rebecca M., and Joseph Krajcik. “Supporting Science Teacher Learning: The Role of Educative Curriculum Materials.” Journal of Science Teacher Education 13, no. 3 (2002): 221-45. Shavelson, Richard J., Dennis C. Phillips, Lisa Towne, and Michael J. Feuer. “On the Science of Education Design Studies.” Educational Researcher 32, no. 1 (2003): 25-28.


Sherer, Jennifer Zoltners, and Alicia Grunow. “90-Day Cycle: Exploration of Math Intensives as a Strategy to Move More Community College Students out of Developmental Math Courses.” Palo Alto, CA: The Carnegie Foundation for the Advancement of Teaching, 2010. Shojania, Kaveh G., and Jeremy M. Grimshaw. “Evidence-Based Quality Improvement: The State of the Science.” Health Affairs 24, no. 1 (2005): 138-50. Shutt, Kari, Rachel S. Phillips, Nancy Vye, Katie Van Horne, and John D. Bransford. “Developing Science Inquiry Skills with ChallengeBased, Student-Directed Learning.” Paper presented at the Annual Meeting of the American Educational Research Association, Denver, CO, April 2010. Smylie, Mark A., and Thomas B. Corcoran. “Nonprofit Organizations and the Promotion of Evidence-Based Practice.” In The Role of Research in Educational Improvement, edited by John D. Bransford, Deborah J. Stipek, Nancy Vye, Louis Gomez and Diana Lam, 11136. Massachusetts: Harvard Educational Press, 2009. Snow, Catherine E., Joshua Lawrence, and Claire White. “Generating Knowledge of Academic Language among Urban Middle School Students.” Journal of Research on Educational Effectiveness 2, no. 4 (2009): 325-44. Strive Network. “The Strive Network.” www.strivenetwork.org. Thering, Susan, and Victoria Chanse. “The Scholarship of Transdisciplinary Action Research: Toward a New Paradigm for the Planning and Design Traditions.” Landscape Journal 30, no. 1 (2011): 6-18. Villavicencio, Adriana, and Justina K. Grayman. “Learning From ‘Turnaround’ Middle Schools: Strategies for Success.” New York: Research Alliance for New York City Schools, 2012. Vozzo, Les, and Jack MacFadden. “A School-University Partnership: A Commitment to Collaboration and Professional Renewal.” In The Many Faces of School-University Collaboration: Characteristics of Successful Partnerships, edited by Ruth. Ravid and Marianne G. Handler, 223-33. Englewood, CO: Teachers Ideas Press, 2001. Vye, Nancy, Philip Bell, Carrie T. Tzou, and John D. Bransford. “Instructional Design Principles for Blending and Bridging Science Learning across Formal and Informal Environments.” Paper presented at the National Association for Research in Science Teaching Annual International Conference, Philadelphia, PA, March 2010. Wallerstein, Nina, and Bonnie Duran. “Community-Based Participatory Research Contributions to Intervention Research: The Intersection of Science and Practice to Improve Health Equity.” American Journal of Public Health 100, no. S1 (2010): S40-S46. Williams, David Dwayne “The Brigham Young University-Public School Partnership.” In School-University Partnerships in Action: Concepts, Cases and Concerns, edited by Kenneth A. Sirotnik and John I. Goodlad, 124-47. New York: Teachers College Press, 1988. Yarnall, Louise, Nicole Shechtman, and William R. Penuel. “Using Handheld Computers to Support Improved Classroom Assessment in Science: Results from a Field Trial.” Paper presented at the Annual Meeting of the American Educational Research Association, Montreal, Quebec, Canada, April 2005.


Page Left Blank Intentionally


THE PROBLEM WITH BRIEFS, IN BRIEF

Carrie L. Conaway Associate Commissioner of Planning, Research, and Delivery Systems Massachusetts Department of Elementary and Secondary Education Malden, MA 02148 cconaway@doe.mass.edu

Abstract Policy briefs written by academics—the kind typically published in Education Finance and Policy—should be a crucial source of information for policy makers. Yet too frequently these briefs fail to garner the consideration they deserve. Their authors are too focused on the potential objections of their fellow academics, who are concerned with rigor and internal validity, instead of the objections of policy makers, who are concerned with generalizability, understandability, and utility. And researchers too often believe that simply publishing a brief is sufficient to communicate its results. By focusing briefs on topics on the policy agenda, helping policy makers see their constituents in the results, writing clearly, studying implementation and not just outcomes, weighing evidence and drawing conclusions, and reaching out to policy makers beyond publication, researchers have the greatest potential to see their work influence public policy.

doi:10.1162/EDFP_a_00096 © 2013 Association for Education Finance and Policy

287


THE PROBLEM WITH BRIEFS, IN BRIEF

THE STATE OF THE BRIEF Let’s be honest: Major education policy decisions are rarely based solely on research findings. But academic research can and does influence the policy debate. Indeed, as Kingdon (2003) noted in his seminal work on the policymaking process, ideas from academic literature are regularly discussed by federal staffers, bureaucrats, and lobbyists, and often find their way into legislation and policy. Policy briefs play a critical role by framing the relevant questions and distilling key findings. Briefs written by academics—the kind typically published in Education Finance and Policy (EFP)—should be a particularly crucial source of information, for in theory they offer just what policy makers want: the latest, most relevant findings and implications for policy, compiled by a party separate from the usual stakeholder groups. For policy makers barraged with information from competing, invested organizations, academic policy briefs should offer respite. Yet too frequently these briefs fail to garner the consideration they deserve. The problem is two-fold. First, the researchers who write the briefs are often responding to the wrong audience. They are worried about the “yeah, but. . .” responses of their fellow academics, who are concerned with rigor and internal validity, when they should focus on the “yeah, but. . .” responses of policy makers, who are concerned about generalizability, understandability, and utility. This state of affairs is no surprise. Most academics have little exposure to policy making, whereas they have all suffered through the indignity of a seminar in which they have been lambasted for neglecting to properly cluster their standard errors. But the unfortunate consequence is briefs that are less useful to the audience for whom they are intended. Second, too often academics suffer from the Field of Dreams delusion regarding research dissemination: “If we publish it, they will come.” Publishing a policy brief in EFP or elsewhere is a good start. But if the academic community aspires to influence public policy, it is obligated to do more. As the associate commissioner with responsibility for research and evaluation at the Massachusetts Department of Elementary and Secondary Education, I straddle the two worlds of research and policy. In this essay I provide my perspective on the most common “yeah, but. . .” comments I hear from my colleagues as they review research. For each concern, I also offer strategies researchers can use to counteract policy makers’ complaints and examples of policy briefs that have done so effectively. I conclude with suggestions for how researchers can extend the reach of policy briefs beyond journal publication.

288


Carrie L. Conaway

“YEAH, BUT THIS POLICY ISN’T ON THE AGENDA RIGHT NOW” In the public sphere, nothing dooms research faster than irrelevance. If a state isn’t currently debating how to restructure its teacher retirement system, its lawmakers won’t be interested in the trade-offs between years of service and retirement payouts. If a superintendent isn’t considering implementing an extended school day, she won’t care whether the evidence illustrates a clear link to improved student outcomes. As Kingdon put it, “policy makers in government listen to academics most when their analyses and proposals are directly related to problems that are already occupying the officials’ attention” (2003, p. 56). Herein lies a major disconnect between researchers and practitioners. Social science researchers—particularly those whose methodological preference is for causal analysis—tend to focus on answerable questions, but what is answerable is at best imperfectly correlated with what is on the policy agenda. For instance, 1.6 million students are enrolled in charter schools nationwide, representing 3.3 percent of student enrollment (NCES 2012a). Yet an ERIC search reveals 60 journal articles referencing charter schools published in 2011 alone. Compare this to 232 articles on No Child Left Behind (NCLB) or 603 articles on accountability, both policies that affect all students. That’s about a 1:4 ratio for articles on charters to articles on NCLB and a 1:10 ratio for accountability, versus a 1:30 ratio for actual student enrollment. Why so much academic work on charters? Although some researchers may claim to be analyzing charter schools to identify innovative practices that could potentially be applied elsewhere, or to study the influence of education markets on student outcomes, in the end I believe it comes down to answerability. Charter school enrollment is done on the basis of a lottery—that is, random assignment. Much as the Sirens of Greek mythology lured sailors with their enchanting voices, researchers are lured to the possibility of a convincing causal analysis based on random assignment to treatment and control conditions. Charter schools are certainly an important part of the education policy landscape and a topic worthy of analysis—indeed, my own office has sponsored several charter-related studies (see, e.g., Abdulkadiroglu et al. 2009; Angrist et al. 2012). But are they worth this disproportionate attention? I would prefer to see equivalent attention given to other priorities, such as strategies for school turnaround. In Massachusetts, we currently have 43 turnaround schools, enrolling nearly 21,000 students statewide in 2011–12, as compared with about 31,000 students in charter schools. Yet whereas we have a good idea about whether and where charter schools are working in Massachusetts, it’s difficult for us to determine whether our state turnaround strategies are working. For ethical, legal, and practical reasons we cannot randomly assign students to

289


THE PROBLEM WITH BRIEFS, IN BRIEF

the schools, nor can we randomly assign some of these schools to implement turnaround strategies while others do not. More research in this area would be useful, but alas, the question of which turnaround strategies are most effective is not easily answerable with traditional causal analysis techniques. The Sirens do not lure researchers here. The implication of this argument is not that researchers should work on unanswerable questions just because they are of interest to policy makers. Instead, I would like to see a broader notion of what constitutes an answer. In its search for causal perfection, the research community too often overlooks the opportunity to report other types of evidence—descriptive, correlational, qualitative, and so forth—that is relevant to the current policy agenda and could directly benefit policy making. For instance, in the area of school turnaround, it would be helpful to know if the rate of improvement in turnaround schools is similar to that in other low-performing schools not officially designated as turnarounds, or if any educational practices distinguish rapidly improving turnaround schools from those changing more slowly. Answering these questions may not definitively demonstrate that a turnaround designation improved outcomes, but these analyses are still quite valuable to policy makers wrestling with this challenging educational problem. As Randall Munroe said, “Correlation may not imply causation, but it does waggle its eyebrows suggestively and gesture furtively while mouthing, ‘look over there’” (Munroe, Undated). The best guarantee that a particular piece of research is actually needed, of course, is to design a research agenda in collaboration with policy makers themselves to ensure that it is well aligned to the anticipated policy agenda. Organizations like the University of Chicago Consortium on School Research—where Chicago Public Schools staff work alongside academics devising and executing relevant research to identify what helps students succeed—and the Education Research Finance Commission in New York state, described elsewhere in this volume by Deborah Cunningham and Jim Wyckoff, rarely worry that they are producing information that will not be used, because the researchers are producing materials on topics and on a schedule tied directly to a local policy-making agenda. These types of collaborations, however, are rare. And even if they were common, there would still be a need for the policy briefs published in EFP, which aim to cover a policy question from a national perspective and to inform a broader policy agenda. But then the relevance challenge for these broadbased briefs is even greater, for they must cover a sufficiently general topic to be pertinent to many policy makers, yet still be focused and timely enough to influence their thinking. An example of an EFP brief that meets this mark is “An Introduction to ‘Early College,’” by Brewer, Stern, and Ahn (2007). Six years after its publication, many states and districts are still considering or

290


Carrie L. Conaway

implementing early college opportunities as a way of increasing the rigor of the high school curriculum and to better prepare their students for college. Although additional evidence is now available on the impact of early college policies, I would still recommend this brief today as a starting point for policy makers familiarizing themselves with this topic. There is one exception to the relevance rule. On occasion, it is research itself that makes a particular issue relevant for the policy agenda. For instance, a report sponsored by my agency and the Boston Foundation (and later published in the Quarterly Journal of Economics) showed that Boston students who had won the lottery to enter a charter school were performing in some cases close to half a standard deviation higher on state assessments than their peers who had not won the lottery (Abdulkadiroglu et al. 2009, 2011). The rigorous analytical techniques used in the study, the fact that the researchers found similar results across several methodologies, and the consonance of the results with those of other recent studies of urban charter schools left little room for doubt that, on average, charter schools in Boston were producing stronger student outcomes than Boston public schools. This drew attention to the fact that the cap on the number of charter school seats in Boston had nearly been reached, which meant opportunities for students to enroll in these more successful schools were limited. Ultimately, a charter school enrollment cap increase became one of the education reform strategies featured in our state’s 2010 Act Relative to the Achievement Gap. Although our study was not the only reason for this policy change, having rigorous evidence available about the impact of Boston’s charters certainly played a role. There will always be room for new research that resets the public debate and pushes an issue onto the policy agenda. And policy makers themselves need to stretch occasionally beyond their immediate policy concerns to explore research that can help them understand the broader context within which specific issues arise and can spark ideas for future policy making. But if the goal is increasing the influence of research on policy making, for nearly all research work, greater attention to its relevance in the current policy debate will increase the likelihood of it actually being used. “YEAH, BUT THAT CONTEXT IS DIFFERENT FROM MINE” Focusing on a relevant topic is a good start, but it’s not sufficient to convince policy makers of the value of a piece of research, for context also matters. Policy makers will draw inferences from national studies, to be sure. But most convincing to them are studies with a local focus. Only local studies can fully account for the context of the community they serve. Policy makers wouldn’t state it this way, but this is essentially a question of external validity—the degree to which the findings in a particular study are

291


THE PROBLEM WITH BRIEFS, IN BRIEF

generalizable to other contexts. Most recent advances in research methodology have focused on improving internal validity, that is, the degree to which a measure measures what we intend it to measure—for instance, in the case of causal analysis, the degree to which a measure captures a causal relationship and not just a correlation. The use of local, focal control groups or sophisticated statistical methods such as instrumental variables or regression discontinuity serve to increase confidence that asserted causal relationships are real. But they also work to undermine external validity because they make results less generalizable. An effective strategy to defuse this “yeah, but. . .” is to disaggregate data. As the Sirens attract researchers to data supporting causal analysis, they also attract policy makers to data about their own communities. Why? Disaggregation facilitates comparisons, which put data in context and stir competitive spirits. State education policy makers eagerly await the release of the National Assessment of Education Progress results because the data allow them to compare their performance and progress against other states on an even playing field. More broadly, breakdowns of results by geography, urbanicity, or district or school, even if purely descriptive, grab policy makers’ attention by getting them engaged in examining the study’s findings for their communities of interest. Maps are particularly effective at this, as they provide a quick visual comparison across an entire geographic area. Researchers can further contribute in this area by using their expertise in analytical techniques to help policy makers choose appropriate reference groups. Policy makers often have a general sense of which states, cities, or schools are similar to theirs, but input from researchers in this area can identify referents of which the policy maker might have otherwise been unaware. For instance, my office created a District Analysis and Review Tool that published simple-to-use, comparative data for every school district and school in Massachusetts (Massachusetts Department of Elementary and Secondary Education 2012). As part of this project, we developed a method to identify the ten districts or schools most similar to the one of interest in terms of student demographics. This helped districts look beyond their geographic neighbors to districts or schools across the state that served similar students, some of which were achieving better results. National and international journals like EFP may encounter trade-offs with disaggregation because of their broad scope. Too much focal detail may increase relevance for a small slice of the audience but decrease it for many others, so authors must tread cautiously. Nonetheless, some EFP briefs have used this strategy to good effect. Two nice examples are the pieces on the state role in teacher professional development and education and in teacher compensation by Loeb, Miller, and Strunk (2009a, 2009b). In both briefs,

292


Carrie L. Conaway

most of the tables and charts present data by state to facilitate comparisons and provide context for the types of policies pursued in each state. The first thing I did when I read these papers was to look for Massachusetts in each table. Sirens, I hear your call. “YEAH, BUT I DON’T UNDERSTAND THIS $@%* THING” Perhaps the most common complaint leveled at the academic community is that it values a style of writing that leaves other human beings baffled. It is perfectly rational, though perhaps not desirable, for researchers to write this way for their fellow academics, who speak the same language. But it is inexcusable for them to use this style with policy makers, for it breaks a cardinal rule of writing: Know Your Audience. Much of what there is to say about high quality writing in the social sciences has been said elsewhere, in my view most notably in economist and rhetorician Deirdre McCloskey’s wonderful little book, Economical Writing (McCloskey 2000). But writers of policy briefs who wish for their work to have influence with policy makers would do well to emphasize the following elements of style. Most critical are a brief’s introductory and concluding materials. Timothy Taylor, long-standing editor of the Journal of Economic Perspectives, writes, “Introductions of papers are worth four times as much effort as they usually receive. The opening paragraph of each main section of a paper is worth three times as much effort as it usually receives. Conclusions are worth twice as much effort as they usually receive” (Taylor 2012, p. 32). Why? Readers tend to stop reading if they aren’t immediately drawn in by the content. If the summary materials are poorly written, no one will read anything further. Summary materials should highlight the one or two facts—specific pieces of evidence or data—the author most wants a policy maker to remember. For example, a 2008 study by my agency found that 37 percent of Massachusetts public high school graduates who went on to Massachusetts public colleges or universities took at least one remedial course in their first semester (Massachusetts Board of Higher Education and Massachusetts Department of Elementary and Secondary Education 2008). This fact was not previously known and was featured throughout our report. It has since been cited frequently by the state’s secretary of education and the governor when speaking about the need for a stronger focus on college readiness, and it also contributed to a decision by our state’s Board of Higher Education to increase the entrance requirements to the state’s public four-year colleges to create a greater incentive for students to take a rigorous course of study in high school. By making this critical information available and easy to understand, our study influenced the public discourse and helped improve outcomes for our state’s students.

293


THE PROBLEM WITH BRIEFS, IN BRIEF

Next, avoid jargon. In briefs published in EFP, I have seen terms such as “counterfactual,” “value-added measures,” and “adequacy” (in the school finance sense) used with little or no explanation—often in situations where understanding that concept was critical to understanding the article itself. Even “standard deviation” is a concept that you cannot assume policy makers will understand. Further, including a sentence to define a technical term does not give you free license to then repeat that jargon throughout the brief. Describe the idea with simple, straightforward language that would be familiar to nonspecialists, and use those words when you need to refer to the concept again. If you cannot resist the temptation to include technical details, put them in a footnote or an appendix. Finally, no Greek letters. Enough said. “YEAH, BUT THIS STUDY DOESN’T HELP ME FIGURE OUT WHAT TO DO” One of policy makers’ greatest frustrations with academic research is that it tells them whether something worked, but not why. In Commissioner Mitchell Chester’s introductory letter to the Massachusetts charter school report mentioned earlier, which also examined the impact of another type of school (pilot schools) that shares similar autonomies, he praised the research but highlighted the multitude of questions it raised: The results of this study are both statistically significant and educationally important. But they also open many further questions. What is causing the differences in performance we see between charters and pilots? What is it about charter schools that allows them to achieve such strong results, and how can their effective practices be more widely disseminated? How can pilot schools take better advantage of the autonomy they already have to produce improved outcomes? When is more autonomy a good solution for improving student performance, and when might other strategies make more sense?” (Abdulkadiroglu et al. 2009, p. 2). These are the types of questions for which policy makers crave answers, and ones that the research community is often not well positioned to answer. More studies on implementation would help. In my office, we have deliberately shifted our focus to put a greater emphasis on frequent, early feedback from evaluators to program managers about how to improve and sustain their work. This ensures that our analyses stay relevant to their agenda. We combine this with causal designs when we can; for instance, our five-year study of our state Expanded Learning Time program, done by Abt Associates, included both implementation research and an interrupted time series design with matched

294


Carrie L. Conaway

comparison schools for analyzing student outcomes (Checkoway et al. 2012). We’ve come to realize that it is more important to get useful data, even if imperfect, to our program offices quickly than to implement a strong research design that doesn’t generate timely, relevant information. Equally as importantly, it would help if researchers would be more willing to use their judgment and experience to draw clearer conclusions about the policy implications of their work, particularly in the context of a policy brief. It is reasonable to be cautious about drawing too definitive a conclusion from a single study, especially one small in scale. But policy briefs by their nature typically review evidence from multiple studies, or from a single study with broad relevance. Briefs thus can push further with conclusions than might be legitimate on the basis of an individual study. They can emphasize points of consensus in a field, identify areas where the evidence is less solid, and put results in their proper context. Where results from multiple studies differ in their implications, policy briefs can help readers to appropriately weight the quality of evidence from each source. Thus, where the evidence merits, authors of policy briefs should not be afraid to draw conclusions; where the evidence does not support a conclusion, they should say so. A recent study that drew out policy implications particularly well was the Measures of Effective Teaching (MET) study from the Bill and Melinda Gates Foundation. MET set out to determine how to measure effective teaching, to better provide teachers with information and support about their teaching practice. The research team conducted a large-scale study examining alternative practices for educator evaluation that could provide useful feedback to teachers. They produced technical papers documenting the research methodology and results—but that wasn’t the end of their work. They also produced a policy and practice brief specifically designed for practitioners and included actionable recommendations, such as, “Observers should be expected to demonstrate their ability to generate accurate observations and should be recertified periodically” and “When high-stakes decisions are being made, multiple observations are necessary” (Kane and Staiger 2012, pp. 2–3). These recommendations were drawn directly from the evidence generated by the study—and practitioners listened. Our approach to educator evaluation in Massachusetts has been influenced in substantial part by the results of this work. For instance, our state regulations include student feedback as one source of evidence on teachers’ performance, largely because the MET study convinced my commissioner of the importance of including student feedback as a measure of teacher practice. THE FIELD OF DREAMS DELUSION One last “yeah, but. . .” cripples the effectiveness of policy briefs, but this one is invoked by researchers themselves: “Yeah, but I published my findings, so

295


THE PROBLEM WITH BRIEFS, IN BRIEF

people must know about them.” While the “If we build it, they will come” strategy may have worked for Kevin Costner’s character in Field of Dreams, it is not effective for getting research findings in the hands of policy makers. Once a brief has been published, the work has just begun. Policy briefs published in academic journals are particularly vulnerable to going unnoticed by policy makers, as most state agencies and school districts do not have easy access to those publications. If journals aim to have relevance for readers outside of universities, they should seek ways to expand their availability to those audiences. I commend EFP for distributing copies of this special issue to policy makers, as this is a useful first step in broadening access to this important work. But the research community needs to think beyond simply mailing copies of reports. A cautionary tale: Recently some researchers published a piece using Massachusetts education data in a major refereed academic journal. They asked the publisher to send my commissioner a copy of the issue—which he promptly forwarded to me without having read it. In states without research directors (the vast majority), it probably would have ended up in the recycle bin. When researchers make themselves available as a resource to policy makers, it increases the likelihood that their research evidence will actually be used. Engaging directly with policy makers also has the advantage of addressing the previously cited concern that many studies don’t help policy makers figure out what to do. Rather than researchers merely guessing what a policy maker might conclude from their work, they can instead collaborate to determine the policy implications of the findings and what additional research might be needed. Sustained collaborations like the Chicago Consortium of Public Schools and the Education Finance Research Commission mentioned earlier are one of the most effective ways to engage with policy makers, but other approaches can still reap benefits. Working with policy makers could take the form of in-person briefings, special analyses conducted for an agency or legislator, or participation as an expert in events where policy makers will be present. Sometimes this is as easily arranged as calling a commissioner or superintendent to request an opportunity to share a relevant study. Even better, offer a briefing ahead of public release to provide an opportunity to ask questions and prepare a response. This is particularly effective when the study is directly pertinent to his or her jurisdiction or when the findings are likely to counter commonly held beliefs. Other times it may be easier to gain policy makers’ attention by speaking at public events put on by organizations that serve educators, for example, associations of superintendents or mathematics curriculum directors, or the Council of Chief State School Officers. If you’ve obtained data from a state or district, use the connections you have made with their staff—whether

296


Carrie L. Conaway

in program offices or in research, assessment, and data reporting—as potential conduits of results back into the organization. Once you establish yourself as someone who has deep expertise and can communicate your work effectively, you will quickly find yourself in high demand. Not to be overlooked, however, is the role of independent research intermediaries, which compile secondary reports and briefs on findings published elsewhere and therefore give research findings a second chance at reaching a policy-making audience. Intermediaries often have the benefit of close ties to policy makers, as they typically have communications staff dedicated to disseminating findings through publications, events, briefings, social media, and other venues—a luxury for most academic journals. As a result, policy makers are more likely to pay attention to the work intermediaries publish. When our state convened a task force in 2010 to make recommendations to the state Board of Elementary and Secondary Education on a new framework for educator evaluation, we relied heavily on materials from intermediaries when we developed an annotated bibliography of the most pertinent recent reports on the subject for task force members. Materials from the Center on Great Teachers & Leaders (GTL Center, formerly National Comprehensive Center on Teacher Quality) alone composed ten percent of the reports provided to the task force, making it the single most common source of information we provided. Although the GTL Center’s work is strongly evidence-based, it does relatively little original research, rather reviewing and analyzing others’ findings to identify proven and promising policies, strategies, models, and practices (GTL Center 2013). It is incumbent on policy makers, of course, to distinguish independent intermediaries such as the GTL Center from others that are backed by organizations advancing a particular policy agenda. Thankfully, making these distinctions is exactly what policy makers do for a living. Academic journals that aspire to reach policy-making audiences might consider collaborating on content and dissemination with a relevant intermediary, where one exists. The two could work together to agree on a research and publication agenda. Briefs could be concurrently published in both the academic journal and the appropriate venues from the intermediary, perhaps in different formats for various audiences. The researchers who produced the briefs could be available for events and briefings, giving them greater access to influence policy makers. And the intermediaries would gain the benefit of the credibility the experts bring to their communications efforts. A CALL FROM POLICY MAKERS Federal and state governments and local school districts have spent millions of dollars investing in data systems to “help states, districts, schools, educators,

297


THE PROBLEM WITH BRIEFS, IN BRIEF

and other stakeholders to make data-informed decisions to improve student learning and outcomes, as well as to facilitate research to increase student achievement and close achievement gaps” (NCES 2012b). In return for their investment, policy makers are increasingly demanding that the data they have invested so heavily in collecting be turned into evidence. Policy makers want to know whether their policies are working, why, and what they should do differently. They are looking to the research community to use data to answer these important questions and to report the results in an accessible, relevant way. And they want help from researchers in using their evidence, understanding the caveats, and drawing appropriate conclusions and inferences. Well-crafted policy briefs are likely to be the research community’s most effective response to this call from policy makers. By focusing briefs on topics on the policy agenda, helping policy makers see their community in the results, writing clearly, studying implementation and not just outcomes, weighing evidence and drawing conclusions, and reaching out to policy makers beyond publication, researchers have the greatest potential to see their work influence public policy. I thank Tom Downes, Heidi Guarino, Kieran Killeen, Lori Likis, Antoniya Owens, and the two anonymous reviewers for helpful comments on previous drafts. All opinions are my own and do not necessarily reflect the position of the Massachusetts Department of Elementary and Secondary Education.

REFERENCES Abdulkadiroglu, Atila, Josh Angrist, Sarah Cohodes, Susan Dynarski, Jon Fullerton, Thomas Kane, and Parag Pathak. 2009. Informing the debate: Comparing Boston’s charter, pilot and traditional schools. Boston, MA: The Boston Foundation and the Massachusetts Department of Elementary and Secondary Education. Abdulkadiroglu, Atila, Josh Angrist, Susan Dynarski, Thomas Kane, and Parag Pathak. 2011. Accountability and flexibility in public schools: Evidence from Boston’s charters and pilots. Quarterly Journal of Economics 126(2): 699–748. doi:10.1093/qje/qjr017 Angrist, Josh, Susan Dynarski, Thomas Kane, Parag Pathak, and Christopher Walters. 2012. Who benefits from KIPP? Journal of Policy Analysis and Management 31(4): 837–60. doi:10.1002/pam.21647 Brewer, Dominic, Stefanie Stern, and June Ahn. 2007. An introduction to ‘early college’. Education Finance and Policy 2(2): 175–87. doi:10.1162/edfp.2007.2.2.175 Center on Great Teachers & Leaders (GTL Center). 2013. About us. Available www.Tqsource.org/about.php. Accessed 5 February 2013. Checkoway, Amy, Beth Gamse, Melissa Velez, Meghan Caven, Rodolfo de la Cruz, Nathanial Donoghue, Kristina Kliorys, et al. 2012. Evaluation of the Massachusetts

298


Carrie L. Conaway

expanded learning time (ELT) initiative year five final report: 2010–2011. Vol. 1. Cambridge, MA: Abt Associates. Kane, Thomas J., and Douglas O. Staiger. 2012. Gathering feedback for teaching: Combining high quality observations with student surveys and achievement gains. Seattle, WA: Bill and Melinda Gates Foundation. Kingdon, John W. 2003. Agendas, alternatives, and public policies, 2nd ed. London, UK: Longman Classics in Political Science. Loeb, Susanna, Luke C. Miller, and Katharine O. Strunk. 2009a. The state role in teacher compensation. Education Finance and Policy 4(1): 89–114. doi:10.1162/edfp.2009 .4.1.89 Loeb, Susanna, Luke C. Miller, and Katharine O. Strunk. 2009b. The state role in teacher professional development and education throughout teachers’ careers. Education Finance and Policy 4(2): 212–28. doi:10.1162/edfp.2009.4.2.212 Massachusetts Board of Higher Education and Massachusetts Department of Elementary and Secondary Education. 2008. Massachusetts school to college report: High school class of 2005. Boston, MA: Massachusetts Board of Higher Education and Massachusetts Department of Elementary and Secondary Education. Massachusetts Department of Elementary and Secondary Education. 2012. District analysis, review & assistance tools. Available www.doe.mass.edu/apa/dart/. Accessed 11 January 2013. McCloskey, Deirdre. 2000. Economical writing, 2nd ed. Long Grove, IL: Waveland Press, Inc. Munroe, Randall. Undated. Correlation. Available http://xkcd.com/552/. Accessed 11 January 2013. National Center for Education Statistics (NCES). 2012a. Digest of education statistics. Table 101: Number and enrollment of public elementary and secondary schools, by school type, level, and charter and magnet status: Selected years, 1990–91 through 2009–10. Available http://nces.ed.gov/programs/digest/d11/tables/dt11_101.asp. Accessed 20 October 2012. National Center for Education Statistics (NCES). 2012b. State longitudinal data systems grant program. Available http://nces.ed.gov/programs/slds/listserv.asp. Accessed 5 February 2013. Taylor, Timothy. 2012. From the desk of the managing editor. Journal of Economic Perspectives 26(2): 27–40. doi:10.1257/jep.26.2.27

299


Page Left Blank Intentionally


POLICY MAKERS AND RESEARCHERS SCHOOLING EACH OTHER: LESSONS IN EDUCATIONAL POLICY FROM NEW YORK

Deborah H. Cunningham Office of Business and Support Services Nevada Department of Education Carson City, NV 89706 dcunningham@doe.nv.gov Jim Wyckoff (corresponding author) Curry School of Education University of Virginia Charlottesville, VA 22904-4277 Wyckoff@virginia.edu

Abstract Policy makers and researchers are intrigued with but also frequently frustrated by each other. Although these differences are understandable and predictable, it is clear that research on a variety of educational issues has been both influential and valuable in the development of policy and practice. There is much to suggest that researchers and policy makers should be collaborating to improve student outcomes. There are important instances of sustained collaborations between educational researchers and educational policy makers. We summarize some of these efforts but describe in more detail the Education Finance Research Consortium, a long-standing collaboration between universitybased researchers and the New York State Education Department. Given the current intense focus on the role of evidence in the development of education policy, some of the lessons from this collaboration may be useful to those seeking to expand the use of evidence in policies intended to improve student learning.

doi:10.1162/EDFP_a_00095 Š 2013 Association for Education Finance and Policy

275


SCHOOLING EACH OTHER

POLICY MAKERS AND RESEARCHERS Policy makers and researchers are intrigued with but also frequently frustrated by each other. The reasons for this approachâ&#x20AC;&#x201C;avoidance behavior are many and varied, but what follows is not uncommon. Some policy makers find the evidence researchers produce appealing in policy development for a variety of reasons, not least of which is the potential that policies grounded in evidence are more likely to succeed. They often find, however, that much of the research is apparently contradictory, with researchers seemingly more interested in what policy makers see as tedious, third-order statistical details than in providing guidance on how to navigate conflicting findings. From their perspective, many researchers want their work to have relevance to social issues and to make a difference in the lives of others, but they become frustrated by the need (their inability?) to present their work so that it is accessible to policy makers, and with policy makersâ&#x20AC;&#x2122; lack of appreciation for the difference between correlation and cause. These differences notwithstanding, it is clear that research on a variety of educational issues has been influential in the development of policy and practice. This usually occurs as policy makers draw on the existing research literature to inform policy, rather than through sustained collaborations between policy makers and researchers. This should not be too surprising as the incentives of policy makers/practitioners and researchers are typically not well aligned. Educational policy makers must respond to the press of current issues quickly in an understaffed environment. They rarely have the time or financial luxury of pilots and frequently donâ&#x20AC;&#x2122;t recognize the benefit of doing so in a way that is amenable to rigorous evaluation. Researchers often respond to the incentives of peer-reviewed academic publishing, which privilege novel findings and methodological rigor. Academics find it useful to selectively use the emerging wealth of educational data as fits this academic reward structure, which typically does not include informing the policy process. No wonder successful, long-standing collaborations of policy makers and researchers are rare. Thus, there is little or no give and take between policy makers and researchers, substantially compromising the immediate use of research findings in policy and preventing a shared understanding that would substantially enrich future work. There is much to suggest that researchers and policy makers should be collaborating to improve student outcomes. The U.S. Department of Education has rolled out an unprecedented set of policies and programs highlighting the development and use of data to inform education policy and practice. No Child Left Behind, State Longitudinal Data System grants, Teacher Incentive Fund grants, Institute of Education Sciences contracts and research grants, and the various flavors of Race to the Top all create incentives for the development or

276


Deborah H. Cunningham and Jim Wyckoff

use of data in state or district educational policy. Policy makers and researchers each have much to gain in collaborating on the design and availability of these data. State and local education agencies rightfully privilege administrative uses of data, often without considering how modest changes in the nature of data collection or its availability may substantially alter research opportunities. What are the models for how researchers and policy makers interact to improve policy? There are important instances of sustained collaborations between educational researchers and school district policy makers. The University of Chicago Consortium on Chicago School Research (CCSR), now located under the umbrella of the University’s Urban Education Institute, is probably the best example. Formed in 1990 to study the decentralization of public school governance and management resulting from the Chicago School Reform Act, CCSR has developed an active research agenda to inform Chicago Public Schools (CPS) policy. Two aspects of CCSR confound the typical policy maker–researcher interactions described earlier. Its design promotes a close working relationship with CPS. The theory of action is that policy-relevant research is facilitated by meaningful collaboration among policy makers, practitioners, and researchers. To wit, “the role of the researcher must shift from outside expert to interactive participant” (Roderick, Easton, and Sebring 2009, p. 3). In addition, the CCSR Steering Committee includes the school district CEO and representatives of important educational stakeholder groups in Chicago. CCSR has also developed a robust data archive that includes a twenty-year longitudinal database on CPS students and teachers, as well as biannual surveys of students, teachers, and principals begun in 1997 (Roderick, Easton, and Sebring 2009). This is a substantial investment by researchers, allowing them to expeditiously explore a variety of research topics in response to and in anticipation of policy discussions. Two other features of CCSR may account for its longevity and success. CCSR does not rely on the CPS for funding. Rather, it receives support from a long list of foundations, federal government agencies, and individual contributors.1 This broad base of support shields CCSR from the fluctuations of any single organization, especially public agencies such as school districts and state departments of education. Second, CCSR has a large staff of full-time analysts and researchers who are not tethered to the incentive structure typically confronting university faculty. CCSR does involve university faculty in its work but does not rely on these individuals to develop and maintain relationships with the district. Typically, affiliated faculty engage in research that is both useful to the district and appropriate for peer-reviewed publications, but this research does not address all of the research issues of interest to the district. 1. See http://ccsr.uchicago.edu/about/funders-supporters for a list of current supporters.

277


SCHOOLING EACH OTHER

The Research Alliance for New York City Schools, created in 2008, modeled after CCSR, has produced some useful research but is still developing its research capacity. Other examples of collaborations between researchers and school district policy makers exist but are typically more recent and/or less structured as collaborations with policy makers.2 Collaborations between researchers and state educational policy makers are even more rare. We have been fortunate to be affiliated with one such collaboration, the Education Finance Research Consortium, and in what follows describe its basic structure and some lessons we and others learned through that process. We believe that given the current intense focus on the role of evidence in the development of policy some of these lessons may be useful to those exploring such collaborations. EDUCATION FINANCE RESEARCH CONSORTIUM The Education Finance Research Consortium (EFRC) had its roots in symposia convened between 1992 and 1996. On three different occasions, senior policy makers in the New York State Education Department (NYSED) invited education finance researchers to work to develop a research symposium that addressed policy questions of interest to the New York State Board of Regents (the governing body for education in New York State) that were fertile for sound research.3 These symposia (Berne 1993; Monk 1995; Wyckoff and Naples 1997) brought together panels of educational finance researchers from across the country to write short policy briefs on various aspects of the symposium topic. Soon after the completion of the third of these symposia, James Kadamus, then Deputy Commissioner of Education, led an effort at NYSED to formalize the interactions of policy makers and researchers through the creation of the EFRC. The EFRC was instantiated by a five-year budget from NYSED to a Governing Board composed of Robert Berne (New York University), David Monk (Cornell University), Jim Wyckoff (SUNY Albany), and Jim Kadamus (NYSED).4 In addition to continuing the biannual symposia, the initial agreement provided for annual condition reports and quick-response studies. EFRC would contract with individual researchers best suited to produce condition reports that were analyses of specific aspects of education finance. The quick-response studies provided a vehicle for EFRC to compensate researchers to work on short-term consultations with NYSED on a variety of tasks, such as advice on the use of 2. Collaborations exist in Baltimore, MD (http://baltimore-berc.org/), Richmond, VA (http://merc.soe .vcu.edu/), Kansas City, KS (http://www.kcaerc.org/), Los Angeles, CA (http://www.laeri.org), Newark, NJ (http://nsrc.newark.rutgers.edu/), and San Diego, CA (http://sandera.ucsd.edu/). 3. The first symposium was largely attributable to Associate Commissioner J. Francis Oâ&#x20AC;&#x2122;Connor; the next two were initiated by thenâ&#x20AC;&#x201C;Associate Commissioner James A. Kadamus. 4. During this initial contract Leanna Stiefel succeeded Robert Berne on the Governing Board.

278


Deborah H. Cunningham and Jim Wyckoff

cost indices in the development of state aid formulae. The topics for each of these products were determined by the Governing Board and thus provided a regular opportunity for researchers and policy makers and their staff to meet. This basic structure endured through a second contract and one-year extensions that allowed the collaboration to continue to 2011.5 Over this period, EFRC sponsored five symposia that contained a total of 42 policy briefs and 27 condition reports.6 Design Principles

From the beginning, EFRC embodied a few design principles that we believed served us well: (1) Collaborative research agenda. The choice of topics resulted from consensus from the Governing Board, and thus agreement among researchers and policy makers. This allowed for the confluence of important policy questions and issues for which there were credible research designs and data. (2) Contextualized research. Researchers were provided with background information on the educational and political context of their work, so that these factors might influence their research designs. (3) High-quality research. Individuals were chosen for research projects based on their expertise and research experience on similar topic areas. Within this group of researchers, the Governing Board actively sought diverse perspectives. (4) Independent research. Once topics were agreed upon, researchers were free to pursue their work without oversight from policy makers or staff. (5) Advance notice. All research was submitted in draft form to policy makers for their feedback and so that they would not be surprised by findings during public dissemination. (6) Public dissemination. All symposia and condition reports were presented in public forums open to policy makers, legislators and staff, stakeholder groups, and the public, as well as by Web site and limited mailings. In addition, EFRC members would regularly brief the New York State Board of Regents on research findings. These principles informed the way we structured EFRC interactions and research. 5. The Governing Board for most of the second contract included Jim Kadamus (NYSED), Bill Duncombe (Syracuse University), David Monk (Pennsylvania State University), Leanna Stiefel (New York University), and Jim Wyckoff (University of Virginia). Deborah Cunningham succeeded Kadamus upon his retirement from NYSED. 6. See http://www.albany.edu/edfin/ for a summary of the EFRC.

279


SCHOOLING EACH OTHER

Policy Impact

Collaborations between policy makers and researchers will be more likely to be sustained when they are mutually beneficial. To explore the rewards of the EFRC collaborations we surveyed individuals who were actively engaged in the design of New York education policy over an extended period. We asked policy makers to comment on the influence of the Consortium on New York State education policy and their responses follow. Respondents included the former Chancellor of the Board of Regents; the Chair of the Regents Subcommittee on State Aid; the former Deputy Commissioner for Elementary, Middle, Secondary, and Continuing Education; and the former Deputy Commissioner for Higher Education. James A. Kadamus, Former Deputy Commissioner for Elementary, Middle, Secondary, and Continuing Education, New York State Education Department: The 1990s were a turbulent time for K–12 education in New York State. New State standards were being created. State assessments were being revised to match the standards. The teacher certification and evaluation process was changing. Public expectations for improving education across all populations of students and closing the achievement gap between wealthy and poor schools were at an all-time high. While this reform movement was underway, a major challenge of the school finance formula was being waged by the Campaign for Fiscal Equity (CFE). It became clear to senior officials in the New York State Education Department that independent research was needed to provide guidance on the best ways to restructure school funding to create greater equity and to ensure we had adequate funding for schools to implement the new reforms and improve student achievement. The Consortium produced evidence on the impact of the school funding formulas on the inequities in school resources and the gap in student achievement. Much of this evidence was introduced during the CFE court case. When the CFE case was settled, Consortium research provided the basis for NYSED requests for new State funding increases of over $1 billion per year for multiple years. With new funding came new expectations for evaluating the impact of standards, assessments, and teacher policy reforms on student achievement and school effectiveness. Again, NYSED turned to Consortium researchers to provide the evidence. The Consortium worked for a number of reasons: (1) It was clearly a collaborative activity—both NYSED and the researchers had a stake in the outcomes. Having co-chairs helped define the agenda and set timelines for research studies that conformed to the State’s policy agenda. 280


Deborah H. Cunningham and Jim Wyckoff

(2) Commissioner Richard Mills made it clear in public space that the research produced would not be censored by NYSED. Whatever the evidence, it would be publicly shared. This decision created confidence that the work would be independent and respected and helped draw top talent to participate in the Consortium. (3) The Consortium continuously scanned the national finance and education research environment and identified the best researchers on key agenda topics and brought them into the Consortium to conduct studies. (4) Researchers agreed to work on short timelines, usually nine-month projects. The research brief format enabled researchers to produce studies that were timely and had an impact on policy decisions before the State Board of Regents and the Legislature. (5) All research was vetted through an annual public Consortium symposium, widely attended by research, legislative and executive branch staff, education advocacy groups, Board of Regents members, and the media. This strategy brought attention to the research and raised expectations that the research findings would ultimately impact education policy.

Carl T. Hayden, Former Chancellor, Board of Regents: From my perspective as Chancellor of the Board of Regents at the inception of the Education Finance Research Consortium, I can say that its work proved to be transformative. It provided the intellectual context and unimpeachable data that buttressed the Regentsâ&#x20AC;&#x2122; efforts to convince the Governor and the Legislature to invest in higher standards and in a more equitable allocation of state aid. I truly believe that the marked expansion of financial support for schools serving large numbers of at-risk students would not have occurred but for its efforts. James R. Tallon, Jr., Member, Board of Regents, and Chair, Regents Subcommittee on State Aid: As a member of the New York State Board of Regents, and as Chair of its Subcommittee on State Aid, I found special value in the work of the Education Finance Research Consortium. Three themes support that conclusion. The topics chosen by consortium researchers were developed in concert with analytic needs outlined in collaboration with senior leaders of the State Education Department. The range of subjects was broad, especially viewed over multiple years. School finance, labor markets, teacher preparation, and a range of timely topics addressed emerging issues in a timely way. Finally, the work was conducted with academic independence and rigor, but also with an eye toward presentation to public decision makers. Although ideology and anecdote will always play a role in public debate, continuing targeted and substantive research activity is essential to support responsible decision making. 281


SCHOOLING EACH OTHER

Joseph P. Frey, Former Deputy Commissioner for Higher Education, New York State Education Department: From 1999 to 2011, I served in leadership positions within the New York State Education Department’s Office of Higher Education (Assistant Commissioner, Associate Commissioner, and Deputy Commissioner). A critical responsibility that I had was to advise the Commissioner of Education and the New York State Board of Regents on issues relating to teaching policy. During this time period, all states were struggling with the balance between ensuring the adequate supply of teachers for all classroom assignments and the need to raise the standards for teachers entering the profession, recognizing the importance of the role of the teacher in improving student learning. In helping to better understand the complex system of teacher quality and teacher supply and demand, I had the benefit of the research of the Education Finance Research Consortium. The work of Dr. Wyckoff and his colleagues helped shape policy discussions by the Board of Regents on critical elements of New York’s teaching policy. The work of the Education Finance Research Consortium contributed to the Board of Regents’ efforts to change regulations regarding teacher certification, teacher pre-service programs, and professional development toward the goal of strengthening teacher practice and improving student learning. The Education Finance Research Consortium was a tremendous resource for us. Researchers received a variety of benefits from their participation. Many appreciated the opportunity to engage in a process that by its construction was likely to influence policy. Recall that the general areas of research resulted from a collaborative process that involved identification by policy makers of the importance of these topics to their policy agenda. Many also appreciated the opportunity to connect to policy makers directly through meetings and presentations at the public forums. The stipends associated with the work were modest, but frequently afforded the opportunity for faculty to hire graduate students to contribute and learn from the process. There is some satisfaction that several of these graduate students became EFRC collaborators once they assumed roles as faculty.

LESSONS LEARNED Over the eleven-year period in which this model actively engaged educational policy makers and researchers in New York, we believe we learned some valuable lessons. (1) Collaboration became a two-way learning process. An important, and unanticipated, benefit of our collaboration was the ways in which policy

282


Deborah H. Cunningham and Jim Wyckoff

makers and researchers schooled each other. Researchers developed an appreciation for the nuances and context of policy. Policy makers came to appreciate the benefits of strong research designs and the time necessary for good research. Both helped to strengthen the use of research in policy in the short- and long-term. Without the personal interactions at several points in the process that turned a transaction into collaboration, these benefits may not have existed. This requires that both researchers and policy makers make room for what may appear to be an inefficient use of time, and is likely a process that is appealing to a subset of researchers and policy makers. Our collaborations extended well beyond those typical of connecting researchers to policy makers but could have gone much further, especially if the Consortium were continued during this period of developing teacher evaluation models, accountability assessments of the value teachers and schools add to student learning, and longitudinal data systems. The CCSR model provides some sense of these possibilities. (2) Timing and respect are crucial. On occasion, researchers brought topics to the Governing Board for funding that were very appealing to researchers but not so to policy makers. At times this resulted in deferring study of a certain topic to a later time. On other occasions, the proposed study proceeded as proposed. Working through these differences relied on the mutual respect and good will of the partners. (3) Less is more. Policy makers have little time or patience for lengthy reports focused on theory, methods, and data that may be only indirectly relevant to their work. A one-page executive summary that highlights key policyrelevant findings will likely motivate them to read on. Even then, the longer policy brief is not an academic paper and should be written with the policy audience in mind. Policy makers want to learn more about their worlds and issues through data and can benefit from the conceptual framework that researchers often bring to their work. Using data in creative ways with strong research methods can help to create consensus about the nature and extent of problems. Keep discussions of methods in an appendix for the more technically minded policy reader. The art is in telling the story that is consistent with rigorous research findings. A picture is worth a thousand words, provided it can be understood readily. (4) Good research takes time. We found that anticipating research questions and initiating the research collaboration early provided researchers with reasonable time for research. Over time, policy makers improved their ability to define the long-term policy agenda that both benefited stakeholders and allowed more time for research. Likewise, researchers became more sensitive to the policy windows for some issues and improved responsiveness.

283


SCHOOLING EACH OTHER

(5) Data are crucial. We learned two lessons on the use of data—one we were able to implement, the other was not fully realized. (i) There’s no place like home. We found policy makers to be distrustful of data from other places and naturally concerned about the uniqueness of local and state circumstances. The issue of external validity is both perceived and real. Although some differences are irrelevant to the research question at hand, others are not, but merely reflect a lack of knowledge by researchers. This is an instance where the collaborative process outlined here will school both policy makers and researchers on the relevance of policy context. (ii) Data limit research. There were occasions where lack of appropriate data either limited analysis or took some policy questions off the table. EFRC relied on the extant data available primarily from the NYSED. Although very good by the standards of most states during this period, New York State, for example, never had a student-level database to use in outcome-based research. Likewise, administrative data were inadequate to address some questions. EFRC researchers collected additional data but these data were never collected or made available by EFRC in any systematic way. The current focus of states on longitudinal data systems that connect the outcomes of individual students to their teachers provides a large potential for researcher–policy maker collaboration to address a variety of policy issues. (6) Collaborations are fragile. NYSED did not re-issue the contract for the EFRC when it expired in 2009, but did extend the current collaboration two additional years under a contract to explore various aspects of New York charter schools. The factors that led to the dissolution of EFRC are also instructive and highlight the more general challenges of sustaining policy maker–researcher collaborations. (i) Dependence on key stakeholders. As described earlier, EFRC developed as a result of policy makers and researchers who worked outside of the prevailing incentive structures in which each operated. In his position as Deputy Commissioner, James Kadamus linked EFRC to school finance policy and operations from 1995 to 2006, when he retired. He collaborated with a group of researchers who shared his vision and involved key members of his staff. As a result, EFRC represented a robust connection of research to policy during this period. Although EFRC sustained beyond 2006, it never had the same connection to key policy makers. Commissioner Mills, who had been

284


Deborah H. Cunningham and Jim Wyckoff

very supportive of EFRC, retired in 2009 and Joseph Frey, Deputy Commission for Higher Education, retired in 2010. It is also true that some of the researchers who had been active EFRC participants made career transitions during this period as well. (ii) Changing fiscal context. The environment for education policy was very different in 2009 than it had been in the mid 1990s. The financial crisis made it very difficult for NYSED to continue to fund the Consortium as the agency sustained substantial cuts to its budget. Although some projects received financial support from other sources, many depended on Consortium funding from the NYSED. This dependence, combined with the leadership changes described here, made EFRC vulnerable to shifting priorities. In short, the Consortium had not become an independent part of the culture of education policy in New York and was a casualty of changing personalities and priorities. LOOKING FORWARD Although EFRC did not persist, we are optimistic about policy makerâ&#x20AC;&#x201C; researcher collaborations. The incentives for policy makers and researchers to collaborate have never been greater. Policy makers are under substantial pressure from the public to improve student outcomes and rigorous research has the potential to provide insights of how to do so efficiently. Researchers are gaining access to data that provide them with exciting opportunities to explore interventions to improve student outcomes. Our experience suggests that true collaboration between policy makers and researchers offers the best opportunities to improve policy and practice. As we have suggested, that implies a model that characterizes few existing policy makerâ&#x20AC;&#x201C;researcher relationships. Because of the current context, however, there is good reason to believe that such partnerships will develop at a much quicker pace. State and district panel data on students and teachers are increasingly available and policy makers are turning to researchers to provide the analytic wherewithal to assist them in leveraging these data for policy. There appears to be renewed interest in education finance, especially as it may be designed to create incentives for improved student achievement. Researchers are increasingly aware that an important measure of success is the extent to which policy makers consider and use research findings to inform policies. The Institute of Education Science recently initiated a grant program that supports collaborations between researchers and practitioners aimed precisely at the type of collaborations discussed here. The Request for Applications highlights not only the factors that initially brought policy makers

285


SCHOOLING EACH OTHER

and researchers together, but also how these parties will sustain the culture of collaboration. Whether the incentives in the current context are sufficient for researchers and policy makers to alter past behaviors is unclear. We believe that sustained collaborations focused on important issues of policy have the potential to meaningfully improve the quality of work by both researchers and policy makers in ways that may result in reduced costs to taxpayers and improved outcomes for students. We appreciate comments on an earlier draft from Bill Duncombe, David Monk, and Leanna Stiefel. REFERENCES Berne, Robert. 1993. New York State Study Group on outcome equity. Albany, NY: New York State Department of Education. Monk, David. 1995. Study on the generation of revenues for education, ďŹ nal report. Albany, NY: New York State Board of Regents. Roderick, Melissa, John Q. Easton, and Penny Bender Sebring. 2009. The consortium on Chicago school research: A new model for the role of research in supporting urban school reform. Available http://ccsr.uchicago.edu/sites/default/files/publications/ CCSR%20Model%20Report-final.pdf. Accessed 15 January 2013. Wyckoff, Jim, and Michelle Naples. 1997. Educational finance to support high learning standards: A synthesis. Economics of Education Review 19 (4):305â&#x20AC;&#x201C;18. doi:10.1016/ S0272-7757(00)00005-4

286


Scaling-up Brief

September 2013  Number 3

Dean Fixsen, Karen Blase, Rob Horner, Barbara Sims, & George Sugai 

S

tudents cannot benefit from education practices they do not experience.  While this seems obvious (and it is), education systems have yet to develop the  capacity to help all teachers learn to make good use of evidence‐based  practices that enhance the quality of education for all students. The purpose of  this Brief is to provide a framework that state leadership teams and others can  use to develop the capacity to make effective, statewide, and sustained use of  evidence‐based practices and other innovations. 

Scaling Up

The significant investment in attempts to improve education will be “worth it” if  Effective implementation  it helps further the education of students and benefit their families and  communities. As a benchmark, “scaling up” innovations in education means  capacity is essential to  that at least 60% of the students who could benefit from an innovation are  improving education. The State  experiencing that innovation in their education setting.  For example, 60% of all  Implementation & Scaling‐up of  K‐3 teachers in schools in a district are using an effective approach to teaching  Evidence‐based Practices  reading. To purposefully achieve educationally and socially significant outcomes  for at least 60% of the millions of students in the USA requires changes in  Center supports education  education practices and the development of implementation capacity to  systems in creating  support those practices in education systems in every state.   implementation capacity for  evidence‐based practices 

Scaling relies on the knowledge base for implementation science, a field that  benefitting students, especially  has grown exponentially in recent decades.  Implementation science helps to  explain why only some education improvement efforts succeed and why only  those with disabilities.  some improvements are sustained.  The Formula for Success reflects the  growing science of implementation:    The Formula for Success 

      State Implementation & Scaling‐up of Evidence‐based Practices Center  www.scalingup.org 


The Formula points to three components that interact  over time to produce intended outcomes.  In an  extreme case where Effective Implementation is zero,  the Educationally Significant Outcomes will be zero.   Similarly, if instruction methods are ineffective the  outcomes will be zero no matter how well implemented  they are.  In addition, well‐implemented effective  instruction methods are not sustained in the absence of  engaged leadership in schools, districts, and the state.   Thus, attention to any one or two components is  insufficient.  A substantial literature on student learning  has accumulated over the decades to inform Effective  Instruction.  Efforts to create Enabling Contexts have  been the focus of federal and state legislation for  decades, especially since the advent of No Child Left  Behind and the dramatic increases in federal funding for  education.    Understanding of the implementation component has  increased dramatically.  Since the 1960s,  implementation specialists and researchers have  produced a deeper and more complete understanding  of what it takes to purposefully produce significant  outcomes on a useful scale.  Implementation Teams  bring expertise to support teachers’ use of Effective  Instruction, support administrators’ efforts to establish  hospitable environments for teacher instruction and  student learning, and support leaders who engage in  organization and system change specifically designed to  create adaptive learning organizations.    To reach all schools and teachers, Implementation  Teams help create readiness to change among staff,  leaders, and administrators.  Implementation Teams  help establish data and communication links to bring  about greater alignment and coherence among policies  and practices.  Eventually, every region in each state will  have implementation specialists to augment the current  work of curriculum and instruction specialists and  administrative specialists to purposefully and  proactively support effective instruction in each  classroom, school, and district.  The capacity for scaling up innovations statewide is  created by capitalizing on every opportunity to develop  and institutionalize the implementation infrastructure  needed to support the full and effective use of 

innovations. This brief outlines two key concepts,  Transformation Zones and Implementation Teams, and  the relationship of these structures and their attendant  functions to successful scaling‐up endeavors. 

Transformation Zones States currently dabble in the use of evidence‐based  practices and other innovations, often by funding pilots  and demonstration projects. While pilot and  demonstration projects are a necessary part of system  change efforts, unfortunately they rarely lead to  widespread or sustainable use. Part of the reason for  these unfortunate outcomes is that most  demonstration projects are focused only on  interventions. They do not include making system  changes (e.g., policy, funding, regulatory) or  establishing implementation capacity to allow  innovations and demonstrations to be deployed  effectively Better outcomes can be achieved by  establishing innovations in designated “transformation  zones” that focus on innovations and infrastructure  development.   A transformation zone can be thought of as a “vertical  slice” of the education system. The “slice” is small  enough to be manageable but large enough to include  all aspects of the system. A transformation zone  includes teachers and staff, important stakeholders and  partners, key policy makers at the state level, and all  components of the bureaucracy in between. The figure  below provides a visual representation of the continual  feedback loop that exists between policy and practice in  a transformation zone. Transformation zones are used  to establish simultaneously new ways of work (the  intervention) and the capacity to support the new ways  of work (the implementation infrastructure to assure  effective use of the intervention). One without the  other is not sufficient.   In a transformation zone, the intention from the  beginning is to rapidly establish the operational value of  the innovation and determine the infrastructure  supports necessary for widespread use. 

Page | 2 


The dual intention (innovation and  infrastructure development) is fully understood  and agreed upon at all levels (LEA, parent  groups, schools, district administrators, state  leaders). 

From the beginning, issues related to  sustainability, quality improvement, and  scalability are considered and decisions are  made with the future in mind (i.e., capacity  development is part of every decision and part  of every solution). 

Exceptions to current policy, funding, and  regulatory requirements are anticipated,  welcomed, and tested at the practice level with  respect to enhancing capacity building. 

Practice‐level feedback loops at each policy  level (e.g., school, district, state) are formalized  and built into communication protocols. Formal  assessment instruments are used frequently  and repeatedly to assess the fidelity of the  practices at the school level, the fidelity of the  implementation supports at the district level,  and the fidelity of the policy and continuous  improvement systems at the state level.  

Assessment data are used immediately to  assess progress and to inform action planning at  each level. 

Changes in the areas outlined above begin in  the first month or two (not a few years later  when a “demonstration” or “pilot” has  concluded) and continue until critical problems  have been solved, capacity has been built, and  system alignment within the transformation  zone has been achieved. 

As the work in a transformation zone becomes  successful, the zone is broadened to include a  larger “slice” of the overall system.  Within four  or five years the entire system is in the  transformation zone, and the innovation and  the implementation infrastructure are  embedded as standard practice in education. 

Capacity Development As the value of an innovation is demonstrated in a  transformation zone, the State actively supports  capacity expansion and aligns current policies,  structures, roles, and functions. As the transformation  zone expands, the infrastructure expands to better  support the effective use of the innovation in schools  and districts in larger portions of the state. In a  simultaneous bottom up and top down manner, every  new policy sets the occasion for creating new capacity  to effectively implement the policy with demonstrable  benefits to students, families, and communities. New  practices that are implemented set the occasion for  discovering and creating the infrastructure supports,  policy revisions, and funding streams needed to further  develop and expand capacity. This leads to a never  ending cycle to sustain and improve both the innovation  and the infrastructure supports for the innovation for  years to come.  Successful scaling‐up of evidence‐based practices and  effective innovations requires keeping the entire system  in mind; directing capacity development efforts to  appropriate levels; and connecting communication and  data systems across these levels so a transformed  system can emerge.  State education   Capacity development for sustainable, quality  implementation is the goal of the State Implementation  and Scaling up of Evidence‐based Practices (SISEP)  Center funded by the U.S. Department of Education    Office of Special Education Programs (OSEP). In the  SISEP active scaling states, innovations already are in  use to further literacy and social and emotional well‐ being. The scale‐up efforts are focused on these well‐ established innovations that were initiated by the states  based on their needs and desires for their students.  SISEP’s role is to help the states develop the capacity to  make full and effective use of those innovations in  classrooms across the entire state. Thus, the purpose of  “scaling up” is to build on the good work that already  has been initiated in each state in order to establish a  general capacity for implementing a variety of evidence‐ based programs and other innovations with fidelity and  good outcomes for students, families, and communities.  While the work is funded by OSEP, capacity 

Page | 3 


development is focused on the entire education system  (general and special education). 

Implementation Teams The SISEP approach begins with a clear understanding  that teachers and education staff members who  interact with students are the key agents of quality  education. This is where “education happens.” Teacher  and staff competency to “make education happen”  relies upon initial and ongoing teacher preparation and  professional development (e.g. , selection, training,  coaching, performance assessments) and organizational  supports (e.g., decision support data systems,  facilitative administration, system interventions) that  are focused on making effective use of innovations and  creating schools as learning organizations.  How can the capacity for professional development and  practice improvement be developed, sustained, and  improved over time? The SISEP vision for developing  state capacity is focused, in part, on creating  Implementation Teams that each concentrate on about  100 schools within a manageable geographic region to  assure high‐quality supports for teacher preparation  and professional development and supportive  administrative practices in every school. The goals of  Implementation Teams are to provide the infrastructure  needed to use best practices in implementation and  systems change in order to support the widespread use  of effective educational interventions selected by  districts, schools, and communities. The intent is to  establish a core infrastructure that can help integrate  practice improvement initiatives and that can both take  advantage of local and district strengths as well as  anticipate and react appropriately to the multiple  challenges faced by any scale‐up effort. About 10 to 15  Implementation Teams will be needed to establish an  adequate implementation infrastructure in the  education systems in typical states. The daily, weekly,  and monthly communication and practice‐based  feedback systems among the various partners and  stakeholders (e.g., teachers, building administrators,  district superintendents and staff, unions, parents,  advocacy groups, and State leaders) help to create an  on‐going capacity for surfacing local, district, and  system issues, and solve problems by re‐aligning 

resources in the education system as a whole. These  feedback systems help to assure the continuing  functional components of the Implementation Teams  over generations of staff members providing education  in the midst of continual changes in society. 

Conclusion Organized transformation zones and implementation  teams currently do not exist in States. Thus, the  capacity for making full and effective use of evidence‐ based programs and other innovations does not exist in  State systems of education or other human services.  The science of implementation, organization change,  and system transformation is growing and applied “best  practices” have been identified.   Given the recent advances in knowledge, it is now  possible for States to deliberately and systematically  develop and make effective use of an implementation  infrastructure to accomplish educationally and socially  significant outcomes for children statewide.  About SISEP  Effective implementation capacity is essential to improving  education. The State Implementation & Scaling‐up of  Evidence‐based Practices Center supports education systems  in creating implementation capacity for evidence‐based  practices benefitting students, especially those with  disabilities. For more Information visit us on the web at:  http://www.scalingup.org   

This content is licensed under Creative Commons license CC  BY‐NC‐ND, Attribution‐NonCommercial‐NoDerivs . You are  free to share, copy, distribute and transmit the work under  the following conditions: Attribution — You must attribute  the work in the manner specified by the author or licensor  (but not in any way that suggests that they endorse you or  your use of the work); Noncommercial — You may not use  this work for commercial purposes; No Derivative Works —  You may not alter, transform, or build upon this work.  Any of  the above conditions can be waived if you get permission  from the copyright holder, sisep@unc.edu . 

Page | 4 


Policy Brief

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

AUGUST 2013 RB-55

Consortium for Policy Research in Education University of Pennsylvania Teachers College, Columbia University Harvard University Stanford University

How State Education Agencies Acquire and Use Research in School Improvement Strategies By Margaret E. Goertz, Carol Barnes and Diane Massell

Over the last two decades, state and federal laws and grant programs, such as state accountability polices, the No Child Left Behind Act (NCLB), Race to the Top, Title I School Improvement Grants, and State Longitudinal Data System Grants, have given state education agencies (SEAs) considerably more responsibilities for directing and guiding the improvement of low-performing schools. At the same time, they have pressed SEAs and school districts to incorporate research-based school improvement policies and practices in their statewide systems of support for low-performing schools, technical assistance for districts, professional development for teachers, and school improvement programs. Policymakers have urged SEAs to engage with organizations external to their own agencies to extend their strained capacity, and to help them collect and use research or other evidence (see, for example, Rennie Center, 2004). A

variety of organizations involved in this enterprise have emerged over the last two decades (Rowan, 2002). For example, the 2002 authorization of the Elementary and Secondary Education Act’s (ESEA) comprehensive assistance centers was specifically designed to provide and encourage SEA’s use of research. Although studies of districts’ and schools’ use of research exist (see for example, Coburn, Honig, & Stein, 2009; Daly & Finnigan, 2011; Farley-Ripple, 2012), we know little about how SEAs search for, select, and use research and other kinds of evidence in their school improvement strategies. While one might assume similarities in research use behaviors, both the organizational structures of SEAs and the population of external organizations with which they interact are quite different than schools and districts, and the most recent in-depth study of SEAs was conducted nearly 20 years ago (Lusi, 1997). The exploratory study on which this brief is based was designed to fill that gap by examining: 1) where SEA staff search for research, evidence-based, and practitioner knowledge related to school improvement; 2) whether and how SEA staff use research and these other types of knowledge to design, implement, and refine state school improvement policies, programs and practices; and, 3) how SEAs are organized to manage and use such knowledge (Goertz, Barnes, Massell, Fink, & Francis, 2013). To clarify the nature of evidence that SEAs sought and used, we distinguished among three types of knowledge in our data collection and analysis. We define researchbased knowledge as research findings that have

University of Michigan University of Wisconsin-Madison Northwestern University

This Policy Brief was derived from the research report, State Education Agencies’ Acquisition and Use of Research Knowledge in School Improvement Strategies. Visit www.cpre.org/SEA to download a free copy.


Policy Brief

been to varying degrees “collated, summarized, and synthesized,” and then are presented in ways that provide empirical or theoretical insights or make them otherwise informative (Davies & Nutley, 2008). We include in this category published original research, research syntheses, summaries or metaanalyses, and evaluation reports. We also consider forms of research-based knowledge that are designed for use in practice; that is, models, programs, protocols or other tools that embed research or research-based practices in guides to action. Older knowledge utilization models assumed that simply transmitting such knowledge to policymakers or practitioners would be sufficient to create change. But new models show that research-based knowledge is not sufficient to meet the needs of professionals using it. Integrating contextual, local, and practitioner knowledge with research knowledge is critical to developing “useable” knowledge to guide action (Honig & Coburn, 2008; Hood, 2002; Huberman, 1990; Lindblom & Cohen, 1979; Weiss, MurphyGraham, Petrosino, & Gandhi, 2008). Therefore, we considered how the SEAs incorporated other evidencebased knowledge, which we define as data, facts, and other information relevant to the problem of school improvement, such as formative feedback loops on implementation, and practitioner knowledge, which is the information, beliefs, and understanding of context that practitioners acquire through experience, along with research in their decision-making processes. The formal organizational structure of most SEAs has long been criticized for its hierarchical and segmented nature, and for its focus on compliance instead of on guidance and support for meaningful improvements in schools or districts. In her research study of two SEAs, Lusi (1997) argued that flatter, less segmented management structures could help build internal and external connections and produce the kind of

The research reported here was supported by a grant from the William T. Grant Foundation. This brief has been internally and externally reviewed to meet CPRE’s quality assurance standards. The opinions expressed are those of the authors and do not necessarily reflect the views of the William T. Grant Foundation, CPRE, or its institutional partners.

adaptive organization that would be more conducive to coherent improvement policies and the flow of knowledge. More recent studies in other settings confirm that flexible professional connections across traditional organizational boundaries improve problem-solving using varied but relevant expertise (Dutton & Heaphy, 2003; Weick & Sutcliffe, 2001; Wenger, McDermott, & Snyder, 2002). Sociologists have long studied these kinds of connections, known as “social networks,” to understand the diffusion of knowledge and innovation within and across organizations, including more recent studies of schools and districts (Daly, 2010). A few researchers have used social network theory and methods to study state education policy networks (Miskel & Song, 2004; Song & Miskel, 2005). We applied social network perspectives and methods here to examine SEA’s communication structures, search and incorporation networks, and network properties, such as social capital, and to identify the most central “knowledge brokers” or influential knowledge sources in the research and other knowledge networks. Our study included three SEAs that are located in different regions of the country and vary in size (from 250 to 500 staff), organizational structure, and school improvement strategies. Data for the study were collected between 2010 and 2012. We conducted indepth interviews with high-level SEA staff involved directly in school improvement and in related programs (e.g., curriculum and instruction, accountability, special programs, teacher policy) and with a small number of leaders of external organizations that were central to research use in the SEAs. We also sent a web-based survey to all professional staff in the two smaller SEAs, and to all staff working in school improvement and related departments and a representative sample of other professional staff in the third SEA. Our analyses are based on a total of 62 interviews and 300 surveys in the three SEAs1, as well as documents describing SEA school improvement policies and tools designed for district and school use. Respondents identifed the offices, organizations and individuals they turned to both within and outside their SEA when communicating about work, and

1 All survey respondents were asked whether their work related “in any way to improving low-performing schools and school districts” in their state. We used results from all respondents who answered “yes” to this question; that is, staff who self-identified as being involved in school improvement work regardless of the SEA office in which they worked.

2

CPRE.ORG


more specifically, when seeking research, data, and practitioner advice on programs and practices targeted at improving low-performing schools and school districts.2 We used the survey data to identify internal and external sources of research and other types of knowledge, to analyze the size, strength, and configuration of the four networks, including patterns of cross-office or within-office communication, and to identify the most highly connected knowledge brokers and influential individuals, offices, or organizations in the networks. We drew on the direct, interpersonal networks of these influential individuals along with interview data to corroborate and interpret the broader network analyses, and to provide more specific information on the internal and external sources of research or other knowledge. Interview data and document reviews also provided more detail on the types and qualities of research or other forms of knowledge that SEA staff sought and used in school improvement decision-making, the research incorporation process, and institutional, political, or other factors that influenced research search, incorporation, and use. The findings presented in this brief focus on how SEAs acquired and used research knowledge in their school improvement policies and programs. We did not examine the effects of their use of research or other evidence on improving school or student outcomes. Because we focused on research use for school improvement in a small number of states, our findings may not generalize to other SEAs or other education policy areas. Our study, however, is the first to systematically map information networks within SEAs and between SEA staff and external sources of support, and provides important insights into how SEA staff search for and incorporate research in their work.

The Structure and Strength of SEA Research Networks Multiple SEA staff in the three study states actively searched for and were receptive to research ideas and

related information from both within and outside their agencies. About 75% of the staff in each agency asked their SEA colleagues for research advice, while a little less than one-third turned to external organizations or individuals for similar information. In each SEA, some, but not all, of these staff named multiple colleagues, offices and external organizations as sources of research information. Contrary to the usual image of SEAs as “siloed” organizations, our network analyses showed considerably more cross-office connections than we anticipated given the literature on SEA structures. Respondents attributed cross-department communications to multiple factors, including state and federal accountability demands, competition for federal grants that required integrated proposals, reduced SEA staffing, and SEA leaders committed to more collaborative organizational cultures. For example, special education staff in State C perceived that NCLB accountability requirements had brought them into school improvement meetings in an unprecedented way. Other federal initiatives, such as the Race to the Top grants, stimulated cross-office search and exchange of research-based ideas in State B. Leadership in all three SEAs also facilitated crossagency collaboration by creating cross-office teams to share information or work on common tasks or problems. States B and C were active participants in an Academy of Pacesetting States hosted by the Center on Innovation and Improvement (CII), a national comprehensive assistance center that convened cross-office state teams on a regular basis to share and discuss research or other topics related to statewide systems of school improvement. State A established cross-departmental task forces to design its system of tiered-intervention and to manage and monitor performance on the SEA’s major goals. These broad, more informal cross-department and external connections facilitated the flow of research information and new ideas, but internal crossdepartment connections were weaker than those within departments. This suggests that colleagues within the same department or office engaged in more frequent interactions, and these had a greater impact on individuals’ work than did the cross-

2 We were unable to collect the names of individuals from whom the respondents sought research or other types of advice and information in State A and use office or department names instead.

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

3


Policy Brief

department connections. Thus, formal organizational structures still delineate many functional responsibilities and lines of communication.

monitoring (aqua nodes), and a consultant from special education (pink node). We also found strong connections between the Curriculum and Instruction Office (dark green nodes) and the School Improvement Office, both of which were within the School Improvement Department. Two external organizations—a statewide professional membership association and the state’s regional comprehensive assistance center (black shapes)— were centrally located and had stronger connections to the School Improvement Department than most other organizations in State B’s research network, although the CII and a state university were also quite influential.

While the school improvement research networks in each state included an array of staff from across the SEAs and numerous external organizations, only a small number of individuals, offices and external partners were central actors in these networks. We identified the key actors or offices by mapping each participant’s location in the networks (e.g., central or more peripheral), as well as the strength3 and direction4 of their connections to each other. Influential and well-connected individuals and offices tended to be clustered more centrally in the core of the network maps.5 “Influential” participants are those who were highly sought after for research information; that is, they were mentioned as a source of advice by many SEA staff. “Well-connected” participants are those who both sought research ideas and information from a range of sources, and, at the same time, provided information to a range of SEA colleagues, thus serving as knowledge brokers in the research networks. These two sets of individuals or offices tended to overlap considerably. Figures 1-3 below show the structure of the three SEA research networks, and the staff, offices and external organizations involved in, and their location within, each network.

School improvement and curriculum and instruction are housed in different departments in State C, and primarily leadership from both departments, along with very few staff, were at the center of its research network. While the Assessment Offices were not as prominent as they were in State B, directors of a very small Research Office housed in the Commissioner’s Office and ESEA program monitoring were highly influential in the network. Thus, like State B, most of the influential internal brokers were in formally designated leadership roles. Also like State B, a few external organizations were prominent in the research network—the CII, State C’s regional comprehensive assistance center, and a statewide professional membership association.

The directors of school improvement were the most influential and well-connected SEA staff in the research networks in States B and C (Figures 1 and 2). These directors and some of their staff (green nodes) were connected with several other salient offices related to school improvement. In State B, these key participants included leadership from the Assessment and Accountability Department (grey nodes), including staff in the Research and Evaluation Unit (red nodes), the Commissioner’s and Deputy Commissioner’s Offices related to academic matters (yellow nodes), ESEA program

In contrast to the other states, the Research Department was the most influential in State A (Figure 3). It resided at the center of State A’s research network, with multiple connections to most other departments in its SEA. This relatively large office plays multiple roles in its SEA: conducting research reviews for offices throughout the agency; helping program offices design, procure, and manage program evaluations; preparing data reports and briefings for accountability review teams; and, developing analytical tools. Some of this work is done in-house, and some, particularly

3 Strength of network connections was measured though a combination of the reported frequency of communication about research or other kinds of information, and the influence respondents perceived the resulting information to have on their work. Degree of strength was considered using a matrix ranging from a cell defined by highly influential/daily contact (200) to a cell defined by not influential/a few times per year contact (0.5). 4 An individual or office can be the seeker of research information (considered an “out-tie”) or the named source of research information (considered an “in-tie”). To examine ties between SEA members and external members of networks, we used “out-ties” from our SEA respondents forming in-ties to external organizations as we interviewed a subsample, but did not survey, external organizations. 5 We also identified the most central staff or offices using rank ordered standardized centrality measures showing the percentage of all possible ties directed into sources of research (in-ties) as well as out from those seeking information (out ties).

4

CPRE.ORG


program evaluations and literature reviews, is contracted out. But the most well-connected research knowledge broker was the Accountability Department that conducts formal reviews and monitors the performance of low-performing school districts. The School Improvement Department and its offices, along with the Curriculum and Instruction Department, were also highly influential hubs of activity. Similar to State C, curriculum and instruction and school improvement are housed in separate departments, but though influential, the Curriculum and Instruction Department did not have many direct research connections with school

improvement in State A. Finally, perhaps due to the role of the Research Office, external organizations remained at the periphery of State Aâ&#x20AC;&#x2122;s research network.

Figure 1 State B, Research Network, Strength of Ties

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

5


Policy Brief

Figure 2 State C, Research Network, Strength of Ties

6

CPRE.ORG


Figure 3 State A, Research Network, Strength of Ties

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

7


External Sources for Research Knowledge

Policy Brief

Although fewer SEA actors turned to external sources of research on school improvement, these organizations played a key role in the research that policymakers accessed and used. The SEA staff named a large and diverse array of external organizations—between 37 in State A and 42 in State B. These organizations were invited to present new ideas about school improvement, provided or collaboratively developed research-based tools and strategies, conducted evaluations, synthesized research on relevant topics, and more. As we discussed above, a subset of these external organizations played central roles in SEA research networks in States B and C.

8

The most commonly mentioned source of external research information was the federal government. (See Figure 4.) SEA staff turned to federal agencies, including several offices within the U.S. Department of Education, and federally funded centers, such as the ESEA comprehensive assistance centers, regional education labs, and other technical assistance centers that support the implementation of more targeted educational programs (e.g., special education or vocational education). The second largest external source of research advice was professional membership associations. These included: 1) associations focused on specific subject matter, or teaching and learning more generally, such as state and national reading, mathematics and technology organizations, and the ASCD; 2) occupationally focused associations, such as the Council of Chief State School Officers or state-level associations representing superintendents, principals, and federal program administrators; and, 3) regional organizations, such as the Southern Regional Education Board. Staff in the three SEAs, however, reached out primarily to national, not state, professional membership associations for research advice. Fewer SEA staff sought research from institutions of higher education or research organizations, although one to two universities did play an integral role in school improvement designs and delivery in States B and C, and SEA staff in State A often turned to university faculty for help on specific projects. SEA staff also turned to a different CPRE.ORG

mix of federal organizations across the three states. For example, State A sought research advice more frequently from various regional education labs than staff in States B and C who relied more heavily on the regional and content centers within the comprehensive assistance center network. Figure 4 External Sources for Research Knowledge

This variation in the states’ external research networks reflected differences in stages of policy development, internal capacity, the structure of external environments, and prior partnership histories. For example, the basic components of the State A’s school improvement system had been in place for many years; the overarching design was not a target for major review. As a result, State A was more likely to pull in a range of external partners, on an as-needed basis, to co-develop very discrete and specific tools and resources. This process was facilitated by the SEA’s robust research office. State C, in contrast, was bracing for more and more schools to come within its purview for not meeting state or federal accountability standards in the midst of a very spare and declining SEA workforce. With limited research capacity and expertise in school improvement, SEA staff turned to the CII, as well as their own regional comprehensive assistance center and a state professional membership association, to help them redesign their supports and create a research-based infrastructure of tools to monitor and assist schools. Their engagement with these organizations was frequent and extended over many years. Similarly, having very limited capacity for


high school improvement and guidance, States B and C sought assistance from the National High School Center. State C, for example, used this Center’s research and the work of a consulting group to develop an “early warning system” to identify students at risk of dropping out of school. Much of the external search was driven by personal connections and prior work histories. While a few of the SEA staff who we interviewed suggested they used a range of Internet and academic resources, many turned to their existing network of academics and/or memberships and affiliations with education organizations to access research and new ideas. SEA staff also turned to organizations with which the SEA had a history of doing work. For example, regional area education agencies and their respective membership association have long partnered with the SEA in State B in delivering services to schools. These regional government agencies pioneered strategies for improvement that State B later adopted as part of its statewide systems of support, such as a design for working with low-performing schools to model an improvement process. State C, which did not have an equivalent set of regional government partners, turned instead to a state-level professional membership association with which they had long worked in school improvement design and delivery. Organizations that have a previous work history with SEAs are attractive sources of research-based knowledge because they often know the local context in which the SEA is working, are familiar with the strengths of SEA staff, and are viewed as credible sources of information. Some external sources of research were also viewed as more neutral purveyors of knowledge than the SEA, an asset if targeted schools were wary of the state reform agenda. Similarly, the expertise carried by external organizations or individuals could provide an outsiders’ perspective on whether state efforts were within the bounds of best practice—an important metric for state agencies and policymakers who are engaged in often uncertain work. For example, when State A undertook a review of its district and school improvement standards, it asked its regional education lab to identify the research underlying the proposed standards and their impact on schools. The SEA then compiled this research into a guide and posted it on its website so that, as

one staff person noted, “people understand they’re being held accountable to things that research tells us are important. But also so that…we can tie the assistance around some of what research is telling us.” State C sought out the advice and engagement of a university faculty member who had been a well-respected former superintendent, recognizing that his support would open the doors, and the minds, of other superintendents. Finally, external organizations frequently played an important role in synthesizing and packaging research to make it useable, and useful, to SEA work. Many of the SEA staff we interviewed expressed a desire to know more about research, but it was difficult for them to find time to stay current with the literature and to incorporate research directly into their school improvement work. An administrator in State A explained, “Because our work [in my office] is so huge, I’ve relied on consultants and organizations that can capture and summarize…research so that we can figure out, focus on how we’re going to use it to inform our work.” As we describe in the following sections, the more influential external organizations in SEAs’ networks brokered, jointly developed with states, or helped states adapt research-based, but useable tools that translated research into more specified guides for action.

Incorporating Research Within Core Networks While broad, but relatively weaker cross-department and external research networks facilitated the search process and the flow of diverse ideas in the SEAs, a set of well-connected, influential SEA staff brought research and other kinds of information from these different sources into stronger, smaller working groups that collectively addressed problems of school improvement. We conceptualized these groups as “core networks.”6 These groups, which generally included leaders and other staff within school improvement offices, a few key external organizations and, in a very few cases, colleagues from other departments in the agency, enabled key SEA staff, who were typically central in the practitioner as well as research networks, to actually

6 Here we draw on work of Wenger, McDermott, and Snyder (2002) who studied “communities of practice” in other settings.

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

9


incorporate research into strategies that were workable in the context of their respective states.

Policy Brief

As research has shown in other settings (see, for example, Barnes, Camburn, Sanders, & Sebastian, 2010; Honig & Coburn, 2008; Spillane, Reiser, & Reimer, 2002), SEA staff incorporated research into their school improvement strategies through a distinctly social process in which the network members interpreted, challenged, and otherwise made sense of research over time. During this incorporation process, the core network groups used local practitioners’ feedback, state professionals’ experience, and external partners’ knowledge of relevant research to contextualize various research findings in light of their states’ school improvement needs. In contrast to models of research dissemination in which generalized, primarily decontextualized findings advanced by researchers are transmitted to users, in these core networks, users and a few providers worked collectively to adapt research to address particular problems and, in some cases, to co-construct new useable knowledge for guiding action. One characteristic of the core networks was that a group of people came together to address common problems and goals. Key school improvement staff in States B and C who had developed strong ties to key liaisons in the federal comprehensive assistance center system were able to learn about relevant research and how to apply it from developers of school improvement models or promising practices rooted in that research, and with states who were puzzling over similar problems to their own. Not only could the SEA staff in these states receive research tailored to particular school improvement policy needs they faced, but they could then see how it might be put into action from other SEAs who were early implementers. Members of the core networks also worked together over time to develop, refine and use a set of ideas, protocol, tools, and frameworks. In these instances, the research-based blueprints or models were adapted through core groups for use in the context of a particular state. State C, for example, turned to the CII’s Handbook on Restructuring and Substantial School Improvement (Walberg, 2007) when redesigning its school improvement plans. The handbook contains a consolidated checklist of indicators for schools, districts and teachers to use to

10

CPRE.ORG

identify areas for improvement, a research-based guide to action that they found readily usable, and useful. The SEA, however, took a proactive stance with CII to help them transform these materials into what SEA staff considered a more focused and useful format. Working with their own school improvement coaches (who are retired educators), CII staff, and their regional comprehensive assistance center, State C created a more streamlined set of indicators, cutting the number from the Handbook by half. As this example suggests, the core networks included strong ties to networks of practitioners “on the ground” and in professional associations, as well as the research networks. Practitioner networks provided feedback on how improvement strategies were working in the field, what needed clarifying, or what could be changed. A core group member in State B described how they used meetings with a group of regional school district school improvement staff as a sounding board for their strategies: “because they’re in the schools doing school improvement with the local districts. And a lot of times we can say, ‘Okay, here’s what we’re thinking….Is that too much? Is it not enough?’” Staff in State B turned to professional membership organizations as well for practitioner input. Professional membership organizations were central to State C’s practitioner network, as were the comprehensive assistance centers. In contrast, the SEA in State A sought much of its practitioner advice from its districts and district networks. SEA staff also perceived information, decisionmaking, evolving improvement strategies and ideas to be more trustworthy and efficacious within the context of these core networks and collective work. For example, when asked if the people in his core work network have the expertise to find, and then use evidence to successfully improve lowperforming schools, an office director reported: There is no one individual that holds all the information, which is why we have a group. . . all of those different people hold enough pieces that we can have conversations and share information across the table that can … push us along … to that ideal goal at the end.


Resources such as “communal memory” created a collective sense of efficacy and supported core network members such that no one person had to know everything. When asked if and why she trusted research and other school improvement information exchanged within a core network, one influential office director in State B said: “Because we digest it together. And people challenge each other.” She continued with an example of a similar process: “We solve problems. What are we going to do about this? . . . And people bring in research and we’ll table things and [then] come back to them with the research, and then we’ll challenge the research.” An influential leader in State C noted: “I want to validate what [researchers are] saying. When you have those strong networks, you build upon that professional knowledge and practice.”

Research Use The SEAs in our study valued and used research to inform the design of their school improvement frameworks, processes, tools, and other forms of school improvement assistance. As our earlier examples show, however, SEA staff gravitated to research that they perceived as relevant to their context, actionable, feasible, and helpful in addressing pressing problems of policy and practice, sometimes called “research designed for use.” In addition to searching for this kind of specific and contextrelevant research, they also generated it by coupling research with practitioner knowledge. The states confronted similar problems and challenges in their work, even though many features of their systems of supports for lowperforming schools varied. For instance, at different points in time all three searched for research to create school improvement frameworks and/or planning processes that would be more effective in leveraging school or district change. All three sought research to help districts be more successful in managing the problems of their lowperforming schools. And each used research to develop tools and processes that would aid the growing number of schools coming under the purview of accountability mandates. While foundational documents, such as the school improvement frameworks, could contain extensive

references to the literature—State B’s framework contained 92 different citations, for example—SEA staff most often relied on research syntheses or research-based tools or strategies to provide very specific and concrete guides to action. For instance, State C drew on this type of research to overhaul their district and school improvement planning processes. The SEA viewed the plans submitted by low-performing sites as lacking a coherent theory of action, and premised on very localized notions of good practice rather than solid research. When they heard a presentation about the CII’s Handbook, they seized upon it to solve these problems. As described in the preceding section, the SEA created a more streamlined set of indicators that they believed would be more doable for sites. They also encouraged the CII to create a web-based platform using the indicators to help them more efficiently and cost-effectively monitor progress and interact with local educators. Building on this positive experience, the CII and the regional comprehensive assistance center continued to be a major source of research used by State C to develop new tools and processes, such as “change maps,” a process that the state could use to differentiate their technical assistance to sites. It was developed by the regional center and built on research from Banathy (1996), among others. State A brought together research with their own knowledge and local experience to develop a wide array of web-based school improvement tools and supports for its school and districts. The SEA collaborated with its urban school district network to identify common problems and develop guides for addressing these problems. They created a Professional Learning Community Guidance Document in conjunction with the National Institute for School Leadership (NISL) and a professional working group from districts and schools. The document, which provides guidelines for developing and strengthening instructional teams at the school level, includes references to researchbased curriculum units from NISL and related research and SEA resources for each stage of the process. Similarly, the SEA’s school improvement, data, technology, and curriculum and instruction offices worked with a national consulting firm and five urban school districts to create a District Data Team Toolkit. Drawing on a research-based data-

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

11


driven inquiry and action cycle, the toolkit provides detailed modules and rubrics to help districts engage in inquiry and use data to inform districtlevel decisions.

Policy Brief

The states similarly pulled in research or evidence that had rich, descriptive details of practice that they could pass on to schools or delivery providers. One SEA staff member in State B noted that John Hattie’s book, Visible Learning: A Synthesis of Over 800 Meta-Analyses Relating to Achievement (Hattie, 2009), had “taken off like wildfire” with their school coaches, in part because it presents a research-based dashboard for comparing innovations, a tool these providers could use to help their schools evaluate needs and select effective practices. SEA staff in both States A and B used the research-based Instructional Rounds in Education (City, Elmore, Fiarman, & Teitel, 2009) to focus their own and their districts’ school improvement strategies more specifically on classroom observations of instruction, student learning and academic content. State B was also trying to actively involve practitioners in recommending research-based resources. Educators could submit instructional resources aligned to the teaching and learning strand of the state’s framework on its website. But they had to also submit their assessment of the scholarship underlying the proposed resource, such as a judgment about the quality of evidence, how “seminal” the research was, and whether it was confirmed by other studies or experts. The state includes a link to this research evidence next to the recommended practice. Such attention underscores how deeply the reference to research has been embedded in the norms of SEAs. States also created their own original research by undertaking both formal and informal evaluations of their school improvement policies and programs. The extent to which the three SEAs conducted formal evaluations varied, however, depending on resources and internal capacity to select and guide external evaluations. SEA staff sometimes designed and conducted evaluations on their own, and other times they contracted with external partners. Under the direction of its Research Office, State A was the most involved in undertaking formal evaluations. State B contracted with national research organizations to evaluate their statewide system of support and their Title I School Improvement Grant program. In contrast, State C was more likely to use

12

CPRE.ORG

measures of academic improvement in identified schools and informal educator feedback to assess program success. But all of the SEAs drew on external evaluations of instructional programs or practices to create lists of acceptable or recommended programs or vendors to assist schools and districts.

Summary and Implications In summary, SEA staff in our three study states actively sought and were receptive to research. Contrary to a uni-dimensional model of knowledge utilization, where research users are viewed as passive recipients of published research, research use was a multi-dimensional process in our sites. Multiple SEA staff reached out to multiple internal and external sources of research. Incorporating research into policy and practice was often a social process, where SEA staff worked with each other, practitioners, and external partners to make sense of research and adapt it to their local context. And key brokers of research inside and outside the SEAs facilitated the research search and incorporation process. Decision-makers were more likely to seek and use research designed for use than published academic studies to guide their actions. They also understood that research, particularly from recognizable and trusted sources, lent credibility to their efforts and motivated practitioners. Although the findings reported here come from an exploratory study of only three SEAs in one policy area, they shed light on ways that SEAs and policymakers can strengthen research-based knowledge use in their organizations. 1. SEAs should draw on the infrastructure outside their boundaries, such as technical assistance centers, state and national professional membership organizations, other professional networks, and universities, to access research and research designed for use. This action will, however, require SEAs to develop a culture of research use, and build the capacity to broker research search and incorporation, and to assess the underlying quality of the research and research designed for use produced by these


organizations. By cultivating multiple knowledge brokers within the SEA to access and circulate a diverse array of research, SEA knowledge networks will have access to a broader range of expertise and could also be less vulnerable to staff turnover.

the knowledge base, which includes supporting more varied types of research on policy implementation and effects in understudied areas of education policy.

2. SEAs should also nurture, identify, and connect knowledge brokers in their agencies and in external organizations who work on common problems. SEA staff could develop and lead ongoing networks involving research organizations, practitioners, and their own staff to solve specific problems and advance state policy. The New York State Education Departmentâ&#x20AC;&#x2122;s Education Finance Research Consortium provides one example of this approach. Fostering working groups composed of influential SEA brokers, key research sources and practitioners to adapt generalized findings into more useable information in the context of particular state problems can facilitate the incorporation of research into policy and practice. 3. Policymakers should encourage and support SEA evaluations of their own programs. These evaluations, particularly of the implementation of school improvement programs, provide critical feedback to agency staff. But SEAs often lack the human resources to design these studies and the fiscal resources to conduct them. Our study also raised more questions than we could answer, given the limited scope of our inquiry. First, researchers should study the use of research and other types of evidence in additional SEAs and other education policy areas to see whether the findings reported here generalize to other SEA contexts. Second, research should examine connections among patterns of internal or external information flow within an SEA, the number and type of information sources, the types of evidence people access and use in decision-making, and consequences for policy and practice. Third, there is a need to assess the quality of research acquired by SEA staff and underlying research designed for use products. While many of the products identified in our study were written by or cited national experts, sometimes research was added in a fairly superficial manner. Finally, there is a major need to strengthen

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

13


References Barnes, C., Camburn, E., Sanders, B., & Sebastian, J. (2010). Developing instructional leaders: Using mixed methods to explore the black box of planned change in principals' professional practice. Educational Administration Quarterly, 46(2), 241-279. Banathy, B. H. (1996). Designing social systems in a changing world. NY: Plenum Press.

Policy Brief

City, E. A., Elmore, R. F., Fiarman, S. E., & Teitel, L. (2009). Instructional rounds in education: A network approach to improving teaching and learning. Cambridge, MA: Harvard Education Press. Coburn, C. E., Honig, M. I., & Stein, M. K. (2009). What’s the evidence on district’s use of evidence? In J. Bransford, D. J. Stipek, N. J.Vye, L. Gomez, and D. Lam (Eds.), Educational improvement: What makes it happen and why? (pp. 67-86). Cambridge: Harvard Education Press. Daly, A. J., (Ed.) (2010). Social network theory and educational change. Cambridge, MA: Harvard Education Press.

Rowan, B. (2002). The ecology of school improvement: Notes on the school improvement industry in the United States. Journal of Educational Change, 3(3-4), 283-314. Song, M., & Miskel, C. G. (2005). Who are the influentials? A crossstate social network analysis of the reading policy domain. Educational Administration Quarterly, 41(1), 7-44. Spillane, J., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition: Reframing and refocusing implementation research. Review of Educational Research, 72(3), 387-431. Walberg, H. (Ed.) (2007). Handbook on restructuring and substantial school improvement. Greenwich, CT: Information Age Publishing.

Davies, H. T. O., & Nutley, S. M. (September 2008). Learning more about how research-based knowledge gets used: Guidance in the development of new empirical research. New York, NY: William T. Grant Foundation.

Weick, K. E., & Sutcliffe, K. M. (2001). Managing the unexpected: Assuring high performance in an age of complexity. San Francisco, CA: Jossey-Bass.

Dutton, J., & Heaphy, E. (2003). Coming to life: The power of high quality connections at work. In K. Cameron, J. Dutton, & R. Quinn (Eds.), Positive organizational scholarship (pp. 263-278). Thousand Oaks, CA: Berrett-Koehler.

Farley-Ripple. E. N. (2012). Research use in central office decisionmaking: A case study. Education Management, Administration and Leadership, 40(6), 786-806. Hattie, J. (2009).Visible learning: A synthesis of over 800 metaanalyses relating to achievement. London & New York: Routledge, Taylor& Francis Group. Honig, M., & Coburn, C. (2008). Evidence-based decision making in school district central offices: Toward a policy and research agenda. Educational Policy, 22(4), 578-608. Hood, P. (2002). Perspectives on knowledge utilization in education. West Ed. Huberman, M. (1990). Linkages between researchers and practitioners: A qualitative study. American Educational Research Journal, 27(2), 363-391. Lindblom, C. E., & Cohen, D. K. (1979). Usable knowledge. New Haven, CT:Yale University Press. Lusi, S. F. (1997). The role of state departments of education in complex school reform. New York: Teachers College Press.

CPRE.ORG

Rennie Center for Education Research & Policy (2004). Examining state intervention capacity: How can the state better support low performing schools & districts? Boston, MA: The Rennie Center for Education Research & Policy.

Daly, A. J, & Finnigan, K. S. (2011). The ebb and flow of social network ties between district leaders under high-stakes accountability. American Educational Research Journal, 48(1), 39-79.

Goertz, M. E., Barnes, C., Massell, D., Fink, R. E., & Francis, A. (2013). State education agencies’ acquisition and use of research knowledge in school improvement strategies. Philadelphia, PA: Consortium for Policy Research in Education.

14

Miskel, C. G., & Song, M. (2004). Passing Reading First: Prominence and processes in an elite policy network. Educational Evaluation and Policy Analysis, 26(2), 89–109.

Weiss, C. H., Murphy-Graham, E., Petrosino, A., & Gandhi, A. G. (2008). The fairy godmother and her warts: Making the dream of evidence-based policy come true. American Journal of Evaluation, 29(1), 29-47. Wenger, E., McDermott, R., & Snyder, W. M. (2002). Cultivating communities of practice: A guide to managing knowledge. Boston, MA: Harvard Business School Press.


About the Authors: Carol A. Barnes is a senior researcher at the Consortium for Policy Research in Education (CPRE) at the University of Michigan. Her research includes the macro and micro influences on school improvement in high-poverty settings, especially the relationship between policy, research and instructional or leadership practice. Barnesâ&#x20AC;&#x2122; research on planned improvement range from close examinations of teachers and principals incorporating research or policy principles into their practice, to evaluations of federal or state policies aimed at school improvement. She is currently studying how state education agencies are organized to manage and use evidence in their policies or practices to improve low-performing schools. Margaret E. Goertz is a professor emerita of education and a senior researcher at CPRE in the University of Pennsylvaniaâ&#x20AC;&#x2122;s Graduate School of Education. She specializes in the study of education finance and governance policy, and has conducted research on state and local implementation of Title I and the No Child Left Behind Act; the design and implementation of standards-based reform by states, school districts, and schools; federal and state accountability policies; and state and local district resource allocation. She is currently studying how state education agencies search for and use research in their school improvement strategies. Diane Massell is a senior researcher at CPRE at the University of Michigan. Her research focuses on K-12 education policy, particularly in the design and implementation of standards-based reform, federal supports for state education agencies, and policies and programs for school and district improvement. In addition to this research on how state education agencies are using research and other evidence in school improvement policies, she is reviewing state practices using federal School Improvement Grant resources to turn around low-performing schools, and consulting with a state education agency on reviewing and revising its statewide system of support.

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

15


Graduate School of Education University of Pennsylvania 3440 Market Street, Suite 560 Philadelphia, PA 19104-3325

Policy Brief

CONSORTIUM FOR POLICY RESEARCH IN EDUCATION

Non Profit U.S. Postage

PAID Permit No. 2563 Philadelphia, PA

About the Consortium for Policy Research in Education (CPRE) Established in 1985, CPRE unites researchers from seven of the nationâ&#x20AC;&#x2122;s leading research institutions in efforts to improve elementary and secondary education through practical research on policy, finance, school reform, and school governance. CPRE studies alternative approaches to education reform to determine how state and local policies can promote student learning. The Consortiumâ&#x20AC;&#x2122;s member institutions are the University of Pennsylvania, Teachers College-Columbia University, Harvard University, Stanford University, the University of Michigan, University of Wisconsin-Madison, and Northwestern University. The University of Pennsylvania values diversity and seeks talented students, faculty, and staff from diverse backgrounds. The University of Pennsylvania does not discriminate on the basis of race, sex, sexual orientation, religion, color, national, or ethnic origin, age, disability, or status as a Vietnam Era Veteran or disabled veteran in the administration of educational policies, programs or activities; admissions policies, scholarships or loan awards; athletic, or University administered programs or employment. Questions or complaints regarding this policy should be directed to Executive Director, Office of Affirmative Action, 1133 Blockley Hall, Philadelphia, PA 19104-6021 or (215) 898-6993 (Voice) or (215) 898-7803 (TDD).

CPRE.ORG


Volume 16, Number 2

August 13, 2013

ISSN 1099-839X

Whose Opinions Count in Educational Policymaking? Joel R. Malin and Christopher Lubienski University of Illinois at Urbana-Champaign The success of some advocacy organizations in advancing their preferred policies despite questionable evidence of the effectiveness of these policies raises questions about what contributes to successful policy promotion. We hypothesize that some education-focused organizations are advancing their agendas by engaging media, with individuals who may not possess traditionally defined educational expertise. Using two distinct expert lists, we examined relationships between measures of expertise and educational impact. We found non-significant positive relationships between these measures with a list of experts complied by a conservative think tank, while a second list from a university-based center showed a significant positive relationship. We conclude that media impact is at best loosely coupled to expertise. This issue should be explored in greater depth because deleterious outcomes are more likely if individuals are more successful in shaping policy discussion based on criteria outside of expertise. Keywords: agenda setting, decision making, educational policy, expertise, information dissemination, political influences, politics In recent years, American public education has been experiencing a period of notable pressure to adopt far-reaching reforms; buttressed by a familiar narrative of public school failure, specific and significant reforms have been advanced as necessary to improve student learning (Ravitch, 2010; Brill, 2011). For example, Race to the Top, a four-plus billion dollar federal competition sponsored by the U.S. Department of Education, has been designed to advance major, specific policies across the states (“Race to the Top Fund”, 2012). A significant concern centers on the degree to which reforms are advanced on the basis of evidence versus, for instance, popularly held “common sense” notions, or viewpoints that are assumed to be true in elite policy circles. In this paper, we examine whose facts and opinions are most frequently cited in media accounts and by policymakers, and how this relates to these individuals’ backgrounds and actual educational expertise. We hypothesize that many education-focused organizations in the U.S. are successfully advancing their agendas by using individual

actors who may have more acumen in terms of media engagement than in terms of research expertise. In the current environment of educational reform, we have seen the rise of many entities characterized by varying levels of expertise or reliance upon research. Debray-Pelot, Lubienski, and Scott (2007) provide depth and historical context in their analysis of the U.S. institutional landscape, focused upon interest group politics around vouchers and other forms of school choice. In so doing, they present compelling evidence that a confluence of factors have given rise to an increasing of interest groups in terms of numbers and activities, and an increased complexity of interest group politics, including the No Child Left Behind Act of 2001 (NCLB) and the landmark 2002 U.S. Supreme court ruling on the constitutionality of the Cleveland voucher program (Debray-Pelot et al., 2007). Still, subsequent local, state, and national political events underscore that issues surrounding school choice remain hotly contested. These authors provide evidence that new opportunities, 1


Current Issues in Education Vol. 16 No. 2 and efforts to counter them, ultimately have enhanced the importance and complexity of advocacy coalition efforts; coalitions, for instance, have formed between groups who share certain core beliefs but do not necessarily agree on secondary aspects (Sabatier & Jenkins-Smith, 1999). Meanwhile, unprecedented philanthropic dollars are being applied to these lobbying and advocacy efforts, in an effort to gain leverage over the direction of public spending and policy (Confessore, 2011). Thus, the current educational landscape is complex, and many individuals and groups have organized to make a policy impact. To what extent are these individuals and groups advocating reforms that are empirically supported? To what extent are they rigorously questioning their premises? This is a critically important topic. For example, major reform efforts, such as charter schools, vouchers, and No Child Left Behind, are based significantly on the premise that private organizations are more effective at improving student outcome (Lubienski, 2008). However, recent findings challenge this premise; for instance, Lubienski and Lubienski (in press) found public school achievement gains to be greater than those of demographically comparable Catholic school students. Likewise, Betts and Tang (2008) synthesized findings across 14 studies of schools’ impact on student achievement. They limited their analysis to studies that utilized student-level data, and the median effect size (0.005) across these studies was barely distinguishable from zero (Ravitch, 2009; Raymond & Center for Research on Education Outcomes, 2009). Many others have pointed to examples of major reform efforts that appeared to be based more on perceptions and politics than empirical evidence, including the effort to create comprehensive high schools, the small schools movement, the progressive education movement, and so forth (Bestor, 1953; Chubb & Moe, 1990; Conant, 1967). When there is a substantial divide between the empirical record and policymaking, problems can result; for instance, regulation of carbon emissions is a current issue in which expert-generated knowledge is often trumped by political operatives (Oreskes & Conway, 2010; Specter, 2010). It is essential that we policy researchers consider both the empirical basis for such grand reforms as well as the political agendas advocating for policies. One way to do this is to consider the degree to which advocates for such reforms have discernible expertise around the policies that they promote. An instrumentalist perspective on policymaking would suggest that individuals with true expertise around these policy issues are more likely to advocate for policies that are empirically substantiated; therefore, they are preferable as policy influencers to individuals who do not possess high levels of expertise (Davies & Nutley, 2008). Particularly when policies and related beliefs hold consequences for others’ welfare, beliefs arguably should

be substantiated and firmly grounded (Bridges & Watts, 2008). The Role of Expertise in Education Policy Kingdon (2003), in his classic work, explored policy and agenda setting. Although Kingdon does not focus on educational policy, his conclusions can be extrapolated into that context. The complexities of these topics are made eminently clear: multiple forces interact to determine which issues reach the agenda, and what new laws subsequently survive the policymaking process. Scientific research is but one of many potential influences on the policy process, and may be a relatively less important factor among many. For instance, researchers/academicians were rated as “very important” by only 15% of insider respondents, whereas lobbyists were rated as “very important” by fully one-third of these respondents. Meanwhile, various uncontrollable events and conditions influence agenda setting. Thus, more powerful actors or events often overshadow scientists and their research (Gormley, 2011). Indeed, this is one of the commonly cited reasons why the relations between research, policy, and practice are tenuous (Granger, Tseng, & Wilcox, n.d.). Compounding this, at times policy windows open, presenting an opportunity for advocates of various proposals to push their favored solutions (Kingdon, 2003). Currently, just such a policy window appears to be open in educational reform in the United States, owing largely to a prevalent viewpoint that traditional approaches are unacceptably ineffective for today’s students. The federal government is incentivizing this process: Race to the Top has been designed to advance major, specific policies across the states (“Race to the Top Fund,” 2012). Against this reform-happy backdrop, forces aside from research may be particularly formidable in the effort to advance favored solutions that may or may not be supported by robust research. Meanwhile, as noted previously, interest groups have increased in number, scope, and complexity, and philanthropic dollars have increased to unprecedented levels (Debray et al., 2007). Thus, as a result of several factors in the current educational climate, a policy window appears to be open; as such, a forceful and increasingly intricate effort to advance policy ideas ensues. As policy advocates form a greater consensus around the notion that the “status quo” is ineffective, they may promote innovations (even untested ones) with increased vigor and persuasiveness. Further, research often presents rather nuanced viewpoints on topics, and circumstances like these arguably favor more black-and-white, confident “solutions.” Following the Advocacy Coalition Framework (ACF; see Sabatier & Jenkins-Smith, 1999), it appears that for many it may be more important to secure changes aligned with core beliefs in any way possible than, for example, to simply produce well-researched but easily ignored policy briefs. It may be more incumbent

2


Whose Opinions Count in Educational Policymaking?

on would-be policy influencers to form partnerships and pursue strategic approaches than to carefully and painstakingly consult the literature on a topic, some of which may conflict with a more coherent and confident narrative that begs for change. Within such a context, “proofiness” techniques, where manipulative statistical sleight-of hand devices are employed in deceptive support of half-truths, are likely to be particularly prevalent (Seife, 2010). Educational think tank reports labeled “research,” for instance, have been critiqued on several grounds, including: they present oversimplified viewpoints; they lack firm research grounding; and/or they are authored by individuals lacking sufficient expertise (Welner, 2011). Mucciaroni and Quirk (2006) found that, while evidence is often misrepresented in congressional debates, accuracy tends to be greater when issues are of low or moderate public importance or visibility. Also, even when research is utilized, it is often used simply to justify pre-determined policy preferences (Whiteman, 1985). Therefore, research-policy relations are likely to be particularly tenuous at these times; an opportunity is present for some to capitalize upon certain reforms provided that sufficient force and persuasiveness are applied. Potentially, this could involve neglect of substantive empirical support. Presently, a familiar narrative about schools is prevalent, and appears to be consistent with policy: American public schools, as presently structured, are failing (in spite of ample and increasing funding); teaching deficiencies are largely responsible for this, as powerful unions protect many poor teachers; alternatives to public schools (i.e., charter schools) are necessary to save many students (particularly poor, often minority students in urban areas; Ravitch, 2010; see, e.g., Brill, 2011). The documentary film Waiting for “Superman” (Chilcott & Guggenheim, 2010) effectively advanced viewpoints such as these more deeply into the public consciousness; however, other groups vigorously counter these positions (e.g., see Ravitch, 2010). Interestingly, this discourse serves only to further fuel calls and support for changes, many of which are quite dramatic from a historical perspective, yet which appear to be rational when the presupposition that the present system is failing students is accepted — that is: not only must the system change within such a narrative, it must change quickly. Passion, in other words, may supersede reason when an urgent need for reform is popularly perceived. In any case, the strength of the narrative suggests that some level of bias or power imbalance is in operation. If, for instance, a diversity of viewpoints regarding U.S education were commonly presented, it would seem more likely that a more nuanced or wideranging set of views would be evidenced within the public (Bushaw & Lopez, 2012). We would expect similar outcomes if media regularly sought out education scholars on policy-related educational topics. Therefore, in this

analysis we examine the extent to which those who are most frequently cited on these topics have training or acumen in research on topics for which they are (self-) identified as experts. Is there a discernible content bias (see McQuail, 1992), such that media disproportionately represents one side or position? We expect to be able to directly address the former question, while touching indirectly upon the latter one. To do so, we rely upon two recently constructed expert lists (Welner, Mathis, & Molnar, 2012; Hess, 2011) for names, while drawing upon several of the criteria of a third, educational impact list (Hess, 2012). In an effort to offer a more ideologically diverse pool of sources for education reporters, Rick Hess (2011), of the conservative American Enterprise Institute, created a list of “Republican and/or Conservative (and/or libertarian) edu-thinkers” that could be solicited by journalists for expertise when writing about GOP proposals or candidates. Subsequently, Welner et al. (2012) created a list on behalf of the purportedly liberalleaning National Education Policy Center (NEPC) with the intention of pointing out to reporters the names of individuals who can speak to “the overall knowledge base” in given areas of policy. This was likely predicated on a desire to enhance the quality of reporting regarding educational issues. The intents and purposes of these lists are substantially different; however, we expect that in combination the two lists yield a useful number and variety of educational experts and thinkers for the purposes of the current study, which aims to assess the current relationship between educational expertise and media impact. We rely on several variables described and utilized by Hess (2012) in the publication of his 2012 RHSU Edu-Scholar Public Presence Rankings, published annually in Education Week. Each is described in detail in the methods section. The research question is significant from theoretical and pragmatic perspectives. A loose relationship between expertise and solicited media or policy input, for instance, would be suggestive of content bias, and would cast increased doubt upon the use of evidence to inform and influence these debates. This, in turn, would raise questions about what is, in fact, driving the solicitation and selection of experts or pundits by the media? Are media and/or policymakers selecting wouldbe commentators on the basis of “justification” of a particular viewpoint, or “balance” between apparently opposing viewpoints, above any desire to present an objective view? If, alternatively, a strong relationship is discovered, confidence in present media coverage and policymaking surrounding these issues might justifiably be increased or maintained. In either case, the vast and far-ranging implications of current national educational policy in the United States are clear. It is therefore important that we enhance our understanding of the strength of the current relationship between research

3


Current Issues in Education Vol. 16 No. 2 evidence and both media coverage and policymakers’ considerations. Based upon the literature we have discussed above, we expect generally to find weak and/or negative relationships between our measures of expertise (independent variables) and our measures of media impact (dependent variables). The current climate surrounding education reform is more conducive to the solicitation of individuals better characterized as pundits than as individuals with bona fide expertise in relevant areas. Also, we expect to find distinction in expertise and media impact between the two lists that we used to derive names — specifically, that Hess’s (2011) list will show a pattern of individuals with higher impact and lower expertise scores, and that Welner et al.’s (2012) NEPC list, by contrast will show the reverse. Such patterns are predicted based upon Welner’s (2011) observation that conservative groups appear to be more willing to fund activities that directly engage with the political process. The individuals on Hess’s (2011) list were explicitly chosen based upon conservative ideology, and therefore we might expect to find relatively more individuals whose primary concerns are oriented toward direct policy engagement or impact; by contrast, the NEPC list (Welner et al., 2012) is not explicitly weighted toward a particular political ideology but appears to be more heavily weighted toward college/university researchers and academics, whose primary concerns may be oriented more toward demonstrating expertise (e.g., scholarly publications), although they might be accused of representing a more liberal bias. The NEPC list is organized by policy topic and lists, for each topic, several individuals who could speak as experts regarding the quality of the research evidence related to a particular policy (Welner et al., 2012). Altogether, we expect that these comparisons may yield a clearer view of the type and quality of contributions to public opinion and policy within various forums. Methods In this study, we examined the extent to which media, including blogs, are soliciting or otherwise citing individuals with expertise in their coverage or examination of educational policy. In order to do so, we relied on two recently constructed expert lists and drew upon the criteria used within a third, while adding one criterion of our own: educational attainment, represented by highest degree earned. This process led us to identify ninety-seven individuals, none of whom were named of both lists: all individuals but one (95) were included in the final analysis; for this individual, we were unable to identify educational background. We treated three criteria—education press mentions, blog mentions, newspaper mentions—in combination as a dependent variable representing “media impact.” We included blog mentions because we view blogs as a form of (social)

media, and as one important means of making a media impact. Hess (2011) also used each of these variables as measures of “impact.” We treated four criteria— educational attainment, Google Scholar-listed publications, book points, and highest Amazon ranking— in combination as an independent variable representing “expertise.” Our independent variable represents our attempt to quantify displays of expertise: educational attainment, scholarly articles, and books (both number and popularity). The book metrics may be limited in that not all books are meant to be scholarly, nor do they necessarily display expertise, since scholars in some fields tend to prefer other publication formats. To address this, we repeated our regression analyses using a narrower independent variable comprised only of educational attainment and scholarly articles. Our dependent variables represent our attempt to quantify media impact. Initially, we had intended to include citations in the Congressional Records as had Hess (2011). However, we abandoned this variable because we reasoned that Congressional Records mentions are not clearly measures of media impact. Also, upon review we discovered that only a small percentage of individuals on our lists were mentioned in the Record during the duration of this analysis. Education press mentions, blog mentions, and newspapers in combination represented our effort to capture three of the most influential forms of media. We acknowledge the potential limitation of exclusion of other forms of media, such as television and radio media. By examining the correlations between our independent and dependent variables, we aimed to ascertain the relationship between expertise and opportunities to weigh in on current educational policy debates. Measures Aside from the first measure to be described, educational attainment, we modeled all other measures after the approach outlined by Hess (2012). We departed from Hess only in that we modified the date ranges to be consistent with the timeframe of our study. Hess’s study examined this measure up to December 21 or 22, 2011; in the present study, we used March 1, 2012 as our end date for all measures, except where otherwise noted. For all measures, as a final precaution we examined our obtained values in relationship to values that Hess (2011) reported. Although we used different date parameters and therefore expected to see somewhat different results, we reasoned that significant departures would raise the possibility of a flawed search. Also, for all measures we used middle initials in secondary searches for some individuals with relatively common names, in an effort to cull out same or similarly named individuals. Below, we provide substantial detail about each measure; the reader is

4


Whose Opinions Count in Educational Policymaking?

advised to refer to Hess (2012) for more detail. Independent Variables Educational Attainment. Via Google searches conducted in early March 2012, we located and recorded the highest level of educational attainment of each individual. We treated Juris Doctor, Doctor of Philosophy, and Doctor of Education degrees as equivalent, and the highest level of attainment possible. Next, we treated all Master’s Degrees as equivalent, followed by all Bachelor’s degrees. These were coded as twelve, eight, and four points, respectively. We did not award points for coursework that did not eventuate in a degree. We assigned point values in this manner in an effort for this metric to carry moderate weight in the combined “expertise” variable. Initially, we had planned to account for specific area of study, but abandoned this when we concluded that all members of the combined list completed an area of study that was at least tangentially related to educational policy. The majority of individuals (83.15%) on the combined list had attained the highest level of educational attainment. Google Scholar Metric. Up to March 1, 2012, we examined articles, books, or papers each individual had authored or co-authored, utilizing the following technique: First, the individual’s name was entered under the “author” filter in advanced Google Scholar search, limited to “Business Administration, Finance, and Economics” and “Social Sciences, Arts, and Humanities.” Like Hess (2012), we took care not to count works by similarly named individuals; we inspected each record to ensure that the author listed was the same individual who we were seeking, and we conducted secondary searches using the individuals’ middle initial in many cases. Descending each individual’s works according to the number of times each was cited, we counted the number of works up to the point at which this number exceeded the cite count. For instance, an author who had three works that were cited at least three times, but whose fourth work is cited three or fewer times, would score a three. This measure was intended to measure the breadth and impact of a scholar’s work (Hess, 2012). On this measure, individuals’ scores ranged from zero to 76 points (M = 17.39, SD = 16.58). Book Points Metric. Up to March 1, 2012, we recorded the number of books each individual had authored, co-authored, or edited, based on an Amazon author search. Similar to Hess (2012), we awarded each person two points for a single-authored book, one point for co-authored book in which he or she was the lead author (and one-half point if not the lead offer), and onehalf point for an edited volume. Also, like Hess (2011), we used an “Advanced Books Search” for the scholar’s name, and the format searched “printed books” so as to avoid double counting. In a few instances, we also used middle initials as method of avoiding duplication with same/similarly-named authors. Only published and

available books were included. On this measure, individuals’ scores ranged from zero to 37.5 points (M = 5.44, SD = 7.00). Highest Amazon Ranking Metric. As of March 15, 2012, we recorded the author’s highest-ranked book on Amazon. Similar to Hess (2012), we subtracted the highest-ranked book from 400,000, and divided the resulting number by 20,000. In this way, each individual achieved a score between zero and twenty. We completed this measure from the Amazon site, searching for books written by each of the individuals, and (if applicable) identifying the individual’s top rated book. We included co-authored or co-edited books by the individual. We found that it was easiest to carry out the task when going to the individual’s “Amazon author page,” if it existed. This measure, Hess notes, is substantially volatile and biased in favor of recent works; however, we agree with his position that it nonetheless conveys useful information. On this measure, individuals’ scores ranged from zero to 19.9 points (M = 2.48, SD = 5.17). The majority of individuals on the combined list (75.79%) achieved zero points on this measure. Dependent Variables Education Press Mentions. Like Hess (2012), we recorded the total number of times each individual was quoted or mentioned in either Education Week or the Chronicle of Higher Education. We counted quotes or mentions from the time period between December 1, 2011 and March 1, 2012, Similar to Hess (2012), we divided the total number of appearances by two to yield a final measure. We searched by using each scholar’s first and last name, using the search tool available at each site. On this measure, individuals’ scores ranged from zero to 14.5 points (M = 1.72, SD = 2.96), with 24.2% of individuals earning zero points on this measure. Blog Mentions. We followed Hess (2012) by recording the number of times a scholar was referenced in a blog between December 1, 2011 and March 1, 2012. Departing from Hess, we searched with a combination of the individual’s name and several words linked to education, separated by “OR” connectors (Name AND education OR school OR schools OR learning OR reform OR charter OR vouchers). This was done in an effort to cull out references to similarly named individuals. Like Hess (2012), we divided the total number of references by four in arriving at a final figure, which we capped at fifty points. On this measure, individuals’ scores ranged from zero to fifty points (M = 23.06, SD = 20.53). Twenty-six individuals (27.4%) earned fifty points on this measure. Newspaper Mentions. Like Hess (2012), we used a Lexis Nexis search to record the number of times each individual was quoted or mentioned in U.S. newspapers. We used the date range of January 1, 2011 to March 1, 2012. Similar to Hess (2011), we divided the resulting number of mentions by four, to yield a final measure per individual. On this measure, individuals’

5


Current Issues in Education Vol. 16 No. 2 scores ranged from zero to 35 points (M = 3.34, SD = 5.64); 10.53% of individuals earned scores of zero points. Results Values of each of four previously described independent measures were added for each individual, to create a single independent measure of “expertise.” Likewise, values of each of four previously described dependent measures were added for each individual, to create a single dependent measure of “media impact.”

As noted previously, we hypothesized that we would find weak and/or negative relationships between our measures of expertise (independent measures) and our measures of media impact (dependent measures). In order to test our hypotheses, we performed several linear regression analyses using PASW® Statistics18.0 software. First, we examined the overall relationship between our broad expertise variable and our impact variable. In partial support of our hypothesis, we found a non-significant

Figure 1. The relationship between “expertise” and “media impact” for individuals identified on either list (combined list).

Figure 2. The relationship between “expertise” and “media impact” for individuals identified on the Welner et al. (2012) list.

6


Whose Opinions Count in Educational Policymaking?

positive relationship, such that increases in expertise loosely and non-significantly associated with increases in media impact among these individuals — expertise explained just 2% of the variance in media impact (R2 = 0.02, F(1,93) = 1.86, p = 0.18), as shown in Figure 1. Second, we examined the overall relationship between our more constrained expertise variable (encompassing only educational attainment and scholarly articles measures) and media impact. In this case as well, we found a non-significant positive relationship, such that increases in this measure of expertise loosely and nonsignificantly associated with increases in media impact among these individuals (R2 = 0.01, F(1, 93) = 1.08, p = 0.30). Third, we examined the relationship between expertise, broadly defined, and media impact for individuals named on the Welner et al. (2012) NEPC list. In this case, we found a significant positive relationship between our measure of expertise and our measure of media impact, R2 = 0.42, F(1, 60) = 12.87, p = 001. Individuals on this list tended to score higher on the measures of expertise (M = 46.09, SD = 24.51) than on

the measures of media impact (M = 23.08, SD = 23.53), as shown in Figure 2. Finally, we examined the relationship between expertise, broadly defined, and media impact for individuals named on the Hess (2011) list. In this case, we found a non-significant positive relationship, such that increases in expertise were loosely and non-significantly associated with increases in media impact, R2 = 0.28, F(1, 31) = 0.88, p = 0.36. Also, we found a nonsignificant positive relationship between the narrow measure of expertise and media impact, R2 = 0.06, F(1, 31) = 1.90, p = 0.18, as shown in Figure 3. In contrast to the prior list, individuals on the Hess list tended to score higher on the media impact measure (M = 37.57, SD = 26.94), and lower on the broad (M = 18.55, SD = 17.44) and narrow (M = 15.85, SD = 13.18) expertise measures. Figure 4 provides a graphical representation of the differences in mean values across the expertise and broad media impact measures, by list. Table 1 includes a summary of statistics for each of the separate regression analyses we performed, using broadly defined expertise.

Figure 3. The relationship between “expertise” and “media impact” for individuals identified on the Hess (2011) list.

7


Current Issues in Education Vol. 16 No. 2

Figure 4. Mean values of “expertise” and “media impact” (broad measure), by expert list.

Table 1 Summary of Separate Regression Analyses of Expertise (By Expert List: Individual and Combined) and Media Impact Variable B SE(B) ß t Sig.(p) Expertise – Hess (2011) List (N = 33)

0.26

0.27

0.17

0.94

.36

Expertise – Welner (2012) List (N = 62)

0.40

0.11

0.42

3.59

.00*

Expertise – Combined List (N = 95)

0.14

0.10

0.14

1.36

.18

Note. p = .001 Discussion We were interested to review the current relationship between indicators of educational expertise and measures of media impact. We suspected that we would find generally weak and negative relationships, basing this expectation on our review of the literature and our understanding of the contemporary pressure to reform education, a pressure that is in keeping with a dominant narrative that current educational models in the U.S. are unacceptably ineffective. Results were partially consistent with our main hypotheses. While our variables were not negatively associated as predicted, they were in nearly all cases weakly associated and/or statistically non-significant. This is consistent with a general perspective within the literature that we reviewed, which suggests that media impact (e.g., opportunities for citation) would not be

tightly related to educational expertise. We find this troubling, particularly within a broader context which is conducive to educational reform. It would be better if media (and, potentially by extension, policy) influencers possessed true expertise, or were connected with experts (Willingham, 2012). Interestingly, expertise significantly predicted media impact when we constrained our analysis to the NEPC list. This list is distinct in that it exclusively includes academics, whereas the Hess (2011) list includes some academics and many who are outside of this sphere. We reason that many academics tend to be primarily concerned with scholarly output and related endeavors, and thus many do not necessarily seek media exposure. Within such circumstances, perhaps their output holds some predictive power over whether they will be “sought out” in the media. Yet some academics actively avoid

8


Whose Opinions Count in Educational Policymaking?

this time of engagement. On the other hand, the Hess (2011) list contains a substantial number of individuals who are likely to be primarily focused on making a public impact (e.g., through the media or otherwise), often irrespective of expertise as traditionally measured. If true, this would explain the distinction of results when the lists were taken separately. Also, it is interesting to note basic distinctions across the two lists in terms of expertise and media influence. Specifically, individuals on the Welner et al. (2012) list tended to score higher on the expertise variable and lower on the media impact variable. By contrast, individuals on the Hess (2011) list tended to score higher on the media impact variable and lower on the expertise variable. This pattern of results is consistent with our expectations and is, we believe, worthy of future study. It is consistent with Welner’s (2011) observation that conservative groups tend to be more willing to directly engage with politics on educational questions. Conservative individuals are more heavily represented on the Hess (2011) list. If there is a tendency toward interest in policy engagement (and, we expect, media impact) amongst this sample, there may be a tendency among academics (who appear to be more heavily represented on the Welner et al., 2012 list) toward greater interest in establishing expertise. Also, “media impact” is likely quite variable as a function of what is important at a given time; at one time, for example, school funding may be a popular topic, whereas at another teacher quality may be an area of emphasis. Presumably, different individuals would be tapped into depending on the topic. Limitations One important limitation of the study is that it draws from somewhat constrained lists of educational experts and pundits. It is also biased somewhat in favor of academics, who are exclusively listed on the Welner et al. (2012) list, and conservatives, who are exclusively listed on the Hess (2011) list. This sets up a questionable dichotomy of two sets of experts, where academics are presumed to be liberal, and think tank types are of a more conservative bent — yet we know that the actual universe of educational expertise is much more diverse. Indeed, many of the individuals who are frequently quoted in the media on educational topics (e.g., those from the Fordham Institute or the American Enterprise Institutes) are not included in our analyses. Had the lists been constructed differently and included such individuals, perhaps our results would be different. Finally, our measures of expertise focused on academic preparation, and did not take into account other factors, such as experience, that could contribute to expertise. Directions for Future Study Still, this study suggests several potential directions for future research. The results, suggesting that media impact is at best loosely coupled to expertise, are troubling and point to the responsibility of the media to

vet experts before citing them or their work — an issue that should be explored in greater depth. Meanwhile, the suggestion from ACF that media impact may differ somewhat as a function of strategies by groups of different ideologies and backgrounds is interesting, and is tentatively supported in this study. Future study should be aimed to better understand the contours of this situation. Lastly, we would like to join the growing chorus of individuals who seek to re-establish tighter relations between research, policy, and practice. A high quality education is immensely beneficial for individuals and states/nations (Alexander, 1976), and policy changes should be carefully discussed and weighed prior to implementation. This is most likely to occur when individuals with true expertise, who may be more grounded by empirical findings related to particular reforms, are positioned to inform the process. References Alexander, K. (1976). The value of an education. Journal of Education Finance, 1(4), 429-467. Bestor, A. (1953). Educational Wastelands: The retreat from learning in our public schools. Chicago & Urbana, IL: University of Illinois Press. Betts, J. R., & Tang, Y. E. (2008). Value-Added-added and experimental estimates of the effects of charter schools on student achievement. Seattle: Center on Reinventing Public Education. Bridges, D., & Watts, M. (2008). Educational research and policy: Epistemological considerations. Journal of Philosophy of Education, 42(S1), 4162. Brill, S. (2011). Class warfare: Inside the fight to fix America's schools. New York: Simon & Schuster. Bushaw, W. J., & Lopez, S. J. (2012). Public education in the United States: A nation divided. Phi Delta Kappan, 94(1), 9-25. Chilcott, L. (Producer), & Guggenheim, D. (Director). (2010). Waiting for "Superman" [Motion Picture]. Hollywood, California: Paramount Home Entertainment. Chubb, J. E., & Moe, T. M. (1990). Politics, markets, and America’s schools. Washington, DC: Brookings Institution. Confessore, N. (2011, November 26). Policy-making billionaires. Retrieved from http://www.nytimes.com/2011/11/27/sundayrevi ew/policy-makingbillionaires.html?pagewanted=all Conant, J. B. (1967). The comprehensive high school; a second report to interested citizens. New York: McGraw-Hill. Debray-Pelot, E. H., Lubienski, C. A., & Scott, J. T. (2007). The institutional landscape of interest

9


Current Issues in Education Vol. 16 No. 2 group politics and school choice. Peabody Journal of Education, 82(2-3), 204-230. Oreskes, N., & Conway, E. M. (2010). Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York: Bloomsbury Press. Gormley, W. T. (2011). From science to policy in early childhood education. Science, 333, 978-981. Granger, R. C., Tseng, V., & Wilcox, B. (Undated). Connecting research and practice (submission draft). Hess, R. (2012, January 4). The 2012 RHSU edu-scholar public present rankings. Retrieved from http://blogs.edweek.org/edweek/rick_hess_straig ht_up/2012/01/the_2012_rhsu_eduscholar_publi c_presence_rankings.html Hess, R. (2011, October 13). A handy rolodex supplement for edu-reporters. Retrieved from http://blogs.edweek.org/edweek/rick_hess_straig ht_up/2011/10/a_handy_2012_rolodex_supplem ent_for_edu-reporters.html Kingdon, J. W. (2003). Agendas, Alternatives, and Public

Sabatier (Ed.), Theories of the policy process (pp. 117-166). Boulder: Westview Press. Seife, C. (2010). Proofiness: The Dark Arts of Mathematical Deception. New York: Viking. Specter, M. (2010). Denialism: How irrational thinking hinders scientific progress, harms the planet, and threatens our lives. New York: Penguin Press. Tan, Y., & Weaver, D. H. (2010). Media bias, public opinion, and policy liberalism from 1956 to 2004: A second-level agenda-setting study. Mass Communication and Society, 13, 412-434. U.S. Department of Education. (2012, January 24). Race to the top fund. Retrieved from http://www2.ed.gov/programs/racetothetop/index .html Welner, K. G., Mathis, W. J., & Molnar, A. (2012, January 23). Researchers as resources: A list of experts who can speak to the overall knowledge base on important educational issues. Retrieved from http://nepc.colorado.edu/publication/researchersas-resources Welnar, K. G. (2011). Free-Market think tanks and the marketing of education policy. Dissent, 58(2), 39-43. Whiteman, D. (1985). The fate of policy analysis in congressional decision making: Three types of use in committees. Western Political Quarterly, 38, 294-311. Willingham, D. T. (2012). When can you trust the experts? How to tell good science from bad in education. San Francisco: Jossey-Bass.

Policies. New York: Longman. Lubienski, C. (2008). School Choice Research in the United States and Why It Doesnâ&#x20AC;&#x2122;t Matter: The Evolving Economy of Knowledge Production in a Contested Policy Domain. In M. Forsey, G. Walford & S. Davies (Eds.), The Globalization of School Choice? (pp. 27-54). Oxford, UK: Symposium Books. Lubienski, C., & Lubienski, S. T. (in press, 2013). The Public School Advantage: Why Public Schools Outperform Private Schools. Chicago: University of Chicago Press. McQuail, D. (1992). Media Performance: Mass Communication and the Public Interest. London: Sage. Mucciarioni, G., & Quirk, P. (2006). Deliberate Choices: Debating Public Policy in Congress. Chicago: University of Chicago Press. Ravitch, D. (2010, November 11). The myth of charter schools. Retrieved from http://www.nybooks.com/articles/archives/2010/ nov/11/myth-charter-schools/?pagination=false Ravitch, D. (2009). The death and life of the great American school system: How testing and choice are undermining education. New York: Basic Books. Raymond, M. E., & Center for Research on Education Outcomes. (2009). Multiple Choice: Charter School Performance in 16 States. Stanford, CA: Stanford University. Sabatier, P., & Jenkins-Smith, H.C. (1999). The advocacy coalition framework: an assessment. In P.A.

10


Whose Opinions Count in Educational Policymaking?

Article Citation Malin, J., & Lubienski, C. (2013). Whose opinions count in educational policymaking? Current Issues in Education, 16(2). Retrieved from http://cie.asu.edu/ojs/index.php/cieatasu/article/view/1086 Author Notes Joel Malin University of Illinois 338 Education Building, 1310 S. 6th St. Champaign, IL 61820 jrmalin2@illinois.edu Joel Malin is a doctoral student at the University of Illinois. He is also a school district administrator for Lake Forest School District 67 (Illinois). His research focuses on educational policy analysis and educational assessment. Christopher Lubienski University of Illinois 338 Education Building, 1310 S. 6th St. Champaign, IL 61820 club@illinois.edu Christopher Lubienski, PhD is Associate Professor of Education Policy, and Director of the Forum on the Future of Public Education at the University of Illinois. His research focuses on the political economy of education reform, with a particular interest in the equity effects of markets in education.

Manuscript received: 09/23/2012 Revisions received: 02/24/2013 Accepted: 05/17/2013

11


Current Issues in Education Vol. 16 No. 2

Volume 16, Number 2

August 13, 2013

ISSN 1099-839X

Authors hold the copyright to articles published in Current Issues in Education. Requests to reprint CIE articles in other journals should be addressed to the author. Reprints should credit CIE as the original publisher and include the URL of the CIE publication. Permission is hereby granted to copy any article, provided CIE is credited and copies are not sold.

Editorial Team Executive Editors Melinda A. Hollis Rory Schmitt Assistant Executive Editors Laura Busby Elizabeth Reyes Layout Editors Bonnie Mazza Elizabeth Reyes

Recruitment Editor Hillary Andrelchik

Copy Editor/Proofreader Lucinda Watson

Authentications Editor Lisa Lacy Technical Consultant Andrew J. Thomas Ayfer Gokalp David Isaac Hernandez-Saca

Section Editors Linda S. Krecker Carol Masser Faculty Advisors Dr. Gustavo E. Fischman Dr. Jeanne M. Powers

12

Bonnie Mazza Constantin Schreiber


Reviews/Essays Organizing Research and Development at the Intersection of Learning, Implementation, and Design William R. Penuel, Barry J. Fishman, Britte Haugan Cheng, and Nora Sabelli This article describes elements of an approach to research and development called design-based implementation research. The approach represents an expansion of design research, which typically focuses on classrooms, to include development and testing of innovations that foster alignment and coordination of supports for improving teaching and learning. As in policy research, implementation is a key focus of theoretical development and analysis. What distinguishes this approach from both traditional design research and policy research is the presence of four key elements: (a) a focus on persistent problems of practice from multiple stakeholders’ perspectives; (b) a commitment to iterative, collaborative design; (c) a concern with developing theory related to both classroom learning and implementation through systematic inquiry; and (d) a concern with developing capacity for sustaining change in systems. Keywords: educational reform; learning processes/strategies; mixed methods; organization theory/change

A

n enduring goal of research in education has been to identify programs that can reliably work in a wide variety of settings so that such programs can be scaled up to improve system-level outcomes. But the observed treatment effects of nearly all programs vary significantly from setting to setting, and even the most promising programs have proved difficult to scale up. Improving educational systems, moreover, requires more than the adoption of effective programs; it demands alignment and coordination of the actions of people, teams, and organizational units within a complex institutional ecology (Rowan, 2002). For decades, policy researchers have observed that strategies for producing alignment and coordination only from the top down rarely work (e.g., Cohen, Moffitt, & Goldin, 2007; Elmore, 1980; Rowan, 2002). Berman and McLaughlin (1975) observed that teachers’ adaptations of programs at the classroom level, not policy makers’ plans, largely determine programs’ effectiveness. Implementation problems evolve, moreover, as programs go to scale, as a consequence both of the adaptations Educational Researcher,Vol. 40, No. 7, pp. 331–337 DOI: 10.3102/0013189X11421826 © 2011 AERA. http://er.aera.net

teachers make and of changes and variations in environments (McLaughlin, 1987). Successful scaling, most policy researchers agree, depends on local actors—especially district administrators, school leaders, and teachers—who need to make continual, coherent adjustments to programs as they work their way through educational systems (Weinbaum & Supovitz, 2010). In this article, we argue for the potential of an emerging form of design research as a strategy for supporting the productive adaptation of programs as they go to scale. Because design research is an iterative approach to developing innovations, it is particularly well suited to informing decision making about needed adjustments to programs (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003). The potential utility of design research for supporting implementation also derives from its focus on developing practical theory and tools that can be used to support local innovation and to solve practical problems (Reinking & Bradley, 2008). Further, the collaborative nature of much design research positions practitioners as codesigners of solutions to problems, which can facilitate the development of usable tools that educators are willing to adopt (Penuel, Roschelle, & Shechtman, 2007). Our perspective in this article has benefited from recent policy research that makes a turn toward design and makes use of theories of implementation to develop and study implementation supports for leadership practices (Honig, 2008; Spillane, 2006), instructional coaching (Gallucci, Van Lare, Yoon, & Boatright, 2010), and the coordination of school-linked services to improve student outcomes (McLaughlin & O’Brien-Strain, 2008). Efforts by a number of interdisciplinary teams over the past decade to conduct design research at the level of educational systems have shown the significant promise of the strategy of engaging learning scientists, policy researchers, and practitioners in a model of collaborative, iterative, and systematic research and development. This represents a significant expansion of design research, which typically focuses on classrooms, to include development and testing of innovations that foster alignment and coordination of supports for improving what takes place in classrooms. We call this approach to collaborative research and development design-based implementation research, because design thinking figures prominently in it—as illustrated by the projects and initiatives we describe in this article—and because research on the implementation of reforms drives iterative improvements

October 2011

331


to designs. Researchers involved in these projects span multiple disciplines, including the learning sciences (where design-based research methods are common) and policy research (where policy and program implementation are often foci of theoretical development and analysis). In this article, we elaborate on some elements that define design-based implementation research as it has been practiced, illuminating these elements by describing partnerships that exemplify each element and analyzing how the elements contributed to the partnerships’ success. Next, we consider challenges to the approach and propose ways to organize a community of scholars whose aim is to develop, test, and improve this model of design research. Finally, we suggest ways to leverage existing federal investments to cultivate norms to guide and improve this model of research and development. Elements of Design-Based Implementation Research Since the 1990s, several interdisciplinary research and development teams have worked collaboratively with practitioners to develop and test designs for improving teaching and learning. These projects share four common elements: · a focus on persistent problems of practice from multiple stakeholders’ perspectives; · a commitment to iterative, collaborative design; · a concern with developing theory related to both classroom learning and implementation through systematic inquiry; and · a concern with developing capacity for sustaining change in systems. Below, we elaborate on these elements, presenting projects that exemplify each. Although all of the projects we describe employed all four elements, we highlight particular elements with individual cases that we judged to be especially indicative of those elements’ potential.

Teams form around a focus on persistent problems of practice from multiple stakeholders’ perspectives. Design-based implementation research, as a descendent of the pragmatic tradition in American educational philosophy (e.g., Dillon, O’Brien, & Heilman, 2000), shares a commitment with other forms of design research (e.g., formative experiments, design experimentation) to using research to solve practical problems. What distinguishes it from most forms of design research, however, is how “practice centered” the problem definitions are. The SERP Institute, an organization that supports coordinated design research in large districts, exemplifies this approach (Donovan, 2011). The institute holds fast to the principles that (a) research and development should be a collaborative endeavor between researchers and practitioners, (b) partnerships should be based on addressing important problems of practice, and (c) practitioners should have a say in defining those problems. Many whole-school reform models, such as Success for All, already require teachers to have a say in adopting the programs in order to build ownership in the reform process (Datnow & Castellano, 2000). SERP goes further, requiring researchers to take up what educators perceive as local obstacles to improving 332

educational Researcher

teaching and learning. Initially, it is unlikely that teachers and leaders of schools and districts will share a definition of these obstacles, and their formulations are likely to differ widely from those of researchers. Thus a key early task in forming a partnership between researchers and an educational system is to develop a shared understanding of the problem or problems that will be taken up among people representing different stakeholder groups. A core group of researchers and district and school leaders in a SERP site is tasked with defining an immediate problem of practice and agreeing on a plan of action for instructional improvement. In Boston, one of the four SERP sites established thus far, the core group initially collaborated around a middle school literacy challenge: improving students’ knowledge of specialized academic vocabulary. To identify strategies for improvement, the core group chartered a design team with expertise on instructional materials, pedagogy, assessments, and data management and use by school staff, all focused on the selected problem. The design team drew on the expertise of two researchers, Catherine Snow and Richard Elmore: Snow to develop an intervention, Word Generation, to implement in schools, and Elmore to analyze the coherence of the instructional program at implementing schools. Snow and her colleagues conducted research on the impact of Word Generation on student learning (Snow, Lawrence, & White, 2009), and Elmore’s research contributed to understanding the conditions under which its implementation was effective in improving student outcomes (Elmore & Forman, 2010). In reflecting on what made the Word Generation experience successful, SERP researchers credit listening to and responding to educators’ needs (Snow, 2011). By focusing on academic vocabulary—a persistent problem of practice that was affecting performance on accountability tests (a system-level outcome)—and by adhering to teachers’ constraints on how much time they could allocate for the intervention, SERP researchers solidified the commitment of district leaders to the collaboration and built trust and buy-in among teachers (Donovan, 2010). In addition, bringing together learning researchers with policy researchers enabled the research on Word Generation to provide insight both into whether the program worked and into the conditions for success. That focus provided the district with the feedback it needed to justify further investment in implementation, as well as clues as to where to intervene to improve outcomes in future years.

To improve practice, teams commit to iterative, collaborative design. Collaborative design research often focuses on the development and testing of usable tools for improving teaching and learning in specific subject matter domains and settings. A hallmark of design-based research in the learning sciences has been its focus on improving the learning environments of classrooms; however, learning scientists have, for the most part, had limited success in bringing their classroom innovations to scale at the level of educational systems (Pea & Collins, 2008). As a consequence, some learning scientists have begun to focus more explicit attention on designing learning opportunities for teachers. For example, some researchers have helped teachers learn to enact inquiry-oriented curricula by designing educative curriculum materials for them


(e.g., Davis & Krajcik, 2005); some have designed professional development programs to better equip school and district leaders for supporting classroom-based reform (e.g., Bowyer, Gerard, & Marx, 2008). Some learning scientists have collaborated with policy researchers, who engage theories of organizational and institutional change in designing new approaches for bringing about systemic improvements (e.g., Bryk, Gomez, & Grunow, in press; Resnick & Spillane, 2006). A good example of a design-focused multiyear collaborative research and development effort is the Middle School Mathematics and the Institutional Setting of Teaching (MIST) project at Vanderbilt University (Cobb, Henrick, & Munter, 2011). MIST is a five-year project in which a team of learning scientists, policy researchers, and educational anthropologists work in collaboration with four school districts to analyze and inform policies for improvement in mathematics instruction as part of a participatory, collaborative approach to research. In a departure from traditional design-based research where researchers establish the learning goals, in MIST the district makes decisions and drives the designs for helping improve how teachers enact ambitious instructional practices in mathematics. The research team helps facilitate the design process, first by eliciting a policy-based theory of action from different actors in the system and then by conducting research on the implementation, informing future cycles of design and implementation. Like many large urban districts with high levels of socioeconomic and cultural diversity, “District B,” a MIST partner for four years, is under pressure to improve student outcomes in mathematics to satisfy the mandates of No Child Left Behind. Its theory of action is to improve the quality of instruction by creating new positions (coaches in mathematics to provide instructional supports to teachers) and implementing new routines to support principals’ instructional leadership in mathematics (e.g., learning walks, or “walkthroughs,” co-led with coaches, whereby the principals observe mathematics instruction in their schools). The MIST research team conjectured that the coaches’ subject matter expertise and new roles could plausibly lead to achieving the district’s goal, but the team expressed concern that the principals lacked sufficient expertise in mathematics to make effective use of the learning walks and that the learning events organized for principals were too isolated to make up for their lack of expertise. Over the first year of data collection, the team discovered that principals were not in agreement about the policy goal. Many saw their job as requiring a sharper focus on instructional management (ensuring compliance to teaching to standards) rather than on the instructional improvement goals established by district leaders. What happened next makes MIST a good example of how design-based implementation research can contribute to system improvement. The Vanderbilt team shared their conjectures and research findings with district leaders at the end of the year in a highly facilitated meeting that left plenty of time for discussion. Researchers were able to make evidence-based recommendations to district leaders about the importance of making the vision more explicit for all principals and increasing the coherence of principals’ learning supports in the context of their district. These recommendations were partly reflected in the revised theory of action that the district implemented the next school year to

support instructional improvement. The district’s leadership focused on making sure that supervisors of principals had expectations consistent with the district’s theory of action; the leadership also placed increased emphasis on how to communicate those expectations to principals. The Vanderbilt team also organized more sustained professional development for principals. Although the team did not see all of its recommendations fully implemented, its presentation and collaborative approach paid off in terms of influence on policy at the district level.

As a strategy for promoting quality in the research and development process, teams develop theory related to both classroom learning and implementation through systematic inquiry. In its focus on the persistent problems of practice and the collaborative approach, design-based implementation research shares some features of participatory approaches to educational evaluation (see especially Fetterman, 2001). Also, like rigorous, well-designed program evaluation studies, the research is informed by theories of how people learn in particular contexts (e.g., Donaldson, 2007). However, in contrast to most evaluation studies—which are motivated by practical and policy questions— design-based implementation research aims to develop and refine theory through systematic inquiry (Cobb et al., 2011). The objects of this theory development include explicit ideas about how to support classroom learning, about how to prepare teachers and administrators to implement programs, and about how to coordinate the implementation of programs within and across organizations (Confrey, Castro-Filho, & Wilhelm, 2000). Design-based implementation research can also contribute to theories of organizations and institutions that guide much contemporary policy research in education, particularly by pointing out how the deployment of new tools (e.g., curricula, technologies) can bring to light new needs for coordination across different system levels and for capacity building (e.g., Stein & Coburn, 2008). In design research, it is through the analysis of what happens when researchers engage in design and help support implementation that theory develops (Edelson, 2002). At the Center for Learning Technologies in Urban Schools (LeTUS), a research center funded by the National Science Foundation, a key focus was to develop a theory of the conditions under which a technology-supported innovation in science could be usable to a wide range of teachers (Blumenfeld, Fishman, Krajcik, Marx, & Soloway, 2000). Usability is a concept that describes an ideal of software design, and the LeTUS team sought to explore its applicability to education. Immediately, and as a direct consequence of the team’s efforts to deploy technology in the Detroit Public Schools and the Chicago Public Schools (which were partners in the Center), the team saw that the value of technology lay not in what it could do on its own but in its integration with successful curriculum materials. In addition, LeTUS researchers confronted the different layers of the system that affected student access to computers and realized that actors in different departments and schools would need to coordinate their activities to implement technology successfully in the classroom. In urban schools, if technology is to be usable, there must be a “fit” of the innovation to school culture, technical capability, and policies (Blumenfeld et al., 2000). If the gap between the October 2011

333


capacity of a system and the requirements of an innovation along any of those three dimensions is large, a technology-supported innovation will be less usable. Where gaps are large, designers have a choice to scale down the demands along one or more dimensions or to intervene to enhance system capacity, improving chances that an innovation will be usable. The LeTUS example illustrates what can be learned about policy and program implementation from engaging in the activity of design. By having to plan for implementation in classrooms and then adapt the plans on the basis of what the team discovered through its research activities, the team refined its definition of the problem at hand, shifting from the idea of usability as a relationship between a technology capability and the task at hand (the definition from software design) to the idea of usability as entailing the capability of systems to make good uses of technology. This shift led to theory building about the systemwide conditions of scalability and sustainability up front in design, a primary goal of design-based implementation research.

Design-based implementation research is concerned with developing capacity for sustaining change in systems. One strategy for promoting the sustainability of designs is to develop capacity through intentional efforts to develop organizational routines and processes that help innovations travel through a system. The LeTUS team was not the first to claim or discover that the capacity of systems acts as both a resource and a constraint for change: This idea has been fundamental in policy analyses for some time (e.g., Darling-Hammond, 1993). However, the predominant approaches to building capacity in education have long been focused on improving human capital—whether through professional development or by selecting and rewarding teachers on the basis of their students’ test scores—and on developing and supplying improved material capital (e.g., curriculum). Often overlooked as a potential target for design efforts is improving social capital, that is, the resources and expertise that individuals can access to accomplish purposive action. A teacher’s colleagues, for example, can be an important resource for implementing new reforms (Frank, Zhao, & Borman, 2004). Routines in schools, such as those that teachers use to structure conversations about teaching and learning in teams, serve as important resources for teachers’ own learning and growth (Horn & Little, 2010). School administrators’ processes for coordinating reform activities likewise can be resources for leading change efforts (Spillane, 2006). Design-based implementation research can help develop local capacity by fostering cohesion among networks of local actors tasked with implementing change, and by creating designs for routines and coordination mechanisms that can help innovations travel readily along those networks and that themselves can travel to new contexts. A good example of a research and development effort focused on creating routines for innovation and coordination that travel well is the Fifth Dimension, part of the UC Links program in the University of California system (Cole & Distributed Literacy Consortium, 2006). Developed initially by Michael Cole and colleagues at the Laboratory for Comparative Human Cognition at the University of California, San Diego, the Fifth Dimension links university students and faculty to local communities through the joint activity of planning and running an 334

educational Researcher

after-school program. For children, participation is voluntary. For the undergraduates who help staff and research the program, attendance is part of a for-credit class. Program activities include academic, school-like tasks, but they are assigned within a broader context of learning through play. The Fifth Dimension program is a good example of capacity building because its routines and organizational processes have traveled well to support the formation of more than three dozen university–community partnerships nationally and because it has proved successful in promoting a range of traditional academic and other types of outcomes. The necessary student labor is counted in the cost of running a course for the university, making the program attractive to communities and sustainable as a program that offers something to children in community settings and that helps undergraduates learn about human development. Designing for sustainable improvements in teaching and learning is the ultimate goal of design-based research, and as part of that, identifying and putting into place routines and processes that build or leverage existing capacity to support programmatic scalability is a crucial step. Furthermore, the program’s developers consider the adaptation required when implementing the program in a new site as a core object of research study. Notably, the primary interest of Cole and his colleagues in studying the Fifth Dimension has been in “tracing implementations in widely disparate conditions” to better understand the conditions under which the design principles are appropriated and transformed as the program is integrated with the values, norms, and practices of host institutions (Cole & Engeström, 2006, p. 500). These findings help the team to refine theories of how the sociocultural setting shapes adoption of particular program elements. Threats and Challenges to the Success of DesignBased Implementation Research There are significant practical challenges to engaging in designbased implementation research (Donovan, Wigdor, & Snow, 2003). The demand of the public and of schools for quick success from reforms often makes researchers uneasy. On their own, researchers may not have the capacity to function successfully as reform intermediaries at the same time that they are conducting research; effective partnerships are likely to require an intermediary organization whose primary focus is capacity building, not research. In addition, much policy discourse on—and funding of—large-scale research on innovations leans toward the need to promote implementation fidelity rather than toward productive, mutual adaptation of programs. Funding cycles of both federal agencies and private foundations typically are short, and ongoing funding of collaborations for the time required for building capacity is rare, making the multidisciplinary projects that serve as examples in this article the exception rather than the norm for organizing research and development efforts. Just as design-based researchers often seek out classrooms that are suitable contexts for innovations (e.g., Reinking & Bradley, 2008), the teams discussed above have worked largely within education systems that were “ready for change.” In other words, the researchers have focused their efforts on places where they shared a broad vision for improvement with district officials. This may be a necessary condition for partnerships, and even in the above


examples where that condition is met, coordinating change has often proved a greater challenge than the research and development teams are equipped to solve. An ongoing challenge, which we hope can begin to be addressed in future projects, is the development of theories and models that can be used to initiate change where capacity is more limited and to develop designs for coordination that are better tailored to different contexts, including those that convene different kinds institutional and individual actors in and out of school to study and bring about improvements to children’s learning. An example is the Youth Data Archive (McLaughlin & O’Brien-Strain, 2008), where researchers facilitate the assembly of data sets across different kinds of organizations (e.g., schools, social service agencies). Researchers involved with the Youth Data Archive help collaborative groups that provide the data to pose questions of these data and use the answers to inform the design of strategies to improve and coordinate services for youth. In addition, teams conducting design-based implementation research will need to identify ways to involve young people themselves in designing and studying the educational systems of which they are part, both as a way to expand collaborations and as a strategy for promoting youth development (Kirshner, O’Donoghue, & McLaughlin, 2005). Finally, an enduring challenge of design-based implementation research is to coordinate the activities of research and development. The projects described above involved multiple teams whose work had to be coordinated. In many instances, the practical work could have easily overshadowed the need for rigorous research. It was only because the research teams were diverse, with specialized roles for their members, that they were able to conduct the kinds of experimental, quasi-experimental, and mixed-methods longitudinal studies that they did. Even so, in their accounts of the design process, researchers did not always detail alternative approaches to redesigning innovations, explain how empirical evidence informed changes, or describe alternative approaches to design and evidence that might have supported those approaches. These challenges suggest an important role for research on the processes of research and development, to make visible the tensions and challenges in the work, as do the case studies in a recent edited volume on bridging the research– practice divide (Coburn & Stein, 2010). Looking to the Future: Organizing for the Improvement of Design-Based Implementation Research A first step toward developing design-based implementation research as a systematic form of inquiry and practice is to establish shared norms and practices regarding theory development and the specification and testing of specific claims or conjectures. In other words, the approach needs to establish a distinctive “argumentative grammar” (Kelly, 2004) for judging the adequacy of data supports for particular claims and theories and for warrants that link claims to data. In design-based implementation research, the driving question may sometimes be one of “what works,” in which case experimental designs may be appropriate. Instead, much design-based implementation research asks questions such as “What works when, how, and for whom?” “How do we improve this reform strategy to make it more sustainable?” and “What capacities does the system need to continue to

improve?” Answering these questions and the many subquestions needed to develop and validate theory-based innovations will require a wide range of research methods. Longitudinal, historical, ethnographic, and case analyses of changing contexts are likely to be necessary to understand how reforms’ trajectories across time and settings shape implementation. A second activity around which a community of scholars might organize is to develop standards regarding the use of evidence to guide refinements to design. If research and researchers are to mediate improvements, the community must articulate norms and practices regarding how to incorporate multiple points of view on problems and conflicting interpretations of data. One approach could be to encourage the use of design rationales, that is, accounts of the decisions teams make and the reasons for their decisions (Moran & Carroll, 1996). Professionals in the fields of architecture, urban planning, and software engineering articulate design rationales to clarify the purposes of designs, record the history of the design process, and reflect on and modify designs. In design-based implementation research in education, design rationales might serve as a means to make visible (and public for external review) the ways that teams employ evidence to resolve conflicts, weigh competing approaches to improvement, and identify new areas of focus for their work. Finally, to develop a community, scholars will need resources for conducting their work and venues for communicating their findings. Already, the National Science Foundation encourages scholars to propose implementation research studies for its Research and Evaluation on Education in Science and Engineering Program competition. But at the Institute of Education Sciences, design-based implementation research would currently need to take place in the context of intervention development. One way for both the institute and the National Science Foundation to support design-based implementation research would be for requests for proposals to give priority to existing research– practice partnerships that have been successful, so that partnerships can develop new projects from groups with a shared history of collaboration. Finally, this area of research needs new publication outlets that appreciate the interdisciplinary, iterative nature of the research. Many design-based implementation research studies are “fugitive documents”; that is, they appear as book chapters, online reports, or manuscripts. Peer-reviewed journals that publish articles at the intersection of policy and learning sciences in particular are needed, where the work is currently most concentrated. We are optimistic about the potential of design-based implementation research and believe that resources for it can be identified, because this kind of research directly addresses important and timely policy concerns, namely, scaling up and sustaining change, and because it builds on prior work in both policy and learning sciences. The examples presented in this article illustrate how collaborative design that focuses on problems of practice can produce effective programs, help school districts augment the supports they provide to teachers for improving their instruction, contribute to advances in theory about what makes an innovation usable, and develop system capacity. The task ahead is to enable a broader community to undertake design-based implementation research systematically. Doing so will involve expanding the model to address present and emerging challenges so that it can have significant impact on the field of education. October 2011

335


References

Berman, P., & McLaughlin, M. W. (1975). Federal programs supporting educational change: Vol. 4. The findings in review. Santa Monica, CA: RAND. Blumenfeld, P., Fishman, B. J., Krajcik, J., Marx, R. W., & Soloway, E. (2000). Creating usable innovations in systemic reform: Scaling up technology-embedded project-based science in urban schools. Educational Psychologist, 35(3), 149–164. Bowyer, J., Gerard, L., & Marx, R. W. (2008). Building leadership for scaling science curriculum reform. In Y. Kali, M. C. Linn, & J. E. Roseman (Eds.), Designing coherent science education (pp. 123–152). New York: Teachers College Press. Bryk, A. S., Gomez, L. M., & Grunow, A. (in press). Getting ideas into action: Building networked improvement communities in education. In M. Hallinan (Ed.), Frontiers in sociology of education. Dordrecht, the Netherlands: Verlag. Cobb, P. A., Confrey, J., diSessa, A. A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. Cobb, P. A., Henrick, E. C., & Munter, C. (2011, April). Conducting design research at the district level. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Coburn, C. E., & Stein, M. K. (Eds.). (2010). Research and practice in education: Building alliances, bridging the divide. Lanham, MD: Rowman & Littlefield. Cohen, D. K., Moffitt, S. L., & Goldin, S. (2007). Policy and practice: The dilemma. American Journal of Education, 113(4), 515–548. Cole, M., & Distributed Literacy Consortium. (2006). The Fifth Dimension: An after-school program built on diversity. New York: Russell Sage. Cole, M., & Engeström, Y. (2006). Cultural-historical approaches to designing for development. In J. Valsiner & A. Rosa (Eds.), The Cambridge handbook on sociocultural psychology (pp. 484–507). New York: Cambridge University Press. Confrey, J., Castro-Filho, J., & Wilhelm, J. (2000). Implementation research as a means to link systemic reform and applied psychology in mathematics education. Educational Psychologist, 35(3), 179–191. Darling-Hammond, L. (1993). Reframing the school reform agenda: Developing capacity for school transformation. Phi Delta Kappan, 74(10), 752–761. Datnow, A., & Castellano, M. (2000). Teachers’ response to Success for All: How beliefs, experiences, and adaptations shape implementation. American Educational Research Journal, 37(3), 775–799. Davis, E. A., & Krajcik, J. (2005). Designing educative curriculum materials to promote teacher learning. Educational Researcher, 34(3), 3–14. Dillon, D. R., O’Brien, D. G., & Heilman, E. E. (2000). Literacy research in the next milennium: From paradigms to pragmatism and practicality. Reading Research Quarterly, 35(1), 10–26. Donaldson, S. I. (2007). Program theory-driven evaluation science: Strategies and applications. Mahwah, NJ: Lawrence Erlbaum. Donovan, S. (2010, May). Use-inspired research and development: Addressing middle school challenges in the SERP-Boston field site. Paper presented at the annual meeting of the American Educational Research Association, Denver, CO. Donovan, S. (2011, April). The SERP approach to research, design, and development: A different role for research and researchers. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Donovan, S., Wigdor, A. K., & Snow, C. E. (2003). Strategic education research partnership. Washington, DC: National Research Council. 336

educational Researcher

Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences, 11(1), 105–121. Elmore, R. F. (1980). Backward mapping: Implementation research and policy decisions. Political Science Quarterly, 94(4), 601–616. Elmore, R. F., & Forman, M. (2010, May). Internal coherence: Building organizational capacity for instructional improvement. Paper presented at the annual meeting of the American Educational Research Association, Denver, CO. Fetterman, D. M. (2001). Foundations of empowerment evaluation. Thousand Oaks, CA: Sage. Frank, K. A., Zhao, Y., & Borman, K. (2004). Social capital and the diffusion of innovations within organizations: Application to the implementation of computer technology in schools. Sociology of Education, 77(2), 148–171. Gallucci, C., Van Lare, M. D., Yoon, I. H., & Boatright, B. (2010). Instructional coaching: Building theory about the role and organizational support for professional learning. American Educational Research Journal, 47(4), 919–963. Honig, M. I. (2008). District central offices as learning organizations: How sociocultural and organizational learning theories elaborate district central office administrators’ participation in teaching and learning improvement efforts. American Journal of Education, 114(4), 627–664. Horn, I. S., & Little, J. W. (2010). Attending to problems of practice: Routines and resources for professional learning in teachers’ workplace interactions. American Educational Research Journal, 47(1), 181–217. Kelly, A. E. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences, 13(1), 115–128. Kirshner, B. R., O’Donoghue, J., & McLaughlin, M. W. (2005). Youth– adult research collaborations: Bringing youth voice to the research process. In J. L. Mahoney, R. W. Larson, & J. S. Eccles (Eds.), Organized activities as contexts of development: Extracurricular activities, after-school, and community programs (pp. 131–156). Mahwah, NJ: Lawrence Erlbaum. McLaughlin, M. W. (1987). Learning from experience: Lessons from policy implementation. Educational Evaluation and Policy Analysis, 9, 171–178. McLaughlin, M. W., & O’Brien-Strain, M. (2008). The Youth Data Archive: Integrating data to assess social settings in a societal sector framework. In M. Shinn & H. Yoshikawa (Eds.), Toward positive youth development: Transforming schools and community programs (pp. 313–332). New York: Oxford University Press. Moran, T. P., & Carroll, J. M. (1996). Overview of design rationale. In T. P. Moran & J. M. Carroll (Eds.), Design rationale: Concepts, techniques, and use (pp. 1–20). Mahwah, NJ: Lawrence Erlbaum. Pea, R. D., & Collins, A. (2008). Learning how to do science education: Four waves of reform. In Y. Kali, M. C. Linn, & J. E. Roseman (Eds.), Designing coherent science education (pp. 3–12). New York: Teachers College Press. Penuel, W. R., Roschelle, J., & Shechtman, N. (2007). The WHIRL co-design process: Participant experiences. Research and Practice in Technology Enhanced Learning, 2(1), 51–74. Reinking, D., & Bradley, B. A. (2008). Formative and design experiments: Approaches to language and literacy research. New York: Teachers College Press. Resnick, L. B., & Spillane, J. P. (2006). From individual learning to organizational designs for learning. In L. Verschaffel, F. Dochy, M. Boekaerts, & S. Vosinadou (Eds.), Instructional psychology: Past, present and future trends. Sixteen essays in honor of Erik de Corte (pp. 257–274). Oxford, UK: Pergamon. Rowan, B. (2002). The ecology of school improvement: Notes on the school improvement industry in the United States. Journal of Educational Change, 3(3–4), 283–314.


Snow, C. E. (2011, April). The Strategic Education Research Partnership approach to supporting middle schools in promoting academic development. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Snow, C. E., Lawrence, J., & White, C. (2009). Generating knowledge of academic language among urban middle school students. Journal of Research on Educational Effectiveness, 2(4), 325–344. Spillane, J. P. (2006). Distributed leadership. San Francisco: Jossey-Bass. Stein, M. K., & Coburn, C. E. (2008). Architectures for learning: A comparative analysis of two urban school districts. American Journal of Education, 114(4), 583–626. Weinbaum, E. H., & Supovitz, J. A. (2010). Planning ahead: Make program implementation more predictable. Phi Delta Kappan, 91(7), 68–71. AUTHORS

WILLIAM R. PENUEL is a professor of educational psychology and learning sciences at the University of Colorado, School of Education, UCB 249, Boulder, CO 80309; william.penuel@colorado.edu. His research focuses on teacher learning and organizational processes that shape the implementation of educational policies, school curricula, and after-school programs.

BARRY J. FISHMAN is an associate professor of educational studies at the University of Michigan, School of Education, 610 E. University, Room 4121, Ann Arbor, MI 48109; fishman@umich.edu. His research focuses on technology and teacher learning and on video games as models for transformative learning environments. BRITTE HAUGAN CHENG is an education researcher at SRI International, Center for Technology in Learning, 333 Ravenswood Avenue, Menlo Park, CA 94025; britte.cheng@sri.com. Her research examines STEM learning in and across formal and informal learning environments, including the design of technology-based instruction and assessment. NORA SABELLI is a senior science advisor at SRI International, Center for Technology in Learning, 333 Ravenswood Avenue, Menlo Park, CA 94025; nora.sabelli@sri.com. Her research focuses on policies aimed at systemic improvements in STEM education.

Manuscript received February 9, 2011 Revisions received April 25, 2011; August 2, 2011; and August 5, 2011 Accepted August 6, 2011

October 2011

337


Page Left Blank Intentionally


Tab 14 1 Posters Relatedd to Action Plan Strateegy 1 –Impprove Student Learninng 1..01 Barghaus & Fantuzzo – Validation of o the Preschoool Child Obserrvation Recordd 1..02 Brawner – Project GOLLD: “We are Kings and Queeens” HIV/STI PPrevention forr Black Adolesccents 1..03 Carlton, Sorhagen, S Stuull, & Weinrauub – Whose Heead Hit the Pillow First: A RReview of Sporrts and Sleep in Preadolesscence 1..04 Chein, Roosenbaum, & Botdorf – Woorking Memoryy Training Impproves Perform mance on an Emotionaal Cognitive Control Task in Adolescents 1..05 Dahl – SNAP-Ed EAT.RRIGHT.NOW. Nutrition Educaation Program m: Long-Term Impacts of 4thh Grade Veegetable Core Intervention 1..06 Dahl –EAT.RIGHT.NOW EA W. Nutrition Edducation Proggram: 2012-20013 School Yeaar Reach and Scope 1..07 Daly & Tiimony – Teachhing Children to Succeed: A University-Ellementary Schhools Partnersship to Promoote Social, Emotional, and Behavioral B Heaalth in Studennts 1..08 Darken – Experiences and Perceptioons of Bilinguaal Students in Two Settingss 1..09 Dickard, Magee, Agostto, & Forte – Studying S Teenn Social Mediaa Use: What Doo You Think iss the #1 Topic Teens Ask About on Social Media? 1..10 Ellis – Ouut-of-School Time T and Studdent Outcomess 1..11 Friedmann – PA SNAP-ED funded Nutrition Educattion Assembliees School Disttrict of Philadeelphia


1..12 Glunt – Blueprints B LifeeSkills Training Outcomes EEvaluation 1..13 Gundersoon – Praise foor Effort Increaases Motivatioon and Learninng 1..14 Gundersoon & Ramirez – Math Anxieety in Early Eleementary Schoool 1..15 Harbisonn, Sack, Campbbell, & Ciner – Vision in Preschoolers: Hyperopia in Preeschoolers (VIPHIP) 1..16 Hirsh-Pasek, Pace, & Yust Y – Developping a Computerized Languuage Assessmeent for Preschooolers 1..17 Hirsh-Pasek,Newcombbe, Golinkoff, & Verdine – DDo Early Spatiaal Skills Lead tto Later Spatiaal and Math Gains? 1..18 Hirsh-Pasek, Spiewak--Toub, & Hassinger-Das – PPlaying to Learn: Vocabularyy Developmennt in Early Childhood 1..19 Holmes – Proportionaal Reasoning: An A Interventioon to Support Quantitative Developmentt 1..20 Jirout – Learning L from m Play: The Impact of Spatiaal Game Play oon Spatial Ability 1..21 Jubilee & Engelman – City Year Proggram Evaluatiion 1..22 Kupersm midt & Scull – An A Evaluationn of Media Awa ware, A Media LLiteracy Educaation Substance Abuse Prrevention Proggram for High School Studeents 1..23 Lawman – Trends in Relative R Weighht Over One Yeear in U.S. Urbban Youth 1..24 Lent – Purchasing Pattterns of Adultts, Adolescentts, and Childreen in Urban Coorner Stores: Quantityy, Spending, annd Nutritionall Characteristiics 1..25 McClung – Educating Children and Youth Y Experieencing Homeleessness: Transsportation as a Barrier too Attendance 1..26 Merlino – Summary off a Matched-SSample Study Comparing thhe Interactive Math Program m and Traditionnally-Taught High H School Sttudents in Philadelphia 1..27 Mojica – An Examination of English Language Prooficiency and Achievementt Test Outcomees 1..28 Lawrence, Krier, & Possner – Engaging H.S. Math Students and Teachers throough a Proficieencym Based Asssessment andd Reassessmennt of Learningg Outcomes (PPARLO) System 1..29 Nocito – SNAP-Ed Garden-Based Nuutrition Educattion: A New Strategy in thee Fight againstt Obesity


1..30 Norton & Reumann-M Moore – 2012-13 Evaluationn of City Year GGreater Philaddelphia: A Fivee-Year Study byy Research for Action 1..31 Peters – Cardiovasculaar Health amoong Philadelphhia Adolescennts: Analysis off Youth Risk Behaviorr Data, 2011 1..32 Polonskyy – Breakfast Patterns P amonng Low-Incom me, Ethnically Diverse Elemeentary School Children 1..33 Reed & Hirsh-Pasek H – Arts and Earlyy Education: A Promising Duuet to Supporrt School Readdiness Skills 1..34 Reisinger & Mandell – Philadelphia Autism Instruuctional Methhods Study: Phhilly AIMS 1..35 Schaeffer & Leach – Arts Link: Buildding Mathemaatics and Sciennce Competenncies through aan Arts Integration Modeel 1..36 Shakow – Eating the Alphabet A 1..37 Shakow – F.U.N. (Fam milies Understaanding Nutritiion) and Fit thhrough P.L.A.YY. (Playful Leaarning for Adultts and Youth) 1..38 Simon & Robbins – “BBe the HYPE”: Engaging E Youung People as School Wellness Leaders 1..39 Stull, Weeinraub, & Harrmon – Early Childcare C Careeers: What aree the Facts? 1..40 Valshteinn, Foley, Stull,, & Weinraub – What a Nightmare! Not AAll Sleep Meassures are Created Equal 1..41 Vetter – Cardiopulmonary Resuscitaation/Automaated External DDefibrillator OOlympic Competittion Enhancess Resuscitationn in High Schoool Students 1..42 Wasik, Hindman, & Snnell – Exceptioonal Coachingg for Early Language and Litteracy ExCELL--e


Page Left Blank Intentionally


Validation of the Preschool Child Observation Record Background

Results/Conclusions

Scientifically based measures are key to monitoring and advancing students’ learning. The use of such assessments has been called for in national legislation1 and locally in the District’s Action Plan. To help meet these calls, this study investigated the validity of one of the most widely-used early childhood measures: the Preschool Child Observation Record, 2nd ed. (COR-2).2 The COR-2 has 32 items organized into six categories: Initiative, Social Relations, Creative Representation, Movement and Music, Language and Literacy, and Math and Science. Each item has five skill points that aim to capture a developmental sequence.

RQ 1: The six factors did not fit the data well and were highly correlated (.85–.97), calling into question their distinctiveness. This result was replicated with non-Head Start programs. RQ 2: Four factors fit the data better than six, but the factors were still highly correlated and explained only a small amount of the variability. RQ 3: Problems were found with the functioning of a third of the items’ skill points. Figure 1. Example of Theoretical and Actual Item Functioning Theoretical

Actual

Despite its widespread use, there is no independent research on the COR-2. This study helps to bridge this knowledge gap by investigating the following research questions: RQ 1. Does the COR-2 capture six factors corresponding to its six categories? RQ 2. What is the empirically derived factor structure of the COR-2? RQ 3. Do the five skill points of each item represent a developmental sequence?

Methods Data came from administrative records on all children enrolled in the District’s Head Start program in 2006-07. RQ 1. The six categories were evaluated with confirmatory factor analysis. This analysis was replicated with data from all non-Head Start programs to test the results’ generalizability. RQ 2. Exploratory and confirmatory factor analyses were conducted to empirically uncover the factor structure. RQ 3. Item response theory was used to examine the extent to which each item’s skill points represented a developmental sequence.

3

5 Ability

Project Overview

Ability

5 4

3 4

2

2 1

1 Item Skill Points

Item Skill Points

Conclusion: Further development of the COR-2’s constructs and items is needed.

Research to Practice Strategies Key points for the Action Plan’s call for a rigorous assessment system: 1. The validation process used here can be used to evaluate potential measures. 2. Adequate assessment training is needed to ensure validity and reliability. 1 Head

Start Act as amended 2007, 42 USC 9801 et seq. Educational Research Foundation. (2003). Preschool Child Observation Record, 2nd ed. Ypsilanti, MI: HighScope Press. * This research was supported by IES and a Child Care Research Scholars Grant from OPRE, ACF, DHHS. This work is the authors’ sole responsibility and does not necessarily represent views of the supporting agencies.

2 HighScope

Contact Information Name: Katie Barghaus & John Fantuzzo Organization: University of Pennsylvania Phone Number: 215-746-5437 Email Address: barghaus@upenn.edu


Project GOLD: “We are Kings and Queens” HIV/STI Prevention for Black Adolescents Background

Results/Conclusions

Black youth are at the epicenter of the national HIV/AIDS epidemic, and those with mental illness are at heightened risk (Brawner, Gomes, Jemmott, Deatrick & Coleman, 2012). In Philadelphia, PA, Black adolescents account for the overwhelming majority of all HIV/AIDS cases reported in this demographic (Philadelphia Department of Public Health AIDS Activities Coordinating Office Surveillance Unit [AACO], 2013). “Project GOLD: We are Kings and Queens” is a comprehensive intervention designed to address the effects of psychiatric symptoms on the sexual decisionmaking process, through a social determinants approach. The aim is to facilitate knowledge and skill acquisition to prevent HIV/STIs in a context that is relevant to the target demographic.

Our preliminary work confirms that there are unmet sexual health education and intervention needs among Black adolescents with mental illnesses.

Research to Practice Strategies Given the devastating impact of HIV/AIDS among Black youth, it is imperative that we make every effort to address their sexual health needs. In one focus group discussion a teen stated “Some people just want to do it because the more sex partners they got the cooler they are.” (Speaker FG3) Through the integration of physical and mental health, we can develop programs to meet them where they are. Partnership with the District will help to ensure that we can reach the broadest group of youth possible.

Project Overview • Purpose: To develop and test a comprehensive behavioral intervention to reduce the risk of HIV/STIs among heterosexually-active Black adolescents (aged 14 to 17) with mental illnesses • Location: Community behavioral health (CBH) programs, approved District schools, and the University of Pennsylvania School of Nursing • Population: Heterosexually-active Black adolescents aged 14 to 17, currently receiving outpatient mental health treatment from a community behavioral health provider (N = 128) • Timeline: This is a four year study, ending in December 2016

Methods_______________________ Participants will be recruited from CBH programs and high schools through flyers, face-to-face encounters and social media advertisements. Those who are eligible will be randomized to either the HIV/STI risk reduction (n = 64) or general health promotion (n = 64) curriculum. The curricula content (taught in 2, 3-hour sessions) consists of fun, skills-based activities to teach participants how to maintain their physical, mental, sexual and emotional health. For attitude and behavior change evaluation, participants will self-administer an electronic questionnaire at baseline (immediately before the first session begins), immediate post (immediately after the second session ends), and at 3, 6, and 12 month followups. HIV, Chlamydia and Gonorrhea testing will be provided at each assessment time point, with immediate linkage to care for positive test results. Participants will receive a total of $200 for completing all study activities over the12-month enrollment period.

References: Brawner, B. M., Gomes, M. M., Jemmott, L. S., Deatrick, J. A., & Coleman, C. L. (2012). Clinical depression and HIV risk-related sexual behaviors among African-American adolescent females: Unmasking the numbers. AIDS Care, 24(5), 618-625 Philadelphia Department of Public Health AIDS Activities Coordinating Office Surveillance Unit [AACO]. (2013). HIV and AIDS in the City of Philadelphia: Annual Surveillance Report. Philadelphia, PA: Philadelphia Department of Public Health.

Contact Information Bridgette Brawner, PhD, APRN University of Pennsylvania School of Nursing (215) 898-0715 brawnerb@nursing.upenn.edu


Whose Head Hit the Pillow First: A Review of Sports and Sleep in Preadolescence Background

Results/Conclusions

Studies have demonstrated a slight but persistent link between exercise and sleep.1 However, contradictory results have emerged as to the nature of the relation between exercise and sleep before puberty.2,3 There has been relatively little research on the interplay between light exposure as it related to exercise, despite the similar physiological effects both have on the body.4 Virtually none of this research as been focused on children. There exists a gap in the literature concerning the additive or detractive relationship between light and exercise.

y Participating in sports decreased trouble falling asleep compared to nonparticipation. y Light had no significant effects on sleep at the .05 level.

Project Overview y Clarification of the relation between sports and sleep. y Exploration of the relations between light, sports, and sleep.

Methods Data was used from the NICHD Study of Childcare and Youth Development. N = 1012 6th grade children completed surveys about their sleep habits while mothers completed surveys about afterschool sport participation.

Research to Practice 1Driver,

H., & Taylor, S. (2000). Exercise and sleep. Sleep Medicine Reviews, 4(4), 387-402. 2Dworak, M., Wiater, A., Alfer, D., Stephan, E., Hollmann, W., & Strueder, H. (200 8). Increased slow wave sleep and reduced stage 2 sleep in children depending on exercise intensity. Sleep Medicine, 9, 266-272. 3Pesonen, A., Sjöstén, N., Matthews, K., Heinonen, K., Martikainen, S., Kajantie, E., Tammelin, T., Eriksson, J., Strandberg, T., & Räikkönen, K. (2011). Temporal associations between daytime physical activity and sleep in children. PLoS ONE, 6(8), 1-6. 4Youngstedt, S., Kripke, D., & Elliot, J.. (1999). Is sleep disturbed by vigorous late-night exercise? Medicine & Science in Sports & Exercise, 31(6), 864-869.

Trouble falling asleep was lowest when sports participation was at 1 day . While sports participation should be encouraged, practicing intensely every day may be problematic. Contact Information Richmond Carlton, Nicole Sorhagen, Judith Stull, & Marsha Weinraub (609) 202-1741 richmond.carlton@gmail.com


Working Memory Training Improves Performance on an Emotional Cognitive Control Task in Adolescents Background

Results/Conclusions

• Working memory training can enhance cognitive skills in adult populations (Morrison & Chein, 2011)



)DOVH$ODUP5DWHV

• Teens have lower cognitive control than adults, especially in emotional situations (Steinberg, 2008) • The effect of working memory training on adolescents’ cognitive control is unknown

   

$FWLYH&RQWURO

3RVW7UDLQLQJ

*p < .05

• Can working memory training enhance adolescents’ performance in an emotional cognitive control task?



• Adolescents in the training group performed better at post assessment and exhibited larger improvements on the emotional go-nogo task than those in the control group  • Provides evidence of far transfer from training tasks to an untrained cognitive control task using emotional stimuli

Methods 22 adolescents, ages 13-17 Training (20 sessions across 4 weeks)

: :07

3UH7UDLQLQJ

Project Overview

Pre-training assessment

*



Post-training assessment

Complex Span Task (20 minutes/day)

Research to Practice Strategies 

• Cognitive training may help attenuate teens’ increased propensity for risk-taking through increasing cognitive control References

N-Back Task (20 minutes/day)

Morrison, A., & Chein, J. (2011). Does working memory training work? The promise and challenges of enhancing cognition by training working memory. Psychonomic Bulletin & Review, 18, 46-60. Steinberg, L. (2008). A social neuroscience perspective on adolescent risk-taking. Developmental Review, 28, 78–106.

Contact Information

Go-Nogo Task (Assessment) • Three conditions: Respond to the happy face, respond to the angry face, respond to the neutral face (shown 

Jason Chein, Gail Rosenbaum, Morgan Botdorf Temple University (763) 229-3782 grosenbaum@temple.edu


SNAP-Ed Eat.Right.Now. Nutrition Education Program: Long-term Impacts of 4th Grade Vegetable Core Intervention EAT.RIGHT.NOW. (ERN) is a nutrition education outreach program, available to eligible students enrolled in The School District of Philadelphia (SDP). ERN is funded by the Pennsylvania (PA) Department of Public Welfare through PA Nutrition Education Tracks, a part of U.S. Department of Agriculture’s (USDA) Supplemental Nutrition Assistance Program (SNAP). Research shows that vegetable consumption has positive effects on children’s health and weight status.1,2 Low vegetable intake can be attributed to a number of factors that influence behavior, including knowledge, attitude, selfefficacy, and preference.3,4

Project Overview Vegetable Core (VC) is a fourth grade classroom-based standardized intervention that aims to increase vegetable consumption in children. VC was implemented and evaluated in select SDP schools during the 2011-12 school year (SY). The School District of Philadelphia (SDP) Office of Research and Evaluation (ORE) aimed to investigate changes in student content knowledge and other key VC measures over an extended period of time.

Table 3. Vegetable Core Pre- and Follow-up Scores VC Measure Max. 4th Pre 5th Followtp-value Score Mean Up Mean value Knowledge 4 2.73 3.12 -5.025 <0.001 Attitude 10 7.87 7.60 2.381 0.018 Self-efficacy 10 7.73 7.65 0.608 0.544 Preference 50 35.91 35.43 1.254 0.211 Figure 1. Estimated Marginal Means of VC Knowledge Scores from Pre(1), to Post (2), to Follow-Up(3) Surveys 3.935 4 Estimated Marginal Means

Background

3.75 3.5 3.25 3.196

3 2.75

Methods During the 2011-12 SY, 588 4th grade students participated in the VC evaluation. They completed preand post-surveys which included questions regarding VC knowledge, self-efficacy, attitude, and preference measures. In the 2012-13 SY follow-up evaluation, the final sample was reduced to 365 students.

Results/Conclusions Table 1. Vegetable Core Pre- and Post-Survey Scores VC Measure Max. 4th Pre 4th Post Mean p-value Score Mean Mean Change Knowledge 4 2.70 3.81 1.11 <0.001 Attitude 10 7.83 8.01 0.18 0.038 Self-efficacy 10 7.67 7.89 0.22 0.016 Preference 50 36.24 37.53 1.29 <0.001 Table 2. Vegetable Core Post- and Follow-Up Scores VC Measure Max. 4th Post 5th Follow- tpScore Mean Up Mean value value Knowledge 4 3.92 3.16 10.19 <0.001 Attitude 10 7.90 7.63 2.19 0.029 Self-efficacy 10 7.85 7.61 1.91 0.57 Preference 50 37.48 35.59 4.87 <0.001 1. Centers for Disease Control and Prevention. (2013). State Indicator Report on Fruits and Vegetables 2013. National Center for Chronic Disease Prevention and Health Promotion: Division of Nutrition, Physical Activity, and Obesity. Retrieved from http://www.cdc.gov/nutrition/downloads/State-Indicator-Report-Fruits-Vegetables-2013.pdf 2. Tohill, B. C. (2005). Dietary intake of fruit and vegetables and management of body weight. World Health Organization. Retrieved from http://www.who.int/dietphysicalactivity/publications/f&v_weight_management.pdf?ua=1 3. Reinaerts, E., de Nooijer, J., Candel, M., & de Vries, N. (2007). Explaining school children’s fruit and vegetable consumption: The contributions of availability, accessibility, exposure, parental consumption and habit in addition to psychosocial factors. Appetite, 48(2), 248258. http://dx.doi.org/10.1016/j.appet.2006.09.007 4. Wall, D. E., Least, C., Gromis, J., & Lohse, B. (2011). Nutrition education intervention improves vegetable-related attitude, self-efficacy, preference, and knowledge of fourth- grade students. Journal of School Health, 82(1), 37-43.

2.727

2.5 1

2 3 Survey Time • Results from the 2011-12 SY pre- and post-surveys indicated VC significantly improved students’ knowledge, attitude, self-efficacy, and preference regarding vegetables (see Table 1). •Although there was a significant decrease in VC knowledge scores from 4th grade post-survey to 5th grade follow-up (see Table 2), the mean score at 5th grade follow-up was significantly higher than 4th grade pre-survey scores (see Table 3), indicating that VC knowledge was retained to some extent (see Figure 1).

Research to Practice Strategies The VC intervention did not generate persistent longterm positive results. With the absence of the VC lessons, there were significant declines in students’ vegetable attitudes and preferences during the 2012-13 SY. ORE recommends that nutrition education be delivered over longer periods of time with reinforcement lessons or activities in order to help sustain knowledge, self-efficacy, attitude and preference toward vegetable consumption and other healthy eating behaviors.

Contact Information Alicia A. Dahl, M.S. SDP Office of Research and Evaluation (215) 400-6491 adahl@philasd.org


EAT.RIGHT.NOW. Nutrition Education Program: 2012-2013 School Year Reach and Scope Background

Table 2. Number of Nutrition Education Events by Type

EAT.RIGHT.NOW. (ERN) is a nutrition education outreach program, available to Supplemental Nutrition Assistance Program (SNAP) eligible students enrolled in The School District of Philadelphia (SDP). ERN is a component of the PA TRACKS initiative, funded by the U.S. Department of Agriculture (USDA) Food and Nutrition Service (FNS), with matching state and local support. The ERN program is implemented by six community partners: SDP, Drexel University (DU), Health Promotion Council (HPC), Albert Einstein Medical Center (AE), The Food Trust (TFT), and the Urban Nutrition Initiative (UNI).

Project Overview ERN programming is available to any Philadelphia District school with 50% or more of its student population eligible to receive free or reduced school lunch. In the 2012-2013 school year (SY), the program reached grades K-12 in over 250 schools.

Type

DU

Single class Series classes 1-on-1 Afterschool classes Assembly Assembly follow-up Other Total by Partner

1,132

AE

TFT

HPC

4,205 10,869 1,480

SDP

UNI

Total by Activity

24

58

17,768

28,032 60,666 35,917 17,839 17,626 12,266 172,346 1,570

380

0

1,344

0

0

3,294

394

2

4

16

35

296

747

0

0

0

0

6,975

0

6,975

0

0

0

0

27,281

0

27,281

0

0

0

2

1

0

3

31,128 65,253 46,790 20,681 51,942 12,620 228,414

Figure 1. Nutrition Education Objectives, 2012-13 SY 1% 1% MyPlate/MyPyramid Methods Whole Grains Calcium All ERN activities were recorded in the STARtracks 14% 21% Snacks information system, which details the type and extent of Breakfast programming carried out by each community partner. Food Safety The Office of Research and Evaluation (ORE) used Physical Activity 18% 8% STARtracks data to conduct an ERN process evaluation Water/Beverages to identify the reach and scope of the program. Data Fruit 6% presented here refer to all direct education activities Vegetable 3% 8% that occurred during the first three quarters of the 2012Calories In:Out 8% 6% 13 SY, or October 2012 through June 2013. The fourth 3% Fiber quarter was excluded because students were not in school. Research to Practice Strategies

Results/Conclusions Table 1. Percentage of Nutrition Education Events by Community Partner Total Total Percentage Percentage Percentage Community Number Number of All Events of All Events Change Partner of Events of Events 2012-13 (%) (%): 2011-12 (%) 2011-12 2012-13

DU AE TFT

31,128 65,253 46,790

13.6 28.6 20.5

34,663 64,647 46,392

15.0 28.5 19.5

-10.2 +0.9 +0.9

HPC

20,681

9.1

21,455

9.0

-3.6

SDP

51,942*

22.7

54,226*

22.8

-4.2

UNI

12,620

5.5

12,160

5.1

+3.8

Total

228,414

100.0

237,543

100.0

-3.8

*Includes assemblies

From the 2011-12 SY to the 2012-13 SY, the number of ERN activities provided to students in Philadelphia decreased -3.8% (see Table 1).

ORE and ERN community partners discussed limitations with using the STARtracks reporting system for tracking program reach and scope. From these discussions, a practice strategy was implemented this SY, pre-assigning objective codes to curricula so data will be consistent and reliable across the partners when entered into the STARtracks objectives report. ERN partners should set measurable reach and scope goals prior to program implementation to provide a meaningful interpretation of the data regarding ERN activities within Philadelphia schools.

Contact Information Alicia A. Dahl, M.S. The School District of Philadelphia Office of Research and Evaluation (215) 400-6491 adahl@philasd.org


Teaching Children to Succeed: A University-Elementary Schools Partnership to Promote Social, Emotional, and Behavioral Health in Students Background

Results/Conclusions

• Children who demonstrate poor social skills and/or behavior problems at school are at risk for continuing behavioral challenges, poor academic achievement, and adverse economic outcomes.

• Findings reveal that those children demonstrating the most frequent and severe behavior problems (lower quartile – 25%) derive the greatest benefit from the program. Scores for most of these children improve from the “clinically significant” to the “at-risk” range, or from the “at-risk” to the “normal” range”.

• Social, emotional, and behavioral problems typically worsen as children age, and children face numerous barriers in receiving services after entering school.

Project Overview • Teaching Children to Succeed utilizes session content from the Incredible Years Series, an evidence-based prevention program, to improve children’s social skills while reducing instances of behavior problems. • Teaching Children to Succeed utilizes an innovative model for delivery whereby psychology graduate students from Drexel University and Temple University train and then collaborate with classroom teachers to deliver the program to all children in the classroom. • This yearlong program is currently being delivered to select classrooms of 1st – 3rd grade students in Morton McMichael School in West Philadelphia, and Tanner Duckery School and Paul Dunbar School in North Philadelphia.

• Similarly, those children rated lowest in social competence demonstrate the largest gains in social skills at the conclusion of the program. • Teachers report high satisfaction with the program.

Research to Practice Strategies • Collaborative university – school partnerships can help effectively address student behavioral and socioemotional challenges – especially with those children struggling the most in school. • One novel model is to utilize trained psychology graduate students from area Universities to co-lead the intervention sessions with teachers. Partnering with teachers has the potential to help with sustainability. *Funding for this project provided by the Pew Charitable Trusts Foundation

Methods_______________________ • Classrooms and teachers are selected based on principal and teacher input. Children earn small prizes for participating in the program. Pre- and post-test teacher report data is collected for measures of social competence, social skills, and behavior.

Contact Information Brian P. Daly, Ph.D., & David Timony, Ph.D. Drexel University / Delaware Valley College 215-571-4252 brian.daly@drexel.edu


Experiences and Perceptions of Bilingual Students in Two Settings Background The transitional bilingual education model that the School District of Philadelphia offers for the majority of students in bilingual classrooms is but one of several models. In its Action Plan, the SDP cites its intentions to improve and expand bilingual programs. Program model is a key variable to consider in implementing these goals. I seek to examine the messages that program models convey to students.

Project Overview The purpose of this study is to compare the experiences and perceptions of students enrolled in bilingual education programs in two distinct public elementary school settings.

Student attitudes towards languages The El Paso students focused on personal relationships, especially family heritage. Christina said that her family is “100% Mexican” and she would like to learn more Spanish in order to speak with more family members. Several students referred to the importance of knowing languages just in case they were needed during travel or other communication opportunity. Melissa said, “It’s good to know extra languages” to be ready to translate. The Philadelphia students’ ideas about the importance of language learning were far from unified; a larger sample might have helped. A few ideas were: academic success, career goals, family connection, secrecy, and avoiding embarrassment.

Methods I compared district literature on the two programs, observed bilingual 4th grade students in their classrooms, and interviewed 8 students in El Paso, and 4 in Philadelphia.

Results/Conclusions Comparison of the Programs Using Janks’1 Framework on Critical Literacy Education

Research to Practice Strategies Philadelphia’s bilingual programs should provide a model and a curriculum that encourage students to examine and value bilingualism. 1. Janks, H. (2000). Domination, Access, Diversity and Design: A synthesis for critical literacy education. Educational Review, 52 (2), 175-186. 2. Bartlett, L. & García, O. (2011). Additive schooling in subtractive times: Bilingual education and Dominican immigrant youth in the Heights (1-28, 115-150). Nashville: Vanderbilt University Press.

Contact Information Erica Darken School District of Philadelphia 215-227-4435 ebdarken@philasd.org


Studying Teen Social Media Use What do you think is the #1 topic teens ask about on social media? Background

Research to Practice Strategies

Teens are the most active users of social media, yet little is known about how they use these tools for school and personal use. Understanding how students are exchanging information online can help us to create best practices for social media use in schools and school libraries.

Our research can help inform the design of school and library services. We must continue to work with teens to understand their social media use and how best to support their learning via social media tools and services.

Project Overview

School Social Activities

YES has asked. 76.6% 56.1%

NO but would. 17.7% 27.4%

NO and would not. 5.7% 16.6%

Entertainment

50.6%

36.7%

12.7%

College Shopping Club or Org Daily Life Work Fashion Self Appearance

41.7% 40.5% 35.3% 34.2% 30.4% 28.2% 27.8%

46.2% 35.4% 41.7% 41.1% 47.5% 32.7% 22.8%

12.2% 24.1% 23.1% 24.7% 22.2% 39.1% 49.4%

Methods

Current Event

26.5%

42.6%

31.0%

Data was collected in two high schools from December 2012 through May 2013 using online surveys of 158 high school students, 30 semi-structured in-person interviews, and six focus groups with a total of 50 students.

Pop Culture Career Creative

26.1% 24.4% 22.8%

43.3% 48.1% 51.9%

30.6% 27.6% 25.3%

The aim of this project is to understand how high school students use social media. To do so, we are examining school and public library policies and practices around social technologies, the types of questions high school students ask and answer online, and how they use social media for school and personal purposes.

Results/Conclusions 77% of students have asked questions online, and 61% have answered questions online (Table 1). The most common type of questions were related to schoolwork (Table 2). Through our interviews, we found that schools with open social media policies fostered online collaboration, whereas schools with highly restrictive social media policies limited studentsâ&#x20AC;&#x2122; abilities to communicate and collaborate online. Prevalence of Q&A Ever asked a Question

Total (158) 77% (122)

School 1 (98) 74% (72)

School 2 (60) 83% (50)

Ever answered a Q

61% (97)

60% (59)

63% (38)

IMLS Grant LG-06-11-0261

NSF GRFP Grant #2011121873

Question Topic

School Culture

21.8%

41.7%

36.5%

Romantic Family Norms

19.6% 19.6% 16.6%

22.2% 39.9% 46.5%

58.2% 40.5% 36.9%

Laws or Rules

16.6%

48.4%

35.0%

Philosophy Health Religion Public Safety Culture Sexual Identity Safe Sex

15.8% 14.7% 12.0% 9.6% 8.9% 7.6% 7.0%

33.5% 44.9% 28.5% 40.1% 46.5% 21.0% 20.4%

50.6% 40.4% 59.5% 50.3% 44.6% 71.3% 72.6%

Contact Information: Michael Dickard, Rachel Magee, Denise Agosto, Andrea Forte Drexel University, CCI 215-895-1930 youthonline@drexel.edu


Out-of-School Time and Student Outcomes Background

Methods_______________________

Multiple studies nationwide suggest that regular participation in out-of-school time programming is associated with increased attendance in school, better grades, increased graduation rates and less negative behaviors such as being suspended from school.

The School District of Philadelphia matched the City of Philadelphia’s Out of School Time (OST) program participants with District student identification numbers to compare program participation to academic and behavioral outcomes at school. The first task of the analysis was to compare OST attendance for all participants to attendance for OST participants who were matched with a District student identification number. This analysis determined the degree to which there was a difference between the groups, with regard to program attendance.

Project Overview For many years, the City of Philadelphia’s Department of Human Services has funded outof-school time programming, also known as “expanded learning opportunities” especially in our most distressed neighborhoods. These opportunities have been viewed as an approach to increasing protective factors for youth such as keeping them safe, constructively engaged, and on the right track toward graduation. Broadly speaking, out-of-school time programming support the District’s anchor goals 1 and 2.

Results/Conclusions A consistent positive relationship between desirable academic and behavioral outcomes and attendance in OST programs was found. The greater the number of OST days attended predicted better outcomes across variables. These relationships were statistically significant and mostly linear, with the exception of the “2129 days attended” category, for which outcomes deviated in a positive direction. Based on the data available, these deviations seem to represent students who attended summer programs only and may be mostly comprised of students who are in grades lower than 9-12

Research to Practice Strategies These results are consistent with other research on out- of-school time. Strategies to ensure consistent student participation through quality and engaging programming is key. Follow up studies are in the works.

Contact Information Vicki Ellis School District of Philadelphia and PhillyBOOST 215-400-5872 vellis@philasd.org


PA SNAP-ED funded Nutrition Education Assemblies School District of Philadelphia Background ___________

Results/Conclusions

__

The School District of Philadelphia’s (SDP) EAT.RIGHT.NOW. (ERN) Nutrition Education Program is a federally funded program through the United States Department of Agriculture’s (USDA) Supplemental Nutrition Assistance Program – Education (SNAP-ED) with a mission to provide programs that increase the likelihood that SNAP recipients will make healthier food choices consistent with the most recent dietary advice. ERN has been offering nutrition and physical activity-themed assembly program to SDP schools since 2003. These assemblies utilize a variety of entertainment techniques to engage students while educating them on an array of nutrition and health topics. Schools that qualify to receive ERN programs are determined on a school-by-school basis from School Meal Participation. Schools with a population of 50% or greater in free and reduced meal participation qualify for SNAP-Ed Programming.

A total of 188 schools received assembly programs during the 2012-2013 school year. Of those schools, 127 schools received 2-4 different assemblies. As shown in Figure 1, Stages of Imagination was the least performed assembly program in 2012-2013.

In the 2012-2013 school year, a total of 6,975 classrooms attended assembly programs. Figure 2 provides a breakdown by age group of those classrooms receiving assemblies.

Project Overview Five nutrition-themed assemblies toured eligible SDP schools from November 1012 to May 2013. Assemblies were provided by the following companies: • Stages of Imagination, K-3rd grade • Foodplay Productions, K-5th grade • Jump with Jill, K-5th grade • Rapping about Prevention, K-12th grade • Magic by Taddo, K-12th grade Participating classrooms received grade appropriate workbooks for each student including activities in accordance with Pennsylvania Common Core Standards that reinforce nutrition concepts presented in the assembly. During the 2012-2013 school year, 307 schools were eligible for the ERN program. Of those 307 schools, 280 participated in the ERN program through various strategies, such as classroom lessons, assemblies, health fairs and taste-testings.

Methods_______________________ Schools participating in the ERN Program were provided with background and contact information for each assembly program by the program office and nutrition educators. Schools were responsible for booking shows with the assembly companies directly and were able to book any or all assemblies available for their grade level. Each booking included approximately 2 performances per school and follow-up workbooks and teacher guides were provided to all students and teachers attending an assembly. “

Research to Practice Strategies______ The SDP’s Office of Research and Evaluation (ORE) conducted an evaluation of assembly programs offered by the ERN Program during the 2012-2013 academic year and will continue the evaluation this spring 2014. The evaluation assessed the impact of assembly performances on students’ knowledge of key nutrition and physical activity messages, students’ reactions to the assemblies and teachers perception of their students’ reactions, student and teacher satisfaction with the assemblies, and teachers’ interest in, and us of the assembly workbooks. The ORE evaluates ERN assemblies to provide the program office with feedback regarding the educational quality and entertainment value of the assemblies provided to Philadelphia students. Funded by the Pennsylvania (PA) Department of Public Welfare through PA Nutrition Education Tracks, a part of USDA’s Supplemental Nutrition Assistance Program (SNAP)”

Contact Information Name: Muffin Friedman, Director, ERN Program Organization: School District of Philadelphia Phone Number: (215) 727-5765 Email Address: jfriedman@philasd.org


Blueprints LifeSkills Training Outcomes Evaluation Background

Results/Conclusions

The Penn State University EPISCenter is working collaboratively with the University of Colorado to enhance the participation of PA schools in the Blueprints LifeSkills Training grants initiative by providing technical assistance surrounding the collection and assessment of outcomes data.

The goal of this effort is to provide schools with an outcomes report which will serve to identify the impact the LifeSkills Training program had on studentsâ&#x20AC;&#x2122; knowledge, attitudes, and behaviors as they relate to substance use and abuse. The report will assist the District both in assessing if the desired effects of implementation are being obtained and determining the value of providing future support for the program.

Project Overview The Center for the Study and Prevention of Violence (CSPV) at the University of Colorado was awarded funding to disseminate the LifeSkills Training (LST) drug and alcohol prevention program in several states, including Pennsylvania. Fifty-one school districts across PA are participating, and 36 school districts are collecting outcomes data. It is anticipated this project will last three years.

Methods_______________________

Research to Practice Strategies Placing an emphasis on outcomes data demonstrates strong leadership and vision for future sustainability of evidence-based programming within the school district. Having data will inform decision making and assist in obtaining buy in from key stakeholders.

Map of Participating School Districts

Outcomes data will be collected via administration of a pre/post survey instrument. In year one a pre/post will be given to students receiving level one of the curriculum, and a follow-up post-test can be administered in years two and three following delivery of the booster lessons. The surveys are provided by the Penn State EPISCenter and processed in collaboration with the Penn State Survey Research Center.

Contact Information Kris Glunt Penn State EPISCenter (814) 863-2568 kglunt@episcenter.org


Praise for Effort Increases Motivation and Learning

Background

Results/Conclusions

Laboratory-based experiments show that praising children’s effort, actions, and strategies (process praise: “you worked hard”, “good job”) leads to a growth mindset, while praising ability (“you’re smart”, “you’re good at that”) leads to a fixed mindset1.

• Children who heard more process praise at age 1-3 years: • Believed more strongly in a growth mindset in 2nd grade.5 • Had higher 4th grade math achievement.

Growth mindset: Belief that intelligence can improve with effort. • Leads to increased effort in response to challenges2 • Higher course grades and improved test scores3,4 Fixed mindset: Belief that intelligence is a fixed trait that cannot change. • Leads to helpless response to challenges2

Project Overview The present study asks whether these labbased studies translate into the real world. Does parents’ process praise to young children lead to a growth mindset and improved academic achievement in elementary school?

Methods Longitudinal study of 53 parents & children. • Selected to reflect demographics of Chicago area in terms of income and race/ethnicity. Age 1-3 years • Observed parent-child interactions at home for 4.5 hours; recorded amount of parents’ process praise (e.g., “Good job”). Age 7-8 years (2nd grade) • Child growth mindset questionnaire • Math achievement tests (WoodcockJohnson III Tests of Achievement, Applied Problems and Calculation subtests). Age 9-10 years (4th grade) • Math achievement tests.

• Children’s growth mindset in 2nd grade accounted for the relation between parents’ praise and 4th grade math scores.

Research to Practice Strategies • Findings suggest that praising children for their effort, actions, and strategies can improve motivation and learning. • Although this study was with parents, it is likely that teachers’ praise has a similar influence on students’ motivation. Action points: • Increase use of effort-oriented praise such as “You’re working hard”, “I like how you tried it different ways”, and “Good job!” • Avoid use of ability-oriented praise such as “You’re so smart”, “You’re good at that”, and “You’re a good reader.” References 1. Dweck, C.S. (2006). Mindset: The New Psychology of Success. New York, NY: Random House. 2. Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermine children's motivation and performance. Journal of Personality and Social Psychology, 75(1), 33-52. doi: 10.1037/0022-3514.75.1.33 3. Blackwell, L.S., Trzesniewski, K.H., & Dweck, C.S. (2007). Implicit theories of intelligence predict achievement across an adolescent transition: A longitudinal study and an intervention. Child Development, 78(1), 246-263. 4. Good, C., Aronson, J., & Inzlicht, M. (2003). Improving adolescents’ standardized test performance: An intervention to reduce the effects of stereotype threat. Journal of Applied Developmental Psychology, 24, 645 – 662. 5. Gunderson, E. A., Gripshover, S. J., Romero, C., Dweck, C. S., Goldin-Meadow, S., & Levine, S. C. (2013). Parent praise to 1- to 3-year-olds predicts children's motivational frameworks 5 years later. Child Development, 84(5), 1526-1541.

Contact Information Elizabeth A. Gunderson Temple University Dept. of Psychology Phone: (215) 204-1258 liz.gunderson@temple.edu


Math Anxiety in Early Elementary School

Background

Results/Conclusions

• Math anxiety is linked to low math achievement in adolescents and adults1.

• Some children in 1st and 2nd grades already reported math anxiety. • Children’s math anxiety was correlated with their math achievement, especially among those high in working memory (WM)4.

• Math anxiety leads to worries that take up working memory (WM) (a limited cognitive resource), which in turn disrupts math performance2. • Adults with high WM capacity are more affected by math anxiety than those with low WM, because they use WM-intensive strategies that are disrupted by anxiety3.

Project Overview The present study examines: • Whether math anxiety is already present in 1st and 2nd grades, • Whether math anxiety is related to math achievement at this age, and • Whether working memory (WM) capacity moderates the relation between math anxiety and achievement.

Math anxiety can develop and affect math achievement as early as 1st and 2nd grades. It is important for teachers to be aware of emotional barriers to math learning.

Methods Participants: 154 1st and 2nd grade children from a large urban school district. Measures: Child Math Anxiety Questionnaire (CMAQ) (8 items, Cronbach’s Į = 0.55) Sample item: “How do you feel when getting your math book and seeing all the numbers in it?”

Very Nervous

Research to Practice Strategies

Further research is needed to identify strategies to effectively relieve and/or prevent children’s math anxiety. References

____________

1. Hembree, R. (1990). The nature, effects, and relief of mathematics anxiety. Journal for Research in Mathematics Education, 21(1), 33-46. 2. Beilock, S. L. (2008). Math performance in stressful situations. Current Directions in Psychological Science, 17(5), 339-343. 3. Beilock, S. L., & DeCaro, M. S. (2007). From poor performance to success under stress: Working memory, strategy selection, and mathematical problem solving under pressure. Journal of Experimental Psychology: Learning, Memory, & Cognition, 33(983-998). 4. Ramirez, G., Gunderson, E. A., Levine, S. C., & Beilock, S. L. (2013). Math anxiety, working memory, and math achievement in early elementary school. Journal of Cognition and Development, 14(2), 187-202.

Very Calm

Working Memory Digit span (WISC-III): Repeat numbers in forward and backward orders Math Achievement Woodcock-Johnson III Tests of Achievement, Applied Problems subtest

Contact Information Elizabeth A. Gunderson & Gerardo Ramirez Temple University & UCLA Dept. of Psychology Phone: (215) 204-1258 liz.gunderson@temple.edu


Vision in Preschoolers – Hyperopia in Preschoolers (VIP-HIP) Background

Results/Conclusions

Refractive errors such as astigmatism or nearsightedness are easily corrected with consensus among vision care providers as to when and how to prescribe. In contrast, there are no evidence-based practice guidelines for managing moderate amounts of farsightedness (hyperopia) in children who do not have an eye turn or lazy eye. Parents may currently receive diverse recommendations including full time, part time or no eyeglass wear depending on which eye care provider they consult with.

Parents are provided: • Written feedback of the results of the eye exam and educational assessment • Free eyeglasses for their child if they do not have insurance Recruitment ends August 2014. Results will be analyzed, written, published and disseminated during 2015.

Yet, moderate farsightedness has been shown to be associated with decreased reading ability, attention and visual-motor integration skills in young children.

The possible benefit to society of the VIP-HIP Study will be a better understanding of the impact of uncorrected farsightedness on reading readiness and early educational performance. This information will help eye care providers and vision researchers learn how and when to prescribe for moderate farsightedness and determine how early correction of this type of refractive error might benefit children in the future.

Today’s preschooler faces increased visual demands along with greater educational expectations. It is important to understand any effects of uncorrected farsightedness on early educational performance.

Project Overview VIP-HIP is a 4 year study in 3 cities whose purpose is to compare reading readiness, attention, development and eye-hand coordination skills among preschool children (age 4-5 years) who are farsighted vs. children without any refractive error.

Methods_______________________ Children identified through vision screenings at their school are invited to participate. • Parents ask questions and complete written informed consent. • An enrolled child receives a comprehensive eye exam to determine his/her: • refractive error (farsighted or no eyeglasses required) • clarity of vision far away and close • eye alignment or teaming and • eye health Children found to be eligible return for an educational assessment that includes tests of: • Literacy, Attention, Development and Eye-hand coordination Each study visit: • Takes up to two hours • Occurs at The Eye Institute or on a Mobile Vision Unit at the child’s school • Compensates parents for their time .

Research to Practice Strategies

1.Vision in Preschoolers SG. Preschool vision screening tests administered by nurse screeners compared with lay screeners in the vision in preschoolers study. Invest Ophthalmol Vis Sci. 2005;46(8):2639-48. 2.Cotter SA. Management of childhood hyperopia: a pediatric optometrist's perspective. Optom Vis Sci. 2007;84(2):103-9. 3.Moore B, Lyons SA, Walline J. A clinical review of hyperopia in young children. The Hyperopic Infants' Study Group. J Am Optom Assoc. 1999;70(4):215-24. 4.AOA. Optometric Clinical Practice Guideline: Care of the Patient with Hyperopia. St. Louis: American Optometric Association; 1997 Contract No.: Document Number|. 5.Miller J, Harvey M. Spectacle prescribing recommendations of AAPOS members. AAPOS. 1998;35:51-2. 6.Lyons SA, Jones LA, Walline JJ, Bartolone AG, Carlson NB, Kattouf V, et al. A survey of clinical prescribing philosophies for hyperopia. Optom Vis Sci. 2004;81(4):233-7. 7. Williams C, Bell J, Harrad R. Preschool vision screening and reading ability at age 7. Invest Ophthalmol Vis Sci. 2004;45(s):s94. 8.Williams WR, Latif AH, Hannington L, Watkins DR. Hyperopia and educational attainment in a primary school cohort. Arch Dis Child. 2005;90(2):150-3. PMCID: 1720267. 9. Cornelissen P, Bradley L, Fowler S, Stein J. What children see affects how they read. Dev Med Child Neurol. 1991;33(9):755-62. 10. Shankar S, Evans MA, Bobier WR. Hyperopia and emergent literacy of young children: pilot study. Optom Vis Sci. 2007;84(11):1031-8. 11.Roch-Levecq AC, Brody BL, Thomas RG, Brown SI. Ametropia, preschoolers' cognitive abilities, and effects of spectacle correction. Arch Ophthalmol. 2008;126(2):252-8; quiz 161. 12.Rosner JR, J. The relationship between moderate hyperopia and academic achievement: how much plus is enough? J Am Optom Assoc. 1997;68:648-50. Supported by Grant # RO1–EY021141 The National Eye Institute – National Institutes of Health

Contact Information Names: Whitley Harbison, Leah Sack, Jasmine Campbell, Dr. Elise Ciner Organization: Salus University Phone Number: 215-780-1400 ext. 1489 Email Address: Eciner@salus.edu


Developing A Computerized Language Assessment for Preschoolers Background 100

Performance on Product and Process Measures

80

Percent Correct

• Early language ability is critical for later reading success.1,2 • We need an assessment for early language that is reliable, valid, culturally-neutral and easy for teachers to use. • Supports Strategy 1 of the Action Plan 2.0—promotes effective assessments to inform language instruction, particularly of ELL students.

60

Total Product Total Process

40

20

0

3

Project Overview Key Components of the Assessment: 1. Tests vocabulary and grammar 2. Assesses both products (what children know) and processes (how children learn) of language development 3. Designed for both English- and Spanish-speaking children

Methods 300 English monolingual and 76 SpanishEnglish bilingual students from preschools and Head Starts in Philadelphia, Delaware and Massachusetts were recruited.

4

5

Age

** Data shown are for 300 monolingual children

Research to Practice Strategies • Reveals what children can learn as well as the products of what they have learned. • Identifies children who need further screening and language interventions. • The Spanish-English version allows for assessment in both languages and provides information about distributed language ability in bilingual children. • The assessment is still in the development stage.

Sample Item from the Known Verb section: Funded by the Institute of Education Sciences Grant R305A110284 to U of Delaware, Temple, Smith College & Laureate Learning, PIs: R. Golinkoff, H. Hirsh-Pasek, A. Iglesias, J. de Villiers, & M. Wilson. 1Catts,

Results/Conclusions • Older children scored better than younger children. • Children scored slightly better on process measures than product measures indicating that their ability to learn (process) exceeds how much they have learned to that point (product).

H. W., Fey, M. E., Tomblin, J. B., & Zhang, Z. (2002). A longitudinal investigation of reading outcomes in children with language impairments. Journal of Speech, Language, and Hearing Research, 45, 1142-1157. 2National Early Literacy Panel. (2009). Developing Early Literacy: Report of the National Early Literacy Panel, Executive Summary.

Kathy Hirsh-Pasek, Amy Pace, & Paula Yust Temple University 267-468-6810 infantlab@temple.edu


Do Early Spatial Skills Lead to Later Spatial and Math Gains? Background

Results/Conclusions

• Childhood spatial activities (block building, • Early spatial skills predict later spatial etc.) influence spatial abilities.1 and math gains. • Spatial skills are essential for success in • Spatial skills at the age of 3 and 4 predict many STEM careers. 2,3 to scores on standardized tests of space • Informs Strategy 1 of the Action Plan 2.0— and math age 4 even after controlling for indicates the importance of spatial vocabulary scores. activities in the classroom. • Spatial and math skills at age 3 predict kindergarten math skills. • The TOSA can identify children who need Project Overview additional spatial training. Main question: Do early spatial skills predict future spatial and math abilities?

Methods 102 3-year-olds completed the Test of Spatial Abilities (TOSA; involves manipulating 2D and 3D shapes) and math & vocabulary Research to Practice Strategies assessments. Then 81 of these children at age 4 and 60 at age 5 completed more math • Shows the importance of early spatial and spatial assessments. p skills in later spatial and math performance. • Early spatial learning may have a longterm impacts on STEM outcomes. • Increasing spatial activities in the classroom (blocks, etc.) could help improve math and spatial abilities.

Sample 2D and 3D Items on the TOSA 1Levine,

S. C., Ratliff, K. R., Huttenlocher, J., & Cannon, J. (2012). Early puzzle play: A predictor of preschoolers' spatial transformation skill. Developmental Psychology, 48(2), 530. 2Newcombe, N. S. (2010). Picture This: Increasing Math and Science Learning by Improving Spatial Thinking. American Educator, 34(2), 29. 3Wai, J., Lubinski, D., Benbow, C. P., & Steiger, J. H. (2010). Accomplishment in science, technology, engineering, and mathematics (STEM) and its relation to STEM educational dose: A 25-year longitudinal study. Journal of Educational Psychology, 102(4), 860.

Kathy Hirsh-Pasek & Nora Newcombe Temple University Roberta Golinkoff & Brian Verdine University of Delaware infantlab@temple.edu 267-468-8610


Proportional Reasoning: An Intervention to Support Quantitative Development Background

Methods

Children often have difficulty with proportional reasoning. Performance is especially low when proportions are presented in discrete, countable units1. Discrete units tempt children to use erroneous counting-based strategies, sacrificing advantageous heuristics, such as using the halfmark boundary2,3.

PARTICIPANTS: • ~ 60 2nd grade students • ~ 30 per condition

Unit-to-Unit Match

CONDITIONS: • Experimental Condition • 24 trials • 8 continuous trials, presented FIRST • 8 mixed trials, (continuous & discrete proportions) presented SECOND; no unitto-unit matches (see Fig. 1) • 8 discrete trials, presented LAST; unit-tounit matches presented in each trial (see Fig. 2) • Control Condition • Same as Ex. Condition, except that continuous, mixed, and discrete trials presented in reverse order Fig. 1: Mixed Trial

Non-Integer on-Integer Misconceptions

1. Transfer of whole number properties to non-integers

2. Non-integers understood ONLY in terms of part-to-whole relations

3. Application of one arithmetic rule to ALL non-integer operations Fig. 2: Discrete Trial

e.g., A fraction’s magnitude is not equivalent to the magnitude of its numerator or denominator.

e.g., 4/3 does not imply 4 parts of a 3 part entity.

e.g., Addition & subtractiondenominator maintained. Multiplication & division-denominator NOT maintained.

The proposed intervention aims to address the first misconception by: 1. Presenting continuous proportions BEFORE discrete proportions. • Will heuristics override erroneous counting strategies? 2. Prompting children to consider part-to-whole relations when estimating magnitudes of non-integers. • Will children infer that the referents’ green juice is always above or below the container’s half-mark? • If so, will they adopt a 䇾half-mark䇿 strategy?

References 1Boyer,

Levine, & Huttenlocher, 2008 et al., 2008 3Spinillo & Bryant, 1999 2Boyer

Contact Information Name: Corinne A. Holmes Organization: Temple University Phone Number: 609.468.4277 Email: corinne.holmes@temple.edu


Playing to Learn: Vocabulary Development in Early Childhood Background • Many early childhood education programs have shifted their focus from play towards a more academic curricula. • Yet, evidence suggests that play—if it is adult-supported—supports strong learning1. • The Read-Play-Learn (RPL) project investigates the effectiveness of different types of play on vocabulary development. • RPL aligns with the SDP Action Plan 2.0: Strategy 1: Improve student learning, Part D: Implement a literacy-rich early childhood continuum.

• These graphs illustrate post-test scores for the receptive and expressive vocabulary measures, after controlling for pre-test performance, attendance, age, and book theme. 100% Average Proportion of Receptive Items Correct

80% 60% 40%

Project Overview • Purpose: Compare the effectiveness of three types of play, combined with bookreading, for increasing vocabulary. • Population: 4- to 5-year-olds in Head Start (Eastern PA) and PreK (Tennessee) programs; April – May 2012. Methods • Four books were selected, each with 10 target vocabulary words. • Language Specialists (LS) read each book to small mixed-gender groups for four days. • After all readings, each group had one type of play session: • Free – play without LS guidance Directed – LS directed children to retell the story, defined words, asked closedended questions Guided – children led the play; LS joined and incorporated definitions, closed- and open-ended questions • Receptive and expressive vocabulary knowledge was assessed before/after the intervention.

53.8%

20%

Free Play

59.8% Directed Play

60.7% Guided Play

1.0 0.8

Average Score Per Expressive Item

0.6 0.4 0.2

0.47

0.61

0.58

Research to Practice Strategies • Play can be a valuable tool in early childhood education, allowing children to have fun while gaining important knowledge. • Not all play is equal. Children learn vocabulary better from play that is supported by an adult who incorporates definitions and discussions about the words. 1 Christie

& Roskos, 2006; Han, Moore, Vukelich, & Buell, 2010; Lillard & Else-Quest, 2006

Contact Results/Conclusions • Directed and Guided Play showed stronger target vocabulary gains than Free Play. There were no differences between the two types of adult-supported play.

Kathy Hirsh-Pasek, Ph.D., Tamara Spiewak Toub, Ph.D., Brenna Hassinger-Das, Ph.D Temple University’s Read-Play-Learn Team khirshpa@temple.edu 267-468-8610, tamara.spiewak.toub@temple.edu, hassinger.das@temple.edu


Learning from play: The impact of spatial game play on spatial ability Background

Results/Conclusions

• Spatial reasoning is crucial for learning in science, technology, engineering, and math (STEM) domains.1 • Included in both Common Core and NGSS. • Research has found gender and SES differences in children’s spatial ability, spatial game play, and spatial learning. • Males and higher SES tend to have an advantage.2 • Spatial ability is malleable, and spatial game play is related to better spatial ability.3 • Skills typically more malleable earlier, and provide an early foundation for later learning.

1. Frequency of playing with puzzles, blocks, and board games related to spatial score.

• We investigate the influence of spatial game play (i.e., playing with blocks, puzzles, and board games) on children’s spatial ability, addressing the research questions: 1. Is spatial game play related to spatial ability? 2. Are there group differences in spatial game play, and do these differences impact the effect of spatial game play on spatial ability?

Figure 1. Spatial ability score for different levels of reported frequency of spatial game play 11.5 .5 SD

standardized score

p <.001, d = .35

Rarely Sometimes Often

11

p =.003, d = .26

10.5

NS

Mean 10 Block design

Project Overview

• F (2, 890) = 8.433, p < .001, Șp2 = .0, Controlling for verbal ability, family income

9.5

9

-.5 SD 8.5 Puzzles, blocks, board games

Puzzles, blocks, board games

Dolls, balls, cars, trucks

Dolls, balls, cars, trucks

Bicycle, skateboard, scooter, swings

Bicycle, skateboard, scooter, swing set

2. No group differences in spatial game play, and no interactions between gender or SES and spatial game play on the influence on spatial ability.

Methods Participants: N=1,267 (51% female)

Research to Practice Strategies

• Ages 2.50 – 7.60 years; mean age = 4.67 • Reported race/ethnicity: 52.5% White, 24.8%

Spatial games and activities improve spatial ability, which can help children learn in STEM domains. Our current research is exploring how spatial ability develops through these activities.

Hispanic, 15.2% Black, 3.1% Asian, 4.4% Other

• SES (household income): 35% Low, 28% medium, 37% high (>$30,000, $30-70,000, >$70,000).

Measures: • Spatial and verbal ability: Wechsler Preschool and Primary Scale of Intelligence (WPPSI).4 • Spatial game play: Home-Environment Questionnaire (HEQ). • “How many times a week does the child play with the following types of toys and games?” (never/rarely, sometimes, often) a. Puzzles, blocks, and board games b. Dolls, balls, cars, trucks c. Bicycle, skateboard, scooter, swing set The Spatial Intelligence & Learning Center is funded by an NSF Science of Learning Centers grant (SBE 1041707).

1 Uttal, D. & Cohen, C. (2012). Spatial thinking and STEM education: When, why and how? In Psychology of Learning and Motivation (B. Ross, Ed.). 2 Levine, S. C., Vasilyeva, M., Lourenco, S. F., Newcombe, N. S., & Huttenlocher, J. (2005). Socioeconomic status modifies the sex difference in spatial skill. Psychological Science, 16(11). 3 Levine, S. C., Ratliff, K. R., Huttenlocher, J., & Cannon, J. (2012). Early puzzle play: A predictor of preschoolers' spatial transformation skill. Developmental Psychology, 48(2), 530. Uttal, D. H., Meadow, N. G., Hand, L. L., Alden, A., Warren, C., & Newcombe, N. S. (2012). The Malleability of spatial skills: A meta-analysis of training studies. Psychological Bulletin. Casey, B. M., et al. (2008). The development of spatial skills through interventions involving block-building activities. Cognition and Instruction, 26(3), 269-309. 4 Wechsler, D. (2002). Wechsler Preschool and Primary Scale of Intelligence – Third Edition (WPPSI-III). San Antonio, TX: Pearson »PsychCorp.

Contact Information Jamie Jirout, Nora Newcombe Spatial Intelligence and Learning Center (SILC) Temple Infant & Child Lab, Temple University, Psychology 215-468-8610 jamie@temple.edu


City Year Program Evaluation Background

Results/Conclusions

City Year is an education-focused nonprofit organization that partners with high need public schools to enhance the quality of the school learning environment. City Year’s ABC model, supports students in the areas of Attendance, Behavior, and Course Performance. This is achieved by deploying teams of City Year Corps members to the schools. With funding from the William Penn Foundation, City Year is being implemented in 11 schools in the School District of Philadelphia (SDP) over two years (2013-14 and 201415). The Office of Research and Evaluation (ORE) is evaluating City Year’s effectiveness in meeting the above goals. This poster presents preliminary results from the 2013-14 mid-year analysis (October-March).

1) 100% of principals and 82% of teachers surveyed indicate that they feel well-informed about City Year's mission, with 90% of Principals agreeing that City Year’s mission directly aligns with the school’s action plan. ™ This data was confirmed by interview data with all 11 principals stating they have a clear understanding of City Year’s mission and that this mission aligns with the school’s action plan. ™ “When I [spoke with City Year] at the table, [City Year] matched perfectly” (Principal, School A). 2) 90% of principals and 83% of teachers surveyed indicate that they are satisfied or very satisfied with their overall City Year experience. ™ This data was corroborated by interview data, with all principals stating positive perceptions of City Year. ™ “It’s been a very strong educational relationship…I am very happy with [City Year]” (Principal, School K). 3) 81% of teachers surveyed indicate that they are satisfied or very satisfied with the overall impact of City Year on their class/students. However, 59% of principals express concerns regarding the quality of instruction among City Year staff. ™ These findings were confirmed by interview data, with 6 of the 11 principals expressing the need for more training in the areas of academic development and interventions. ™ “For the appropriate student that needs a little more help, I think City Year is great. I am not sure that [City Year] is the answer to the student that is below basic to get them up to proficient.” (Principal, School G)

Project Overview_______________ ORE is employing a mixed-methods, quasiexperimental design to evaluate City Year’s programming. The Logic Model in figure 1 presents the complete scope of the evaluation. Figure 1. City Year Logic Model

Methods_______________________ Quantitative Data Collection Principals and other school administrators (n=20) and teachers (n=71) completed the mid-year feedback survey in December 2013. The purpose of the survey was to obtain their feedback on City Year in their schools, in the areas of: 1) the City Year team’s performance, 2) their relationship with City Year; 3) their understanding of City Year’s model; and 4) City Year‘s programmatic impact on enhancing school climate. Data was analyzed using descriptive statistics. Qualitative Data Collection All 11 participating principals were interviewed in February and March 2014 using a semi-structured interview. The interview protocol was developed around the survey areas described above. Interviews were designed to help the evaluation team gain an in-depth understanding of City Year’s impact in the schools as well as provide a forum to identify areas of improvement.

Research to Practice Strategies The evaluation plan focuses on providing City Year with on-going formative feedback. As such, below are a set of mid-year recommendations the organization is working to address: 1. Enhancing Professional Development opportunities between teachers and City Year staff. 2. Creating opportunities for earlier and regular planning meetings with school administrators and City Year staff. Contact Information Sabriya K. Jubilee, PhD, Shelly Engelman, PhD School District of Philadelphia Office of Research and Evaluation sjubilee@philasd.org (215) 400-5336 sengelman@philasd.org (215) 400-5510


An evaluation of Media Aware, a media literacy education substance abuse prevention program for high school students Background

Results/Conclusions

Today’s youth spend a large portion of every day using some form of media1 and are repeatedly exposed to positive messaging about alcohol, tobacco, and other drugs2. Media literacy education (MLE) has been used effectively with children and younger adolescents to counteract the influence of unhealthy media messages on risky decisionmaking3. However, there have been no evidence-based MLE substance abuse prevention programs that have been developed and evaluated for use with high school students.

™ Participation in Media Aware was found to impact several substance use-related outcomes for high school students (i.e., lower intentions to use alcohol for current substance users and lower intentions to use tobacco for males; lower positive expectancies about substance use for females; and lower normative beliefs about alcohol and other illegal drug use). ™ Both teachers and students reported satisfaction in participating in Media Aware. ™ This was the first study that demonstrated that when students are taught both critical analysis of media messages and accurate information about the consequences and prevalence of drug use, it reduces the belief that substance use is normative, which in turn, reduces intent to use.

Project Overview The current study evaluated the Media Aware program (previously called Media World), a new high school, media literacy education (MLE), substance abuse prevention program in a randomized control trial design. Media Aware is an MLE program that has the goal of reducing the use of alcohol, tobacco, and other drugs in high school students. The program consists of 12 multimedia, interactive, 50minute lessons.

Methods_______________________ An RCT using a pretest-posttest design was used to evaluate the effectiveness of the Media Aware program for substance abuse prevention. Teachers also reported on the usability and their fidelity of implementation of the program. Student outcomes were analyzed using SAS PROC MIXED with Teacher(Class) serving as the repeated variable. Pretest scores for each outcome were included as predictor variables; therefore, outcome variable means are reported as adjusted posttest mean scores. Participants: • 18 high school teachers (38 classes) • 624 high school students • ages 12-19 (M=14.76; SD=.94) • 50% male • 64% white; 16% black; 20% other • 11% Hispanic/Latino

Research to Practice Strategies This information is useful because schools can adopt this easy-to-teach, engaging, and effective substance abuse prevention program in their high school health classes. Footnotes 1 Rideout, Foehr, & Roberts, 2010. 2 King & Siegel, 2001; Austin & Hust, 2005; Sargent, Wills, Stoolmiller, Gibson, & Gibbons, 2006; Primack, Dalton, Carroll, Agarwal, & Fine, 2008. 3 Kupersmidt, Scull, & Austin, 2010; Austin, Pinkleton, Hust, & Cohen, 2005; Kupersmidt, Scull, & Benson, 2012. Acknowledgements: This project was supported by grant R44DA018495 from the National Institute on Drug Abuse to Dr. Kupersmidt (PI) and Dr. Scull (Co-I). The content in this poster is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. We would like to thank school administrators, health professionals, and the teachers and students in the school study sites for assistance with this project.

Contact Information Janis Kupersmidt, Ph.D. jkupersmidt@irtinc.us Tracy Scull, Ph.D. tscull@irtinc.us Innovation Research & Training 919-493-7700


Trends in Relative Weight Over One Year in US Urban Youth Background_____________________________

Figure 1. Longitudinal tracking of students’ weight categories

Few large-scale longitudinal datasets exist, particularly with lowincome and minority youth who are at higher risk for obesity and obesity related chronic diseases. Longitudinal datasets allow for the examination of individual trajectories in addition to describing prevalence, which will aid in understanding current childhood obesity trends.

Project Overview_________________________ This study seeks to: 1) Describe recent data on baseline estimates and one-year changes in measures of relative weight (weight status, z-BMI, BMI percentile, BMI) in a large sample of diverse youth 2) Examine relative weight trend variation by sex, race, grade, and obesity severity 3) Investigate how cross-sectional and longitudinal trends may differ

Methods______________________________ Youth were sampled from 55 schools in high-risk areas tin Philadelphia hat were identified based on health, crime, and income data from a large urban city. Approximately 94% of eligible students were enrolled at baseline, and 86% of eligible students were enrolled at followup. Measured heights and weights were collected in 5-14 year-old youth over two assessments approximately 1 year apart to form 3 samples • Baseline cross-sectional sample, n=17,727 • Follow-up cross-sectional sample, n=18,476 • A subset of longitudinal data, n=13,305 Table 1. Longitudinal sample demographics at post (n=13,305) N Whole 13,305 Sample Sex Males 6890 Females 6415 Race African American 8465 Hispanic 2351 White 961 Asian 1017 Other 511 Grade First 74 Second 2335 Third 2388 Fourth 2184 Fifth 2230 Sixth 2050 Seventh 2044

% 100

Healthy OverSeverely Obese Weight weight Obese 59.8

17.1

14.4

8.7

51.8 48.2

61.0 58.6

16.5 17.7

13.8 15.1

8.7 8.6

63.6 17.7 7.2 7.6 3.8

60.7 53.6 58.9 67.6 59.7

16.7 18.8 17.0 15.8 17.4

13.6 17.8 15.2 12.1 16.0

9.0 9.7 8.9 4.4 6.8

0.6 17.6 17.9 16.4 16.8 15.4 15.4

59.5 64.2 61.9 62.1 57.4 56.1 56.5

21.6 16.1 16.5 15.7 17.0 19.0 18.3

9.5 12.5 13.7 14.2 16.1 14.6 15.8

9.5 7.3 7.9 8.0 9.6 10.2 9.4

Table 2. Weight category prevalence, stability, incidence, and remission at follow-up Prevalence Stability Worsened Improved Healthy 59.8 93.8 6.2 0.0 Overweight 17.1 67.9 13.7 18.4 Obese 14.4 76.0 8.0 16.0 Severely Obese 8.7 84.1 0.0 15.9 Overall 100 86.0 7.2 6.8 Note: Stability=same weight category; Worsened=moved to any worsened weight category; Improved=moved to any improved weight category

Results/Conclusions Cumulative link mixed models examining the difference in youths’ probability of being in any overweight category versus a healthy weight category showed no significant differences after 1 year in the longitudinal sample. Linear multilevel models examining the effect of time on BMI zscore and BMI percentile showed a small but statistically significant decrease in BMI z-score (B= -0.02, SE=0.003, p<.01) and percentile (B= -0.48, SE= 0.08, p<.01) after 1 year. Multinomial logistic regressions and multilevel moderation analyses showed that at baseline, females and Hispanics showed a significantly higher prevalence of obesity, and longitudinally, Asians and boys showed significantly greater decreases in relative weight compared to other racial groups and girls, respectively.

Research to Practice Strategies

This research was support by a Cooperative Agreement #3U58DP002626-01S1 from the Centers for Disease Control and Prevention and Get Healthy Philly, an initiative of the Philadelphia Department of Public Health.

It remains alarming that so many youth are overweight or obese and particularly concerning that low-income and minority youth exhibit such high and stable rates of severe obesity. Despite some initial positive signs, childhood obesity requires continued, creative, public health efforts. Contact Information Name: Hannah Lawman Organization: CORE – Temple University Phone Number: 215-707-8670 Email Address: hlawman@temple.edu


Purchasing Patterns of Adults, Adolescents and Children in Urban Corner Stores: Quantity, Spending and Nutritional Characteristics Background_____________________________ Corner stores are prevalent in low-income urban areas and primarily stock high-calorie foods and beverages. Little is known about individual-level purchases in these locations.

Table 1. Distribution of product categories as a percentage of intercepts (N=9,238) and items (N=20,244) Overall Adults % of % of % of % of Interceptsa Items Interceptsa Items 65.9 39.2 69.5 42.7

Beverage

Project Overview_________________________

Candy

8.4

7.9

5.4

7.6

The purpose of this study is to assess corner store purchases of children, adolescents and adults in a low-income urban environment (items, nutritional characteristics, and amount spent).

Chips

24

17.9

20.3

15.2

Dried Family Food Productc

6

4.3

6.6

4.9

Fruits and Vegetables

2.4

2.3

2.9

2.9

Ice Cream

4.9

3

4.1

2.6

Methods

Nuts Seeds and Granola

1.9

1.2

1.5

0.9

Corner stores were defined as businesses that primarily sold food, had 1-4 aisles, had only 1 cash register, were located in high poverty areas and agreed to participate in the Healthy Corner Store Initiative, a healthy food intervention program.

Othere

1.1

1

0.9

0.8

Pastry

12.5

10.1

12.2

10.7

13.7

10.7

14.6

8.7

3.7

2.5

4.2

3

Prepared Food Itemb Refrigerated Family Product

d

9,238 intercept surveys examined food and beverage purchases at 192 corner stores in 2011. Staff obtained verbal consent and permission to record shoppers’ purchases (number of items, product category, brand, and weight/size).

Children % of % of Interceptsa Items 44.3 21.7

Beverage

Adolescents % of % of Interceptsa Items 62.4 35.7

Candy

25.6

26.5

12

13.4

Chips

42.4

27.8

30.7

23.4

Shoppers self-reported their purchase costs.

Dried Family Food Productc

2.5

1.5

5.5

3.3

Shoppers identified their age category.

Fruits and Vegetables

1

0.6

1.3

1.1

10.7

5.3

5.6

3.2

Nutrition information was obtained for all items purchased using labels or by weighing and measuring identically prepared items. Purchases were analyzed by intercept (n=9,238) and by total items (n=20,244). Table 2. Corner store purchase characteristics by age (N=9,238)a Total (N=9,238)

Adults •\HDUV (n=6,857)

Adolescents 13-18 years (n=1,430)

Children 5-12 years (n=915)

Purchase

MeansSD

MeansSD

MeansSD

MeansSD

Total amount spent, $b

2.74s3.52

2.96s3.67

2.38s3.37

1.61 s1.83

Total No. of Items

2.2s2.1

2.1s2.1

2.3s2.1

2.5 s2.3

Food Items

1.3s2.0

1.2s2.0

1.5s2.0

2.0 s2.2

Beverage Items

0.9s0.9

0.9s0.9

0.8s0.8

0.5 s0.8

Energy, kcal

666.0 s1064.6

693.5 s1086.5

650.2 s1102.0

475.7 s690.9

Fat, g

23.0 s63.9

23.2 s56.7

25.0s95.0

17.5 s31.6

Protein, g

15.4 s34.6

16.5 s35.3

13.9s32.3

9.6 s32.8

Carbs, g

102.9 s160.2

108.8 s172.7

95.5 s120.3

71.2 s105.0

Sugars, g

66.2 s113.0

70.2 s122.3

61.9 s85.9

43.1 s67.7

Dietary fiber, g

2.5 s7.2

2.7 s7.8

2.3s5.1

1.7s5.1

996.4 s5028.0

786.3 s1451.4

589.3 s1078.2

Sodium, mg 921.1 s4368.3

Ice Cream Nuts Seeds and Granola

3.6

1.9

2.6

1.5

Othere

3.4

2.3

1.1

0.9

Pastry

14.1

7.6

12.9

8.7

Prepared Food Itemb

6.9

3.4

13.7

7.5

1.8

1.4

2.2

1.3

Refrigerated Family Product

d

Results/Conclusions Urban corner store shoppers spent < $3.00 for almost 700 calories of mostly nutrient sparse snack foods and sugarsweetened beverages. When examined as a percentage of total items or intercepts, the five most frequently items purchased from corner stores were beverages, chips, prepared food items, pastries and candy. Almost two-thirds (61.4%) of beverages purchased were sugarsweetened drinks, with an even higher percentage among children (78.1%).

Research to Practice Strategies Obesity prevention efforts may benefit from including interventions aimed at changing corner store food environments in low-income, urban areas.

Contact Information Name: Michelle Lent Organization: CORE – Temple University Phone Number: 215-707-8637 Email Address: tue41017@temple.edu

This project was funded by Cooperative Agreement #3U58DP002626-01S1 from the Centers for Disease Control and Prevention, Get Healthy Philly, and NIH grants #1F32DK096756 and #DK20541.


Educating Children and Youth Experiencing Homelessness: Transportation as a Barrier to Attendance Background Research suggests that a lack of transportation is a major factor in excessive absenteeism of homeless students,1 thus negatively impacting their ability to succeed academically.2 Truancy is defined by the School District of Philadelphia (SDP) as ten or more unexcused absences incurred by a student.

Project Overview Educating Children and Youth Experiencing Homelessness (ECYEH) is a federal program that provides services to homeless students, and informs schools and community shelters about homeless students' rights to an equal education.3 A main goal of ECYEH is to remove barriers and disruptions to homeless students’ education.4 The Office of Research and Evaluation (ORE) at SDP conducts an annual evaluation of the ECYEH program.

Methods_______________________ Homeless students are identified through school referrals, shelter referrals, and applying for services through the ECYEH Office. The ECYEH Office provides ORE with a monthly data file, comprised of student identification, ECYEH services received, and housing placement. As part of the evaluation, ORE investigated the impact of transportation assistance services, in the form of SEPTA transit passes, on homeless students. ORE collected truancy information for K-12 students in SDP from the District’s database. 5

Results/Conclusions During the 2013-14 school year, the rate of truancy for homeless students was twice as high as that of other District students. This trend was consistent across all grade levels. See Figure 1 and Table 1.

Figure 1. Truancy Rates as of January 2014

Table 1. Homeless Students’ Rate of Truancy* Population K-8 HS (9-12) Total

Homeless 26% 51% 29%

All District 10% 27% 14%

*All reported group differences were significant at the p<.001 level.

Patterns of truancy rates varied greatly among homeless students who accessed transit passes. A Pearson’s Chi-Square test found a significant association between transit pass access and whether or not students were truant. Specifically, students were significantly less likely to be truant if they received a transit pass. Of the homeless students who received SEPTA transit passes through January 2014, 73% were not truant. Refer to Figure 2. Figure 2. Transit Pass Access and Truancy 100% 80%

73%

69%

60% 40%

31%

27%

20% 0% Transit Pass Received Not Truant

Transit Pass Not Received Truant

Research to Practice Strategies The ECYEH Office has been notified of this variability and can target transit passes at those most in need. Further research is needed to isolate those students who have more complex reasons for their patterns of absenteeism so that they can be addressed, when possible, by the program. 1 Tobin,

K. (2011, August). Identifying best practices for homeless students. Retrieved from http://search.proquest.com.ezproxy2.library.drexel.edu/docview/907106135. Buckner, J., Bassuk, E., Weinreb, L. (2001). Predictors of academic achievement among homeless and low-income housed children. Retrieved from http://www.sciencedirect.com.ezproxy2. library.drexel.edu/science/article/pii/S0022440500000595#. 3 Stipulated by the McKinney-Vento Homeless Assistance Act. 4 Homelessness Narrative 2013-14 5 Charter students excluded from all analyses 2

Contact Information Kirstie McClung Office of Research and Evaluation (ORE) (215) 400-4263 kmcclung@philasd.org Katherine Stratos Office of Research and Evaluation (ORE) (215) 400-5396 kstratos@philasd.org


Summary of a Matched-Sample Study Comparing The Interactive Math Program and Traditionally-Taught High School Students in Philadelphia Background SDP was one of a few sites around the country to be a test bed for the implementation of The Interactive Mathematics Program (IMP). IMP is an innovative problem-based, full-replacement high school math curriculum originally aligned to the 1989, 2000 NCTM Standards and more recently the 2013 Common Core Math Standards. It was developed with $9 million in funding from the National Science Foundation. Several matched pair research studies were conducted comparing IMP students versus non IMP students on various achievement measures.

Project Overview The timeline for implementation was 1993 to 2003. Implementation funding was from the National Science via the Philadelphia Urban Systemic Initiative.

Methods Math teachers from 19 high schools volunteered to implement the IMP curriculum. Teachers either received a one course load reduction or $4,000 to participate in the professional development. Professional development consisted of 60 hours/year (10 days) for 4 years for a total of 240 hours of PD. In a carefully controlled matched-sample study, the performance of 83 junior grade students in Central High School IMP classes and 83 traditionally-taught algebra students at the same school were compared using the Stanford Achievement Test (SAT-9) which was administered in March 1996.

The 83 traditional students were chosen so that they matched the IMP students with regard to • Gender, • Number of students attending public and private schools in eighth grade, • Percentile scores on national, standardized tests taken by the students in eighth grade. A second study matched-pair study of 55 IMP and 55 non-IMP junior level students at Girls High was also conducted.

Results/Conclusions In the Central High study, IMP outperformed traditional students in 21 out of 28 reported SAT9 multiple choice categories. Of these, the Probability and Functions gain scores are statistically significant. There were 3 categories were IMP and traditional students tied. There were 4 categories where the traditional scores were higher than IMP. Of these, none are statistically significant. In the Girls study, of the 17 math-related scores of the SAT-9, IMP students outperformed their traditionally taught counterparts on 12, tied on two and scored slightly lower on three. IMP students did better on all the cumulative scores and the open-ended assessments.

Research to Practice Strategies IMP has been demonstrated to produce significantly better student outcomes aligned to the Standards for Mathematical Practice of the Common Core.

Contact Information Name: F. Joseph Merlino Organization: The 21st Century Partnership for STEM Education Phone Number: 610-368-7942 Email Address: jmerlino@21pstem.org


AN EXAMINATION OF ENGLISH LANGUAGE PROFICIENCY AND ACHIEVEMENT TEST OUTCOMES Background

Results/Conclusions

The study is needed to explain how the assessments used to determine language proficiency relate to academic content assessed during the administration of the statewide test.

The study showed strong evidence that there is a significant relationship between the PSSA and language background as measured by the ACCESS assessment. Assessment. The implications of these data for the testing and assessment of ELL learners was discussed.

Project Overview The purpose of the quantitative study was to compare the relationship between grade eight English language proficiency as measured by the ACCESS for ELLâ&#x20AC;&#x2122;s assessment (Assessing Comprehension and Communication in English State to State for English Language Learners) and achievement test outcomes on the Pennsylvania System of Student Assessment, a state mandated test. The PSSA data and ELL assessment data of eighth grade students at eight traditional middle school in the School District of Philadelphia are examined. Pearson Correlations Between PSSA and ACCESS scores

Methods_______________________ The data for the study were gathered from an analyses of 8th grade ELL studentsâ&#x20AC;&#x2122; scores on the 2011 PSSA standardized assessment test administered in the Philadelphia, Pennsylvania public school district. Data were also gathered from the analysis of 8th grade ELA assessments for the 2010-2011 school year.

Research to Practice Strategies Practitioners and policymakers must examine the validity, valid authenticity and language proficiency tests usefulness of lan given the significant signific relationships existing between the PSSA PSS and ACCESS assessments. The Th results of this study provided strong e evidence that there is a significant relatio relationship between the PSSA and language background, ba as measured by the ACCESS a assessment.

Contact Information Inf Mojica Tammy C. M Temple Uni University 215/850.8437 215/850.84 tmhill@philasd.org tmhill@phila Note: All correlations are significant at the .001 level

*Dissertation completed at Temple University


Engaging H.S. Math Students & Teachers Through a Proficiency-based Assessment and Reassessment of Learning Outcomes (PARLO) System Background

Results/Conclusions 9th

Research indicates that success in grade math is critical in keeping students engaged, in school, and on track to graduate.1 Consistent with the national priority of college and career readiness and the District’s Action Plan, PARLO is an instructional and assessment system designed to make students more responsible for their own learning.

• • •

Project Overview NSF funded study to determine the effects of providing 9th grade students with additional time and opportunities to show evidence of mathematical understanding of specific learning outcomes. Learning Outcomes are defined by teachers and strongly aligned to state standards. Emphasis is on accumulating knowledge rather than points. Students have multiple opportunities to reassess and receive High Performance (HP), Proficient (P), or Not Yet Proficient (NYP) rather than traditional grades. Duration: Fall 2010 – Spring 2015 Participants: 16 Districts in Pennsylvania, 32 Schools (including 10 SDP Schools), 85 Teachers (including 16 SDP Teachers), over 4,000 Students

Fieldwork ended in June 2013. ANALYSES ARE ONGOING EMERGENT FINDINGS Most teachers agreed that their participation in PARLO influenced their thinking about how students learn math. Teachers felt more informed about math standards. The majority of teachers reported that overall, their students took more responsibility for their own learning and also showed responsibility for their peers’ learning. At the same time though, most teachers acknowledged a “contentment” problem (i.e. procrastination/ lack of motivation). 80% of teachers said that their students were more likely to participate in discussions about math content with both their teacher and fellow students. 80% of teachers said that their students were more likely to ask questions when concepts were not well understood. The majority of teachers said they intend to continue with PARLO in some manner after their participation in the study ends.

Research to Practice Strategies Methods_______________________ • Design: RCT – Mixed Methods • Data Collection: Algebra/Geometry Tests, Attitude/Belief Surveys, Classroom & PLC Observations, Teacher & Administrator Interviews • Teachers attended 54 hours of PD, monthly PLCs, and received compensation.

• PARLO was not without challenges. Success required teachers to embrace assessment and instructional philosophies that took many of them outside of their comfort zones. As well, it took time for students and parents/guardians to understand before many fully participated in this new dynamic. • Ideally, the next step is to test PARLO with younger grades and across subjects. 1 Allensworth

& Easton, 2007

Contact Information Nancy Lawrence & Kathleen Krier, 21st Century Partnership for STEM Education Michael Posner, Villanova University 610-825-5644 nlawrence@21pstem, kkrier@21pstem.org, michael.posner@villanova.edu


SNAP-Ed Garden-based Nutrition Education: A New Strategy in the Fight Against Obesity Background

Results/Conclusions

The United States Department of Agriculture’s (USDA) My Plate recommends a healthy diet that includes fruits and vegetables occupying half of your plate at daily meals to reduce risk of chronic health conditions and to promote a healthy weight. Many youth and adults are not consuming nearly this recommended daily amount of fruits and vegetables. For example, only 25% of youth and 30% of adults consume 1 serving daily. Across Philadelphia, almost half (41%) of Philadelphia’s youth between the ages of 6 and 17 years are overweight or obese but this rate is much higher in North Philadelphia, where 70% of African American and Hispanic youth are overweight or obese (CDC, Communities Putting Prevention to Work, 2013). SNAPEd practitioners need multiple tools and strategies to combat this health crisis. Garden-based nutrition education is one strategy to diversify traditional nutrition education, open experiential learning opportunities and promote physical activity.

Students that participate in HPC’s garden based education program represent groups that statistically have the highest rates of youth overweight/obesity in Philadelphia. HPC plans to evaluate the successfulness of garden-based education in improving students’ knowledge about fruits and vegetables and attitudes towards trying and/or regularly consuming fresh produce. Garden-based nutrition education demonstrates the principles of nutrition and physical activity, enabling students to potentially shape their behaviors that can lead to a brighter and healthier adult tomorrow.

Project Overview The purpose of garden-based nutrition education is to increase nutrition knowledge, encourage intake of fruits and vegetables and to promote physical activity. Students in the garden dig, compost, pull weeds, plant seeds, water plants, and harvest produce. Physical activities in the garden promotes a healthy weight and teaches healthy behaviors, important components to reducing rates of obesity among Philadelphia’s youth.

Methods Health Promotion Council (HPC) and other Eat.Right.Now. partners provide nutrition education for students in 80% of Philadelphia’s public schools. HPC nutrition educators reach approximately 14,000 students across 20 schools with 5 public schools receiving both classroom and garden-based programming. Garden-based nutrition education is funded through the USDA Supplemental Nutrition Assistance Program Education (SNAP-Ed). SNAP-Ed provides nutrition education programs and services to schools with >50% of students qualifying for free or reduced lunch. Schools were initially selected to participate in the garden initiative based on outdoor growing space and level of demonstrated and potential interest among school staff. HPC began teaching outdoors during fall and spring and expanded to yearround growing using grow labs over the winter months. In both settings, students typically cultivate leafy greens, sprouts and culinary herbs. Students prepare salads, smoothies and other recipes using the harvested fresh produce. They are consistently delighted by eating the plants they have grown.

Research to Practice Strategies Before starting your garden, ask the following questions: Who should be involved in the planning process? • Administration • Teachers • Parents and the community • Last, but not least, the students Aspects to consider when starting a garden: • How many children/classrooms will participate? • Do you want indoor growing labs or an outdoor garden? • Where is the best location to build a garden? • Who will be responsible for maintenance? • Who is your Garden Team? • What is the role of parent and community volunteers?

Contact Information Name: Lauren Nocito Organization: Health Promotion Council Phone Number: 215-985-2669 Email Address: lauren@phmc.org Funded by the Pennsylvania (PA) Department of Public Welfare (DPW) through PA Nutrition Education Tracks, a part of USDA’s Supplemental Nutrition Assistance Program (SNAP). To find out how SNAP can help you buy healthy foods, contact the DPW’s tollfree Helpline at 800-692-7462. This institution is an equal opportunity provider and employer.


2012-13 Evaluation of City Year Greater Philadelphia A Five-Year Study by Research for Action Background In 2012-13, City Year Greater Philadelphia (CYGP) provided academic and behavioral supports for students in grades 6-9. Supports include one-on-one and group tutoring in math and English, attendance and behavior coaching. CYGP’s target population is students who have at least one Early Warning Indicator (EWI) for high school dropout related to attendance (attendance below 80%), behavior (1 or more out of school suspensions) or course grades (D or F in a math or English course).

Project Overview Research assessed: (1) the degree to which CGYP reached its target student population in 16 partner schools; (2) whether support met dosage thresholds; (3) implementation of CYGP math and literacy assessments; (4) student academic and behavioral outcomes; and (5) CYGP supports’ effect on students’ attendance, behavior, and course performance.

Methods Data Sources • SDP Administrative Records -- attendance, behavior, course performance – for all participating students (1,931 CYGP students) • CYGP math and literacy assessment data for all participating students • Interviews with CYGP program managers, team leaders, corps members, principals and teachers at five schools (60 total)

Results/Conclusions Serving the Target Population • The majority of students served by corps members had at least one early warning indicator (EWI) of dropout at the start of the 2012-13 school year. • A majority of all CYGP-supported students had an EWI corresponding to the support they received. o This represented a considerable increased from 2011-12.1 • A considerable number of at-risk students at partner schools did not receive CYGP supports. • Roughly half of students who received academic tutoring reached CYGP dosage thresholds. o This represented a considerable increase from 2011-2.2 • Administration of the new CYGP math and English assessments was inconsistent across the CYGP partner schools.

CYGP Supports and Student Performance • The majority of students who received attendance coaching finished the school year with either ‘sliding’ (80-90%) or ‘off-track’ (below 80%) attendance rates. • Roughly half of students receiving behavior coaching finished the school year with no suspensions. • Interviews reveal that a variety of school and nonschool factors can limit the effectiveness of attendance and behavior coaching. • Over half of middle grades students, and less than half of 9th grade students receiving tutoring earned a C or better in their math and English course. • Across each support area, students who entered the school year at-risk across all support areas continued to struggle with attendance, behavior and academics. • CYGP academic supports exerted no detectable effect on at-risk students’ academic performance for middle grades and 9th grade students. • Interviews suggest that, across schools, the context, methods and content of CYGP tutoring varied, complicating measuring impact. Also, CYGP corps members sometimes focused tutoring to address basic skills gaps, which may not impact course grades. • Attendance was a strong, driving force behind academic performance in math and English for atrisk 9th grade students. • Higher levels of CYGP math tutoring had a positive and significant effect on CYGP students’ likelihood of earning a C or better in math.

Research to Practice Strategies Continue to work with school administrators and staff to accurately identify at-risk students across all support areas: • Prioritize attendance and behavioral supports for atrisk students. • Intentionally coordinate attendance and behavioral supports with academic tutoring to support at-risk students’ academic progress. • Raise the dosage targets for math and English tutoring.

Contact Information: Michael Norton & Rebecca Reumann-Moore Research for Action 267.295.7760

1 Norton,

M. Year Four Evaluation of City Year Greater Philadelphia. Research for Action: March 2013. 2 Ibid


Cardiovascular Health Among Philadelphia Adolescents: Analysis of Youth Risk Behavior Data, 2011 Background

Results/Conclusions

Cardiovascular disease (CVD) is a leading cause of morbidity and mortality in American adults. Behavioral risk factors such as sedentary behaviors, poor diet and tobacco use are typically initiated in youth and sustained through adulthood. Improvement of cardiovascular health in adults is a national public health priority.

The majority of participants were female (59%) and African American (51%). Data on the four health metrics showed that most participants (89%) were in the ideal category for smoking and BMI (66%), and in the intermediate category for physical activity (70%) and for healthy diet (72%). Less than one percent (0.6%) of the sample had ideal heart health; 49.2% had intermediate heart health, and 50.2% had poor heart health. See Figure 1.

Project Overview Guided by the American Heart Association’s (AHA) metrics for ideal heart health- the “heart score”- we sought to examine the status of behavioral heart health in a representative sample of urban adolescents (N=805 high school students).

Methods_______________________ We used data from the 2011 Philadelphia Youth Risk Behavior Survey (YRBS) to calculate a behavioral heart health score for each student, reflecting risk associated with smoking, body mass index (BMI), physical activity, and dietary intake. Risk level was assigned using the 3-tiered system of ideal, intermediate or poor risk.

Research to Practice Strategies The low percentage of ideal behavioral heart health in Philadelphian adolescents is alarming. Since heart health declines with age, these data suggest that greater attention to improving heart health in even younger populations is a public health necessity. Thus, primary prevention approaches to address poor cardiovascular health prior to 9th grade are needed.

Figure 1. Prevalence of Behavioral Heart Risk Score Outcome Measure 100% 66 90%

19

127

146 189

80% 144 70%

60%

Poor 50%

Intermediate 579

720

Contact Information

Ideal

566

40%

Judith Peters, MBA, HHSA The School District of Philadelphia 215-400-6803 jrpeters@philasd.org

534 30%

20%

10% 80 50 0%

Smoking

BMI

Physical Activity

Healthy Diet


Breakfast patterns among low-income, ethnically diverse elementary school children

Many school districts and federal efforts have sought to increase breakfast participation given its documented benefits on children’s concentration, academic performance, and behavior.

Results/Conclusions Figure 3. Model estimated curvilinear relation between BMI percentile and number of breakfasts consumed 84 82 80

To encourage more participation in school breakfast, cities have adopted programs that offer breakfast in the classroom. These programs are seen as a way to combat stigma associated with school breakfast participation, address logistical challenges with breakfast served before school, and fight food insecurity.

82.2

78 76 74

74.5

72 70

BMI Percentile

Background_________________________

70.6

68

68.1

66 64

Providing breakfast at school for children who would not otherwise have one is valuable. At the same time, there is a concern that among children who are already consuming one or more breakfasts, school breakfast could unintentionally increase energy intake and undermine obesity prevention efforts.

Project Overview Purpose of the study is to assess breakfast patterns among 4th-6th graders in an urban public school district that implements a Universal Free School Breakfast Program and to assess the relationship between those patterns and measured relative weight. The feasibility pilot study included 4th-6th graders from 3 schools and occurred during the 2012-2013 academic year.

Methods During the pilot study, students were recruited and baseline measures were collected in Fall 2012; approximately 65% of eligible students were enrolled. Measured heights and weights and questionnaires were collected from 651 students. Students reported whether they ate or drank anything (yes/no) from 1 of 4 locations: home, corner store, school cafeteria, or classroom breakfast. Within each location, students bubbled in the items they ate/drank from a list of food and drink categories. The survey also assessed reasons for not eating/drinking, the consumption of dinner the evening before, and students’ method of transportation to school. Table 1. Sample characteristics (n=651) Variable Gender Female . Male Race/Ethnicity Black Hispanic Asian White Other Ate dinner

% 52.3 47.7 61.4 14.4 13.2 6.9 4.1 96.1

Variable Weight Status Underweight Healthy Weight Overweight Obese Severely obese Transportation Walked Driven Other

% 2.9 56.6 15.8 15.2 9.5 66.0 24.9 9.1

Variable Age (yrs) Weight (kg) Height (cm) BMI (kg/m2) BMI z-score BMI percentile

Mean ± SD 10.7 ±1.0 44.8 ± 15.1 146.4 ± 9.2 20.5 ± 5.1 0.7 ± 1.2 66.6 ± 30.5

0

1

2

3 or more

Number of Breakfast Locations

A large number of youth (37.8%) reported eating multiple breakfasts (25.5% consumed 2, and 12.3% consumed 3 or 4). These findings suggest that eating breakfast at multiple locations may contribute to excess energy intake and that efforts to promote school breakfast consumption may have unintended effects on childhood obesity. Over 75% of youth reported eating breakfast at home and most (58%) also ate at school (23.5%), a corner store (23.5%) or both (11.2%). These similar findings suggest that interventions designed to increase school breakfast participation should consider that the breakfast may be in addition to what was already consumed prior to school at home and corner stores. Of concern, 12.4% of youth reported not eating breakfast despite the universal free breakfast policy. This suggests a small but significant proportion of youth may be experiencing food insecurity, which has previously been associated with skipping breakfast and increased risk of obesity.

Number of breakfasts and BMI percentile showed a significant curvilinear relation, with higher mean BMI percentiles observed among children that did not consume any breakfast and who consumed ≥ 3 breakfasts.

Research to Practice Strategies Overall, this study supports the need for careful study of breakfast policies, such as breakfast in the classroom, that offer all youth additional opportunities to eat breakfast. Next steps include the main randomized control trial. The main trial will take place in 16 Philadelphia public schools from 2013-2016, and will follow 4th-6th graders until they are 6th-8th graders. Contact Information Name: Heather Polonsky Organization: CORE – Temple University Phone Number: 215-707-5320 E-mail Address: Heather.polonsky@temple.edu


Arts and early education: A promising duet to support school readiness skills Background The arts are increasingly being removed from schools. Yet, the arts are linked to learning to learn skills like attention, self-regulation, and persistence (executive function).

Conclusions 1. Do preschool music classes foster learning? Yes. Music stimulates Habits of the Mind for preschool children. from… …

to…

Links to specific positive outcomes: • Musical theatre & empathy.1 • Creative movement & self-regulation.2 2. What skills are reinforced during preschool music classes? Project Overview How do the arts support young children䇻s school readiness skills? Partnered with a local Head Start program that utilizes an arts-enriched (AE) pedagogy— preschoolers learn about AND through music, visual art, & dance classes. Adapted Hetland and colleagues’ Habits of the Mind coding scheme:3 • Originally for high school visual arts classes. • We modified it for preschoolers. Research Question: Will these Studio Habits of Mind emerge in our AE classes? If so, how? Methods Data Collection: • 14 observations (4 music classes across 4 mos). • Audio recordings were transcribed then reviewed for accuracy by second transcriber. Qualitative Coding Scheme: • Following convention, utterances were unit of analysis. • Coded independently by 2 researchers. 1

Goldstein, T.R., & Winner, E. (2012). Enhancing empathy and theory of mind. Journal of Cognition and Development, 13, 19-37. 2 Winsler, A., Ducenne, L., & Koury, A. (2011). Singing one’s way to selfregulation: The role of early music and movement curricula and private speech. Early Education and Development, 22, 274-304. 3 Hetland, L, Winner, E., Veenema, S., & Sheridan, K. (2007). Studio Thinking 2.0: The Real Benefits of Visual Arts Education (2nd Ed.). New York: Teachers College.

Self-regulation & cooperation: critical school readiness skills. Technique & Studio skills: music-specific content. 3. How do the arts foster these school readiness skills? During two contexts: • As primary focus: • waiting one䇻s turn. • During technique & studio habit moments: • Musical activities like 䇾walk & stop.䇿 • Striking gong on beat to accompany song. Research to Practice Strategies Musical instruction reinforces school readiness skills within naturally motivating context. Participating in the arts engages the whole child—body and mind—for active & meaningful learning. There is educational merit to keeping the arts in schools. Contact Information Jessa Reed & Kathy Hirsh-Pasek, Ph.D. Temple University (267) 468 - 8610 jreed@temple.edu


Philadelphia Autism Instructional Methods Study: Philly AIMS Background Districts are under increased pressure to incorporate EBP for growing numbers of students with autism. There is little research on the effectiveness of autism interventions in public schools, and even less on the best ways to help educators implement these interventions with fidelity .

Project Overview This randomized field trial compared two interventions for 383 students with ASD in 50 K-through-5 autism support classrooms, and examined factors moderating outcomes

Interaction of Program Fidelity and Implementation Climate

40

Low Climate (Adjusted) High Climate (Adjusted)

30

Methods______________ _________ Over the course of the academic year: • Student’s cognitive ability was assessed in September and June using the Differential Abilities Scale. • Potential moderating factors, including child symptoms, teacher characteristics, and schools’ implementation climate (the extent to which use of the intervention was expected, supported and rewarded) were measured in September.

Results Despite generally low fidelity, group differences in outcome were moderated by program fidelity, such that in high fidelity classrooms, STAR outcomes were better than Structured Teaching outcomes. Innovation climate was associated with fidelity (r = .3; p = .02). There was an interaction between fidelity and the school’s implementation climate. The best student outcomes were achieved in high fidelity, high climate classrooms, followed by low fidelity, low climate classrooms. Outcomes were poorest when fidelity and climate were discordant.

Change in DAS Score

20

10

0

-10

-20

-30 0

0.5

1

1.5

2

2.5 3 Program Fidelity

3.5

4

4.5

5

Conclusions Program fidelity is critical to maximizing outcomes. A school’s implementation plays a critical role in program fidelity and in outcomes. Maximizing outcomes requires: • Ongoing teacher training & support over multiple years • Principal buy-in to the program, and clear expectations about its use • Rewards for teachers who implement the program successfully and achieve good outcomes Contact Information Erica Reisinger/ David Mandell University of Pennsylvania 215-573-8472 ereis@upenn.edu


Arts Link: Building Mathematics and Science Competencies through an Arts Integration Model Background The School District of Philadelphia, (SDP) is one of the largest public school systems in the country with many of its students coming from low income and historically underserved racial minority backgrounds. Further, reduced levels of student motivation, engagement, and sustained focus are barriers seen not only in low test scores in mathematics and reading but also in unacceptably low levels of pro-social behavior and daily attendance at school. Acknowledging this challenge, the Philadelphia Arts in Education Partnership, (PAEP) developed the Arts Link program to address this need.

Project Overview The Philadelphia Arts in Education Partnership (PAEP) partnered with the School District of Philadelphia (SDP) to implement Arts Link, a comprehensive arts-integrated program addressing the mathematics and science skills of the Grade 2, 3, 4, and 5 populations in four (4) â&#x20AC;&#x153;School Improvementâ&#x20AC;? schools not making adequate yearly progress (AYP). The overarching goal of this program is to improve student academic achievement through rich, experiential arts engagements through the design and implementation of arts-integrated curricula. This project presents the program evaluation of Arts Link. However, this poster will focus on the teacher pedagogical development and student academic achievement components of this evaluation.

Methods The design utilized for the Arts Link evaluation is generally known as a matched randomized pretest-posttest control group design. The matched randomization occurred at the school level, matching on these criteria: a) neighborhood elementary schools in the School District of Philadelphia with at least two grades 2, grade 3, grade 4 and grade 5 classrooms, b) employ an art specialist, and

c) are school-improvement status schools. Participating schools were informed that 10 schools will be selected by random lottery into the intervention group (n=4) and the control group (n=6) by the program evaluator. Although this evaluation utilized a variety of performance and behavioral data, the data presented here is limited to teacher pedagogical development and PSSA scores in Math and Science.

Results/Conclusions There was an increase of at least 10% in all but one of the Teacher Skill Inventory items for Arts Link teachers and the mean scores for each of the teacher skill criteria were significantly higher (p<.05) for the Arts Link group as compared to the control group. The mean scores on the mathematics PSSA test were significantly higher (p<.01) for the Arts Link schools as compared to the control schools. The mean scores on the science PSSA test were substantially higher (p=.10) for the Arts Link schools as compared to the control schools

Research to Practice Strategies The Arts Link evaluation provides support for co-curricular partnerships as a means to develop teacher skills as well as the utility of an arts integration approach to address student needs in core academic achievement areas. 1 APA/MLA

citations for all referenced material presented on the poster

Contact Information Pearl Schaeffer and Evan Leach Philadelphia Arts in Education Partnership 215-717-6096 paep@uarts.edu www.paep.net


EATING THE ALPHABET Background

Methods

Eating the Alphabet is a curriculum /program designed to teach kindergarten through second grade students the importance of making healthy food and activity choices. In the United States 32% of children are overweight or obese.1 In North Philadelphia close to 70% of children are overweight or obese.2 Being overweight contributes to high levels of diabetes and heart disease in this country. This program combines nutrition lessons with literacy and math to not only help students make healthy choices but to also increase literacy and math skills.

Number of children tasting foods and number who will eat it again is recorded. Pre and post tests have been administered to students receiving program and to students not receiving program to determine where children receive program have significant increase in knowledge.

Project Overview • Nutrition educators teach weekly classes that include a healthy food tasting that begins with each letter of the alphabet. Lessons are grade level appropriate to meet the needs of each student. The lessons include: • Reading a nutrition or food related book. • Games and songs that reinforce eating healthy foods and physical activity. • A food tasting of a healthy food (mainly fruits and vegetables with some dairy and whole grains). • A craft project or game that reinforces the nutritional value of the food. • Follow up materials in both math and literacy for the teachers to use. • Fact sheets about the foods tasted for parents. Additionally, registered dietitians provide parent and teacher workshops and one on one family counseling upon request.

Contact Information Name Organization

Results/Conclusions Approximately 8,000 students receive lessons every week. Over 95% of students taste the food offered each week and 80% of school age students state they would eat it again. Analyses of data will take place this spring.

Research to Practice Strategies Students are exposed to many foods they or their parents might never have had before. The weekly instruction of the importance of eating fruits, vegetables, dairy and whole grains and being physically active helps students to make healthy choices now and in the future. These choices will help to reduce the rates of developing diabetes and heart disease. 1 Ogden CA et al. Prevalence of High Body Mass Index in US Children and Adolescents, 2007-2008. JAMA . 2010;303(3):242-249. 2 Public Health Management Corporation, Household Health Survey. Downloaded from: http://www.phila.gov/health/pdfs/Obesity_in_Philadelphia_3.10.10.pdf

Contact Information Name: Kineret Shakow Organization: Einstein Healthcare Network Phone Number: 215-456-4926 Email Address: shakowk@einstein.edu

Conference

Phone Number Email Address


F.U.N. (Families Understanding Nutrition) and Fit through P.L.A.Y. (Playful Learning for Adults and Youth) Background

Methods_______________________

The F.U.N. and Fit through P.L.A.Y. program was a comprehensive program combining nutrition education and activity developed to prevent childhood obesity among Latinos (ages 5-12). • 32% of U.S. children are overweight or obese.1 • In North Philadelphia close to 70% of children are overweight or obese.2 • Latino children are more likely to be overweight or obese than non-Hispanic white children of the same age.1,3 A partnership for monitoring and evaluation of intervention combined:“A Better Start”, Einstein Healthcare Network; ASPIRA After-school program and summer camp; and Johns Hopkins/Johnson & Johnson Community Healthcare Scholars Program.

• Age appropriate student pre/post surveys testing knowledge, attitudes, and behavior. • Heights and weights measured at the beginning and end of each school year. • A food log for families to complete over the weekend. • Parent pre/post surveys for workshops.

Project Overview Students: Six 90 minute interventions – 45 minutes of nutrition education and 45 minutes physical activity lessons. Families: Four 3 hour workshop sessions. This included: • 1 hour of activity for entire family. • 1 hour of nutrition education for parents with a RD while students enjoyed 1 hour of nutrition craft activities with nutrition educators. • 1 hour cooking demonstration of healthy alternatives to traditional foods by a chef. • Free sessions with a RD. Reinforcements for home use: • Bag of groceries with ingredients of recipe to replicate at home. • Games and equipment to replicate physical activities at home.

Results/Conclusions • Children’s BMIs were similar before and after the six week program • Most improvement seen among children age 5-8. • ~35 families attended parent-child workshops. • Parents were more confident in providing healthy snacks after the workshops. Very Confident: 43% to 78%. • Results were marginally improved among children with parents attending workshops.

Research to Practice Strategies • The intervention succeeded in increasing the nutrition knowledge and attitudes of students and parents. Unfortunately, this did not translate to substantial behavior change. • Challenges included: • Many adults (caregivers and staff) did not perceive a problem and did not take full advantage of program. • Institutional policies did not always coincide with nutrition lessons • Programs required more time than many families had available. 1 Ogden CA et al. Prevalence of High Body Mass Index in US Children and Adolescents, 2007-2008. JAMA 2010;303(3):242-249. 2 Public Health Management Corporation, Household Health Survey. Downloaded from: http://www.phila.gov/health/pdfs/Obesity_in_Philadelphia_3.10.10.pdf 3 NHCSL Hispanic Obesity Initiative. “Hispanic Obesity: An American Crisis” 2010. Downloaded from: http://nhcsl.org/Hispanic-Obesity-An-American-Crisis.pdf

Contact Information Name: Kineret Shakow Organization: Einstein Healthcare Network Phone Number: 215-456-4926 Email Address: shakowk@einstein.edu


“Be the HYPE”: Engaging Young People as School Wellness Leaders Background

Results/Conclusions

20% of Philadelphia children are obese.1

To date, HYPE has: • Hosted a total of 6 Youth Leadership Summits; • United over 1,400 student leaders; • Facilitated approximately 600 youth-led initiatives.

HYPE is a collaboration between The Food Trust, the School District of Philadelphia, and the Philadelphia Department of Public Health’s Get Healthy Philly initiative that enables local youth to take action and become leaders for healthy change in their school and community.

Project Overview Our goal is to make being healthy cool, fresh, relevant, and fun. HYPE is a youth-created and -led social marketing campaign that currently works with over 30,000 young people in over 75 middle and high schools in Philadelphia. In each school, HYPE supports a wellness council of eight to ten student leaders as they start dance clubs, plant school gardens, start fitness clubs, record healthy PSAs, coordinate health fairs, lead cooking classes, and coordinate HYPE assemblies.

Methods HYPE wellness councils facilitate youth leadership development through: • Goal setting • Dedication/Follow Through • Public Speaking • Professionalism • Action Planning • Art/Media/Technology The annual middle and high school Youth Leadership Summits draw more than 500 students in the fall of each school year. Youth participate in leadership development activities, health workshops, and educational sessions.

At the last Middle and High School Youth Summits (Fall 2014), HYPE leaders left feeling: 1. Engaged: 97% reported they are proud to be healthy. 2. Excited: 95% said they were excited about making healthy changes in their school and community. 3. Empowered: 84% believe they can make their school and community a healthier place. 4. Connected: 80% left feeling that they are part of a community of youth leaders. HYPE supports the education and development of talented and health conscious leaders of tomorrow.

Research to Practice Strategies HYPE, created by students for students, taps into the unmatched energy and creativity of the youth voice and empowers students to be change agents. HYPE embodies a comprehensive approach to health behavior change by activating and motivating youth to become healthy role models for their peers and within their schools. HYPE leaders pledge to make healthy changes and are proud to represent healthy habits in their schools. The work of HYPE leaders does not end in the schools– they are taking the HYPE message into their communities through advocacy and efforts in corner stores.

1. Robbins JM, Mallya G, Polansky M, Schwarz DF. Prevalence, Disparities, and Trends in Obesity and Severe Obesity Among Students in the Philadelphia, Pennsylvania, School District, 2006–2010. Prev Chronic Dis 2012;9:120118. DOI:http://dx.doi.org/10.5888/pcd9.120118

Contact Information Alyssa Simon, The Food Trust (215) 575-0444 ext. 176 asimon@thefoodtrust.org Rodney Robbins Philadelphia Dept. Of Public Health (215) 685-5678 rodney.robbins@phila.gov


Early Childcare Careers: What are the Facts? Background

Results/Conclusions

“Considerable evidence links child-care quality and cognitive development among preschoolers. Childcare quality has been positively related, albeit modestly, to preschool –age children’s cognitive development and social competence in a wide variety of studies that controlled for family background characteristics such as socioeconomic status, maternal education, or family. 1 Higher levels of child care quality were modestly associated with improvements in children’s socioemotional development, and extensive hours in child care were linked to increases in children’s quantitative skills and decreases in behavior problems. 2

• The number of days a child was in subsidized childcare during this 5 year period ranged from 2 days to 1,811 days with a mean of 599 days. • Over the course of a child’s PKE • The number of child care centers ranged from 0 to 9, with a mean of 1.6 centers • Family care experiences ranged from 0 to 5 with a mean of .15 experiences, • Group care experiences ranged from 0 to 3 with a mean of .04, and • Residential/neighborhood care ranged form 1 to 11 experiences with a mean of 1. 250

Project Overview

200

Number of Children

Project Goal: To understand the effects of subsidized care on children’s school readiness and subsequent development through an analysis of the “childcare careers” (Pre Kindergarten Experiences – PKEs) of children with childcare subsidies. Import of project: Although researchers and policymakers can access cross-sectional data that show the kind and quality of care that is being “bought” at any one time with subsidies, this project provides a longitudinal analysis of children’s pre-kindergarten experiences. Perhaps the modest or inconsistent findings are due to the variability of experiences.

150 100 50 0 3 4 5 6 7 8 9 10 11 1 PKE 2 PKEs PKEs PKEs PKEs PKEs PKEs PKEs PKEs PKEs PKEs

On the whole the children spent the greatest percentage of time in center care (58.4%), followed somewhat distantly by R_N care (35.7%), then family care (4.7%), and g group p care (1.2%). ( )

The initial assumptions made were: • Child development was a function of the quality of the child care experienced – the higher the quality, the greater the positive effect • Center-based care was superior to other forms of care in terms of child development • Pennsylvania had a quality rating system (Keystone Stars) for center-based care so quality measures were available.

Methods_______________________ In this secondary data analysis project, we analyzed data from the Commonwealth of Pennsylvania’s Office of Child Development and Early Learning (OCDEL) and the Office of Income Management. Detailed subsidy data from the state by length of time and by type of care a. Center-based b. Group homecare c. Residential/ Neighborhood d. Family care Analysis focused on the greater Philadelphia area – sample from an on-going project with N= 735 children.

Research to Practice Strategies 1. Impact of center care may have been underestimated 2. Addressing multidimensionality may yield richer understanding 3. To capture the actual effect of center-based childcare on school readiness, the following need to be factored into any analyses: a. Length of time by quality characteristics, b. Number and length of “gaps,” and c. Number of centers, and other childcare arrangements. 1 Burchinal, M. R., Roberts, J. E., Riggins Jr, R., Zeisel, S. A., Neebe, E., & Bryant, D. (2000). Relating quality of center-based child care to early cognitive and language development longitudinally. Child Development, 71(2), 339-357. 2 Votruba-Drzal, E., & Lindsay Chase-Lansdale, P. (2004). Child Care and Low-Income Children's Development: Direct and Moderated Effects. Child development, 75(1), 296-312.

Contact Information Names: Judith Stull, Marsha Weinraub*, Michelle Harmon Organization: Family & Children’s Research Collaborative, Temple University Phone Number: 215-204-7360 Email Address: Marsha.weinraub@temple.edu

Contact Information Name Organization

Conference

Phone Number Email Address


WHAT A NIGHTMARE! NOT ALL SLEEP MEASURES ARE CREATED EQUAL Background • Sleep disturbances have profound effects on such key outcomes as: • emotional reactivity • behavioral problems • social adjustment • academic performance • physical health1 • How do discrete measures of sleep disturbances differentially affect outcomes in middle childhood? • Research has investigated nightmares within the context of trauma and PTSD but, less in general is understood about nightmares in healthy populations.2 • Nightmares are linked to anxiety and depression, but limited research has addressed nightmares as a unique sleep disturbance in middle childhood. 4

Project Overview We examined unique links between children’s non-traumatic nightmares and emotional functioning during the preadolescent period. We hypothesized that: 1. A variety of sleep disturbances in third grade would be related to anxious/depressive symptoms and emotional reactivity according to multiple informants at grade six. 2. Only nightmares in third grade would predict anxious/depressive symptoms in sixth grade, even when we included all other sleep disturbances in our analysis.

Methods NICHD SECCYD dataset with n=1026 measured at two time points: Grade 3 and Grade 6. Multiple Informants: Mothers, Teachers, and Children. Sleep Disturbances in Third Grade (G3): • Nightmares • Daytime sleepiness • Trouble sleeping • Sleeping more or less than most kids • Talks or walks in sleep • Amount of sleep Socioaffective Outcomes in Sixth Grade (G6): • Maternal report of child anxious/ depressive symptoms • Child self-report of depressive symptoms • Teacher report of child’s anxious/depressive symptoms • Maternal report of child emotional reactivity • Teacher report of child emotional reactivity Controlled for Maternal Depressive Symptoms and Child’s Depressive Symptoms. Structural equation modeling used to conduct a path analysis. 1. Mindell, 2011; Gregory et al., 2008; Dahl, 1996; Curcio, Ferrara, and Gennaro, 2000 2. van der Kolk et al., 1984; Spoormaker, Schredl, and van den Bout, 2006; Inman, Silver, & Doghramji, 1989; Mindell & Barrett, 2002. 3. Levin and Nielsen, 2007 4. Nielsen et al., 2000; Simard et al., 2008; Levin & Nielsen, 2007; Gregory et al., 2011

Results/Conclusions After controlling for maternal depressive symptoms and child anxious/depressive symptoms at G3, we found that: • Maternal report of child nightmares at G3 significantly predicted maternal report of child anxious/depressive symptoms at G6 (β=0.062). • Maternal report of child trouble sleeping at G3 predicted child self-report of depression symptoms at G6 (β=0.081). • Maternal report of child amount of sleep at G3 was negatively associated with teacher report of child emotional reactivity at G6 (β= -0.088). Discussion • Nightmares were the only measure of sleep that predicted maternal report of child anxious/depressive symptoms. • Trouble sleeping was the only measure of sleep that predicted child self-report of anxious/depressive symptoms. • Less sleep duration in third grade predicted less effective emotional strategies and behaviors when interacting with peers in grade six.

Research to Practice Strategies • Distinct sleep disturbances may differentially affect specific aspects of children’s behavior. • Nightmares are a unique and separate phenomena from other sleep disturbances. • Future research may benefit from using discrete sleep disturbance measures as opposed to collapsing them into a general sleep problem score.

Contact Information Timothy Valshtein, Joan Foley, Judith Stull, Marsha Weinraub Temple University (973) 670-9722 timothy.valshtein@temple.edu


Cardiopulmonary Resuscitation/Automated External Defibrillator Olympic Competition Enhances Resuscitation in High School Students Results

Background • Sudden cardiac arrest (SCA) is the leading cause of death in the US. • Survival rates decrease by 10% per minute after collapse until effective defibrillation. • 5-10% overall survival from out-of-hospital arrest. • 41-74% survival by immediate use of an automated external defibrillator (AED). • Critical steps for effective CPR & AED use 9 Team effort, recognition of an event, cognitive knowledge of resuscitation steps and AED use, and individual psychomotor skills.

Project Overview • Study Purpose/Location: The study aimed to develop innovative Resuscitation Training Programs in high schools in the Philadelphia School District, to assess the efficacy of the education and training program, and to assess the motivation or willingness to implement resuscitation. • Population: Health education classes in 15 Philadelphia School District high schools were selected, with one control and one study intervention group per school. • Timeline: Pre-tests were conducted starting in February 2011; Post-tests were conducted in April/May 2011, and Retention testing was conducted in April/May 2012.

Methods__________________________ Inclusion Criteria: • PSD High School health class student • Consent/assent to participate in study Exclusion Criteria: • Refusal to participate Study Class CPR/AED Pre and Post tests of CK and PM skills

Usual CPR/AED Instruction

Conclusions • Development of novel resuscitation programs translated into outstanding application of these skills in a Mock Code. • Engaging students in their own learning process can empower a new generation of effectively trained CPR bystanders.

Research to Practice Strategies • To replicate high retention rates of CPR/AED skills by using these innovative teaching methods and collaborative learning, we recommend implementation and dissemination of this program throughout the PSD high school health education curriculum.

Control Class CPR/AED Pre and Post tests of CK and PM skills Usual CPR/AED Instruction

Students develop CPR/AED novel programs to learn, teach and retain skills

CPR/AED Olympic Competition

• One year post instruction and participation in the SPORTS study, students showed an 88% retention rate in the performance of CPR/AED skills. • This rate is highly significant, as retention rates for CPR/AED skills post instruction thus far have been around 50%, with limited studies conducted on adolescents.

Contact Information Victoria L. Vetter, M.D., MPH The Children’s Hospital of Philadelphia 215-590-1962 vetter@email.chop.edu


Exceptional Coaching for Early Language and Literacy ExCELL-e Background

Results/Conclusions

The vocabulary and language gap between children in poverty and affluent peers (Fernald, 2013; Heckman, 2013) remains one of the greatest challenges facing the American education system. Our research focuses on the development and implementation of language and literacy interventions in high-poverty early childhood classrooms.

Preliminary evidence of ExCELL-e’s effectiveness comes from research on our previous PD intervention in preschools:

Project Overview Our current project, ExCELL-e, is a webbased professional development (PD) program designed for pre-k, k and 1st grade teachers to guide their learning and implementation of effective language and literacy strategies.

• Teachers trained in our strategies significantly increased the quality of their instruction (effect size = 1.00) • Children made significant gains in vocabulary (effect size = .80) (Wasik & Hindman, 2011). • Children learned the words targeted in the intervention and also made gains on standardized measures (e.g., PPVT).

Research to Practice Strategies ExCELL-e supports vocabulary learning among both native speakers and dual language learners.

Findings from our research show that: • ExCELL-e can provide feasible, effective research-based PD to teachers.

Methods_______________________ We are piloting the PD with 15 teachers. • Teachers read a module, which includes discussion of a strategy, video exemplars, which show how the strategies are implemented in classrooms, and a Check Your Understanding assessment of teachers’ knowledge of the information. • Teachers also upload videos of their practicing the strategies in classrooms and receive detailed feedback from a coach. • Teachers monitor the progress of children’s vocabulary learning 3 times per year. • Teachers receive books and Reading Guides to support their use of the intervention techniques.

• Our materials -- such as online modules, trade books and reading guides -- can be easily integrated into many curricula. • ExCELL-e increases the quality of instruction in early childhood classrooms, specifically around vocabulary and language development. • ExCELL-e significantly impacts children’s success in learning to read and, as a result, fosters their success in school.

Contact Information Barbara A. Wasik, Annemarie H. Hindman, & Emily Snell Temple University College of Education bwasik@temple.edu, ahindman@temple.edu, emily.snell@temple.edu


Tab 15 1 Posters Relatedd to Action Plan Strateegy 2 –Devvelop a Sysstem of Exccellent Schoools 2..01 Dowdall – Shuttered Schools: S Brininng New Life too Old Buildinggs 2..02 Hill – Sittuating Successs: A Mixed-Methods Study of School Turnarounds in PPhiladelphia 2..03 Jack – Thhe Effects of Philadelphia’s School Choicee Policies on Scchool Utilizatiion Rates 2..04 Jones – Measuring M Bullying Preventtion Impleme ntation Readiiness and Proggress: A Pilot SStudy 2..05 Max & Gllazerman – Do Disadvantagged Students GGet Less Effecctive Teaching? 2..06 Norton – Philadelphiaa’s Acceleratedd High Schoolss: An Analysis of Strategies and Outcomees 2..07 Peters – Youth Risk Beehavior Surveyy, Philadelphia 2..08 Schwartzz & Eiraldi – School-Wide Positive Behavvior Interventioons and Supports (SWPBIS)) in SDP Schoools: An Exploratory Study 2..09 Stratos & Reitano – Sttudent Achievement in Renaaissance Charrter Schools 2..10 Supovitz – The Common Core in New w York City Scchools


Page Left Blank Intentionally


Shuttered Schools: Bringing New Life to Old Buildings Background

Results/Conclusions

Large-scale public school closures have become a fact of life in many American cities. Pew’s 2011 report, Closing Public Schools in Philadelphia: Lessons from Six Urban Districts, provided a broad overview of such closures. In 2012, Superintendent Hite announced plans to shutter several dozen of Philadelphia’s district-run schools, and Pew launched a follow-up study focused on what happens to the buildings after they close.

• Districts sold, leased or reused 267 properties between 2005 and 2012; 301 were still on the market. • Obstacles to reuse included building condition, quirky layouts, and locations in declining residential areas. • Sale prices were often well below initial projections. • The most common reuse by far was as a charter school. • Districts got the best results when they moved aggressively to sell or lease facilities soon after they became empty, made information readily accessible to prospective buyers and the public, took steps to ensure that purchasers followed through on plans, and got outside help in determining appropriate uses of the properties.

Project Overview Shuttered Public Schools: The Struggle to Bring Old Buildings New Life examined repurposing efforts in 12 cities that had large inventories of closed schools: Atlanta, Chicago, Cincinnati, Cleveland, Detroit, Kansas City (MO), Milwaukee, Philadelphia, Pittsburgh, St. Louis, Tulsa and Washington. The project looked at the obstacles to reuse, the costs and consequences associated with properties that remain empty, the policies that guide disposition processes, and strategies that have proved most effective.

Research to Practice Strategies Pew researchers shared findings with School District of Philadelphia staff and city officials. The current repurposing effort, a shared effort by the district and the Mayor’s Office, draws directly from lessons in the report. Contact Information Name: Emily Dowdall Organization: The Pew Charitable Trusts Phone Number: (215) 599-8297 Email Address: edowdall@pewtrusts.org


Situating Success: A Mixed-Methods Study of School Turnarounds in Philadelphia Background

Methods

Promise Academies play a major role in the District’s plan to “Develop a System of Excellent Schools” (The School District of Philadelphia, 2014). These schools have been identified as “chronically underperforming” and selected by the District to under-go Districtmanaged turnaround (SDP, 2010). Contrary to the popularity of turning around the lowest performing schools in Philadelphia and nationally, research on the effectiveness of turnarounds is limited (e.g.,Calkins, Guenther, Belfiore, & Lash, 2007; de la Torre, Allensworth, Jagesic, Sebastian, Salmonowicz, Meyers, & Gerdeman, 2012).

Using a regression discontinuity design, I will analyze the impact of the Promise Academy turnaround model on student achievement and instruction and explore how the key mechanisms through which this CSR model works relate to the successes and failures of the five Essential Supports. Qualitative methods will be used to contextualize these findings and offer hypotheses to explain variation in both achievement outcomes and Essential Supports.

Project Overview The conceptual framework for this study utilizes a theoretical framework that integrates Policy Attribute Theory (Desimone, 2002—from the Comprehensive School Reform (CSR) literature) and the Essential Supports (developed by Bryk et al. (2010) from the literature on effective schools) in order to shed light on the possible successes, challenges, and failures of turnarounds and how these success, challenges, and failures arise. To what extent do varying levels of Policy Attributes mediate and moderate the levels and success of the Essential Supports?

Research to Practice Strategies This study will use mixed-methods research to directly compare, contrast, and link two facets of success—academic achievement on the one hand and determinants of success represented by both the five Policy Attributes and the five Essential Supports on the other— as they apply to Promise Academies. In doing so, this study will aid the District in making “evidence-based revisions to the Promise Academy Model” (The School District of Philadelphia, 2014), and additionally make an important contribution to the growing literature on effective schools and build on Rowan et al.’s (2009) work that underscores the tradeoffs in achieving academic success through CSR models (e.g., improving student achievement at the cost of professional communities and innovation). Bryk, A. S., Sebring, P. B., Allensworth, E., Luppescu, S., & Easton, J. Q. (2010). Organizing Schools for Improvement: Lessons from Chicago. Chicago, IL: The University of Chicago Press. Calkins, A., Guenther, W., Belfiore, G., & Lash, D. (2007). The Turnaround Challenge: Why America’s Best Opportunity to Dramatically Improve Student Achievement Lies in Our Worst-Performing Schools. Executive Summary. Mass Insight Education. Retrieved from http://www.eric.ed.gov/ERICWebPortal/recordDetail?accno=ED538309 de la Torre, M., Allensworth, E., Jagesic, S., Sebastian, J., Salmonowicz, M., Meyers, C., & Gerdeman, R. D. (2012). Turning around low-performing schools in Chicago. Chicago, IL: Consortium for Chicago School Research, University of Chicago. Desimone, L. (2002). How can comprehensive school reform models be successfully implemented? Review of Educational Research, 72(3), 433–479. Rowan, B. P., Correnti, R. J., & Camburn, E. M. (2009). School improvement by design: Lessons from a study of comprehensive school reform programs. In G. Sykes, B. Schneider, & D. N. Plank (Eds.), Handbook of Educational Policy (pp. 637–651). New York, New York: Routeledge. The School District of Philadelphia. (2014). Action Plan v2.0. Retrieved from http://webgui.phila.k12.pa.us/offices/a/action-plan

Contact Information Kirsten Hill Penn GSE 630.699.0915 kihi@gse.upenn.edu


The Effects of Philadelphia's School Choice Policies on School Utilization Rates Background

Results/Conclusions

Research has shown that enrollment shifts due to charter policies can have negative impacts on school districts that are experiencing persistent enrollment declines.1 This can lead to declining revenue and in the case of two New York districts, closures.2 The School District of Philadelphia (SDP) has embraced the expansion of school choice amid a decline in traditional public school enrollment over the past decade. As with the experiences of other districts, the School District of Philadelphia faces significant financial pressures in due part to underutilized buildings and growing charter payments increasing cost during a time of austerity.3

Project Overview There has been a growing debate around school choice in the SDP as underutilization led to the closure of 24 schools at the end of the 20122013 school year. We tried to isolate the effects of school choice (charters and intra-district options) versus outward migration, a historically documented problem for urban centers, in the utilization problem facing many of SDP schools.

Methods We compared utilization rates based on two scenarios using 2012-13 school catchment data for 197 of 204 neighborhood schools (96.6%). The district’s utilization goal of 85%, a bestpractice standard, was used as an indicator of adequate school capacity.4 Scenario 1 followed the district’s formula; dividing each school’s 2012-2013 enrollment by total school capacity. Scenario 2 assumes that all school-aged children within the catchment area attended their neighborhood school while capacity remained the same. 1. Ni, Y., & Arsen, D. (2011). School choice participation rates: Which districts are pressured? Education Policy Analysis Archives, 19 (29) Retrieved from: http://epaa.asu.edu/ojs/article/view/777 2. Bifulco, R., & Reback, R. (2011). Fiscal impacts of charter schools: Lessons from New York. Effect of Charter Schools on School District Finance, Final Report. Retrieved from: http://www.columbia.edu/~rr2165/pdfs/nycharterfiscal.pdf 3. Jack, J. & Sludden, J. (2013) School closings in Philadelphia. Perspective on Public Education. Retrieved from: http://www.urbanedjournal.org/archive/volume-10-issue-1-summer-2013/schoolsclosings-philadelphia 4. Jablow, P. (2011). Where are the 70,000 seats? The Philadelphia Public School Notebook. Retrieved from: http://thenotebook.org/february-2011/113295/where-are-70000-seats

Scenario 1 Despite potential enrollment loss to school choice options, 80 schools (40.4%) exceeded the 85% utilization rate benchmark. Scenario 2 For 42 schools (21.2%), the loss of students to alternative schooling options led to underutilization. The remaining 76 neighborhood schools (38.4%) were underutilized due to outward migration.

Research to Practice Strategies District leadership must understand the districtand school-level effects of expanding choice. Consequently, a balance must be struck between expanding choice offerings to try and retain families and the negative financial and enrollment effects of increasing schooling options, which may exacerbate enrollment shifts and increase financial strain; consequences that may lead to more people seeking residence outside the city limits. Contact Information James Jack School District of Philadelphia 215-400-6403 jjack@philasd.org


Measuring Bullying Prevention Implementation Readiness and Progress: A Pilot Survey Background While a number of bullying prevention programs exist, implementation with fidelity and sustainability can be problematic, particularly for highly-stressed schools. Research is increasing our understanding of the importance of school readiness in implementation success1, however few measures have been developed.

Project Overview The current study provides instrument development data on a survey designed to measure bullying prevention implementation readiness and progress. The two participating SDP schools are involved in a pilot implementation of a combined social emotional learning (SEL), bullying prevention program: Second Step + Bullying Prevention Unit.

Methods_______________________ Principals were asked to select around 10 formal or informal leaders among school staff to complete the voluntary online survey. Twentytwo leadership staff at the participating elementary schools responded, including 18 teachers, and 4 staff members. Figure 1. Mean and Range of Scores on 6 Readiness Dimensions (N=22)

Respondents were asked to rate the school community on six dimensions of readiness. Averaged respondent scores for each school ranged from 40 to 60, which accurately reflects a 1st year initiation of a bullying prevention program. However score distribution across respondents suggest low consensus on dimensions. Although almost all respondents identified that a bullying prevention program had been initiated during the current year, knowledge about program components was mixed, as was knowledge about the schoolâ&#x20AC;&#x2122;s written policies and protocols for responding to bullying. Findings support the potential value of the measurement tool, but revisions will include clearer definitions and the collection of readiness information from specific school community subgroups.

Research to Practice Strategies Two follow-up strategies are being planned: First, efforts are underway to expand bullying prevention implementation and data collection at an additional 5-10 SDP elementary schools next year. Second, we are using the pilot data to improve the readiness measure. We hope to administer the revised survey with the SDP during the next school year and broaden it to capture readiness for Tier 1 safety and climate program implementation in addition to bullying prevention. Readiness data can assist the district in better targeting prevention and climate improvement efforts and holds great potential for improving implementation success. 1 Fixsen, D.L., Naoom, S.F., BlasĂŠ, K.A., Friedman, R.M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: The National Implementation Research Network.

Contact Information: Lisa M. Jones, Ph.D. CCRC, University of New Hampshire Stoneleigh Foundation Fellow 603-862-2515 lisa.jones@unh.edu


Do Disadvantaged Students Get Less Effective Teaching? Background

Results/Conclusions

As the School District of Philadelphia partners with teachers to improve practice and meet higher standards in English/ Language Arts and Mathematics, there is a need to better understand how effective teachers are distributed across schools.

• On average, economically disadvantaged students received less effective teaching than other students. • The difference was equivalent to about 4 weeks of learning for reading and 2 weeks for math, or about 2 to 4 percent of the student achievement gap between these groups • Access to effective teaching for disadvantaged students varied across districts.

Project Overview This study synthesized findings from three peer-reviewed studies for the U.S. Department of Education that collectively spanned school districts in 17 states.

Figure 2. Prevalence of Highest-Performing Teachers in the Highest- and Lowest-Poverty Schools

Methods_______________________ The three studies measured effectiveness using value-added scores based on one to four years of value-added data. Students were in grades 3 through 8 and were classified as eligible for free or reduced-price lunch (FRL) or not. Two studies compared average teacher effectiveness for FRL students and nonFRL students. One study compared the percentage of highest-performing teachers in schools with the highest and lowest FRL rates. Figure 1. Disadvantaged Students Receive Less Effective Teaching

Source: Note:

Glazerman and Max (2011). Schools were divided into five equal-sized categories based on the percentage of FRL students, and the percentage of highest-performing teachers was compared across the categories. This graph shows the percentage of highest-performing teachers in the lowest-poverty category— the 20 percent of schools in each district with the lowest percentage of FRL students—and the highest-poverty category—the 20 percent of schools in each district with the highest percentage of FRL students. The analysis of elementary schools combined math and reading because teachers were responsible for both subjects. *Difference in the prevalence of highest performing teachers between the highest- and lowest-poverty schools is statistically significant at the 0.05 level.

Research to Practice Strategies • Access to effective teaching varies by district. • Districts can examine whether effective teachers are evenly spread across schools with different poverty rates. • Districts can also measure access to effective teaching across schools and within schools. Contact Information

Source: Isenberg et al. (2013) and calculations based on Sass et al. (2012). See Section C in the appendix for details about how we translated the study results into the “weeks of learning” measure. Note:

Isenberg et al. (2013) includes grades 4-8; Sass et al. (2012) includes grades 4 and 5.

*Differences between teachers of non-FRL students and FRL students are statistically significant at the 0.05 level.

Jeffrey Max and Steve Glazerman Mathematica Policy Research (202) 484-9220 jmax@mathematica-mpr.com


Philadelphia’s Accelerated High Schools: An Analysis of Strategies and Outcomes Background Since 2004, the School District of Philadelphia’s Office of Multiple Pathways to Graduation and Project U-Turn have guided development of Philadelphia’s accelerated high schools. In 2012-13, 10 accelerated high schools were operated by seven private providers, all of whom offer over-age, under-credited youth an opportunity to accelerate credit accumulation and graduate from high school.

Students who remain enrolled in accelerated schools for at least 12 months outperform similar students in neighborhood high schools: Students Earning Six or More Credits: Accelerated Students vs. Similar Students Enrolled in Neighborhood High Schools: 2009-10 through 2011-12

Project Overview Research for Action (RFA) conducted a two-year, mixedmethod study guided by research questions addressing: • Challenges facing students when they enroll in accelerated schools. • Strategies and practices providers employ to support students’ attendance, retention, and completion; • Overall retention and graduation outcomes; • Credit accumulation rates; and • Graduation rates.

Data Collection & Analyses

Retention Remains a Challenge for Accelerated Schools: Accelerated Students’ Enrollment Status after the 2011-12 School Year

Data collection: • SDP administrative records • Interviews with SDP administrators & principals • Case studies of three accelerated schools Analyses on: • Student population • Student retention & graduation patterns • Matched comparisons of credit accumulation and graduation rates • Cross-site analysis to identify strategies to promote student attendance, retention, and completion at accelerated schools

Results/Conclusions Accelerated schools serve a vulnerable student population: • In-school challenges: Over-age, behind academically, exposure to juvenile justice system, at elevated risk for dropout • Out-of-school challenges: Highly mobile, parenting, working caring for family Accelerated schools strategies to meet students’ needs: • Building a relational culture between adults and students • Creating safe environments • Actively monitoring attendance • Engaging students in learning • Promoting visions for the future

9% 5%

58%

Still Enrolled Graduate

28%

Out of District Transfer Dropout

n = 10,876 Students

Research to Practice Strategies Implications for practice: • Stay focused on Attendance & Retention: Most drop out occurs in the first year of enrollment; after one year, students’ long-term outcomes were consistently better than those of similar students in neighborhood high schools.

Implications for Policy • Reductions in funding for accelerated school pose significant risks to an already vulnerable population of students. • Multiple metrics of success and supports aligned to them are required to rigorously assess the performance of accelerated school providers.

Contact Information: Michael Norton Research for Action 267-295-7760 *Final Report by Research for Action


Youth Risk Behavior Survey, Philadelphia Background The Youth Risk Behavior Surveillance System (YRBSS) monitors health-risk behaviors that contribute to the leading causes of death and disability among youth and adults.

Project Overview • YRBSS has been completed in the Philadelphia School District since 1991. • Every two years a scientifically selected sample of 9th-12th grade students complete a 99-item survey that asks them to report on their health risk behaviors. • Process ensures that with 70% completion, the sample is representative of all 9th-12th grade students in the district.

Methods • In spring 2013, the 99-item multiple choice YRBS was administered to 1300 high school students from 32 randomly selected public schools in Philadelphia. • The school response rate was 100%, the student response rate was 71%.

Results/Conclusions Demographics of Sample (N=1300)

Research to Practice Strategies Tracking trends of youth health behaviors help us identify which behaviors need to be addressed in our schools and communities.

Contact Information Name: Judith Peters, MBA, HHSA Organization: SDP Phone Number: 215-400-6893 Email Address: jrpeters@philasd.org


School-Wide Positive Behavior Interventions and Supports (SWPBIS) in SDP Schools: An Exploratory Study Background Ethnically diverse children from urban schools have higher rates of office discipline referrals (ODRs) and disruptive behavior disorders, indicating greater mental health needs (Chitiyo, May, & Chitiyo, 2012; Gregory & Weinstein, 2008; Vincent et al., 2011). SWPBIS is a well-documented approach to help improve school climate and decrease behavioral problems school-wide (Chitiyo, et al., 2012; Solomon et al., 2012; Vincent et al., 2011). SWPBIS aims to create a positive school environment for all students through the use of clear rules and behavioral expectations, and procedures to encourage and discourage certain behaviors (Taylor-Greene et al., 1997) reduce behavior problems (Bradshaw et al., 2010) and improve academic achievement (Horner et al., 2009).

Project Overview Project ACCESS is an initiative developed by The Children’s Hospital of Philadelphia and Devereux Center for Effective Schools in partnership with The School District of Philadelphia with funding from the Center for Disease Control and Prevention (CDC) to address health disparities in under-resourced schools.

Figure 1: Average Number of ODRs Per Student from 2008-2011

Effects of universal interventions on children’s behavior was measured via the number of ODRs per student at the beginning of Year 1, end of Year 2, and end of Year 3.

Results/Conclusions_________ Average number of ODRs per student decreased for both schools. All interventions were acceptable to children, parents and teachers. SWPBS can be successfully implemented in urban schools to reduce ODRs.

Research to Practice Strategies Identify strategies for scaling up the use of SWPBIS throughout the District. • Currently conducting a 5-year randomized control clinical trial of SWPBIS in 6 SDP schools.

Methods Project ACCESS was developed in two schools in Philadelphia using the Public Health approach (Kutash et al., 2006). School staff implemented SWPBIS with support from CHOP and Devereux. Study Aims: 1. To identify components of SWPBS interventions that need adaptation. 2. To implement interventions and reduce office discipline referrals (ODRs) 3. To assess acceptability of interventions

Refer to handout for references.

Contact Information Billie Schwartz, MA Ricardo Eiraldi, Ph.D. The Children’s Hospital of Philadelphia 267-427-7408/215-590-7759 schwartzb@email.chop.edu eiraldi@email.chop.edu


Student Achievement in Renaissance Charter Schools Background

Results/Conclusions

The School District of Philadelphiaâ&#x20AC;&#x2122;s Renaissance Schools Initiative was first implemented in 2010-2011. The initiative intended to spur dramatic improvements in student performance over a short period of time by providing additional resources, changes in staffing, increased attention and other strategies designed to improve persistently low-performing schools. The initiative consists of two models:

Overall, fifteen of the seventeen Renaissance Charter Schools in Cohorts 1-3 experienced an increase in the percentage of students scoring advanced or proficient on the PSSA reading test since turnaround. Thirteen experienced an increase in the percentage of students scoring advanced or proficient in math on the PSSA (see Figures 1 and 2).

Research to Practice Strategies Through the Renaissance Schools Initiative, schools were tasked with achieving dramatic improvements in student outcomes in a short period of time, and there is evidence of positive improvements at many Renaissance Charter schools. Programmatic and other qualitative data should be collected at these schools from administrators, teachers, student and parents in order to capitalize on the strengths of these programs. ORE is currently in the process of collecting qualitative data.

Promise Academies, which remain District managed, but undergo changes in leadership and teaching staff, as well as receive additional funding and support; and Renaissance Charter Schools, which remain neighborhood schools but are managed by charter providers, with relative autonomy from the District.

Project Overview This analysis explores changes in buildinglevel student achievement after school turnaround at School District of Philadelphia (SDP) Renaissance Charter schools.

Figure 1. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Reading for Renaissance Charter Schools Cohort 1 Schools

20.04

16.07 15.05 15

14.01 12.63

13.1

13.6 11.45

10 7.79

7.97

9.3

8.83 7.87

5

3.28

2.63 -0.97

0 Aspira Universal YS Mastery Universal Mastery Mastery Universal Mosaica Mastery Stetson Daroff Frederick Mann Bluford Harrity Smedley Vare Birney Gratz Douglass

-0.62

Mastery Aspira Universal Memphis Arts Universal Mastery Clymer Olney HS Audenreid St. Edmunds Creighton Cleveland

-5

Figure 2. Average Percentage Point Change since Turnaround in Students Scoring Adv./Prof. on PSSA Math for Renaissance Charter Schools Cohort 2 Schools

Cohort 3 Schools

35 Average Percentage Point Change since Turnaround

Based on the research related to successful school turnarounds, a reasonable baseline by which to judge academic gains was determined to be a 4 to 8 percentage point increase in students scoring advanced or proficient on the PSSA each year after turnaround. Therefore, Cohort 1 schools would be considered to have met the expectation for adequate progress with a minimum increase of 12 percentage points, and cohort 3 schools with a minimum increase of 4 percentage points.

Cohort 3 Schools

20

Cohort 1 Schools

Methods

Cohort 2 Schools

25 Average Percentage Point Change since Turnaround

Since 2010-2011, twenty schools have been converted to Renaissance Charters, and fifteen schools have become Promise Academies (although three of the fifteen have since closed).

30.34

30 25

24.55

23.77 19.28

20 15.27

20.34

19.15 16.3

15.27

14.01

15

12.2

13.6

10 5.68 5 -0.49 0

-5

Aspira Universal YS Mastery Universal Mastery Mastery Universal Mosaica Mastery Stetson Daroff Frederick Mann Bluford Harrity Smedley Vare Birney Gratz Douglass

-11.82

-1.29

-5.53

Mastery Aspira Universal Memphis Arts Universal Mastery Clymer Olney HS Audenreid St. Edmunds Creighton Cleveland

-10 -15

Contact Information Kati Stratos; Adrienne Reitano School District of Philadelphia 215-400-5396 kstratos@philasd.org; areitano@philasd.org


The Common Core in New York City Schools Background

Results/Conclusions

School districts across the country are designing strategies to support and guide teachers in understanding and implementing the Common Core State Standards (CCSS). District policies shape how new initiatives such as the CCSS are interpreted by practitioners and have implications for how policies are implemented at the school- and classroomlevels.

There was wide variation in how schools responded to the districtâ&#x20AC;&#x2122;s expectations for implementing the CCSS, ranging from schools who transformed their approach to teaching and learning, to schools that made only minor changes to what they were already doing. Those schools who transformed their instructional approaches developed a deeper understanding of the CCSS.

Project Overview

Furthermore, researchers found that teachers relied mostly on coaches and school administrators to gain knowledge about the CCSS, and that knowledge was transferred mostly during formal, within-school learning opportunities such as team meetings and school-wide professional development.

Over the course of two school years, researchers from the Consortium for Policy Research in Education (CPRE) spent time in NYC schools learning about the cityâ&#x20AC;&#x2122;s expectations for teachers regarding the CCSS and how teachers and school administrators interpreted and engaged with those expectations.

Methods Researchers collected data through interviews with stakeholders at all levels of the NYC school system, including teachers, principals, network leaders, and city-level administrators. Additionally, a survey was administered to a sample of schools regarding the sources that school staff used to help understand and implement the CCSS.

Research to Practice Strategies District policy was carefully crafted to encourage schools to engage with the new CCSS. Results from this research suggest that the more thoughtfully schools engage with CCSS implementation, the better their understanding of the CCSS will be. Also, coaches and administrators play a key role in disseminating knowledge within schools and they access external resources more often than other school staff. District and school leaders should prioritize providing teachers with opportunities to collaborate and access knowledge which resides within schools.

Visit cpre.org/CommonCoreNYC for more information. Contact Information Jonathan Supovitz Consortium for Policy Research in Education (215) 573-0700x230 jons@gse.upenn.edu


Tab 16 1 Posters Relatedd to Action Plan Strateegy 3–Idenntify and DDevelop Excceptional, Committed Peoople 3..01 Ebby – Teacher T Analyssis of Student Knowledge (TTASK) 3..02 Herzog – Teacher Netw works in Philaadelphia: The CCurrent Landsscape 3..03 Jack & Sttratos – Teachher Perceptionn of Evaluationn Focus and Reeported Outcoomes 3..04 Merlino – National Center for Cogniition and Sciennce Instruction – Findings ffrom Philadelphia


Page Left Blank Intentionally


Teacher Analysis of Student Knowledge (TASK) Background

Results/Conclusions

A Teacher Analysis of Student Knowledge, or TASK, is a grade-specific, online assessment for mathematics teachers which measures important components of the instructional knowledge necessary to teach to the high expectations of the Common Core State Standards in Mathematics. TASK can be used for program evaluation, professional development, and more.

Findings from the TASK nation field trial paint a picture of the current state of teachers’ learning trajectory-oriented formative assessment capabilities in grades K-10 in five urban and urban fringe districts. They indicate: • Across all grades, teachers focused more on what students do (procedural) than what they understand (conceptual). • The majority of teachers suggested teaching the student particular strategies rather than developing mathematical understanding. • Overall, researchers see significant room for growth in teacher capacity to identify, interpret, and respond to students’ conceptual understanding.

Project Overview In Spring 2012, the Consortium for Policy Research in Education (CPRE) conducted a large field trial of the TASK instrument in partnership with five public school districts in five northeastern and southern states. The districts varied in size, student demographics, and programs of mathematics instruction. Teachers completing the TASK instrument were evaluated in Content Knowledge, as well as six domains: Concept Knowledge, Mathematical Validity, Analysis of Student Thinking, Learning Trajectory Orientation in both Ranking and Rationale, and Instructional Decision-Making.

Research to Practice Strategies Given the emphasis in the Common Core State Standards in Mathematics on rigor as a balance between procedural and conceptual understanding, policymakers and practitioners must prioritize capacity building in both district policy and practice. Access the TASK field test results in the interactive electronic report at cpre.org/task-report.

Methods 1,261 teachers of mathematics in grades K-10 were given a TASK, which provides a gradeappropriate problem and a set of student responses, and then asks teachers to complete seven steps, each of which measures formative assessment practices pertaining to the domains described above.

Contact Information Caroline Ebby Consortium for Policy Research in Education (215) 573-0700x222 cbe@gse.upenn.edu


Teacher Networks in Philadelphia: The Current Landscape Background

Results/Conclusions

Teachers often feel isolated within their classrooms, schools, districts, states (Meyers, Paul, Kirkland and Dana 2009) 1.

On average, teachers identified 10 resources (people, groups, events) that they rely on: Æ nearly half of these resources run through formal structures/channels, half informal Æ 20% of resources are found outside school Æ just 8% of resources are engaged online

Teacher networks • Connect teachers in order to enhance expertise, instruction, and classroom/ school environment • Influence teacher expertise, teacher leadership, student achievement (Leana 2011; Baker-Doyle 2010; Anderson 2010, Daly et al 2010, Spillane & Louis 2002) 2

Project Overview The Philadelphia Education Fund supports teacher networks (e.g., Philadelphia Teacher Residency, Math + Science Coalition, Early Warning Systems, Philadelphia Postsecondary Success Program), and is interested in better understanding the landscape of teacher networks in Philadelphia and how networks serve to enhance teacher knowledge, retention, and perceived value. SDP Action Plan 2.0 Strategy 3, Goals D, E, & F focuses on supporting teacher development and collaboration.

Methods_______________________ • 180+ interviews of teachers, support staff, and educators across District and charter schools with follow-up survey around teacher networks • Qualitative and social network (ego network) analysis of interview data, descriptive analysis of survey results Example of teacher network differences

• Network value and teacher persistence tied to school networking environment • Wide range of teacher networks (see Figures A and B) and network behavior statistically different between schools • Teacher autonomy and shared leadership promotes productive networking practice • Desired networking topics include: 1) using and analyzing student data, 2) classroom management 3) technology in the classroom, 4) teaching content 5) differentiated instruction, 6) student interventions and supports

Research to Practice Strategies • Networks as a strategy to build knowledge, expertise, and professional support • Formal network time benefits school culture • District and partners can engage teachers in taking ownership of networks • District and partners can help facilitate and promote opportunities and content 1 Meyers,

E., Paul, P. A., Kirkland, D. E., & Dana, N. F. (Eds.). (2009). The power of teacher networks. SAGE;. C.R. (2011). The missing link in school reform. Stanford Social Innovation Review,30-35; Baker-Doyle, K. (2010). Beyond the labor market paradigm: A social network perspective on teacher recruitment and retention. Education Policy Analysis Archives, 18(26); Anderson, L. (2010). Embedded, emboldened and (net)working for change: Support seeking and teacher agency in urban, high needs schools. Harvard Education Review 80(4),541-572; Daly, et al. (2010). Relationships in reform: The role of teachers' social networks. Journal of Educational Administration,48(3),359-91; Spillane, J. P., & Louis, K. S. (2002). School improvement processes and practices: Professional learning for building instructional capacity. Yearbook of the National Society for the Study of Education, 101, 83–104.

2Leana,

Contact Information Liza Herzog Philadelphia Education Fund 215-665-1400 ext. 3323 lherzog@philaedfund.org


Teacher Perception of Evaluation Focus and Reported Outcomes Background Starting with the 2013-2014 school year and rolled out over the subsequent two years, the School District of Philadelphia (SDP) will implement the Pennsylvania Department of Education’s new educator effectiveness system, mandated by Act 82 of 2012. When fully implemented, teachers will be evaluated on a composite measure that incorporates data from a new classroom observation tool, building level scores as measured by the School Performance Profile, and individual impact on student achievement.

Project Overview In November 2013, the Office of Research and Evaluation (ORE) surveyed approximately 6,000 educators to gain feedback on the SDP 2012-2013 educator evaluation process. The findings from this evaluation can serve as a baseline to observe changes in perception and reported evaluation outcomes as the SDP implements Pennsylvania’s new educator effectiveness system.

Methods Educators were surveyed about what they perceived to be the focus of formal feedback received during their 2012-2013 evaluation. Respondents were also questions about the process of being evaluated in 2012-2013. These results are shown in Figures 1 and 2. ORE used an analysis of variance (ANOVA) to see if any differences exist between respondents in those four response categories across five evaluation outcomes. Figure 1. Overall Perception of the Focus of 2012-2013 Evaluation Feedback

Educators who perceived feedback to be more focused on judging their performance were more likely to report negative outcomes across all five evaluation goals, and similar outcomes as those who reported no formal feedback. Educators who perceived feedback to be equally focused on improving and judging their performance were more likely to report positive outcomes as compared to those who felt they were judged but less likely than those who felt the focus was on improvement.

Research to Practice Strategies The most beneficial formal feedback may be that which is focused on improving a teacher’s practice, followed by feedback that is equally attentive to improving practice and making a judgment about performance. Feedback perceived as primarily judging is the least useful, and no more helpful than not having received formal feedback at all. As reforms aim to retain and grow the best teachers, these findings and those of previous research stress the importance of creating an environment that supports and promotes teacher development and growth rather than one that is perceived as punitive or judgment focused.

Figure 2. In 2012-13, the processes used to conduct my evaluation...

Strongly Agree

Agree

Disagree

Strongly Disagree

The feedback I received from my evaluator focused... (n=5,340) 45%

Educators who perceived feedback to be more focused on improving their practice were more likely to report positive outcomes across all five evaluation goals (e.g. it helped me improve as a professional).

Improved my students' achievement. (n=5,514)

55%

30%

Contact Information

40%

Percent of Respondents

35% Improved my practice. (n=5,538)

30%

62%

22%

25% 20%

Influenced my professional development activities. (n=5,531)

40%

52%

30%

15% 27% 21%

10% 5%

13%

Clearly defined what was expected of me. (n=5,538)

64%

21%

Helped me improve as a professional. (n=5,570)

62%

21%

0% More on helping me More on making a Equally on helping I did not receive improve my judgment about my me improve my formal feedback practice. (n=2,325) performance. practice and making from my evaluator (n=737) a judgment about last year. (n=1,220) my performance. (n=1,564)

0%

25% 50% 75% 100% Percent of Respondents

James Jack & Katherine Stratos School District of Philadelphia 215.400.6403 jjack@philasd.org


National Center for Cognition and Science Instruction Findings from Philadelphia Background

Results/Conclusions

Can laboratory-based research findings in cognitive science be applied to existing science curriculum units to improve student achievement?

1. Cog science group of schools performed statistically significantly better than some Control or Content Only units, but effect size was small.

Project Overview

2. Content only school performed the worse than Cog Sci or Control in almost all units and cohorts.

Institute of Education Science awarded grant to IHE and nonprofit orgs in Philadelphia and Pittsburgh to form a National Center to do a large randomized control study of the effects of cognitive science curricular modifications to three units of the SDP’s Holt middle school science texts. (A second twin study in Arizona and Pittsburgh used the FOSS curriculum.) A second intervention focused on providing teachers concentrated science content PD only. (28 hrs. per unit) 90 SDP schools were involved from 2009-2013 and over 270 teachers. The schools were randomly divided into 3 groups within each SDP region. • Cog Science Group: PD (28 hrs./unit) + science text modifications group • Content PD only (28 hrs./unit) • Control business-as-usual group

Methods_______________________ School districts and principals were recruited and volunteered. Teachers were recruited prior to school randomization and were paid a generous stipend for participation. Two cohorts of teachers were used. End-of-units tests were constructed using an innovative algorithm that closely aligned each end-of unit-test item to the content of each unit. The tests were given to all three groups. Detailed teacher and student level data was collected. PSSA scores were also collected. Three tiered HLM was used for analysis.

3. Student of teachers who taught units twice did better on some of the units versus control. 4. Large variance within classrooms and across schools in all groups. 5. Teacher mobility and teaching reassignments from one year to the next varied from 40-50% a year rendering multi-year intervention less effective.

Research to Practice Strategies 1. The Cogsci modifications and materials for Holt science units are available to the SDP for use in all middle schools free of charge. Just the costs of reproduction. 2. Cog Science PD and Content PD materials are available and PD services from 21PSTEM. 3. Content only PD needs higher dosage but may still be ineffective in raising student scores. 4. High teacher and principal assignment instability is a serious and even fatal threat to reform efforts. But 70% of instability is potentially controllable. Partners for this research were the University Pennsylvania, Temple University, the University of Pittsburgh, Research for Better Schools, and The 21st Century Partnership for STEM Education.

Contact Information F. Joseph Merlino The 21st Century Partnership for STEM Education 610-368-7942 jmerlino@21pstem.org


Tab 17 1 Posters Relatedd to Action Plan Strateegy 4–Becoome a Pareent- and Faamily-Centtered Orgaanization 4..01 Kowalskii – District-Wiide Parent andd Student Surv rveys 4..02 Rogers & Subramanyaam – Nudgingg Attendance w with Social Coomparison: A RRandomized Controlleed Experimentt 4..03 Spier – Investing in Faamily Engagem ment


Page Left Blank Intentionally


District-Wide Parent & Student Surveys

Background Action Plan 2.0 calls for the School District of Philadelphia to become a parent and family centered organization, to develop a system of excellent schools, and to improve student learning, data accuracy, data application, and data accessibility. To further these objectives the School District will release redesigned District-Wide Surveys to students in grades 312 and all parents in the latter part of the 20132014 school year.

Project Overview The redesigned District-Wide Parent Survey (DWPS) and District-Wide Student Survey (DWSS) will request feedback on multiple aspects of the educational experience within the School District of Philadelphia. All specific areas of inquiry included in the redesigned surveys will align with goals and strategies set forth in Action Plan 2.0, so that the data collected may be utilized in implementing and evaluating school and district initiatives.

Methods_______________________ Redesigning the DWPS and DWSS to align with the needs of parents, students, schools, teachers, administrators, and the District required numerous sources of input. In working to construct useful survey tools the following steps were taken: 1. Identification of survey goals aligned with Action Plan 2.0 2. Convening parent and student focus groups to identify constructs important to those parties

3. Review of the extant literature relevant to the constructs to be included in the survey and survey design 4. Item creation 5. Expert review of survey items and format 6. Item-pretesting with parents and students 7. Survey review by students, parents, teachers, counselors, administrators, and central office staff.

Results/Conclusions The survey redesign process outlined above has resulted in surveys which, based on the available data, include constructs and items important to a diverse group of stakeholders within the School District of Philadelphia.

Research to Practice Strategies The data collected via the redesigned DWPS and DWSS will be used in a myriad of ways, including the following: 1. To construct scores on the School Progress Report 2. To identify areas of need and excellence within District schools 3. To inform the implementation and evaluation of District initiatives 4. To evaluate the psychometric validity and reliability of the survey tools Contact Information David Kowalski School District of Philadelphia 215-400-6136 dkowalski@philasd.org


Nudging Attendance with Social Comparison: A Randomized Controlled Experiment Background ___________________________ Recent work suggests that the majority of parents of highly absent students believe their student’s attendance is average or above average as compared to peers in the same grade and school.1,2 In this randomized experiment, we will correct these miscalibrated parental beliefs and study the downstream impact on student attendance. This work leverages psychological research showing that people are motivated to perform behaviors when they believe most others do them as well. For example, people are more motivated to vote when they are informed that many others will vote3 and people reduce their energy use when they are informed that others use less energy than they do4. Research Questions 1. Does student absenteeism reduce when parents receive information about their student’s school attendance? 2. Is this treatment more effective when social comparison information is included?

Study Design

Methods________________________________ • 10 SDP K-12 schools were recruited to participate • Informed consent sent to all guardians of students in participating schools • Sample universe: All students in the bottom 50% of attendance in each school and grade during the 1st semester of the 2013-14 SY • We are interested in student attendance, student performance, and sibling attendance • Students are randomly assigned to 1 of 3 conditions Preliminary Survey Results_____________ _ (Experiment still in progress) • Phone surveys suggest (N=840) over 80% of guardians of highly absent students (mistakenly) believe their students’ attendance is average or better compared to their classmates. 89% are very likely to open mail from the School District. And 92% of the mailing addresses were correct. References 1. Rogers, T. (in progress). Parental Miscalibration of Their Children’s Relative Attendance and Performance. 2. Svenson, O. (1981). Are we all less risky and more skillful than our fellow drivers? Acta Psychologica, 47(2), 143-148. 3. Gerber, A.S. & Rogers, T. (2009). Descriptive Social Norms and Motivation to Vote: Everyone’s Voting and so Should You. The Journal of Politics 71(1) 178Ǧ191. 4. Allcott, H., & Rogers, T. (2014). The short-run and long-run effects of behavioral interventions: Experimental evidence from energy conservation (No. w18492). National Bureau of Economic Research.

Sample Universe Control

Treatment 1

Treatment 2

Contact Information Project Timeline_______________________ • February 2014: Phone surveys conducted • March 2014: First round of mailings sent • April 2014: Second round of mailings sent

Todd Rogers, Harvard Kennedy School 617-496-3621 Todd_Rogers@hks.harvard.edu Shruthi Subramanyam, Harvard Kennedy School 617-384-8185 Shruthi_Subramanyam@hks.harvard.edu


Investing in Family Engagement Background This study is being carried out through a U. S. Department of Education i3 validation grant. The purpose of this grant is to assess the effectiveness of the Families and Schools Together (FAST) initiative in turning around persistentlylow-performing schools. This approach aligns with Strategy 4 of the District’s Action Plan 2.0 – Become a parent- and familycentered organization.

Project Overview The project logic model states that improving family functioning and engagement in children’s schooling early on will contributed to school turnaround through increased attendance, improved student social and behavioral skills, and improved academic achievement. The FAST program consists of 8 weekly after-school sessions where families of kindergarteners meet with school staff, other students’ families, and staff from other local agencies to enjoy a meal and participate in activities. Through FAST, families build positive connections with other families and school staff, learn positive techniques for interacting with their child, and learn about available community resources.

Results/Conclusions This study began in 2013, so impact information is not yet available. Considerable effort has been required to encourage program attendance and to obtain study participation consent forms from families. Research to Practice Strategies The District’s Action Plan 2.0 states that active [parent and family] engagement will help improve the achievement of our students as well as our overall system performance. This study assesses the extent to which the FAST intervention improves family engagement and student academic performance in the District’s lowest-performing primary schools. The results of this study will help the District and other, similar districts to make evidence-based decisions about family engagement strategies for improving student outcomes.

Methods_______________________ This evaluation is a randomized control trial with 60 low-performing elementary schools in the District. Outcomes assessed include family functioning, family-school engagement, children’s social and behavioral skills, and children’s academic skills.

Contact Information Elizabeth Spier, Ph.D. American Institutes for Research 650-843-8226 espier@air.org


Page Left Blank Intentionally


Tab 18 1 Posters Relatedd to Action Plan Strateegy 5–Becoome an Innnovative annd Accoountable Orrganizationn 5..01. Gutierrezz & Alemany – Evaluation of o the Philadellphia GEAR UPP Partnership 5..02 Hartmannn, Gao, & Hallar – Aggregaated Analysis oof Six 21st Century Communnity Learning Center Evvaluations, 20011-12 5..03 Kerstetteer – The Consoortium for Policy Research in Education (CPRE)


Page Left Blank Intentionally


Evaluation of the Philadelphia GEAR UP Partnership Background The Philadelphia GEAR UP Partnership initiative is a six-year (2009-2015), federallyfunded program that brings together a strong network of partners committed to improving the educational outcomes and graduation rates of Philadelphia Public School students.

9 Very promising results from the rigorous impact study, including significant impact of AVID on credits earned and attendance. Impact of GEAR UP on Credits Earned (2012-13 QED Study) *

Project Overview The program began in 2009-10 with 6th grade and 7th grade cohorts in 32 middle schools. In 2014-15, the program serves over 4,200 tenthgrade and eleventh-grade students in seven high schools. The Philadelphia GEAR UP project provides a wide array of services to students, teachers, schools and parents.

Methods_______________________

*An asterisk denotes a statistically significant difference at the .05 level.

Areas for growth: 9 Low participation in services for at risk students; over one-third of GEAR UP students have limited exposure to GEAR UP (i.e., one or two activities only) 9 Large variation across schools in implementation/dosage, and outcomes 9 Scheduling and implementation difficulties at the large schools; school climate issues

Research to Practice Strategies 9 Use impact findings and AVID factsheet to promote buy-in among stakeholders; if possible expand AVID (given its proven impact) 9 Use participation data to closely monitor implementation across schools and activities and make mid-course corrections 9 Increase resources at large schools 9 Strengthen efforts to reach underserved GEAR UP students, including new recruitment and retention strategies 9 Future research area: assess the cost effectiveness of individual GU components

Results/Conclusions Positive results: 9 GEAR UP has a strong presence and is being implemented with high fidelity in most schools 9 Students with higher GEAR UP dosage exhibited better outcomes than their peers (school attendance, credits earned, attitudes and knowledge about college)

Contact Information Manuel GutiĂŠrrez, Vice President mgutierrez@metisassoc.com Julia Alemany, Senior Associate jalemany@metisassoc.com Metis Associates, 212-425-8833


Aggregated Analysis of Six 21st Century Community Learning Center Evaluations, 2011-12 Background and Project Overview 21st Century Community Learning Center (21st CCLC) grants fund out-of-school time (OST) programs to provide academic support for youth attending high-poverty, under-performing schools. Of the 18 Philadelphia organizations awarded 21st CCLC grants in 2011-12, six organizations representing seven providers selected RFA to conduct a mixed-methods evaluation that examined student demographics, program quality, and student outcomes.

Methods Data Sources • Student outcome data • SDP (13 schools) • Catholic schools (8) • Charter schools (2) • OST Program participation data • Interviews with program directors, site coordinators Analyses Mixed-methods study that includes an analysis of student outcomes data, program participation data, and qualitative data from the local evaluations

Well-Prepared Staff • Staffing was a challenge for all programs as most staff positions were part-time and lowpaying. • Programs struggled to attract highly qualified staff. Robust School Partnerships • All but one of the 21st CCLC providers experienced a supportive relationships with the principals (provided access to space and aided in recruitment) but had less communication with classroom teachers. • Providers were requiring and offering additional professional development • Providers were seeking to employ staff with some postsecondary training. Student Outcomes After taking into account pre-existing differences, there is no consistent evidence that ES and MS OST participants are doing better than nonparticipants academically (grades, PSSA scores) or in their behavior (attendance, suspensions). However, among students who participated in programming, higher levels of participation are associated with better outcomes.

Results/Conclusions Student Participation • Majority of elementary school students participated in 21st CCLC programs at a meaningful level (90+ days). • Majority of middle and high school students did not attend at a meaningful level (90+ days). Enrollment by OST Participation & Grade Level Elementary (K-5)

Middle (6-8)

90 80 70 60 50 40 30

60

58

20 10

42

36 24

17

22

OST(90 & more)

OST(30-89)

33 7

0

OST(<30)

Program Quality Alignment of Program to Student Academic Outcomes • Impact academic achievement through homework help and project-based learning, • Largely unable to provide a significant amount of one-on-one or small group academic support or align academic support to school day activities.

3/21/2014

Research to Practice Strategies

High School (9-12)

100

For program providers: • Increase student participation (MS and HS) • Offer more centralized planning for program staff. • Develop relationships with school personnel. For Philadelphia’s OST System: • Support professional development for OST staff. • Help SDP support OST partnership.s • Support development of the OST Workforce.

Tracey Hartmann, Jian Gao, & Brittan Hallar Research for Action 267.295.7760

1


The Consortium for Policy Research in Education (CPRE) Background

Research to Practice Strategies

CPRE brings together education experts from renowned research institutions to contribute new knowledge that informs K16 education policy & practice. Our work is peer-reviewed & open-access for education policymakers, practitioners, and researchers at cpre.org.

Not only does CPRE engage in quality research, but we also work to make our research available to policymakers and practitioners. Access CPRE’s work in the following mediums: • Visit cpre.org to access our researcher, educator, or policymaker pages. • Policy Briefs present research reports in more concise and digestible language. • Insights monthly e-newsletter provides monthly updates of CPRE’s latest findings, projects, and more. • Follow us on Twitter and like us on Facebook using the QRC’s below:

Projects CPRE researchers study myriad topics pertaining to education policy &practice, including: • • • • • • •

Common Core State Standards Instruction & Learning School & District Leadership Professional Development Teacher Quality School Finance And more!

Methods CPRE employees quantitative, qualitative, and mixed-methods approaches to research. Our researchers conduct experimental studies, program evaluations, and instrument development.

Sign-up for CPRE䇻s Insights e-newsletter. Follow us on Twitter @CPREresearch.

Like us on Facebook.

Contact Information Jackie Kerstetter Communications Manager, CPRE University of Pennsylvania (215) 573-0700x231 jji@gse.upenn.edu


Page Left Blank Intentionally


Tab 19 1 Poster related to t Action Plan P Strateggy 6–Achieeve and Susstain Finanncial Balancce 6..01 Steinbergg & Quinn – Assessing A Adeqquacy and Equuity in Education Spending


Page Left Blank Intentionally


Assessing Adequacy and Equity in Education Spending Background

Results/Conclusions

We offer empirical evidence on the extent to which education spending in districts across Pennsylvania is both inadequate for all students to achieve academically and is distributed in such a way that disadvantages some districts while benefitting others. In addition, we offer an empirical rejoinder to the oft-told story that large urban districts, like the School District of Philadelphia (SDP), are inefficient. We situate our study during the very short period in Pennsylvania’s recent history when efforts were dedicated to addressing the inequitable distribution of resources through a fair funding formula and increasing the amount of resources available for education spending through a gradual increase in state appropriation.

• PA required an additional $3.21 billion statewide in education funding to account for the difference between current per-pupil spending and an educationally adequate level of spending for the 2009-10 school year. • There are significant differences in the adequacy gap across Pennsylvania districts. These differences vary by district-level student poverty, academic performance and geographic location. The poorest and lowest achieving districts had the largest adequacy gaps. • Even in the presence of a funding formula, SDP did not receive sufficient resources . However, SDP more efficiently utilized its resources than the average peer district in terms of student poverty and achievement.

Project Overview A comprehensive (and ongoing) assessment of: • The legislative, legal and policy context shaping education funding both locally and nationally. • Funding trends—the sources of education funding and expenditures per-pupil—both locally and nationally. • Issues related to equitable and adequate funding, on a per-pupil basis, throughout the Commonwealth, with a focus on the School District of Philadelphia.

Research to Practice Strategies • The study’s findings support claims Pennsylvania’s education funding system unfairly burdens SDP. • It also refutes claims SDP spends its resources inefficiently. While many students in SDP are still not academically proficient, the study indicates an investment of resources could potentially catalyze large gains in student success. Table 2. Expenditures in SDP and peer districts, by district poverty

Methods_______________________ We calculated an adequate per-pupil amount for each school district in Pennsylvania using the methodology of the legislatively commissioned Costing Out Study. We then compared SDP’s adequacy gap to districts with similar achievement scores and populations. p p y math achievement Table 1. Expenditures in SDP and peer districts, by

Contact Information: Matthew P. Steinberg, Rand Quinn Graduate School of Education University of Pennsylvania steima@gse.upenn.edu


Page Left Blank Intentionally


Â

Tab 20 2 Conttact Information


Page Left Blank Intentionally


Julia Alemany jalemany@metisassoc.com Metis Associates Researcher

Beau Bradeu william.h.bradley@phila.gov City of Philadelphia Researcher

Akeem Anderson akeemanderson@gmail.com University of Pennsylvania - Fels Institute of Government Graduate student

Bridgette Brawner brawnerb@nursing.upenn.edu University of Pennsylvania School of Nursing University faculty

Katherine Barghaus barghaus@upenn.edu University of Pennsylvania Researcher

Tonya Broussard tbroussard@philasd.org School District of Philadelphia SDP Central Staff

Tamika Barrow tamika.barrow@gmail.com University of Pennsylvania Graduate student

Shirley Brown shirl.brown@comcast.net Philadelphia Writing Project Professional Development Facilitator

Carlos Bates cbates@philasd.org School District of Philadelphia SDP Central Staff, Higher Education Transition Coordinator

Benjamin Brumley benjamin.brumley@gmail.com University of Pennsylvania Graduate student

Toscha Blalock tblalock@gse.upenn.edu Consortium for Policy Research in Education Researcher

Sophie Bryan sbryan@philasd.org School District of Philadelphia SDP Central Staff

Morgan Botdorf mbotdorf2@gmail.com Temple University Research Assistant

Darryl Bundrige dbundrige@cityyear.org City Year Greater Philadelphia Service/Practice Provider


Nancy Burns nburns@21pstem.org The 21st Century Partnership for STEM Education Project Coordinator

Jura Chung jchung@philasd.org School District of Philadelphia SDP Central Staff

Elizabeth Cain ecain@pyninc.org Philadelphia Youth Network Intermediary staff

Elise Ciner eciner@salus.edu Salus University Researcher

Grace Cannon gcannon@philasd.org School District of Philadelphia SDP Central Staff

Shauna Clarke shauna.clarke@icfi.com REL Mid-Atlantic at ICF International Researcher

Gina Cappelletti gcapp@gse.upenn.edu Penn GSE Graduate student

Donna Cleland dcleland@21pstem.org The 21st Century Partnership for STEM Education Director of Professional Development

Richmond Carlton richmond.carlton@gmail.com Temple University Undergraduate Student

Carrie Conaway carrie@virtualmax.com

Sarah Castello scostel1@swarthmore.edu Swarthmore Visiting Assistant Professor

Dennis Creedon dwcreedon@philasd.org School District of Philadelphia SDP Central Staff, Assistant Superingendent for Learning Network 3

Rachel Chu rachel.chew1990@gmail.com University of Pennsylvania Graduate student

Elizabeth Crossen ecrossen@pyninc.org Philadelphia Youth Network Foundation staff


Alicia Dahl adahl@philasd.org School District of Philadelphia SDP Central Staff

Michael Dickard mad389@drexel.edu Drexel University Graduate student

Joe D'Alessandro jvdalessandro@philasd.org School District of Philadelphia SDP Central Staff

Patricia DiLella pdilella@philasd.org School District of Philadelphia SDP Central Staff, Director of student systems - Info Technology for SDP

Brian Daly brian.daly@drexel.edu Drexel University University faculty

Erica Darken ebdarken@philasd.org Taylor Elementary, School District of Philadelphia Bilingual Teacher

Jean Dauphinee jean.dauphinee@icfi.com REL Mid-Atlantic at ICF International Researcher

Laura Desimone lauramd@gse.upenn.edu University of Pennsylvania Professor

Guy Diamond gd342@drexel.edu Drexel University University faculty

Emily Dowdall edowdall@pewtrusts.org Pew Charitable Trusts, Philadelphia research initiative Researcher

Angela Duckworth angela.duckworth@gmail.com University of Pennsylvania University faculty

Ayana Dudley adudley@phmc.org Health Promotion Council Nutrition Educator

Jennifer Duffy jeduffy@philasd.org School District of Philadelphia SDP Central Staff

Teresa Duncan teresa.duncan@icfi.com REL Mid-Atlantic at ICF International Director


Kimberly Edmunds kedmunds@researchforaction.org Research for Action Researcher

Ryan Fink ryanfi@gse.upenn.edu CPRE/University of Pennsylvania Researcher

Vicki Ellis vellis@philasd.org School District of Philadelphia Out of School Time, Summer Programs

Nelson Flores nflores@gse.upenn.edu University of Pennsylvania Assistant Professor

Shelly Engelman sengelman@philasd.org School District of Philadelphia SDP Central Staff

Michael Gallagher migallagher@philasd.org School District of Philadelphia SDP Central Staff

Katie Englander katen@gse.upenn.edu University of Pennsylvania Graduate student

Megan Getz megan.getz@phila.gov DHS Social Service Program Analyst

Linda Faber lfaber@philasd.org Office of Strategic Analytics SDP Central Staff

Kris Glunt kglunt@episcenter.org Penn State University EPISCenter Prevention Coordinator

Elizabeth Farley-Ripple enfr@udel.edu University of Delaware Researcher

Margaret Goertz pegg@gse.upenn.edu University of Pennsylvania University faculty

Charity Fesler cfesler@gse.upenn.edu University of Pennsylvania Graduate School of Education Graduate student

Debra Green debgreen@philasd.org School District of Philadelphia SDP Central Staff


Jody Greenblatt jgreenblatt@philasd.org School District of Philadelphia/Stoneleigh Foundation Stoneleigh Fellow

Dale Hamby dhamby@pa.gov PA Dept of Education Special Assistant to Secretary

Michelle Grimley megrimley@philasd.org School District of Philadelphia SDP Central Staff

Whitley Harbison wharbison@salus.edu The Pennsylvania College of Optometry at Salus University Clinical Research Coordinator

Elizabeth Gunderson liz.gunderson@temple.edu Temple University University faculty

Paul Harrington peh32@drexel.edu Drexel University's Center for Labor Markets and Policy Researcher

Manuel Gutierrez mgutierrez@metisassoc.com Metis Associates Researcher

Melanie Harris mharris@philasd.org School District of Philadelphia SDP Central Staff

Trevor Hadley thadley@upenn.edu University of Pennsylvania University faculty

Michelle Harris mimuller@philasd.org School District of Philadelphia SDP Central Staff

Danielle Haley maind@email.chop.edu Youth Heart Watch at The Children's Hospital of Philadelphia Researcher

Ray Hart rhart@cgcs.org Council of the Great City Schools Researcher

Kathleen Hall kdhall@gse.upenn.edu University of Pennsylvania University faculty

Brenna Hassinger-Das hassinger.das@temple.edu Temple University Infant and Child Laboratory Researcher


Liza Herzog lherzog@philaedfund.org Philadelphia Education Fund Researcher

Sophia Hwang hwangs2@email.chop.edu PolicyLab-Children's Hospital of Philadelphia Researcher

Kirsten Hill kihi@gse.upenn.edu University of Pennsylvania Graduate School of Education Graduate student

James Jack jjack@philasd.org School District of Philadelphia SDP Central Staff

William R. Hite, Jr. School District of Philadelphia Superintendent

Keysha Jackson kjackson3@philasd.org School District of Philadelphia SDP Central Staff

Stacy Holland sholland@philasd.org School District of Philadelphia SDP Central Staff

Karen James kljames@philasd.org School District of Philadelphia SDP Central Staff

Corinne Holmes corinne.holmes@temple.edu Temple University Graduate student

Dr. Robert Jarvis rljarvis@gse.upenn.edu University of Pennsylvania Director K-12 Outreach and Equity Leadership Initiatives

Rachel Holzman, Esquire rholzman@philasd.org School District of Philadelphia SDP Central Staff

Uma Jayaraman ujayaraman@philasd.org School District of Philadelphia District Assessment Coordinator

Yijing Huang yhuang@philasd.org School District of Philadelphia SDP Central Staff

Stephanie Jerome sejerome@philasd.org School District of Philadelphia SDP Central Staff


Jamie Jirout jamie@temple.edu Temple University, Spatial Intelligence and Learning Center (SILC) Researcher

Lee Johnson, Jr. ljohnson4@philasd.org School District of Philadelphia SDP Central Staff

Paul Kihn pkihn@philasd.org School District of Philadelphia SDP Central Staff

Diane Kim dkim@philasd.org School District of Philadelphia SDP Central Staff, Bilingual SPA

Jeff Jones jeffrey.jones@icfi.com REL Mid-Atlantic @ ICF International Meeting Planner

Kate Kinney Grossman kinneym@gse.upenn.edu Teacher Education Program, Graduate School of Education, University of Pennsylvania Associate Director, Teacher Education Program

Lisa Jones lisa.jones@unh.edu University of New Hampshire Researcher

David Kipphut dkipphut@philasd.org School District of Philadelphia SDP Central Staff, Deputy, Career and Techncial Education

Sabriya Jubilee sjubilee@philasd.org School District of Philadelphia SDP Central Staff

Hannah Klein hannah.klein@phila.gov City of Philadelphia, Department of Human Services Research Coordinator

Christina Kang-Yi ckangyi@upenn.edu University of Pennsylvania University faculty

Daniel Knapp daniel.knapp@phila.gov DHS Researcher

Jacqueline Kerstetter jji@gse.upenn.edu Consortium for Policy Research in Education (CPRE) Communications manager

David Kowalski dkowalski@philasd.org School District of Philadelphia SDP Central Staff


Kathleen Krier kkrier@21pstem.org The 21st Century Partnership for STEM Education Researcher

Lisa Luo lisa.luo@icfi.com REL Mid-Atlantic at ICF International Researcher

Chau Wing Lam clam@philasd.org School District of Philadelphia SDP Central Staff

Christina Malik cmalik@irtinc.us innovation Research & Training Researcher

Hannah Lawman hlawman@temple.edu Temple University Researcher

Bates Mandel bmandel1@verizon.net 21st Century Partnership for STEM Education Researcher, Professional Development

Nancy Lawrence nlawrence@21pstem.org 21st Partnership for STEM Education Researcher

Kirstie McClung kmcclung@philasd.org School District of Philadelphia SDP Central Staff

Evan Leach eleach64@aol.com Philadelphia Arts and Education Partnership Researcher

Lorraine McGirt lorraine.mcgirt@phila.gov City of Philadelphia Department of Human Services Manager, DHS OST Project

Betsy Lipschutz bdlipschutz@philasd.org School District of Philadelphia Researcher

Julia McWilliams juliemcwilliams5@gmail.com University of Pennsylvania Graduate student/researcher

Christopher Lubienski club@illinois.edu University of Illinois University faculty

David Meketon dmeketon@gmail.com University of Pennsylvania Researcher


Joseph Menta jmenta@philasd.org School District of Philadelphia SDP Central Staff

F. Joseph Merlino jmerlino@21pstem.org The 21st Century Partnership for STEM Education Researcher

Cheri Micheau cmicheau@philasd.org School District of Philadelphia ESOL Manager

Doria Mitchell dnmitchell@philasd.org School District of Phialdelphia SDP Central Staff

Tammy Mojica tmhill@philasd.org School District of Philadelphia Teacher/Researcher

Glenn Moore glmoore@philasd.org School District of Philadelphia SDP Central Staff

James Moore james.moore@phila.gov City of Philadelphia Executive Director, Philadelphia Policy & Analysis Center; Director of Policy & Evaluation for the Office of the Deputy Mayor for Health & Opportunity

Angela Morris amorris19023@gmail.com School District of Philadelphia ICH teacher

Leslie Morris lmorris@philasd.org School District of Philadelphia SDP Central Staff

Tara Lynne Murphy murphyt@freelibrary.org Free Library of Philadelphia Administrator

Gulal Nakati gulal.nakati@phila.gov City of Philadelphia Project Lead

Fran Newberg fnewberg@philasd.org School District of Philadelphia SDP Central Staff, Administrator


Bobbi Newman bnewma@gse.upenn.edu CPRE Researcher

Rachel Pereira rachpereira@yahoo.com Office of the Philadelphia District Attorney RELMA Governing Board

Michelle Nguyen mnguyen2@philasd.org School District of Philadelphia SDP Central Staff

Judith Peters jrpeters@philasd.org School District of Philadelphia SDP Central Staff

Patricia Norwood pnorwood@philasd.org School District of Philadelphia Consulting Teacher

Alan Phillips aphillips2@philasd.org School District of Philadelphia SDP Central Staff, Data Analyst

Amy Pace amy.pace@temple.edu Temple University Postdoctral Fellow

Heather Polonsky heather.polonsky@gmail.com Temple University - Center for Obesity Research and Education Researcher

Freda Patterson fredap@temple.edu Temple University Researcher

Brian Pavlovic bpavlovic@philasd.org School District of Philadelphia SDP Central Staff

Tennille Peeler tdtodd@philasd.org School District of Philadelphia SDP Central Staff

Michael Posner michael.posner@villanova.edu Villanova University Researcher

Rand Quinn raq@gse.upenn.edu University of Pennsylvania University faculty

Kenneth Ramos kramos@philasd.org School District of Philadelphia SDP Central Staff


Danielle Raucheisen draucheisen@philasd.org School District of Philadelphia Office of Research and Evaluation SDP Central Staff

Nicole Ray nray@philasd.org School District of Philadelphia Graduate student

Janaiya Reason jreason@nursing.upenn.edu University of Pennsylvania School of Nursing Researcher

Jessa Reed jreed@temple.edu Temple University Graduate student

Adrienne Reitano areitano@philasd.org School District of Philadelphia SDP Central Staff

Claire Robertson-Kraft claire.rk@gmail.com OPE, University of Pennsylvania Researcher

Todd Rogers Todd_Rogers@hks.harvard.edu Kennedy School Harvard Univ. Assistant Professor

Gail Rosenbaum grosenbaum@temple.edu Temple University Graduate student

Christine Ross cross@mathematica-mpr.com REL Mid-Atlantic at ICF International Researcher

Brooke Rothman brothman@branchassoc.com Branch Associates Researcher

Russell Rumberger russ@education.ucsb.edu University of California - Santa Barbara University faculty

Leah Sack lsack@salus.edu The Pennsylvania College of Optometry Research Coordinator

Dianne Santiago dhernandezsantiago@philasd.org Head Start ECE SDP Central Staff, Social Worker

Pearl Schaeffer pschaeffer@uarts.edu Philadelphia Arts in Education Partnership Researcher


Daniel Schiff dschiff@philaedfund.org Philadelphia Education Fund Researcher

Lori Shorr lori.shorr@phila.gov Mayor's Office of Education Other

Katie Schlesinger katie.schlesinger@gmail.com University of Pennsylvania Project Manager

Laura Shubila laura@b-21.org Building 21 Execution Director

Stan Schneider sschneider@metisassoc.com Metis Associates Evaluator

Bethany Silva silvab@gse.upenn.edu Philadelphia Writing Project at the University of Pennsylvania Graduate student

Billie Schwartz schwartzb@email.chop.edu The Children's Hospital of Philadelphia Graduate student

Majeedah Scott mscott2@philasd.org School District of Philadelphia SDP Central Staff

Kineret Shakow shakowk@einstein.edu Einstein Heathcare Network Partner Director

Brett Shiel bshiel@philasd.org School District of Philadelpha SDP Central Staff

Jasmine Silvestri tub48904@temple.edu Temple University Researcher

Alyssa Simon asimon@thefoodtrust.org The Food Trust Youth Wellness Coordinator

Kelly Smith ksmith2@philasd.org College and Career SDP Central Staff

Emily Snell emily.snell@temple.edu Temple University Researcher


Sharon Sniffen sharon.sniffen@icfi.com REL Mid-Atlantic at ICF International Logistics Coordinator

Judith Stull stullj@temple.edu Temple University University faculty

Harris Sokoloff harriss@gse.upenn.edu University of Pennsylvania University faculty

Shruthi Subramanyam shruthi.subramanyam.hks@gmail.com Harvard Kennedy School Researcher

Nicole Sorhagen nicole.sorhagen@temple.edu Temple University Graduate student

Kelsey Suloman ksuloman@philasd.org School District of Philadelphia Researcher

Elizabeth Spier espier@air.org American Institutes for Research Researcher

Jonathan Supovitz jons@gse.upenn.edu Consortium for Policy Research in Education, University of Pennsylvania Researcher

Matt Steinbers steima@gse.upenn.edu University of Pensylvania Assistant Professor

Allison Still awstill@philasd.org School District of Philadelphia Acting Director

Kati Stratos kstratos@philasd.org School District of Philadelphia SDP Central Staff

Tom Szczesny tszcz@gse.upenn.edu University of Pennsylvania GSE Graduate student

Jennifer Thompson jthompson@branchassoc.com Branch Associates, Inc. Researcher

Tamara Toub tamara.spiewak.toub@temple.edu Temple University Infant and Child Laboratory Researcher


Timothy Valshtein timothy.valshtein@temple.edu Temple University Student Research

Barbara Wasik bwasik@temple.edu Temple University University faculty

Jill Valunas JValunas@cli.org Children's Literacy Initiative Regional Manager

Elliot Weinbaum eweinbaum@williampennfoundation.org William Penn Foundation Foundation staff

Stephanie Vander Veur svanderv@temple.edu Temple University, Center for Obesity Research and Education Researcher

Dave Weinstein dweinstein@cityyear.org City Year Internal non-profit evaluator

Victoria Vetter vetter@email.chop.edu The Children's Hospital of Philadelphia, Perelman School of Medicine at the University of Pennsylvania University faculty/Researcher

Darcelle Void-Boston ddvoidboston@philasd.org School District of Philadelphia SDP Central Staff, Instructional Coach

Stephanie Waller wallers@sp2.upenn.edu University of Pennsylvania School of Social Policy and Practice Graduate student

Terri White twhite5@philasd.org Mayor's Office of Education Higher Education Advisor to Mayor

Peter Witham pwitham@edanalytics.org Ed Analytics Researcher

Tonya Wolford twolford@philasd.org School District of Philadelphia Researcher

Naomi Wyatt nwyatt@philasd.org School District of Philadelphia SDP Central Staff


Paula Yust paula.yust@temple.edu Temple University Researcher

Mariam Zakhary mzakhary@philasd.org Office of Strategic Analytics SDP Central Staff

Sarah Zlotnik zlotniks@email.chop.edu PolicyLab Researcher


Page Left Blank Intentionally


Page Left Blank Intentionally