Page 1

Peer Learning Community Series for SCA Statewide Recidivism Reduction Grantees Data Collection & Evaluation: Building Capacity, Measuring Performance, and Developing Strategies to Monitor Public Safety and Program Outcomes Mike Eisenberg, Senior Research Manager Shenique S. Thomas, Policy Analyst

October 9, 2013


http://csgjusticecenter.org/nrrc/ ď‚— The resource center is

continually updating its website with materials relevant to the reentry field.

ď‚— Please register for the monthly

NRRC newsletter at: http://csgjusticecenter.org/subscribe /, and please share this link with others in your networks that are interested in reentry.


Peer Learning Community

• Purpose: Cultivate learning communities between Statewide Recidivism Reduction sites to facilitate discussions centered on implementing realistic research strategies, enhancing data collection and evaluation efforts, and sharing successes, challenges, and lessons learned related to program implementation and evaluation.


BJA’s Expectations around Evaluation • “In applying for these grants, lead grantees and their subgrantees agree to cooperate in any and all related research efforts and program evaluations by collecting and providing enrollment and participation data during all years of the project.” • “Applicants further agree to implement random or other modes of participant assignment, required by the evaluation design; cooperate with all aspects of the evaluation project; and provide comparable individual-level data for comparison group members.” • “Applicants are encouraged to consider a partnership with a local research organization that can assist with data collection, performance measurement, and local evaluations.”


Research Strategy • Realistic research strategy should increase the organization’s: – Understanding of the mechanism through which the program accomplishes change – What factors influence the change?

– Understanding of the necessary contextual conditions to accomplish change – What circumstances/conditions are needed to ensure the change takes place?

– Specificity of outcome pattern predictions according to context and mechanism triggered – What is the outcome and patterns that result due to this change?

(Pawson & Tilley, 1997)


Presenters Presenter and Moderator: Mike Eisenberg Senior Research Manager Council of State Governments Justice Center Grantee Presenters:Kansas Department of Corrections Margie Phelps Jeremy Barclay Kristin Bechtel Ohio Department of Rehabilitation and Correction Ed Rhine Sharon Schnelle


Presentation Overview Measuring Recidivism and Other Key Data Indicators

Evaluating the Impact of Specific Interventions

Case Example: Kansas Department of Corrections Case Example: Ohio Department of Rehabilitation and Corrections


Measuring Recidivism and Other Key Performance Measures • Why it matters: Knowing your statewide recidivism rate and having a rich set of data to break down and understand the drivers of the recidivism rate is key to developing effective reentry policies. • What to think about: – Defining recidivism – Effectively tracking and measuring statewide recidivism rates on a regular basis – Collecting other performance indicators of break-outs of the recidivism rate to give more context


Defining and Tracking Recidivism Measures 1. No national standard exists for defining recidivism 2. Agencies use a variety of definitions

Arrest Conviction Return to Incarceration

3. Standard follow up periods are necessary to calculate recidivism rates

Follow up matters – a one year rate will be lower than a three year Percent

One Year Tracking Period

Return to Prison for New Offense or Revocation of Supervision

Percent

Three Year Tracking Period

Return to Prison for New Offense or Revocation of Supervision


Examples of Key Recidivism Indicators to Inform Policy Recidivism rates by risk level for released population (1 year reincarceration) 80% 65%

67%

63%

60% 38%

59% 42%

40%35%

33%

20%16%

14%

15%

15%

2010.0

2011.0

2012.0

0% 2009.0

ďƒź To determine if supervision strategies targeting risk are effective

High Risk Mod Risk Low Risk


Examples of Key Recidivism Indicators to Inform Policy Distinguish supervision revocations by technical violations and new crimes

100% 76% 80%

75%

74%

75%

69%

25%

26%

25%

31%

FY12, Q1

FY12, Q2

FY12, Q3

FY12, Q4

60% 40% 24% 20% 0% FY11, Q4

Technical

New Crimes

 To determine if supervision strategies are impacting “behavior” (reduction in new crimes) and effectively utilizing progressive sanctions for violations


Presentation Overview Measuring Recidivism and Other Key Data Indicators

Evaluating the Impact of Specific Interventions or Programs

Case Example: Kansas Department of Corrections Case Example: Ohio Department of Rehabilitation and Corrections


Program Evaluation Overview

Issue 1: Identifying the target population and comparison group Issue 2: Defining and tracking recidivism outcomes Issue 3:

Conducting a process evaluation

Issue 4: Defining successful completion


Issue 1: Identifying a Target Population and Comparison Group Target Population Discussion: • High/moderate-high risk • Identified needs • Sample size for evaluation

Relevant information from the solicitation: “target population should be based on documented groups of offenders that significantly contribute to increased recidivism rates”

– What is adequate sample size for evaluation – Time to develop sample size and conduct evaluation


Issue 1: Identifying a Target Population and Comparison Group Comparison Group Discussion: Tiers of research design quality Ran do m Assi gn me nt QuasiExperimental

Qualitative Evidence

Individuals that qualify for program are randomly assigned to participate, or control group With adequate controls, may include: • Comparison to similar population in different jurisdiction • Comparison to overall population in jurisdiction not receiving services • Pre-program/post-program comparison of outcomes for the target population


Issue 1: Identifying a Target Population and Comparison Group Which is better – the “real good reentry” program or the “so-so reentry” program? Recidivism Rate “Real Good Reentry” Program

20% ASK

“So-so Reentry” Program

40%

Are Populations Served in Each Program Comparable?

Risk

Low Risk

Medium-High and High Risk

Age

45+

18 to 30

Female

Male and Female

First time offenders

2 or more prior felonies

1 year Reincarceration

2 year Rearrest

Gender Criminal History Recidivism Measure

These are not comparable programs


Issue 1: Identifying a Target Population and Comparison Group Relevant information from the solicitation: “Applicants agree to provide comparable individual-level data for comparison group members”

Discussion: In order to compare outcomes, you must control for some of the following factors:

Recidivism Rate Comparison Group

Did Not Participate in “So-so Reentry” Program

65%

“So-so Reentry” Program

40%

Risk

Medium-High and High Risk

Age

18 to 30

Gender Follow-up Period Recidivism Measure Criminal History

Male and Female 2 Years Any rearrest 2 or more prior felonies


Issue 2: Defining and Tracking Recidivism Measures Recommended definition of recidivism for evaluation: • One year follow-up • Rearrest and Reincarceration

Relevant information from the solicitation: “For purposes of this solicitation, “recidivism” is defined in accordance with the current definition utilized by the applicant agency”

– Misdemeanor or Felony Arrest – Reincarceration for new offense or revocation for supervision violations


Issue 2: Defining and Tracking Recidivism Measures Recidivism data and follow-up period: • What data sources are available for tracking recidivism? Rearrest

Reconviction

Reincarceration (jail Reincarceration or prison) (prison only)

SID #

SID #

SID #

DOC data

• Are you tracking recidivism rates of participants over a standard follow-up period?


Issue 3: Conducting a Process Evaluation

Goals of a process evaluation: 1. Monitor program implementation and changes to the program or service system interactions over time 2.

Determine whether a program is implemented in a way consistent with proven successful interventions

3. Fully describe the program and its components for purposes of replication


Issue 3: Conducting a Process Evaluation • A process evaluation should examine: √ Is program utilizing a design that has previously demonstrated an ability to reduce recidivism? √ Is the program being implemented as designed? √ Is staff training and experience adequate to deliver program as designed √ Are risk/needs assessed and services delivered consistent with risk and needs? √ Is the delivery of these services consistent over time?


Issue 4: Defining successful program completion Successful Completion

• Program participant successfully completed the previously defined necessary components of the program.

Dropout/Mortality

• Program participant disposition is unknown and should not necessarily be counted as a successful completer or failure.

Program Failure

• Program participant describes has not made advancements in the program, but instead has transgressed.


Issue 4: Defining successful program completion • Process-based definitions of successful completion: – Program participant has completed 70% of program requirements or case plan within one year. – Program participant at a moderate risk level has completed approximately 100 hours of services, programming, or treatment within one year; program participant at a high risk of reoffending has completed approximately 200 hours of services, programming, or treatment within one year.


Issue 4: Defining successful program completion • Outcome-based definitions of successful completion: – Program participant has shown an improvement on a post-test assessment of a behavior related to their risk of recidivism or goals of the program (such as, but not limited to, substance abuse, risk and need level, behaviors and attitudes, mental health, etc) within one year. – Program participant has achieved core benchmark goals of the program which are not necessarily related to behaviors (such as, but not limited to, attaining stable housing, attaining employment, achieving a GED, etc) within one year.


Presentation Overview


Case Example: Kansas Department of Corrections 1. Building the DOC’s capacity and expertise around data collection and evaluation 2. Evaluation of Thinking for a Change (T4C) 3. Tracking recidivism and other key performance measures to monitor progress and inform policy


Evaluation of Thinking for a Change • Project purpose: evaluate effectiveness of T4C, as measured by re-incarceration, on T4C participants when compared with similarly situated inmates • Initial stage of project development and research questions • Methodology – Sample and matching criteria – Implementation and program fidelity

• Preliminary findings and next steps


Case Example: Kansas Department of Corrections Based on offenders released from KDOC facilities who return within 3 years. CY 2005 CY 2006 CY 2007 CY 2008 CY 2009 RECIDIVISM 38.62% 34.18% 32.90% 33.64% 33.13% NO RETURN CY 2005

61.38% CY 2006

65.82% CY 2007

67.10% CY 2008

66.36%

66.87%

CY 2009

NEW CONVICTIONS

12.23%

13.21%

12.62%

13.99%

15.14%

CONDITIONAL VIOLATION

26.38%

20.97%

20.28%

19.65%

17.99%

OFFENDERS RELEASED By type of return and length of follow-up period* - The KDOC’s recidivism rate has declined 23.57% since FY 2000 when the recidivism rate was 56.7%. - For every 1% reduction in recidivism, the number of crime victims inherently drops and the need for prison beds drops by 34.


Case Example: Kansas Department of Corrections CY 2006 CY 2007 CY 2008 CY 2009 SEX OFFENDERS Overall 40.47% 42.66% 38.57% Conditional Violators 36.52% 37.86% 33.71% Convicted (New Offenses) 3.95% 4.80% 4.86%

33.33% 27.08% 6.25%

GENDER** Male Female

32.46% 15.79%

RISK LEVELS High Risk Moderate Risk Low Risk

34.17% 18.55%

33.12% 17.27%

33.81% 17.88%

38.85% 42.30% 41.09% 42.86% 32.97% 31.42% 31.95% 28.77% 26.08% 19.69% 18.04% 15.96%


Presentation Overview


Case Example: Ohio’s Reentry Efforts • Passage of HB 130 had direct implications for reentry and the Second Chance Act • Ohio’s HB130 became effective April 1, 2009 • 2 critical components of HB130 that support sustainability efforts for the state -

Established the statewide Ohio Ex-Offender Reentry Coalition to facilitate collaboration with state and local stakeholders and aid in reentry improvement efforts

-

Catalyst to the development of a 5 year comprehensive strategic plan that contains performance goals and outcomes, a commitment to reduce recidivism by 50%, and updates on progress in Annual Report issued by OERC


Ohio’s scope of Data Collection Development Services Agency Supreme Court of Ohio Dept of Aging -

# trainings of DRC staff # of trainings Reentry professionals # of inmate trainings(chronic disease) # of inmate trainings (diabetes)

- # Courts apply for Certification - # Courts granted Certification - # of SD courts starting up - # of SD courts receiving technical assistance

Dept of Rehabilitation & Correction REENTRY EDUCATION EMPLOYMENT VETERANS VICTIMS RECOVERY SERVICES MENTAL HEALTH - HOUSING

# # # #

of One Stops of Trainings of Job Fairs of prison mailings

Veteran’s Services - # veterans referred - # veterans connected to services - # of veterans released from DRC that connected to services

Dept of Alcohol & Drug Addiction Services* -

# of Drug Courts funded # juvenile TASC # adult TASC # clients served # clients successfully complete # clients readmitted DRC

Dept of Mental Health*

Governor’s Office of Faith-based & Community Initiatives

Dept of Education -

# inmates GED # inmates ABLE # inmates career certificates # inmates apprenticeship program

Dept Of Youth Services

OHIO Ex-Offender Reentry Coalition

-

Board of Regents

-

Dept of Job and Family Services Dept of Developmental Disabilities

-

EDUCATION EMPLOYMENT RECOVERY SERVICES MENTAL HEALTH HOUSING /PLACEMENT MENTORING FAMILY ENGAGEMENT

- # DD determinations - # DD ineligible r e d e te r m in a tio n s - # MOUs

Dept of Commerce

Rehabilitative Services Commission

Dept of Public Safety


OERC Annual Data Collection Tool • Ohio Office of Criminal Justice Services maintains the web-based data collection tool • Collaborative effort with stakeholders (identify data instruments currently used, and then developed data collection tools from there to ensure minimizing double data entry or burdening of the agencies by further taxing limited agency resources. For missing essential information needed to demonstrate movement on outcomes included in the strategic plan necessary additional measures were developed and included)

• Effective communication of the purpose – meeting with stakeholders to acknowledge goal of gathering information that can be used to further leverage scarce state resources

• DATA COLLECTION TOOL located at https://www.surveymonkey.com/s/OERCdatacollectiontool


Barriers / Lessons Learned • • • • • •

Logistical - not enough staff resources Fiscal and budgetary crisis at both state/local Data sharing limitations and issues Inefficient system of collecting information Diversity of data (i.e. collection period, methods, measures, etc) Balancing need to meet statutory obligations and utility of the information to inform the strategic planning process


Practical Application • Preliminary infrastructure resulted in opportunity to expand reentry efforts to rural communities.


Questions?


Thank You! Questions? Mike Eisenberg Senior Research Manager meisenberg@csg.org

Shenique S. Thomas Policy Analyst sthomas@csg.org

http://csgjusticecenter.org/nrrc/

The presentation was developed by members of the Council of State Governments Justice Center staff. Because presentations are not subject to the same rigorous review process as other printed materials, the statements made reflect the views of the authors, and should not be considered the official position of the Justice Center, the members of the Council of State Governments, or the funding agency supporting the work.

sca-srr-fy12-peer-learning-webinar-10-9-13-1  
Read more
Read more
Similar to
Popular now
Just for you