Issuu on Google+

The Program Effectiveness Plan Accrediting Bureau of Health Education Schools PAUL PRICE, MBA ABHES EDUCATION ADVISOR

The Purpose of the PEP

Purpose  The PEP offers a program the chance to evaluate its

overall effectiveness by: 

 

Systematically collecting data on each of the outcomes indicators Analyzing the data and comparing it with previous findings Identifying changes to be made (based on the findings)

 Use of at least three years’ historical outcomes

How Does the PEP Assess Programs?  PEP evaluates each individual program within an

educational institution by:     

Establishing goals Collecting outcomes data relevant to these goals Defining criteria for measurement of goals Analyzing outcomes Setting strategies to improve program performance

Outcome Assessment  Outcomes, though not limited to, are generally

defined in terms of the following indicators:   

Retention Job placement External validation (e.g., program assessment exam, certification/licensing exam) Student, graduate, extern affiliate, and employer satisfaction (through surveys).

What Purposes Does the PEP Fulfill?  Developing and using the PEP should fulfill several

purposes: 

Defining criteria for measurement of goals 

  

Used to unify administrative and educational activities

Assessing progress and the need for change Communicating to the public a process for quality improvement Demonstrating regulatory compliance for approving or accrediting organizations 

Such a document is a primary focus of most accrediting agencies

 Program success is based on student achievement

in relation to its mission

PURPOSE  The PEP requires each program within an institution to

look at its past, present, and future, and to continuously ask:  Where 

have we been?

Establishes baselines

 Where

are we now?

 Compares with baselines for needed change  Where

do we want to go?

 Setting

 How


do we get there?

 Process

used to achieve new direction

Data Collection, Baselines, and Outcomes

Data Collection  Data should clearly evidence the level of educational

outcomes for retention, placement and satisfaction  Information relevant to improving overall effectiveness  

In-service training programs Professional growth opportunities for faculty


 Collect data on each of the educational outcomes

areas and achievement of its occupational objectives for each of the programs offered by the institution 

Include data relevant to improving the program’s overall effectiveness Clearly evidence the level of educational outcomes and satisfaction experienced by current students, graduates, and employers

WHAT KIND OF DATA IS COLLECTED?  Annual reports  IEPs  Objectives  Retention rate  Job placement rates  Credentialing exam (participation)  Credentialing exam (pass rates)  Program assessment exams  Satisfaction surveys  Faculty professional growth activities  In-services


data and compare with previous

findings  Identify necessary changes in operations or activities  Set

baseline rates

Baselines (Thresholds)  How to set minimum thresholds?  Analyze previous data and average  Set and tweak as results come in  Survey constituents  Should be minimums be mandated or should

schools set and regulators/accreditors approve?

DEVELOPING OBJECTIVES  Program objectives are consistent with the field of

study and the credential offered and include as an objective the comprehensive preparation of program graduates for work in the career field.  Objectives of the program are to:  At the completion of the program, the student will

be able to:

Where are the program objectives published?

How does the program determine that graduates have achieved the objectives (e.g. surveys, credentialing exam)

Who reviews the What changes have data? What resulted from data process is utilized? review? (e.g. semiannually by advisory committee)

Date of most recent program review

Program Retention Rate  The method of calculation, using the reporting period July 1 through

June 30, is as follows: (EE + G / (BE + NS + RE) = R%  EE


 G

= =

 BE  NS  RE  R%

= = =

Ending enrollment (as of June 30 of the reporting period) Graduates Beginning enrollment (as of July 1 of the new reporting period) New starts Re-entries Retention percentage

Job Placement Rate in the Field  The method of calculation, using the reporting period July 1 through June 30, is as


(F + R) / (G-U) = +P%     

F= R= G= U* = P% =

Graduates placed in their field of training Graduates placed in a related field of training Total graduates Graduates unavailable for placement Placement percentage

 *Unavailable is defined only as documented: health-related issues, military

obligations,  incarceration, death, or continuing education status.  See pages 14-15 for examples

Credentialing Examination Participation Rate  Standard:  Participation of program graduates in

credentialing or licensure examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment.  The method of calculation, using ABHES’ reporting period July 1 through June 30, is as follows: Examination participation rate = T/G  T = Total graduates eligible to sit for examination  G = Total graduates taking examination

Credentialing Examination Participation Rate

 Include results of periodic reviews of exam

results along with goals for the upcoming year  Devise alternate methods for collection of results that are not easily accessible without student consent  Include the three most recent years of data collection (by program or by class)

Credentialing Examination Pass Rate  The method of calculation, using ABHES’ reporting

period July 1 through June 30, is as  follows: F / G= L%  F = Graduates passing examination (any attempt)  G = Total graduates taking examination  L = Percentage of students passing examination

Program Assessment

 PAEs pinpoint curricular deficiencies  Should be designed to incorporate all major elements of the curriculum  A well-designed PAE will point directly to that

segment of the curriculum that needs remedy  May score with ranges, rather than pass/fail.

Program Assessment Ranges ď‚— Program Assessment Exam Percentage





Significant development necessary



Additional development warranted



Satisfactory development demonstrated



Significant development demonstrated

Student, Clinical Extern Affiliate, Graduate, and Employer Satisfaction  The purpose of the surveys is to collect data

    

regarding perceptions of a program’s strengths and weaknesses for: Student Extern clinical affiliate Graduate Employer For graduates and employers only, the survey is standardized

Student, Clinical Extern Affiliate, Graduate, and Employer Satisfaction  Two Parts for the PEP:  1. Participation  Percentage of returns compared to threshold set  2. Satisfaction  

Typically Likert scale Benchmark example: 70% of respondents will rate satisfaction at 3 or higher on the Likert scale.

Survey Participation  Survey participation rate:

SP / NS = TP  SP =Survey Participation (those who actually filled

out the survey)  NS = Number Surveyed (total number of surveys sent out)  TP = Total Participation by program, by group

Survey Participation  For each group surveyed, programs must identify

and describe the following:     

The rationale for the type of data collected How the data was collected Goals A summary and analysis of the survey results How data was used to improve the learning process.

Survey Participation The report table format should look like this: Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies

Survey Reporting EXAMPLE Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies

 Rationale for Data 

Secure feedback from students on importance and satisfaction on customer service and overall attitudes related to the institution’s administration. Data used to reflect on what worked or didn’t work. End of term student evaluations used as composite of student views relating to course importance and satisfaction and overall class attitudes about the classroom environment. Faculty use the data to determine effective/ineffective activities and compare this information with other classes.

Survey Reporting EXAMPLE Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies

ď‚— Collection Procedures ď‚Ą Student satisfaction surveys are collected semiannually

Survey Reporting Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies


 

Collection Goals (Benchmarks) Using student satisfaction surveys (orientation through graduation) the benchmarks are:

              

Tutoring Academic Advising Admissions Support Financial Aid Career Services Library Spirited/Fun Environment Orientation Sessions Recognition Mission Statement Admin Accessibility Facility Social Activities

80% 80% 75% 75% 75% 80% 50% 75% 65% 50% 80% 70% 50%

Survey return percentage


Survey Reporting Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies

ď‚— Summary/Analysis ď‚Ą Feedback obtained from completed surveys tallied for each category.

Survey Reporting Rationale for Data

Collection Procedures


Summary /Analysis

Improvement Strategies

ď‚— Improvement Strategies ď‚Ą The data is collected and benchmarks are set and analyzed for improvement strategies when measures fall below established baselines. ď‚Ą Failure to achieve a baseline goal will be addressed at faculty and in-service meetings

Student Surveys  Evaluations exhibit student views relating to:  Course importance  Satisfaction Administration  Faculty  Training (including externship) 

Attitudes about the classroom environment

Clinical Affiliate  Students on externship should evaluate this

experience just as they did in the classroom  Should reflect how well the students are trained to perform their required tasks  

Include assessment of the strengths and weaknesses Proposed changes

 The sites should also evaluate school representative’s

responsiveness and support  See pages 20-21 for example

Clinical Affiliate  2 parts:  Student

satisfaction with clinical experience Can

be secured from Student Surveys

 Clinical

affiliate satisfaction


entry-level knowledge from didactic and lab for clinicals Responsiveness and support of school rep

Graduate  Standardized surveys have been developed by

ABHES for graduate  The items must be provided in the order presented.  The graduate survey is to be provided to graduates no sooner than 10 days following graduation.

Graduate 5 = Strongly Agree 4 = Agree 3 = Acceptable 2 = Disagree 1 = Strongly Disagree


I was informed of any credentialing required to work in the field. 2. The classroom/laboratory portions of the program adequately prepared me for my present position. 3. The clinical portion of the program adequately prepared me for my present position. 4. My instructors were knowledgeable in the subject matter and relayed this knowledge to the class clearly. 5. Upon completion of my classroom training, an externship site was available to me, if applicable. 6.I would recommend this program/institution to friends or family members.

Graduate  The program may use the provided survey only, or

may include additional items for internal assessment. 

Only those items provided by ABHES for graduate satisfaction assessment are to be included in the PEP.

 May include:  Relevance and currency of curricula  Quality of advising  Administrative and placement services provided  Information should be current, representative of the student population, and comprehensive.

Employer  Major part of determining program

effectiveness  Reflects how well employees (graduates) are trained to perform their required tasks 

Includes an assessment of the strengths and weaknesses Proposed changes

Employer  The employer survey is to be provided to the employer no fewer

than 30 days following employment.

 5 = Strongly Agree 4 = Agree 3 = Acceptable 2 = Disagree

1 = Strongly Disagree   Employer survey satisfaction items are as follows:

1. The employee demonstrates acceptable training in the area for which he/she is employed. 2. The employee has the skill level necessary for the job. 3. I would hire other graduates of this program (Yes / No)

Faculty Professional Growth and InService Activities  Evidence faculty participation in professional growth

activities and in-service sessions  Include schedule, attendance roster, and topics discussed  Show that sessions promote continuous evaluation of

the   

Program of study Training in instructional procedures Review of other aspects of the educational programs

 Include the past two years and professional

activities outside the institution for each faculty member

Reporting for all Categories  How data was collected  Baseline threshold  How initial baseline rates were determined  Analysis  Goal for next year  Activities that will be undertaken to meet the

goals set for the next year

Goals, Analysis, and Improvement Strategies

Categories  Goals  Programs establish specific goals for benchmarks to measure improvement  Goals can be set as an annual incremental increase or set as a static goal (e.g., 85 percent for retention and placement).  Annually monitor activities conducted

Program Retention Rate: Establishing Goals  A program may elect to establish its goal by an increase of a given

percentage each year (for example, 5%) or by determining the percent increase from year to year of the three previous years.

 Pharmacy Technician Program  Retention rates for the past three years, taken from the Annual Report: 

2005-2006 80%

2006–2007 2007–2008 81% 85%

 



 Avg = 2.5%  Projection for 2009: 85 + 2.5 = 87.5%

Program Retention Rate: Establishing Goals ď‚— Some programs may address retention by assigning

quarterly grade distribution goals in the percentage of As and Bs for selected courses/classes. ď‚— A quarterly intervention plan might be developed for those struggling students who are not achieving the higher scores. ď‚— Such an intervention plan might enhance retention.

Credentialing Examination Participation Rate  EXAMPLE: Nursing Assistant Program


Nursing Assistant




‘05 ’06 ‘07

‘05 ’06 ‘07

‘05 ’06 ‘07










 [2% (percent increase between 2005 & 2006) + 11(percent increase

between 2006 & 2007) ÷ 2 = 6.5]

So the goal for the number taking the nursing assistant exam in 2008 would be 94.5% (88% + 6.5% = 94.5%)

 Use trends or averages

Credentialing Examination Pass Rate  EXAMPLE:  Nursing Assistant Program


Nursing Assistant




‘05 ’06 ‘07

‘05 ’06 ‘07

‘05 ’06 ‘07

‘05 ’06 ‘07







 80 + 75 + 76 / 3 = 77%







Data Collection

Rationale for Use

Data Rationale Collection for Use

Types used for assessment




How collected

Who Responsible

Summary/ Analysis

Improvement Strategies

Summary/ Problems/ Analysis Deficiencies

Timetable for collection

Review Dates

Summary/ Analysis

Specific Activities

Parties responsible for collection

Strategy Adjustment

Analysis  Summary and Analysis  Schools provide a summary and analysis of data collected and state how continuous improvement is made to enhance expected outcomes.  Provide overview of the data collected  Summarize the findings that indicate the program’s strong and weak areas with plans for improvements  Use results to develop the basis for the next annual review  Present new ideas for changes

Analysis 

An example of how a program may evaluate the PEP could be by completing the following activities: Measuring the degree to which educational goals have been achieved.  Conducting a comprehensive evaluation of the core indicators  Summarizing the programmatic changes that have been developed  Documenting changes in programmatic processes  Revised goals  Planning documents  Program goals and activities 

Improvement Strategies  The outcome indicator reveals the deficiency  Trends are monitored  3 Years minimum, 5 is best  Action plan determines the improvement strategy  Remedy is applied and strategy is monitored for


Improvement Strategies  Strategies to Improve  Example of changes to a program that can enhance a

program:    

PAE trends by class reveal dropping scores in A & P Textbooks may need to be reevaluated Instructors may need to be exchanged Prerequisites may work better than core course

 Strategy is outlined and monitored for success  If it doesn’t work, then another strategy is applied

Improvement Strategies ď‚— Examples of changes to a program that can enhance

a program: ď‚— Retention: If the analysis of the data indicates that

large numbers of students are dropping or failing a course when taught by a particular instructor, the instructor may need additional training or a different instructor may need to be assigned to teach that course.

Improvement Strategies  Strategies to Improve  Examples of changes to a process that can enhance a


 If a course requires a certain amount of outside

laboratory or practice time and an analysis of the students’ actual laboratory or practice time demonstrates that the students are not completing the required hours, formally scheduling those hours or adding additional laboratory times may dramatically increase the effectiveness of that course.

Improvement Strategies ď‚— Strategies to Improve ď‚— Examples of changes to a process that can enhance a


ď‚— If an analysis of the data demonstrates that a

large number of students are failing a specific course or are withdrawing in excessive numbers, the program may change the prerequisites for that course or offer extra lab hours or tutoring to see if the failure or withdrawal rate are positively affected.

Improvement Strategies ď‚— But action plans take time to implement and

monitor, students can suffer in the interim ď‚— Analysis of trends will eventually reveal consistent patterns ď‚— Over time, proven computer-generated algorithm remedies can be developed for each negative outcomes trend and applied

CONCLUSION  The results of a PEP are never final.  It is a working document  An effective PEP is regularly reviewed by key

personnel and used in evaluating the effectiveness of the program.







The Program Effectiveness Plan