What Purposes Does the PEP Fulfill? THE PROGRAM EFFECTIVENESS PLAN ACCREDITING BUREAU OF HEALTH EDUCATION SCHOOLS
Paul Price, MBA ABHES Education Advisor
• Developing and using the PEP should fulfill several purposes: – Defining criteria for measurement of goals • Used to unify administrative and educational activities
– Assessing progress and the need for change – Communicating to the public a process for quality f improvement – Demonstrating regulatory compliance for approving or accrediting organizations • Such a document is a primary focus of most accrediting agencies
• Program success is based on student achievement in relation to its mission
WHAT KIND OF DATA IS COLLECTED?
• The PEP requires each program within an institution to look at its past, present, and future, and to continuously ask:
– Where have we been? • Establishes baselines
– Where are we now? • Compares with baselines for needed change
– Where do we want to go? • Setting goals
– How do we get there? • Process used to achieve new direction
Baselines (Thresholds) • How to set minimum thresholds? – Analyze previous data and average – Set and tweak as results come in – Survey constituents
• Should be minimums be mandated or should schools set and regulators/accreditors approve?
• Collect data on each of the educational outcomes areas and achievement of its occupational objectives for each of the occupational objectives for each of the programs offered by the institution – Include data relevant to improving the program’s overall effectiveness – Clearly evidence the level of educational outcomes and satisfaction experienced by current students, graduates, and employers
DEVELOPING OBJECTIVES • Program objectives are consistent with the field of study and the credential offered and include as an objective the comprehensive preparation of program graduates for work in the career field. • Objectives of the program are to: • At the completion of the program, the student will be able to:
Program Retention Rate • The method of calculation, using the reporting period July 1 through June 30, is as follows:
Job Placement Rate in the Field • The method of calculation, using the reporting period July 1 through June 30, is as follows: (F + R) / (G‐U) = +P%
(EE + G / (BE + NS + RE) = R% • EE
• G = • BE = • NS = • RE = • R% =
Ending enrollment (as of June 30 of the reporting period) Graduates Beginning enrollment (as of July 1 of the new reporting period) New starts Re‐entries Retention percentage
Credentialing Examination Participation Rate • Standard: • Participation of program graduates in credentialing or licensure examinations required for employment in the field in the geographic area(s) where graduates are likely to seek employment. • The method of calculation, using ABHES The method of calculation using ABHES’ reporting period reporting period July 1 through June 30, is as follows: Examination participation rate = T/G
• • • • •
F = R R = G = U* = P% =
Graduates placed in their field of training G d Graduates placed in a related field of training l di l d fi ld f i i Total graduates Graduates unavailable for placement Placement percentage
• *Unavailable is defined only as documented: health‐related issues, military obligations, • incarceration, death, or continuing education status.
Program Assessment • PAEs pinpoint curricular deficiencies – Should be designed to incorporate all major elements of the curriculum
• A well‐designed PAE will point directly to that segment of the curriculum that needs remedy • May score with ranges, rather than pass/fail.
• T = Total graduates eligible to sit for examination • G = Total graduates taking examination
Student, Clinical Extern Affiliate, Graduate, and Employer Satisfaction • The purpose of the surveys is to collect data regarding perceptions of a program’s strengths and weaknesses for: • Student • Extern clinical affiliate • Graduate • Employer • For graduates and employers only, the survey is standardized
Student, Clinical Extern Affiliate, Graduate, and Employer Satisfaction • Two Parts for the PEP: • 1. Participation – Percentage of returns compared to threshold set
• 2. Satisfaction – Typically Likert scale – Benchmark example: 70% of respondents will rate satisfaction at 3 or higher on the Likert scale.
Survey Participation • Survey participation rate: SP / NS = TP • SP =Survey Participation (those who actually filled out the survey) • NS = Number Surveyed (total number of surveys sent out) • TP = Total Participation by program, by group
• Evaluations exhibit student views relating to: – Course importance – Satisfaction • Administration • Faculty • Training (including externship)
– Attitudes about the classroom environment
Graduate • Standardized surveys have been developed by ABHES for graduate • The items must be provided in the order presented. presented • The graduate survey is to be provided to graduates no sooner than 10 days following graduation.
Survey Participation • For each group surveyed, programs must identify and describe the following: – The rationale for the type of data collected – How the data was collected – Goals – A summary and analysis of the survey results – How data was used to improve the learning process.
Clinical Affiliate • Students on externship should evaluate this experience just as they did in the classroom • Should reflect how well the students are p q trained to perform their required tasks – Include assessment of the strengths and weaknesses – Proposed changes
• The sites should also evaluate school representative’s responsiveness and support
Graduate 5 = Strongly Agree 4 = Agree 3 = Acceptable 2 = Disagree 1 = Strongly Disagree
1. I was informed of any credentialing required to work in the field. 2. The classroom/laboratory portions of the program adequately prepared me for my present position. 3. The clinical portion of the program adequately prepared me for my present position present position. 4. My instructors were knowledgeable in the subject matter and relayed this knowledge to the class clearly. 5. Upon completion of my classroom training, an externship site was available to me, if applicable. 6. I would recommend this program/institution to friends or family members.
Employer • Major part of determining program effectiveness • Reflects how well employees (graduates) are trained to perform their required tasks trained to perform their required tasks – Includes an assessment of the strengths and weaknesses – Proposed changes
Employer • The employer survey is to be provided to the employer no fewer than 30 days following employment. • 5 = Strongly Agree 4 = Agree 3 = Acceptable 2 = Disagree 1 = Strongly Disagree • • Employer survey satisfaction items are as follows: 1. The employee demonstrates acceptable training in the area for which he/she is employed. 2. The employee has the skill level necessary for the job. 3. I would hire other graduates of this program (Yes / No)
• Summary and Analysis
• Goals – Programs establish specific goals for benchmarks to measure improvement – Goals can be set as an annual incremental Goals can be set as an annual incremental increase or set as a static goal (e.g., 85 percent for retention and placement). – Annually monitor activities conducted
– Schools provide a summary and analysis of data collected and state how continuous improvement is made to enhance expected outcomes. – Provide overview of the data collected – Summarize the findings that indicate the program’s strong and weak areas with plans for improvements – Use results to develop the basis for the next annual review – Present new ideas for changes
Improvement Strategies • The outcome indicator reveals the deficiency • Trends are monitored – 3 Years minimum, 5 is best
• Action plan determines the improvement strategy • Remedy is applied and strategy is monitored for success
Improvement Strategies • Strategies to Improve • Example of changes to a program that can enhance a program: – – – –
PAE trends by class reveal dropping scores in A & P Textbooks may need to be reevaluated Textbooks may need to be reevaluated Instructors may need to be exchanged Prerequisites may work better than core course
• Strategy is outlined and monitored for success • If it doesn’t work, then another strategy is applied
"The Program Effectiveness Plan: An Outcomes Assessment Tool for Institutional Quality Improvement," a presentation from the 2010 DETC Fall...