Issuu on Google+

Community College of Beaver County Institutional Effectiveness Handbook

2015-2016

1


Institutional Effectiveness

Table of Contents

What is Institutional Effectiveness (IE)? Institutional Effectiveness and Assessment Institutional Effectiveness, Assessment, and Resource Allocation Institutional Effectiveness and CCBC’s Vision, Mission, Values, and Goals Institutional Effectiveness and Accreditation Institutional Effectiveness at CCBC FIGURE 1: Institutional Effectiveness Cycle and Process Narrative Creating a Culture of Institutional Effectiveness Review and Revision of the Institutional Effectiveness Handbook Support for Institutional Effectiveness Resources

Strategic Planning Strategic Planning vs. Institutional Effectiveness Strategic Planning at CCBC FIGURE 2: Strategic Planning Cycle and Process Narrative FIGURE 3: 2014-2019 Key Performance Indicators Key Performance Indicators (KPI) Resources

3 .

Student Learning Outcomes Assessment What are Student Learning Outcomes (SLOs)? Student Learning Outcomes Assessment at CCBC The Role of Faculty in the Assessment of SLOs FIGURE 4: SLO Assessment Cycle and Process Narrative More Information on Collecting and Aggregating SLO Data Resources

2


General Education Outcomes Assessment What is General Education? General Education at CCBC Assessing General Education FIGURE 5: General Education Outcomes and Mastery Matrices FIGURE 6: General Education Assessment Cycle and Process Narrative Resources

College Prep Outcomes Assessment What is College Prep? The College Prep Program at CCBC Assessing College Prep Resources

Service Department Outcomes Assessment What are Service Departments? FIGURE 7: Service Departments Assessing Service Departments at CCBC FIGURE 8: Service Department Outcomes Assessment Cycle and Process Narrative Resources

List of Figures Glossary of Assessment Terms


Appendices A: Institutional Effectiveness Council Charter B: Faculty Fellowship Description/Application C: Institutional Effectiveness Council Application Form D: SLO/SDO Guidelines and Rubrics for Creating Assessment Plans E: Assessment Plan Template F: SLO Data Collection Forms G: TracDat Data Entry Screenshots H: Example TracDat 4-Column Report I: SLO/SDO Checklists for Data Discussions J: General Education Student Learning Outcomes: Program and Assessment Overview K: Using the Blackboard Outcomes Assessment Site L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up M: 5-Year Key Performance Indicators and KPI Report Template N: Initiatives and Events Assessment Template

Works Cited


COMMUNITY COLLEGE OF BEAVER COUNTY

Institutional Effectiveness

4


Institutional Effectiveness What is Institutional Effectiveness (IE)? Institutional effectiveness (IE) is the measure of how well an institution is achieving its overall goals. It answers the question “how well are we collectively doing what we say we are doing?” (Characteristics 54). At CCBC, institutional effectiveness ensures the continuous improvement of academic and non-academic units through outcomes assessment.

4). A successful assessment process facilitates institutional improvement and increases an institution’s overall effectiveness in both academic and non-academic areas.

Institutional Effectiveness, Assessment, and Resource Allocation Resource allocation supports “the development and change necessary to improve and to maintain institutional quality” (Characteristics 4) and assessment is the ongoing process that ensures Institutional Effectiveness and Assessment the “effective and efficient use of the institution’s To determine an institution’s level of effectiveness, resources” (Characteristics 9). At CCBC, assessment is necessary. Assessment answers resource allocation is an important component the question “Are our efforts bringing forth the of all planning and assessment cycles: strategic, desired results?” (Addison student learning, and service department.

Institutional Effectiveness and CCBC’s Vision, Mission, Values, and Goals The vision, mission, and values of the college are the foundation for CCBC’s institutional goals, which are the fundamental criteria used to assess the college’s effectiveness. Our Goal - Student Success Create a learning community by supporting student success through educational programs provided in diverse and accessible formats Our Goal - Community and Economic Development Partner with businesses, organizations, educational institutions, and governmental agencies to enhance economic opportunities for the region Our Goal - Organizational Development Create a culture that expects openness, collaboration, and mutual respect and embraces innovation and professional development Our Goal - Resources Develop and allocate resources which sustain the institution and encourage its growth and development To access CCBC’s complete Vision, Mission, Values, and Goals (VMVG) statement visit http://www.ccbc.edu/VisionAndMission. 5


Institutional Effectiveness and Accreditation The ability to demonstrate an institution’s effectiveness is critically important to the Middle States Commission on Higher Education’s (MSCHE) accreditation process. Standards 1-7 of Characteristics of Excellence in Higher Education are framed within the context of institutional effectiveness and Standards 8-14 gauge institutional effectiveness in the academic arena. Standard 7: Institutional Assessment emphasizes MSCHE expectations regarding institutional effectiveness:

The institution has developed and implemented an assessment process that evaluates its overall effectiveness in achieving its mission and goals and its compliance with accreditation standards. (Characteristics 25)

Institutional Effectiveness at CCBC The Institutional Effectiveness process at CCBC is designed to permeate every facet of the college. CCBC’s current Institutional Effectiveness process includes three key elements: strategic planning, student learning outcomes assessment, and service-department outcomes assessment (see Figure 1).

FIGURE 1: INSTITUTIONAL EFFECTIVENESS PROCESS IMPLEMENT GOALS President, PEC, Supervisors

C-2

SET/REVIEW GOALS Board, President, PEC, Supervisors

C-1

College Level Strategic Planning and Assessment

C-3

REPORT RESULTS President, PEC, Supervisors

C-4

USE OF RESULTS P/D-4 P/D 4

REPORT RESULTS Outcome Leads/ Supervisors

P/D-3 P/D 3

Program and Department Level Planning and Assessment

P/D-2

IMPLEMENT OUTCOMES Outcome Leads

6

SET/REVIEW OUTCOMES Outcome Leads P/D-1


INSTITUTIONAL EFFECTIVENESS PROCESS NARRATIVE IE PROCESS OVALS: C-Oval (College Oval): The top oval, or C-Oval, represents college-level strategic planning and assessment. P/D-Oval (Program/Department Oval): The bottom oval, or P/D Oval, represents program and department level planning and outcomes assessment. IE PROCESS: C-OVAL: C-1 (Review and Set Goals): The college president will review and update his/ her presidential goals following the establishment of the Board of Trustees’ annual priorities. The college’s Board of Trustees will review presidential goals in July. The college’s administrators will then review and update their strategic objectives to ensure they align with the president’s goals. The president will review administrator’s goals in September. C-2 (Implement Goals): In September, action plans attached to all goals will be implemented for the current academic year. C-3 (Report Results): In May/June, the president and administrators will report the results of their action plans via TracDat.

C-4 and P/D 4 (Use of Results): In June, college administrators will review the results, actions, and needs determined by their own action plans as well as assessment processes at the program and/or department levels to inform goal setting for the upcoming academic year. The cycle recommences (C-1). IE PROCESS: P/D OVAL: P/D-1 (Review and Set Outcomes): Program and service department outcome leads develop, revise, and/or review the outcomes and measures associated with their program/s or area/s in August/September. P/D-2 (Implement Outcomes): Student learning and service department outcomes assessment plans are implemented for the current academic year. P/D-3 (Report Results): Outcomes assessment data should be gathered and reported via TracDat by May, with some programs reporting in August/September. P/D-4 and C-4 (Use of Results): Outcome leads review the results, actions, and needs determined by their assessment plans with their immediate supervisor. Outcome leads are invited to review administrative action plans during a fall data summit. Both reviews should inform goal setting for the upcoming academic year. The cycle recommences (P/D1).

7


Creating a Culture of Institutional Effectiveness To ensure assessment at CCBC occurs continually and permeates all activities and areas of the college, initiatives and events at all levels of the institution and for all purposes are to be assessed. Assessment planning should occur in conjunction with event/initiative planning and be established before an event/initiative is rolled out to campus. The Initiatives and Events Assessment Template (see APPENDIX N), which can be accessed from the Outcomes Assessment course on Blackboard, provides a useful template for the assessment of such activities.

Institutional Effectiveness Council

The Institutional Effectiveness Council (IEC) is responsible for the oversight of CCBC’s planning and assessment activities (see APPENDIX A: Institutional Effectiveness Council Charter).

Final templates should be sent to the Faculty Fellow to be entered into TracDat. For further guidance regarding the Initiatives and Events Assessment Template or TracDat data entry, contact the Faculty Fellow for Planning, Assessment, and Improvement. Support for Institutional Effectiveness To ensure CCBC’s institutional effectiveness process is sustainable, the college has identified and/or created the following groups and positions to facilitate planning, assessment, and improvement.

Executive Director of Institutional Research and Engagement

The Director of Institutional Research and Engagement helps to inform decision making at CCBC through the collection, analysis, and reporting of data and through facilitating strategic planning.

Data Mining Coordinator

The college’s data mining coordinator is responsible for CCBC’s data warehouse and reporting environments. The coordinator supports institutional effectiveness measures involving key performance indicators (KPI), benchmarking, and mandatory reporting requirements.

Faculty Fellow for Planning, Assessment, and Improvement The Faculty Fellow program recognizes the importance of faculty involvement in and knowledge of outcomes assessment, planning, and institutional effectiveness. The Faculty Fellow program creates an opportunity for faculty members, on a rotating basis of two-years, to dedicate time to the college’s planning, assessment, and improvement activities. The overall purpose of the Faculty Fellow program is to advance and sustain institutional effectiveness at Community College of Beaver County (see APPENDIX B: Faculty Fellowship Description/Application).

8


Review and Revision of the Institutional Effectiveness Handbook To ensure CCBC’s Institutional Effectiveness Handbook remains useful and accurate, it is to be carefully reviewed and updated yearly by the Faculty Fellow for Planning, Assessment, and Improvement with assistance from the Institutional Effectiveness Council.

Resources For further information regarding institutional effectiveness, please consult the following resources. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 5. Internet Resources for Higher Education Outcomes Assessment. Ephriam Schechter, Office of Institutional Research and Planning, NCU. http://www2.acs.ncsu.edu/UPA/archives/assmt/resource.htm. 6. Student Learning Assessment: Options and Resources. Middle States Commission on Higher Education. http://www.msche.org/publications/SLA_ Book_0808080728085320.pdf 7.

“Suggested Readings on Assessing Institutional Effectiveness.” Linda Suskie, Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography-Institutional-Effectiveness.pdf

9


COMMUNITY COLLEGE OF BEAVER COUNTY

Strategic Planning

10


STRATEGIC PLANNING Strategic Planning vs. Institutional Effectiveness Strategic planning focuses on the actions taken to implement the institutional mission and meet institutional goals, while institutional effectiveness focuses on the results of those actions to determine how well the institution’s mission and goals are being fulfilled (Addison 7). Strategic Planning at CCBC CCBC’s strategic plan establishes the college’s overall direction and serves as the foundation for annual goal planning at all levels of the institution. The college’s current strategic plan can be accessed from CCBC’s website: http://www.ccbc.edu/strategicplan. At CCBC, strategic planning functions according to an established cycle that emphasizes a cascading process of goal-setting and embedded assessment to ensure the college is focused on institutional priorities and that performance is measured and evaluated on the basis of concrete outcomes. FIGURE 2 illustrates CCBC’s strategic planning cycle. The narrative following the figure explains the cycle.

11


FIGURE 2 STRATEGIC PLANNING CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY 2. KPI Report Prepared IR/Data Miner June

Action Plan Assessments

7.

KPI Report Reviewed and Budget Linked BoT July

Collected/Annual Goals Report Prepared IEC/PEC May/June

3.

College-Wide Strategic Plan/KPI Developed BoT, President, IEC

Action Plans Developed

6.

1.

Department/School Supervisors August/September

Strategic Objectives Developed Senior Administrators August

5. 12

Presidential Goals Developed President July

4.


STRATEGIC PLANNING PROCESS NARRATIVE NARRATIVE STRATEGIC PLANNING PROCESS BOX 1:

STRATEGIC PLANNING PROCESS BOX 2:

College-Wide Strategic Plan/KPI Developed

KPI Report Prepared

Development of the 5-Year Strategic Plan: The Board of Trustees (BoT) undertakes a comprehensive review of the college’s current strategic plan every five years to revise and update the college’s Vision, Mission, Values, and Goals as appropriate. In compliance with MSCHE Standard 1: Mission and Goals, this process includes “the involvement of the institution’s community” as well as the broadest segments of its internal and external constituencies. The exact form the 5-year review takes within these parameters is determined by the Institutional Effectiveness Council (IEC).

In June of each year, the Executive Director of Institutional Research and Engagement and Data Mining Coordinator will prepare a KPI Report by reviewing the college’s data for the previous year and establish whether each key performance indicator was “achieved.” Past years’ trends for each KPI will also be included if they are available.

To prepare for and conduct the 5-year review, the IEC must 1) develop a year-long timeframe (running from January-January) for the review process, 2) determine the process that will be used to gather the information necessary to appropriately revise and update the college’s strategic plan (i.e. compression planning, process management, etc.), and 3) select individuals to facilitate the 5-year review process. Based on the information gathered during the five-year review process, the IEC composes a draft 5-Year Strategic Plan by October of the review year. The draft is then ready for review and approval by the president and Board of Trustees at their annual January retreat. Development of College-Wide KPIs: Upon approval of the 5-Year Strategic Plan, the IEC will create a list of Key Performance Indicators (KPI) consistent with the institutional goals as stated in the plan. The president, in consultation with a select group of faculty, staff, and administrators will review CCBC, statewide, and national trends to establish benchmarks for each KPI. Once these benchmarks have been set by the president, the Board of Trustees will review and approve them.

STRATEGIC PLANNING PROCESS BOX 3: KPI Report Reviewed and Budget Linked The KPI Report will be submitted to the Board of Trustees (BoT) for review, discussion, and action in July. Using information from the report, the BoT will create a list of priorities directly aligned with the college’s institutional goals. If resources need to be allocated or reallocated to support and sustain the BoT’s priorities, they are then linked to the college’s budgetary process. Following the Board’s review of the KPI Report, it will be distributed to the campus community and made available as a public document. STRATEGIC PLANNING PROCESS BOX 4: Presidential Goals Developed Following the creation of the Board’s priorities, in July the president will create a list of presidential goals that are directly informed by the Board’s priorities. These goals will serve as presidential performance goals. STRATEGIC PLANNING PROCESS BOX 5: Strategic Objectives Developed After the president creates his/her goals, s/he will work with senior administrators to establish strategic objectives for their areas. All strategic objectives are to be directly informed by presidential goals. Each senior administrator’s strategic objectives will serve as the administrator’s performance goals. 13


STRATEGIC PLANNING PROCESS BOX 6: Action Plans Developed After developing their own strategic objectives, senior administrators will work with appropriate department/school supervisors to establish action items for their areas. All actions items are to be directly informed by the senior administrator’s strategic objectives. Action plans will be developed for each action item. All action plans will establish appropriate assessments, identify necessary budgetary needs, and address the timeframe and logistics associated with the action items. Each department/school supervisor’s action items will serve as his/her performance goals. STRATEGIC PLANNING PROCESS BOX 7: Action Plan Assessments Collected/Annual Goals Report Prepared: In May/June, the PEC will submit action plan assessment results to TracDat and use the data to create an Annual Goals Report to be reviewed and acted on by the college’s administration, Board of Trustees, and IEC. The cycle recommences.

14


Figure 3 2014-2019 Key Performance Indicators Community College of Beaver County Goal: Student Success

Goal: Organizational Development

1. Successfully Complete Developmental Education 2. Successfully Complete Initial Gateway English Course 3. Successfully Complete Courses with a Grade of “C� or Better 4. Fall-to-Spring Retention Rates 5. Fall-to-Fall Retention Rates 6. Attain Credential 7. AAS/AS Graduate Employment and Continuing Education Rate 8. License Examination Pass Rate 9. Academic Challenge 10. Student Effort 11. Career Counseling Services Use 12. Career Goal Clarification 13. Support for Students

21. Cultural Awareness 22. Employee Perception of College Commitment to Diversity 23. Employee Job Satisfaction 24. Active and Collaborative Learning 25. Student-Faculty Interaction

Goal: Community & Economic Development 14. Enrollment in Noncredit Workforce Development Courses 15. Contract Training Clients 16. Contract Training Student Headcount 17. Lifelong Learning Course Enrollments 18. College-Sponsored Community Events 19. Community Groups Using College Facilities 20. College Position in Community

Goal:Resources 26. Annual Unduplicated Headcount 27. FTE Enrollment 28. Annual Credit Count 29. Tuition and Fees 30. High School Graduate Enrollment Rate 31. Met Need of Pell Recipients 32. Met Need of Non-Pell Recipients 33. Student Perception of Institutional Financial Support 34. Support for Learners 35. Enrollment per Section 36. Teaching by Full-Time Faculty 37. Expenditures per FTE Student 38. Expenditure on Instruction and Academic Support 39. Employee Turnover Rate 40. College Support for Innovation

15


Key Performance Indicators (KPI) CCBC utilizes 40 key performance indicators or KPI to assess the college’s overall institutional effectiveness in relation to its institutional goals of Student Success, Organizational Development, Community and Economic Development, and Resources. Both the creation/revision of KPIs every five-years and the annual production of KPI reports are part of the college’s strategic planning process as illustrated in FIGURE 2. CCBC’s current key performance indicators are listed and described in FIGURE 3.

Resources For further information regarding strategic planning, please consult the following resources. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 5. Strategic Planning: A Guide for Higher Education Institutions. Centre for Higher Education Transformation, Fred M. Hayward, Daniel J. Ncayiyana, and Jacqueline E. Johnson. http://chet.org.za/books/strategic-planning 6. Strategic Planning in Higher Education: A Guide for Leaders. Center for Organizational Development and Leadership, Rutgers. http://ddq.nu.edu.sa/files/35.pdf 7.

“Suggested Readings on Assessing Institutional Effectiveness.” Linda Suskie, Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography-Institutional-Effectiveness.pdf

16


17


COMMUNITY COLLEGE OF BEAVER COUNTY

Student Learning Outcomes Assessment

18


STUDENT LEARNING OUTCOMES ASSESSMENT What are Student Learning Outcomes (SLOs)? The Middle States Commission on Higher Education (MSCHE) defines student learning outcomes or SLOs as “the knowledge, skills, and competencies that students are expected to exhibit upon successful completion of a course, academic program, co-curricular program, general education requirement, or other specific set of experiences” (Characteristics 63).

Student Learning Outcome Assessment at CCBC The use of SLO assessment results ensures institutional programs and resources at CCBC are organized and coordinated to achieve institutional and program-level goals, which contributes to the college’s overall level of institutional effectiveness.

The Role of Faculty in the Assessment of SLOs Faculty play an indispensable role in the assessment of student learning outcomes. Faculty are responsible for the creation of clearly articulated, observable outcomes and resulting action plans as well as the careful collection and aggregation of data associated with those plans. In addition, faculty are responsible for sharing assessment plan results with appropriate stakeholders and ensuring actions informed by assessment results are created and followed. Because of the central role faculty play in SLO assessment, a clear understanding of CCBC’s SLO assessment process and the faculty fellow program were designed to help guide faculty through the SLO assessment process and should be consulted whenever clarification is needed in regards to student learning outcomes assessment. FIGURE 4 illustrates CCBC’s student learning outcomes assessment cycle. The narrative following the figure explains the cycle.

19


FIGURE 4 SLO ASSESSMENT CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY SLO Assessment Process: To be completed annually by ALL institutional programs, KEY: IEC = INSTITUTIONAL EFFECTIVENESS COUNCIL SLO = STUDENT LEARNING OUTCOME A: SLOs Revised/Created OL = OUTCOME LEAD D: SLO/SDO Guidelines & A: CTIVITY D: OCUMENT Rubrics R: ESPONSIBLE PARTY R: Outcome Lead (OL) T: IMLINE

A Bb C D E F d E G

A: SLOs Formally Reviewed A: SLOs Communicated/ D: Assessment Plan Implemented Template D: Data Collection Memo R: Outcome Lead/Faculty R: Outcome Lead/Faculty Fellow Fellow T: August-September T: August-September September-October SLO ASSESSMENT PROCESS: To be completed annually for ALL institutional programs, EXCEPT General Education andT: College Prep. A

C

B

A: SLOs Revised/Created D: SLO Guidelines and Rubric

A: SLOs Formally Reviewed D: IEC Forms

A: Data Collected D: Data Collection Form

R: Outcome Lead (OL) T: August/September

R: Outcome Lead (OL)/IEC T: September-October

R: Outcome Lead (OL) T: Nov-Dec (fall)/April-May(spring)

A: Data Collected & Aggregated D: Data Collection Form/ Bb Collection Site/ Other R: Outcome Lead Nov-Dec (fall)/AprilA: T: Follow-Up Documented D: TracDat Matrix G R: May (spring) Outcome Lead (OL) T: Throughout Assessment Cycle

D:

A: Data Aggregated D: TracDat Matrix R: Outcome Lead (OL) T: May

A: Data Discussed A: Data Reported/Actions D: TracDatD 4-Column Report/ Created/Budget Linked SLO Checklist/Budgetary D: TracDat Forms R:FOutcome Lead R: All Program OLs/Immediate T: May Supervisor A: Data Discussed/Budget Linked A: Actions Documented D: TracDat SLO Report/SLO Checklist/ T: April-May/Aug-Sept D: TracDat Matrix Budget Forms E R: Outcome Lead (OL) T: April-May/August-September

R: All OLs/Division Director T: April-May/August-September

R: Outcome Lead (OL)

KEY

SLO = Student Learning Outcome OL = Outcome Lead A = Action

KEY

20

D = Document

R = Responsible Party T = Timeline

A: Follow-up Documented D: TracDat R: Outcome Lead T: Mid-Year/End-of-Year


SLO ASSESSMENT PROCESS NARRATIVE SLO ASSESSMENT PROCESS: BOX A

SLO ASSESSMENT PROCESS: BOX B

Activity (SLOs Revised/Created): Lead faculty members for individual program outcomes, or outcome leads (OLs), will receive an email reminding them to review, revise, and/or create new outcomes for the upcoming academic year as appropriate. NOTE: SLOs should only be modified at the five-year program review point unless substantial change has occurred in the program area necessitating the addition and/or revision of SLOs before the five-year program review.

Activity (SLOs Formally Reviewed): If an OL decides to revise and/or create new outcomes for the upcoming academic year, s/he must submit revised and/or new outcomes for formal review and approval to the Faculty Fellow.

Document (SLO/SDO Guidelines and Rubrics): Student learning outcomes created by outcome leads (OLs) should adhere to the guidelines for successful SLOs established in the SLO/SDO Guidelines and Rubrics document (see APPENDIX D), which is available from the Outcomes Assessment course on Blackboard. Responsible Party (Outcome Lead): Outcome leads or OLs are faculty members responsible for reviewing/revising/creating specific program outcomes within programs. Generally, CCBC faculty are not assigned to assess entire programs. Rather, faculty or OLs are assigned to certain outcomes within a program or programs.

Document (Assessment Plan Template): To submit outcomes for formal review, the OL must complete an assessment plan template, which is available from the Outcomes Assessment course on Blackboard (see also APPENDIX E). Responsible Party (Outcome Lead/Faculty Fellow): The outcome lead is responsible for submitting appropriate documentation regarding revised and/or newly created student learning outcomes to the Faculty Fellow. Timeline (August-September): To be approved and active for the current academic year, revised and/or newly created outcomes must be submitted to the Faculty Fellow no later than the end of September.

Timeline (August/September): SLOs will be reviewed, revised, and/or created no later than September to ensure appropriate outcomes are in place for the academic year.

21


SLO ASSESSMENT PROCESS: BOX C Activity (SLOs Communicated/Implemented): If necessary, OLs should contact all employees from whom they will need to collect data to appropriately assess established outcomes. Document (Data Collection Memo): APPENDIX F: SLO Data Collection Forms contains a standardized memo that may be adapted by faculty OLs to inform other faculty members (both full- and part-time) about the method/s, criteria, and means of collection that will be used to assess established outcomes. Responsible Party (Outcome Lead/Faculty Fellow): Faculty OLs should complete the data collection memo—if necessary—and send it to appropriate full- and part-time faculty members. The Faculty Fellow can assist with this process when necessary. Timeline (September-October): Faculty OLs should contact other faculty members as soon as possible and no later than October.

SLO ASSESSMENT PROCESS: BOX D Activity (Data Collected & Aggregated): The OL should collect assessment information from appropriate courses according to his/her established assessment plan. Document (Data Collection Form/Blackboard Collection Site/Other): The OL should aggregate all data in an electronic format so that it can later be uploaded to TracDat. The college created data collection form may be used if an OL so chooses, or the OL may request a Blackboard collection site be created and added to the Outcomes Assessment Site through Blackboard. Bb collection site requests should be directed to the Faculty Fellow for Planning, Assessment, and Improvement. However, any method of collection is acceptable as long as it is accurate, thorough, and electronic. The Data Collection Form can found in the Outcomes Assessment Course on Blackboard (see also APPENDIX F). The form also contains a memo to alert other instructors, if need be, about the data the OL will be collecting. Responsible Party (Outcome Lead): The outcome lead is responsible for collecting and aggregating data appropriate to the assessment of his/her assigned student learning outcomes according to his/her established assessment plan. NOTE: Please review “More Information on Collecting and Aggregating SLO Data,” which follows below, for more information regarding how faculty may choose to collect assessment data. Timeline (Nov-Dec (fall)/April-May (spring)): Assessment data should be collected during both the fall and spring semesters. During the fall semester, OLs should collect data no later than November/ December and during the spring semester data should be collected by OLs no later than April/May.

20


SLO ASSESSMENT PROCESS: BOX E Activity (Data Reported/Actions Created/Budget Linked): After collecting appropriate assessment data using the data collection form or another electronic means, OLs should enter the data into TracDat. Actions and budgetary links should also be established at this time. NOTE: While data collection occurs during both the fall and spring semesters, TracDat data entry takes place during the spring semester only but accounts for data from the entire academic year. Document (TracDat): During this phase of the assessment cycle, an email reminder will be sent to all OLs alerting them that assessment data should be entered into TracDat for each outcome they are responsible for assessing. NOTE: See APPENDIX G or the Blackboard Outcomes Assessment Site for a series of screenshots detailing how to enter assessment data into TracDat. Responsible Party (Outcome Lead): The outcome lead is responsible for entering appropriate assessment data into TracDat, creating actions, and establishing budgetary links during the spring semester each academic year. Timeline (May): All assessment data should be entered into TracDat by the end of May. It is recommended assessments are completed as early as possible to allow time for data discussions with immediate supervisors before the end of the semester.

SLO ASSESSMENT PROCESS: BOX F Activity (Data Discussed): Following the submission of assessment data, immediate supervisors should schedule data discussions with all outcome leads for each program in their area. During these discussions, established actions will be discussed and resource needs will be addressed. Document (TracDat 4-Column Report/SLO Checklist/Budgetary Forms): Following the completion of data entry through TracDat, each program OL should create a TracDat 4-column report (see APPENDIX H). APPENDIX G explains how to create a 4-column report. A copy of the report should be provided to the immediate supervisor and used to discuss the program with the supervisor and other program OLs. In addition to the TracDat 4-column report, the SLO Checklist (see APPENDIX I), which can be accessed from the Outcomes Assessment course on Blackboard, should be discussed and completed by OLs and the immediate supervisor during their data discussion. If necessary, the supervisor should also complete appropriate budgetary forms (New Initiative form, for example) to address specific program needs. Responsible Party (All Program Outcome Leads/Immediate Supervisor): Each program OL is responsible for providing a TracDat 4-column report to his/her immediate supervisor and completing, in conjunction with the supervisor and other program OLs, the SLO Checklist. OLs should retain copies of the completed SLO Checklist and TracDat 4-column reports for all outcomes they are responsible for assessing. These documents should be referenced as follow-up is added to TracDat throughout the assessment cycle. The immediate supervisor is responsible for scheduling data discussions with all program OLs, retaining copies of SLO Checklists and TracDat 4-column reports for each CCBC program under his/her direction, and forwarding completed SLO Checklists to the Faculty Fellow. The supervisor is also responsible for moving forward any budgetary needs deemed necessary following data discussions. Timeline (April-May/August-September): Ideally, data discussions should take place between April and May; however, in certain cases this may be impossible because of the timing of assessment measures (final paper, final exam, etc.). In such cases, program OLs should wait to meet as a group with the immediate supervisor until the beginning of the fall semester (August/ September). 23


SLO ASSESSMENT PROCESS: BOX G Activity (Follow-Up Documented): This step of the SLO assessment process is necessary and required, but occurs periodically throughout the assessment process. At a minimum, follow-up should be entered into TracDat at the mid-year (December) and end-of-year (May) points. Document (TracDat): At mid- and end-of-year points, the OL should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Outcome Lead): The OL is responsible for entering follow-up in relation to established actions. Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the SLO assessment process and should be documented as appropriate or--at minimum--the mid-year (December) and end-of-year points (May).

More Information on Collecting and Aggregating SLO Data As outcome leads, faculty are responsible for collecting and aggregating the data needed to accurately assess the student learning outcomes they have established according to the assessment plans they have likewise developed. Depending upon the student learning outcomes and assessment plan a faculty member or outcome lead (OL) has developed, it may be necessary to collect assessment data from other faculty members, both full- and/or part-time. If that is the case, faculty members may pursue one of two options: 1) identify and contact appropriate faculty for assessment information on their own or 2) request a Blackboard collection site for the automated and anonymous collection and aggregation of appropriate assessment data. Requests for a Blackboard collection site should be submitted to the Faculty Fellow for Planning, Assessment, and Improvement.

24


Resources 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations. Middle States Commission on Higher Education. http://www.msche.org/publications/Assessment_Expectations051222081842.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 4. “Examples of Evidence of Student Learning.” Assessing Student Learning: A Commonsense Guide, Linda Suskie. http://www.msche.org/publications/exam ples-ofevidence-of-student-learning.pdf 5. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 6. National Institution for Learning Outcomes Assessment http://www.learningoutco Resources meassessment.org/publications.html “Regional Accreditation andlearning Studentoutcomes Learning:assessment, Principles for Goodconsult Practice.” For more 7. information regarding student please the Council of Regional Accrediting Commissions. http://www.msche.org/publications/ following resources. Regnlsl050208135331.pdf. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. 8. “Seminal Readings on Assessing Student Learning.” Middle States Commission on Higher Education. http://www.msche.org/publications/Bibliography---seminal. CCBC pdf Library Resource Center, Institutional Effectiveness Collection. 1. Assessing Student Learning and Institutional Understanding Middle 9. Student Learning Assessment: OptionsEffectiveness: and Resources. Middle States States Commission Expectations.onMiddle Commission on Higher Education. HigherStates Education. http://www.msche.org/publications/SLA_ Book_0808080728085320.pdf http://www.msche.org/publications/Assessment_Expectations051222081842.pdfFor more information regarding student learning outcomes assessment, please consult the following resources.

25


COMMUNITY COLLEGE OF BEAVER COUNTY General Education Learning Outcomes Assessment

26


GENERAL EDUCATION LEARNING OUTCOMES ASSESSMENT What is General Education? General education is an important component of undergraduate study that contributes to students’ “essential knowledge, cognitive abilities, understanding of values and ethics…and draws students into new areas of intellectual experience, expanding cultural and global awareness and sensitivity, and preparing them to make enlightened judgments outside as well as within their academic specialty” (Characteristics of Excellence 47). General Education at CCBC Community College of Beaver County has identified five general education competencies: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Cultural Literacy, and Technology Literacy. The college’s general education requirements are publicly defined in the academic catalog: http://www.ccbc.edu/ CourseSchedulesAndSearch.

Assessing General Education CCBC’s general education competencies are assessed at both the gen. ed. and majorspecific levels through embedded course assignments referred to as General Education Competency (GEC) assignments, which are established in master syllabi. Mastery matrices attached to GEC assignments are used to directly assess established student learning outcomes associated with each of CCBC’s general education competencies, as illustrated in FIGURE 5: General Education Outcomes and Mastery Matrices. APPENDIX J provides further information about CCBC’s general education program, including how mastery matrices should be used and timelines for college-wide general education assessment. General education outcomes are assessed according to an established cycle as illustrated in FIGURE 6. The narrative following FIGURE 6 explains the General Education assessment cycle.

27


FIGURE 5 GENERAL EDUCATION OUTCOMES AND MASTERY MATRICES REQUIREMENT 1: COMMUNICATION PROFICIENCY OUTCOME #1 Demonstrate clear and skillful communication methods appropriate to different occasions, audiences, and purposes. MASTERY MATRIX #1 Mastery Student consistently demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.

Progressing Student generally demonstrates clear and skillful communication methods appropriate to occasion, audience, and purpose.

Low/No Mastery Student does not consistently or generally demonstrate clear and skillful communication methods appropriate to occasion, audience, and purpose.

REQUIRMENT 2: INFORMATION LITERACY OUTCOME #2 Access, evaluate, and appropriately utilize information from credible sources. MASTERY MATRIX #2 Mastery Student consistently accesses, evaluates, and appropriately utilizes information from credible sources.

28

Progressing Student generally accesses, evaluates, and appropriately utilizes information from credible sources.

Low/No Mastery Student does not access, evaluate, and appropriately utilize information from credible sources.


REQUIREMENT 3: SCIENTIFIC AND QUANTITATIVE REASONING OUTCOME #3 Select and apply appropriate problem-solving techniques to reach a conclusion (hypothesis, decision, interpretation, etc.). MASTERY MATRIX #3 Mastery Student consistently selects and applies appropriate problemsolving techniques to reach a conclusion.

Progressing Student generally selects and applies appropriate problemsolving techniques to reach a conclusion.

Low/No Mastery Student does not consistently or generally select and apply appropriate problem-solving techniques to reach a conclusion.

REQUIREMENT 4: CULTURAL LITERACY OUTCOME #4 Demonstrate an understanding and appreciation of the broad diversity of the human experience. MASTERY MATRIX #4 Mastery Student consistently demonstrates an understanding and appreciation of the broad diversity of the human experience.

Progressing Student generally demonstrates an understanding and appreciation of the broad diversity of the human experience.

Low/No Mastery Student does not consistently or generally demonstrate an understanding and appreciation of the broad diversity of the human experience.

REQUIREMENT 5: TECHNOLOGY LITERACY OUTCOME #5 Utilize appropriate technology to access, build, and share knowledge in an effective manner. MASTERY MATRIX #5 Mastery Student consistently utilizes appropriate technology to access, build, and share knowledge in an effective manner.

Progressing Student generally utilizes appropriate technology to access, build, and share knowledge in an effective manner.

Low/No Mastery Student does not consistently or generally utilize appropriate technology to access, build, and share knowledge in an effective manner.

29


FIGURE 6 GENERAL EDUCATION ASSESSMENT CYCLE & PROCESS NARRATIVE General Education Assessment Process: To be completed annually for the General Education Program.

A: SLOs Revised/Created D: SLO/SDO Guidelines and Rubric R: Faculty Fellow T: August/September

A: SLOs Formally Reviewed D: Assessment Plan Template R: Faculty Fellow/IEC T: August-September

A A: SLOs Communicated/ Implemented D: Data Collection Memo R: Faculty Fellow T: October (fall)/February (spring)

C

A: Data Collected & Aggregated D: Data Collection Form/Bb R: Faculty Fellow/Teaching Faculty T: Nov-Dec (fall)/AprMay (spring)

A: Data Reported/Actions Created/Budget Linked D: TracDat R: Faculty Fellow T: May

D

A: Data Discussed D: TracDat 4-Column Report/SLO Checklist/ Budget Forms R: Fac. Fellow/BAST Dean T: Apr-May/Aug-Sept

F

IEC = Institutional Effectiveness Council SLO = Student Learning Outcome A = Activity D = Document R = Responsible Party T = Timeline

30

B

E

A: Follow-up Documented D: TracDat R: Faculty Fellow T: Mid-Year/End-of-Year

G

KEY


GENERAL EDUCATION OUTCOMES ASSESSMENT PROCESS NARRATIVE GEN. ED. ASSESSMENT PROCESS: BOX A Activity (SLOs Revised/Created): Student learning outcomes for CCBC’s General Education Program will be reviewed/ revised/created as appropriate each academic year. NOTE: Outcomes should only be revised at the five-year program review point, unless significant change in the program area necessitates an immediate revision. Document (SLO/SDO Guidelines and Rubrics): Student learning outcomes for the General Education Program should adhere to the guidelines for successful SLOs as established in the SLO/SDO Guidelines and Rubrics documents, which are available from the Outcomes Assessment course on Blackboard (see also APPENDIX D). Responsible Party (Faculty Fellow): The Faculty Fellow for Planning, Assessment, and Improvement serves as the Outcome Lead (OL) for the General Education Program. Timeline (August/September): SLOs will be reviewed, revised, and/or created no later than September to ensure appropriate outcomes are in place for the upcoming academic year. GEN. ED. ASSESSMENT PROCESS: BOX B Activity (SLOs Formally Reviewed): If the faculty fellow decides to revise and/or create new outcomes for the upcoming academic year, the outcomes must be formally reviewed and approved by the Institutional Effectiveness Council (IEC). Document (Assessment Plan Template): To submit outcomes for formal review to the IEC, the faculty fellow must complete the Assessment Plan Template, which is available from the Outcomes Assessment course on Blackboard (see also APPENDIX E).

Responsible Party (Faculty Fellow): The faculty fellow will be responsible for submitting appropriate documentation regarding revised and/ or newly created student learning outcomes to the IEC for approval. Timeline (August-September): To be approved and active for the current academic year,revised and/ or newly created outcomes must be submitted to the IEC no later than the Council’s last meeting in September. GEN. ED. ASSESSMENT PROCESS: BOX C Activity (SLOs Communicated/ Implemented): Following formal approval by the IEC, in cases of revision and/or newly created SLOs, or following review of SLOs with no changes by the Faculty Fellow, the Fellow should formally contact all instructors teaching courses from which assessment data will be collected using the Data Collection Memo, which can be accessed from he Outcomes Assessment course on Blackboard (see also APPENDIX F: SLO Data Collection Forms). Document (Data Collection Memo): APPENDIX F: SLO Data Collection Forms, includes a standardized memo that may be used by the Faculty Fellow to inform other faculty (both full- and part-time) about the method/s, criteria, and means of collection that will be used to assess the student learning outcomes attached to CCBC’s General Education Program. If the data collection memo is not used, the Faculty Fellow must develop his/her own method of communicating assessment logistics to other faculty members teaching within the General Education Program. Responsible Party (Faculty Fellow): The Faculty Fellow for Planning, Assessment, and Improvement should prepare and send data collection emails to appropriate full- and parttime faculty members.

31


Timeline (October (fall)/February (spring)): Since assessment data should be collected during both fall and spring semesters, data collection memos should be sent near the middle of those semesters, or October and February, respectively. NOTE: Data gathered during the academic year is only required to be entered into TracDat during the spring semester. GEN. ED. ASSESSMENT PROCESS: BOX D Activity (Data Collected & Aggregated): Those faculty, both full- and part-time, teaching within CCBC’s General Education Program and previously contacted by the Faculty Fellow should upload appropriate assessment data to the Outcomes Assessment Site on Blackboard. Document (Data Collection Form/Blackboard Collection Site/Other): All general education data is to be collected through the Outcomes Assessment Site on Blackboard; however, the Faculty Fellow may choose to collect data using alternative means. The college created data collection form may be used if the Fellow so chooses. Any method of collection is acceptable as long as it is accurate, thorough, and electronic. The Data Collection Form can be found in the Outcomes Assessment course on Blackboard. Responsible Party (Faculty Fellow/Teaching Faculty): The Faculty Fellow is responsible for ensuring an appropriate data collection method is put into place. All faculty teaching within CCBC’s General Education Program will be responsible for uploading appropriate assessment data to the Outcomes Assessment Site on Blackboard or reporting data as requested. Timeline (Nov-Dec (fall)/April-May (spring)): Assessment data should be collected during both the fall and spring semesters. During the fall semester data should be uploaded to the Outcomes Assessment Site from NovemberDecember. During the spring semester data should be uploaded from April-May.

30 32

GEN. ED. ASSESSMENT PROCESS: BOX E Activity (Data Reported/Actions Created/Budget Linked): After receiving assessment data from all course sections via the Outcomes Assessment course on Blackboard, the data should be entered into TracDat. Document (TracDat): During this phase of the assessment cycle, the Faculty Fellow will enter data in TracDat, create actions, and establish budgetary links. Responsible Party (Faculty Fellow): The Faculty Fellow for Planning, Assessment, and Improvement, is responsible for entering assessment data into TracDat and creating actions and budgetary links. Timeline (May): All assessment data should be entered into TracDat by the end of May. It is recommended data be entered as early as possible to allow ample time for data discussions with the immediate supervisor. GEN. ED. ASSESSMENT PROCESS: BOX F Activity (Data Discussed): The faculty fellow should meet and discuss outcomes assessment data and any budgetary needs with the Dean of Business, Arts, Sciences, and Technologies (BAST). Document (TracDat 4-Column Report/SLO Checklist/Budgetary Forms): Following TracDat data entry, the Faculty Fellow should create a TracDat 4-column report (see APPENDIX H). APPENDIX G explains how to create a 4-column report. A copy of the report should be given to the BAST Dean. The SLO Checklist (see APPENDIX I), which can be accessed from the Outcomes Assessment course on Blackboard, should be discussed and completed by the Faculty Fellow and the BAST Dean during their data discussion. If necessary, appropriate budgetary forms (New Initiative form, for example) should also be completed.


Responsible Party (Faculty Fellow/BAST Dean): The Faculty Fellow is responsible for providing a TracDat 4-column report to the BAST Dean and completing, in conjunction with the Dean, the SLO Checklist. The Faculty Fellow should retain copies of the completed SLO Checklist and TracDat 4-column report for all general education outcomes. These documents should be referenced as follow-up is added to TracDat throughout the assessment cycle. The Dean is responsible for scheduling a data discussion with the Faculty Fellow and retaining copies of SLO Checklists and TracDat 4-column reports for CCBC’s General Education Program. The Dean is also responsible for moving forward any budgetary needs deemed necessary following data discussions. Timeline (April-May/August-September): Ideally, data discussions should take place between April and May; however, in certain cases this may be impossible because of the timing of assessment measures (final paper, final exam, etc.). In such cases, the Faculty Fellow and Dean should wait to meet until the beginning of the fall semester (August/ September). GEN. ED. ASSESSMENT PROCESS: BOX G Activity (Follow-Up Documented): This step of the general education assessment process is necessary and required, but occurs periodically throughout the assessment cycle. At a minimum, follow-up should be entered into TracDat at the mid-year and end-of-year points. Document (TracDat): At mid- and end-of-year points, the Faculty Fellow should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Faculty Fellow): The Faculty Fellow is responsible for entering followup in relation to established action

32 33

Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the assessment process and should be documented as appropriate or --at minimum--the mid-year (December) and end-of-year points (May).


Resources For more information regarding the assessment of general education, please consult the following resources: 1. 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. 2. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 3. 3. Developing Research and Communication Skills: Guidelines for Information Literacy in the Curriculum. Middle States Commission on Higher Education. https://www.msche.org/publications/DevelopingSkills080111151714.pdf 4. 4. “General Education Competency Grid.� Middle States Commission on Higher Education. http://www.msche.org/publications_view.

33 34


COMMUNITY COLLEGE OF BEAVER COUNTY College Prep Learning Outcomes Assessment What is College Prep? Traditionally called “developmental” or “remedial” courses, CCBC’s college prep courses prepare students for college-level study. However, not all college prep courses at CCBC are remedial in nature, but rather required to better acclimate first-time students to the post-secondary environment. The College Prep Program at CCBC Community College of Beaver County identifies four areas of study often deemed necessary to prepare students for college-level work: college success, reading, writing, and math. To better serve students taking college prep courses at CCBC, in 2015 the institution began assessing all college prep areas as one program, much like the assessment of its General Education program. Assessing College Prep The College Prep program is assessed according to the Student Learning Outcomes Assessment Cycle (see FIGURE 4).

34


Resources 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Assessing Developmental Assessment in Community Colleges: A Review of the Literature. Katherine L. Hughes and Judith Scott-Clayton. Community College Research Center, Teachers College: Columbia University. http://www.postsecondaryresearch.org/conference/PDF/NCPR_Panel%202_ HughesClaytonPaper.pdf 3. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/ publications/CHX-2011-WEB.pdf 4. Connecting Curriculum, Assessment, and Treatment in Developmental Education. Community College Virtual Symposium, U.S. Department of Education: Office of Vocational and Adult Education. https://www2.ed.gov/about/offices/list/ovae/pi/cclo/brief-3-reinventing-dev-ed.pdf 5. “Culture of Evidence and Inquiry.� Achieving the Dream. http://achievingthedream.org/focus- areas/culture-of-evidence-inquiry 6. National Center for Developmental Education. Appalachian State University. http://ncde. appstate.edu/publications/what-works-research-based-best-practices-developmentaleducation 7. NADE: National Association for Developmental Education. http://www.nade.net/index.html

35


Resources 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment.

36


COMMUNITY COLLEGE OF BEAVER COUNTY Service Department Outcomes Assessment

37


Service Department Outcomes Assessment What are Service Departments? CCBC defines service departments as any non-academic area of the college that provides

services to the institution’s employees and/or students. Currently, CCBC recognizes 15 such departments.

FIGURE 7 SERVICE DEPARTMENTS COMMUNITY COLLEGE OF BEAVER COUNTY STUDENT SERVICES & ENROLLMENT Enrollment Management

Finance

Counseling

Physical Plant

Career Center Library Tutorial/ Learning Lab Supportive Services Student Activities & Athletics Financial Aid

38

FINANCE & OPERATIONS

COMMUNITY RELATIONSHIPS & DEVELOPMENT Community Relationships & Development

HUMAN RESOURCES

Human Resources

INFORMATION TECHNOLOGIES

Information Technologies

WORKFORCE DEVELOPMENT & CONTINUING EDUCATION Workforce

Continuing Education


COMMUNITY COLLEGE OFFIGURE 8 SDO ASSESSMENT CYCLE AND PROCESS NARRATIVE COMMUNITY COLLEGE OF BEAVER COUNTY SDO assessments must be completed annually for all service departments BEAVER COUNTY

A: SDOs Revised/Created D: SLO/SDO Guidelines & Rubric R: Outcome Lead (OL) T: August/September

A

A: Follow-Up Documented D: TracDat R: Outcome Lead (OL) T: Mid-year/end-of-year

G

A: SDOs Formally Reviewed D: Assessment Plan Template R: Outcome Lead (OL)/Faculty Fellow T: August/September

B

A: ction D: ocument R: esponsible Party T: imeline

KEY A: Data Discussed D: TracDat 4-Column Report/ SDO Checklist/ Budgetary Forms R: Outcome Lead/Supervisor T: May

F

A: SDOs Communicated/ Implemented D: Data Collection Memo R: Outcome Lead (OL) T: September-October

A: Data Collected/ Aggregated D: Data Collection Form R: Outcome Lead (OL) T: March-April

C

D

A: Data Reported/Actions Created/Budget Linked D: TracDat R: Outcome Lead (OL) T: May

E

Assessing Service Departments at CCBC Service departments narrative. at CCBC have established service department outcomes (SDOs) that are the accompanying assessed annually according to an established cycle. Like student learning outcomes or SLOs, SDOs are tracked using employee created assessment methods and are assessed against employee generated benchmarks. All SDO assessment data is stored in TracDat. The SDO Assessment cycle is illustrated in FIGURE 8: SDO Assessment Cycle and explained in the accompanying narrative.

41 39


SDO ASSESSMENT PROCESS NARRATIVE SDO ASSESSMENT PROCESS: BOX A Activity (SDOs Revised/Created): The lead staff member for each department outcome, or outcome lead (OL), will receive an email prompting him/her to review, revise, and/or create outcomes for the upcoming academic year as appropriate. NOTE: SDOs should only be modified at the five-year point unless substantial changes have occurred in the area necessitating the addition and/or revision of SDOs before the five-year mark. Document (SLO/SDO Guidelines and Rubrics): Service department outcomes created and/or revised by OLs should adhere to the guidelines for successful SDOs established in the SLO/ SDO Guidelines and Rubrics document, which is available from the Outcomes Assessment course on Blackboard (see also APPENDIX D). Responsible Party (Outcome Lead): OLs are employees responsible for specific department outcomes. Timeline (August/September): SDOs should be reviewed, revised, and/or created no later than the end of September to ensure appropriate outcomes are in place for the academic year. SDO ASSESSMENT PROCESS: BOX B Activity (SDOs Formally Reviewed): If the lead staff member decides to revise and/or create new outcomes for the upcoming academic year, the outcomes must be formally reviewed and approved by the Faculty Fellow. Document (Assessment Plan Template): To submit outcomes for formal review, the lead staff member must complete the assessment plan template, which is available from the Outcomes Assessment course on Blackboard. Responsible Party (Outcome Lead/Faculty Fellow): The lead staff member will be responsible for submitting appropriate documentation regarding revised and/or newly created service department outcomes to the Faculty Fellow. Timeline (August-September): To be approved and active for the current academic year, revised and/or newly created SDOs must be submitted to the Faculty Fellow no later than the end of September. SDO ASSESSMENT PROCESS: BOX C Activity (SDOs Communicated/Implemented): If necessary, OLs should contact all employees from whom they will need to collect data to appropriately assess established outcomes. Document (Data Collection Memo): APPENDIX F: SLO Data Collection Forms contains a standardized memo that may be adapted by staff OLs to inform other staff members (both full- and part-time) about the method/s, criteria, and means of collection that will be used to assess established outcomes. Responsible Party (Outcome Lead): Staff OLs should complete the data collection memo—if necessary--and send it to appropriate full- and part-time staff members. Timeline (September-October): OLs should contact other staff members no later than October. 40


SDO ASSESSMENT PROCESS: BOX D Activity (Data Collected & Aggregated): Staff OLs should collect and aggregate appropriate assessment data. Document (Data Collection Form): The data collection form (see APPENDIX F), or another electronic, staff-created form, should be used to capture assessment data. Responsible Party (Outcome Lead): The OL should ensure appropriate data is collected. Timeline (March-April): All assessment data should be collected no later than April SDO ASSESSMENT PROCESS: BOX E Activity (Data Reported/Actions Created/Budget Linked): After receiving the appropriate assessment data, it should be entered into TracDat. Actions should be created and budgetary links established. Document (TracDat): During this phase of the assessment cycle, staff OLs will receive an email reminder regarding TracDat data entry. Responsible Party (Outcome Lead): The OL is responsible for entering data into TracDat, creating actions, and establishing budgetary links. Timeline (April-May): All assessment data should be aggregated and entered into TracDat by the end of May. SDO ASSESSMENT PROCESS: BOX F Activity (Data Discussed): The OLs for each service department should meet as a group to discuss actions and needs associated with each service department based on SDO assessment results with their immediate supervisor to determine and gain approval for new/updated actions and/or budgetary requests. Document (TracDat 4-Column Report/SDO Checklist/Budgetary Forms): Following the completion of data entry through TracDat, a TracDat 4-column report (see APPENDIX H) should be created and provided to the supervisor by each staff OL. APPENDIX G explains how a 4-column report is created in TracDat. The 4-column report should be used to discuss the program with the supervisor and other staff OLs. In addition to the TracDat report, the SLO/SDO Checklist for Data Discussions (see APPENDIX I), which can be accessed from the Outcomes Assessment course on Blackboard, should be discussed and completed by OLs and the immediate supervisor during their data discussion. If necessary, the supervisor should also complete appropriate budgetary forms (New Initiative form, for example) to address specific department needs. Responsible Party (Outcome Lead/Immediate Supervisor): Each program OL is responsible for providing a TracDat 4-column report to his/ her immediate supervisor and completing, in conjunction with the supervisor and other program OLs, the Data Discussion Checklist. OLs should retain copies of the completed SDO Checklist and TracDat 4-column reports for all outcomes for reference and to ensure a more streamlined and effective SDO assessment process as follow-up is added to TracDat throughout the assessment cycle. The immediate supervior is responsible for scheduling data discussions with all department OLs, retaining copies of SDO Checklists and TracDat 4-column reports for each CCBC department under 41


his/her direction, and forwarding completed SDO Checklists to the Faculty Fellow. The supervisor is also responsible for moving forward any budgetary needs deemed necessary following data discussions. Timeline (May): Data discussions should take place by the end of May. SDO ASSESSMENT PROCESS: BOX G Activity (Follow-Up Documented): This step of the SDO assessment process is necessary and required, but occurs periodically throughout the assessment cycle. At a minimum, follow-up should be entered into TracDat at the mid-year and end-of-year points. Document (TracDat): At mid- and end-of-year points, OLs should access TracDat to document follow-up in regards to established actions (see APPENDIX G: TracDat Data Entry Screenshots and APPENDIX L: SLO/SDO Guidelines for Reporting Results, Actions, and Follow-Up). Responsible Party (Outcome Lead): The OL is responsible for entering follow-up in relation to established actions. Timeline (Mid-Year/End-of-Year): Follow-up information is a required part of the assessment process and should be documented as appropriate or --at minimum--the mid-year (December) and end-of-year points (May).

40 42


Resources For further information regarding the assessment of service departments, please consult the following resources: 1. A Road Map for Improvement of Student Learning and Support Services Through Assessment. James O. Nichols and Karen W. Nichols. CCBC Library Resource Center, Institutional Effectiveness Collection. 2. Administrative Assessment: A Guide to Planning, Implementing, and Reporting Division-Wide Evaluation and Assessment. Office of Institutional Effectiveness, Marymount University. http://www.marymount.edu/Media/Website%20 Resources/documents/offices/ie/AdminAssessHandbook.pdf 3. “Assessment and Evaluation.� NASPA: Student Affairs Administrators in Higher Education. https://www.naspa.org/focus-areas/assessment-and-evaluation 4. CAS Professional Standards for Higher Education. Council for the Advancement of Standards in Higher Education. http://standards.cas.edu/getpdf. cfm?PDF=E868395C-F784-2293-129ED7842334B22A 5. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. Middle States Commission on Higher Education. https://www.msche.org/publications/CHX-2011-WEB.pdf 6. Five Dimensions of Quality: A Common Sense Guide to Accreditation and Accountability. Linda Suskie. CCBC Library Resource Center, Institutional Effectiveness Collection. 7. Internet Resources for Higher Education Outcomes Assessment. Office of Institutional Research and Planning, North Carolina State University. http://www2.acs.ncsu.edu/UPA/archives/assmt/resource.htm#hbooks

42 43


LIST OF FIGURES FIGURE 1: Institutional Effectiveness Cycle and Process Narrative FIGURE 2: Strategic Planning Cycle and Process Narrative FIGURE 3: 2014-2019 Key Performance Indicators FIGURE 4: SLO Assessment Cycle and Process Narrative FIGURE 5: General Education Outcomes and Mastery Matrices FIGURE 6: General Education Assessment Cycle and Process Narrative FIGURE 7: Service Departments FIGURE 8: Service department outcomes Assessment Cycle and Process Narrative

43 44


GLOSSARY OF ASSESSMENT TERMS

44


GLOSSARY OF ASSESSMENT TERMS

A

ACTION ITEMS Strategic actions developed by immediate supervisors in concert with senior administrators that are informed by the senior administrator’s strategic objectives. Action items serve as each immediate supervisor’s performance goals.

ASSESSMENT PLAN The plan created by all outcome leads (faculty and staff) that establishes outcomes, assessment methods, and benchmarks for specific programs/departments.

ASSESSMENT The continuous and cyclical process through which ongoing institutional improvement—at all levels—is achieved.

ASSESSMENT METHOD The measure attached to a stated student learning or service department outcome that indicates how an outcome will be assessed (i.e. exams, essays, licensure exams, certifications, gap analyses, turn-around times, etc.). See also DIRECT and INDIRECT MEASURES.

B

BOARD PRIORITIES The set of goals CCBC’s Board of Trustees establishes each year in response to the college’s KPI and Annual Goals Report.

BUDGET FORMS Forms used to establish, discuss, and determine budgetary needs and allocation. Budget forms can be obtained from the office of the controller.

45


C

CRITERIA FOR SUCCESS The benchmark attached to a learning/department outcome.

D

DATA COLLECTION MEMO A form letter used by outcome leads (OLs) to communicate program/department outcomes. The data collection memo can be found in the Outcomes Assessment course on Blackboard (see also DATA COLLECTION FORM and APPENDIX F: SLO Data Collection Forms).

DATA DISCUSSIONS Discussions held between outcome leads (OLs) and immediate supervisors following the reporting of outcome data through TracDat.

DIRECT MEASURE A typically quantitative measure of outcome achievement (i.e. exams, rubrics, audits, financial ratios, etc.)

F

FACULTY FELLOW FOR PLANNING, ASSESSMENT, AND IMPROVEMENT Faculty member appointed by the President to serve a two-year term as lead faculty for planning, assessment, and improvement (see also APPENDIX B: Faculty Fellow Description/Application).

FOLLOW-UP Information entered into TracDat at the mid- and end-of-year points of the assessment cycle by outcome leads (OLs) in regards to established actions. cycle by outcome leads (OLs) in regards to established actions.

G

46

GENERAL EDUCATION The knowledge, skills, and abilities that are of use in all programs and prepare students for immediate academic, personal, and professional endeavors as well as for a life of learning. CCBC recognizes five general education competencies: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy.


GENERAL EDUCATION COMPETENCY ASSIGNMENT Also known as GEC assignments (or portfolio assignments), General Education Competency assignments are common assignments established in the master syllabus for all CCBC courses that include a general education competency. GEC assignments are graded using a common rubric and assessed using a mastery matrix or matrices.

GENERAL EDUCATION COMPETENCY ASSIGNMENT RUBRIC The rubric included in the master syllabus for all courses that meet a general education requirement. The GEC rubric should be used to grade General Education Competency assignments.

GENERAL EDUCATION MASTERY MATRICES Rubrics used to measure student achievement of the general education outcomes attached to CCBC’s five general education requirements: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy. General education mastery matrices are included in the master syllabus for all courses that meet a general education requirement. They are not used to grade student assignments, but rather to assess students’ level of mastery in regards to specific general education requirements and outcomes (see also APPENDIX J: General Education Student Learning Outcomes: Program and Assessment Overview).

GENERAL EDUCATION OUTCOMES The learning outcomes attached to CCBC’s General Education program. There is one learning outcome for each general education requirement: Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy (see also APPENDIX J: General Education Student Learning Outcomes: Program and Assessment Overview).

GENERAL EDUCATION REQUIREMENTS The five requirements that define CCBC’s General Education program and define the knowledge, skills, and abilities the institution views as vital to all majors and a life of learning. CCBC’s general education requirements are Communication Proficiency, Information Literacy, Scientific and Quantitative Reasoning, Technology Literacy, and Cultural Literacy.

I

IEC Acronym for the Institutional Effectiveness Council.

47


INDIRECT MEASURE A typically qualitative measure that should be used to supplement direct measures associated with student learning or service department outcomes. Indirect measures may take such forms as surveys, focus group data, and interviews.

INSTITUTIONAL EFFECTIVENESS The measure of how well an institution is achieving its overall goals. It answers the question “[H]ow well are we collectively doing what we say we are doing?” (Characteristics of Excellence 54). At CCBC, institutional effectiveness ensures the continuous improvement of academic and non-academic units through outcomes assessment.

INSTITUTIONAL EFFECTIVENESS COUNCIL (IEC) The institutional council charged with oversight of the college’s planning, assessment, and improvement efforts (see also APPENDIX A: Institutional Effectiveness Council Charter).

INSTITUTIONAL EFFECTIVENESS PLAN The college’s plan for continuous improvement through planning and outcomes assessment as outlined in the Institutional Effectiveness Handbook.

INSTITUTIONAL GOALS The overarching goals established in the college’s strategic plan that identify the institution’s overall aim and aspirations.

K

KEY PERFORMANCE INDICATORS (KPI) Established measures designed to assess elements vital to the institution’s overall success, such as enrollment, retention, and graduation rates (see also APPENDIX M: 5Year Key Performance Indicators and KPI Report Template) .

KPI The acronym used to refer to the college’s key performance indicators.

O 48

OBJECTIVE The term typically used to refer to learning goals at the course level that are established in master syllabi. Objectives are course specific and should not be confused with outcomes which are program specific.


OL The acronym used to refer to outcome leads.

OUTCOME The term typically used to refer to goals at the program/department level that are established and tracked by faculty and staff and reported in TracDat. Student learning outcomes are program specific and represent broader learning goals than course-level objectives.

OUTCOME LEAD (OL) The faculty or staff member assigned to a student learning or service department outcome.

P

PRESIDENTIAL GOALS

R RUBRIC

s

Goals created by the college president that are informed by the Board’s Priorities and serve as the president’s performance goals.

A tool used to quantitatively and directly assess student learning. A rubric often takes the form of a matrix, with established criteria, scores, and descriptions in various rows and columns.

SERVICE DEPARTMENT OUTCOME Outcomes developed by non-academic service departments to gauge quality of service.

SLO/SDO GUIDELINES AND RUBRICS Guidelines used by faculty and staff to revise and/or create outcomes for programs and departments, respectively.

49


STRATEGIC OBJECTIVES The objectives developed by senior administrators in concert with the president that are directly informed by presidential goals and serve as each senior administrator’s performance goals.

STRATEGIC PLAN The overarching, long-term plan that directs goal-setting and resource allocation at all levels of the institution.

STRATEGIC PLANNING The process utilized to create the institution’s strategic plan every five years as well as the annual processes used to sustain the strategic plan (see also FIGURE 2: Strategic Planning Cycle and Process Narrative).

STUDENT LEARNING OUTCOME (SLO) Program-level outcomes developed by faculty members to gauge student learning.

T

TRACDAT CCBC’s electronic data management system.

TRACDAT 4-COLUMN REPORT A report generated using TracDat that organizes all current outcomes assessment data in a four-column matrix. The 4-column report is used during data discussions with immediate supervisors.

50


APPENDICES

51


Appendix A: INSTITUTIONAL EFFECTIVENESS COUNCIL

52


INSTITUTIONAL EFFECTIVENESS COUNCIL CHARTER PURPOSE: The Institutional Effectiveness Council (IEC) is an advisory body that facilitates and guides planning and assessment at Community College of Beaver County. The IEC works with the college president to develop, monitor, and report on five-year and annual strategic plans as well as other areas related to institutional effectiveness, focused on CCBC’s institutional goals, and informed by the college’s Key Performance Indicators. RESPONSIBILITIES AND FUNCTIONS: To fulfill its overall purpose, members of the Institutional Effectiveness Council (IEC) will uphold the following responsibilities and functions through service on at least one of the following subcommittees chaired by standing members/officers of the IEC: STRATEGIC PLANNING (Chaired by the Executive Director of Institutional Research & Engagement) 

Facilitate the college’s five-year strategic planning process. To fulfill this role, the IEC will convene a core group of faculty, staff, and administrators every five years to create a strategic planning timeline and design, which must be approved by the President and Board of Trustees. This core group will also be responsible for 1) surveying college constituents using an established method of analysis (Compression Planning, SWOT, etc.) to identify strategic assets and challenges of the college, 2) analyzing long-term trends related to the college’s Key Performance Indicators (KPI), 3) preparing a report utilizing survey/KPI data regarding the college’s opportunities, challenges, and trends to be shared with CCBC’s Board of Trustees, President, and campus community that includes recommendations for new initiatives and improvement strategies, and 4) based on report feedback, composing a final draft of the college’s five-year strategic plan and key performance indicators to be approved by the President and Board of Trustees.

Facilitate the development of the college’s annual strategic plan. To fulfill this role, the IEC will 1) review and adopt an annual strategic plan update, which will become effective following presidential approval and includes Board priorities, Presidential goals, senior administrators’ strategic objectives, and school/department action plans; 2) review the results of all assessment plans attached to the annual strategic plan, including the Accreditation & Assessment Subcommittee’s quality evaluations; 3) recommend additions/deletions of actions from the plan, and 4) prepare a yearly report for the Board of Trustees, President, and college community to reflect the institution’s progress toward achieving its strategic goals and to inform the annual strategic plan update for the upcoming year.

ACHIEVING THE DREAM (Chaired by the Achieving the Dream Team Chairs) 

Monitor progress on Achieving the Dream measures. To fulfill this role the IEC will establish and implement an assessment plan—in conjunction with Achieving the Dream (AtD) core and data team members—to monitor and drive the college’s progress with regard to each AtD measure (completion of developmental courses, completion of gateway courses, completion of courses attempted, persistence from term to term and year to year, and completion of credentials).

ACCREDITATION & ASSESSMENT (Chaired by Faculty Fellow for Planning, Assessment, and Improvement) 

Regularly assess and review institutional planning and assessment processes. To fulfill this role, the IEC will 1) establish and implement assessment plans for all planning and assessment processes (five-year strategic planning/KPI development, annual strategic planning, student learning/service department outcomes assessment, etc.), 2) use data gathered from said assessments to drive improvements regarding planning and assessment processes at the college, 3) oversee the annual review

53


and revision of CCBC’s Institutional Effectiveness Handbook and ancillary documents related to institutional effectiveness through the Office of Planning, Assessment, and Improvement, and 4) ensure the campus is aware of resulting changes to college assessment/planning processes and documents. 

Review master plans. To fulfill this role, the IEC will 1) biennially review master plans and attached assessment plans from the following areas: academic affairs, finance and facilities, information technology, community relations and marketing, academic support and enrollment management, workforce and continuing education, and planning, assessment, and improvement, and 2) recommend changes to the President’s Executive Council.

Monitor Middle States compliance efforts. To fulfill this role, the IEC will oversee all accreditation activities through an annual report prepared by the Office of Planning, Assessment, and Improvement. The report will include, but is not limited to, the following information: 1) a five-year timeline of required accreditation activities (self-study, periodic review report, etc.), 2) a table establishing responsibility for and progress regarding all Middle States’ recommendations and requirements per Commission findings and Middle States’ team reports, and 3) organizational efforts regarding accreditation activities such as the self-study process, periodic review report preparations, and team visits.

COUNCILS, COMMITTEES, & GROUPS (Co-chaired by IEC Chair and IEC Vice Chair) 

Ensure the efficacy of institutional councils, committees, and groups. To fulfill this role, the IEC will 1) require semiannual (fall/spring) reports in the form of a public forum from all currently functioning college councils, committees, and groups, which include but are not limited to Academic Council, Curriculum Committee, Faculty Development Committee, Promotion and Tenure Committee, and the Enrollment Management Group, 2) request, as necessary, further information and/or action from these groups to address issues/areas of interest arising from these reports.

MEMBERSHIP: The Institutional Effectiveness Council (IEC) shall be comprised of at least 16 members chosen by the President through an application process:  Three (3) 9-month faculty: One (1) from each academic school (Aviation; Business, Arts, Sciences, &Technology; and Nursing & Allied Health)  Two (2) staff members.  One (1) 12-month faculty member.  One (1) part-time faculty member.  One (1) academic dean.  One (1) representative from the President’s Executive Council (PEC)  One (1) representative from Student Services and Enrollment  One (1) representative from Workforce and Continuing Education  One (1) representative from Curriculum Committee  One (1) representative from Academic Council In addition, the following positions will have permanent membership on the IEC, but may not hold any office:  Data Miner  Executive Director of Institutional Research and Engagement  Faculty Fellow for Planning, Assessment, and Improvement Members of CCBC’s Achieving the Dream (AtD) Team will also serve on the Institutional Effectiveness Council. AtD membership will be determined through a separate application process as determined by the College President. There are no IEC term limits for AtD members.

54


OFFICERS: At the first meeting of each year, the following officers shall be chosen to serve for one (1) year:  A chair whose responsibilities include the scheduling and convening of meetings, setting agendas, and establishing ad hoc committees, etc. as needed.  A vice chair who will assist the chair and serve in that capacity in the absence of the chair. STAFF SUPPORT:  A staff support person will be appointed by the President to schedule meetings, compose agendas, keep minutes, and distribute materials to membership as needed. MEETINGS: IEC membership will set a year-long meeting schedule at its first meeting of the academic year, but must meet a minimum of one (1) time during both the fall and spring semesters. Subcommittees must meet at least once per month according to the specific work of the committee. IEC Subcommittees are not required to meet in December, May, June, July, or August. TERMS: Terms shall begin the first day of the academic year. All members are eligible to serve one (1) three (3) year term; members may reapply after a lapse of one year. Upon the convening of the first meeting, lots will be drawn to set an equal number of initial one (1), two (2) and (3) year terms that will then convert to a standard three (3) year term upon expiration. In the case of a member’s resignation or removal, a replacement will be appointed by the President to serve the remainder of that term. REMOVAL: If a member’s employee status changes such that he/she no longer represents his/her current group, that member will cease to be a member of the IEC. Any member who misses two (2) consecutive meetings without cause will be removed from the IEC. RECORDS: The IEC staff support person will archive all Council documents including minutes on the Institutional Effectiveness web page (http://www.ccbc.edu/IE) or a successor site. 9.1.2015

PURPOSE:

55


APPENDIX B: FACULTY FELLOWSHIP DESCRIPTION/APPLICATION

56


FELLOWSHIP DESCRIPTION Faculty Fellow for Planning, Assessment, and Improvement GENERAL PURPOSE OF FELLOWSHIP The Faculty Fellow for Planning, Assessment, and Improvement will advance and sustain institutional effectiveness at Community College of Beaver County. The Fellow reports directly to the President. ESSENTIAL DUTIES AND RESPONSIBILITIES The Faculty Fellow for Planning, Assessment, and Improvement will be expected to:  Advance a culture of evidence at CCBC through planning, assessment, and improvement;  Cultivate leadership to create, maintain, and support planning, assessment, and improvement at the program, department, and institutional levels;  Support the planning, collection, dissemination, and use of assessment and assessment results for institutional improvement;  Serve on the Institutional Effectiveness Council;  Meet with the President to discuss successes, challenges, and goals;  Attend workshops and training sessions designed to increase knowledge of institutional effectiveness. KNOWLEDGE, SKILLS, AND ABILITIES The Faculty Fellow for Planning, Assessment and Improvement should possess the following knowledge, skills, and abilities:  An understanding of CCBC’s culture and current planning, assessment, and improvement efforts;  An ability to effectively work with faculty, staff, and administrators to advance the institution’s culture of evidence;  A knowledge of planning, assessment, and improvement processes;  An aptitude for the organization, composition, and presentation of information;  A capacity to learn new skills essential to advancing planning, assessment, and improvement at CCBC;  A familiarity with accreditation standards and expectations. APPOINTMENT OF FACULTY FELLOW The Faculty Fellow for Planning, Assessment, and Improvement is appointed by the college President and serves a two-year term. Every two years, the President will invite tenured, full-time faculty members to apply for the fellowship. Applications will be reviewed by the President with input from the Director of Institutional Research, the current Faculty Fellow, the chair of the Institutional Effectiveness Council, and an additional faculty representative. The President will appoint the new fellow at least one semester before the current fellow’s term ends to ensure adequate training prior to the new fellow’s term of appointment. To ensure a culture of assessment is cultivated amongst CCBC faculty through the faculty fellow program, a faculty member may not serve consecutive terms as Fellow for Planning, Assessment, and Improvement unless there are no other qualified applicants. Previous fellows may reapply for appointment after at least one intervening year of service by another faculty member. TERMS OF SERVICE The Faculty Fellow for Planning, Assessment, and Improvement will be appointed at least one semester prior to assuming his/her role. During that semester, the incoming fellow will receive release time equivalent to 3 credit hours to train with the outgoing fellow. Once assuming the role of Faculty Fellow for Planning, Assessment and Improvement, the faculty member will receive full-release from his/her required course load and will also be reimbursed for his/her standard overload during the academic year. Additionally, the fellow will be expected to hold summer hours the equivalent of 12 hours per week. At least 50% of required summer hours must be held on-campus. The fellow will be reimbursed for summer hours according to current overload rates equivalent to 12 credits. With appropriate approval and agreement, the fellow may teach in lieu of release and/or overload hours.

57


Faculty Fellow for Planning, Assessment, and Improvement Application Form NAME:_______________________________________________________________________ TITLE:________________________________________________________________________ TENURED:

YES_____

NO_____

SUPERVISOR’S SIGNATURE:____________________________________________________ By signing this document, the immediate supervisor acknowledges the applicant is approved to be released from his/her contracted duties for the term of appointment. APPLICATION DIRECTIONS: PART I: Compose a letter of intent that addresses the following three areas:  REASON FOR APPLYING: Please describe why you are interested in becoming CCBC’s Faculty Fellow for Planning, Assessment, and Improvement.  PREVIOUS EXPERIENCE: Please describe prior work and/or educational experiences that have prepared you for appointment to the fellowship program.  CONTRIBUTION TO THE COLLEGE: Please describe how you believe you will contribute to the ongoing work of the college in advancing a culture of planning, assessment, and overall improvement. PART II: Submit a current curricula vitae. PART III: Submit two letters of recommendation from the following individuals:  Immediate supervisor  Faculty colleague APPLICATION SUBMISSION: The completed application form and all other required materials should be submitted to the Office of the President by _____________for appointment consideration. The Faculty Fellow for Planning, Assessment, and Improvement will be announced by________________. Thank you for your interest in advancing CCBC’s commitment to planning, assessment, and overall institutional improvement!

58


APPENDIX C: INSTITUTIONAL EFFECTIVENESS COUNCIL APPLICATION FORM

59


Institutional Effectiveness Council Application Form NAME:

TITLE:

CHECK ONE:

□ □

FULL-TIME FACULTY

STAFF

□ □

ADJUNCT FACULTY

ADMINISTRATOR/CONFIDENTIAL

REASON FOR APPLYING: Please describe your reason for applying to the Institutional Effectiveness Council in 2-4 sentences.

PREVIOUS EXPERIENCE: Please describe any previous work or educational experience in regards to institutional effectiveness, strategic planning, and/or assessment in 2-4 sentences.

CONTRIBUTION TO THE COUNCIL: Please describe how you believe you will contribute to the work of the Council in 2-4 sentences.

I have read and understand the IEC charter and if selected as a member will uphold the responsibilities and functions of the Council as well as my personal responsibilities as a Council member.

SIGNATURE:

DATE:

60


APPENDIX D: SLO/SDO GUIDELINES & RUBRICS FOR CREATING ASSESSMENT PLANS

61


SLO/SDO GUIDELNES AND RUBRICS For Creating/Revising Student Learning and Service Department Outcomes Assessment Plans PART ONE: CREATE OUTCOME Characteristics of Successful Outcomes

1. 1. 2. 2. 3. 3.

Observable: Can you identify evidence that the outcome was or was not achieved?

4. 4.

Number: Do 3-5 outcomes clearly identify the most important foci of the program or department to be assessed?

Measurable: Can you measure the degree to which the outcome was met or not met? Aligned: Is the outcome aligned with other institutional goal areas such as general education, the College’s strategic plan, and/or the vision, mission, values, and goals of the College?

Anatomy of Successful Outcomes

1. 1. 2. 2. 1.

W/WB/WBAT Phrase: Outcomes generally begin with a “will,” (W) “will be” (WB) or “will be able to” (WBAT) phrase. For instance, outcomes created for academic programs may begin with a phrase like “Students will be able to,” and outcomes for service departments may begin with a phrase like “Help desk requests will be…”. Measurable, Action Verb: Choose an action verb that results in an overt behavior that can be measured. Using Bloom’s Taxonomy of Learning helps ensure you create outcomes that range from lower to higher order skills.

BLOOM’SBLOOM’S LEVELS LEVELS 1.

Knowledge 1. Knowledge

2.

2. Comprehension Comprehension

3. Application 3. Application 4. Analysis 4. Analysis

60 62

5.

5. Synthesis Synthesis

6.

6. Evaluation Evaluation

ACTION VERBS ACTION VERBS

arrange, arrange, define, describe, duplicate, label, label, list, define, describe, duplicate, list, memorize, name, order, recognize, memorize, name, order, recognize, reproduce reproduce classify, describe, discuss, explain, express, classify, describe, discuss,locate, explain, exidentify, indicate, recognize, report, press, identify, indicate, locate, recognize, restate, review, select, translate report, restate, review, select, translate Apply, choose, demonstrate, dramatize, emplo Apply, choose, demonstrate, dramatize, illustrate, interpret, operate, practice, schedule, employ, illustrate, interpret, operate, pracsketch,sketch, solve, use tice, schedule, solve, use analyze, appraise, calculate, categorize, comanalyze, appraise, calculate, categorize, contrast, criticize, differentiate, discrimcompare,pare, contrast, criticize, differentiate, inate, distinguish, examine, experiment, quesdiscriminate, distinguish, examine, experition, testtest ment, question, assemble, compose, Arrange,Arrange, assemble, collect, collect, compose, con- construc create, design, develop, formulate, struct, create, design, develop, formulate, manage, propose manage,organize, organize,plan, plan,prepare, prepare, propose argue, assess, choose, comAppraise,Appraise, argue, assess, attach,attach, choose, defend, estimate, judge, predict, rate, compare,pare, defend, estimate, judge, predict, score, select, support, value,and and evaluate rate, score, select, support, value, evaluate


3. Stated Outcome: Limit the stated outcome to one area of content. Avoid “ands” and “ors.” Stated outcomes for student learning should reflect knowledge or skills gained by the student over the scope of the entire academic program, not just a specific course. Outcomes for service departments should reflect overarching goals aimed at overall area improvement.

Examples of Successful Student Learning and Service Department Outcomes Student Learning Outcome Examples: Students in the Social Work program will be able to identify the basic psychological concepts, theories, and developments related to human behavior. Students in the Criminal Justice program will be able to employ effective written and verbal communications skills required for police operations, investigations, and public safety. Students in the Early Childhood Education program will be able to create curriculum using a variety of instructional strategies which provide for meaningful and challenging learning experiences and alternative models and methodologies. Service Department Outcome Examples: Financial aid personnel will monitor 3-year aggregate student loan default rate for CCBC students. Financial aid personnel will improve financial aid notification time. Help desk requests will be processed more quickly. Enrollment Management staff will increase the number of admissions interviews for incoming freshman.

PART TWO: IDENTIFY ASSESSMENT METHODS Establish Measures

1.1 .

Direct Measures: At least one direct measure must be identified per outcome. Two or more is considered best practice.

Description of Direct Measures

Tangible, observable, direct examination of student learning, gauge of effectiveness of a service, typically quantitative

Example of Direct Measures Student Learning: Exams (standardized, certification, licensure, pre-/post), rubrics, field experience/supervisor instructor ratings*, portfolios*, presentations*, projects*, performances*, research papers*, analyses of discussion threads* *If scored with rubrics

62

2 . 3

Service Departments: Turn-around times, completion of service, productivity/service/ incident counts, compliance (government), gap analyses, financial ratios, audits, inspections, tracking (e.g. website hits), lack of service (e.g. computer server downtime)


2.5.

Indirect Measures: Indirect measures should be used to supplement direct methods of assessing the established outcome.

Description of Indirect Measures

Example of Indirect Measures Student Learning: Surveys (CCSSE, SENSE), interviews, focus groups, course evaluations, course grades, student “rates” (graduation, employment, transfer), honors, awards, scholarships

Opinions, feelings, satisfaction, attitudes, habits of mind, reports of learning, inputs (faculty qualifications/library holdings), outputs (graduation rates, employment placement rates), typically qualitative

Service Departments: Satisfaction surveys, interviews, focus groups, opinions, feelings, attitudes

Characteristics of Successful Assessment Methods 1. Valid: Does the method measure the stated outcome? Does it hit the target? 2. Reliable: How consistent is the measure? Are you confident of the results? Are the results useful and reasonably accurate? 3. Embedded/Authentic: Is the measure already part of the course or “normal course of business”? Use authentic methods when possible instead of “add-ons.” Data Collection Logistics College-wide processes for the collection of outcomes assessment data are established in the Institutional Effectiveness Handbook; however, embedded in those processes are areas that require individual faculty, staff, and administrators to plan the logistics of data collection, aggregation, and follow-up reporting for their assigned outcomes. Periodically Review Established Assessment Methods At a minimum assessment methods should be reviewed at the five-year mark to determine if currently used methods are sufficient, need to be revised, or if new methods must be developed. Examples of Appropriate Outcome Assessment Methods Student Learning Outcome Example

PROGRAM: Culinary Arts Create and serve a variety of cuisines typically found in a food service facility in a team environment. Identify and utilize proper food and beverage preparation and service practices to meet industry standards for safety, cleanliness, and sanitation

ASSESSMENT METHODS HMR 103-112 Culinary Lab Final Projects (Instructor Evaluation for Technique and Teamwork) HMR 103-112 Culinary Lab Final Projects (Instructor Evaluation for Sanitation, Mise en Place, and Equipment) ServSafe Exam

63


Service Department Outcome Example

DEPARTMENT: Financial Aid

ASSESSMENT METHODS

Monitor 3-year aggregate student loan default rate.

Federal 3-Year Cohort Default Rate

Improve financial aid notification response time.

Average number of days award letter mailed for completed financial aid applications.

PART THREE: SET BENCHMARKS Characteristics of Successful Benchmarks 1. Defines Criteria for Success: Benchmarks establish how good is good enough, define success, and determine rigor. 2. Justifiable: Benchmarks should be realistic and attainable, but also “a stretch.� 3. Defensible: Benchmarks should be based on something (pre-established measure, trends, etc.) 4. Specific: Benchmarks should clearly identify which outcome/s will be measured and directly identify how the chosen assessment method will be used for benchmarking purposes. Anatomy of a Benchmark 1. Benchmark: Statement establishing the successful attainment of a specific outcome. EXAMPLE: 80% of students will earn a 3 out of 5 or better for all rubric criteria. 2. Target: A percentage included with the benchmark to demonstrate a wider attainment of the benchmark. EXAMPLE: 80% of students will earn a 3 out of 5 or better for all rubric criteria. Types of Benchmarks 1. Standards-Based: Benchmarks based around established competencies/criteria (i.e. rubric defining criteria for a writing assignment or exam testing learned competencies). 2. Peer/Norm-Based: Benchmarks based around published instruments, national averages, or peer institutions (i.e. IPEDS, certification/licensure requirements, professional organizations). 3. Value-Added: Benchmarks that measure growth, change, improvement (i.e. pre/post-test). 4. Longitudinal: Benchmarks that measure change over time and/or compare groups at different points in time (i.e. measures of credit enrollment over a 2-year time period, freshman scores vs. graduate scores).

64 64


Examples of Successful Benchmarks Student Learning Outcome Example

PROGRAM: Culinary Arts

ASSESSMENT METHODS

Create and serve a variety of cuisines typically found in a food service facility in a team environment.

HMR 103-112 Culinary Lab Final Projects (In- structor Evaluation for Technique and Teamwork). Identify and utilize propHMR 103-112 Culinary er food and beverage Lab Final Projects (Inpreparation and service structor Evaluation for practices to meet industry Sanitation, Mise en standards for safety, clean- Place, and liness, and sanitation Equipment)

BENCHMARK 80% of students will score a 3 out of 5 or better for Technique and Teamwork on the Instructor Evaluation. 85% of students will score a 4 out of five or better for Sanitation, Mise en Place, and Equipment on the Instructor Evaluation.

Service Department Outcome Example

DEPARTMENT: Financial Aid

ASSESSMENT METHODS

Monitor 3-year aggregate student loan default rate.

Federal 3-Year Cohort Default Rate

Improve financial aid notification response time.

Average number of days award letter mailed for completed financial aid applications.

BENCHMARK CCBC students will maintain a loan default rate 3% lower than the national average. Financial aid notification letters will be processed 2 days sooner than previous average.

*Adapted from Dr. Janine Vienna’s “Assessment Shop Work” presentation. Faculty Convocation. August 2014

65


66 Review Rubric for Academic Programs Criteria

Highly Developed

Developed

Developing

3-5 (or more) program outcomes identified which adequately address the most important focus of the program

5 (or more) program outcomes identified which adequately address the most important focus of the program

3-4 program outcomes identified which adequately address the most important focus of the program

1-2 program outcomes identified which adequately address the most important focus of the program

Outcomes stated in measurable and observable terms of student achievement

All outcomes stated in measurable and observable terms

Some outcomes stated in measurable and observable terms

No outcomes stated in measurable and observable terms

Outcomes aligned with general education competencies as applicable

All outcomes aligned with general education competencies as applicable

Some outcomes aligned with general education competencies as applicable

No outcomes aligned with general education competencies as applicable

Assessment methods identified are valid and reliable to assess program outcomes

All methods identified are valid and reliable to assess program outcomes

Some methods identified are valid and reliable to assess program outcomes

No methods identified are valid and reliable to assess program outcomes

Assessment methods use direct measures

All assessment methods use at least one direct measure

Some assessment methods use at least one direct measure

No assessment methods use at least one direct measure

Multiple assessment methods strengthen findings (secondary measures may be indirect).

More than one assessment method is used for all program outcomes

More than one assessment methods is used for most program outcomes.

Only one assessment methods is used for all program outcomes.

Benchmarks identify specific performance criteria

All benchmarks identify specific performance criteria

Some benchmarks identify specific performance criteria

No benchmarks identify specific performance criteria

Benchmarks appear to be sufficiently rigorous and defensible.

All benchmarks appear to be sufficiently rigorous and defensible.

Some benchmarks appear to be sufficiently rigorous and defensible.

No benchmarks appear to be sufficiently rigorous and defensible.

All logistics plans identify responsible person(s), when and how data will be collected and data storage.

Most logistics plans identify responsible person(s), when and how data will be collected and data storage.

No logistics plans identify responsible person(s), when and how data will be collected and data storage.

Outcomes

Assessment Methods

Benchmarks (success criteria)

Logistics Logistics appear to be well-planned and organized.

Feedback


Review Rubric for Service Departments Criteria

Highly Developed

Developed

Developing

Department Outcomes Outcomes appropriate to the unit’s services, processes and functions.

All outcomes appropriate to the unit’s services, processes and functions.

One or two outcomes appropriate No outcomes appropriate to the to the unit’s services, processes and unit’s services, processes and functions. functions.

Outcomes stated in measurable language

All outcomes stated in measurable language

Some outcomes stated in measurable and non-measurable language.

No outcomes stated in measurable language.

Outcome(s) appropriately linked to strategic plan.

All outcomes appropriately linked to strategic plan.

Some outcomes appropriately linked to strategic plan.

No outcomes appropriately linked to the strategic plan.

Assessment methods use direct measures.

All assessment methods use direct measures.

Some assessment methods use direct measures.

No assessment methods use direct measures.

Multiple assessment methods strengthen findings.

All outcomes have more than one assessment method.

Some outcomes have more than one assessment method.

No outcomes have more than one assessment method.

Assessment methods appear to be valid and reliable to assess the outcomes.

All assessment methods appear to be valid and reliable to accurately measure the outcomes.

Some assessment methods appear to be valid and reliable to accurately measure the outcomes.

No assessment methods appear to be valid and reliable to accurately measure the outcomes.

Benchmarks identify specific performance criteria

All benchmarks identify specific performance criteria

Some benchmarks identify specific performance criteria

No benchmarks identify specific performance criteria

Benchmarks appear to be sufficiently rigorous and defensible.

All benchmarks appear to be sufficiently rigorous and defensible.

Some benchmarks appear to be suf- No benchmarks appear to be suffificiently rigorous and defensible. ciently rigorous and defensible.

Assessment Methods

Benchmarks (success criteria)

Logistics Logistics appear to be well-planned All logistics plans identify responand organized. sible person(s), when and how data will be collected and data storage.

Most logistics plans identify responsible person(s), when and how data will be collected and data storage.

No logistics plans identify responsible person(s), when and how data will be collected and data storage or no logistics plans.

Comments

67


APPENDIX E: ASSESSMENT PLAN TEMPLATE

68


Outcomes

Assessment Methods

1.

1.

Related to General Education Outcome? If yes, which one(s)?

Y

/

N

Related to Strategic Plan Goal/Objective? If yes, which one(s)?

Y

/

N

Benchmarks (Criteria for Success) .

Logistics:

2.


68 70


APPENDIX F: SLO DATA COLLECTION FORMS

71


TO: Full- and Part-Time Faculty Teaching

(Name of course/s here)

FROM: SUBJECT: Student Learning Outcomes Assessment DATE:

This memo regards the collection of assessment data for the

program.

(Description of data to be collected here, i.e. specific test questions, assignment grade, rubric criteria, etc.) Please submit appropriate data to

no later than

using

(Describe means of submission here). If you have any questions, please feel free to contact me at

. You may also

stop by my office at your convenience. Thank you in advance for your cooperation with this effort. As always, your continued contribution to CCBC is invaluable and much appreciated.

77 72


arks (Criteria for Success)

COURSE NAME AND SECTION NUMBER

NUMBER OF STUDENTS ENROLLED IN CLASS AT TIME OF ASSIGNMENT

(TITLE OF APPROPRIATE DATA) i.e. final exam scores, competency assignment grades, etc. (TITLE OF APPROPRIATE Example: DATA) i.e. final exam scores, competency assignment 70, 65, 99, 97, 88, 89, 87, grades, etc. 56, 97, 96, 77, 79, 83, 82, 59, 79,Example 89, 99, 98, : 66, 69, 100, 76 70, 65, 99, 97, 88, 89,

DATA COLLECTION FORM

78

Example COURSE: NAME AND SECTION NUMBER WRIT 101-03

Example: NUMBER OF STUDENTS ENROLLED IN CLASS AT TIME 23 OF ASSIGNMENT

Example :

Example :

WRIT 101-03

23

87, 56, 97, 96, 77, 79, 83, 82, 59, 79, 89, 99, 98, 66, 69, 100, 76


APPENDIX G: TRACDAT DATA ENTRY SCREENSHOTS

79 74


EXAMPLE: John Smith, Outcome #1: Critical Thinking

USING TRACDAT DIRECTIONS AND SCREENSHOTS

80 75

STEP 1:

Access TracDat from tracdat.ccbc.edu

STEP 2:

Log in to TracDat


STEP3: Select the Appropriate Program/Department

STEP 4: Click Plan

81 76


82 77

STEP 5:

Edit/Add New Outcome

STEP 6:

Edit/Add Means of Assessment


STEP 7:

STEP 8:

Access Results Area

Select Appropriate Outcome for Results Entry

83 78


STEP 9:

84 79

Enter Results and Link to Actions


STEP 10: Add Actions and Resources

STEP 11: Add Follow-Up

85 80


STEP 12:

86 81

Create a TracDat Report


APPENDIX H: EXAMPLE TRACDAT 4COLUMN REPORT

88 82


90 84


APPENDIX I: SLO/SDO CHECKLISTS FOR DATA DISCUSSIONS

91 85


SLO/SDO CHECKLIST For Data Discussions DIRECTIONS: The SLO/SDO Checklist should be completed during data discussions with the immediate supervisor. Data discussions should be scheduled after all SLO/SDO data for a specific program or department has been collected and reported through TracDat by all outcome leads (OLs) associated with the program/department. Outcome leads should bring a TracDat 4-column report to the data discussion meeting for all the outcomes they are responsible for within the program/department. Complete and original SLO/ SDO Checklists should be sent to the Faculty Fellow. A copy of the checklist and required attachments should be kept by all outcome leads as well as the immediate supervisor.

PROGRAM/DEPARTMENT NAME:

OUTCOME LEADS & ASSIGNED OUTCOME:

1.

2.

3.

4.

5.

DATE OF DATA DISCUSSION:

TRACDAT 4-COLUMN REPORTS PROVIDED:

Yes_____No

Please attach copies of each outcome lead’s (OL’s) 4-column TracDat report. NOTES: RESOURCES NEEDED: If “yes,” please check one:

Yes

No Funds Available from Current Budget Line

_____Additional Funds Needed/New Initiative NOTES:

90 86


DISCUSSION SUMMARY: Please provide a brief narrative regarding the highlights of the data discussion.

IMMEDIATE SUPERVISOR’S SIGNATURE: I acknowledge all outcome leads responsible for reporting assessment data for the

program/department have met with me to discuss program needs

and assessment results for the 20

/20

academic year.

OUTCOME LEADS’ SIGNATURES: 1. 2. 3. 4. 5. We/I acknowledge that student learning/service department assessment data was thoroughly and appropriately discussed, including necessary resource needs, with the immediate supervisor for the program/department for the 20

92 87

/20

academic year.


APPENDIX J: GENERAL EDUCATION LEARNING OUTCOMES/ PROGRAM & ASSESSMENT OVERVIEW

93 88


COMMUNITY COLLEGE OF BEAVER COUNTY

GENERAL EDUCATION STUDENT LEARNING OUTCOMES PROGRAM AND ASSESSMENT OVERVIEW REQUIREMENT 1: COMMUNICATION PROFICIENCY Definition: Communication Proficiency requires the skilled presentation of ideas through appropriate media and in a manner suitable to the audience. In addition to WRIT 101: English Composition, courses with the following prefixes are among those that require demonstration of Communication Proficiency: COMM, FINE, FILM, FREN, MUSI, SPAN, THEA, WRIT 201. Outcomes and Mastery Matrix: OUTCOME #1 Demonstrate clear and skillful communication methods appropriate to different occasions, audiences, and purposes. MASTERY MATRIX #1 Mastery Progressing Low/No Mastery Student consistently demonstrates Student generally demonstrates Student does not consistently or clear and skillful communication clear and skillful communication generally demonstrate clear and methods appropriate to occasion, methods appropriate to occasion, skillful communication methods audience, and purpose. audience, and purpose. appropriate to occasion, audience, and purpose. Assessment Timeline: CORE COURSE/S WRIT 101 All sections assessed every academic year PROGRAM SPECIFIC COURSES All courses within area requiring a General Education Group 1-Nursing and Allied Health Competency Assignment with a Communication Group 2-Business and Technologies Proficiency focus assessed every fourth academic year on Group 3-Aviation Sciences a rotating basis according to the General Education ProgramGroup 4-Liberal Arts and Sciences Level Matrix.

100 89


REQUIRMENT 2: INFORMATION LITERACY Definition: Information Literacy recognizes the need to integrate authoritative resources with an existing knowledge base. In addition to LITR 210: Concepts of Literature or WRIT 103: Writing for Business and Technology (depending on your major), courses with the following prefixes are among those that require demonstration of Information Literacy: JOUR, LITR, PHIL. Outcomes and Matrix: OUTCOME #2 Access, evaluate, and appropriately utilize information from credible sources. MASTERY MATRIX #2 Mastery Progressing Low/No Mastery Student consistently accesses, Student generally accesses, Student does not access, evaluate, evaluates, and appropriately utilizes evaluates, and appropriately utilizes and appropriately utilize information from credible sources. information from credible sources. information from credible sources. Assessment Timeline: CORE COURSE/S LITR 210/WRIT 103 All sections assessed each academic year PROGRAM SPECIFIC COURSES All courses within area requiring a General Education Group 1-Nursing and Allied Health Competency Assignment with an Information Literacy Group 2-Business and Technologies focus assessed every fourth academic year on a Group 3-Aviation Sciences rotating basis according to the General Education ProgramLevel Matrix. Group 4-Liberal Arts and Sciences

101 90


REQUIREMENT 3: SCIENTIFIC AND QUANTITATIVE REASONING Definition: Scientific and Quantitative Reasoning employs empirical and mathematical processes and scientific methods in order to arrive at conclusions and make decisions. Courses with the following prefixes are among those that require demonstration of Scientific and Quantitative Reasoning: BIOL, CHEM, MATH, NANO, PHYS. Outcomes and Matrix: OUTCOME #3 Select and apply appropriate problem-solving techniques to reach a conclusion (hypothesis, decision, interpretation, etc.). MASTERY MATRIX #3 Mastery Progressing Low/No Mastery Student consistently selects and Student generally selects and Student does not consistently or applies appropriate problemapplies appropriate problemgenerally select and apply solving techniques to reach a solving techniques to reach a appropriate problem-solving conclusion. conclusion. techniques to reach a conclusion. Assessment Timeline: CORE COURSE/S BIOL 100 BIOL 101 BIOL 201 MATH 130 PHYS 101 PHYS 105

All sections assessed each academic year

PROGRAM SPECIFIC COURSES Group 1-Nursing and Allied Health Group 2-Business and Technologies Group 3-Aviation Sciences Group 4-Liberal Arts and Sciences

100 91

All courses within area requiring a General Education Competency Assignment with a Scientific and Quantitative Reasoning focus assessed every fourth academic year on a rotating basis according to the General Education ProgramLevel Matrix.


REQUIREMENT 4: CULTURAL LITERACY Definition: Cultural Literacy delineates the patterns of individual and group dynamics that provide structure to society on both individual and global levels. Courses with the following prefixes are among those that require demonstration of Cultural Literacy: ANTH, GEO, HIST, POLS, PSYC, SOCI, BUSM 255 or ECON 255. Outcomes and Matrix: OUTCOME #4 Demonstrate an understanding and appreciation of the broad diversity of the human experience. MASTERY MATRIX #4 Mastery Progressing Low/No Mastery Student consistently demonstrates Student generally demonstrates an Student does not consistently or an understanding and appreciation understanding and appreciation of generally demonstrate an of the broad diversity of the human the broad diversity of the human understanding and appreciation of experience. experience. the broad diversity of the human experience. Assessment Timeline: CORE COURSE/S PSYC 101 All sections assessed each academic year PSYC 106 PROGRAM SPECIFIC COURSES All courses within area requiring a General Education Group 1-Nursing and Allied Health Competency Assignment with a Cultural Literacy Group 2-Business and Technologies focus assessed every fourth academic year on a rotating Group 3-Aviation Sciences basis according to the General Education Program-Level Group 4-Liberal Arts and Sciences Matrix.

101 92


REQUIREMENT 5: TECHNOLOGY LITERACY Definition: Technology Literacy enhances the acquisition of knowledge, the ability to communicate, and productivity. In addition to CIST 100: Introduction to Information Technology, courses with the following prefixes are among those that require demonstration of Technology Literacy: CISF, CISN, CIST, CISW, OFFT, VISC. Outcomes and Matrix: OUTCOME #5 Utilize appropriate technology to access, build, and share knowledge in an effective manner. MASTERY MATRIX #5 Mastery Progressing Low/No Mastery Student consistently utilizes Student generally utilizes Student does not consistently or appropriate technology to access, appropriate technology to access, generally utilize appropriate build, and share knowledge in an build, and share knowledge in an technology to access, build, and effective manner. effective manner. share knowledge in an effective manner. Assessment Timeline: CORE COURSE/S All sections assessed each academic year CIST: 100 PROGRAM SPECIFIC COURSES All courses within area requiring a General Education Group 1-Nursing and Allied Health Competency Assignment with a Technology Literacy Group 2-Business and Technologies focus assessed every fourth academic year on a rotating Group 3-Aviation Sciences basis according to the General Education Program-Level Group 4-Liberal Arts and Sciences Matrix.

100 93


USING THE MASTERY MATRICES Purpose and Scope of the Matrices: Each General Education Mastery Matrix directly assesses a specific student learning outcome (SLO) defined in CCBC’s General Education Program and is intended to be used for the assessment of general education only. The matrices in no way infringe upon a faculty member’s right to evaluate course assignments in a way s/he deems appropriate; rather, each matrix, while a part of all General Education Competency (GEC) Assignment rubrics, should be recognized as a measure distinct from those criteria used to determine a student’s score on the GEC Assignment as a course-specific requirement. Uniformity in Scoring Using the Matrices General Education Requirement #2: Information Literacy Outcome #2: Access, evaluate, and appropriately utilize information from credible sources.

Mastery Student consistently accesses, evaluates, and appropriately utilizes information from credible sources.

Progressing Student generally accesses, evaluates, and appropriately utilizes information from credible sources.

Low/No Mastery Student does not access, evaluate, and appropriately utilize information from credible sources.

To ensure a reliable level of uniformity in scoring using the General Education Mastery Matrices, faculty members should refer to the following guidelines. It is important to remember the percentages established in the guidelines do not correspond with a student’s overall grade on the GEC assignment, but are rather general guidelines to help faculty more uniformly identify levels of mastery, progression, and low/no mastery across course sections that may be taught by various faculty members. MASTERY: A student who displays mastery of a certain outcome/general education competency could be said to demonstrate the skill and/or ability identified by the outcome at the 80-100% level. PROGRESSING: A student who demonstrates s/he is progressing in his/her mastery of a certain outcome/general education competency could be said to display the skill and/or ability identified by the outcome at the 70-79% level. LOW/NO MASTERY: A student who displays low/no mastery of a certain outcome/general education competency could be said to demonstrate the skill and/or ability identified by the outcome at the 69% or below level. Mastery Matrices and General Education Competency Assignments At CCBC, General Education Competency (GEC) Assignments are identified and included in the master syllabi for all courses that include a general education component. Each GEC assignment should be defined and used across sections. GEC assignments should also include a faculty-approved rubric, which all instructors should use to determine a student’s course grade for the assignment. Beginning in the fall of 2014, all GEC Assignment rubrics should be amended to include the General Education Mastery Matrix or Matrices appropriate to the GEC Assignment category identified in the master syllabus.

101 94


APPENDIX K: USING THE BLACKBOARD OUTCOMES ASSESSMENT SITE

102 95


USING THE OUTCOMES ASSESSMENT SITE DIRECTIONS AND SCREENSHOTS STEP 1: Access Blackboard

103 96


STEP 2: Log in to Blackboard S

STEP 3:

104 97

Access the Outcomes Assessment Course


STEP 4: Watch the “How To” Video Announcement

STEP 5:

Find Documents OR Upload Assessment Data

105 98


APPENDIX L: SLO/SDO GUIDELINES FOR REPORTING RESULTS, ACTIONS, AND FOLLOW-UP

106 99


SLO/SDO GUIDELNES For Reporting Results, Actions, and Follow-Up PART ONE: RESULT S Characteristics of Successful Results and Result Statements 1. Clear: Successful result statements establish what the numbers mean. 2. Concrete: Successful result statements are supported by data evidence. Anatomy of Successful Results and Result Statements 1. Data Evidence: The evidence that is used to establish whether a benchmark is met, not met, or inconclusive should always be included when reporting outcomes assessment results. Since CCBC uses an electronic data storage system, TracDat, data evidence must be in an electronic form (i.e. Excel sheet, Microsoft Word table, etc.) 2. Data Description: A brief statement of statistics related to the benchmark. It is often helpful to begin data descriptions with a phrase such as, “The results 1) indicate, 2) demonstrate, 3) show, 4) reveal, etc.,” “The results exceed/meet/do not meet the benchmark for…,” or “Based on the results of the [assessment method]…”. 3. Data Analysis: A brief analysis of the assessment results. Data analyses can focus on long-range trends, year-overyear information, highs/lows, strengths/weaknesses, external occurrences (snow days, change of instructor), etc. 4. Data Discussion/Dissemination: Discussion and dissemination of assessment data and results ensures stakeholders are informed of and have input regarding results. Depending on the scope of the established outcomes, it may be important to discuss and/or disseminate assessment data/results to faculty teaching in the program, members of an advisory committee, supervisors, clinical site supervisors, internship supervisors, primary employers, etc. A brief statement of how data was discussed/disseminated should be included in the result statement. 5. Result Conclusion: Establish whether the outcome was met, not met, or inconclusive. Examples of Successful Student Learning and Service Department Result Statements Student Learning Result Statement Example: Description

Analysis

Discussion

The 2012-2013 first attempt NCLEX-RN pass rate for CCBC was 88.89% compared to 86.22% and 84.29% for the PA and national pass rates; the benchmark was met. This is the third consecutive year for which the CCBC first-time pass rate exceeded both the PA and national pass rates. CCBC experienced a slight decline in the pass rate from 2011-2012, however the decline was also reflected at the PA and national levels. Nursing faculty discussed the results and believe the quality of instruction, clinicals and course assignments continue to contribute to positive NCLEX-RN results.

Service Department Result Statement Examples: Description Analysis Discussion

The results indicated that for all three rubric criteria, education, experience and professional presentation; 80% or more of the students created resumes that met quality standards; the benchmark was met. The highest criteria meeting quality standards was for experience (92%) and the lowest meeting quality standards was for professional presentation (81%). For education, 85% of the students met the quality standard. The results were discussed with faculty in the CTE programs to inform them of student strengths/weaknesses with the development of their students’ resumes.

107 100


PART TWO: ACTIONS What is an Action Plan? Actions plans are 1) the plans put in place to further improve a program or service based on positive assessment results (meeting/exceeding the established outcome benchmark), or 2) the plans put in place to improve upon a program or service because established benchmarks were not met. Anatomy of an Action Plan 1. Action Statement: An action plan statement states the action to be taken in a specific, concise manner. There should only be one action per action plan. 2. Resource Identification: When planning what action should be taken to improve a program/service, consider whether resources are needed to accomplish the action. Linking planning, budgeting, and assessment is a critical component of the assessment cycle. 3. Implementation/Timeline/Responsible Party: Considering and planning the logistics of the action plan are also necessary. Who will do what and when are important questions to ask as the action plan is established. Examples of Successful/Unsuccessful Student Learning and Service Department Action Plans SUCCESSFUL Student Learning Action Plan Example: Implement a peer reviewed role play assignment in NURS 130 for students to practice therapeutic communication techniques and create evaluation criteria to recognize successful techniques. The role play assignment will be developed during Summer 2014 and added to the NURS 130 syllabus and grading criteria. In the Fall of 2014 the role play activity will be implemented in the course. All instructors teaching NURS 130 will be responsible for understanding and implementing the assignment. No additional resources are needed for this action. UNSUCCESSFUL Student Learning Action Plan Example: Review Fisdap results with the program advisory committee. SUCCESSFUL Service Department Action Plan Example: Send a direct mail postcard to all credit students to notify them of the day/time of the job fair and the employers who will be in attendance to increase student participation at the fair. The postcards will be developed by the office of enrollment management and the College’s marketing department at least three weeks before the job fair. The postcards will be mailed two weeks prior to the fair. Resources for this project will come from the enrollment management budget and will total approximately $1000. UNSUCCESSFUL Service Department Action Plan Example Get more people to help with processing transcripts and determining transfer equivalencies.

108 101


Action Plan Considerations When creating an action plan, consider many possible avenues for improvement to ensure a valuable action is pursued. Student Learning Action Plan Considerations Service Department Action Plan Considerations Teaching Methods/Tasks Services Outcomes

Process Management

Curriculum

Outcomes

Assessment Strategies

Assessment Strategies

Learner/Student

Benchmarks

Support/Resources

Support/Resources

PART THREE: FOLLOW-UP Why is Follow-Up Important? Follow-up is important because it 1) demonstrates action plans are being implemented, 2) describes the progress being made towards improvement, and 3) ensures assessment cycles are complete and functional. Characteristics of Successful Follow-Up 1. Describes Progress: The primary function of follow-up reporting is to describe the progress being made towards improvement through implementation of the established action plan. When describing the progress of an action plan, it is important to discuss the effectiveness of the action plan and examine any new data associated with the plan. 2. Occurs Regularly: Follow-up reporting should occur at both the mid-year and end-of-year points to ensure the effectiveness of the action plan is being thoroughly examined and appropriate information is available as new action plans are established for the following assessment cycle.

*Adapted from Dr. Janine Vienna’s “Assessment Next Steps” presentation. Professional Development Day October 2014.

109 102


APPENDIX M: 5-YEAR KEY PERFORMACE INDICATORS KPI REPORT TEMPLATE

110 103


GOAL 1: STUDENT SUCCESS Create a learning community by supporting student success through educational programs provided in diverse and accessible formats. 1. Successfully complete developmental education Percent of enrollees in PREP English, Reading, and Mathematics earning grades of C or above, reported separately for each discipline, fall terms, as reported for Achieving the Dream

Benchmark

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016

English = √= Achieved Reading = √= Achieved Math = √= Achieved

2. Successfully complete the initial gateway English course Percent of enrollees in gateway English earning grades of C or above, fall terms, as reported for Achieving the Dream

Benchmark

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016

√= Achieved

3. Successfully complete courses with a grade of “C” or better Percent of enrollees in all courses earning grades of C or above, fall terms, as reported for Achieving the Dream

Benchmark

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016

√= Achieved

4. Fall-to-Fall Retention Rates Percent of FTIC fall cohort enrolling in the subsequent fall term, reported separately for full-time and part-time students, as reported for Achieving the Dream

Benchmark

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016

Full-time = √= Achieved Part-time = √= Achieved

5. Attain a Credential Percent of FTIC fall cohort who attain a degree, diploma, or certificate within four years, as reported for Achieving the Dream

Benchmark √= Achieved

112 104

Fall 2008

Fall 2009

Fall 2010

Fall 2011

Fall 2012


6. AAS/AS Graduate Employment and Continuing Education Rate Percent of AAS and AS graduates reporting employment in a job related to their CCBC program or transfer to a four-year institution within a year of graduation, as reported by Career Services

Benchmark

Class of 2012

Class of 2013

Class of 2014

Class of 2015

Class of 2016

√= Achieved

7. License Examination Pass Rates Percent of first-time testers meeting or exceeding national mean as reported by the National Council of State Boards of Nursing

Benchmark

2012

2013

2014

2015

2016

2015

2016

2015

2016

LPN = √= Achieved RN = √= Achieved RadTech = √= Achieved

8. Academic Challenge CCSSE composite benchmark score of survey items; standardized national sample mean = 50

Benchmark

2012

2013

2014

50.0 √= Achieved

9. Student Effort CCSSE composite benchmark score of survey items; standardized national sample mean = 50

Benchmark

2012

2013

2014

50.0 √= Achieved

10. Career Counseling Services Use Percent of students saying they used the college’s career counseling services as reported by CCSSE

Benchmark

2012

2013

2014

2015

2016

√= Achieved CCSSE national

11. Career Goal Clarification Percent of students saying college helped to clarify career goals as reported by CCSSE

Benchmark

2012

2013

2014

2015

2016

√= Achieved

113 105


CCSSE national

12. Supports For Students Rate for question “This institution treats students as its top priority” in annual Noel-Levitz survey

Benchmark √= Achieved CCSSE national

114 106

2015

2016

2017

2018

2019


GOAL 2: COMMUNITY AND ECONOMIC DEVELOPMENT Partner with businesses, organizations, educational institutions, and government agencies to enhance economic opportunities for the region. 13. Enrollment in Noncredit Workforce Development Courses Course enrollments in Workforce and Continuing Education noncredit open-enrollment workforce development and contract training courses for the fiscal year, as reported by the Dean of Workforce and Continuing Education

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

14. Contract Training Clients Number of business, government, and non-profit organizational units contracting with the college for customized training and services for the fiscal year, as reported by the Dean of Workforce and Continuing Education

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

15. Contract Training Student Headcount Annual unduplicated headcount of workforce development students served through contract training arrangements for the fiscal year, as reported by the Dean of Workforce and Continuing Education

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

16. Lifelong Learning Course Enrollments Number of course enrollments lifelong learning courses offered by Workforce and Continuing Education for the fiscal year, as reported by the Dean of Workforce and Continuing Education

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

115 107


17. College-sponsored Community Events Number of college-sponsored cultural and educational events open to the public held at the college each year

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

FY 2015

FY 2016

√= Achieved

18. Community Groups Using College Facilities Unduplicated count of Beaver County organizations using college facilities each year

Benchmark

FY 2012

FY 2013

FY 2014

√= Achieved

19. College position in the community Rate for question “This institution is well-respected in the community” in annual Noel-Levitz survey

Benchmark √= Achieved CCSSE national

116 108

2015

2016

2017

2018

2019


GOAL 3: ORGANIZATION DEVELOPMENT Create a culture that expects openness, collaboration, and mutual respect and embraces innovation and professional development. 20. Cultural Awareness Percent of students saying college contributed to understanding of people of other racial/ethnic backgrounds as reported by CCSSE

Benchmark

2012

2013

2014

2015

2016

√= Achieved CCSSE national

21. Employee Perception of College Commitment to Diversity Total votes made by employees to the question “Increase the diversity of racial and ethnic groups represented among the student body” in the annual Noel-Levitz survey

Benchmark

2015

2016

2017

2018

2019

√= Achieved CCSSE national

22. Employee Job Satisfaction Rate of overall satisfaction with employment as reported by the Noel-Levitz survey

Benchmark

2015

2016

2017

2018

2019

2015

2016

2015

2016

√= Achieved CCSSE national

23. Active and Collaborative Learning CCSSE composite benchmark score of survey items; standardized national sample mean = 50

Benchmark

2012

2013

2014

50.0 √= Achieved

24. Student-Faculty Interaction CCSSE composite benchmark score of survey items; standardized national sample mean = 50

Benchmark

2012

2013

2014

50.0 √= Achieved

117 109


GOAL 4: RESOURCES Develop and allocate resources which sustain the institution and encourage its growth and development. 25. Annual Unduplicated Headcount Annual total unduplicated headcount for students enrolled in credit and noncredit courses, reported by fiscal year

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

FY 2015

FY 2016

FY 2014

FY 2015

FY 2016

FY 2014

FY 2015

FY 2016

√= Achieved

26. FTE Enrollment Annual total FTE student enrollment in credit courses, reported by fiscal year

Benchmark

FY 2012

FY 2013

FY 2014

√= Achieved

27. Annual Credit Count Annual total count of credits taken by degree-seeking students, reported by fiscal year

Benchmark

FY 2012

FY 2013

√= Achieved

28. Tuition and Fees Change in sponsoring, credit student tuition and fees

Benchmark

FY 2012

FY 2013

√= Achieved

29. High School Graduate Enrollment Rate Percent of Beaver County public high school graduates attending CCBC in the fall semester following their high school graduation

Benchmark √= Achieved

118 110

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016


30. Met need of Pell Recipients Percent of Pell Grant recipients who have 50 percent of more of their total need met by grants and scholarships

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved Pell Recipients

31. Met need of Non-Pell Recipients Percent of students with financial need, who did not receive Pell Grants, who have 50 percent of more of their total need met by grants and scholarships

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

2015

2016

2015

2016

√= Achieved Students

32. Student Perception of Institutional Financial Support Percent of students saying college provides financial support needed as reported by CCSSE

Benchmark

2012

2013

2014

√= Achieved

33. Support for Learners CCSSE composite benchmark score of survey items; standardized national sample mean = 50

Benchmark

2012

2013

2014

50.0 √= Achieved

34. Enrollment per Section Mean credit course class size in the Divisions of Liberal Arts & Sciences and Business & Technologies, excluding self-paced, internship, and independent study course

Benchmark

Fall 2012

Fall 2013

Fall 2014

Fall 2015

Fall 2016

√= Achieved

117 111


35. Teaching by Full-Time Faculty Percent of total teaching load hours taught by full-time faculty

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

FY 2014

FY 2015

FY 2016

√= Achieved

36. Expenditures per FTE student Unrestricted current fund operating expenditures per FTE student

Benchmark

FY 2012

FY 2013

√= Achieved

37. Expenditure on Instruction and Academic Support Percent of total educational and general operating expenditures expended on instruction and academic support as reported by IPEDS categories

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

38. Employee Turnover Rate Percent of employees leaving the institution annually, excluding retirees as reported by the National Community College Benchmark Project national medium.

Benchmark

FY 2012

FY 2013

FY 2014

FY 2015

FY 2016

√= Achieved

39. College Support for Innovation Rate for question “This institution makes sufficient budgetary resources available to achieve important objectives” in annual Noel-Levitz survey

Benchmark √= Achieved CCSSE national

118 112

2015

2016

2017

2018

2019


APPENDIX N: INITIATIVES & EVENTS ASSESSMENT TEMPLATE

119 113


INITIATIVES AND EVENTS ASSESSMENT TEMPLATE COMMUNITY COLLEGE OF BEAVER COUNTY

Outcomes

Means of Assessment/criteria

Program or event:

Assessment Methods:

Responsible Person:

Outcome:

120 114

Criteria:

Results

Action taken and follow up


WORKS CITED Addison, Jennifer D. Institutional Effectiveness: A Practical Guide for Planning and Assessing Effectiveness. 5th ed. Virginia Highlands Community College, 2013-2014. Web. Characteristics of Excellence in Higher Education: Requirements of Affiliation and Standards for Accreditation. 12th ed. Middle States Commission on Higher Education, 2006. Print. Claglett, Craig. Institutional Effectiveness Assessment Report. Carroll County Community College, December 2011. Web. General Education Assessment Plan. Johnson County Community College. http://www.jccc.edu/files/pdf/assessment/general%20-ed-assessment-plan.pdf “Understanding Middle States Expectations for Student Assessing Student Learning and Institutional Effectiveness.” Student Learning Assessment: Options and Resources. 2nd Ed. MSCHE, 2007. Vienna, Janine. “Assessment Next Steps for Academic Programs: Professional Development Shop-Work.” Professional Development Day, Community College of Beaver County. Learning Resources Center, Monaca, PA. 7 October 2014. Presentation. Vienna, Janine. “Assessment Next Steps for Administrative Units: Professional Development Shop-Work.” Professional Development Day, Community College of Beaver County. Learning Resources Center, Monaca, PA. 7 October 2014. Presentation. Vienna, Janine. “How to Set Benchmarks.” Faculty Convocation, Community College of Beaver County. Learning Resources Center, Monaca, PA. 22 August 2014. Presentation. Vienna, Janine. “Identifying Assessment Methods.” Faculty Convocation, Community College of Beaver County. Learning Resources Center, Monaca, PA. 22 August 2014. Presentation. Vienna, Janine. “Writing Measurable Academic Outcomes.” Faculty Convocation, Community College of Beaver County. Learning Resources Center, Monaca, PA. 121 115


22 August 2014


2015 2016 institutional effectiveness handbook