Grambling State University Standard Five Compendium 2 R5.1 and R5.2 Representativeness of Data
Alignment to National Standard: Standard Five The provider maintains a quality assurance system that consists of valid data from multiple measures and supports continuous improvement that is sustained and evidence-based. The system is developed and maintained with input from internal and external stakeholders. The provider uses the results of inquiry and data collection to establish priorities, enhance program elements, and highlight innovations.
CAEP 5.1 Quality Assurance System: The provider has developed, implemented, and modified, as needed, a functioning quality assurance system that ensures a sustainable process to document operational effectiveness. The provider documents how da ta enter the system, how data are reported and used in decision making, and how the outcomes of those decisions inform programmatic improvement.
CAEP R5.2 Data Quality: The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative, and actionable measures to ensure interpretations of data are valid and consistent.
How Alignment is assured: The Assessment Coordinator in consultation with Program/Discipline Chairs, aligns the evaluation measures and assessment tasks with CAEP, InTASC, and appropriate Technology Standards. The Assessment Coordinator maintains alignments and adherence to multiple Louisiana state laws and policy regulations. All Standards have been maintained utilizing Watermark – Taskstream. This standards database is maintained by the Assessment Coordinator so that alignments can accommodate updates to standards, program competencies, courses, or assessments.
Evidence
Key Assessments
As depicted in Appendix C QAS Transitions Points, and Measures Crosswalk each program has key assessments. Each program has identified key assessments based on state and national competencies for each program (Appendix D Table of Current Program Key Assessments). Key assessment data are used internally to determine candidate proficiencies, which impact candidate matriculation, to measure program quality, and to improve EPP operations and programs (Appendix E Data Collection, Analysis, and Review Plan). At each portal and for each program, each candidate submits key assessments in TaskStream to the faculty member designated as the course instructor of record. Faculty evaluate and grade candidate work before submitting the candidate’s final grade. Each candidate can see his/her scores and faculty/supervisor comments as soon as the evaluation is released after grading. At each decision point, each candidate is informed of the decision relative to matriculation through the program.
Meeting CAEP Sufficiency
The key assessments are aligned with the conceptual framework. This alignment ensures that the assessment system collects information on candidate proficiencies as articulated in the conceptual framework, the state, national and the professional standards for both initial and advanced programs. The alignment also enables us to be efficient and focused on the data that are collected, which maximizes our ability to grow a culture of data-driven decisions in the EPP.
Template for the Presentation of Evidence by Dr. Michele Brewer and Dr. Amber Vraim is licensed under Attrib ution 4.0 International "College of Education Office of Technology, Assessment, and Compliance: Template for the Presentation of Evidence." Copyright 2020 by Wilmington University.
Grambling State University Standard Five Compendium 2 R5.1 and R5.2 Representativeness of Data
Faculty meet regularly to review key assessments. Subject-matter experts design key assessments and rubrics. These rubrics are reviewed by external stakeholders to ensure that they meet content validity standards. The rubrics are also reviewed by program leads and program committee members to ensure that they meet CAEP sufficiency standards. Internal reviews of programmatic key assessments are regularly reviewed by the department at large during data review days and in response to changes in state licensure requirements.
Proprietary Assessments
The EPP does utilize proprietary assessments where appropriate. Content validity and reliability have already been established for these assessments. Faculty training is conducted on the proper usage of proprietary assessments and norming activities provide the foundation for ensuring inter-rater reliability.
Scoring Criteria
The EPP has established the scoring criteria [Table 1 below] for key assessments to establish the level of proficiency candidates must attain to progress. All candidates must score a 3 or higher on the key assessment rubrics and achieve a level of Effective: Proficient. If a candidate fails to meet proficiency standards, the course instructor will provide actionable feedback and remediation or interventions. The candidate will be allowed to resubmit the assessment prior to the end of the course.
Performance Indicators - Descriptions
Novice
This rating is equivalent to having emerging performance skills/content knowledge that can be enriched with additional coursework.
Effective: Emerging This rating is equivalent to having the performance skills/content knowledge needed to move forward into student teaching; however, additional remediation might be needed to hone the candidate's performance.
Effective: Proficient (Target)
This rating is equivalent to having the performance skills/content knowledge needed to be an effective student teacher where additional skills will be practiced.
Highly Effective
This rating is equivalent to having the performance skills/content knowledge needed as a highly effective first year teacher.
Major transition points identified as “Portals” in initial programs are used to determine whether candidates are allowed to move to the next level of the certification pathway within the matriculation process. The EPP has incorporated a systematic monitoring process at each portal/transition point to monitor candidates from admission to completion. During advising sessions, advisors review candidates' progress and guide their matriculation through the program by completing the Program Progression Update forms and the Advising Contract. These documents are saved and made available to candidates for self-monitoring. In addition, the Directed response Folio is utilized in Taskstream for candidates to monitor their progress. CAEP Reviewers are encouraged to click on the link below for the compilation of these transitions, along with directions, and evaluation methods. TRANSITION POINTS DIRECTED RESPONSE FOLIO
Template for the Presentation of Evidence by Dr. Michele Brewer and Dr. Amber Vraim is licensed under Attrib ution 4.0 International "College of Education Office of Technology, Assessment, and Compliance: Template for the Presentation of Evidence." Copyright 2020 by Wilmington University.
Table 1: Rubric Level Definitions
Grambling State University Standard Five Compendium 2 R5.1 and R5.2 Representativeness
of Data
Candidates are also advised of the many supports available on campus to assist with academic support. The links below illustrate this support.
Office of Professional Laboratory Experiences
Care Center
Praxis Laboratory
GSU Residency Mentor and Supervisor Handbook
Representativeness of Data
Throughout the Self Study, Grambling documented that data are regularly and systematically reviewed, patterns across preparation programs are identified, and data/evidence are used for continuous improvement (Assessment Planning and Program Review Forms and Samples). Assessments submitted as evidence for standard 1-4 (links below) were disaggregated for all candidates across licensure programs and included 3 cycles of data (where available).
Standard One - Initial
GSU Standard One Compendium 1 - R1.1_Learner_Growth and Development
GSU Standard One Compendium 2 - R1.2 Application of Content
GSU Standard One Compendium 3 – R1.3 Instructional Practice
GSU Standard One Compendium 4 – R1.4 Professional Responsibility
Standard Two – Initial
GSU Standard Two Compendium 1 - R2.1 Partnerships for Clinical Preparation
GSU Standard Two Compendium 2 - R2.2 Clinical Expectations and Training
GSU Standard Two Compendium 3 - R2.3 Field Experiences: Depth and Breadth
Standard Three – Initial
GSU Standard Three Compendium 1 - R3.1 Access and Equity
GSU Standard Three Compendium 2 - R3.2 Academic Achievement and Support
GSU Standard Three Compendium 3 - R3.3 Monitoring Academic and Non-Academic Factors
Standard Four – Initial
GSU Standard Four Compendium 1 - R4.1 Completer Effectiveness
GSU Standard Four Compendium 2 - R4.2 Satisfaction of Employers
GSU Standard Four Compendium 3 - R4.3 Satisfaction of Completers
Template for the Presentation of Evidence by Dr. Michele Brewer and Dr. Amber Vraim is licensed under Attrib ution 4.0 International "College of Education Office of Technology, Assessment, and Compliance: Template for the Presentation of Evidence." Copyright 2020 by Wilmington University.
Grambling State University Standard Five Compendium 2 R5.1 and R5.2 Representativeness of Data
Continuous Improvement
Focus Area(s):
To ensure that candidates have met the proficiency criteria for key course-based assessments at each transition point prior to beginning their clinical practice, portal committee members should be given permissions to review student Taskstream folios for Residency I applicants. Office of Professional Laboratory Experiences (OPLE) director will inform any candidate that has scored lower than the required 3 on any assessment to resubmit. Resubmissions will be evaluated by the candidate’s program chair prior to admittance to Residency 1.
Transition Points - Edit transition points DRFs in Taskstream to make it more user-friendly for candidates and evaluators to standardize expectations across initial programs
Template for the Presentation of Evidence by Dr. Michele Brewer and Dr. Amber Vraim is licensed under Attrib ution 4.0 International "College of Education Office of Technology, Assessment, and Compliance: Template for the Presentation of Evidence." Copyright 2020 by Wilmington University.