Computer Systems Validation

Page 1

Insert Company Logo here

Document ID: CSV001

Version No.: Status

Validation Plan: Title

Author signs to confirm technical content Prepared by:

Job title:

Signature:

Date:

Signature:

Date:

System Owner signs to confirm technical content Reviewed by:

Job title:

Quality representative signs to confirm document complies with quality management system Authorised by:

Job title:

Signature:

Document review date

This template was prepared by PharmOut Pty Ltd www.pharmout.net Š2011 PharmOut Pty Ltd. All rights reserved.

Page 1 of 4

Date:


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

Contents 1.

Introduction

4

2.

Scope

4

2.1.

Included In Scope

4

2.2.

Excluded From Scope

4

3.

System Overview

4

3.1.

System Description and Project Context

4

4.

Roles and Responsibilities

5

5.

Standards, Regulations and Guidelines

6

6.

Validation Strategy

7

6.1.

Overview

7

6.2.

Assumptions, Exclusions and Limitations

7

6.3.

Validation Activities

7

7.

Quality Risk Management

8

7.1.

Initial Risk Assessment

8

7.2.

Detailed Risk Assessment

9

7.3.

Project Risk Management

9

8.

Deliverables

9

8.1.

[Computer System Name] Validation Documents

8.2.

Supplier Assessment and Report

10

8.3.

System Manuals

10

8.4.

User Requirements Specification

10

8.5.

Functional / Configuration Specification

11

8.6.

Requirements Traceability

11

8.7.

Qualification Protocols

11

8.8.

Data Migration Plan / Report

11

8.9.

Validation Summary Report

12

8.10.

Source Code Review

12

9.

Testing Approach

12

9.1.

Roles and Responsibilities

12

9.2.

Test Strategy

13

9.3.

Installation Qualification

14

9.4.

Operational Qualification

14

9.5.

Performance Qualification

14

9.6.

Test Execution

15

10.

Acceptance Criteria

15

11.

Validation Reporting

15

12.

Supporting Processes

15

12.1.

Change Control

15

Page 2 of 4

10


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

12.2.

Incident Reporting

16

12.3.

Documentation and Procedures

16

12.4.

Project Training

17

13.

Periodic Review

17

14.

Decommissioning

17

15.

Glossary

18

16.

Referenced Documents

20

DOCUMENT REQUIREMENTS (Delete this information prior to issue) All text provided is either instructional (red text) or example content (black text) and should be modified/deleted as appropriate. This document should be modified to match the company style guide, where applicable. The document contains several editable fields denoted by a grey background when the cursor is placed over the field. Company information for these fields can be added within the document properties tabs (Summary and Custom) and are updated automatically when the file is opens or manually highlighting the field and pressing F9. This template has been written assuming that the company has a copy of ISPE’s GAMP 5 – A Risk-Based Approach to Compliant GxP Computerized Systems available. It is recommended that the GAMP guide be consulted for more specific information when using this template. All controlled validation documents must have an identifying number consistent with the site documentation procedure. The correct site QMS template (styles, fonts etc) should be used where one is provided and applicable. The Change History section of this document must be completed and have a valid reference to a change control raised according to site procedures. General consistency should be present within the document. i.e. The name of a piece of equipment, a solution, etc., once defined, should not be varied.

Page 3 of 4


Company

1.

Validation Plan: Title Document ID: CSV001

Version No.: Status

Introduction The purpose of this Validation Plan (VP) is to establish the process required to provide documented evidence at Company’s facility in [site, location, country] that all components of the [Computer System Name] system are installed and operate according to the user requirements, site quality standards and current pharmaceutical industry regulations. It has been written in accordance with the Site Validation Master Plan / Computer Systems Validation Policy (VMP) [edit / insert references]. This plan is intended to guide the validation personnel and project team members in performing the required validation activities. The results from this validation effort will be summarised in the Validation Summary Report (VSR). Execution of this plan shall: • provide documented evidence that the [Computer System Name] has been implemented and performs according to approved specifications, • ensure appropriate controls are in place to provide assurance that the system will continue to operate in a controlled, maintained environment, and • demonstrate compliance with Company quality system standards associated with the operation, use and support of the [Computer System Name]. This document will be reviewed in line with project changes and updated accordingly as required before completion of the project.

2.

Scope This VP applies specifically to the validation of the [Computer System Name] application software and the supporting hardware components identified in Section x of this document. It does not include [exclusions, e.g. related systems, related hardware, non-GxP modules].

2.1.

Included In Scope List all functions that are to be included in the validation. Consider such things as hardware, software, interfaces, printers & other output devices, network(s), related systems, xxx…

2.2.

Excluded From Scope List all functions that are not covered in this validation.

3.

System Overview

3.1.

System Description and Project Context A detailed description of the system and what is required from the project should be recorded here and may include a diagram, if appropriate. Discuss the areas of the business that are impacted by the project and any project dependencies. This section may take the place of a separate system description document, (a PIC/S regulatory requirement) if sufficient detail is provided here. A software upgrade may include:

Page 4 of 4


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

The [Computer System Name] system currently in use at Company is version x.x.x, and will be updated to version x.x.y as part of the project. The existing version has been in use at Company for approximately XX years. This project directly impacts the xxxx department and will also indirectly affect the [yyy and zzz] departments. 3.1.1.1.

System Design

The [Computer System Name] system comprises the following main components, which are resident on the site local area network (LAN): • 1st component • 2nd component • etc. A diagram is recommended here to show the overall system architecture and how it interfaces with other systems and / or hardware. 3.1.1.2. Hardware Describe what hardware is to be installed or exists already, and whether the components are new or existing. Include reference to hard disk drives, disk failure protection, where the database will reside (if applicable) and whether some data will reside elsewhere, clients and server(s). It may be useful to include user access if relevant.

3.1.1.3. Software and Operating System Describe the operating environment, including the platform and any other software installed on the system. Include the version numbers of each. For example: Windows XP Symantec ‘Backup Exec’ V12.5 Back-up agent Bit Defender Antivirus Plus v10 – Build 247

4.

Roles and Responsibilities Specify who (by role or title, not name) is carrying out each key stage or activity of the project, including writing, reviewing, approval and execution of each of the documents, provision of hardware, infrastructure, resources etc. Edit the table below accordingly. Role

Responsibility

Project Manager

Establish and maintain the project schedule and budget Ensure sufficient resources are allocated to complete the tasks on time

SME(s)

Provide expert knowledge on the system as required to write and review validation documentation and carry out user testing

IT

Develop system configuration System administrative support and data security

Page 5 of 4


Validation Plan: Title

Company

Document ID: CSV001

Role

Version No.: Status

Responsibility Approval of key documentation as defined in Section x below

QA

Implementation of quality standards and procedures for the testing, installation and operation of the system Approval of key documentation as defined in Section x below Monitoring and support of project phase activities Ensuring adequate training has been completed for end users, including administrators Ensuring that standard operating procedures (SOPs) for using the system exist, are followed and are reviewed periodically Ensuring system changes are approved and managed Support of the project life cycle processes, incident management, document management and change control

Process Owner

May be the system owner

Ultimate responsibility for the regulatory compliance of the system Approval of key documentation Provide adequate resources (SMEs) for requirements definition, user testing, development support and system operation Management of changes End user training

System Owner

Availability, support and maintenance of the system Security of the data on the system System is supported and maintained by trained staff in accordance with applicable SOPs Approval of key documentation Management of changes Ensuring availability of information for system inventory and configuration management

5.

Standards, Regulations and Guidelines Insert all standards applicable to the company & edit the list below as required. The content of this document complies with the principles of the following international standards and guidelines: • AS ISO 13485 Medical devices — Quality management systems — Requirements for regulatory purposes • ISO 14971 Medical devices — Application of risk management to medical devices • GAMP 5® — A Risk-Based Approach to Compliant GxP Computerized Systems, Version 5, ISPE 2008 • PE 009-8, PIC/S Guide to Good Manufacturing Practice for Medicinal Products, (in particular Annex 11, Computerised Systems) 2009

Page 6 of 4


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

• US FDA GMP Regulations 21 CFR parts 210 and 211 • US FDA Guidance for Industry Part 11, Electronic Records; Electronic Signatures – Scope and Application

6.

Validation Strategy This section should discuss the life cycle model used, the application of hardware and software categories, the inputs and outputs required for each stage of the project, the acceptance criteria for each stage, the approach to traceability and the approach to design review.

6.1.

Overview The validation strategy used for this project is based on the validation lifecycle approach described in GAMP 5®, A Risk-Based Approach to Compliant GxP Computerized Systems. It takes into consideration the outcome of the risk assessment, the assessment of the system, its components and architecture, and the supplier assessment. As the system has been assessed as GAMP Category [insert GAMP category], the validation approach taken will be as detailed in the sections below.

6.2.

Assumptions, Exclusions and Limitations Include such aspects as handling of records, market penetration of the software in the pharmaceutical industry, qualification status of the LAN, and any limitations on the use of the application functionality. The following are used as suggestions of what might be required for this section but are not intended to be all-inclusive and may not be relevant to the software application being validated. It is assumed that the software is... complete/amend as required It is intended that the upgraded software will be used to… complete/amend as required It is assumed that the local area network has been qualified and maintained in a qualified state. Some functionality offered by the [Computer System Name] will not be used and will not be validated. This functionality is indicated in Section x below.

6.3.

Validation Activities Where appropriate, amend bullets as required. Add detail to each of the bullet points below to describe how, when, where or why the tasks will (or will not) be carried out. The list should be expanded as required and, depending on the size, nature and complexity of the system, may include activities such as factory acceptance testing, user acceptance testing, site acceptance testing. Validation activities for the system implementation will follow the GAMP 5 V-Model lifecycle approach and will include: • • • • • •

Risk assessment Supplier assessment System component assessment and definition of architecture Specification Planning Review and approval of specifications

Page 7 of 4


Company

Validation Plan: Title Document ID: CSV001

Version No.: Status

• Development of test strategy • Design review • Configuration management • Testing / Qualification • Review and approval of qualification • Reporting • System release / approval for use In addition, system maintenance and retirement should be discussed in this validation plan but are carried out after the initial validation is completed. Qualification should include ensuring that the appropriate documents are written and approved prior to the system moving into the operational phase of its lifecycle.

7.

Quality Risk Management Describe the risk assessment process used or refer to the site SOP. Record the outcome of the risk assessment here and reference the detailed risk assessment. List any high-level assumptions, implications or actions required as a result of assigning the relevant category. e.g. some aspects of validation may have been addressed by the vendor.

7.1.

Initial Risk Assessment This section should describe the outcome of the initial risk assessment which should have determined whether the system is GxP regulated, that it requires validation, what is its highlevel impact on product quality, safety and data integrity, and whether there is a need for a further, more-detailed risk assessment. The initial risk assessment has already been completed, as a precursor to the project. The assessment identified that the [Computer System Name] system requires validation. The [Computer System Name] has been assessed for its GAMP computerised system categorisation. The conclusion is that the [Computer System Name] system is GAMP category x, and is a GxP system. Refer to the [Risk Assessment SOP] and to the initial risk assessment carried out in compliance with that SOP at the beginning of this project. Some details of the initial risk assessment may be included in the table below, if useful. The table below lists the activities which are supported by [Computer System Name]. The second column records where the activities are stated to be regulated activities as determined by [the relevant guide], and the associated regulation is given. A brief description of the requirement is given in the third column and a description of how the [Computer System Name] supports the regulated activity is given in the last column. Activity

Associated Regulation

Requirements

Page 8 of 4

How [Computer System Name] Supports the Activity


Company

7.2.

Validation Plan: Title Version No.: Status

Document ID: CSV001

Detailed Risk Assessment A detailed risk assessment that looks at the risks associated with specific functions of the system will be required in most cases. Refer to the site SOP for risk management which should detail the method to be used e.g. FMEA, FMECA. Refer to the detailed risk assessment & discuss the outcome of the assessment.

7.3.

Project Risk Management This section should focus on the project management risks; it is not intended to cover risk assessment of the system being installed – that should be discussed in the risk assessment sections above. Project risk management should assess all project risks such as availability of resources, late delivery of hardware, system failure or incompatibility, additional testing requirements, roll back strategy, changes in scope etc. The following project risks have been identified and are intended to be mitigated using the strategy listed in the Risk Mitigation column. Risk Description

8.

Risk Mitigation

Deliverables Include references to all of the validation documentation that has been or will be written, such as the project quality plan, user requirements specification, functional specification, configuration specification, requirements traceability matrix, installation qualification, operational qualification, performance qualification, validation summary report. Wherever possible and appropriate, leverage the vendor’s work by using their documentation. As part of the validation activities, the following documents will be produced, and will form the validation package that will be maintained for the life of the system. This document set will be available for audit by regulators or customers and will demonstrate that the [Computer System Name] has been implemented in accordance with current regulatory expectations. The table below provides a list of deliverable project documents and responsibilities for writing them. Modify the table below as required to list the documents that will be or have been prepared.

Page 9 of 4


Company

8.1.

Validation Plan: Title Version No.: Status

Document ID: CSV001

[Computer System Name] Validation Documents Document Deliverable Description Author Reviewer(s)

Approver(s)

Initial Risk Assessment Supplier Assessment and Report Supplier validation documentation (reviewed) Detailed Risk Assessment User Requirements Specification Validation Plan (this document) Functional / Configuration Specification(s) Requirements Traceability Matrix Installation Qualification Test Protocol(s) Operational Qualification Protocol(s) Performance Qualification Test Protocol Data Migration Plan / Report Validation Summary Report

8.2.

Supplier Assessment and Report The supplier of the [Computer System Name] system will be sent a quality assessment for their quality assurance representative to complete. The result of the assessment and any conclusions will be added to a report, created as one of the project deliverables. Alternatively, it may be decided that an on-site audit is required.

8.3.

System Manuals Itemise and cross reference any manuals that are provided with the system by the supplier. Information relating to the functional description and configuration may be sourced from these documents, where needed, and added to formal specification and configuration documents. Manuals include the online help files and vendor installation documents.

8.4.

User Requirements Specification Amend as required. It may be necessary to specify testing more functionality if, for example, the vendor assessment was not satisfactory.

Page 10 of 4


Company

Validation Plan: Title Document ID: CSV001

Version No.: Status

The User Requirements Specification (URS) will list the attributes that the system is required to deliver. The attributes may be in the form of functional or performance requirements, and only those requirements that have relevance to the critical functions of the system (refer to the risk assessment in Section x above), or safety of the system will be tested.

8.5.

Functional / Configuration Specification Amend as needed. For small systems, configuration may be documented in the Installation Qualification protocol. For larger or more complex systems, it may be desirable to document the configuration in a separate document. When the functional and performance requirements of the [Computer System Name] system have been identified, the configuration of the system that will deliver those requirements will be identified. The configuration details will be added to the Functional / Configuration Specification. This will provide a basis for the system to be configured and then tested against.

8.6.

Requirements Traceability This section is intended to prompt the writer to document how traceability will be handled in the project, whether it will be by means of a separate requirements traceability matrix or whether it will be built into the specification documents and protocols. Small projects may build the traceability into the specification documents and the protocols but, for large projects, a separate Requirements Traceability Matrix will typically be required. In either case, explain how this will be managed for the project in this section. Traceability may also need to be established for any vendor documents that have been used in order to reduce the validation effort. Provide a traceability matrix linking the test cases to be developed to a location within the appropriate pre-qualification document (System or Design Specifications) where the testing is performed. For instance the Installation Qualification testing is executed to verify and document that system components are combined and installed in accordance with specifications, supplier documentation, and site requirements. The test cases described in the protocols are linked back to the system requirements specification / functional description / user requirements specification (document reference xxx) via the requirements traceability matrix (document reference yyy). The requirements traceability matrix will remain a ‘live’ document throughout the validation project. It will list the user, functional and design requirements and provide a link to the tests in the protocols [test documents].

8.7.

Qualification Protocols Describe what documents will be produced to carry out testing of the system. Include reference to and how information from vendor documentation will be used, if applicable. IQ (Installation Qualification), OQ (Operational Qualification) and PQ (Performance Qualification) protocols will be written for this project as detailed in Sections x, y and z below.

8.8.

Data Migration Plan / Report If applicable. This is required when existing data is to be transferred to the ‘new’ system, such as in the case of a system upgrade.

Page 11 of 4


Company

Validation Plan: Title Document ID: CSV001

Version No.: Status

A data migration plan will be written for this project to document the approach required to assure the quality and content of data transferred from the old system to the new system. Refer to Section x below.

8.9.

Validation Summary Report A validation summary report will be written at the end of the project to summarise the outcome of the validation and the system implementation. This will include a statement of the system’s suitability for intended use and any limitations found during or resulting from the system implementation. It will also state whether the system has been formally released for routine use. Refer to Section x below.

8.10. Source Code Review In most cases this section will be not applicable as this is typically only required for the validation of GAMP Category 5 systems. It is an important part of the SDLC but should be carried out by the supplier as part of their QMS. The supplier assessment should indicate whether this has been carried out satisfactorily by the supplier and hence determine whether additional testing or review is required during validation.

9.

Testing Approach

9.1.

Roles and Responsibilities The following roles and responsibilities apply specifically to testing. Amend table content as required. Roles

Responsibilities

Validation Specialist

Write test protocols Execute IQ test scripts Oversee OQ and PQ testing

User or SME

Execute OQ and PQ test scripts Raise test incidents during OQ and PQ testing

IT Manger or delegate

Approve all validation documentation Perform IT-related tests during OQ and PQ e.g. data back-up and restoration Complete the data migration and associated data integrity testing

Quality Manager or delegate

Approve all validation documentation Review and approve all completed tests Approve the completed incident investigations

Page 12 of 4


Company

9.2.

Validation Plan: Title Document ID: CSV001

Version No.: Status

Test Strategy This section should describe how testing is intended to be carried out. This may include whether functional, configuration or user testing is required, and any impact on testing as a result of a phased or a ‘big bang’ implementation being carried out. Consider any issues arising from the supplier assessment report. A marginal supplier assessment may require additional testing to be carried out. For large or complex system implementations a separate test plan document is typically expected and is recommended. Where possible, leverage any testing carried out by the supplier and support that, where appropriate, with the results of the supplier assessment. The high level strategy is to perform three types of testing on the system, IQ, OQ and PQ, in that order. The initial assessment given in Section x above has identified the basic areas within the [Computer System Name] system operation, which are regulated by the regulatory authority. The functions of the [Computer System Name] software that relate to those regulated areas of system operation will be tested thoroughly within the qualification of the system. 9.2.1.1. Test Environment Discuss the need for a test environment. Consider the testing approach and what will be required for installing the new software. For instance, Production may continue to use the existing version of the system while testing of the new / upgraded system components are tested at IQ and OQ. Similarly, a test environment might be created using a selection of the upgraded / new components, which will be installed on to the Company network. For the purposes of testing, a test database instance might be created on a new system database server. The PQ stage of testing should not be performed in the test environment, but rather on the updated and configured components in their final locations. Detail the information in a test plan and reference it here if a separate document is being prepared.

9.2.1.2. Supplier Test Activities This section should provide more detail about the supplier testing than that already discussed above. The software vendor is expected to have performed adequate testing of the functions of the system. This will be verified in the supplier assessment report and should include verification that the supplier or the company producing the software package operates under an appropriate quality management system. Any deficiencies identified in the supplier testing may require more comprehensive system and user testing. Conversely, an excellent supplier assessment may enable some testing to be omitted, especially if there is satisfactory evidence that the supplier has carried out the testing and it is transferrable to the site implementing the software. 9.2.1.3. Data Migration Describe the process and approach that will be required for any data migration activities. Consider how and when the data will be transferred, how it will be verified and if there will be any data clean-up activities carried out at the time. Write and refer to a data migration plan if there is too much detail required for this document (i.e. for large or complex implementations).

Page 13 of 4


Validation Plan: Title

Company

9.3.

Document ID: CSV001

Version No.: Status

Installation Qualification Edit as required. IQ will be performed for all new or upgraded computer hardware and software associated with the system. The hardware components will be tested to record their details, and to verify they meet the minimum performance requirements of the [Computer System Name] application. IQ testing will verify that the [Computer System Name] and other related or important applications, such as anti-virus and back-up, have been installed, are the correct versions and have been configured as detailed in the pre-approved specification(s) [cross reference the applicable documents]. The IQ protocol will also identify what training and operating procedures are required for the [Computer System Name] system, to allow it to enter live production. The IQ testing cannot start until the URS and the configuration specification have been approved, and the IQ protocol has been pre-approved.

9.4.

Operational Qualification Edit as required. The [Computer System Name] system will be tested at OQ to verify that it operates as specified in the pre-approved specification(s) throughout all specified operating ranges. The testing will focus on the following critical areas: • • • • • •

Security Operator data entry Functionality – correct configuration of calculations Critical alarms Reporting Data back-up and restoration

The OQ testing cannot start until the IQ testing has been completed and the OQ protocol has been pre-approved.

9.5.

Performance Qualification Edit as required. Describe how PQ will be carried out. It may or may not be possible to duplicate the live environment so PQ may need to be carried out after the system is in use. Also, discuss any post go-live monitoring that may be required and how the results of such monitoring will be reported e.g. an interim summary report prior to PQ testing followed by an additional VSR. The [Computer System Name] system will be tested at PQ to verify that it operates as specified in the end environment, using the approved procedures. The PQ will incorporate stress testing, which will simulate higher levels of data entry and processing that are likely to be encountered during production. The PQ will include a test to confirm that all system-related procedures have been approved for use in live production. The PQ protocol will be used to verify that adequate user training has been performed to allow the system to enter live production. The performance qualification testing cannot start until the OQ testing has been completed and the PQ protocol has been pre-approved.

Page 14 of 4


Validation Plan: Title

Company

9.6.

Document ID: CSV001

Version No.: Status

Test Execution Test execution will only be conducted using test scripts which have been pre-approved by the groups identified in the table in Section x above. The original approved test script will be photocopied and the working copy identified uniquely as iteration ##. The working copy will be handed to the tester(s) for execution and recording of results. Screen shots and other evidence will be taken for the direct impact tests to support the test outcome.

9.6.1.

Test Acceptance Criteria

A test will be considered completed when the actual results meet the expected results, or when all related incidents have been closed out.

10.

Acceptance Criteria Describe how the system will be formally released for use. Include how any significant deviations or incidents will be handled. e.g. The system will be considered ready for release into live production use when: all tests have been successfully completed all related incidents have been closed or otherwise justified as acceptable the system has been reviewed and assessed as fit for its intended purpose the [Company Pty. Ltd. QA manager] (or delegate) has approved the system for live production • the system has been formally released by the authorisation of a system release form.

• • • •

11.

Validation Reporting After all validation testing has been completed, a validation summary report (VSR) will be written and approved. The outcome of the testing activities will be recorded in the validation report, as well as a list of any incidents that may have occurred. The validation report will state that the system is ready to enter live production. The [Computer System Name] system cannot enter live production until the validation report has been approved. However, an interim report and / or a release certificate may be utilised in some cases to document the approval for the system to be used before the final summary report has been signed off e.g. where performance monitoring is to be carried out for some time after the system has gone live.

12.

Supporting Processes

12.1. Change Control Changes to the configuration of any of the software components or hardware associated with the [Computer System Name] will be controlled by [e.g. the site change control procedure]. The change control reports will reference the validation documents associated with the change. For large projects, a separate project change control procedure is recommended.

Page 15 of 4


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

12.2. Incident Reporting Use the site incident or computer systems incident management reporting system, if appropriate, or create a separate system for the project. When the actual results of a test do not match the expected results, or an unexpected result or occurrence is encountered, an incident report will be raised in accordance with the [project / site] incident reporting procedure. The details of the incident must be added to an ‘Incident Report Form’. Refer to [Section X below for an example of an incident report form or to the site procedure]. The incident will be investigated and resolved. On resolution of the incident, the incident report will be closed out with a final approval from the QA representative. If changes are required to the [Computer System Name] system as a result of a test incident investigation, they will be applied using the [site or project] change control procedure. Incidents that occur during testing will be categorised according to the following definitions. Incident category

Definition

Minor

Cosmetic / formatting problem or protocol error

Major

System functionality incorrect

Critical

System crash, data loss or security or safety of system affected

12.3. Documentation and Procedures All formal validation documents identified in Section X Deliverables will be written and circulated for approval in accordance with the site procedure, SOPXXX. The following information could be replaced by reference to an SOP on good documentation practice. A site SOP may be referenced or a separate project SOP may be appropriate.

12.3.1.

Test Protocols

The following good documentation and testing practices must be applied in the execution of tests: • instructions must be followed as written • to obtain a ‘Pass’ result, the actual results must be sufficient to prove the expected results have been met • actual results must correspond to the expected results • any failed test results changed to ‘Pass’ must be adequately explained and approved by QA and supported by the appropriate incident records, if applicable • screen shots and printouts must be annotated with the relevant test step • result cells in the protocol must be filled out with the appropriate information or crossed out as ‘N/A’, not left blank • test results must be written in indelible ink • errors must be crossed out with a single line, the correction made, and signed and dated; the original information must remain legible after the correction is made; an explanation detailing the reason for the strike-out must be included at the point of the amendment.

Page 16 of 4


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

• typographical protocol errors can be amended by hand, and initialled and dated, but must be initialled and dated by the same people who approved this VP. Printouts and any other documented evidence to support the testing must be appended to the corresponding test protocol. The evidence must be annotated with: • • • • •

the protocol document number [or other unique ID], an attachment number e.g. ‘Appendix 1’, a cross reference to the relevant test(s), page numbering of the full set of attached pages i.e. ‘page x of y’, and the initials of the person appending the evidence and the date the attachment is added.

Each complete test requires a set of initials and date from the tester, and a second set from a reviewer. The reviewer will verify that the test result meets the acceptance criteria.

12.3.2.

Operating Procedures

Identify the standard operating procedures that will be required for the system, including those that exist but require updating. Also identify responsibilities for their production, review and authorisation. The required procedures for the [Computer System Name] system are: • SOPXXX • SOPXXX The identified procedures are expected to include operation and administration of the system, maintenance and support, security and access, data backup, data archiving and disaster recovery.

12.4. Project Training Refer to any site training plans or policies. Staff should be trained in validation procedures where applicable to their role. All project personnel involved in the validation activities of this project will be trained in this validation plan and the relevant site validation SOPs and policies. Refer to [site SOP / VMP].

13.

Periodic Review Periodic review of the system is required and should include an assessment of the need for any additional validation. The [Computer System Name] system will be assessed for the need for revalidation during a periodic review carried out at least every x years/ months or whenever a significant change occurs. Risk imparted as a result of any changes should also be assessed and mitigated at that time, but considered again during the review. The system risk assessment should also be reviewed at this time and updated as required.

14.

Decommissioning List any systems to be decommissioned as a result of this system being implemented. Discuss how they will be decommissioned e.g. access to and storage of old data and refer to any decommissioning plans that are required.

Page 17 of 4


Company

15.

Validation Plan: Title Document ID: CSV001

Version No.: Status

Glossary Edit the following table to include all company-specific acronyms and any technical terms that may not be understood by the intended audience. Term

Definition

CS

Configuration Specification

DS

Design Specification

FDA

Food and Drug Administration (USA)

FS

Functional Specification

GAMP

Good Automated Manufacturing Practice (GAMP) guidance aims to achieve validated and compliant automated systems meeting all current healthcare regulatory expectations, by building upon existing industry good practice in an efficient and effective manner.

GMP

Good Manufacturing Practice

GxP

GxP encompasses a number of best practice principles applicable to the pharmaceutical industry, including: • GMP (Good Manufacturing Practice) • GLP (Good Laboratory Practice) • GCP (Good Clinical Practice) • GDP (Good Distribution Practice)

IQ

Installation Qualification (IQ) is a development lifecycle system testing stage, linked to the testing of the design / configuration specification document. The IQ provides documented evidence that the system and its components have been installed according to written and pre-approved specifications.

OQ

The Operational Qualification (OQ) is a development lifecycle system testing stage, linked to the testing of the FS and DS documents. The OQ provides documented evidence that the system operates according to pre-approved specifications throughout all operating ranges including, where appropriate, realistic stress conditions and alarm and error handling.

PIC/S

Pharmaceutical Inspection Cooperation Scheme

PQ

The Performance Qualification (PQ) is a development lifecycle system testing stage, linked to the testing of the FS document. PQ provides documented evidence that a system is capable of performing or controlling the activities of the processes it is required to perform or control, according to written and pre-approved specifications, while operating in its specific operating environment.

QMS

Quality Management System

RTM

Requirements Traceability Matrix

SDLC

Software Development Lifecycle

Page 18 of 4


Company

Validation Plan: Title Document ID: CSV001

Version No.: Status

Term

Definition

SME

Subject Matter Expert

SOP

Standard Operating Procedure

TGA

Therapeutic Goods Administration (Aust.)

UPS

Uninterruptable Power Supply

URS

The User Requirements Specification (URS) describes what the system is supposed to do, not how to do it, and is typically written by the users / system owner. The first form of this URS can be used as an invitation to tender document for potential suppliers. After the system / supplier is selected the document can be revised to enable user acceptance documentation to be more readily developed. It should include all essential requirements, and prioritise desirable requirements. The URS relates to the User Acceptance testing phase or PQ phase of the Validation process.

VMP

Validation Master Plan

VP

Validation Plan

Page 19 of 4


Company

16.

Validation Plan: Title Document ID: CSV001

Referenced Documents Document number

Title

DOCUMENT END

Page 20 of 4

Version No.: Status


Validation Plan: Title

Company

Document ID: CSV001

Version No.: Status

Change History Version number

Date

01

MMM-YY

Change control number

Description of change First issue.

Insert change history information for each new version of the document.

Change History Page 1 of 1


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.