inventory_2

SOP Computer System Validation

Use this SOP Computer System Validation template to guide the systematic validation of software and computer systems that impact your medical device quality management system or critical production processes. This document is essential for ensuring regulatory compliance (such as ISO 13485, GAMP5, 21 CFR Part 11, MDR/IVDR) and should be completed whenever new software is implemented, updated, or decommissioned to document risk-based validation planning, testing, deployment, maintenance, and retirement activities. Proper use of this SOP helps maintain data integrity, product safety, and continuous regulatory compliance throughout your software lifecycle.
Generate ->

SOP Computer System Validation

ID: lorem ipsum dolor

1. Purpose

The purpose of this SOP is to describe the process for validation of computer and/or software systems to be used for product or organizational processes with a potential impact on the organization's quality management system or performance and safety of medical products produced by the organization.

2. Scope

This covers software used for production and service provision as defined by ISO 13485:2016 4.1.6, 7.5.6, and 7.6, ISO/TR 80002-2:2017, GAMP5 methodology, and 21 CFR Part 11 compliance requirements, and MDR 2017/745, and IVDR 2017/746.

3. Qualification of Software for Validation

3.1 Quality Management Software

In order to determine if a software is in scope for the quality management system, a high level of definition of the process and use of the software should be considered first. The two criteria should be considered for qualification of software for validation related to the quality management system:

  • If the failure or latent flaws of the software affect the safety or quality of the medical device(s).
  • Does the software automate or execute an activity required by regulatory requirements (in particular, the requirements for the quality management system)? Examples of this could include capturing electronic signatures and/or records, maintaining product traceability, performing and capturing test results, maintaining data logs such as CAPA, non-conformances, complaints, calibrations, etc.

If either criterion is considered true for the software, then it shall be validated within scope of this SOP.

Some software may contain only limited functionalities that fall within the medical device regulatory requirements. In this case, an analysis shall be performed to determine which parts of the software are considered to be in scope and which parts are not in scope. The software shall be validated only for the parts that are in scope.

3.2 Software for Production or Service Provisions

The organizations will validate any processes for production and service provision where the resulting output cannot be or is not verified by subsequent monitoring or measurement and, as a consequence, deficiencies become apparent only after the product is in use or the service has been delivered. If the software does not meet these criteria, it is not subject to validation.

3.3 GAMP5 Categorization of Software

The software may be categorized according to GAMP5 guidelines to determine the appropriate level of validation effort. The categories are as follows:

  • Category 1: Infrastructure Software - Operating systems, databases, and other infrastructure software.
  • Category 3: Non-configurable Software - Commercial off-the-shelf (COTS) software that is used as-is without any configuration.
  • Category 4: Configurable Software - COTS software that is configured to meet specific user requirements.
  • Category 5: Bespoke Software - Custom-developed software specifically designed to meet user requirements.

The categorization will help in determining the computer system validation (CSV) approach and the extent of documentation required.

3.4 Risk Assessment for CSV Non-Medical Device Software

Before proceeding with the validation of non-medical device software, a risk assessment should be conducted to determine the level of validation required. The risk assessment should consider the following factors:

  • Impact on Quality Management System: Evaluate how the software impacts the organization's quality management system. Determine if the software supports critical processes or data that could affect product quality or compliance.
  • Potential for Failure: Assess the likelihood and potential consequences of software failure. Consider historical data, known issues, and the complexity of the software.
  • Regulatory Requirements: Identify any regulatory requirements that apply to the software. Determine if the software needs to comply with specific standards or guidelines.
  • User Impact: Evaluate how the software affects end-users, including employees and customers. Consider the potential for user errors and the impact on productivity and satisfaction.

Based on the risk assessment, categorize the software into one of the following validation levels:

  • Low Risk: Minimal impact on quality management system and low potential for failure. Basic validation activities such as installation qualification and basic functional testing may be sufficient.
  • Medium Risk: Moderate impact on quality management system or moderate potential for failure. More extensive validation activities, including detailed functional testing and performance testing, may be required.
  • High Risk: Significant impact on quality management system or high potential for failure. Comprehensive validation activities, including thorough functional testing, performance testing, and risk mitigation measures, are required.

The results of the risk assessment should be documented in the Software Validation Form and used to guide the validation planning and execution.

3.5 21 CFR Part 11 Compliance Requirements for Non-Medical Device Software Undergoing Computer System Validation (US only)

  • Validation: Ensure that the software is validated to demonstrate accuracy, reliability, consistent intended performance, and the ability to discern invalid or altered records.
  • Audit Trail: Implement secure, computer-generated, time-stamped audit trails to independently record the date and time of operator entries and actions that create, modify, or delete electronic records.
  • Record Retention: Ensure that electronic records are retained for the required duration and are accessible for review and copying by the FDA.
  • Security Controls: Implement measures to ensure that only authorized individuals can access the system and perform operations.
  • Electronic Signatures: Ensure that electronic signatures are unique to one individual and cannot be reused or reassigned. They must also be linked to their respective electronic records to prevent tampering.
  • Operational System Checks: Use operational system checks to enforce permitted sequencing of steps and events.
  • Authority Checks: Implement authority checks to ensure that only authorized individuals can use the system, electronically sign a record, access the operation or computer system input or output device, alter a record, or perform the operation at hand.
  • Device Checks: Use device checks to determine the validity of the source of data input or operational instruction.
  • Training: Ensure that individuals who develop, maintain, or use the system have the education, training, and experience to perform their assigned tasks.
  • Documentation: Maintain documentation of the system's validation, including test plans, test results, and any changes made to the system.
  • Policies and Procedures: Establish and follow written policies and procedures to ensure compliance with Part 11 requirements.
  • Backup and Recovery: Implement procedures for data backup and recovery to protect against data loss.

4. Validation Planning Processes

The Software Validation Form shall be filled out for each software system that is subject to validation. The content and rigor of the validation should be determined by the risk associated with the software.

The stages of the validation process are Define, Implement, Test, Deploy and Maintain. The Software Validation Form shall be updated at each stage of the validation process. The validation phases are best performed sequentially, but some steps may be performed in parallel if needed

4.1 Development Phase - Define

The Define phase of the development phase for software validation begins with providing a description of the software and its intended use. The following information should be considered in the Software Validation Form, as needed:

  • Identification of process requirements, or the activity / process that the software will partially or fully automate
  • Process failure risk analysis
  • Process risk controls to be implemented (if required)
  • Plan for validation of the software
  • Software requirements or development life cycle (as needed)

4.2 Development Phase - Implement

The Implement phase of the development phase for software validation considers how the software will be implemented into existing infrastructure. The following information should be considered in the Software Validation Form, as needed:

  • Analysis of software risk failures and software risk controls
  • Software development steps including specification, design review, architecture development, and traceability (if required)

4.3 Development Phase - Test

The Test phase of the development phase for software validation considers how the software will be tested to ensure performance and safety. The following information should be considered in the Software Validation Form, as needed:

  • Test planning and execution, including unit testing, implementation testing, regression testing, use case testing, interface testing, system testing, vender-supplied test suit testing, robustness testing, normal case testing, output forcing testing, beta testing, performance testing, or combination of inputs testing (as needed)
  • Validation testing according to plan

Testing should be appropriate to the software being validated, the risk associated with the software, and the processes or quality management system aspects it is intended to automate.

4.4 Development Phase - Deploy

The Deploy phase of the development phase for software validation considers how the software will be deployed into the organization. The following information should be considered in the Software Validation Form, as needed:

  • Procedure review (as needed)
  • Internal training and operator certification (as needed)
  • Installation qualification and acceptance training (as needed)
  • Operational and performance qualification (as needed)

Deployment should be appropriate to the software being validated, the risk associated with the software, and the processes or quality management system aspects it is intended to automate.

4.5 Maintain Phase

The Maintain phase of the development phase for software validation considers how the software will be maintained and updated. The following information should be considered in the Software Validation Form, as needed:

  • Maintenance planning (as needed)
  • List of known anomalies or issues (as needed)
  • Compatibility and infrastructure testing (as needed)
  • System monitoring (as needed)
  • Backup and recovery processes (as needed)
  • Operational controls and security (as needed)
  • Regression analysis (as needed)

Maintenance should be appropriate to the software being validated, the risk associated with the software, and the processes or quality management system aspects it is intended to automate. For some well established technologies, maintenance may be minimal or handled externally by that company. For some software, simple user feedback and occasional review of error logs is sufficient.

5. Validation Results

If the validation process is successful and the software is determined to be able to perform as intended, the software is considered validated and can be added to the List of Validated Software. The Software Validation Form should be saved as a record within the organization's quality management system.

If a software was found unable to meet the validation requirements, the software should not be integrated into the organization's processes or quality management system. The software can be re-evalauted at a later time, if needed. The Software Validation Form identifying the valiation failure and reasons why should be saved as a record within the organization's quality management system.

6. Software Updates

If new versions of the software released, the quality team will need to review those versions and new features to ensure that validation does not need to be repeated. If the intended use of the software changes then the validation will need to be repeated to account for the new scope of the intended use. As mentioned, the validation should be risk based and therefore if there are unchanged features of the software then the validation does not need to be repeated for those features.

For minor changes to the software that do not affect the intended use of the software or introduce new risks, the validation does not need to be repeated. Instead, the version of the software should be updated in the List of Validated Software prior to deployment.

7. Software Decommission

If software that is validated and used in the organization is planned to be removed from the system, an evaluation should be performed by the quality team to determine the effects of the removal of the software on the organizational processes and the quality management system. Prior to decommissioning the software, appropriate measures should be implemented in order to ensure that the software is removed from the organization in a controlled manner with minimal risk.

New software that would replace the decommissioned software can be implemented simultaneously, although, it would need to undergo the validation procedures as outlined in this SOP prior to being deployed. Decommissioned software should be removed from the List of Validated Software once fully removed from the organization's systems.

Mapping of Requirements