Software Qualification and validation starts with a user requirement document (URS). When either management by policy; or an employee by cause; requests that a controlled process is instigated or an existing controlled process requires a serious modification (miner modification would be controlled under change control), a URS document is raised. Over the course of several meeting the URS is fleshed out to document all aspects of the requirement.
It is important that
the document is properly scoped in order that the procurement, installation,
commissioning, validation, user training, maintenance, calibration and cleaning
tasks are all investigated and defined adequately.
To scope and define an adequate validation procedure the URS has to be detailed sufficiently for various assessments to be made. The main assessment that concerns us with software qualification AND VALIDATION is the risk assessment. This assessment is only concerned with ensuring that the degree of validation that is proposed; is compliant with the regulatory requirements.
So at this early stage it is required to execute a risk assessment
against the URS to see what category the software is going to be.
To learn more about this Validation Risk Assessment contents, click here; VRA
To purchase a ready to use Validation Risk Assessment document, click here; VRA
It is a mandatory requirement that certain aspects of this assessment are documented and held as a regulatory required record. These are;
Does the software require Full Life Cycle validation ?
If the data requires to be Part 11 compliant, how is that to Validation Plan be being achieved?
The answers to these questions are required to enable the (VP) to provide sufficient information to the protocol writers, to enable them to define the correct content for the Installation Qualification (IQ). Operational Qualification (OQ) and the Performance Qualification (PQ) protocols.
The VP is then used by the protocol writers as the official mandate for protocol content
The outcome of the VRA drives a split in software validation scope, if the VRA categorises the software as requiring Full Life Cycle Validation (FLCV); then a considerable amount of the validation effort is put into establishing how the software originated, was designed and developed, in order to establish that its basic concept and development can be considered robust, sound and in accordance with best practices .
The original development plans, code reviews, methods reviews and testing plans must be available to enable this software qualification to be executed successfully. Once this proof of quality build is establish, validation then follows a more convention path in code / procedural / security verification / functional inspections and verification.
Software that is not classified as requiring FLCV treatment, does not require this depth of verification into quality build history and is validated mainly by the more convention path in code / procedural / security verification / functional inspections and verification.
Dynamic testing verifies the execution flow of software, including decision paths, inputs, and outputs. Dynamic testing involves creating test cases, test vectors and oracles, and executing the software qualification and validation testing against these tests. The results are then compared with expected or known correct behavior of the software. Because the number of execution paths and conditions increases exponentially with the number of lines of code, testing for all possible execution traces and conditions for the software is impossible.
Code inspections and testing can reduce coding errors; however, experience has shown that the process needs to be complemented with other methods. One such method is static analysis. This somewhat new method largely automates the software verification process. The technique attempts to identify errors in the code, but does not necessarily prove their absence. Static analysis is used to identify potential and actual defects in source code.
A code verification solution that includes abstract interpretation can be instrumental in assuring software safety and a good quality process. It is a sound verification process that enables the achievement of high integrity in embedded devices. Regulatory bodies such as the FDA and some segments of industry recognize the value of sound verification principles and are using tools based on these principles.
It now appears that the current FDA Guidance rules pertaining to 21 CFR Part 11 may be with us for a longer time than was originally anticipated. So we have incorporated the guidance suggestions in their latest guidance document, into this Validation Risk Assessment (now issue 10). This VRA document now comes with a downloadable matrix for registering the justification for all your Part 11 assessments. This is now a mandatory requirement.
Part 11 will be interpreted narrowly; we are now clarifying that fewer records will be considered subject to part 11. (extract from FDA document)
This would appear to mean that the original all embracing approach to all electronic records is to be dropped and the rigors of part 11 applied only to the data that directly affects product quality and safety, and record (of the product) integrity.
For those records that are to remain the subject of part 11, we will apply discretion in the validation of audit trails, record retention, and record copying. (extract from FDA document)
This would appear to mean that if your in-house document control is secure and robust; then it will be acceptable in the matter of audit trail / record retention and record copying.
We will enforce all predicate rule requirements, including predicate rule record and record keeping requirements. (extract from FDA document)
This is a statement to the effect that in all other ways 21 CFR Part 11 will be enforced.
we recommend that, for each record required to be maintained under predicate
rules, you determine in advance whether you plan to rely on the electronic
record or paper record to perform regulated activities. We recommend that you
document this decision (e.g., in a Standard Operating Procedure (SOP), or
specification document). (Extract
from FDA document )
For Your Security We are now TLS 1.2 Compliant
This is a very popular 64 page document that can be
swiftly and simply edited down for smaller systems or scaled up for
The IQ section establishes documented verification that key aspects of the computer adhere to approved design intentions and that the recommendations of the regulators have been suitably considered. The OQ section establishes that there is documented verification that the installed system functions as specified and that there is sufficient documentary evidence to demonstrate this. The PQ section gives documented verification that the computer performance in its normal operating environment is consistently as specified in the URS.
Computer Vendor Audit (Issue 3.) -- $105.00
This Computer Vendor Audit document should be customized using the built in tools. The document can then be targeted to reflect your project priorities. The fifteen chapters all contain 10 questions, the total scored is then weighted to reflect your priorities. By assessing the importance of each of the chapter subjects in your project, the weighting is altered taking points from one and adding to others. This enables your assessment to be expressed simple and clearly as a percentage, allowing clear unambiguous comparisons to be presented for competing companies.
CSV OQ (Issue 6.) -- $115.00
Operational Qualification (OQ) is an important step in the overall validation and qualification process for software and computer systems. Our protocol leads you through the detailed requirements, progressively and simply.
Computer Performance Qualification (Issue 7.) -- $87.00
The Computer Performance Qualification is the culmination of the validation process. The protocol is used in conjunction with the system operating SOP, to verify that the system process is consistent and correct. The results of the testing must be recorder and reviewed with a view to ensuring that the devidences (within permitted tolerances) that exist are random and not a trend that will lead to out of specification operation during production use