Five points you need to understand about software validation

Validation of calibration software ? as required by ISO 17025, for instance ? is a topic that folks don?t prefer to talk about. Often there is uncertainty concerning the following: Which software actually should be validated? If that’s the case, who should take care of it? Which requirements must be satisfied by validation? How does one do it efficiently and how could it be documented? The following post explains the background and gives a recommendation for implementation in five steps.
In a calibration laboratory, software is used, among other things, from supporting the evaluation process, around fully automated calibration. Whatever Emergency of automation of the software, validation always refers to the entire processes into which the program is integrated. Behind validation, therefore, is the fundamental question of if the procedure for calibration fulfills its purpose and whether it achieves all its intended goals, that is to say, does it provide the required functionality with sufficient accuracy?
If you want to do validation tests now, you ought to know of two basics of software testing:
Full testing isn’t possible.
Testing is always dependent on the environment.
The former states that the test of most possible inputs and configurations of a program cannot be performed as a result of large number of possible combinations. With respect to the application, the user should always decide which functionality, which configurations and quality features should be prioritised and that are not relevant for him.
Which decision is made, often depends on the second point ? the operating environment of the software. According to the application, practically, there are always different requirements and priorities of software use. There are also customer-specific adjustments to the program, such as regarding the contents of the certificate. But also the individual conditions in the laboratory environment, with an array of instruments, generate variance. The wide selection of requirement perspectives and the sheer, endless complexity of the program configurations within the customer-specific application areas therefore ensure it is impossible for a manufacturer to check for all the needs of a particular customer.
Correspondingly, taking into account the aforementioned points, the validation falls onto an individual themself. In order to make this technique as efficient as possible, a procedure fitting the next five points is recommended:
The info for typical calibration configurations ought to be defined as ?test sets?.
At regular intervals, typically once a year, but at the very least after any software update, these test sets should be entered into the software.
The resulting certificates can be weighed against those from the prior version.
Regarding a first validation, a cross-check, e.g. via MS Excel, can take place.
The validation evidence should be documented and archived.
Scornful provides a PDF documentation of the calculations completed in the software.
Note
For further information on our calibration software and calibration laboratories, visit the WIKA website.

Leave a Comment