Sources of Variability
If someone this afternoon forced me to propose one concept/hot topic to focus on based on recent 483s (and warning letters), both in the clinical and commercial GXP space, I would have to settle on “understanding sources of variability” in your end-to-end process, whatever GXP process that may be. Sources of variability can be defined as steps along a data collection -> review -> reporting process where hazards (see the revised ICH Q9 R1) exist that could affect the scientific justification for the ultimate decision-making step. Generally, these fall into one of two categories:
Technical Hazards (e.g., system-system interface)
Procedural Hazards (e.g., access to edit the contents of a report template)
In the PIC/s Guide for Data Integrity, we find an entire section dedicated to “hybrid systems”: those process that involve computerized systems that may not be fully complaint with regulatory expectations (e.g., simple laboratory instrumentation). These are obvious sources of variability within our manufacturing/testing processes due to reliance on procedural/human controls that need to be well understood and monitored appropriately. This is covered in the PIC/s Guide – and should be well understood in industry.
In reality, however, in 2023 and for the foreseeable future, nearly all of our processes are actually in some way “hybrid”, even if the computerized system is “compliant” with the requirements found in Part/Annex 11. I can think of few GXP processes (collection -> review -> reporting) that are truly automated. Most of the data & metadata that eventually make it to the decision-making step involve several hazards downstream from the “original” record. Even those that are mostly automated, involve at least some human intervention along the way, like the printing of a batch report following completion of an aseptic fill.
In pharma, we find it hard to be transparent with all procedural hazards. In my opinion, this is due to the serious nature of the trials we conduct and products we manufacture. Our processes have direct health and safety implications to potentially millions of our fellow humans. To admit that a hazard in our process is due to an employee failing to print and attach all filter integrity results (despite significant training and education) is difficult. This is due to the universal human condition called “cognitive dissonance”: defined as the discomfort that one experiences when dealing with a conflict between our beliefs and reality. To avoid the discomfort (extreme in our case), sometimes we choose to ignore reality. Let’s break it down:
Belief: Employees working in a GXP environment understand the serious nature of their roles and responsibilities, and will act with integrity while performing their day-to-day functions.
Reality: Employees do understand the serious nature of their roles and responsibilities, however, can only work with the tools available to them within the GXP workplace. Ultimately, the need for personal wellbeing (financial, mental, etc.) will overcome requirements of the GXP workplace.
This second point is extremely uncomfortable – that’s the cognitive dissonance kicking in.
Hence – “understanding sources of variability”. In the preamble to 21 CFR Part 11, published on March 20, 1997 (nearly 26 years ago exactly!) states “the agency’s experience with various types of records and signature falsification demonstrates that some people do falsify information under certain circumstances”. The Agency is not making a statement disparaging the GXP workforce. On the contrary! The Agency is pushing management to acknowledge Reality & the universal human condition. It’s OK to have processes that are not perfect, and sometimes under pressure, humans act in ways that they would not otherwise consider. If we are aiming for true process control, acknowledging the human condition in our data governance strategies gets us as close to perfection as possible.
Let’s do it.
Pete

