Lab Manager | Run Your Lab Like a Business

Business Management

Man writes on a virtual screen that holds compliance information

Managing the Integrity of Data

Lab managers must refocus their efforts on managing data due to increased regulatory agency activity

Dan Zuccarello

Dan Zuccarello currently serves as principal consultant for RBF Consulting Group, LLC located in Hightstown New Jersey. During his 40+ year career in industry, Dan has established and managed pharmaceutical,...

ViewFull Profile.
Learn about ourEditorial Policies.

Managing data integrity has always been difficult. Technology evolves, more complex data management systems appear, and workers interface with systems in new ways. It has taken decades to develop and finalize regulations to ensure data integrity based on constantly changing conditions. Today, regulations require conformance to the principals of ALCOA (Attributable, Legible, Contemporaneous, Original, and Accurate). But even ALCOA continues to evolve. Additional concepts have created ALCOA+, and more are likely to come, making the process even more confusing and challenging. Perhaps a new acronym can provide additional clarity to this difficult topic: TRUST (Tangible, Reliable, Unique, Sustainable, and Tested). In this article, we’ll explain how TRUST along with ALCOA can help ensure that your data systems meet integrity criteria.


“If it’s not documented, it didn’t happen.” As a lab manager, you are routinely asked to provide tangible documented evidence of data, which questions integrity. Regulatory agencies focus on electronic dynamic data and audit trail review. This is currently where many observations of non-compliance are being cited. A real-time review of electronic data and placement of review or approval signatures within the data record is required. Think of this as the “Witnessed by:” that was in hard cover notebooks. Critical for lab managers is that older equipment may not meet current regulatory requirements. Previously, gaps in compliance to Part 11 and data integrity could be fixed procedurally. However, with stronger guidance as to dynamic data and audit trail review, these workarounds are no longer permitted and are being cited during regulatory audits.

This change may force you to remediate observations by evaluating your current capabilities. Older software may actually have hidden or unknown audit or data review functions which were not tested in the original computer validation that might be used. If not, then consider software upgrades or new software or hardware that provides the functionality needed. Keep in mind that new equipment or software purchases may significantly impact your budgets.


Data integrity is all about reliability. The computer systems, instrumentation, wiring, and connections must work together and be dependable. Otherwise, your laboratory may end up reviewing nothing but blank files. For systems to be reliable, it’s important to generate user requirements (URS) that suit your lab’s infrastructure and workflows to inform the decision of which system to buy. However, this is often not the case, and post-purchase URS become more about making equipment work and not about the user’s needs. Developing URS before the purchase builds robustness and reliability into your system and vendor selection process.


Generating, storing, and protecting original data is the basic tenet of electronic records. Most systems can time and date stamp when additional files are created or copied from the original. To further protect the uniqueness of data and eliminate the potential to copy, alter, or delete raw data, it’s possible to disable removable storage functionality like optical disk drives or USB ports so that external drives can’t be used to copy data from servers. Regulators look for closed systems that cannot copy data or access the internet. Inspectors review computer systems, directories, and histories to determine if the users are moving or copying data.

The unintentional creation of unsecured networks is a growing concern within a post-COVID, remote workforce. Remote workers wanting access to network applications through virtual private networks (VPN) may accidentally create open networks capable of scheduling and starting analyses, review, processing, and even printing of data. Open systems must meet different electronic signature requirements that likely were not considered during the closed system’s validation. Furthermore, personal computers may render critical systems vulnerable if used to transfer data between unregulated storage devices and your organization’s hardware.

Data integrity is all about reliability. The computer systems, instrumentation, wiring, and connections must work together and be dependable.

One suggestion for lab managers is to work with your IT departments to determine boundaries for using a VPN.  It may become necessary to restrict access to critical validated software programs or confirm through testing that remote users cannot gain access, especially if your systems are to remain closed.  However, if your operations are moving toward open systems and remote access, then review your current validation requirements and your organization’s SOPs on computer system validation and risk assessment relative to open systems.   Simple test scripts can be generated to confirm passwords, electronic signatures, and functionality of software applications.  Finally, contact the software/hardware vendor to determine if open software functions exist, can be applied easily, and what security features they offer to protect critical aspects of the program.


Demonstrating that you can store, backup, and recover data from different servers or locations is critical to data integrity. Media aging should be considered, as backups are made to removable media then stored at off-site locations. Aging studies may be required as some locations, such as a bank vault, may not be environmentally controlled, which could impact the integrity of the data or media itself. 

The concept of sustainability also includes traceability of the validation data relative to URS and functional and design specifications of the application. Creation of legacy systems and proper long-term storage of the workstation, applications, and data must be considered once equipment ages out or is replaced with updated versions.


Test scripts are vital to maintaining lab data integrity. The primary purpose of test scripts is to find errors before they happen, not to test your systems into compliance. Stress testing pushes every aspect of the electronic system to a point above where it is expected to operate normally. If a system is expected to be used at 40°C but is designed or only tested at room temperature, the system may fail once it’s placed in the harsher environment. What if all of the equipment in your lab was scheduled to collect data at the same time at a high collection rate? The amount of data being generated and sent to the network may cause unknown stress on the system. Evaluate the potential load based on number of systems, then test if you can actually collect that level of information without error.

Test your computer’s CPU, memory, graphics, and storage for the potential stress errors on normal operations and data collection. This is particularly valuable if equipment has extremely fast collection rates. There are commercial programs that will perform millions of parallel operations concurrently to stress-test a computer and report the results.

With the advent of an ever-increasing remote workforce and heightened regulatory scrutiny related to electronic records, laboratory managers must re-evaluate their data integrity programs. Risk assessments may find that day-to-day operations or procedural fixes must be revised or eliminated to become compliant.

Evaluating and managing your programs based on the concepts of TRUST should provide an additional framework that, along with ALCOA, can help effectively manage the integrity of your laboratory’s data.