| |February 20219cycle in pharmaceuticals revolves around one equation which is "The authentic validated data from source" will feed into a validated custom made or OEM prebuild "Al-gorithms" and a desired "output data" will be received. The "output data" may be processed further, then locked up to avoid further processing. Later backed up and subsequently preserved/archived. Organizations shall put equal emphasis on all phases of data lifecycle while implementing the IT surveillance tools and processes should be I think, having stringent processes to close all pos-sible leak points in the data lifecycle will be treated as a unique capability that will differentiate the leaders of tomorrow in pharma.Real-Time Availability of Data Integrity StatusApart from implementing the conventional approaches like access protection, manual/paper-based review of the audit trails to ensure data integrity, having a real automated inspection software which detects each change made to the data in real-time can play a key role in maintaining the 100% data integrity at any point of time. All changes made to access permissions, shared logins, data at rest, data in motion, audit trail, old archival, etc. should be detected and record in real-time. This change data should be quick to retrieve in a report format anytime whenever it is needed. This is only possible by having a mix of off the shelve IT tools and custom scripts. It also helps the pharmaceutical company to be audit-ready anytime. Under normal circumstances, the changes made per approved change controls can be verified by performing a checklist based tests. In the absence of an automated inspection software, the changes made to the data after completion of change control will be discovered in later stages. Manual proofreading or inspections are proven to be inefficient and often cannot assure that data is free from any unauthorized changes or any type of error in the data.Design Technical Tools and Scripts to Ensure 100% Data IntegrityThough there are opportunities available at multiple phases of the data lifecycle, I would like to highlight the few in which the industry faces challenges in ensuring 100% data integrity.To Detect Alteration in Backup DataDesign a procedure to check the integrity of data in SQL and oracle databases with the help of prebuild commands like "backup validate" VALIDATE, "restore validate", "rman", "checksum check=true", "dB check". SQL, ORACLE also capable to detect block (both inter and intra) corruption, logical block corruption, fractured blocks, etc. Setup a maximum limit for the total number of corruptions permitted in a file via rman backup. Feed a checksum while taking a backup. Try to verify only the backup option before restoring the actual copy.To Ensure no Alteration in Data While it is in MotionYou might be moving the GMP data from one location to another, physical to virtual and vice versa, Tier-1 to Tier-2 storage, from disk to DVD, etc. A sophisticated approach is required to ensure the data is not altered anywhere during motion otherwise it will be extremely difficult to detect the alteration in later stages. Use a validated utility that gives bit by bit audit trails. The usage of MD5 checksum is an extremely effective way to detect any type of alteration. The creation of a validation hash when you burn your optical disks (or put away additive backups on the hard drive) can check on the integrity of all the files by running a hash-checking utility.SummaryWith increasing stringent reviews of the integrity of data in the pharma industry by worldwide regulatory agencies, there is a pressure on organizations to build a robust and flexible framework to prevent and catch data integrity issues. The IT department shall play an essential role in building the technical framework with the help of readily available tools and custom software's/scripts to prevent integrity issues during the whole life cycle of data. With increasing stringent reviews of the integrity of data in the pharma industry by worldwide regulatory agencies, there is a pressure on organizations to build a robust and flexible framework to prevent and catch data integrity issues
< Page 8 | Page 10 >