Description
The PPT explains about CHROMATOGRAPHIC DATA SYSTEMS (CDS) , LABORATORY INFORMATION MANAGEMENT SYSTEMS (LIMS), LIMS Hardware and Architectures and their evloution types and development.
Computers as data analysis & data management tools in preclinical development
INTRODUCTION
• Scientists from many different disciplines participate in pharmaceutical development. • • Their research areas may be very different, but they all generate scientific data (and text documents), which are the products of development laboratories.
• Truckloads of data and documents are submitted to the regulatory authorities in support of investigational and marketing authorization filings. • For example, even a typical Investigational New Drug (IND) application requires around 50,000 pages of supporting documents.
• One way or another, every single data point has to go through the acquiring, analyzing, managing, reporting, auditing, and archiving process according to a set of specific rules and regulations.
• The wide use of computers has tremendously increased efficiency and productivity in pharmaceutical development. • On the other hand, it has also created unique problems and challenges for the industry.
Special emphases are put on three widely used computer systems: • CDS—Chromatographic Data Systems
• LIMS—Laboratory Information Management systems
• TIMS—Text Information Management Systems
The following are examples of the development activities that generate the majority of the data: • Drug substance/drug product purity, potency, and other testing
• Drug substance/drug product stability testing
• Method development, validation, and transfer • Drug product formulation development
• Drug substance/drug product manufacturing process development, validation, and transfer • Master production and control record keeping • Batch production and control record keeping • Equipment cleaning testing
CHROMATOGRAPHIC DATA SYSTEMS (CDS)
• The importance of CDS is directly related to the roles that chromatography, particularly high-performance liquid chromatography (HPLC) and gas chromatography (GC), play in pharmaceutical analysis. • HPLC and GC are the main workhorses in pharmaceutical analysis.
• In today’s pharmaceutical companies, development work cannot be done without HPLC and GC. • CDS are also used for several other instrumental analysis technologies such as • ion (exchange) chromatography (IC) • capillary electrophoresis (CE) • and super- critical fluid chromatography (SFC).
• The Days Before CDS
• In the 1960s and early 1970s, chromatographs were relatively primitive and inefficient.
• Chromatographers had to use microsyringes for sample injection and stopwatches for measurement of retention times. • The chromatograms were collected with a strip chart recorder. • Data analysis was also performed manually. • At that time, the management of chromatographic data was essentially paper based and very inefficient
• However, compared with the traditional analytical methods, the adoption of chromatographic methods represented a significant improvement in pharmaceutical analysis. • This was because chromatographic methods had the advantages of method specificity, the ability to separate and detect low-level impurities.
• Specificity is especially important for methods intended for early- phase drug development when the chemical and physical properties of the active pharmaceutical ingredient (API) are not fully understood and the synthetic processes are not fully developed.
• Therefore the assurance of safety in clinical trials of an API relies heavily on the ability of analytical methods to detect and quantitate unknown impurities that may pose safety concerns. • This task was not easily performed or simply could not be carried out by classic wet chemistry methods. • Therefore, slowly, HPLC and GC established their places as the mainstream analytical methods in pharmaceutical analysis.
• practical needs prompted instrument vendors to come up with more efficient ways for collecting and processing chromatographic data.
• In the mid-1970s, the integrator was introduced. • At first, the integrator worked similarly to a strip chart recorder with the added capabilities of automatically calculating peak area and peak height.
• The Emergence and Evolution of CDS
• The first generation of CDS systems were based on a working model of multiuser, time-sharing minicomputers. • The minicomputers were connected to terminals in the laboratory that the analysts would use. • The detector channels of the chromatographs were connected to the data system through a device called the analog-to-digital (AID) converter, which would convert the analog signals from the detectors into digital signals.
• In the late 1970s, Hewlett-Packard introduced the HP- 3300 series data-acquisition system. • Through the AID converters, the HP system was able to collect chromatographic data from up to 60 detector channels. • This represented the beginning of computerized chromatographic data analysis and management
• Because the CDS used a dedicated hardware and wiring system, it was relatively expensive to install.
• Another drawback of the system was that the performance of the system would degrade as the number of users increased.
• The next generation of CDS systems did not appear until the start of the personal computer (PC) revolution in the 1980s. • The early PCs commercialized by Apple and IBM were not very reliable or powerful compared with today’s PCs. The operating systems were text based and difficult to use.
client/server model.
• Taking advantage of the PC revolution, a new generation of CDS appeared on the market that utilized a client/server model.
• In the new CDS, the client provided the graphical and user interface through a PC and was responsible for some or most of the application processing. • The server typically maintained the database and processed requests from the clients to extract data from or update the database.
• This model was adopted widely in the industry for almost a decade because of its scalability.
• During this period of time, in parallel with the progress in CDS, chromatography itself was developing rapidly. • Instrumentation had adopted modular design so that each functional part became more reliable and serviceable.
• Progress in microelectronics and machinery made the solvent delivery pump more accurate and reproducible. • The accuracy and precision of auto samplers also were significantly improved.
• Compared with the time when chart recorders or integrators were used, the fully automated HPLC could now be programmed to run for days and nights nonstop. Results could also be accessed and processed remotely.
• With the help of sophisticated CDS, chromatography finally established its dominance in pharmaceutical analysis.
• With respect to the FDA’s expectations, the CDS operated with the client server model had a significant drawback.
• In the client/server model, the client must retain parts of the applications. • To fulfill the requirements of system qualification, performance verification, and validation, one must validate not only the server, but also each PC used by the client.
• It was burden for the customer, which resulted in the adoption of a new operating model of server-based computing.
server-based computing
• With server-based computing, the applications are deployed, managed, supported, and executed on a dedicated application server • There are no software components installed on the client PC • The client’s PC simply acts as the application server’s display • CDS using this model significantly reduced the total cost in implementation and maintenance and significantly increased its compliance with regulatory guidelines.
•The Modern CDS
• The other two important features are the use of • embedded data structure and • direct instrument control.
• The earlier generations of CDS used a directory file structure, meaning that the raw data and other files such as the instrument method and data processing method were stored at separate locations. • There would either be no connections or only partial connections between these files.
• The most significant drawback of this type of file management was the potential for methods and raw data to be accidentally overwritten. • To prevent this from happening, the raw data and result files must be locked. • If in some cases the locked data needed to be reprocessed, the system administrator must unlock the files
• The embedded data structure can be used to manage not only chromatographic data, but also all aspects of the CDS, including system security and user privileges. • The embedded data structure maintains all information and changes by date- and time stamping them to prevent accidental overwriting of raw data and method files.
Direct instrument control
• Direct instrument control (or the lack of it) was an important issue for the earlier version of CDS. • The scheme of connecting the detector channels through A/Ds to CDS worked well in analytical laboratories across the pharmaceutical industry. • The scheme provided enough flexibility so that the CDS could collect data from a variety of instruments, including GC, HPLC, IC, SFC, and CE.
• It was equally important that the CDS could be connected to instruments that were manufactured by different vendors. • The disadvantage of this scheme was that the instrument metadata could not be linked to the result file of each sample analyzed. • It could not be guaranteed that the proper instrument parameters were used in sample analysis.
• Summary
• CDS have certainly served the pharmaceutical industry well by being continuously improved. • CDS have helped the pharmaceutical industry to increase efficiency and productivity by automating a large part of pharmaceutical analysis.
• But CDS still have room for improvement. So far the main focus of CDS has been on providing accurate and reliable data. • The current regulatory trend in the pharmaceutical industry is to shift from data-based filings to information-based filings, meaning that the data must be analyzed and converted into information.
• LABORATORY INFORMATION MANAGEMENT SYSTEMS (LIMS)
• Laboratory information management systems, or LTMS represent an integral part of the data management systems used in preclinical development. LIMS
• are needed partly because CDS cannot provide enough data management capability. For example, CDS cannot handle data from nonchromatographic tests.
• Another important use of LIMS is for sample management in preclinical development, more specifically in drug substance and drug product stability studies. • Stability studies are very labor intensive, and the results have an important impact on regulatory filings.
• LIMS are designed to automate a large part of these stability studies including sample tracking, sample distribution, work assignment, results capturing, data processing, data review and approval, report generation, and data archiving, retrieving, and sharing.
•LIMS Hardware and Architectures
• Commercial LIMS appeared on the market in the early 1980s. These operated on then state-of-the-art minicomputers such as the 16-bit Hewlett- Packard 1000 and 32-bit Digital VAX system. • By the late 1980s, several DOS-based PC LIMS operating on the primitive PC network were available.
• By the early 1990s, most LIMS started using commercial relational database technology and client/server systems, which operated on UNIX or the new Windows NT platform.
• The most advanced LIMS utilize serverbased architecture to ensure system security and control.
• There are four main types of architectural options when implementing LIMS
• The first is the LAN (local area network) installation. In a multiple- site situation and through the standard client/server setup, the application would be hosted separately on a server at each site connected to PC clients. • In this setup, the LIMS are installed on both the clients and the server. • System administration is required at each facility.
• The second type is the WAN (wide area network) installation. • In this setup the LIMS take advantage of telecommunication technology to cover a great distance. • The setup can also be used to connect disparate LANs together. • In this configuration, the LIMS are installed on both the clients and a central server.
• The third type is the so-called “centrally hosted thin client installation”. • For this setup, system administration is managed at a corporate center, where the LIMS are hosted and distributed via a WAN or the Internet with a virtual private network (VPN).
• The last and also the newest type is the ASP (Application Service Provision provider)-hosted installation. In this setup, the LIMS are hosted on a centrally managed server form and maintained by third-party specialists. Users access the LIMS with any Internet-connected PC with a standard Web browser.
• Different types of LIMS
Customer-tailored LIMS
• Customer-tailored LIMS—In an implementation of this type of LIMS, the customer purchases a generic product from the vendor. The vendor and customer will work together over a period of time to configure the software to adapt it to meet end user needs. This usually involves extensive programming, which can be performed by the trained end user or dedicated supporting personnel on the customer side.
Preconfigured LIMS
• Preconfigured LIMS—This LIMS does not require extensive customer programming. • To meet specific needs of end users, the vendors provide a comprehensive suite of configuration tools. These tools allow end users to add new screens, menus, functions, and reports in a rapid and intuitive manner. • The tools also allow the LIMS to be more easily integrated with other business applications such as document processing, spreadsheets, and manufacturing systems.
• Specialized LIMS— • This type of LIMS is based on the fact that certain laboratories have a range of well-defined processes (e.g., stability testing) that are performed according to a specific set of regulations and by using well- established tests. The tests are done according to industry-wide accepted protocols. • Specialized LIMS are tailor-made for certain types of laboratories. Therefore the performance can be optimized for clearly defined work process.
• Implementation of LIMS
• Because of their complexity, implementing LIMS usually is a traumatic process. • Good communication and planning can reduce the level of turmoil caused by LIMS • Planning (defining expectations) is the first step in a lengthy process of acquiring the LIMS.
• The LIMS vendor and customer have to work very closely at this stage. A series of meetings must be held between the LIMS vendor and potential end users and laboratory supervisors. • The business processes and sample flows need to be mapped and documented to prepare for future system configuration.
• The LIMS for GMP/GLP use must be validated . Validation includes • design qualification, • installation qualification, • operational qualification, • performance qualification, • and final documentation. • Each of these steps needs good planning and documentation.
• During validation, the system is tested against normal, boundary value, and invalid data sets. • Invalid data should be identified and flagged by the software. • Dynamic “stress” tests should also be done with large data sets to verify whether the hardware is adequate.
• One of the major undertakings during LIMS implementation is user training, which should cover not only the LIMS itself but also the standard operating procedures (SOPs) that govern use, administration, training, and other aspects of the LIMS.
• Thanks
doc_432308642.ppt
The PPT explains about CHROMATOGRAPHIC DATA SYSTEMS (CDS) , LABORATORY INFORMATION MANAGEMENT SYSTEMS (LIMS), LIMS Hardware and Architectures and their evloution types and development.
Computers as data analysis & data management tools in preclinical development
INTRODUCTION
• Scientists from many different disciplines participate in pharmaceutical development. • • Their research areas may be very different, but they all generate scientific data (and text documents), which are the products of development laboratories.
• Truckloads of data and documents are submitted to the regulatory authorities in support of investigational and marketing authorization filings. • For example, even a typical Investigational New Drug (IND) application requires around 50,000 pages of supporting documents.
• One way or another, every single data point has to go through the acquiring, analyzing, managing, reporting, auditing, and archiving process according to a set of specific rules and regulations.
• The wide use of computers has tremendously increased efficiency and productivity in pharmaceutical development. • On the other hand, it has also created unique problems and challenges for the industry.
Special emphases are put on three widely used computer systems: • CDS—Chromatographic Data Systems
• LIMS—Laboratory Information Management systems
• TIMS—Text Information Management Systems
The following are examples of the development activities that generate the majority of the data: • Drug substance/drug product purity, potency, and other testing
• Drug substance/drug product stability testing
• Method development, validation, and transfer • Drug product formulation development
• Drug substance/drug product manufacturing process development, validation, and transfer • Master production and control record keeping • Batch production and control record keeping • Equipment cleaning testing
CHROMATOGRAPHIC DATA SYSTEMS (CDS)
• The importance of CDS is directly related to the roles that chromatography, particularly high-performance liquid chromatography (HPLC) and gas chromatography (GC), play in pharmaceutical analysis. • HPLC and GC are the main workhorses in pharmaceutical analysis.
• In today’s pharmaceutical companies, development work cannot be done without HPLC and GC. • CDS are also used for several other instrumental analysis technologies such as • ion (exchange) chromatography (IC) • capillary electrophoresis (CE) • and super- critical fluid chromatography (SFC).
• The Days Before CDS
• In the 1960s and early 1970s, chromatographs were relatively primitive and inefficient.
• Chromatographers had to use microsyringes for sample injection and stopwatches for measurement of retention times. • The chromatograms were collected with a strip chart recorder. • Data analysis was also performed manually. • At that time, the management of chromatographic data was essentially paper based and very inefficient
• However, compared with the traditional analytical methods, the adoption of chromatographic methods represented a significant improvement in pharmaceutical analysis. • This was because chromatographic methods had the advantages of method specificity, the ability to separate and detect low-level impurities.
• Specificity is especially important for methods intended for early- phase drug development when the chemical and physical properties of the active pharmaceutical ingredient (API) are not fully understood and the synthetic processes are not fully developed.
• Therefore the assurance of safety in clinical trials of an API relies heavily on the ability of analytical methods to detect and quantitate unknown impurities that may pose safety concerns. • This task was not easily performed or simply could not be carried out by classic wet chemistry methods. • Therefore, slowly, HPLC and GC established their places as the mainstream analytical methods in pharmaceutical analysis.
• practical needs prompted instrument vendors to come up with more efficient ways for collecting and processing chromatographic data.
• In the mid-1970s, the integrator was introduced. • At first, the integrator worked similarly to a strip chart recorder with the added capabilities of automatically calculating peak area and peak height.
• The Emergence and Evolution of CDS
• The first generation of CDS systems were based on a working model of multiuser, time-sharing minicomputers. • The minicomputers were connected to terminals in the laboratory that the analysts would use. • The detector channels of the chromatographs were connected to the data system through a device called the analog-to-digital (AID) converter, which would convert the analog signals from the detectors into digital signals.
• In the late 1970s, Hewlett-Packard introduced the HP- 3300 series data-acquisition system. • Through the AID converters, the HP system was able to collect chromatographic data from up to 60 detector channels. • This represented the beginning of computerized chromatographic data analysis and management
• Because the CDS used a dedicated hardware and wiring system, it was relatively expensive to install.
• Another drawback of the system was that the performance of the system would degrade as the number of users increased.
• The next generation of CDS systems did not appear until the start of the personal computer (PC) revolution in the 1980s. • The early PCs commercialized by Apple and IBM were not very reliable or powerful compared with today’s PCs. The operating systems were text based and difficult to use.
client/server model.
• Taking advantage of the PC revolution, a new generation of CDS appeared on the market that utilized a client/server model.
• In the new CDS, the client provided the graphical and user interface through a PC and was responsible for some or most of the application processing. • The server typically maintained the database and processed requests from the clients to extract data from or update the database.
• This model was adopted widely in the industry for almost a decade because of its scalability.
• During this period of time, in parallel with the progress in CDS, chromatography itself was developing rapidly. • Instrumentation had adopted modular design so that each functional part became more reliable and serviceable.
• Progress in microelectronics and machinery made the solvent delivery pump more accurate and reproducible. • The accuracy and precision of auto samplers also were significantly improved.
• Compared with the time when chart recorders or integrators were used, the fully automated HPLC could now be programmed to run for days and nights nonstop. Results could also be accessed and processed remotely.
• With the help of sophisticated CDS, chromatography finally established its dominance in pharmaceutical analysis.
• With respect to the FDA’s expectations, the CDS operated with the client server model had a significant drawback.
• In the client/server model, the client must retain parts of the applications. • To fulfill the requirements of system qualification, performance verification, and validation, one must validate not only the server, but also each PC used by the client.
• It was burden for the customer, which resulted in the adoption of a new operating model of server-based computing.
server-based computing
• With server-based computing, the applications are deployed, managed, supported, and executed on a dedicated application server • There are no software components installed on the client PC • The client’s PC simply acts as the application server’s display • CDS using this model significantly reduced the total cost in implementation and maintenance and significantly increased its compliance with regulatory guidelines.
•The Modern CDS
• The other two important features are the use of • embedded data structure and • direct instrument control.
• The earlier generations of CDS used a directory file structure, meaning that the raw data and other files such as the instrument method and data processing method were stored at separate locations. • There would either be no connections or only partial connections between these files.
• The most significant drawback of this type of file management was the potential for methods and raw data to be accidentally overwritten. • To prevent this from happening, the raw data and result files must be locked. • If in some cases the locked data needed to be reprocessed, the system administrator must unlock the files
• The embedded data structure can be used to manage not only chromatographic data, but also all aspects of the CDS, including system security and user privileges. • The embedded data structure maintains all information and changes by date- and time stamping them to prevent accidental overwriting of raw data and method files.
Direct instrument control
• Direct instrument control (or the lack of it) was an important issue for the earlier version of CDS. • The scheme of connecting the detector channels through A/Ds to CDS worked well in analytical laboratories across the pharmaceutical industry. • The scheme provided enough flexibility so that the CDS could collect data from a variety of instruments, including GC, HPLC, IC, SFC, and CE.
• It was equally important that the CDS could be connected to instruments that were manufactured by different vendors. • The disadvantage of this scheme was that the instrument metadata could not be linked to the result file of each sample analyzed. • It could not be guaranteed that the proper instrument parameters were used in sample analysis.
• Summary
• CDS have certainly served the pharmaceutical industry well by being continuously improved. • CDS have helped the pharmaceutical industry to increase efficiency and productivity by automating a large part of pharmaceutical analysis.
• But CDS still have room for improvement. So far the main focus of CDS has been on providing accurate and reliable data. • The current regulatory trend in the pharmaceutical industry is to shift from data-based filings to information-based filings, meaning that the data must be analyzed and converted into information.
• LABORATORY INFORMATION MANAGEMENT SYSTEMS (LIMS)
• Laboratory information management systems, or LTMS represent an integral part of the data management systems used in preclinical development. LIMS
• are needed partly because CDS cannot provide enough data management capability. For example, CDS cannot handle data from nonchromatographic tests.
• Another important use of LIMS is for sample management in preclinical development, more specifically in drug substance and drug product stability studies. • Stability studies are very labor intensive, and the results have an important impact on regulatory filings.
• LIMS are designed to automate a large part of these stability studies including sample tracking, sample distribution, work assignment, results capturing, data processing, data review and approval, report generation, and data archiving, retrieving, and sharing.
•LIMS Hardware and Architectures
• Commercial LIMS appeared on the market in the early 1980s. These operated on then state-of-the-art minicomputers such as the 16-bit Hewlett- Packard 1000 and 32-bit Digital VAX system. • By the late 1980s, several DOS-based PC LIMS operating on the primitive PC network were available.
• By the early 1990s, most LIMS started using commercial relational database technology and client/server systems, which operated on UNIX or the new Windows NT platform.
• The most advanced LIMS utilize serverbased architecture to ensure system security and control.
• There are four main types of architectural options when implementing LIMS
• The first is the LAN (local area network) installation. In a multiple- site situation and through the standard client/server setup, the application would be hosted separately on a server at each site connected to PC clients. • In this setup, the LIMS are installed on both the clients and the server. • System administration is required at each facility.
• The second type is the WAN (wide area network) installation. • In this setup the LIMS take advantage of telecommunication technology to cover a great distance. • The setup can also be used to connect disparate LANs together. • In this configuration, the LIMS are installed on both the clients and a central server.
• The third type is the so-called “centrally hosted thin client installation”. • For this setup, system administration is managed at a corporate center, where the LIMS are hosted and distributed via a WAN or the Internet with a virtual private network (VPN).
• The last and also the newest type is the ASP (Application Service Provision provider)-hosted installation. In this setup, the LIMS are hosted on a centrally managed server form and maintained by third-party specialists. Users access the LIMS with any Internet-connected PC with a standard Web browser.
• Different types of LIMS
Customer-tailored LIMS
• Customer-tailored LIMS—In an implementation of this type of LIMS, the customer purchases a generic product from the vendor. The vendor and customer will work together over a period of time to configure the software to adapt it to meet end user needs. This usually involves extensive programming, which can be performed by the trained end user or dedicated supporting personnel on the customer side.
Preconfigured LIMS
• Preconfigured LIMS—This LIMS does not require extensive customer programming. • To meet specific needs of end users, the vendors provide a comprehensive suite of configuration tools. These tools allow end users to add new screens, menus, functions, and reports in a rapid and intuitive manner. • The tools also allow the LIMS to be more easily integrated with other business applications such as document processing, spreadsheets, and manufacturing systems.
• Specialized LIMS— • This type of LIMS is based on the fact that certain laboratories have a range of well-defined processes (e.g., stability testing) that are performed according to a specific set of regulations and by using well- established tests. The tests are done according to industry-wide accepted protocols. • Specialized LIMS are tailor-made for certain types of laboratories. Therefore the performance can be optimized for clearly defined work process.
• Implementation of LIMS
• Because of their complexity, implementing LIMS usually is a traumatic process. • Good communication and planning can reduce the level of turmoil caused by LIMS • Planning (defining expectations) is the first step in a lengthy process of acquiring the LIMS.
• The LIMS vendor and customer have to work very closely at this stage. A series of meetings must be held between the LIMS vendor and potential end users and laboratory supervisors. • The business processes and sample flows need to be mapped and documented to prepare for future system configuration.
• The LIMS for GMP/GLP use must be validated . Validation includes • design qualification, • installation qualification, • operational qualification, • performance qualification, • and final documentation. • Each of these steps needs good planning and documentation.
• During validation, the system is tested against normal, boundary value, and invalid data sets. • Invalid data should be identified and flagged by the software. • Dynamic “stress” tests should also be done with large data sets to verify whether the hardware is adequate.
• One of the major undertakings during LIMS implementation is user training, which should cover not only the LIMS itself but also the standard operating procedures (SOPs) that govern use, administration, training, and other aspects of the LIMS.
• Thanks
doc_432308642.ppt