Drug discovery is an inherently collaborative venture that necessitates the interaction and integration of people, processes, laboratories, and technology. This is easier said than done, however, and the ability to achieve optimal communication and coordination of effort; exploit potential synergies; and maximize the quality, reliability, and reproducibility of results begins with a foundation built on transparency and trust.1 In a drug discovery organization where there is transparency and trust, researchers are better able to share and compare data, rely on or openly question results, and support each other’s efforts in a truly collaborative environment. The willingness to embrace transparency is a significant statement by individuals and organizations, saying, “I want to do it right.”1,2 In an open environment where data are readily shared, time otherwise spent debating the accuracy of the data can instead be applied to moving projects forward.2
Building transparency into the research setting depends on several key factors: high-quality raw data; shared analytical tools; curated datasets; and clear documentation, audit trails, and reports. The strategic integration of informatics tools across a drug discovery workflow can foster transparency and improve the quality and standardization of data reporting and interpretation. Although a move toward greater openness may encounter some resistance in an organization, the potential gains far outweigh the challenges.2
Part I of this two-part article presents the principles and foundations for implementing informatics as an integrative tool for achieving these organizational goals. It provides recommendations based on real-world experience and the results of a project carried out in the drug discovery analytical group at Lundbeck Research. The project was designed and intended to gain efficiencies through the integration of electronic laboratory notebook (ELN) systems. Part II, to be published in the September issue, will provide a detailed description of the case study in which Lundbeck integrated multiple informatics tools and software packages to improve data quality and transparency across analytical methods used to determine the composition, structure, and purity of drug compounds in development; make physic-chemical/ ADME measurements; perform complex bioanalyses; and evaluate the solubility, stability, and other critical characteristics of experimental drug compounds in various formulations.
Building transparency and trust
A willingness to invest the time, effort, and resources to develop and implement an organizationwide informatics strategy can yield significant gains in productivity and accelerate the path to a marketable product. Several key factors can contribute to the ultimate success and timetable for accomplishing this type of ambitious project. One is the need for a clear understanding at the outset of what you want to achieve. It is critical to define your vision and outline a path forward, recognizing that there will undoubtedly be bumps and detours along the way and maybe even some backtracking and rethinking. Set realistic expectations. Then identify the resources and people—scientists, information technology (IT) specialists, and project managers—needed to move forward, drawing both from within the organization and on expertise from outside as needed, including vendors and independent consultants.
Identify and anticipate the challenges and potential obstacles, both generic and specific, to your organization. Recognize that every work environment is different, and a vendor cannot know how you want an instrument, analytical device, or algorithm to work in your laboratory. Some degree of customization will always be needed, and this should be viewed as an opportunity to optimize your workflow. It is unrealistic to assume that any analytical instrument will arrive ready to use. Unpacking a device and plugging it in is only the first step. Maximizing the usefulness of a new tool or software program and integrating it into your workflow requires customization, training, and the necessary links to other platforms—including ELN systems, the organization’s data collection and storage system to ensure open access to data—and to external networks, interfaces, and data stores as needed.
No single informatics approach will be sufficient or optimal for every process, research group, or workflow. Furthermore, each approach is unlikely to be a simple, works-right-out-of-the-box software solution. While every device and analytical instrument may have its own dedicated software platform, the key to encouraging and enabling transparency lies in the customization of the software to optimize its functions for a particular application or workflow and the integration of distinct software systems into a unified informatics network. This strategy will allow for organizationwide access to and use of critical resources, experimental results, and proprietary knowledge.
Informatics cannot change the quality of the science, which must always be a top priority, but it can add significant value to the scientific output. It can make it easier for scientists to view, mine, and transfer data; utilize the data in multiple distinct operations or analyses; and extract valuable information and conclusions from the data. The result will be streamlined, more efficient processes. Informatics can even make it possible for the data to find the scientist.
Several missteps can cause informatics initiatives to fail. First and foremost, informatics needs to be clearly enabling and not create more work or increase the chances for generating false metrics. Too often, the implementation of informatics tools introduces barriers instead of removing them. For example, the need to input large amounts of information manually can create a data input barrier. To the extent possible, data should load automatically from an instrument to the software platform and download to visualization and data reporting tools as well as ELNs and servers. Another common mistake is expecting informatics to be able to substitute for face-toface communication. Overestimating or underutilizing your in-house informatics capabilities and expertise and not taking advantage of outside guidance and advice can also introduce barriers to successful implementation.2
Overall, it is important to build and implement an informatics system with foresight and flexibility as the cornerstones of the design. At first, it is crucial to focus on delivering the core needs of the organization. However, the design of an informatics network must also take into account the possibility and implications of adding instruments or new software platforms a year or two down the road, and it must plan for such eventualities. Realistically, it is not possible to anticipate all future needs.
Furthermore, there is not likely to be sufficient time to implement every idea envisioned as a single package. From an organizational perspective, the design of an informatics system and integration of individual instruments and software platforms should be based on a vision driven by overall organizational goals. Meeting these goals, while also building in the flexibility and capacity for reorganization and expansion, should be primary considerations for core informatics technology selection. Once these goals have been delivered, it is often surprisingly easy to identify opportunities for further productivity gains in the work environment. These can be further enhanced by applying Lean Six Sigma approaches to internal workflows, perhaps best performed as separate small projects. Implementation of an informatics strategy as several smaller projects, with some pursued in parallel and others sequentially, is a prudent choice. It has the advantage of more predictable time lines. This can help to engender trust among the senior management as progress is more readily evident, there is less financial exposure, and you are better able to optimize the informatics to match the actual workflow for even greater productivity gains.
Real-world implementation
The concepts and principles described above were the basis for a real-world project implemented at Lundbeck, in which informatics integration resulted in improved data quality and transparency and, ultimately, a higher level of trust and enhanced productivity across a highly collaborative drug discovery organization. The analytical workflow in place at Lundbeck incorporated three main software platforms—Empower, MassLynx (including OpenLynx and FractionLynx), and NuGenesis® Scientific Data Management System—as well as independent detectors and instruments from other vendors, each with its own data management and reporting software. The goal was to utilize ELNs to develop an integrated informatics network that would capitalize on the strengths of each platform, generate high-quality data, and establish a high level of trust in the accuracy, reliability, and interpretation of the results. This vision included built-in, data-driven checks and balances that ensured ongoing monitoring and quality control and incorporated mechanisms for making the information available and understandable to a diverse group of scientists working on common projects.
A key consideration in designing an informatics workflow is how the data are going to get from the data collection mechanism built into a detector or analytical system to the higher level data storage, analysis, and reporting system maintained by the analytical group and, ultimately, into the corporate IT network, thereby making it accessible to the entire community. One solution is to establish a virtual machine (VM) system. A VM is similar in concept to cloud computing, but differs in that the “cloud” remains within the organization and the data and functions are not distributed among computers in the public domain.
Lundbeck installed a bank of CPUs to manage a variety of centralized computational needs, including database access, communications, and remote access in the VM space. This VM capacity was also employed to process, store, and distribute analytical data and to provide remote access for data viewing and instrument operation during nights and weekends, as depicted in Figure 1. All the data have a duplicate audit trail—one maintained within an individual instrument’s software and one in an organizationlevel system managed by the IT group that is automatically backed up daily to an off-site, secure location and is accessible (in read-only mode) to anyone with appropriate clearance from within or outside the company.
Data, outcomes, and lessons learned from this project are presented in Part II—A Bottom-Up View. Overall, the project was a success, resulting in greater access to data across the organization, improved data transparency, and lower barriers for data entry and utilization. Informatics can and should be enabling and empowering. This case study demonstrates that if conceptualized, designed, and implemented in a thoughtful way that takes into consideration the current and future capabilities and needs of individual laboratories, research groups, and the organization as a whole, informatics tools can enhance efficiency and improve productivity across disciplines and workflows. Building transparency and trust takes time, but the benefits can be dramatic and the gains in productivity well worth the investment in time and resources. The results achieved in this project indicate that over the course of a year of total transparency, in addition to consistent and efficient delivery of high-quality results, output from the same full-time employees—including both providers and customers— can be increased greater than threefold.