The cost of developing new therapies continues to soar, ranging from $2.6 billion to $6.7 billion. The cost of failure captures a large portion of this price tag, and the consequences of data failures far exceed the pocketbook perils. In a worst-case scenario, failures require clinical trials to be repeated, as recently observed in late 2021 following the Food and Drug Administration’s determination that two clinical research organizations had violated federal regulations and engaged in misconduct that negated the efficacy of data for trials conducted in 2019. Not only do deficiencies in data management stress biopharmaceutical organizations with tremendous costs, but the aftermath creates major temporal delays and wasted resources and stunts overall agility.
One potential solution is to employ laboratory information management systems (LIMS), also known as information systems or laboratory management systems, to alleviate some of the woes that come with data management and facilitating strategy. LIMS, too, have their limitations—a fallacy of which organizations must be aware to prudently move forward. Here are four reasons why LIMS are not fool-proof and quantify the business impact.
Reason 1: One size doesn’t fit all: Cross functional business processes require platform solutions
There is a reason why the question, “Do I need a LIMS?” is a common Google search for the biopharma industry. Most scientists work in a lab of some sort and have unmet information management needs, so a “laboratory information management system” sounds like a panacea. Unfortunately, legacy LIMS and other common software solutions such as the electronic lab notebook (ELN) were designed for very specific purposes and lack the flexibility to accommodate the demands of modern biopharma development.
Perhaps no greater challenges exist in the biopharma world than those that arise when designing and implementing a biopharma IT strategy. Software customized for research and development purposes tends to be fundamentally different compared to software designed for manufacturing, and for good reason.
In the early stages of development, flexibility and freedom are key but as development progresses, procedural controls become increasingly important. Traditional LIMS systems were designed for managing samples and associated test processes and data, particularly for routine quality control (QC) workflows such as batch release testing for manufacturing. LIMS are rarely used in early development, for example, because they aren’t flexible enough.
Historically, these diverse needs have been met by piecing together a complex ecosystem of standalone or partially integrated software packages, each serving a particular group. Now, business and IT groups alike are shifting the focus from technology to business impact. In other words, discerning which ELN is the best on the market has evolved to exploring how to achieve the best outcomes.
Reason 2: Modern biopharma isn’t business as usual
Outsourcing has long played a role in biopharma development and is becoming increasingly vital both as a means of accessing highly specialized skills and technologies and providing the necessary speed and agility to meet regulatory and market demands.
Like outsourcing, cloud computing is also fundamental to enabling modern biopharma development. Using antiquated approaches fails to solve new-age problems efficiently and effectively, resulting in sluggish or stagnated development.
Patrick Pijanowski, managing director, global practice lead at Accenture Scientific Informatics Services, believes that “biopharma can achieve greater value at speed by intensifying their focus on three strategic plays: new science portfolio, digital and data-led research, and faster, smarter development. To realize the potential value of these strategic plays, a coordinated approach across specific technologies and critical enabling capabilities is needed.”
Reason 3: The stakes are higher than ever
Competition is fierce, and the gap between the possibilities modern cloud computing offer and historical norms, such as inaccessible data silos, continues to widen. The time and cost implications are staggering; years of development effort and millions of dollars are wasted due to ineffective data management.
On one end of the spectrum, artificial intelligence (AI) has effectively solved the protein folding problem with systems like AlphaFold. On the other side, lab scientists still record their work in Excel spreadsheets and basic records—both error-prone applications which are difficult to locate, access, or reuse. Without a robust and well-maintained data backbone spanning the development lifecycle, the challenge of connecting disparate data sources and developing predictive models will continue to present roadblocks.
Reason 4: Data science prompts the shift from data-driven to data-centric
Data has become one of the most important assets in the biopharma industry, resulting in data-driven organizations currently generating massive amounts of digital information. These organizations are actively using this data to make better decisions; however, solutions often center on application uses instead of data, and scientist and applications need to deal with the complexity and burden of data silos and multiple data models.
Data-centric refers to a culture and technical architecture where data is the primary and permanent asset, regardless of whether applications change. The data model always precedes the applications and will remain valid long after application changes. Data science, therefore, also plays an important role in the digital transformation of a data-centric organization. A data-centric approach normally results in improved data accessibility, quality, interoperability, and reusability.
Collecting more data does not yield a more data-centric organization. For example, each data set internalized in ELNs or LIMS uses a different data model. Their extraction to a data lake without any effort serves to harmonize them. As a result, systems become incrementally less data-centric—even if they become more data-driven.
Looking Ahead: The solution lies in harmonizing datasets with a solid data backbone
Establishing a pre-emptive strategy and processes that optimize data management is critically important to mitigating data management challenges in the biopharma industry. Coupled with a clear understanding of legacy system limitations, organizations can overcome data outages and temporal delays. They can also shift the focus to be more data-centric with digital workflows designed to create a persistent, dynamic data backbone throughout the biopharma lifecycle. This provides a solid foundation for analytics and enables insights to be shared across internal and external teams that accelerate process understanding and ensure product quality.
Biopharma companies are already in innovation mode. Now is the time to shift strategy and generate new “future-fit” R&D organizations.1
1 Billions to millions: Improving R&D productivity: https://www.accenture.com/us-en/insights/life-sciences/from-billions-to-millions-transformation