Most labs require several different informatics tools to collect, process, analyze, and store their experimental data and information. These tools have developed over decades with each shift in technology. With the rise of AI, there is an opportunity to create a better informatics ecosystem that can help accelerate experimental decision-making and analysis and reduce fragmentation and rework. To learn more about how this ecosystem could be structured, we talked with Andrew Wyatt, chief growth officer at Sapio Sciences.
Q: Why is the lab notebook becoming a focal point for managing AI and other lab tools?
A: The electronic lab notebook (ELN) sits at the center of experimental context, provenance, and compliance. But for most labs, it still does not sit at the center of decision-making. Today, ELNs are largely record keepers. Scientists pull data off an instrument, export it as a CSV, clean it up in Excel, and then export it again into a specialist analysis tool or even a public AI tool. Only after all that does a summary get typed back into the notebook. By then, decisions have already been made elsewhere by stitching together insights across multiple screens and systems.
That approach worked when experiments were simpler. It does not work today. Modern research depends on combining raw data with analytics, models, and external knowledge. When that work happens outside the notebook, context is lost, traceability breaks down, and previous results become surprisingly hard to trust or reuse.
The notebook is now where context needs to live. That context is what makes AI and analytics useful rather than disruptive. When these tools are available inside the ELN, interpretation happens as part of the workflow rather than as a separate cleanup step. Scientists stop shuttling data between systems just to answer basic questions. The notebook stops being a compliance checkbox and starts acting as the place where work moves forward.
Q: How does having an ecosystem of tools bring value to the scientist and their work?
A: The real value is not just time savings. It is momentum, and more importantly, context.
Scientists spend an extraordinary amount of time moving data around instead of thinking about it. They export results, clean them up, wait on someone else to run a query, and then try to reconstruct context they already had days or weeks earlier. That is not just inefficient. It strips away the experimental detail that gives results their meaning.
When tools and data live inside the notebook, that context stays intact. AI is no longer working from a pasted table or an isolated result. It knows where the data came from, how it was generated, what conditions were used, and how the experiment compares to previous runs. AI can surface plausible causes and next steps that would be invisible without context. When tools work this way, friction drops away. Scientists can explore results and test ideas without leaving their workflow.
It only works with an open approach. The focus must be on integrating the technologies scientists already use every day, whether that is AI models, data services, or scientific content, rather than trying to build a closed ecosystem or compete with specialist vendors who have spent years perfecting their tools.
Q: Why is AI adoption leading to fragmentation in the lab?
A: AI adoption leads to fragmentation when tools arrive faster than workflows can adapt. Until recently, most advanced analytics and AI tools could not be integrated into ELNs in a practical way. Historically, many of these tools lived inside bioinformatics or cheminformatics teams. They were powerful, but complex, and often required custom scripts or pipelines to run. A bench scientist had to fully define the question, hand it off, and wait for the result.
That model worked, but it was slow. It removed the scientist’s ability to explore data iteratively or follow unexpected leads. Analysis became a queued task rather than part of the scientific process.
Labs did what they had to do. They gave scientists direct access to newer tools, but without a way to connect them. The result is a data graveyard: multiple specialist models, multiple interfaces, and multiple versions of the same dataset.
Public generative AI has accelerated this dynamic because it is fast and flexible. If a scientist can paste results into a public AI tool and get a useful answer in seconds, they will do it, especially when official tools make that hard. The answer is not more shiny models. It is giving scientists a way to tie the ecosystem together, so exploration, analysis, and documentation happen in one place.
The most obvious risk is rework. Scientists rerun experiments simply because previous results are hard to find or interpret. That wastes time, reagents, and instrument capacity.
Q: What should lab leaders prioritize when thinking about AI adoption over the next few years?
A: The priority should not be AI for AI’s sake. It should be tools that move science forward. That means solutions that fit naturally into how scientists already work. Tools that integrate with instruments, data, and workflows rather than sitting off to the side as something else to manage.
Integration matters more than novelty. When AI is part of the workflow, it reduces manual effort and improves consistency. When it is not, it just adds more fragmentation. Governance and trust still matter, but they work best when built into the workflow rather than bolted on afterward.
The biggest gains are practical: fewer repeated experiments, less time waiting in queues, and better reuse of existing data. At the end of the day, it is simple. Tools that help scientists think get used. Tools that get in the way get worked around. The real question is whether AI stays at the edges or becomes part of the core. Ultimately, it is about basic sanity for the person at the bench.
An integrated informatics ecosystem can provide significant benefits for labs. The key will be how to keep all the tools lab scientists need while incorporating AI in a system that meets the governance requirements for the organization. The upside will be faster developments and innovations while maintaining compliance and accessibility.














