Buried in a working paper scribed in October 1980 by sociologist Charles Tilly was a consequential union—a marriage of two words forged out of necessity to describe a concept not yet named with repercussions not yet imagined.
Such was the first recorded mention of “big data”—a phrase that would swiftly enter the common vernacular and become common practice across industries and geographies.
To understand how data can and will be used to shape decisions in the lab, one must first understand what decisions lab managers need to make in the first place.
A new era of science means new decisions for lab managers
Ten years ago, the titles of those who supported a lab’s operations were similar to today—technicians, lab managers, and IT managers. Yet, the responsibilities under their purview and the challenges associated with them have changed drastically in the decade since.
Today’s scientists don’t just need their equipment to be operational; they need it to be transformational. Researchers now expect their tools to act as both collectors and reporters of data. To empower scientists with the data they require, operations professionals are now presumed to be experts in cloud infrastructure, data security, and encryption. As for the assets under their jurisdiction, many were manufactured before the internet was even established.
“I’ve definitely seen a paradigm shift,” says Russell Lund, facility manager at Zai Lab in Menlo Park, CA. “A lab manager needs to intricately understand every piece of equipment—what each does, why it does it, and what to do if it goes down.”
Such growing responsibilities give lab managers a litany of new decisions. How can we draw new insights from old equipment? How is our data encrypted? How do we get data into the right hands with ease and out of the wrong hands with veracity?
As Lund describes, today’s lab manager role is highly technical: “I have to ensure that the computer is talking to the machine and the data is being stored properly. That means maintaining frequent contact with techs and even the representatives of the machines themselves.”
Luckily, Internet of Things (IoT) technology has enabled the collection of thousands of data points without human involvement. Today, sensors are embedded in new equipment, while legacy assets can be connected to the cloud via inconspicuous and easy-to-install sensors.
Outfitting a lab to collect data for today’s needs alone is shortsighted. It’s imperative that those seeking to leverage data in the lab consider not only the expanded data pipeline of today, but the colossal one of tomorrow.
Artificial intelligence requires operational excellence
The questions answered with data today will be asked by artificial intelligence (AI) tomorrow. Yesterday’s executive hypotheses are today’s data-driven plans and tomorrow’s fully automated discoveries.
In the not-so-distant future, robotics will handle automation as AI evaluates protocols. Eventually, discovery will require little to no human involvement at all.
But as Lily Huang, associate scientist at Pliant Therapeutics, explains, the initial impact of AI will be a welcomed one: “Many professionals might be worried about robots and AI leading to a high rate of unemployment for manpower-based jobs. I personally think that machines, especially smart machines, will take the boring tasks away from their human counterparts. AI technology has the potential to assist daily operations in the lab as well as facilitate the improvement in various processes. If well designed and executed, AI is able to identify process flaws and procedure redundancy, in addition to catching operational defects and optimization opportunities.”
As the tidal wave of data crests, some researchers are still recording measurements on paper, manually transferring their notes to spreadsheets, and individually exporting spreadsheets into databases for processing and storage.
If such habits are antiquated today, they’ll certainly be detrimental tomorrow.
Any remaining “if it ain’t broke” devotion to paper notebooks will break under the weight of a data-hungry, AI-shaped future. But eventually, so will manual collection of any kind.
To be truly transformative, AI requires input from mass quantities of data. Its collection must be copious, reliable, and automatic. Such widespread collection requires universal connection of every asset, every metric, and even the lab environment itself. IoT technology was born for such a time as this. “We were recording data by manually handwriting in a spreadsheet twice per day,” explains Joanna Schmidt, lab manager at Ionis Pharmaceuticals. “Since installing an online system, we can see all temperature changes of all freezers over time on one page. I’m moving users to under-utilized freezers to help increase the lifespan of the units.”
Some growing pains remain
While most recent laboratory equipment on the market comes with cloud connectivity embedded, vendor-specific solutions solve one problem while creating another. Data is siloed into superfluous and clunky dashboards, rendering it all but useless to those who need it.
Meanwhile, according to internal research by Elemental Machines, an estimated 300 million assets aren’t yet connected to anything. Most are fully operational and widely familiar (balances, centrifuges, freezers, etc).
But thanks to turnkey IoT sensors and vendor-agnostic cloud solutions, the world’s unconnected assets will live on and live as one. Rather than being sidelined in favor of connected equipment, inconspicuous sensors enable seamless retrofitting in seconds. As such, tomorrow’s connectivity needs can be met while stewarding yesterday’s investments.
In the lab, data maturity advances in reverse
In most categories, maturity is a quality that comes effortlessly to the aged and arduously to the young—not so for data maturity. When it comes to data, today’s startups spring to life already pushing the boundaries of its collection, harnessing its insights, and leaning on AI to make sense of its root causes. Despite their bound booklets titled “Digitization Strategy” and secured rooms labeled “Lab of the Future,” titans of industry are challenged with wriggling their way out of longstanding practices, breaking free of tired infrastructure, and asserting their way to modern data practices over the objections of sometimes thousands of internal stakeholders. Inertia is real.
Despite the unequal hurdles presented to startups and industry leaders, the importance of achieving data maturity in the lab remains imperative to both. The organizations who will dominate market share tomorrow are those who prioritize data today.
Amidst the myriad models and guidelines for data maturity in other sectors, practical handrails for leveraging data in the lab are few and far between. As such, the following offers an outline of the five stages of data sophistication in the lab. Evaluate your organization’s standing using the information below.
The Five Stages of Laboratory Data Sophistication
Stage 1: Elementary
- Asset data is available but siloed
- Equipment data populates single-asset interfaces
- Some assets remain unconnected
- Accuracy is questioned
- Access is cumbersome
Stage 2: Irregular
- Organizational data strategy plans are forged but confusing
- Sensors are deployed for complete lab connectivity
- Data is trusted but siloed either by seniority or asset type
- Progress is stunted as data strategy is not fully prioritized
Stage 3: Championed
- Data strategy and vision are formalized, adopted, and concise
- Lab director champions the use of data and analytics
- Algorithms detect and alert of anomalies
- A single universal dashboard enables access to all data anytime, anywhere
- Primary and secondary data are integrated
- Humans remain integral to analysis
Stage 4: Committed
- Lab director and company executives fully buy in to organization-wide data and analytics strategies
- Data informs business decisions and lab activity alike
- Data and analytics are viewed as a key component in driving lab innovation
- AI details the root causes of reactions, anomalies, and errors, and predicts those to come
Stage 5: Transformative
- Data and AI are central to discovery
- Discoveries are fully automated without human involvement
- Robotics handle automation and AI evaluates protocols and results automatically
- Utilization data informs all purchasing decisions in the lab and across the organization
- A chief data officer maintains a seat on the board
For now, achieving a “transformational” level of data maturity may sound like a lofty goal and a clear competitive advantage. But a day is coming where it will be essential for survival and thus become the status quo.
Thanks to IoT, AI, and organizational prioritization of data maturity, the lab of the future is coming into focus.
Charles Tilly likely had no idea in 1980 that his casual declaration of big data would eventually become sacrosanct. Lab managers had little indication of how quickly assets would measure themselves. But for anyone willing to listen, every indication is that AI will fulfill the promises enabled by big data.
For legacy scientific and research enterprises, mature handling of data will determine whether their reign continues or ends. For emerging players, data maturity could be their ticket to disruption. The lab managers enacting the automation and optimization of data collection within each will maintain a place in history as the linchpins who enabled discoveries long elusive. The future of the lab is bright.