A little too quiet
It was during a routine laboratory walkthrough at a large pharmaceutical company, with members of the management team, that Jason recognized an enormous opportunity to improve operations. The lab was arranged into three bays, containing about 50 liquid chromatography (LC) systems. During their walkthrough, the Vice President inquired how frequently these costly instruments were used, and the laboratory staff claimed that all the LC systems were in regular use. However, Jason noticed something was missing: in a lab containing fifty LC systems in operation, the click, hiss, and hum of the pumps should fill the room. Based on the lack of noise, Jason estimated about three of the systems were in use at this time. If this was the case in all labs across the department, how many of the functions of the 140 LC systems were sitting unused? This was the “lightbulb” moment, the realization that the organization needed accurate data to provide a better understanding of asset utilization. This information would be essential to inform data-driven asset lifecycle decisions and improve efficiency throughout the organization. The challenge would be implementing the right system to provide these insights.
Until this point, asset utilization data collection consisted largely of asking the opinion of various laboratory personnel. This piecemeal approach did not yield robust data, as for any given instrument, one person might say “oh I use this all the time”, whereas someone else would claim “that thing? It’s hardly ever used.” With so much variability, Jason knew the only way forward was to acquire accurate data to support asset lifecycle decisions.
When contemplating how to acquire utilization data, Jason and his colleagues initially explored the idea of manually reviewing log files. They quickly realized that this would be too time-consuming and inefficient, and continued investigating different strategies. When Jason discovered PerkinElmer’s Asset Genius, he knew it was the right solution for acquiring this essential data as it was captured digitally. At the time, another force driving the implementation of Asset Genius was that the VP of the department decided not to acquire any new LC systems until utilization was evaluated for the existing systems.
Implementing Asset Genius
Part of the asset utilization review process was determining which parameters would be used to support key decisions. Early on, Jason realized that for utilization data to be compelling and actionable, it needed to be associated with the asset information (model number, age of asset) and the service history. “If you’re going to purge underutilized assets, you don’t want to purge the newer ones”, explains Jason.
“As we started to look at utilization across assets, we realized that the way that one asset runs may be very different from another. For example, a moisture sorption balance might require a full day to analyze a sample, whereas a UPLC can complete a sample within 5 minutes”, says Jason. Therefore, relying on runtime or utilization hours to determine asset utilization didn’t provide a full picture.
Jason realized it would be more consistent to evaluate asset utilization based on days. “It’s a lot easier to explain across any asset type— was the asset used on any given day, yes or no?” he explains. With utilization days, the variation in data accuracy across instruments was less than 10 percent and could then be used to ascertain an accurate understanding of utilization. An additional benefit of utilization days is having a universal utilization cut off time value such as week, month, and quarter.
Data-driven asset lifecycle decisions
With utilization now measured in days, Jason and his colleagues created different data buckets ranging from a week to a full year. Using these buckets, they identified approximately 30 LC systems that were used for less than a full week in a year. A very interesting finding was some of the systems with the lowest utilization were some of the newest LCs. With Asset Genius, utilization data and asset information such as age and service history are combined allowing users to understand the context of the utilization. With this information easily available, Jason and his team reviewed all the systems and decided to eliminate the older systems. This would create the added benefit of pushing the utilization to the newer systems. This also led to the realization that there was sufficient capacity so no additional LC systems were required, and the only new purchase necessary was additional detectors for the existing systems.
Not only did the accurate utilization data obtained in Asset Genius help guide intelligent purchasing decisions, in eliminating redundant assets, the team freed up valuable lab space-about 15 percent of the total lab space. With over 30 LC systems removed from the lab, the team is planning to create more functional workstations and sample preparation areas to improve sample flow. The added space has also been extremely valuable during the COVID-19 pandemic, as it supports appropriate social distancing measures to ensure the safety of lab personnel.
Deeper insights: Bottlenecks and maintenance optimization
In addition to informing purchasing decisions, utilization data obtained with Asset Genius will be used to identify bottlenecks and improve laboratory workflows. This is something Jason is working on, with hopes of examining a much larger data set to provide a better picture for the entire company and develop a utilization goal to maximize asset use.
According to Jason, it will be important to look at all aspects of asset utilization to improve workflows. This includes factors such as time spent on sample preparation for a run. “LC run prep can be anywhere from a few minutes to a few hours, and that all plays into the instrument being tied up for additional time where it is in queue for a run” he explains. Asset Genius provides the detailed data necessary for users to identify and eliminate these bottlenecks to improve efficiency and productivity.
Asset Genius also provides asset age and service history data that may be leveraged to optimize preventative maintenance schedules. A highly utilized instrument, for example, will benefit from more frequent preventative maintenance to avoid unanticipated downtime. Alternatively, identifying underutilized assets and reducing the frequency of preventative maintenance will save on costs. In performing a reliability assessment, users can also compare asset runtime versus downtime, as well as service history to identify assets for decommissioning.
The combination of utilization history and repair history for an asset can also be used to optimize service contracts. Maintaining a premium service contract for underutilized assets is wasteful, and a less comprehensive option can be more affordable. With accurate utilization rates, asset age, and service history, labs can better assess the potential risks associated with reducing coverage on less critical assets and make informed decisions regarding service contracts. “This is where you see the return on data-driven decision-making”, explains Jason, “these decisions will have a big impact”.
Through the implementation of PerkinElmer’s Asset Genius system, Jason found a way to obtain detailed asset utilization data to support asset lifecycle decisions, resulting in significant savings for the laboratory. These results are just the beginning, as more data is collected and analyzed across the assets and laboratory, more informed decisions will be made through the organization. In harnessing the power of Asset Genius, laboratories are now digitizing more of their data enabling data-driven decisions that support greater efficiency and productivity for their asset management programs.