Photo credit: ARM Climate Research Facility Flickr
Problem: Working together with the U.S. Department of Energy’s ARM Climate Research Facility, Argonne National Laboratory, a multidisciplinary research center where scientists and engineers focus on some of the world’s largest energy and environmental issues, is tasked with measuring all types of climate information—from wind, soil, cloud physics, precipitation, and more—often working in some of the world’s harshest environments.
With ARM facilities located in remote areas around the world, often in inhospitable conditions in places like Alaska’s North Slope and icebreakers near Antarctica, it’s not possible to have ”normal” data centers with clean power and steady air conditioning for collecting crucial data. The ”best” places for data centers in ARM and Argonne’s case are sites like sea containers and generator-reliant locations. Remote sites like this are subject to power outages, dirty power, extensive vibrations, temperature fluctuations, and other major challenges. But with crucial research being conducted, Argonne cannot afford for their systems to go down and risk losing vital data. The remote data centers not only need to remain online 24/7, but must also keep up with data-heavy research, so the lab needs to be able to collect data at a rate of 4K to 4GB per hour from a variety of different climate instruments.
At first, the Argonne team attempted to build its own homegrown data storage solution, but the team soon realized that this approach was too expensive, difficult to operate, and unreliable. As a result, Argonne set out to find a storage solution that would meet its very specific requirements.
To make sure their storage system could withstand extreme environments, Argonne was on the hunt for systems that were not only highly reliable, but also durable. The storage needed to have data redundancy and replication components built in to ensure no single point of failure in collecting data. The systems also needed to be able to expand to store hundreds of TB of data within as little space as possible, while also keeping up with multiple streams of data input from a variety of instruments. And, perhaps the most important requirement, if the storage system was forced offline due to the complications that come with being in remote locations, it needed to have the ability to restart at the exact point where power dropped to avoid costly data loss.
Solution: After assessing a variety of options from various storage vendors, Argonne found Nexsan’s storage solutions, specifically Nexsan’s Unity storage, to be the only system that met all of its unique requirements. Nexsan was implemented in seven different research locations very quickly, with a total of 1PB of raw capacity that includes more than 500 hard drives deployed on Nexsan storage arrays throughout the various locations.
Since deploying Nexsan, Argonne has never had any instance that has caused serious data storage failure or downtime, but even “when there have been small issues, Nexsan support has been able to provide on-site staff to replace hardware, even traveling to our most remote locations like the Azores,” said Cory Stuart, Argonne’s ARM site data system and cybersecurity manger.
Ultimately, by implementing Nexsan’s storage solutions, Argonne has been able to successfully tackle the storage challenges that come with collecting data in places where it’s difficult to live, let alone run a data center, and it has been able to achieve this at a cost far less than what they could do on their own or with any other vendor the lab assessed.
For more information, please visit: www.nexsan.com