Lab Manager | Run Your Lab Like a Business

News

Statistical Technique Cleans and Improves Nanotechnology Data

A new statistical analysis technique that identifies and removes systematic bias, noise and equipment-based artifacts from experimental data could lead to more precise and reliable measurement of nanomaterials and nanostructures likely to have future industrial applications.

by Georgia Institute of Technology
Register for free to listen to this article
Listen with Speechify
0:00
5:00

A new statistical analysis technique that identifies and removes systematic bias, noise and equipment-based artifacts from experimental data could lead to more precise and reliable measurement of nanomaterials and nanostructures likely to have future industrial applications.

Known as sequential profile adjustment by regression (SPAR), the technique could also reduce the amount of experimental data required to make conclusions, and help distinguish true nanoscale phenomena from experimental error. Beyond nanomaterials and nanostructures, the technique could also improve reliability and precision in nanoelectronics measurements – and in studies of certain larger-scale systems.

Get training in Risk Assessing and Characterizing and earn CEUs.One of over 25 IACET-accredited courses in the Academy.
Risk Assessing and Characterizing Course

Accurate understanding of these properties is critical to the development of future high-volume industrial applications for nanomaterials and nanostructures because manufacturers will require consistency in their products.

“Our statistical model will be useful when the nanomaterials industry scales up from laboratory production because industrial users cannot afford to make a detailed study of every production run,” says C. F. Jeff Wu, a professor in the Stewart School of Industrial and Systems Engineering at the Georgia Institute of Technology. “The significant experimental errors can be filtered out automatically, which means this could be used in a manufacturing environment.”

Sponsored by the National Science Foundation, the research was reported in the journal Proceedings of the National Academy of Sciences. The paper is believed to be the first to describe the use of statistical techniques for quantitative analysis of data from nanomechanical measurements.

Nanotechnology researchers have long been troubled by the difficulty of measuring nanoscale properties and separating signals from noise and data artifacts. Data artifacts can be caused by such issues as the slippage of structures being studied, surface irregularities and inaccurate placement of the atomic force microscope tip onto samples.

In measuring the effects of extremely small forces acting on extremely small structures, signals of interest may be only two or three times stronger than experimental noise. That can make it difficult to draw conclusions, and potentially masks other interesting effects.

“In the past, we have really not known the statistical reliability of the data at this size scale,” says Zhong Lin Wang, a Regents’ professor in Georgia Tech’s School of Materials Science and Engineering. “At the nanoscale, small errors are amplified. This new technique applies statistical theory to identify and analyze the data received from nanomechanics so we can be more confident of how reliable it is.”

In developing the new technique, the researchers studied a data set measuring the deformation of zinc oxide nanobelts, research undertaken to determine the material’s elastic modulus. Theoretically, applying force to a nanobelt with the tip of an atomic force microscope should produce consistent linear deformation, but the experimental data didn’t always show that.

In some cases, less force appeared to create more deformation, and the deformation curve was not symmetrical. Wang’s research team attempted to apply simple data-correction techniques, but was not satisfied with the results.

“The measurements they had done simply didn’t match what was expected with the theoretical model,” explains Wu, who holds a Coca-Cola chair in engineering statistics. “The curves should have been symmetric. To address this issue, we developed a new modeling technique that uses the data itself to filter out the mismatch step-by-step using the regression technique.”

Ideally, researchers would search out and correct the experimental causes of these data errors, but because they occur at such small size scales, that would be difficult, noted V. Roshan Joseph, an associate professor in the Georgia Tech School of Industrial and Systems Engineering.

“Physics-based models are based on several assumptions that can go wrong in reality,” he says. “We could try to identify all the sources of error and correct them, but that is very time-consuming. Statistical techniques can more easily correct the errors, so this process is more geared toward industrial use.”

Beyond correcting the errors, the improved precision of the statistical technique could reduce the effort required to produce reliable experimental data on the properties of nanostructures.

Ultimately, SPAR may lead researchers to new fundamental explanations of the nanoscale world.

“One of the key issues today in nanotechnology is whether the existing physical theories can still be applied to explain the phenomena we are seeing,” says Wang, who is also director of Georgia Tech’s Center for Nanostructure Characterization and Fabrication. “We have tried to answer the question of whether we are truly observing new phenomena, or whether our errors are so large that we cannot see that the theory still works.”