Lab Manager | Run Your Lab Like a Business

Product Focus

Diverse, Complimentary Techniques

Particle sizing methodologies range from straightforward sieving and sedimentation analysis to advanced laser-based light scattering techniques, microscopy/imaging, nanoparticle tracking analysis, and others.

by
Angelo DePalma, PhD

Angelo DePalma is a freelance writer living in Newton, New Jersey. You can reach him at angelodp@gmail.com.

ViewFull Profile.
Learn about ourEditorial Policies.
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Dozens of techniques have emerged for analyzing particles. A particle sizing lab may use optical microscopy, scanning electron microscopy, diffraction, dynamic light scattering, and particle counting. “It depends on what you’re trying to accomplish and the materials you’re analyzing,” says Philip Plantz, PhD, who manages advanced applications at Microtrac (Largo, FL). “Each technique provides a piece of information to enable characterization.”

Particle sizing is critical for almost every manufacturing segment. But growth in such key industries as biotechnology, pharmaceuticals, nanotechnology, and energy exploration will propel the global market for particle size analyzers by an annual growth rate of 4.9 percent, according to a report, Particle Size Analysis Market by Technology, Industry & End User—Global Forecasts to 2018, sold by report aggregator RnR Market Research (Dallas, TX). Growing demand for particle-sizing instruments will be most robust in the Asia-Pacific region, where growth will approach 6 percent per year due to increases in the outsourcing of pharmaceutical manufacturing and R&D to China and India.

Overall, the report estimates 2018 instrumentation sales at $290 million. North America holds 33 percent of the global market, with Europe right behind at 32 percent. Top instrumentation players are Malvern Instruments, Horiba, Beckman Coulter, and Microtrac.

Dynamic imaging analysis

Several companies advertise particle characterization based on dynamic imaging particle analysis (DIPA), for example Microtrac, Micromeritics, Horiba, Retsch, and Fluid Imaging Technologies. Microtrac’s system uses sieve analysis and a strobe LED light source to illuminate particles, while Horiba and Micromeritics collect images of silhouettes produced by particles flowing in a stream.

Fluid Imaging (Scarborough, ME) is unique in its reliance on light microscopy and fast imaging methods, which in addition to accurate sizing acquires data on more than 30 additional parameters. Lew Brown, Fluid Imaging technical director, has written extensively on DIPA as it applies to the characterization of materials, biologics, minerals, powders, and other substances in the size domain with a lower limit of approximately one micrometer.

Static microscope-based imaging is a labor-intensive technique that acquires one frame at a time. Today computers and lasers have automated sizing and allow collection of an unprecedented amount of information in a short time. “These tools take the burden off the user,” says Plantz. But conventional microscopy is still slow.

Fluid Imaging’s FlowCAM system images particles flowing past the microscope optics in real time. A high-speed flash “freezes” particles within the image field and is synchronized with the camera shutter.

Advanced laser-based methods such as Coulter counting and dynamic light scattering (DLS) are faster, but they provide little information other than nominal size or “equivalent spherical diameter.” DLS analysis, for example, assumes that particles are perfectly spherical, which they most often are not, and cannot accurately characterize particles.

Another drawback of systems that provide simple size distributions is they cannot distinguish, at the high end of the distribution, very large particles from aggregates of two or more smaller particles.

By providing size, shape, and other properties, imaging-based microscope analysis resembles true particle characterization rather than simple sizing, “They’re particle sizers, not analyzers,” Brown observes. “By operating at thirty frames per second, microscope-based techniques can gather enormous quantities of data very quickly. Plus, we have a real image of each particle, so we can do filtering and pattern recognition.”

The main drawback of DIPA is that it doesn’t work with particles smaller than one micrometer—not terribly small by today’s standards for nanomaterials. This is a consequence of the diffraction limit of visible light. It is possible to analyze images from electron and atomic force microscopy with DIPA software, but the drawback is throughput: It can take hours to acquire images of just a few particles with those techniques. “Users would not be able to characterize statistically relevant numbers of particles in a reasonable time frame,” Brown explains.

Microscope-based DIPA shares one inherent limitation with laser-based methods, namely the orientation of particles as they travel through the imaging field. Depending on the speed of image (or signal) acquisition, a rod-like particle may appear as a large or a small circle, and flakes may appear as rods. By using a dimensionally constrained flow cell that provides particles with less freedom to rotate, this problem is mostly eliminated.

Single particle analysis

According to Kerry Hasapidis, president of Particle Sizing Systems (Port Richey, FL), the hottest trend in particle sizing today is the move toward single-particle analysis. “Labs are returning to single-particle methods, like microscopy, that incorporate some aspect of image analysis.”

The scientific impetus behind the change is the realworld problem of particle outliers. Almost any laserbased technique gives mean particle size distributions, but critical quality attributes are rarely based on shifts in particle size amounting to plus or minus two to three microns. “The key is outliers—very fine particles that change viscosity, or very large particles,” Hasapidis says. Large outliers clog pens and inkjet print heads, or in drugs may cause strokes by blocking blood vessels.

Single-particle techniques are particularly useful at very low concentrations, such as contamination measurements. “That’s where you are looking for a needle in a haystack,” explains Philip Plantz. “Its drawback is the lower measurement limit of about one micron. Diffraction methods get down to ten nanometers, and dynamic light scattering down to one nanometer.”

Particle Sizing Systems has been riding the wave toward single-particle analysis, which combines the speed of laser-based particle characterization with the ability to characterize not averages, but actual particle-size distributions for samples consisting of hundreds of thousands of particles in minutes.

The basic technique grew from laser systems that counted contamination particles on the basis of how much light individual particles block, not on an aggregate measurement transformed into particle size distributions through mathematical manipulation.

“Think of a very fast microscope, but instead of having someone view, count, and measure particles, we use a detector based on the amount of light a particle obscures,” Hasapidis says. “It combines the speed of a laser and the ability to discriminate a lot of particles in a short time.”


For additional resources on particle sizing and characterization, including useful articles and a list of manufacturers, visit www.labmanager.com/particle-sizing