Lab Manager | Run Your Lab Like a Business

Advances in Imaging

Michelle Lowe Ocaña talks about new technologies and trends in imaging hardware

by Tanuja Koppal, PhD
Register for free to listen to this article
Listen with Speechify
0:00
5:00

Michelle Lowe Ocaña, senior imaging specialist and manager of the Neuroimaging Core Facility at the Harvard Medical School, talks to contributing editor Tanuja Koppal, PhD, about new technologies and trends in imaging hardware and software that are impacting what she does in her lab. She points out some areas that still need improvement and offers useful advice on what to look for and how to think strategically when upgrading or investing in new imaging systems.


Q: What are the imaging techniques commonly used in your facility and for what applications?

A: Our facility supports a wide range of researchers. Our current popular systems are the confocal microscopes, array tomography, and the whole slide scanners. Whole slide scanning—where the tissue is cut into sections and imaged—has become the first step in basic science. The ability to see an entire organ in slices has completely transformed how basic research is done. A researcher can look at an entire brain and see where a single neuron is connected to another, how disease starts or spreads. Scanning and saving whole slides also has an advantage over glass slides for archival purposes, as the samples can be saved and accessed remotely. Confocal microscopy allows the researcher to optically section each tissue to view processes and metabolites within individual cells. It allows researchers to determine where a protein of interest is located within an individual cell, as well as within an organ.

Cerebellum captured on Olympus VS120 Virtual Slide Scanning MicroscopeStephanie Rudolph PhD, Wade Regerhr Lab, Department of Neurobiology, Harvard Medical SchoolArray tomography is an imaging technique that creates super-high-resolution volumes of tissue. The technique requires the tissue to be embedded in a plastic resin to maintain structural integrity. The plasticized tissue is cut into ultrathin (50-200 nm) serial sections using a diamond knife and then mounted on a glass slide. Each slide can have a ribbon of a hundred sections of tissue concatenated together. Antibodies are added to the slide to attach to precise protein targets. The serial sections are imaged using basic fluorescent microscopy and the resulting images are compiled into a volume stack. The tissue can be eluted with chemicals that will remove the antibodies, and restained with new antibodies for other protein targets. The process is repeated until the tissue is spent of all antigenicity, or until all the proteins of interest have been identified and imaged. The end result tells the researcher exactly where a protein is within the volume of the slice and exactly where it is in relation to all the other proteins imaged.

Q: What changes/improvements have you seen in imaging hardware and software in recent years?

A: There have been so many improvements in imaging over the years that it’s hard to keep track of them all. Overall, hardware has become cheaper, smaller, and more efficient. Imaging cameras have become faster and more sensitive. Pixel technology has grown and changed so much that it’s hard to imagine that sensitive collection of photons was out of reach for most researchers just 10 years ago. I can now buy for under $40,000 a camera that can capture 80 percent of the photons with a frame speed of 100 fps. Lasers and laser combiners have become smaller, less expensive, and easier to integrate. I used to have a bunch of gas lasers with all kinds of fans pushing the heat out of the room. The lasers needed to be internally aligned and the gas had a relatively short lifetime. Today, I can buy a laser online that fits in the palm of my hand and will last at least twice as long as the old gas tubes, and it will put out minimum heat load.

The biggest change in technology, however, is in the software and the computers that drive it. It’s mind-blowing how quickly computers have evolved. My facility can’t run without them. Everything is software driven. Every year computers improve in speed, drive space, graphics capabilities, and cost. These improvements directly affect how we image things and how we view imaging as a tool. More hard drive space means we can collect more data, and improved speed to write and transfer helps us collect that data faster. Software has had to keep up with the exponential growth for imaging and computing faster, bigger, and more data.

Q: What are some areas that still need improvement?

A: Transfer speed is an area that still needs improvement. In some of our equipment, the software is designed to acquire images slower than the hardware can run because of the transfer speed from our imaging systems to the computer hard drive. Large data analysis is also a bottleneck. Big data is growing exponentially and becoming mainstream in research. That ability to store, retrieve, and measure the data collected can be very expensive and time-consuming.

Q: Are there any specific concerns with sample prep or post-data analysis that you think readers should be aware of in imaging?

A: In regard to sample prep, little things add up. Paying close attention to small details has big payoffs at the end. Don’t rush it and keep it clean. Every single step deserves your undivided attention, from cutting to sealing the coverslip. Make sure you have a good system to name your samples when saving them for future use. Coming up with smart, easy-to-read naming keys for your samples will improve your science and make it easy for you to find your data quickly without having to repeat experiments. Make a naming key, stick with it, and put it everywhere. Being proactive and consistent with your naming protocol will make you a hero in your lab long after you are gone.

Know the bit-depth of your image acquisition system and use the full intensity range. Too many times, researchers spend countless hours creating images that cannot be accurately measured because they are out of range.

Q: What are some of the trends you are seeing in technology or in applications for imaging?

Saggital section captured on Olympus VS120 Virtual Slide Scanning MicroscopeJessica Saulnier, Bernardo Sabatini Lab, Department of Neurobiology, Harvard Medical SchoolA: Some trends in imaging that I have seen in recent years include the use of high-content and super-resolution imaging. Collecting whole slides in the case of slide scanner imaging and imaging entire organs or organisms—using technology like light-sheet imaging—are quickly becoming fundamental in biology. These technologies provide a macro to micro view of our biological world. Pairing this with super-resolution techniques such as Stochastic Optical Reconstruction Microscopy (STORM), Photo Activated Localization Microscopy (PALM), Stimulated Emission Depletion (STED), and Structured Illumination Microscopy (SIM) grant researchers the opportunity to image at the molecular level. Using different modalities together on a single project could provide a clearer picture of disease within an organism from the basic morphology to molecular biology.

Q: Any advice to readers who are looking to invest in or upgrade their imaging equipment?

A: There are a lot of things to evaluate when considering your next hardware purchase or upgrade. Technology and hardware change so quickly. The most important consideration is to make sure you buy the system that is most compatible with your research. We all are dazzled by the newest and coolest gadgets, but these often come with undeveloped software and may not have real-life applications in your science. Think hard before purchasing a module or system that is cool, but that you “might” use. The software is as important as the hardware. That said, it’s really all about the optics. You need to resolve your particle. Spend the money on high numerical aperture lenses, especially when purchasing a confocal or high-resolution microscope. Also, spend the time learning the ins and outs of the system. Take it apart or have the service technician take it apart for you. Look inside; consider the layout and the moving parts. Moving parts will fail eventually. Are there a lot of them? Will you need to budget for that in the future when they reach their lifetime? How quickly can it be serviced? Are these parts readily available or is there a lag in manufacturing and shipping?

If investing in a new system, you want something that is upgradeable, flexible, and useful in your lab. You are going to purchase a very expensive piece of lab equipment, so make sure it’s something that your lab will use a lot. If it’s going to be used a lot, you will need to purchase something that can’t easily be broken. Accidents happen. A system that is well manufactured will take that into account and have built-ins to minimize catastrophe. Always consider upgrading before buying something new. I purchased a Laser Scanning Confocal Microscope six years ago. I upgraded the optics and scan galvo for better transmission and added an autofocus module when live-cell confocal imaging was becoming an important part of our science. I added a stage with mosaic tiling when large volumes of brain slices were needed. My system can grow with the needs of the community without the need to raise significant capital. This was important to my core lab and me.