Lab Manager | Run Your Lab Like a Business

News

artificial intelligence in the life sciences

New AI Tool Speeds Up Biology and Removes Potential Human Bias

New deep learning software for analyzing the choreography of protein movement removes the need for expert human intervention and opens up the field to more labs worldwide

eLife

Scientists have developed an AI tool to analyze how proteins move and interact which is faster and more accurate than humans, according to a study published Nov. 3 in eLife.

The software, which is freely available, dramatically speeds up the study of protein dynamics and makes it accessible to research teams across the world, rather than limited to a few laboratories with specialist expertise.

Proteins are the workhorses of our cells, and their movement controls a vast array of biological processes. Studying the choreography of proteins—how they move around and interact with each other—is an essential part of understanding fundamental biology. One of the main tools for studying protein motion is called single molecule Förster Resonance Energy Transfer (smFRET). This works by labeling two or more parts of the molecule with a different fluorescent tag, and when the two tags are in close proximity, the change in fluorescence can be detected by a microscope. In this way, the movement of proteins can be visualized and measured down to the nanometer level.

"Some of the challenges with smFRET include the very large data that are produced, and the steps that researchers need to take to process the images before analysis," explains lead author Johannes Thomsen, who carried out this study as a research assistant at the University of Copenhagen, Denmark, and has since graduated with a PhD. "Machine learning technologies, especially deep neural networks, have significantly improved our ability to understand large datasets without the need for human intervention. We wanted to see whether employing these technologies to smFRET data would allow automated, fast characterization of protein motions, independently of human experts."


Related Article: Building Better Artificial Intelligence for Health Care


The team chose to use a type of deep learning called deep neural networks (DNN). Deep learning is a unique branch of machine learning that takes the raw form of the data and looks for patterns with no prior 'knowledge'. It has the advantage of learning useful features from raw data without time-intensive pre-processing, and offers a 'less opinionated' evaluation of the data, compared with the more subjective analysis by humans. DNN has a further advantage in that it can learn to recognize important aspects of the data and then classify it into groups. Although developing a DNN is a computationally intensive process that can take time, once trained the model can be used easily, and by non-experts, in any computer.

The tool, DeepFRET, imports raw microscope images, locates the two different fluorescence signals, corrects for background noise and, with limited human help, produces a chart showing the motion of the molecules within the sample. When tested with simulated and real data, its accuracy at detecting meaningful patterns from the data was more than 95 percent, outperforming human operators and yet only needing one percent of the time. The evaluation time for DeepFRET on a single piece of data (a trace) was around 50 milliseconds, whereas human reviewers spent an average of five seconds per trace.

"We have developed a machine learning method that can automatically, rapidly and reproducibly analyze recordings of the choreography of protein motions, with simple user interface that works on different operating systems," concludes senior author Nikos Hatzakis, associate professor at the University of Copenhagen, and affiliate associate professor at the Novo Nordisk Foundation Center for Protein Research, University of Copenhagen. "The method works equally to or better than existing methods, and requires only minimal contribution by humans. It therefore offers a tool for people with limited expertise, which we hope will contribute to the standardization and rapid expansion of this field of study."

- This press release was originally published on the eLife website