Artificial intelligence (AI) provides exciting opportunities for fast, accurate, and unbiased cell image analysis. Read our guide to machine learning in high content analysis and see the benefits AI can bring to your automated cell imaging.
Machine learning in high content analysis
Artificial intelligence. It’s creeping into so many aspects of modern life, from autonomous vehicles to voice-powered personal assistants, and even the creation of art. But it’s the application in science and healthcare where the benefits of AI, which learns and improves itself, really stand out.
One of these applications is in bioimage analysis, where the data collected is not only vast but also complex.
It’s possible to make the use of AI part of the high content analysis, HCA (also called high content screening) workflow through machine-learning modules built into advanced image analysis software packages. Here, we give you a quick introduction to AI in HCA and one of these modules, touching on how it optimizes workflows,
But first, what is AI?
Machine learning. Deep learning. Neural networks. These are all slightly different terms for AI, which the Oxford dictionary defines as:
“The theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.”
Essentially, AI represents any intelligence demonstrated by machines that mimics cognitive functions we would usually associate with human minds, such as learning, problem solving, and reasoning.
As the field of AI expands, so the goals and applications diversify.
Applying AI to image analysis
Image analysis is a prime target for machine learning.
HCA experiments can generate vast quantities of multiplexed microscopy data, and a certain amount of automation in HCA is already both common and a necessity.
Machine learning modules in image analysis software can take this to the next level, able to perform object classification autonomously, which is one of the key steps in an HCA workflow.
Image analysis can be a particularly intricate and time-consuming process if performed manually, or even semi-automatically. There’s always the possibility of missing key data, misinterpretation, and biasing due to the difficult and extremely detailed nature of the task.
Add to this the repetitive, lengthy, and often laborious workflows, and it’s clear that AI technologies provide a huge opportunity for optimizing workflows and efficiency. Their absolute consistency between uses would also remove any person-to-person variation, human error, and bias, improving data quality and confidence.
Overcoming human bias
One of the key benefits of machine learning in HCA that deserves special note is the ability to overcome human bias. When studying large data sets, humans are vulnerable to a well-described phenomenon called ‘inattentional blindness’. This is where unexpected observations go unnoticed when performing other attention-demanding tasks.
For example, having previously studied a particular cell phenotype and response in detail, you might be unintentionally looking for those same signs when presented with a large, complex data set containing many variables and measures. In doing so, you might then overlook another subtle or unexpected feature that also has biological relevance.
Unsupervised machine learning helps overcome this vulnerability, performing completely unbiased classification, with the potential to produce unexpected, valuable findings.
Machine learning classification modules in image analysis software
Following segmentation and feature extraction steps, object classification involves assigning cells to a group or class based on its phenotype. Common examples include cellular morphology, sub-cellular localization, and expression level of specific markers.
It is possible to use a classifier tool to manually pick relevant features and assign classes, but this is only applicable to straightforward phenotypical changes based on a few measures. For example, you might be determining cell cycle stage based on nuclear dye intensity or classifying as live or dead in a viability assay.
For anything more complex, involving an expanded set of features, using AI for object classification becomes a better option. Machine learning modules, which can be added on to image analysis software, automatically assign classes to objects faster and more reliably than any human user.
Take, for example, the Phenoglyphs classification module for GE’s IN Carta software, which uses machine learning to analyze multiple measures simultaneously. It automatically identifies most relevant features and pre-assigns classes within the dataset without any need for you to manually select measures or thresholds.
The four steps of training the Phenoglyphs module
Phenoglyphs uses a random sampling of 5000 objects in the dataset to train the model in four steps (Fig 1):
- Cluster: The module automatically selects and uses measures calculated during segmentation to create natural groupings in the data, called clusters, without human bias.
- Label: The user selects and labels all valid classes (at least two) for ranking and training.
- Rank: The module ranks the list of measures used to partition objects into classes and provides the opportunity to deselect measures with redundant information or little impact.
- Train: The module refines the classification model based on user input, including removal of objects or reassignment to more appropriate classes.
Fig 1. Training the Phenoglyphs machine-learning classification module.
As a user, you only need to review and input on nine examples for each class before the module applies the model to the entire dataset.
This approach is semi-automated, which minimizes the need for user input at the first step of class assignment. The considerable time saved puts you firmly in the driving seat of the final phenotypical classification.
Machine-learning classification modules like Phenoglyphs can save hours or even days of manual work in high content analysis, all while providing consistent and unbiased results.