Treasure maps of tissues – Navigating Multi-channel fluorescence images

As multichannel fluorescent imaging gains prominence in immuno-oncology research and immunotherapy development, AI can be used to analyze these images.
Written by Aiforia

Imagine navigating yourself in a city not familiar to you using a map that shows only terrain features without any other important layers of information on the urban environment. If you are extremely lucky, you might end up finding the restaurant you had booked for a dinner.

Similarly, a histological section could be considered as a map of tissue, and the information layers of the map are derived from different stainings of the tissue section. Haematoxylin-Eosin (H&E) and chromogenic immunohistochemistry (IHC) are the two gold standard methods for tissue staining. However, although robust and visually distinctive, the information content of these stainings is limited.

The question is, whether H&E and IHC stainings navigate pathologists and researchers well enough? Often times yes, but not always, as H&E provides only morphological information of the tissue, analogous to the terrain features in the city map. On the other hand, although IHC provides molecular information for a specific marker, the analysis is typically limited to a single marker (protein or RNA)  per tissue section. Furthermore, the morphological information of IHC is not as rich as of H&E stained sections. Thus, IHC can be considered analogous to the detection of the location of every single restaurant in the city but not specifically pinpointing the one booked for the dinner. Obviously, more information is needed to locate the correct restaurant. In the tissue context, the additional information equals to the detection of more than one marker per tissue section. For this, multi-channel fluorescence IHC is superior to chromogenic detection, as it provides inherently de-convoluted information with each marker detected in a specific fluorescence channel. In chromogenic multi-molecule detection, the chromogens are mixed and convoluted in a red-green-blue image, thus complicating the analysis and interpretation of the staining.

With the advent of digital pathology, after the introduction of whole-slide scanners capable of fluorescence imaging, the expectations were high for the future of fluorescence in histology. However, fluorescence imaging has not met these expectations, mostly due to the fact that although fluorescence scanning has been established, cost-efficient and easy-access tools to visualise and analyse whole-slide fluorescence images of tissues have been lacking. Especially effortless viewing of fluorescence images is pivotal as fluorescent tags are not visible under light microscope.

We at Fimmic believe that a more comprehensive utilisation of fluorescence IHC  could open up new avenues in histology and pathology by allowing the researchers to ask totally novel questions, not limited by the current tissue analytics, ultimately translating to improved patient care. We also believe that the renaissance of fluorescence will be facilitated by the end-users – researchers and pathologists – once provided with access to tools and technologies that allow user-friendly yet efficient fluorescence image viewing and analytics.

Thus, we developed and launched the first-in-class, fully cloud-embedded fluorescence image module on our WebMicroscope® AI Platform. The module provides a cost-efficient and easy-access solution for fluorescence image viewing and analysis for our customers. The module includes user-friendly multi-channel visualization of the fluorescence images and powerful context-intelligent image analytics using neural networks. As the WebMicroscope AI Platform is operated in the cloud environment through a web browser interface, a smooth and seamless user experience is delivered anywhere, anytime.

Soon after the launch of the fluorescence module, we were contacted by researchers, who were struggling to analyse mouse pancreas sections stained for mesencephalic astrocyte derived neurotrophic factor (Manf) and insulin 1 (Ins1) with the aim to count double positive cells (Manf+Ins1+) in the fluorescence images 1. The researchers were frustrated with both the development of image analysis pipeline to detect complex features and the implementation of the analysis on whole-slide fluorescence images. As a solution, we utilised our WebMicroscope Deep Learning AI Platform and trained a deep neural network to automatically detect and count cells separately in exocrine pancreas and in Langerhans islets, applied across whole-slide fluorescence images.

Currently, we are developing dozens of different applications for both fluorescence and brightfield images ranging from counting of neuronal synapses to automatic classification of malignant tumors on our WebMicroscope Deep Learning AI Platform. The platform is a powerful and versatile tool for complex yet large-scale image analytics integrating cloud computing and context-intelligence image analysis.

1 Lindahl M, Danilova T, Palm E, Lindholm P, Võikar V, Hakonen E, Ustinov J, Andressoo JO, Harvey BK, Otonkoski T, Rossi J, Saarma M. 2014. MANF is indispensable for the proliferation and survival of pancreatic β cells. Cell Rep. 24;7(2):366-75.