Loading
Optical microscopy, through its variegated modalities, is an unparalleled approach to investigate the living. Beyond the bare acquisition of a snapshot, it is routinely used to gain a dynamical view of biological processes at a high frame rate (tens of frames per sec.). On top of that, it uses light as a perturbation, to either locally photo-switch the dyes to investigate the dynamics of labelled proteins, or to create a subcellular laser nano-ablation and observe how the biological system cope with it. In industrial setups, high content screening is often limited to bare observation since no generic system can perform photo-perturbative methodologies autonomously. They are still crafty approaches requiring experts; their automating is yet to be done, meaning that they are not usable in screening or routine. The smart autonomous microscopy (SAMic) project aims to open this perspective. We assert that semantic segmentation using fully convolutional network (FCN) as user-customizable image processing, embedded onto an ARM-based dedicated electronics, will enable the real-time processing to detect the right time and place to apply a perturbation (fluorescence switching or nano-ablation), and together with our Inscoper control module, a timely adapting of microscope driving. To tackle these challenges, we will pursue three objectives: (i) Applicative, by developing two experimental biology approaches for which the SAMic is mandatory to drive the developments. The investigated biological questions are current in our labs: the role of AurkA at mitochondria affecting mitochondrial dynamics and the mechanical based robustness of cell division in human cells. (ii) Technological by designing the smartCam-LEAD module (Localised Events Advanced Detector). The main innovation lies in porting a convolutional network powered semantic segmentation to a dedicated ARM-based microprocessor in perspective to achieve real-time, building on our current achievement of machine learning image classification. We will connect it to the microscope driving module to perform autonomous experiments. (iii) Prototype to demonstrate the power of SAMic. We aim here to go beyond a development setup that can only address our two biological questions. Taking advantages of the experience of the consortium in tech development transfer to imaging facilities and the market, we will bring our setup to a proper prototype, appropriate for dissemination by the industrial partner. This involves, in particular, the design of proper HMIs by the industrial partner and the transfer of the prototype on the MRic microscopy facility. The partners teaming up to create the SAMic, two academic labs from IGDR and Inscoper company, have broad expertise, from mathematics for image analysis to microscopy applied in biology through computer science, electronics and instrumental development. These interdisciplinary skills are a distinctive trait of our consortium and, we believe, a strong force to succeed in the project. The SAMic project aims to be a technical and methodological major breakthrough in fluorescence microscopy to investigate life mechanisms which will allow obtaining a large amount of unsupervised photo-perturbative experiments. Artificial Intelligence applied to fluorescence microscopy will dramatically help the researchers to better observe and understand what happens within their live samples. The industrial partner Inscoper aims to offer to the market an interoperable and optimised platform to control any microscopy device and to achieve any image acquisition modalities. Adding the AI and feedback control capabilities will position SAMic as a real disruptive product in the market. Creating an intelligent microscope is one of the next big challenges for the life sciences. The project SAMic will contribute to making it real.
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=anr_________::1db81f4f06dea2a5f694d6155e4e9048&type=result"></script>');
-->
</script>