At first, my parcelated experience might be misleading and seems confused due to the different domains whose I was involved in. However, and despite the knowledge I may have accumulated in those distant domains, I was clearly following one well defined path: the data modelization and its processing. Indeed, the work I conducted mostly consisted in developing models for a better understanding of the data, or, processing data in order to better analyze it.
COMING SOON: POSTDOC AT UEF
Framework: Inverse Problems in Aerosol Dynamics.
Model: … soon, I’ll tell you more.
POSTDOC AT CRAL
Framework: In astronomy, the interferometry is a method that allows the resolution of details beyond the diffraction limit. Nevertheless, the resolution is not the only advantage, the size of the instrument can shrink. Despite the clear interest of the method and its promises, the harvested data need a processing step so that it can reveal its true power. Unfortunately, no state of the art algorithm manage to completely process the data, hence, the choice often remains the most basic, i.e. clean beam method, therefore, much information is lost.
Model: The data are acquired in the Fourier domain, yet they are not complete for two reasons. First, only a sparse region of the Fourier domain is recorded. Second, the data are not the full imaginary numbers, a.k.a. complex visibilities, consisting of a set of amplitudes and phases, but rather the data set is compound of the amplitudes and only partially known or unknown phases. We propose and investigate several methods for the image reconstruction and phase retrieval using an acquisition model based on the Fourier propagation of light and taking advantage of the inverse approach. The optimization methods rely on convex (vmlmb) and/or stochastic (simulated annealing) approach.
PHD AT CREATIS
Framework: The human health is unarguably of utmost importance for our society, though, it is not yet fully understood and handled. One considerable tool of the health system is the detection and diagnostic of diseases. In addition to being helpful in applied medicine, the diagnostic tools can help finding new markers and characteristics of diseases. For instance, spotting fiber patterns in myocardium using diffusion MRI data may help the detection at early stage of heart failure such as in infarctus or simply make a follow up after intervention more accurate and non-invasive.
Model: During my PhD, I created and investigated several methods dedicated to model the myocardium with spatial curves relying on the assumption that the muscle fibers are coherently arranged over few centimeters. The extraction of the curves is the tractography. My tractography method relies on a graph based representation of the fibers and is performed by means of minimization of an energy function which is designed to reach a minimum for the set of curves that best represent the heart. The energy decomposes into three terms, one for the data fidelity and two other menbers enforcing a prior knowledge. The data fidelity term depends on the diffusion data: orientation and reliability of the measures. It assumes that the water diffusion is anisotropic and that the main diffusion orientation is aligned with the myofiber orientation. The two a priori terms expresse the smoothness and topology of the fibers. The optimization of the graph configuration is conducted by stochastic methods of the class of the simulated anneaming.
Interesting byproducts: Many byproducts were created during my work, especially for the visualization of fiber sets which are plotted as tubes. Fortunately, the measures created for the characterization of the fiber sets and their classification also help for a better visualization of large sets of fibers. Often, the tractography methods described in the scientific litterature generate sets of fibers that contains too many fibers for an interpretation by the naked eye. Therefore, a criteria that allows the selection of the relevant fibers is welcomed.
ENGINEER AT MPI
Framework: In structural cell biology, one possible way for understanding the mechanisms of the chemistry of the cells relies on the expression of proteins and their crystallization which allows, by means of X-ray diffraction, to build the structure and chemical composition of those proteins. However, the crystallization of protein requires extensive screening in order to try many experimental conditions in the hope to find the actual conditions allowing the proteins to crystallize. As a consequence, this process is partially automated and generates many experiments whose results are automatically saved as images acquired with microscopes. The end user must check every results one by one going through all images, which is often times consuming. Hence, an automated classification of the images is a clear improvement of the process, despite its complexity.
Model: Based on few features, one part of the classifier is able to discard two of the nine classes corresponding to two frequent cases, nothing happened (a.k.a. clear), and precipitate. The features were feed to a compound classifier that uses, among other, a neural network. Unfortunately, the variability of the illumination conditions and the material used (microscope and experiments support), the classifier must be tuned for every conditions, which was not possible at the time.
INTERNSHIP AT EXPLORA NOVA
Framework: Explora Nova is a company that sells software and hardware solutions for biology and medical research. One of their product controls the acquisition of confocal microscopy images and processes the acquired data. In order to stay competitive the software required an update of the deconvolution model for the reconstruction of microscopy images.
Model: I wrote a model for the confocal microscopy based on the Fourier propagation theory and the state of the art scientific literature at the time. The model was used in a deconvolution algorithm, that was (a posteriori) poorly chosen for this reconstruction problem, the Lucy-Richardson method.