Data scientist
My research experiences covered particle physics and neurosciences, two scientific fields separated
by their publication culture, background knowledge, team size and so on.
In these contexts, the recording, modeling and analysis of data has always been the core of my work.
In particular, I undertook:
- "Big data" analysis (before the term was coined).
- Data recording, including on-call expertize.
- Test of new clustering algorithms.
- Modeling of network dynamics.
- Modeling of biochemical processes.
- Parameter tuning optimization.
- Implementation of new visualization techniques.
In my current position, I am in charge of implementing evidence-based management within the direction of the University.
More specifically:
- I am implementing the Rectorate's cockpit (governance dashboard) from every facet: developing the tool, supporting the direction team in choosing indicators, building a network of data owners and certifiers,...
- Being the in-house bibliometrics expert, I work within a national network of experts of scientific evaluation and university rankings.
- With several university, within the League of European Research Universities (LERU) framework, we worked with Elsevier Scival development team and provided feedback on indicators, functionalities and user-experience of their products.
Past research experience
Biochemical modeling
I modeled different biochemical processes at various physical scales: from calcium reaction/diffusion in a single spine to rhythmic networks of a hundred neurons. The work that attracted most attention from the scientific community involved a 1600-compartments model of the Purkinje cell (PC), for which I demonstrated the plurality of parameter sets that can precisely reproduce the experimental activity.
Software development
Together with W. Van Geit, graduate student in E. De Schutter's lab, we have built a full software package to automate parameter tuning: Neurofitter. This software is receiving a very good welcome from the computational neuroscience community. It is now part of the software studied in the "Modeling and Identifying Neural Systems" lecture at Johns Hopkins University.
Particle physics data analysis
My initial training led me to a PhD in experimental particle physics in the European laboratory CERN, on the world largest electron-positron collider. Our experimental team was the fruit of an international collaboration involving more than 50 institutes. My dissertation dealt with statistical data analysis of two-photon collision events leading to the inclusive production of hadrons. It was, therefore, a test for the quantum chromodynamics theory, and some of my observations remain unexplained in the current Standard Model.
Particle physics hardware maintenance
Detectors work with on-line and very rapid selection of the crossings
that carry interesting physics events. This task is performed by several pieces of hardware called
"triggers".
Inside the L3 experiment, my responsibilities included the maintenance of the wire chamber trigger
(outer-TEC trigger) for which I have been on-call expert for several weeks of data taking.
I also participated in the 24/7
recording of data. I shared the responsibility
of the electromagnetic calorimeter monitoring and of the "run control".