Once I did sudo rm -rf /
I am highly proficient with Python and the Numpy + Scipy + Matplotlib stack, which I have also used for deploying solutions employed in high performance computing clusters. I have experience with machine learning libraries such as Pytorch, Scikit-Learn, PyUnLocBox and Copt. I work also with MATLAB/Octave, Java, C++ and Sagemath.
Here I list some projects that I’ve worked on. See my Github profile for my side projects and occasional contributions.
talon is a pure Python package that implements Tractograms As Linear Operators in Neuroimaging.
The software provides the talon Python module, which includes all the functions and tools that are necessary for filtering a tractogram. In particular, specific functions are devoted to:
- Transforming a tractogram into a linear operator.
- Solving the inverse problem associated to the filtering of a tractogram.
- Perform these operations on a GPU.
The source is available on Gitlab, the package can be installed via Pypi and the documentation is available on Read the Docs.
wlalign is a pure Python package that implements the graph-alignment routine based on the generalization of the Weisfeiler-Lehman algorithm proposed in
our Network Neuroscience paper. The software provides the wlalign Python module, which includes all the functions and tools that are necessary for computing network alignments and similarity. In particular, specific functions are devoted to:
- Computing the graph Jaccard index of similarity between two weighted graphs.
- Solving the graph alignment problem with WL-align.
The package is available at Pypi, the documentation is available on Read the Docs and can be easily installed from the command line. I developed this package together with Emilio Cruciani.
The Diffusion Microstructure Imaging in Python (Dmipy) package is an open source tool for the analysis of brain tissue microstructure with diffusion MRI data developed in the ATHENA team. Over time, I contributed to the maintainance and the developement of the Python package, mainly focusing on
- Multi-Tissue Multi-Compartment models.
- Generalised AMICO implementation (not in stable version yet).
Mrtrix3 is the reference software used for the processing of dMRI data and the extraction of tractograms.
I contributed to the development of the 3.0.0 version by creating the new program
connectomeedit for performing basic operations on connectome matrix numerical data, such as transposition, symmetrization and extraction of upper and lower triangular parts.
Occasionally, I also reported minor bugs and incompatibilities.
PyUNLocBoX is a Python package which uses proximal splitting methods to solve non-differentiable convex optimization problems. It is a free software, distributed under the BSD license, and available on PyPI. PyUNLocBoXis a core element of TALON, as it is used for solving the optimisation problem associated to the designed tractography filtering technique. I contributed to the development of PyUNLocBoX by adding the structure sparsity regularisation term and its proximal operator.
The Convex Optimization Modelling for Microstructure Informed Tractography (COMMIT) is a Python package that allows to solve specific instances of the tractography filtering problem. In the past I have been a maintainer and reporter. Also, I designed and implemented the architecture of the commit.solvers module, which is the backbone of the tractography filtering process. The current implementation contains corrections and changes merged by the current maintainers.
- REACT-fMRI: I helped Ottavia Dipasquale in the development of the REACT-fMRI Python package, which allows to estimate target-enriched functional connectivity maps from functional MRI data using Positron Emission Tomography templates as spatial priors of the density distribution of neurotransmitters in the brain.
- gfcpy: I wrote a Python script that computes the global functional connectivity from a 4D volume of time series using nibabel and numpy.
- pyopendemic: I volounteered in the Opendemic project, for which I wrote some routines for the analysis of epidemiologic data. Link
- Spelling Bee: I am an avid player of NTY’s Spelling Bee, which I use as an occasion to widen my English vocabulary. I wrote a simple program that helps solving it whenever it is far from my possibilities. Link