Category Archives: machine learning

Our preterm survival score out on Nature Reports! (now with software)

Congratulations to my student Marco Podda for his first journal paper on a ML-based preterm infant survival score.

It comes with a webservice to compute the score on new data here and with the code to train it here.

And we are now also hitting the news:

https://www.unipi.it/index.php/news/item/13511-dall-intelligenza-artificiale-uno-strumento-per-valutare-la-sopravvivenza-dei-neonati-prematuri

http://www.pisatoday.it/salute/valutazione-neonati-prematuri-pisa.html

Podda Marco, Bacciu Davide, Micheli Alessio, Bellu Roberto, Placidi Giulia, Gagliardi Luigi : A machine learning approach to estimating preterm infants survival: development of the Preterm Infants Survival Assessment (PISA) predictor. In: Nature Scientific Reports, vol. 8, 2018.

Organizing special session @CIBB2018

I am co-organizing a special session on Machine explanation – Interpretation of Machine Learning Models for Medicine and Bioinformatics  at the upcoming CIBB 2018.

Deadline for paper submission: 10 June 2018.

The aim of the session is to report original research and case studies where deep and machine learning models are explained and verified using clinical or bioinformatics knowledge.

Prospective contributors/participants can contact me (or another co-organizer) for details.

Organized by: Davide Bacciu (Università di Pisa, Italy), Ian Jarman (Liverpool John Moores University, United Kingdom), Jose D. Martin, (Universitat de València, Spain), Alfredo Vellido (Universitat Politècnica de Catalunya, Spain)

Deep concentric reservoir paper @IJCNN-WCCI 2018

A paper on a deep concentric reservoir architecture has just been accepted for IJCNN 2018! Congratulations to my student Andrea Bongiorno for his first work!

Now available on Arxiv!

Bacciu Davide, Bongiorno Andrea: Concentric ESN: Assessing the Effect of Modularity in Cycle Reservoirs. Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN 2018) , IEEE, 2018.

Congratulations to new graduates!

Big day for a hoard of my students graduating today with ML theses!

Antonio Bruno developed a deep learning model for learning tree transductions in the LISTIT project.

Andrea Bongiorno proposes a new deep architecture for reservoirs based on concentric topologies.

Federico Errica discussed (cum Laude!) a new deep generative model for contextual processing of graphs.

Alessio Gravina studied how to help clinicians in early prediction of BPD in low birth-weight infants.

Many congrats as well to Ahmed Alleboudy and Ruben Matino whom I followed in their external theses as UNIPI supervisor.

Good news in ESANN 2018 program

First congratulations to my student Daniele for his first scientific paper:

Bacciu Davide, Castellana Daniele: Mixture of Hidden Markov Models as Tree Encoder. Proceedings of the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN'18), i6doc.com, Louvain-la-Neuve, Belgium, 2018, ISBN: 978-287587047-6.

Great success also for the Deep Learning in Bionformatics special session I am co-organizing. We received an high number of great quality papers but only 20% had to be chosen for the oral plenary. Check out the program here!

New IEEE Transactions paper!

A paper on generative tree kernels has just been accepted for publication on the prestigious IEEE Transactions on Neural Networks and Learning Systems. Nice Christmas gift for my and my co-authors, Alessio Micheli and Alessandro Sperduti!

 

Bacciu Davide, Micheli Alessio, Sperduti Alessandro: Generative Kernels for Tree-Structured Data. In: Neural Networks and Learning Systems, IEEE Transactions on, 2018, ISSN: 2162-2388 .

New journal paper accepted

A paper on randomized neural networks for preference learning with physiological timeseries data has just been accepted for pubblication on the Neurocomputing journal. Congratulations to my Biobeats collaborators!

Bacciu Davide, Colombo Michele, Morelli Davide, Plans David: Randomized neural networks for preference learning with physiological data. In: Neurocomputing, vol. 298, pp. 9-20, 2018.