TY - JOUR
T1 - Sliding Window Strategy for Convolutional Spike Sorting with Lasso
T2 - Algorithm, Theoretical Guarantees and Complexity
AU - Dragoni, Laurent
AU - Flamary, Rémi
AU - Lounici, Karim
AU - Reynaud-Bouret, Patricia
N1 - Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Nature B.V.
PY - 2022/6/1
Y1 - 2022/6/1
N2 - Spike sorting is a class of algorithms used in neuroscience to attribute the time occurrences of particular electric signals, called action potential or spike, to neurons. We rephrase this problem as a particular optimization problem: Lasso for convolutional models in high dimension. Lasso (i.e. least absolute shrinkage and selection operator) is a very generic tool in machine learning that help us to look for sparse solutions (here the time occurrences). However, for the size of the problem at hand in this neuroscience context, the classical Lasso solvers are failing. We present here a new and much faster algorithm. Making use of biological properties related to neurons, we explain how the particular structure of the problem allows several optimizations, leading to an algorithm with a temporal complexity which grows linearly with respect to the size of the recorded signal and can be performed online. Moreover the spatial separability of the initial problem allows to break it into subproblems, further reducing the complexity and making possible its application on the latest recording devices which comprise a large number of sensors. We provide several mathematical results: the size and numerical complexity of the subproblems can be estimated mathematically by using percolation theory. We also show under reasonable assumptions that the Lasso estimator retrieves the true time occurrences of the spikes with large probability. Finally the theoretical time complexity of the algorithm is given. Numerical simulations are also provided in order to illustrate the efficiency of our approach.
AB - Spike sorting is a class of algorithms used in neuroscience to attribute the time occurrences of particular electric signals, called action potential or spike, to neurons. We rephrase this problem as a particular optimization problem: Lasso for convolutional models in high dimension. Lasso (i.e. least absolute shrinkage and selection operator) is a very generic tool in machine learning that help us to look for sparse solutions (here the time occurrences). However, for the size of the problem at hand in this neuroscience context, the classical Lasso solvers are failing. We present here a new and much faster algorithm. Making use of biological properties related to neurons, we explain how the particular structure of the problem allows several optimizations, leading to an algorithm with a temporal complexity which grows linearly with respect to the size of the recorded signal and can be performed online. Moreover the spatial separability of the initial problem allows to break it into subproblems, further reducing the complexity and making possible its application on the latest recording devices which comprise a large number of sensors. We provide several mathematical results: the size and numerical complexity of the subproblems can be estimated mathematically by using percolation theory. We also show under reasonable assumptions that the Lasso estimator retrieves the true time occurrences of the spikes with large probability. Finally the theoretical time complexity of the algorithm is given. Numerical simulations are also provided in order to illustrate the efficiency of our approach.
KW - Lasso
KW - Neuroscience
KW - Optimization
KW - Sparsity
KW - Spike sorting
U2 - 10.1007/s10440-022-00494-x
DO - 10.1007/s10440-022-00494-x
M3 - Article
AN - SCOPUS:85130016475
SN - 0167-8019
VL - 179
JO - Acta Applicandae Mathematicae
JF - Acta Applicandae Mathematicae
IS - 1
M1 - 7
ER -