Alain Celisse

Pages de cet auteur
Nicole Mücke, Berlin, le 5 mars 2021 (TU Berlin)
Stochastic gradient descent (SGD) provides a simple and efficient way to solve a broad range of machine learning problems. Here, we focus on distribution regression (DR), involving two stages of sampling : Firstly, we regress from probability measures to real-valued responses. Secondly, we (...)
Adeline Fermanian, LPSM, le 12 février 2021
Sequential or temporal data arise in many fields of research, such as quantitative finance, medicine or computer vision. We will be concerned with a novel approach for sequential learning, called the signature method, and rooted in rough path theory. Its basic principle is to represent (...)
Arnak Dalalyan, ENSAE, le 22 janvier 2021
This paper shows that a single robust estimator of the mean of a multivariate Gaussian distribution can enjoy five desirable properties. First, it is computationally tractable in the sense that it can be computed in a time which is at most polynomial in dimension, sample size and the logarithm (...)
Claire Lacour, LAMA, le 8 jnvier 2021
Nous considérons un échantillon de données sur le cercle, dont la distribution est un mélange à deux composantes. On suppose que la densité de l’échantillon est g(x)=p f(x+a)+(1-p) f(x+b) où p est le paramètre de mélange, f une densité sur le cercle, et a et b deux angles. L’objectif est d’estimer à la (...)
Antoine Chambaz, MAP5, le 18 décembre 2020
We address the practical construction of asymptotic confidence intervals (CIs) for smooth, real-valued statistical parameters by targeted learning from iid data in contexts where sample size is so large that it poses computational challenges. We observe some summary measure of all data and (...)