Description
Kernel methods are widely used in machine learning and related fields. In this talk two kernel based techniques are presented. In the first part of the talk resampling based hypothesis tests are introduced for the regression function of binary classification. These statistical tests are endowed with exact, finite-sample guarantees regarding the type 1 error and proved to be strongly consistent under mild statistical assumptions. The new hypothesis tests are built on the theory of conditional kernel mean embeddings, which represent conditional distributions in a Bochner space. In the second part of this talk a stochastic iterative algorithm is presented to estimate the conditional kernel mean map in a vector valued reproducing kernel Hilbert space. The stochastic gradient method is applied to minimize the expected $L^2$ loss of which optimum is reached in the unknown conditional kernel mean function. The strong consistency of our new scheme is analyzed based on the theory of stochastic approximation.
Joint work with: Balázs Csanád Csáji, SZTAKI, ELTE