Logo eprints

Learning with Boundary Conditions

Gnecco, Giorgio and Gori, Marco and Sanguineti, Marcello Learning with Boundary Conditions. Neural Computation, 25 (4). pp. 1029-1106. ISSN 0899-7667 (2013)

Full text not available from this repository.

Abstract

Kernel machines traditionally arise from an elegant formulation based on measuring the smoothness of the admissible solutions by the norm in the reproducing kernel Hilbert space (RKHS) generated by the chosen kernel. It was pointed out that they can be formulated in a related functional framework, in which the Green’s function of suitable differential operators is thought of as a kernel. In this letter, our own picture of this intriguing connection is given by emphasizing some relevant distinctions between these different ways of measuring the smoothness of admissible solutions. In particular, we show that for some kernels, there is no associated differential operator. The crucial relevance of boundary conditions is especially emphasized, which is in fact the truly distinguishing feature of the approach based on differential operators. We provide a general solution to the problem of learning from data and boundary conditions and illustrate the significant role played by boundary conditions with examples. It turns out that the degree of freedom that arises in the traditional formulation of kernel machines is indeed a limitation, which is partly overcome when incorporating the boundary conditions. This likely holds true in many real-world applications in which there is prior knowledge about the expected behavior of classifiers and regressors on the boundary.

Item Type: Article
Identification Number: 10.1162/NECO_a_00417
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Research Area: Computer Science and Applications
Depositing User: Giorgio Gnecco
Date Deposited: 16 Sep 2013 09:43
Last Modified: 16 Sep 2013 12:02
URI: http://eprints.imtlucca.it/id/eprint/1730

Actions (login required)

Edit Item Edit Item