Speaker: Doroteo Torre-Toledano.

Abstract: The Kolmogorov-Arnold Representation Theorem, KART (1957-58), establishes that any multidimensional function can be expressed in terms of a finite set of binary additions and unidimensional functions. In practice this means that the only real multidimensional function is the sum. This theorem was early studied in the field of machine learning due to its promise to simplify multidimensional problems and deal with the curse of dimensionality, but it was discarded as irrelevant for machine learning in 1989 because it led to highly complex and wild unidimensional functions. More recently, in 2024, Z. Liu et al. have revisited KART solving some of the difficulties for using it for machine learning. While KART proposes a 2-layer representation for all functions, Liu showed that it was beneficial to use deeper networks, proposing deep Kolmogorov-Arnold Networks (KANs) as an alternative to deep MLPs. The importance of KANs rely in their easier interpretation and their possibilities to interact with human knowledge in terms of feature importance, feature properties such as symmetries, and even symbolic formulas. The talk presents KAN as an alternative to MLPs, in particular in small machine learning problems that require explainability, and analyzes some of their possibilities, advantages and disadvantages compared to mainstream deep learning.