Approximation with Tensor Networks

Nom de l'orateur
Mazen Ali
Etablissement de l'orateur
LMJL
Date et heure de l'exposé
Lieu de l'exposé
Zoom

We study the approximation of multivariate functions on bounded domains with tensor networks (TNs). The main conclusion of this work is an answer to the following two questions that can be seen as different perspectives on the same issue: “What are the approximation capabilities of TNs?” and “What is the mathematical structure of approximation classes of TNs?”

To answer the former: we show that TNs can (near to) optimally replicate h-uniform, h-adaptive and hp-approximation. Tensor networks thus exhibit universal expressivity w.r.t. classical polynomial-based approximation methods that is comparable with more general neural networks families such as deep rectified linear unit (ReLU) networks. Put differently, TNs have the capacity to optimally approximate many function classes – without being adapted to the particular class in question.

To answer the latter: we show that approximation classes of TNs are (quasi-)Banach spaces, that many types of classical smoothness spaces are continuously embedded into TN approximation classes and that TN approximation classes themselves are not embedded in any classical smoothness space. Although approximation with TNs is a highly nonlinear approximation scheme, under certain restrictions on the topology of the network, the set of functions that can be approximated with TNs at a given rate is highly structured – it is a (quasi-)Banach space. Moreover, it is a very “large” space: many well-known classical smoothness spaces, such as isotropic/anisotropic/mixed Besov spaces, are embedded therein – while it also contains exotic functions that have no smoothness in the classical sense.

Mathieu Ribatet vous invite à une réunion Zoom planifiée.

Sujet : Séminaire Ali Mazen Heure : 26 janv. 2021 11:00 AM Paris

Participer à la réunion Zoom https://ec-nantes.zoom.us/j/94571681620