Aravindan Vijayaraghavan: 'Smoothed Analysis for Tensor Decompositions and Unsupervised Learning'

preview_player
Показать описание
Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021
Workshop III: Mathematical Foundations and Algorithms for Tensor Computations

"Smoothed Analysis for Tensor Decompositions and Unsupervised Learning"
Aravindan Vijayaraghavan - Northwestern University

Abstract: Smoothed analysis is a powerful paradigm for proving robust, polynomial time guarantees of algorithms on worst-case computationally intractable problems. While polynomial time smoothed analysis guarantees are desirable for tensor decompositions and related problems in unsupervised learning, they have been challenging to obtain.

In this talk, I will describe a general framework for showing polynomial time smoothed analysis guarantees for tensor methods and related unsupervised learning problems. The core technical component is obtaining new high confidence lower bounds on the least singular value for a broad class of random matrix ensembles with highly dependent entries, that are given by low-degree polynomials of a few base underlying random variables. I will describe how these can be used to obtain polynomial time smoothed analysis guarantees for robust decompositions of symmetric order-(2t) overcomplete tensors (rank up to O(n^t)), learning overcomplete latent variable models like HMMs and other problems.

Based on joint work with Aditya Bhaskara, Aidao Chen and Aidan Perreault.

Institute for Pure and Applied Mathematics, UCLA
May 6, 2021

Рекомендации по теме