STA 290 Seminar Series
Thursday, October 27th, 4:10pm, MSB 1147 (Colloquium Room)
Refreshments at 3:30pm in MSB 4110 (Statistics Lounge)
Speaker: Luis Rademacher (Assistant Professor, Dept Mathematics, UC Davis - webpage)
Title: "Provably efficient high dimensional feature extraction"
Abstract: The goal of inference is to extract information from data. A basic building block in high dimensional inference is feature extraction, that is, to compute functionals of given data that represent it in a way that highlights some underlying structure. For example, Principal Component Analysis is an algorithm that finds a basis to represent data that highlights the property of data being close to a low-dimensional subspace. A fundamental challenge in high dimensional inference is the design of algorithms that are provably efficient and accurate as the dimension grows. In this context, I will describe two well-established feature extraction techniques: column subset selection (CSS) and independent component analysis (ICA). I will also present work by my coauthors and myself on CSS with optimal approximation guarantees, new applications of ICA and ICA for heavy-tailed distributions.