STA 290 Seminar: Nikita Zhivotovskiy

seminar thumbnail

Event Date

Mathematical Sciences Building 1147

Speaker: Nikita Zhivotovskiy, UC Berkeley

Title: "Does the sequential analysis of estimators lead to sharp statistical guarantees?"

Abstract: Online learning methods yield sequential regret bounds under minimal assumptions and provide in-expectation risk bounds for statistical learning. Despite the apparent advantage of online guarantees over their statistical counterparts, recent findings indicate that in many important cases, regret bounds may not guarantee tight high-probability risk bounds in the statistical setting. We discuss how online-to-batch conversions applied to general online learning algorithms can bypass this limitation. Through a general second-order correction to the loss function defining regret, we obtain nearly optimal high-probability risk bounds for several classical statistical estimation problems, such as discrete distribution estimation, linear regression, logistic regression, and—more broadly—conditional density estimation. Our analysis is based on the premise that many online learning algorithms are not restricted to using predictors from a given reference class, allowing for significant improvements in the dependencies on various problem parameters.

Faculty webpage (links to UC Berkeley):

Seminar Date / Time: Thursday February 29, 2024, at 4:10pm (refreshments 3:30pm)

Location: MSB 1147