STA 290 Seminar: Jane-Ling Wang

seminar thumbnail

Event Date

Location
Mathematical Sciences Building 1147

Speaker: Jane-Ling Wang, Distinguished Professor, Dept. of Statistics, UC Davis

Title: "Deep learning for survival and functional data"

Abstract: This talk explores the application of deep neural networks (DNN) for two types of data:  survival data and functional data.

Survival Data:  While DNN have demonstrated empirical success in applications for survival data, most of these methods are difficult to interpret and mathematical understanding of them is lacking.  We study the partially linear Cox model, where the nonlinear component of the model is implemented using a deep neural network. The proposed approach is flexible and able to circumvent the curse of dimensionality, yet it facilitates interpretability of the effects of treatment covariates on survival. We establish asymptotic theory for maximum partial likelihood estimators and show that the nonparametric DNN estimator achieves the minimax optimal rate of convergence (up to a poly-logarithmic factor). Moreover, the corresponding parametric estimator for treatment covariate effects is √n-consistent, asymptotically normal, and attains semiparametric efficiency. Numerical experiments provide evidence of the advantages of the proposed method.

Functional Data:  The infinite dimensionality of functional data means standard learning algorithms can be applied only after appropriate dimension reduction, typically through basis expansions.  Currently, these bases are chosen a priori without the information for the task at hand and thus may be suboptimal. We instead propose to adaptively learn these bases in an end-to-end fashion.  We introduce a DNN that employs a new basis-layer whose hidden units are each basis functions themselves, implemented as a micro neural network. This architecture learns parsimonious dimension reduction to functional inputs that focuses only on information relevant to the target rather than irrelevant variation in the input function. Across numerous classification and regression tasks that involve functional data this method empirically outperforms other types of DNN.


Seminar Date/Time: Thursday February 8th, 2024, 4:10pm (Refreshments: 3:30pm)

Location: MSB 1147 (Colloquium Room)

 

Tags