Skip to main content

Special Colloquium - Xiaocong Xu, USC

January 26, 4:00pm - 5:00pm pm in Math Rm. 501

When

4 – 5 p.m., Jan. 29, 2026

Where

Title: Estimation and Inference in Proportional High Dimensions 

Abstract: Many modern learning problems are studied in a proportional high-dimensional regime, where the feature dimension is of the same order as the sample size. In this talk, I will discuss how working in this regime affects both estimation and uncertainty quantification, and how we obtain useful and sharp characterizations for widely used estimators and algorithms.
 

The first part will focus on ridge regression in linear models. We derive a distributional approximation for the ridge estimator via an associated Gaussian sequence model with “effective” noise and regularization parameters. This reduction provides a convenient way to analyze prediction and estimation risks and to support practical tuning rules, such as cross-validation and generalized cross-validation. It also yields a simple inference procedure based on a debiased ridge construction.
 

The second part will take an algorithmic perspective. Instead of analyzing only the final empirical risk minimizer, we view gradient descent iterates as estimators along an optimization path. We characterize the distribution of the iterates and use this characterization to construct data-driven estimates of generalization error and debiased iterates for statistical inference, including in settings beyond linear regression. I will conclude with simulations that illustrate the practical implications for tuning and inference.