Lucas Janson presents "Recent Advances in Model-X Knockoffs"

Presentation Date: 

Wednesday, November 20, 2019

Location: 

CGIS Knafel Building (K354) - 12-1:30 pm

Abstract: Two years ago in this workshop I presented my work on model-X knockoffs, a method for high-dimensional variable selection that provides exact (finite-sample) control of false discoveries and high power as a result of its flexibility to leverage any and all domain knowledge and tools from machine learning to search for signal. In this talk, I will discuss two recent works that significantly advance the usability and generality of model-X knockoffs. First, I will show how the original assumptions of model-X knockoffs, that the multivariate distribution of the covariates be known exactly, can be significantly relaxed to the assumption that only a model for the covariates be known, and that model can have as many free parameters as the product of the sample size and dimension. No loss in the guarantees of knockoffs is incurred by this relaxation of the assumptions. Second, I will show how to efficiently and exactly sample knockoffs for any distribution on the covariates, even if the distribution is only known up to a normalization constant. This dramatically expands the set of covariate distribution for which we can apply knockoffs. This is joint work with a number of collaborators, listed below in the full references for the two works:

D. Huang and L. Janson. Relaxing the Assumptions of Knockoffs by Conditioning. Annals of Statistics (to appear), 2019.

S. Bates, E. Candès, L. Janson, and W. Wang. Metropolized Knockoff Sampling. 2019.

See also: 2019