Wednesday, October 13, 2021
CGIS Knafel Building (K354) - 12:10-1:30 pm
Differential privacy (DP) brings provability and transparency to statistical disclosure limitation. When data users migrate their analysis onto private data products, there is no guarantee that a statistical model, otherwise suitable for non-private data, can still produce trustworthy conclusions. This talk contemplates two challenges in drawing good statistical inference from private data. When the DP mechanism is transparent, I discuss how approximate computation techniques can be adapted to produce exact inference with respect to the joint specification of the intended model and the DP mechanism. In the presence of mandated invariants which the data curator must observe, I underscore the importance to recognize the associated privacy leakage, and advocate for the congenial design of the DP mechanism as an alternative to optimization-based post-processing, as a way to preserve the statistical intelligibility of the private data product.