DOI: 10.1093/biomet/asad078 ISSN: 0006-3444

On Selecting and Conditioning in Multiple Testing and Selective Inference

Jelle J Goeman, Aldo Solari
  • Applied Mathematics
  • Statistics, Probability and Uncertainty
  • General Agricultural and Biological Sciences
  • Agricultural and Biological Sciences (miscellaneous)
  • General Mathematics
  • Statistics and Probability

Abstract

We investigate a class of methods for selective inference that condition on a selection event. Such methods follow a two-stage process. First, a data-driven collection of hypotheses is chosen from some large universe of hypotheses. Subsequently, inference takes place within this data-driven collection, conditioned on the information that was used for the selection. Examples of such methods include basic data splitting, as well as modern data carving methods and post-selection inference methods for lasso coefficients based on the polyhedral lemma. In this paper, we adopt a holistic view on such methods, considering the selection, conditioning, and final error control steps together as a single method. From this perspective, we demonstrate that multiple testing methods defined directly on the full universe of hypotheses are always at least as powerful as selective inference methods based on selection and conditioning. This result holds true even when the universe is potentially infinite and only implicitly defined, such as in the case of data splitting. We give general theory and intuitions before investigating in detail several case studies where a shift to a non-selective or unconditional perspective can yield a power gain.

More from our Archive