Our understanding of the universe has been entirely transformed in the past few decades as we have confirmed the existence of long-conjectured celestial objects (black holes, neutron stars, gravity waves, exoplanets) and discovered entirely new ones (dark matter, gamma ray bursts, dark energy, Hawking radiation). Many of these discoveries, and most of the current exploration in astronomy, involve close collaboration between astronomers and statistical scientists. Recent and upcoming telescopes, both ground-based (Giant Magellan Telescope (GMT), Large Synoptic Survey Telescope (LSST), etc.) and space-based (Transiting Exoplanet Survey (TESS), James Webb Space Telescope (JWST), etc.), are generating terabytes a day of new data that call for entirely new methods of statistical analysis to support scientific inquiry, and extensions of existing methods. Important statistical tools in this area include Bayesian time series analysis, stochastic processes, and multivariate statistics, and compressed sensing.
Theoretical advances in high energy physicists now often entail the comparison of data from experiments at existing and proposed accelerators such as the Large Hadron Collider to simulated data based on theoretical models. Understanding and quantifying uncertainty in both the experimental and the simulated data is an area of rapid progress that needs a collaboration between physical and statistical scientists.
Recent years have seen enormous growth at the intersection of astronomy, partical physics, and Bayesian statistics.