Duke Statistical Science is distinguished by our leadership in the development of theory and methodology of modern, stochastic model-based statistical analysis and Bayesian methods, their integration with research in advanced scientific computation, and collaborative inter-disciplinary applications in many fields. A complete list of our Research Areas and associated faculty follows.

Research Areas

Adversarial risk analysis is a decision-theoretic approach to games. read more »

Agent-based modeling studies the emergent behavior of systems whose components interact according to relatively simple rules.  read more »

Our understanding of the universe has been entirely transformed in the past few decades as we have confirmed the existence of long-conjectured celestial objects read more »

Causal inference concerns designs and methods of analyses to evaluate treatments or interventions in randomized experiments and observational studies. read more »

Increasingly in many “modern” application areas, it has become common to collect data that have a structure that is not well represented by a simple scalar, vector or matrix. read more »

Computational advertising uses statistical methods to improve recommender systems, optimize ad-buy decisions, and forecast click-through rates.  read more »

Many agencies collect data that they intend to share with others. read more »

Research themes include: Development of structured Bayesian dynamic models for large-scale, time-varying systems; read more »

Dynamical text networks are time-evolving networks in which the nodes are documents. read more »

Research themes include: Time series and forecasting linked to applications in macroeconomics and business; read more »

Missing data plague many applied data analyses.  read more »

Statistical models are constructed for a variety of purposes, but typically involve an effort to explain observables (existing or future data) in terms of some underlying structure. read more »

When one is simultaneously conducting more than one test, this is called multiple testing. read more »

As costs of mounting new data collection efforts increase, many statistical agencies and data analysis are turning to integrating data from multiple sources. read more »

A minimal standard for data analysis and other scientific computations is that they be reproducible read more »

Spatial analysis or spatial statistics includes any of the formal techniques which study entities read more »

Statistical computing is the interface between statistics and computer science. read more »

Statistics education is the practice of teaching and learning of statistics, along with the associated scholarly research. read more »

Mathematical models intended for computational simulation of complex real-world processes (e.g., a climate model developed to study climate change) are a crucial ingredient in virtually every field of science, engineering, medicine, and business. read more »

Data visualization describes all of the ways people transform data into visual representations. read more »