THURSDAY, JANUARY 25 2024 | 3:30 p.m. E111 Fee Hall, Zoom
MSU Department of Economics
University Distinguished Professor
“Simple Approaches to Nonlinear Difference-in-Differences with Panel Data”
I will discuss simple strategies for estimating average treatment effects for staggered interventions when panel data are available and the response variable may warrant a nonlinear model. Identification hinges on a no anticipation assumption and a version of parallel trends that conditions on observed covariates. When a canonical link function is paired with a quasi-log likelihood in the linear exponential family, estimation is robust and simple. The leading cases are a linear model paired with the Gaussian quasi-log likelihood, a logistic functional form – for binary and fractional outcomes – combined with the Bernoulli quasi-log likelihood, and an exponential mean paired with the Poisson quasi-log likelihood.
THURSDAY, FEBRUARY 8 2024 | 3:30 p.m C102 Fee Hall (Patenge Room), Zoom
Assistant Professor, Biostatistics and Bioinformatics
Emory University
"Causal Effect Estimation in the Presence of Unmeasured Confounders"
A commonly employed strategy for evaluating the average causal effect (ACE) of a treatment on an outcome of interest is through finding a ‘back-door’ adjustment set that blocks all the confounding paths between treatment and outcome.
However, the use of back-door covariate adjustment as a basis for inference is often untenable in observational studies due to the presence of unmeasured factors confounding the treatment-outcome relationship. An alternative strategy involves the use of directed acyclic graphs (DAGs) with hidden/unmeasured variables to encode independence restrictions between counterfactual and observed variables within a nonparametric model. This graphical approach has led to the development of sound and complete algorithms for identifying causal parameters based on the observed data distribution. The ‘front-door’ model, originally proposed by Pearl, is perhaps the simplest example of a DAG with unmeasured confounders where no valid back-door adjustment set exists, yet the causal effect can still be identifiable. The front-door criterion hinges on the existence of mediators unaffected by unmeasured confounders, which fully mediate the treatment's effect on the outcome. We propose and evaluate a set of estimation strategies for the front-door functional, and a wider extensions of this functional based on the identification theory of DAGs with hidden variables, using non/semi-parametric theory and the targeted minimum loss estimation. Our proposed estimators are applicable to various scenarios accommodating binary, continuous, and multivariate mediators. They address the limitations of existing approaches by using data-adaptive machine learning algorithms that encode fewer modeling assumptions while ensuring desirable statistical properties such as asymptotic linearity, double-robustness, efficiency, and guaranteed estimates within the target parameter space. We establish conditions for nuisance functional estimations that are sufficient to ensure the root-n consistency of estimators for ACE as the target parameter of inference.
THURSDAY, FEBRUARY 22 2024 | 3:30 p.m., Zoom
Research Associate Biostatistics
Harvard School of Public Health
“Causal Estimation of Exposure Shifts with Neural Networks: Evaluating the Health Benefits of Stricter Air Quality Standards in the US”
In policy research, one of the most critical analytic tasks is to estimate the causal effect of a policy-relevant shift to the distribution of a continuous exposure/treatment on an outcome of interest. We call this problem shift-response function (SRF) estimation. Existing neural network methods involving robust causal-effect estimators lack theoretical guarantees and practical implementations for SRF estimation. Motivated by a key policy-relevant question in public health, we develop a neural network method and its theoretical underpinnings to estimate SRFs with robustness and efficiency guarantees. We then apply our method to data consisting of 68 million individuals and 27 million deaths across the U.S. to estimate the causal effect from revising the US National Ambient Air Quality Standards (NAAQS). This change has been recently proposed by the US Environmental Protection Agency (EPA). Our goal is to estimate the reduction in deaths that would result from this anticipated revision using causal methods for SRFs. Our proposed method, called Targeted Regularization for Exposure Shifts with Neural Networks (TRESNET), contributes to the neural network literature for causal inference in two ways: first, it proposes a targeted regularization loss with theoretical properties that ensure double robustness and achieves asymptotic efficiency specific for SRF estimation; second, it enables loss functions from the exponential family of distributions to accommodate non-continuous outcome distributions (such as hospitalization or mortality counts).
THURSDAY, MARCH 14 2024 | 3:30 p.m C102 (Patenge Room), Zoom
Professor
Washington University School of Medicine in St. Louis
“High-Dimensional Quantile Mediation Analysis with Application to a Birth Cohort Study of Mother–Newborn Pairs”
There has been substantial recent interest in developing methodology for high-dimensional mediation analysis. Yet, the majority of mediation statistical methods lean heavily on mean regression, which limits their ability to fully capture the complex mediating effects across the outcome distribution. To bridge this gap, we propose a novel approach for selecting and testing mediators throughout the full range of outcome distribution spectrum. Specifically, our high-dimensional quantile mediation model provides a comprehensive insight into how potential mediators impact outcomes via their mediation pathways. This method’s efficacy is demonstrated through extensive simulations. The study presents a real world data application examining the mediating effects of DNA methylation on the relationship between maternal smoking and offspring birthweight. Our method now offers a publicly available and user-friendly function qHIMA(), which can be accessed through the R package HIMA at https://CRAN.R-project.org/package=HIMA.
National Cancer Institute
Division of Cancer Control & Population Sciences
“Causal inference in natural experiments: Challenges and opportunities”.
Public health and primary prevention depend on both biomedical approaches to health and disease such as sanitation, vaccination, and behavioral interventions addressing fundamental causes of disease and on policy and programmatic approaches addressing aspects of the built, natural, economic, and social environment that influence health and health behaviors. These latter aspects are not always amenable to randomized trials establishing efficacy and effectiveness yet may well be the most important tools we have to address the causes of disease. Increasingly, NIH has been recognizing the importance of rigorous evaluation of policy and the need for wider acceptance and use of natural or quasi-experiments to establish an evidence base for policy approaches to health. In this seminar, Dr. Berrigan will share an example of the challenges of evaluating natural experiments, describe the NIH portfolio of time-sensitive natural experiments addressing obesity policy, and share about funding opportunities for this kind of work. Rigorous evaluation of policy approaches to public health are vital to inform policy makers and support effective strategies to improve health and reduce health disparities.
THURSDAY, APRIL 11 2024 | 3:30 p.m. E111 Fee Hall, Zoom
MSU Foundation Professor
MSU College of Education
“Robustness of Inference to Replacement & Fragility for Logistic Regression and Hazard Functions”
Many studies in epidemiology concern outcomes such as whether a patient survived or how long a patient lived until relapse. As such the statistical models of logistic regression and hazard functions are often applied. But it can be a challenge to interpret the uncertainty of inferences when such models are applied.
Standard errors are complex and often approximate, and likelihood ratio tests are not easily intuitive to many. Here we present methods for characterizing the sensitivity of inferences based on the Robustness of Inferences to Replacement (RIR) framework. This approach generates statements such as “To nullify the inference, __% of the treatment successes would have to be replaced with cases for which the treatment had no effect (Frank et al)” Or, for hazard functions, “To change the inference, __% of the control cases would have to be replaced with those as the median survival time of the treatment.” These techniques allow a broad set of stakeholders who can think in terms of experiences of patients to engage conversation about statistical inferences.