Child Care and Early Education Research Connections
Skip to main content
How much bias results if a quasi-experimental design combines local comparison groups, a pretest outcome measure and other covariates?: A within study comparison of preschool effects
This study uses a within study comparison design (WSC) to conduct a novel test of how much causal bias results when researchers use a nonequivalent comparison group design type (NECGD) that combines: (a) a comparison group local to the treatment group; (b) a pretest measure of the study outcome; and (c) a rich set of 19 other multidimensional covariates. Most prior WSCs have dealt with the bias consequences of only 1 of these, revealing that each routinely reduces bias but does not necessarily eliminate it. Thus, a need exists to identify NECGDs that more robustly eliminate bias. This study is the first to examine how combining the 3 bias-control mechanisms above affects bias. The intervention we examine is a prekindergarten mathematics curriculum, for which a randomized control trial (RCT) produces a positive 1-year math effect. Final bias in the NECGD is assessed as the difference between its impact and that of the RCT when each design has the same intervention, outcome, and estimand. Over the many specifications we explore, NECGD bias is less than .10 standard deviations, indicating that minimal bias results when an NECGD combines all 3 design elements. The factorial design we use in this study also tests the bias associated with seven other NECGD types. Comparing the total pattern of results shows that the minimal bias when all 3 elements are combined is uniquely attributable to the locally chosen comparison group and not the availability of a pretest or other covariates. In actual research practice, it is impossible to predict in advance which design elements will affect bias by how much in any given application. So further research is needed to probe whether the simultaneous use of all three design elements achieves minimal bias dependably across diverse applications and not just in the preschool math context examined here. Translational Abstract This study examines when nonexperiments might substitute for experiments that are done in real-world settings in order to learn what works to affect some socially valued outcome. The study probes whether a similar result is achieved in an experiment and in a nonexperiment that lacks the randomly formed control group of the experiment but that has instead a nonequivalent control group. However, this nonequivalent group is locally chosen to the treatment group in hopes of reducing the size of initial group differences and to match on whichever policies and practices are set locally. Moreover, the nonexperiment in question has a pretest measure of the study outcome and a “rich” set of other multidimensional measures with which to model whatever differences exist after the nonequivalent treatment and control groups have been selected from the same local pool. The study shows that the experimental and nonexperimental estimates are very close if the nonexperiment is defined in terms of local comparison group choice, a pretest measure of the study outcome and a rich set of other preintervention measures. This result now needs replicating. (author abstract)
Reports & Papers
Related resources include summaries, versions, measures (instruments), or other resources in which the current document plays a part. Research products funded by the Office of Planning, Research, and Evaluation are related to their project records.
You May Also Like
These resources share similarities with the current selection. They are found by comparing the topic, author, and resource type of the currently selected resource to the rest of the library’s publications.
A portrait of Head Start classrooms and programs in spring 2020: FACES 2019 descriptive data tables and study design
Reports & Papersview
Linking survey data with commercial or administrative data for data quality assessment
Reports & Papersview
Quality is critical for meaningful synthesis of afterschool program effects: A systematic review and meta-analysis
Reports & Papersview
Release: 'v1.13.0' | Built: 2022-08-08 12:44:31 EDT