Date of Award

Spring 2020

Degree Type


Degree Name

Doctor of Philosophy (PhD)



Committee Chair

Dr. Kyna Shelley

Committee Chair School


Committee Member 2

Dr. Peter Paprzycki

Committee Member 3

Dr. Richard Mohn

Committee Member 3 School


Committee Member 4

Dr. Thomas Lipscomb

Committee Member 4 School


Committee Member 5

Dr. Thomas V. O’Brien

Committee Member 5 School



Evidence about technology effectiveness in supporting post-secondary students’ learning of introductory statistics concepts is inconclusive. Lacking in current investigations are considerations of the synergies between technology, content, and pedagogy that influence learning outcomes in statistics education. The current study used meta-analytic procedures to address the gap between theory and practice related to the best evidence of effective instructional practices in technology-enhanced introductory statistics classrooms. A conceptual framework based on the ADDIE model, TPACK, and constructivism guided the investigation of substantive study characteristics related to instructional design.

Findings were based on 32 studies published between 1998-2018 that used quasi-experimental or experimental research designs and measured statistics achievement. Hedges’ g effect sizes were computed for each study used in the meta-analysis. Random-effects analysis revealed a small average effect of 0.23 favoring technology use over no technology control conditions. Mixed-effects results revealed instructional design characteristics that were significant moderators, favoring technology use. Concerning the learning context, significant effects were found among studies with undergraduate student samples (0.45), discipline-specific courses (0.31), and studies with learning goals associated with statistical literacy, thinking, or reasoning (0.42) and learning statistical skills/concepts (0.28). Regarding content, design, and duration, significant effects were found among studies covering descriptive or null hypothesis testing (0.74), that used technology designed by the instructor (0.30) and for a semester or longer (0.25). Significant effects for instruction implementation included the use of various learning tasks (0.33), students' cooperative, collaborative, or collective engagement (0.38), use of scaffolding (0.36), and the use of technology with multiple functions for covering concepts (0.42). Concerning assessment, significant effects were found for studies using multiple formative assessment measures (0.34) and those using non-authentic assessments (0.28).

Non-significant results were found for report and methodological characteristics, except for studies whose description of the instructional design process was somewhat replicable (0.36). Sensitivity analyses did not indicate publication bias. However, interpretation of meta-analysis findings should be made with considerations that findings are based mostly on studies with quality ratings of unclear risk of bias (63%). Findings are discussed in light of the literature. Implications and recommendations for future research are provided.

Included in

Education Commons