Association for Behavior Analysis International

The Association for Behavior Analysis International® (ABAI) is a nonprofit membership organization with the mission to contribute to the well-being of society by developing, enhancing, and supporting the growth and vitality of the science of behavior analysis through research, education, and practice.

Search

2025 Single Case Conference; Minneapolis, MN; 2025

Program by Workshops: Sunday, September 28, 2025


 

Workshop #W11
Interactive Workgroups: Facilitating Single-Case Experimental Design (SCED) Replication Research: A Path Forward
Sunday, September 28, 2025
9:00 AM–9:50 AM
Embassy Suites Minneapolis; Topaz/Turqoise/Opal
Area: SCI; Domain: Theory
CE Instructor: Jason Travers, Ph.D.
JASON TRAVERS (Temple University), MATTHEW TINCANI (Temple University)
Description:

Single-case experimental design researchers have long emphasized the importance of replicating effects within and across participants.  Replication is critical for both internal and external validity. One potential implication of this dual role is that single-case researchers may experience a trade-off between robust internal validity through clear demonstration of experimental control and exploring the boundary conditions of interventions. The emphasis on strong and clear experimental control and aversion to publishing null or ambiguous results may result in stifling precise understanding of external validity. Single-case research has produced considerable advances toward better understanding of effective interventions and improving skills for individuals requiring support; however, serious challenges in defining what works for whom and under what conditions remain.

There is evidence that SCED studies not demonstrating robust experimental control are less likely to be published. As a result, some SCED researchers may exclude data that shows weak or no experimental control from research reports submitted for peer review. In other cases, researchers may omit procedural components to enhance the appearance of robust experimental effect from study descriptions, resulting in attribution of effects entirely to the described intervention(s). Traditional views of replication, valuable as they are, may not be sufficient to advance understanding about when, how, and why interventions are less effective or ineffective. This presentation will focus on what SCED researchers can do, individually and as a community, to address these and related challenges.

In the first part of the presentation, we will briefly overview concerns about replication identified in research domains outside of single-case research (e.g., “the replication crisis”) and how these map onto SCED research. We will discuss examples of systematic replication in thematic lines of research and how these culminate in identification of interventions and strategies as evidence-based. We then will illustrate potential ways the function of replication might be strengthened or weakened through researcher decisions.

In the second part of the presentation, we will present research scenarios and datasets  that highlight the described challenges with replication. For example, we will present SCED datasets to attendees and ask them to make hypothetical procedural and reporting decisions based on the data. We will also discuss editorial and reviewing scenarios involving research reports reflecting varying degrees of experimental control. Attendees and presenters will collectively explore solutions to these scenarios with identification of specific action steps to support a robust and valid SCED research literature.

Learning Objectives: 0. 1. Participants will describe key challenges with replication in the SCED research literature. | 0. 2. Participants will identify features of SCED studies exhibiting varying degrees of experimental control, and factors affecting decisions regarding interpretation and reporting. | 0. 3. Participants will describe peer review and editorial considerations for disseminating SCED replication studies, including those with weak or no experimental control.
Activities: The audience will break into small groups for interactive discussions and applied activities related to Drs. Travers & Tincani's presentation.
Audience: Single-case researchers, reviewers, editors, and those who teach single-case designs
Content Area: Methodology
Instruction Level: Advanced

BACK TO THE TOP

 

Back to Top
ValidatorError
  
Modifed by Eddie Soh
DONATE
{"isActive":false}