47th Annual Convention; Online; 2021
All times listed are Eastern time (GMT-4 at the time of the convention in May).
|The Nitty Gritty of ABA Research: Special Topics in Single Subject Design
|Monday, May 31, 2021
|3:00 PM–3:50 PM
|Area: TBA; Domain: Translational
|Chair: Judah B. Axe (Simmons University)
|CE Instructor: Judah B. Axe, Ph.D.
|Abstract: Although textbooks on applied behavior analysis and single subject design contain clear guidelines for graphing data and designing studies, there are nuances of these enterprises that deserve further exploration. The first paper addresses a question that researchers, practitioners, and instructors face when graphing: Which graphing conventions are most important? The authors report and discuss survey data on behavior analysts’ ratings of the importance of different graphing conventions and correlations with demographic variables. The parallel treatments design is uncommon, not found in many textbooks, and not on the task list of the Behavior Analyst Certification Board. Therefore, the second paper is a review of studies that used this design, along with recommendations for future use. Finally, researchers often combine single subject designs, yet there are few guidelines for combining designs in the literature. Therefore, the third paper is a review of studies that combined experimental designs, along with analyses and recommendations. The purpose of these papers is to help guide behavior analysts in graphing data and designing studies.
|Instruction Level: Intermediate
|Keyword(s): combining designs, experimental designs, graphing, parallel treatments
|Target Audience: behavior analysis students, researchers, practitioners
|Learning Objectives: At the conclusion of the presentation, participants will be able to:
1. Describe single subject design graphing conventions, which were rated most important, and what demographic variables correlated with those ratings.
2. Define the parallel treatments design, explain the extent to which researchers adhered to its defining features, and describe recommendations for future directions.
3. Describe how and why researchers combine single subject designs.
Graphing Conventions for Behavior Analysts: Demographic Variables Associated With Ratings of Importance
|KENDRA GUINNESS (Simmons University), Kylan S. Turner (Simmons University), Philip N. Chase (Simmons University), Judah B. Axe (Simmons University)
Behavior analysts should graph according to behavior analytic conventions, but the extent to which there is agreement on conventions is unclear. Graphing conventions include aesthetic features of individual graph elements as well as the positioning of graph elements in relation to one another. The current study examined which graphing conventions behavior analysts report are most important, and if there were demographic variables associated with ratings of importance. A web-based survey was completed by 631 Board Certified Behavior Analysts. Five graphing conventions were rated as very important by 80% or more of participants. Ratings of importance varied considerably for the remaining 32 conventions. Further analyses revealed that differences in ratings were associated with several demographic variables including credential, primary work setting, and education level. These results suggest that graphing conventions are used inconsistently across the field of behavior analysis, and implications for future research and training new behavior analysts are discussed.
|A Systematic Review of Adherence to the Defining Features of the Parallel Treatments Design: Is it Still a Thing?
|SARAH FRAMPTON (Simmons University; May Institute, Inc. ), Kendra Guinness (Simmons University), Judah B. Axe (Simmons University)
|Abstract: The identification of interventions that are both effective and efficient is an ongoing need in the practice of applied behavior analysis. The parallel treatments design (PTD) has been described as a powerful and useful tool for comparing interventions in applied settings. The PTD includes elements of the multiple probe design (MPD) and the adapted alternating treatments design (AATD). Execution of a PTD requires adherence to experimental tactics related to both designs, as well as adherence to particular features outlined by the originating authors (Gast & Wolery, 1988). The purpose of this systematic literature review was to evaluate (1) publication trends with the PTD; (2) applications of the PTD across behaviors and interventions; and (3) the extent to which researchers using the PTD adhered to its defining features. Outcomes are discussed with respect to the utility of the PTD and relative contributions to single-subject design research.
|A Review of Combining Single–Case Experimental Designs in Applied Behavior Analysis
|OLGA MELESHKEVICH (Simmons University), Judah B. Axe (Simmons University)
|Abstract: Many researchers using single single–case experimental designs (SCED) combine two or more experimental designs when examining a research question, such as embedding a multielement design within a reversal design. Combining SCEDs allows researchers to study complex behavioral processes; demonstrate strong experimental control; and allow a demonstrative analysis when within-experiment comparative, parametric, and component analyses produce limited results. As a means of commenting on the use of combined SCEDs, and because we found no prior review papers on combining SCEDs, we examined the extent to which researchers combined SCEDs in Volume 52 (2019) of the Journal of Applied Behavior Analysis. Results suggest that 18 out of 71 (25%) articles contained combined SCEDs. The most prevalent combination was a multielement design within a multiple baseline design across subjects, and the most frequent type of research question was comparative. We provide recommendations on combining SCEDs in terms of controlling extraneous variables, assessing stimulus generalization, and providing both demonstrative and comparative analyses.
BACK TO THE TOP
Back to Top