|HAL 9000 or R2-D2?: Accessibility of Advanced Data Analytic Techniques for Behavior Analysts|
|Monday, May 30, 2022|
|3:00 PM–4:50 PM |
|Meeting Level 2; Room 254B|
|Area: AUT/DDA; Domain: Translational|
|Chair: Jonathan E. Friedel (Georgia Southern University)|
|Discussant: Brent Kaplan (University of Kentucky)|
|CE Instructor: Shawn Patrick Gilroy, Ph.D.|
With modern computers, there is an ever-increasing promise for the ability to conduct complex data analyses that are designed for behavior analysts and the types of data we collect. However, many of these useful techniques remain out of the grasp of the average behavior analyst because the techniques do not exist as a functional tools and are just promises. A wholly different behavioral repertoire related to data analysis and computer programming is necessary to translate the promises into functional tools. The goal of this symposium is to highlight the growing effort within behavior analysis to develop useful data analytic tools and applications for ourselves. Topics will cover efforts to analyze behavioral data in relation to environmental variables outside of the behavior analyst’s control (e.g., client medications), neural network computing to analyze multiple baseline data, methods to develop decision support systems for functional analysis, and development of a system for charting of single-subject design data that can be easily extended to support advanced statistical analysis.
|Instruction Level: Intermediate|
|Keyword(s): computing, data analysis, statistics|
|Target Audience: |
Attendees should be aware of pharmacotherapy and should have knowledge of multiple baselines and functional analyses.
|Learning Objectives: At the conclusion of the presentation, participants will be able to: 1) describe some new methods to analyze behavioral data, 2) list some of the benefits of computer aided data analysis, and 3) compare and contrast traditional visual analysis with computer aided data analysis.|
Demonstrating an Analyses of Clinical Data Evaluating Psychotropic Medication Reductions and the ACHIEVE! Program in Adolescents With Severe Problem Behavior
|ALISON COX (Brock University), Duncan Pritchard (Aran Hall School), Heather Penney (AmethystABA), Llio Eiri (Aran Hall School), Tim J. Dyer (Aran Hall School)|
Researchers report increasing trends in psychotropic medication use to treat problem behavior in individuals with intellectual and developmental disability, despite some controversy regarding its application and treatment efficacy. While a substantial evidence-base exists supporting behavioral intervention efficacy, research evaluating separate and combined interventions (i.e., concurrent application of behavioral and psychopharmacological interventions) effects is scarce. This talk demonstrates a series of analysis using the clinical treatment data of four adolescent males who engaged in severe problem behavior to retrospectively explore separate and combined intervention effects. First, we calculated individual effect sizes and corresponding confidence intervals. The results indicated larger problem behavior decreases may have coincided more often with behavioral intervention adjustments compared to medication adjustments. Second, a conditional rates analysis indicated surges in problem behavior may not have reliably coincided with medication reductions. Spearman correlation analyses indicated a negative relationship between behavioral intervention phase progress and weekly episodes of problem behavior compared to a positive relationship between total medication dosage and weekly episodes of problem behavior. However, a non-parametric partial correlation analyses indicated individualized, complex relationships may exist between total medication dosage, behavioral intervention and weekly episodes of problem behavior. Although our conclusions are tentative, we will discuss many potential clinical implications, as well as rationale for behavioral researchers and practitioners to consider applying creative analytic strategies to evaluate separate and combined interventions effects on problem behavior to further explore this extremely understudied topic.
|Artificial Neural Networks to Analyze the Results of Multiple Baseline Designs|
|MARC J. LANOVAZ (Université de Montréal)|
|Abstract: Since the start of the 21st century, few advances have had as far-reaching impact in science as the widespread adoption of artificial neural networks in fields as diverse as fundamental physics, clinical medicine, and psychology. In behavior analysis, one promising area for the adoption of artificial neural networks involves the analysis of single-case experimental designs. The purpose of our study was to compare the predictions produced by an artificial neural network with more traditional methods of analysis. To this end, we trained a new model using 100,000 samples generated with a Monte Carlo simulation to analyze multiple baseline graphs and compared its outcomes with those produced by visual raters and the dual-criteria method. Using artificial neural networks improved power by more than 15% whereas Type I error rate remained consistent across all three methods. Our results suggest that researchers may use artificial neural networks to develop novel models to analyze the outcomes of single-case experimental designs.|
|Automating Functional Analysis Interpretation II: Better Approximating an Expert Human Rater|
|JONATHAN E. FRIEDEL (Georgia Southern University), Alison Cox (Brock University)|
|Abstract: Functional analysis (FA) has been an important tool in behavior analysis. The goal of an FA is to determine problem behavior function (e.g., access to attention) so that treatment can be designed to specifically target causal mechanisms (e.g., teaching a socially appropriate response for attention). Behavior analysts traditionally rely on visual inspection to interpret the results of an FA. However, existing literature suggests interpretations can vary across clinicians resulting in poor interobserver agreement (Danov & Symons, 2008; Ninci et al., 2015). To increase objectivity and address interrater agreement across FA outcomes, Hagopian et al. (1997) created visual-inspection criteria to be used for FAs. Hagopian and colleagues reported improved interobserver agreement but limitations of the criteria were noted. Therefore, Roane et al. (2013) addressed these limitations when they created a modified version. Cox and Friedel (2020) described a computer script designed to automatically interpret functional analyses based on the above-mentioned criteria. In that study, the authors noted several instances where the script provided incorrect interpretations because an experienced interpreter would ignore the strictness of the criteria. Here, we outline further refinement of the script to produce more accurate FA interpretations.|
Integrating Visual and Statistical Analysis With R: Fast, Efficient, Pixel-Perfect Charting with the fxl R Package
|SHAWN PATRICK GILROY (Louisiana State University)|
Applied work in Behavior Analysis is moving towards regularly integrating quantitative metrics in the design, delivery, and evaluation of behavioral interventions. Efforts in this area are constrained by the tools available to practitioners. Whereas commercially available spreadsheet software supports robust charting capabilities, only the most basic types of computations are supported. Furthermore, this approach cannot be fully automated and places significant demands on the analyst. The approach presented here leverages the capabilities of the free, open-source R program to support both quantitative as well as the existing charting conventions (e.g., style, formatting) expected of commercially available spreadsheet software. The combination of which is a toolset that supports both visual analysis and the integration of robust statistical methods (e.g., multi-level modeling). This paper ends with a discussion on the importance of statistical consultation and training and exploration of free and open-source alternatives to commercial software packages.