|Some Current Approaches to Behavior Analytic Training and Structured Decision-Making Models: Behavior Skills Training, Computers, and Consecutive Case-Series Analyses|
|Sunday, May 26, 2019|
|5:00 PM–6:50 PM |
|Fairmont, Lobby Level, Rouge|
|Area: TBA/EDC; Domain: Translational|
|Chair: Tyra Paige Sellers (Behavior Analyst Certification Board)|
|Discussant: Joseph Michael Lambert (Vanderbilt University)|
Implementing behavior analytic interventions requires many skills, including the ability to collect data, graph data, and then analyze data to make decisions. Applying a behavior analytic approach to designing and implementing training and clinical decision-making models can produce desirable results. In the first study, results indicate that a training package that include Behavioral Skills Training components (i.e., instruction, modeling, rehearsal, feedback) is an effective method to train day-treatment center staff to implement a multiple stimulus without replacement preference assessments. In the second and third studies, results indicated that computer-based instruction was an effective method for training behavior analytic skills. In the second study, participants were trained through computer-based instruction modules, to create simple line graphs with embedded phase change lines. In the third study, behavior technicians were trained through computer-based instruction to analyze functional analysis data to decide whether more sessions were necessary or not. Finally, a fourth study demonstrated the efficacy of a data-driven process to analyze undifferentiated functional analysis results to inform modifications to session variables.
|Instruction Level: Basic|
|Keyword(s): assessment, computer-based instruction, functional analysis, staff training|
The Effects of Didactic Training and Behavioral Skills Training on Staff Implementation of a Stimulus Preference Assessment With Adults With Disabilities
|SANDRA SMITH (Utah State University), Kerry Abigail Shea (Utah State University), Tyra Paige Sellers (Behavior Analyst Certification Board)|
Direct care staff working in adult day treatment centers typically receive didactic training to learn various behavior change and assessment procedures. Didactic training however, is not an empirically supported training method for achieving reliable treatment integrity. This study compared the effects of didactic training and behavioral skills training on the treatment integrity scores of direct care staff implementing a stimulus preference assessment with adults with disabilities in a day treatment center using a multiple baseline design across participants. Direct care staff were trained, in both methods, to implement a multiple stimulus without replacement assessment with confederates and clients with disabilities. Results indicated that behavioral skills training resulted in treatment integrity mastery for all participants, while no participants reached mastery criteria following didactic training. Implications for staff training will be discussed.
|Interactive Computer Training for Graphing Embedded Phase Change Lines in Microsoft Excel|
|KERRY ABIGAIL SHEA (Utah State University), Seth Walker (Utah State University), Tyra Paige Sellers (Behavior Analyst Certification Board)|
|Abstract: Graphing data is an essential skill for those who are implementing behavior analytic interventions. The current investigation evaluated the effects of an interactive computer training on graphing skills using a multiple-baseline design across four participants. The computer training included 4 modules, based on a modified version of the embedding phase change task analysis from Deochand, Costello, & Fuqua, (2015). Each module included instructions, video demonstrations, opportunities to practice, and prompts to self-monitor performance. Participants completed modules independently. During baseline sessions, participants were given a data set, case scenario, and model graph. Participants had up to 20 minutes to create a graph that included components in the model. Post-training sessions were identical to baseline except that participants were able to use self-monitoring checklists task analyses during sessions. Results indicated that all participants were able to create graphs to mastery criteria. During a two-week maintenance check, participants were able to create a graph to mastery only during the session where notes were available. Participants completed the training in an average of 1 hour, 43 minutes. Future directions, and recommendations for using computerized instruction to teach graphing skills will be discussed.|
Teaching Behavior Technicians to Interpret Functional Analyses Using Ongoing Visual Inspection
|LAUREN PHILLIPS (University of Nebraska Medical Center), Billie Retzlaff (University of Nebraska Medical Center's Munroe-Meyer Institute), Wayne W. Fisher (Munroe-Meyer Institute, University of Nebraska Medical Center), Ashley Marie Fuhrman (University of Nebraska Medical Center), Alexandra Hardee (University of Nebraska Medical Center's Munroe-Meyer Institute)|
Functional analyses (FAs) identify variables that evoke and maintain target behaviors. Clinicians use FA results to introduce function-based interventions, increase the effectiveness of reinforcement-based procedures, and decrease reliance on punishment. The benefits of the FA are well established; however, many clinicians rarely conduct FAs due to time constraints. Therefore, interventions that decrease the time to conduct an FA are valuable to the field. Structured criteria increase the consistency (Hagopian et al., 1997) and efficiency of FA interpretation when applied in an ongoing manner (i.e., Ongoing Visual Inspection [OVI]; Saini, Fisher, & Retzlaff, 2018). Module-based training maximizes teaching opportunities while minimizing constraints on a behavior analyst’s time. During pretests, participants inspected FA graphs by viewing one series of the FA at a time and selected to continue the FA or to end the FA. During OVI training, participants accessed module-based training and referenced a sheet of rules. Posttests were identical to pretests except the participant had their rules sheet, and FA graphs contained criterion lines from the training. Training alone increased correct responding to the mastery criteria for five participants. One participant required the addition of feedback and programmed reinforcement. These results demonstrate the effectiveness of module-based OVI training.
Toward a Quantitative Decision-Making Process for Clarifying Inconclusive Multielement Functional Analysis Outcomes
|CRAIG STROHMEIER (Kennedy Krieger Institute;
John Hopkins University School of Medicine
), Mirela Cengher (Kennedy Krieger Institute; John Hopkins School of Medicine), Michelle D. Chin (The Kennedy Krieger Institute), Jennifer R. Zarcone (The May Institute)|
Despite numerous studies outlining broad categories of best practices in the functional analysis (FA) of problem behavior (e.g. inclusion of programmed antecedents and consequences [Hanley, Iwata, & McCord, 2003]), sometimes FA outcomes are inconclusive. Recently, investigators have provided guidelines for best practices when FAs are inconclusive (Rooker, DeLeon, Borrero, Frank-Crawford, & Roscoe, 2015), and evidence for FA modifications that are most likely to produce differentiated outcomes (Hagopian, Rooker, Jessel, & DeLeon, 2013). Currently there are no data-informed strategies to guide clinician’s decision-making for selecting categories of FA modifications such as design (i.e. pairwise), antecedent, and/or consequence variables to obtain differentiated FA results. In the current study, we utilized a controlled consecutive case-series design within a clinical database to identify cases that included multielement and pairwise FAs. We used structured criteria evaluation to derive a data-informed process for making FA modifications to produce differentiated results. Structured criteria quotients (SCQs) were analyzed to determine if multielement FA SCQs were predictive of differentiated outcomes during subsequent modified FAs. Preliminary results suggest that a positive predictive quotient score may be used to inform FA changes to produce differentiated outcomes. Between and within subject analyses will be presented, as well as possible recommendations for practice.