|
Evaluating Methods of Training Behavior Assessment and Intervention Protocols. |
Sunday, May 24, 2015 |
3:00 PM–4:50 PM |
214A (CC) |
Area: PRA/TBA; Domain: Service Delivery |
Chair: Brenda J. Bassingthwaite (The University of Iowa Children's Hospital) |
Discussant: David M. Richman (Texas Tech University) |
CE Instructor: David M. Richman, Ph.D. |
Abstract: Behavior analysts often train others in assessment practices and in intervention protocols. Understanding the effectiveness of different training delivery methods may help to lead to better outcomes. This symposium includes four papers that evaluate outcomes of training conducted through different delivery methods. Two papers are focused on practices related to training assessment, and two are focused on practices related to training intervention protocols. Schwartz et al. evaluated factors contributing to the success of trainees’ acquisition of skills in conducting experimental analysis independently via a training model that incorporated didactic and in-vivo training. Schnell et al. evaluated the use of a computer-based training module to train individuals to analyze functional analysis data. Gibson et al. evaluated the effects that providing in-vivo feedback had on caregivers’ ability to implement a feeding protocol. Abby et al. evaluated the effects of training undergraduate students to conduct academic intervention while manipulating the delivery of the training (in-vivo vs. telemedicine). Together, all four papers discuss considerations for behavior analysts who provide training in assessment and intervention protocols. |
Keyword(s): Caregiver training |
|
Understanding Trainee Skill Development in Behavior Assessment when Participating in Iowa’s Challenging Behavior Service |
JESSICA EMILY SCHWARTZ (The University of Iowa), Brenda J. Bassingthwaite (The University of Iowa Children's Hospital), Shaun Wilkinson (The University of Iowa), David P. Wacker (The University of Iowa), Sean D. Casey (The Iowa Department of Education) |
Abstract: Since 2009, the Iowa Department of Education (DOE) has been working to improve services for students with challenging behavior in Iowa schools through the Challenging Behavior Service (CBS). CBS is a project funded by the Iowa DOE in which behavior analysts from the Center for Disabilities and Development provide hands-on training to challenging behavior specialists across Iowa who conduct behavior assessments in schools. Training focused on preference assessments, concurrent operants assessments, antecedent analyses, and functional analyses. By the end of the fifth year of the project, twenty-five trainees reached Advanced level criteria (defined by NIH Competencies Proficiency Scale). We are investigating variables that influence the time needed to obtain advanced level skills. On average, trainees needed to participate in 43 assessments to reach Advanced level. Participation was categorized in terms of “active” and “passive” roles. Active roles include preparation, decision making, and/or conducting procedures for an assessment. Passive roles include data collection, graphing, and data analysis. The main purpose of this study was to evaluate variables that influenced rate of skill acquisition for components of behavior assessment (e.g., discipline, years of experience, passive/active roles). |
|
Effects of a Computer-Based Training Tutorial on Procedural Modifications to Standard Functional Analyses |
LAUREN K. SCHNELL (Caldwell College), Tina Sidener (Caldwell College), Ruth M. DeBar (Caldwell College), Jason C. Vladescu (Caldwell College), SungWoo Kahng (University of Missouri) |
Abstract: Extensive research has been conducted on training individuals with limited functional analysis experience to implement the antecedents and consequences necessary to conduct functional analysis conditions. Only a handful of studies, however, have examined how to best teach individuals to examine outcomes of functional analyses with only one study conducted on training individuals to analyze outcomes of functional analyses and make decisions regarding next steps in the assessment process when data are undifferentiated. The current study evaluated the use of a computer-based training tutorial to teach 10 college students to analyze functional analysis data and make decisions to implement a series of procedural modifications. Participants were exposed to training materials using an interactive software program that was enriched with written material, quizzes, voice over, and feedback over a one-day session. Following the computer-based training tutorial, mean scores of the posttest, novel stimuli probe and maintenance test improved for participants. These results replicate previous findings in which participants were taught to identify the relevant antecedent and consequences across functional analysis conditions, interpret multielement functional analysis graphs, and respond to undifferentiated functional analysis data by suggesting a variation to the protocol. |
|
Training Caregivers to Implement Pediatric Feeding Protocols: Is In-Vivo Feedback Sufficient? |
AMANDA L. GIBSON (University of North Carolina Wilmington), Melanie H. Bachmeyer (University of North Carolina Wilmington), Caitlin A. Kirkwood (University of North Carolina Wilmington), Courtney Mauzy (University of North Carolina Wilmington), Billie J. Klein (University of North Carolina Wilmington) |
Abstract: Training caregivers to implement feeding protocols accurately is a vital component to any treatment program because they will ultimately serve as the behavior change agents in the natural environment, and without continued procedural integrity, gains may be lost or progress may be halted. A limited number of published studies have examined the necessary and sufficient strategies for training caregivers to implement feeding protocols (e.g., Mueller, Piazza, Moore, & Kelley, 2003; Sieverling, Williams, Sturmey, & Hart, 2012). Using a multiple baseline design across caregiver dyads, we examined the effectiveness of using only in-vivo feedback to increase the correct delivery of prompts and consequences by caregivers implementing feeding protocols. Interobserver agreement was conducted on at least 70% of sessions. Agreement was above 90% for each dyad. Percentage of correct prompts and consequences was low during baseline (written instructions only), increased to clinically acceptable levels with in-vivo feedback, and remained at high levels post training and at follow-up for all 4 dyads. The effectiveness and efficiency of using only in-vivo feedback will be discussed. |
|
Training Undergraduates to Implement Components of Early Intensive Behavioral Intervention via Telemedicine and In-Vivo Instruction |
LAYLA ABBY (Texas Tech University), David M. Richman (Texas Tech University), Anna Marie Schaefer (Texas Tech University), Laura Melton Grubb (Texas Tech University), Adam Brewer (Texas Tech University) |
Abstract: The current study compared the efficacy and efficiency of telemedicine and in-vivo training to teach seven undergraduate students to implement empirically supported components commonly used with discrete trial training. A multiple baseline probe design across skills combined with an alternating treatments design was used to evaluate the effects of behavioral skills training (i.e., vocal instructions, modeling, role play with feedback) in teaching participants to implement: (1) multiple stimulus without replacement preference assessment, (2) setting up the instructional context, (3) antecedent instructional prompts, and (4) consequences for accurate and inaccurate responding. Two skills were trained via telemedicine and two skills were trained in-vivo. Insterobserver agreement was above 84% across all phases of the study, and treatment integrity was above 88%. Results showed that telemedicine training was as efficacious and efficient as in-vivo training for all four skills. Five of six participants showed high levels of maintenance of the skills taught, and the skills generalized to a more complex teaching task regardless of the training modality. Finally, participants reported high acceptability ratings across training modalities, but three out of six participants reported a preference for in-vivo instruction while the remaining three participants did not report a preference. |
|
|