|Improving the Implementation of Function-Based Interventions in Schools|
|Tuesday, May 31, 2016|
|2:00 PM–3:50 PM |
|Regency Ballroom A, Hyatt Regency, Gold West|
|Area: EDC; Domain: Applied Research|
|Chair: Sarah E. Pinkelman (George Mason University)|
|Discussant: Ronnie Detrich (The Wing Institute)|
|CE Instructor: Sarah E. Pinkelman, Ph.D.|
Despite the research base indicating the effectiveness of function-based interventions to reduce problem behavior and increase socially appropriate behavior, the implementation of these practices with sufficient fidelity to improve student outcomes is rare in typical classroom environments. The discrepancy between what research identifies as best practice and what actually occurs in schools calls attention to the need for developing implementation supports that will enable teachers and classroom staff to effectively implement function-based interventions with high treatment fidelity. The proposed symposium will present descriptive data regarding potential variables affecting implementation (study #1) and discuss the results of three single-subject studies that examined the effects of varying implementation supports on improving implementation effectiveness (studies 2, 3, and 4). Implementation supports examined in these studies include self-monitoring fidelity, team-based review of data, direct observation and permanent product review for assessing treatment fidelity, and detailed planning procedures. Implications for practice and future research directions will described in the context of building school capacity to effectively develop and implement function-based interventions that improve student outcomes.
|Keyword(s): function-based interventions, implementation, schools, treatment fidelity|
|Building School District Capacity to Conduct Functional Behavioral Assessment|
|M. KATHLEEN STRICKLAND-COHEN (Texas Christian University)|
|Abstract: For decades, function-based interventions have been well documented as effective for producing positive change in challenging student behaviors. However, schools continue to struggle to design and implement effective individualized interventions. One way that schools can use this approach more effectively is by building local capacity to conduct FBA and intervene with function-based strategies at the first signs of persistent problem behavior. In the present pilot study, a six-hour comprehensive training package was used by a typical district-level behavior specialist to train 36 elementary- and middle-school teachers, administrators, and school psychologists to conduct streamlined functional behavioral assessment (FBA) and design and implementing function-based behavior support plans. Study findings show that the training was effective in significantly increasing (a) participant knowledge related to function-based behavior support; t(30) = 11.23, p < .001, and (b) the use of FBA and function-based supports by participating school professionals. In a follow-up survey conducted 15 weeks after training, participating professionals also provided descriptive data related to perceived enablers and barriers to implementing function-based support in schools (see Table 1). Practical implications of these findings and future research needed to better understand factors related to sustained implementation of effective practices in school contexts will be discussed.|
Improving Implementation of Function-Based Interventions Using an Online Data Management Application
|SARAH E. PINKELMAN (George Mason University)|
The success of behavioral interventions depends not just on the quality of procedures employed, but on the extent to which procedures are implemented. This study used a multiple-baseline across participants design to assess the impact of an online data management application on the fidelity and impact of individual student behavior support plans in typical school contexts. Three students with patterns of problem behavior and their supporting adults participated in the study. The research question examined if a functional relation exists between use of (a) performance self-assessment and (b) student impact assessment via an online data management system on the fidelity of behavior support plan implementation by adults and improvement in academic engagement and problem behavior by students. Results indicate the efficacy of the treatment package in improving treatment fidelity, decreasing student problem behavior, and increasing student academic engagement. Potential contributions of the study are discussed in terms of establishing efficient data systems for schools to use in monitoring staff and student behavior and using these data in a meaningful way that result in improved student outcomes and sustained behavior change.
Increasing Teachers' Behavior Support Plan Implementation Adherence and Quality Through Implementation Planning
|LISA SANETTI (University of Connecticut
Interventions, such as behavior support plans (BSPs), must be implemented with adequate adherence and quality to result in improved student outcomes. Despite this fact, research consistently demonstrates that teachers struggle to consistently and completely implement BSPs (Sanetti & Kratochwill, 2009). Implementing interventions is challenging, requiring the integration of intervention-related behaviors into routines. Therefore, implementation may be conceptualized as an adult behavior change process (Noell, 2008). Research on adult behavior change suggests that planning is important for translating intended behaviors into actions (Gollwitzer, 1999). Specifically, completion of action plans (i.e., the detailed logistical planning of implementation) facilitates successful initiation of intended behaviors, whereas coping plans promote successful maintenance of behaviors through the development of strategies to address anticipated barriers (Schwarzer, 2008). This presentation will describe the application of such an implementation planning process and its effect on promoting teachers implementation of student BSPs in a randomized multiple-baseline design study. The results of will be discussed in relation to other implementation support methods and how practitioners can use their time efficiently to effectively support implementation of behavioral interventions.
Assessing Implementation: A Comparison of Direct Observation and Permanent Product Review
|LISA SANETTI (University of Connecticut)|
It is widely accepted that implementation data are necessary to make valid decisions about intervention effectiveness, and data consistently indicate that quantitative implementation data are rarely documented, especially in practice (Cochrane & Laux, 2008; McIntyre, Gresham, DiGennaro, & Reed, 2007). A common reason for this lack of implementation assessment is the lack of empirical guidance for researchers and practitioners regarding how to assess implementation of behavioral interventions (Cochrane & Laux, 2008; Sanetti & DiGennaro Reed, 2012). An empirical understanding of the relation between currently recommended assessment methods, systematic direct observation (SDO) and permanent product review (PPR), is an important step to move implementation assessment research forward. To this end, the purpose of this presentation is to first discuss methodological issues related to each measurement approach, and then to present data from a single-case design study in which teachers? implementation of behavior support plans was measured via SDO and PPR. The relation between and methodological issues related to these methods, as well as their association with student outcomes will be presented. Implications of these results for implementation assessment in research and practice will be discussed.