|Improving the Quality of Data Collection and Behavior Plans in Educational, Medical, and Psychiatric Settings|
|Sunday, May 24, 2020|
|8:00 AM–8:50 AM |
|Area: OBM/DEV; Domain: Service Delivery|
|Chair: Gregory Young (Franciscan Children's)|
|Discussant: Jonathan R. Miller (.)|
|CE Instructor: Kelsey Ventura, Ph.D.|
This symposium will describe new methods that can be used across educational, medical, and psychiatric settings to improve the quality of data collection and increase the social validity of behavioral interventions. In the first study, the authors developed a data collection tool to evaluate the quality of behavior support plans developed for children with severe problem behavior in a community school. The authors demonstrated that by using the tool, they were able to determine the strengths and weakness of current behavior plans to obtain a measure of quality. The second presentation focuses on a treatment package for improving staff’s data collection. The treatment package included simplified data sheets, behavior clickers, and prompts. Results demonstrate that these strategies can improve the quality of data collection on an inpatient psychiatric unit. The final presentation focuses on increasing the amount of behavior data collected by nurses on an inpatient medical rehabilitation unit. This study expands on the findings from the second presentation by systematically evaluating the effects of prompts via an established medical records system. Results support the use of prompts as demonstrated by a significant increase in the amount of data collected by nurses in prompt conditions as compared to baseline conditions.
|Target Audience: |
The target audience of this talk is any behavior analysist or supervisor who is responsible for monitoring and improving staff performance within a clinical setting.
|Learning Objectives: Attendees of this symposium will be able to identify 3 strategies for improving data collection behaviors, including, simplified data sheets, behavior clickers, and prompts. Attendees of this symposium will be able to identify methods for evaluating the social validity of a behavior support plan. Attendees of this symposium will be able to describe empirically supported strategies for improving direct care staff integrity of data collection and behavior plan implementation within an applied clinical setting.|
CANCELED: Evaluating Quality of Behavior Support Plans
|KELSEY VENTURA (May Institute), Uriah Hedrich (May Institute), Jennifer R. Zarcone (The May Institute), Sarah Frampton (May Institute, Inc. ), Clare Liddon (May Institute ), Yannick Andrew Schenk (May Institute), Ali Schroeder (May Institute)|
Previous research evaluating the quality of BSPs has focused on adults and community settings (e.g., Vollmer et al., 1992) although more recent research in school settings have also developed guides to evaluate positive behavior support plans (e.g., Browning Wright et al., 2007). We extended previous research and evaluated the quality of 19 BSPs for students of the May Institute in Randolph, Massachusetts. We developed an evaluation tool that could be used to score several critical dimensions of student BSPs. Independent raters coded the BSPs to identify the strengths and weakness of the plans using the evaluation tool. A secondary measure of social validity was created, as well as completed by direct care staff working with these 19 students. Scores on the collective behavior support evaluation tool showed a decrease of less than half a percent across all scored plans. However, the brief social validity measure reported strong staff support for the accessibility of our behavior support plans. Next steps for the project are to use robust measures (BSP-QEII, URP-NEEDS) to evaluate the self-made scoring tools used previously. Next steps for the project are to evaluate the use of social validity measures (URP-NEEDS), as well as qualitative measures (BSP-QEII), to improve the accessibility of behavior support plan formatting.
Development of a Treatment Package to Improve Accuracy of Data Collection on a Psychiatric Unit for Children Diagnosed With Intellectual and Developmental Disabilities
|Antoinette Donaldson (Children's Hospital Colorado), PATRICK ROMANI (University of Colorado, Anschutz Medical Campus and Children’s Hospital Colorado), Aimee Sue Alcorn (Children’s Hospital Colorado), James Linares (Children’s Hospital Colorado)|
Data collection is a hallmark of effective behavior-analytic therapy. Collecting accurate data permits a behavior analyst to evaluate the effectiveness of behavioral treatment. The current study evaluated the use of a clicker, simplified observation, and timer to improve accuracy of data collection on a psychiatric unit for children diagnosed with intellectual and developmental disabilities. Experiment 1, conducted within a combined multiple baseline across participants and reversal design, was an evaluation to identify an intervention package for four participants employed by the psychiatric unit. Interventions yielding the highest interobserver agreement (IOA) were highly individualized. Thus, we selected the most comprehensive intervention and exposed four additional participants to this intervention during Experiment 2. Results showed that this intervention improved IOA for these additional participants as evaluated within a multiple baseline across participants design. Results of the current study will be discussed to assist other behavior analysts improve data collection practices in hospital or school settings.
The Use of Computerized Prompts to Improve Behavioral Data Collection in a Medical Setting
|GREGORY YOUNG (Franciscan Children's), Mary Laurette Hughes (Franciscan Children's), Daniel Clark (May Institute), Aimee Lyons (Franciscan Children's)|
Patient’s behavior in the medical setting often interrupts medical care (e.g., refusal of care, removal of medical equipment, self-injurious, and aggressive behaviors). Direct behavioral measurements are essential to adequately design and evaluate the efficacy of behavioral interventions in an effort to provide appropriate medical care and ensure safety. Nurses have the most frequent contact with patients and are responsible for evaluating and documenting patients’ medical and behavioral information; however, nurses typically collect indirect measurements of behavior and psychological wellbeing rather than direct behavioral observations. The present study evaluated effectiveness of computerized prompts to increase nurses’ data recording behaviors in a subacute pediatric inpatient medical rehabilitation unit. Initial baseline data demonstrated that only 28.17% of intervals of behavior data were collected by nursing staff. Computerized prompts within the hospital’s electronic medical records system (Meditech) were implemented across five sets of nurses using an A-B-A-B reversals with one set, A-C-B-A-B-D-B reversal design with a second set, an A-B design with two sets, and intervention only condition with the final set. Percent of data collected was calculated using permanent products of weekly behavior data sheets. Results of the present study demonstrate that computerized prompts resulted in a clinically significant increase in the percent of data that were collected.