|
Contemplating Service Provision Efficiencies Through Tool Development and Program Evaluation |
Sunday, May 28, 2023 |
10:00 AM–10:50 AM |
Convention Center Mile High Ballroom 1C/D |
Area: DDA/OBM; Domain: Service Delivery |
Chair: Alison Cox (Brock University) |
Discussant: Joseph Michael Lambert (Vanderbilt University) |
CE Instructor: Joseph Michael Lambert, Ph.D. |
Abstract: Often behavior analytic intervention research emphasizes improving efficacy through direct intervention to generate improved efficiency in service provision. Some examples may be treatment fidelity and training studies, including pyramidal approaches to implementer training. While important efficiencies may be afforded by endeavoring to improve direct implementation practices, there are also many opportunities to generate efficiencies through the enhancement of the service provision process. This can be done without compromising service delivery. Developing and validating effective intake processes (i.e., client triage) may be one approach. Another option could be assessing service delivery (i.e., program evaluation) to discern specific elements that add value as well as uncover areas for improvement. The current symposium aims to showcase strategies to assessing (and refining) service delivery that may be less common across the behavior analytic research. Each talk will provide an in-depth description of respective projects, culminating with a discussant highlighting key aspects and areas for future research. |
Instruction Level: Intermediate |
Keyword(s): developmental disabilities, program evaluation, service delivery, tool development |
Target Audience: Attendees should have an introductory understanding of applied research design and tool development and validation analyses. It would also enhance attendees' experience if they are well-versed in best practice approaches in the treatment of challenging behavior. |
Learning Objectives: (1) Attendees will be able to describe the steps involved in structured tool development and validation, as well as recognize opportunity for application. (2) Attendees will be able to describe consecutive controlled case series, and areas of application. (3) Attendees will be able to describe important program evaluation tenants and how they may be easily applied to ongoing clinical service provision. |
|
Predicting Services and Outcomes Using Consecutive Case Series Data: A Quality Improvement Study |
COLLIN SHEPLEY (University of Kentucky) |
Abstract: Program evaluation is an essential practice for providers of behavior analytic services, as it helps providers understand the extent to which they are achieving their intended mission to the community they serve. A proposed method for conducting such evaluations, is through the use of a consecutive case series design, for which cases are sequentially gathered following the onset of a specific occurrence. Given the sequential nature in which data are collected within a consecutive case series, analytic techniques that adopt a time-series framework may be particularly advantageous. Although such methods are commonly used for program evaluation in medicine and economics, their application within the field of applied behavior analysis is largely absent. To serve as a model for providers undertaking evaluation efforts, we conducted a program evaluation of an outpatient severe behavior clinic serving families with children engaging in challenging behavior. We employed quasi-experimental methods using an interrupted time-series analysis. Analytical models detected planned and unplanned changes in the clinic’s services over time; however, we did not identify evidence to suggest that families and children will experience improved outcomes as a result of these changes. |
|
On the Development and Validation of an Objective Severity Tool to Classify Severe Problem Behavior |
MARIE-CHANEL MONIQUE MORGAN (Brock University), Nazurah Khokhar (Brock University), Alison Cox (Brock University), Hannah Lynn MacNaul (University of Texas at San Antonio) |
Abstract: Presently, there appears to be limited research targeting objective ways to triage clients presenting with problem behavior – for the purpose of accurately predicting a matched treatment tier (e.g., dosage, intensity). Improved ‘patient flow’ may translate to improved efficiencies across service provision – meaning more clients receive services more quickly. This paper describes the development and validation of a severity scale tool aimed to enhance an existing treatment model. Researchers recruited participants through an established research and treatment center supporting participants presenting with a range of problem behavior severity at intake. This center had an existing series of indirect and direct assessments at intake, after which time all participants experienced tiers 1, 2 and/or 3 services, according to their treatment needs. Participant caregivers met with researchers virtually to complete the tool for their child. They answered each tool item by providing permanent products (e.g., reports, pictures, data) to corroborate their verbal responses. Preliminary results suggest lower scores on the tool generally coincided with tier 1 (i.e., lowest level of service delivery), while descriptive analysis suggests some tool items may be more difficult for caregivers to gather corroborating evidence. Outcomes informed by construct optimization, construct validation and construct calibration will also be described. |
|
|