|
Recent Outcomes in Survey-Based Research |
Sunday, May 26, 2024 |
10:00 AM–11:50 AM |
Convention Center, 100 Level, 104 AB |
Area: DDA; Domain: Applied Research |
Chair: Kavya Kandarpa (University of Cincinnati) |
Discussant: Meghan Deshais (Rutgers University) |
Abstract: Surveys are a useful tool to examine practitioners' decision-making, training, intervention practices, and a variety of other topics across organizations. As such, behavior analysts are increasingly utilizing surveys to answer questions regarding clinical practice in applied settings. This symposium will present recent outcomes of survey-based research as well as pitfalls that behavior analysts can avoid when conducting this type of research. The first presentation will discuss the use of functional analysis based on the responses from experts in the field. The second presentation will discuss decisions on behavior analytic treatments from data collected by behavior therapists versus caregivers. The third presentation will discuss the selection of schedule thinning during Functional Communication Training. The fourth presentation will discuss the prevalence of artificial responding in survey research and how to identify this to preserve validity of conclusions. Together, these presentations demonstrate current practices from assessment to treatment, as well as cautions to take when analyzing information from surveys. |
Instruction Level: Basic |
|
An Expert Survey on Functional Analysis Decision-Making |
AUDREY N. HOFFMANN (Utah State University), Megan A. Boyle (Upstate Caring Partners) |
Abstract: Although there are many ways of designing and conducting a functional analysis to identify the function of challenging behavior, there is a lack of decision-making guidelines for practitioners to consult when selecting, designing, conducting, and analyzing functional analyses in practice. When considering the decision-making process, it may be useful to draw from experts in functional analysis from the field of applied behavior analysis. We conducted an online survey distributed to 31 identified experts in functional analysis methodology. Experts reported on their decisions related to selecting from 12 different functional analysis methods. The survey also asked respondents about their perspectives on practitioner training in functional analysis. The majority of experts (80%) reported that the average Board-Certified Behavior Analyst needs substantially more training in designing functional analyses, as well as 80% of experts recommending against practitioners having one “go-to” type of functional analysis. Results and implications for practitioner training will be discussed. |
|
Caregiver Versus Board Certified Behavior Analyst (BCBA) Collected Data: Impacts on BCBA Decision-Making and Preferences |
KISSEL JOSEPH GOLDMAN (Kennedy Krieger Institute), Jessica L Becraft (Kennedy Krieger Institute), Jacqueline Mery (Kennedy Krieger Institute; Department of Psychiatry and Behavioral Science, Johns Hopkins University School of Medicine) |
Abstract: Board Certified Behavior Analysts (BCBAs) rely on data to assess client progress and inform clinical decisions. Involving caregivers in data collection, both in clinical and home settings, is common for various purposes, such as establishing baselines and monitoring treatment effects. However, whether BCBAs make similar treatment decisions based on where and by whom data are collected is unclear. To explore this, we conducted surveys involving BCBAs and caregivers of children in behavioral services. In the first survey, participants evaluated figures showing baseline data of challenging behavior or homework completion, specifying the where and by whom the data were collected and the behavior type. Most BCBAs initiated treatment if data were collected by behavior therapists and requested more information when caregivers collected data at home. In the second survey, participants were shown two figures similar to those in the first survey and asked to choose the dataset they preferred for a new client (for BCBAs) or that that they would want their clinician to use (for caregivers). Most BCBAs favored data collected by behavior therapists. We discuss outcomes in terms of the utility of collecting and using caregiver-collected data and differences in importance placed on data among relevant stakeholders. |
|
Reinforcement Schedule Thinning Practices During Functional Communication Training: A Survey of Behavior Analysts |
JESSICA L FRENCH (Rutgers University - Rutgers University - Children's Specialized Hospital Center for Autism Research, Education, and Services), Daniel R. Mitteer (Children’s Specialized Hospital¬–Rutgers University Center for Autism Research, Education, and Services; Rutgers Robert Wood Johnson Medical School), Megan A. Boyle (Upstate Caring Partners), Andrew R. Craig (SUNY Upstate Medical University) |
Abstract: Researchers have investigated several schedule-thinning approaches during functional communication training (FCT), including FCT with delay schedules (Delay FCT), FCT with discriminative stimuli (Signaled FCT), and FCT with delay-and-denial tolerance training (DDTT). Despite many publications on these approaches, it is unclear how regularly, how often, and under what conditions behavior analysts use these FCT thinning strategies. We analyzed survey data from 129 board-certified behavior analysts to determine (a) how often they use the three thinning approaches during FCT, (b) their most preferred thinning method, and (c) the conditions under which they select one approach over another. DDTT was the most commonly implemented and preferred FCT thinning strategy. Relative to DDTT, Signaled FCT was more frequently used and preferred by doctoral-level behavior analysts and those working in clinical settings. However, regardless of approach, behavior analysts reported several challenges that hold implications for practice and research and discuss recommendations for addressing these challenges. |
|
Prevalence of Artificial Responding in Online Surveys and Methods to Identify Fake Responses |
KAVYA KANDARPA (Emory University School of Medicine, Marcus Autism Center), Summer Bottini (Emory University School of Medicine, Marcus Autism Center), Alexandra Hardee (Trumpet Behavioral Health) |
Abstract: Behavior analysts are increasingly using web-based survey approaches within research. As internet-based data collection methods become more prevalent, there is need for tools to identify and mitigate the impact of artificial responding (Griffin et al., 2022). Artificial responding has the potential to threaten data quality and validity of conclusions. Methods for identifying artificial responding include extra validation and attention checks, advanced captcha features, and hidden questions (Brainard et al., 2022). In the current study, the research team used an online survey to assess provider burnout. We identified artificial responding during data quality checks. The team thereafter applied a set of rules for identifying and preventing artificial responding (e.g., embedded attention checks, timestamp rule outs, picture transcripts, quality of open response questions [Table 1]). These methods identified 78.3% of responses as artificial (742 of 948 responses). Coding open response questions identified the most artificial responses. Interrater agreement for coding quality of open response questions was high (92.6%). These data suggest that methods differ in their ability to identify suspicious responses, warranting multiple methods within survey research. This presentation will review ways to identify false responding in internet-based data collection methods and the importance for behavior analysts to utilize such methods. |
|
|