Association for Behavior Analysis International

The Association for Behavior Analysis International® (ABAI) is a nonprofit membership organization with the mission to contribute to the well-being of society by developing, enhancing, and supporting the growth and vitality of the science of behavior analysis through research, education, and practice.

Search

51st Annual Convention; Washington DC; 2025

Event Details


Previous Page

 

Symposium #48
CE Offered: BACB/IBAO
Advances in Artificial Intelligence, Wearable Technologies, and Applied Behavior Analysis
Saturday, May 24, 2025
11:00 AM–12:50 PM
Marriott Marquis, M4 Level, Liberty N-P
Area: DDA/AUT; Domain: Applied Research
Chair: Leslie Neely (The University of Texas at San Antonio)
Discussant: Russell Lang (Texas State University-San Marcos)
CE Instructor: Leslie Neely, Ph.D.
Abstract: This symposium explores cutting-edge applications of artificial intelligence (AI) and wearable technology to enhance the accuracy, efficiency, and objectivity of data collection and analysis in applied behavior analysis (ABA). The first presentation examines the use of wearable inertial measurement units (IMUs) to quantify the occurrence and intensity of self-injurious behavior, offering a potential solution to the limitations of traditional observational methods. The second study pilots the use of accelerometers during functional analyses to identify behavioral events in real time, comparing data from live observations, video analysis, and accelerometer outputs. The third talk applies machine learning models (MLMs) to the visual analysis of alternating treatment graphs, demonstrating how AI can reduce subjectivity and improve decision-making in ABA. Finally, the fourth presentation investigates advanced neural networks for facial emotion recognition, highlighting their potential to accurately track emotional states in young children with autism spectrum disorder (ASD). Together, these studies illustrate the potential role AI and wearable technologies can play in advancing ABA research and clinical practice.
Instruction Level: Intermediate
Keyword(s): artificial intelligence, developmental disabilities, machine learning, wearable technology
Target Audience: To fully engage with the content, participants should have experience with: (1) Conducting or analyzing functional assessments (FAs) to inform behavior intervention plans, (2) Reading and interpreting behavioral graphs (e.g., alternating treatment designs) to make data-driven decisions, (3) Understanding the ethical considerations related to technology use in clinical settings
Learning Objectives: 1. Participants will be able to identify at least two technologies used to measure self-injurious behavior and describe their application in behavior analysis settings.
2. articipants will be able to describe the accuracy and reliability of machine learning models in analyzing alternating treatment graphs, as compared to traditional visual inspection methods.
3. Participants will be able to explain the process by which advanced neural networks are used to identify emotional states, such as happiness, in children with autism, and discuss the implications for individualized interventions.
 

An Exploratory Study of the Use of Wearable Technology to Supplement Measurement of Self-Injurious Behavior

KIMBERLY CANTU-DAVIS (University of Texas at San Antonio), Leslie Neely (The University of Texas at San Antonio), Katherine Cantrell Holloway (University of Texas at San Antonio), Melissa Svoboda (Baylor College of Medicine; CHRISTUS Children's Hospital), Jessica Emily Graber (Nationwide Children's Hospital), Jordan Wimberley (Autism Treatment Center of San Antonio), Sakiko Oyama (University of Texas at San Antonio)
Abstract:

The occurrence and intensity of target behaviors often reflect the social importance and urgency of intervening on such behaviors. However, accurately capturing behavioral occurrences is time-consuming, prone to human error, and subject to various biases. Additionally, the measurement of behavioral intensity—such as the force exerted during a self-injurious episode—remains largely subjective. Typically, this measurement relies on clinical judgment during the behavior's occurrence or is assessed after the fact by evaluating injuries sustained. These limitations present challenges to developing effective and timely interventions. Inertial measurement units (IMUs), which are small wearable motion capture devices, offer a potential solution. IMUs incorporate three types of sensors: accelerometers, gyroscopes, and magnetometers, all of which can detect and record movement. When these devices are affixed to a specific body segment, they can provide precise and real-time data on human movement. This study aimed to explore the feasibility of utilizing IMUs to objectively quantify both the occurrence and intensity of self-injurious behaviors. The investigation was carried out through three experiments, each designed to test different parameters and configurations of IMU usage. The results offer valuable insights into the potential application of IMUs in clinical settings and suggestions for future research.

 

Wearable Technology to Measure the Occurrence of Self-Injury During a Functional Analysis

SAMANTHA LEE PEREZ (UTSA), Leslie Neely (The University of Texas at San Antonio), Katherine Cantrell Holloway (University of Texas at San Antonio), Karen Cantero (University of Texas at San Antonio), Sakiko Oyama (University of Texas at San Antonio)
Abstract:

Conducting a functional analysis (FA) is considered the gold standard for assessing the function of disruptive behavior and informing function-based treatment plans for individuals with autism and developmental disabilities. However, data collected during FAs are subject to human error. Accelerometers are wearable sensors that capture an individual’s movement and can be used to identify behavioral events. The purpose of this study was to pilot the use of accelerometers to identify the occurrence of self-injurious behavior events during a FA. Three participants with autism, who engaged in self-hitting behaviors, participated in this study. Researchers conducted a FA with the participants while they wore small accelerometer devices. Observational data were collected using (1) live observation (“clinical-grade”), (2) from frame-by-frame video analysis (“research-grade”), and (3) via accelerometers. Researchers established a ground truth data set and calculated interobserver agreement across data sets. Discussion of results and recommendations for practice and future research are included.

 

Applying Machine Learning Models to the Visual Analysis of Alternating Treatment Graphs

AMARIE CARNETT (University of Waikato), Tobias Kausch (Southern Methodist University; University of Texas at San Antonio), Adel Alaeddini (Southern Methodist University)
Abstract:

Behavioral analysts commonly use visual inspection to analyze alternating treatment graphs for assessment and intervention effects. However, there is high variability and disagreement between independent raters due to the subjectivity of this method. Machine learning models (MLMs) offer a promising solution by providing a more objective, data-driven approach to graph analysis, potentially increasing both the accuracy and consistency of interpretations. This research evaluates a machine learning models' ability to produce high accuracy, low Type I error, and strong power metrics when analyzing these graphs. Models trained on a dataset of simulated graphs were assessed for both performance and generalizability on a set of real-world graphs. The model achieved accuracy comparable to human visual raters while demonstrating the best balance of Type I error and power. These findings highlight the potential of MLMs to serve as effective tools in reducing subjectivity and improving the overall reliability and validity of alternating treatment graph analysis, thus enhancing decision-making processes in behavioral analysis.

 

Comparison of Advanced Neural Network Approaches to Identification of Indices of Happiness in Autistic Children

KATHERINE CANTRELL HOLLOWAY (University of Texas at San Antonio), Leslie Neely (The University of Texas at San Antonio), Adel Alaeddini (Southern Methodist University), Tobias Kausch (Southern Methodist University; University of Texas at San Antonio), Amarie Carnett (University of Waikato), Russell Lang (Texas State University-San Marcos)
Abstract:

Identifying and reliably tracking behaviors associated with emotional states, such as happiness and sadness, can greatly enhance a clinician’s ability to individualize interventions and evaluate the social validity of treatments for children with autism spectrum disorder (ASD). However, emotional states often manifest through subtle, idiosyncratic, and subjective behavioral indicators, making consistent data collection and analysis challenging. These challenges limit the practical application of such assessments in clinical settings. Recent advancements in artificial intelligence, particularly in neural network methodologies, have the potential to address these limitations by improving the accuracy and efficiency of emotion measurement. This pilot study investigates the use of facial emotion recognition technology, driven by advanced neural networks, to monitor indices of happiness and unhappiness in young children with ASD. Video data from 42 autistic children, aged 7 to 48 months, was used to train and test four different models. Among these models, the combination of convolutional and dense neural networks demonstrated the best performance, achieving the highest accuracy and recall in predicting emotional states. Limitations and suggestions for future research are discussed.

 

BACK TO THE TOP

 

Back to Top
ValidatorError
  
Modifed by Eddie Soh
DONATE
{"isActive":false}