|
Behavior Analytic Instruction in Higher Education |
Sunday, May 29, 2022 |
10:00 AM–11:50 AM |
Meeting Level 2; Room 203 |
Area: TBA/EDC; Domain: Applied Research |
Chair: Nicole Hollins (Little Leaves Behavioral Services) |
Discussant: Darlene E. Crone-Todd (Salem State University) |
CE Instructor: Nicole Hollins, Ph.D. |
Abstract: As teachers of behavior analysis, it is important to apply principles of behavior analysis to our instruction. This is important on multiple levels. It is important that we (1) analyze the variables related to effective and efficient learning for our students, (2) approach teaching college classes from a conceptually-systematic, behavioral lens, and (3) model the use of behavior analytic tactics in our teaching so students see us practicing what we preach. This symposium will consist of multiple papers that exemplify the application of behavior analytic principles in college teaching to ensure effective and efficient teaching and learning. This collection of papers will show how variations of active student responding activities can be incorporated into synchronous online learning activities and their impacts on learning in the online classroom; examples and nonexamples can be used to teach concept formation in the college classroom; peer-generated examples can be used to teach students to discriminate between examples and nonexamples of behavioral concepts; and involve college students in the generating course content to develop more student-centered, culturally-relevant content in the college classroom. These papers demonstrate behavior analytic, conceptually-systematic, and socially significant approaches to college instruction and are useful in informing effective and efficient college instruction. |
Instruction Level: Advanced |
Keyword(s): active responding, college teaching, concept formation, culturally-relevant teaching |
Target Audience: Prerequisite skills include mastery of behavior analytic content at the doctoral level. Experience in college instruction is a helpful skill but may not be a necessary prerequisite skill. |
Learning Objectives: Learners will:
1. Explain the effects of different active student responding modalities during synchronous online instruction on student engagement and test question accuracy.
2. Explain the effects of similar and dissimilar nonexamples on concept formation.
3. Explain the effects of peer-generated examples on student accuracy of identifying examples and nonexamples of behavioral concepts during interactive computerized teaching.
4. Explain Open Behavior Artifacts and student perceptions of them as an alternative to traditional semester course projects. |
|
Can Everyone See My Slides? The Effects of Active Student Responding During Synchronous Online Instruction |
NICOLE HOLLINS (Little Leaves Behavioral Services), Stephanie M. Peterson (Western Michigan University) |
Abstract: Active student responding and opportunities to respond are appropriately referred to as best-practice instructional strategies for in-person learning. Many have shifted from teaching primarily in-person to either a hybrid or an online format over the past decade. The global pandemic hastened further shifts from in-person to online learning for many institutions of higher education. Given this rapid shift to online instruction, it is critical to evaluate evidence-based teaching practices in online formats. There is a robust body of literature that supports the effectiveness of embedding opportunities to respond and active student responding during in-person instruction. To date, there is limited to no data that evaluates embedding best teaching practices during online synchronous courses in post-secondary settings. Using an alternating treatments design, this study evaluated the effects of two active student response modalities on response accuracy for 17 students enrolled in a synchronous online graduate course. The results suggest that students performed more accurately on post-lecture queries following conditions that required written active student responses compared to responds cards. Moreover, the accuracy of correct responding maintained across the exams and the cumulative final exam. Limitations and future implications are discussed. |
|
Effects of Nonexamples on Concept Formation |
CATHERINE WILLIAMS (Marcus Autism Center
Emory University), Claire C. St. Peter (West Virginia University), Michael Perone (West Virginia University) |
Abstract: Concept formation is affected by the examples and nonexamples provided during teaching, but the degree to which examples and nonexamples should differ is unknown. Two experiments compared concept formation across three teaching conditions: a) nonexamples that were relatively similar to the examples, b) nonexamples that were relatively dissimilar to the examples, and c) no nonexamples. Arbitrary concepts were taught in Experiment 1 and biological concepts were taught in Experiment 2. Before and after teaching, tests with untaught examples and nonexamples measured concept formation. In general, concept formation improved when nonexamples were used to teach the concept compared when only examples were used. The highest levels of concept formation occurred when nonexamples that were more similar to the examples were used. However, concept formation may have been influenced by condition sequence and the relation between stimulus features within and across conditions. The results of these experiments indicate that explicit consideration of these relations is necessary to promote concept formation in instructional and experimental arrangements. |
|
Evaluating the Efficacy of and Preference for Interactive Computer Training to Teach Behavior Analytic Concepts |
SYLVIA AQUINO (Marquette University ), Stephanie A. Hood (Marquette University ), Richard Tanis (Butterfly Effects, LLC), Tara Famie (University of Nebraska Medical Center’s
Munroe Meyer Institute
), Elizabeth Goodbody (Marquette University) |
Abstract: Interactive Computer Training (ICT) involves the use of video modeling and active responding to teach new skills. ICT may be a favorable teaching modality because it can be personalized, long-term cost efficient, and may be referenced throughout training and thereafter. Evaluating the effectiveness of pedagogical strategies is an important step to determine what may inform teaching practices. Nava et al. (2019) demonstrated peer-generated examples did not improve acquisition of the behavioral concepts to undergraduate students, however students preferred peer-generated compared to traditional textbook examples. In the present study, we extended Nava et al. (2019) by including peer-generated examples in ICT with embedded feedback to teach behavioral concepts. Additionally, we evaluated the relative efficacy of ICT to teach students to distinguish between examples and non-examples of the behavioral concepts. T-tests showed higher overall improvement and higher scores in the intervention condition for all questions, but significance remained for one week only in a week-by-week analysis; non-example questions demonstrated significance both overall and for four out of five weeks. In addition, students preferred ICT compared to video models with text descriptions and text descriptions alone. The implications of the study for the adoption of ICT in higher education will be discussed. |
|
Student Open Content Generation as Active Responding: Promoting Access, Diversity, and Educational Equity |
Veronica Howard (University of Alaska Anchorage) |
Abstract: Involving students in content generation is one strategy to promote high-quality interaction with course content. Previous studies have demonstrated that student-generated content is a socially-valid, culturally-responsive strategy to promote learning for diverse learners (e.g., Nava et al., 2019). This presentation will review progress on the Open Behavior Artifacts project, an undergraduate student-focused initiative to develop openly licensed or open access materials on topics related to behavior analysis in lieu of a traditional semester course project. Student ratings indicate that the project was perceived as a good way to assess knowledge of behavior analytic principles. Students were also likely to endorse the project for future students, and were likely to want to do a similar content-generation project in the future. Qualitative feedback highlighted great satisfaction with the project, citing meaningful contributions back to the larger learning community. Samples of student materials will be shared, and implications for promoting diverse voices in behavior analysis will be discussed. |
|
|