|A Component Analysis of Higher Education|
|Sunday, May 24, 2020|
|6:00 PM–6:50 PM |
|Area: EDC/TBA; Domain: Translational|
|Chair: Jesslyn N. Farros (Center for Applied Behavior Analysis (CABA) and Endicott College)|
|CE Instructor: Jesslyn N. Farros, Ph.D.|
Higher education in behavior analysis is in high demand, especially online learning options. Any modality of education must use current evidence-based teaching methods, however, little to no empirical research has been conducted on online learning methodologies. The following studies were all conducted in behavior analysis Master-level courses. The studies evaluated various aspects of those courses including with and without access to online forums (asynchronous discussion), with and without instructor involvement on forums, point contingencies on forums, access to synchronous and asynchronous discussion, participation in synchronous or asynchronous discussion sessions, and grading criteria (credit/no credit vs accuracy).
|Target Audience: |
Those interested in higher education, especially online learning.
|Online Learning: The Effect of Synchronous Discussion Sessions in Asynchronous Courses|
|JESSLYN N. FARROS (Center for Applied Behavior Analysis (CABA) and Endicott College), Lesley A. Shawler (Endicott College), Ksenia Kravtchenko (Endicott College, Global Autism Project), Mary Jane Weiss (Endicott College)|
|Abstract: Online learning is extremely prevalent in education. In 2015, close to six million students were taking at least one online learning course, which was 29.7% of all postsecondary students. Although online learning is becoming more prevalent, there has been little to no research to determine what makes online learning most effective. Those that have, either have not compared modalities or have focused on another aspect of the learning. Determining the components of online learning that lead to better student outcomes will add to the current literature and improve online learning as a whole. The current study comprises four different experiments that evaluated the effect of synchronous discussion sessions in asynchronous master-level applied behavior analysis courses. Three different applied behavior analysis courses were used and each experiment utilized a slightly different experimental design. The first two focused on the addition of synchronous discussion within an asynchronous course and the last two focused on comparing the effects of synchronous and asynchronous discussion. The primary purpose of these experiments was to determine how asynchronous and synchronous discussion affect student outcomes in asynchronous online courses.|
|The Use of Discussion Forums in Asynchronous Behavior Analysis Masters Courses|
|ALLISON ROSE BICKELMAN (Autism Behavior Intervention; Endicott College), Mary Jane Weiss (Endicott College), Tara A. Fahmie (California State University, Northridge)|
|Abstract: Asynchronous online education is increasingly popular, including in the field of behavior analysis. It is imperative that any modality of education use current evidence-based teaching methods to ensure that student learning outcomes are strong. Many online courses use discussion forums as part of the course requirements. Previous research on discussion forums is mixed in terms of effectiveness and both student and instructor preference. Three studies were conducted in asynchronous behavior analysis Masters courses to examine student outcomes with and without access to forums, with and without instructor involvement on forums, and with various point contingencies for posting on forums. Overall results indicate that forums do not have direct, critical impact on student quiz scores and course outcomes; however, social validity measures demonstrate variability in preference for the use of the forums.|
Comparing Grading Criteria for Readiness Assessment Tests: Accuracy versus Credit/No Credit
|LEAH ROSENFELD (California State University Sacramento ), Megan R. Heinicke (California State University, Sacramento), Shelby Bryeans (California State University, Sacramento)|
Pre-lecture reading quizzes, or Readiness Assessment Tests (RATs), improve college students’ exam performance; however, implementing RATs requires instructor resources. This study compared accuracy versus credit/no credit grading criteria on exam scores, participation, and attendance in an upper-level college course using a nonequivalent control group design. Students in the credit/no credit group spent less time on RATs and performed poorer on both RATs and unit exams across the semester compared with students who were required to respond accurately on RATs. We did not find significant differences between groups on attendance or participation measures. More students in the credit/no credit group reported liking RATs and recommended other instructors use them, whereas more students in the accuracy group had a preference for RATs over in-class quizzes. Although grading for completion rather than accuracy may be less intensive for instructors, our findings suggest this choice may decrease the benefits of RATs for students.