|
Improving Learning in Higher Education: Fluency, Oral Quizzes, Reducing Procrastination, and Weekly Quizzes |
Sunday, May 26, 2019 |
11:00 AM–12:50 PM |
Fairmont, Second Level, International Ballroom |
Area: TBA; Domain: Applied Research |
Chair: Christopher J. Perrin (Georgian Court University) |
Discussant: Traci M. Cihon (University of North Texas) |
CE Instructor: Judah B. Axe, Ph.D. |
Abstract: The four papers in this symposium extend research demonstrating the application of principles of behavior analysis to improving learning in undergraduate and graduate courses. In a graduate course, Perrin and Wilson compared two methods of fluency-based practice of terms – see/say versus see/type definitions – and found both improved weekly quiz scores, though the see/type method was preferred. In an online graduate course, Axe, Chase, Breault, and Neault found that oral quizzes administered at the start of live sessions improved written quiz and exam scores. In a graduate course, Bird and Chase reduced procrastination by providing study materials contingent on completing practice quizzes, though students did not prefer distributing their practice until bonus points were contingent on distributed practice. Finally, in a study focused on low-performing students in an undergraduate course, Dalfen, Fienup, and D’Ateno showed that requiring a passing criterion on weekly quizzes improved exam scores. These studies will be discussed in terms of the behavior analytic strategies of fluency-based instruction, self-control, choice behavior, and mastery learning. |
Instruction Level: Intermediate |
Keyword(s): fluency, passing criteria, procrastination, quizzes/exams |
Target Audience: behavior analysts, professors, graduate students, researchers |
|
Comparing the Effects of See-Say (SAFMEDS) and See-Type (TAFMEDS) Fluency Exercises on Quiz Performance |
CHRISTOPHER J. PERRIN (Georgian Court University), David M. Wilson (Georgian Court University) |
Abstract: Precision teaching techniques used in higher education often are see/say activities (i.e.., SAFMEDS) despite the fact that examinations are usually in a see/write or see/type learning channel. Previous researchers (e.g., Cihon, Sturtz, & Eshleman, 2012) have suggested it may be beneficial to conduct practice in the same learning channel as assessment. In a recent study, use of a see/type (i.e., TAFMEDS) exercise delivered by course management software was been shown to improve quiz performance relative to a no exercise condition (Perrin & Wilson, 2017). As such, the current investigation compared the effects of SAFMEDS and TAFMEDS on the weekly quiz performance of students enrolled in a graduate level experimental analysis of behavior course. Each week, students completed either SAFMEDS or TAFMEDS in a counterbalanced fashion in preparation for an in-class written quiz. Results indicated that quiz scores were similar across the SAFMEDS AND TAFMEDS conditions. In addition, the majority of students rated SAFMEDS and TAFMEDS as “useful in preparing for the quiz”. If given a choice, more students indicated a preference for TAFMEDS (54.5%) than SAFMEDS (45.5%). |
|
The Effects of Oral Quizzes on Written Exam Performance in an Online Graduate Course |
JUDAH B. AXE (Simmons University), Philip N. Chase (Simmons University), Megan Breault (RCS Learning Center; Simmons University), Noelle Neault (Simmons University) |
Abstract: There is limited research on strategies to improve exam performance in higher education, and even less in online teaching. Work from behavioral instruction has suggested that oral quizzing may improve written exam performance (Johnson & Ruskin, 1977). In the current study, 20 graduate students in an online research course were exposed to weeks with and without oral quizzes. Oral quizzes occurred during the first 15 minutes of four out of 10 live sessions in which the instructor randomly called on students to answer questions from the previous week’s content. Written quizzes and exams were divided into questions on content for which there was (oral quiz condition) and was not (no oral quiz condition) an oral quiz. Results indicated that 15 of the 20 participants had higher scores in the oral quiz condition. In one of the two terms, one section performed better on a post-oral exam compared to a pre-oral exam. Most participants reported liking the oral quizzes and feeling they helped in the course. Two of the three instructors rated the oral quizzes as helpful. Oral quizzes may have produced better learning as they held students accountable to verbalizing content and required more frequent studying. |
|
Applied Behavior Analysis Master's Student Pacing: Procrastination, Preference, and Performance |
ZACHARY C. BIRD (Principled Behavior Consultants; Simmons University), Philip N. Chase (Simmons University) |
Abstract: Previous research has shown that a large percentage of students procrastinate and a majority of those students wish to reduce it. The purpose of experiment 1 of the current study was to replicate previous findings that contingent access to study materials would produce distributed studying patterns. The current study extended previous research by evaluating preference for treatment using both a choice procedure as well as an end-of-semester survey. Results indicated that although contingent access to study materials was successful at reducing procrastination, students chose to pace themselves when allowed. Given that the intervention was disliked by a vast majority of students and was time intensive for researchers, professors may be reluctant to use the contingent access intervention. Experiment 2 evaluated a treatment for procrastination that was likely to be preferred by the students and more efficient to implement. Effects of a bonus point contingency for pacing was evaluated. Results indicated that a majority of students who procrastinated opted to distribute their studying behavior to access bonus points in the course. The data from both studies are discussed in terms of recommendations for future research regarding the use of bonus points in college courses and implications of student and professor preference. |
|
Passing Criterion: How Lowering Expectations for Quizzes can Produce Higher Scores on Exams |
Samantha Dalfen (Behavioral Intervention Psychological Services), DANIEL MARK FIENUP (Columbia University), Patricia A. D'Ateno (Queens College CUNY) |
Abstract: Students in higher education perform better on exams when they complete frequent quizzes on the assigned reading material; but little research has investigated how different grading criteria for quizzes affect quiz and exam performance. Previous research has shown that the frequent quiz effect has a lower impact on low-performing students, or the students who need help the most. To address this limitation, we examined the effects of different passing criterion for quizzes on exam scores. A passing criterion consists of requiring a student to obtain a certain score to earn full credit for the quiz, thus lowering the response effort to obtain reinforcement for completing quizzes. In Experiment 1, we compared low- and high-passing criterion and found that low-passing criterion produced better outcomes, especially for low performing students. In Experiment 2, we compared a low-passing criterion to a standard quiz grading criterion; we also manipulated the type of exam question and whether exam questions were replicated from a previous quiz. Experiment 2 revealed that passing criteria produced higher performance on exams. Collectively, the experiments support the notion that lowering the response effort for earning points from quizzes translated to improved exam scores. |
|
|