What A Semester-Long A/B Test Taught Us About Student Behavior

Back in the Fall of 2015, CourseKey conducted a semester-long A/B test at San Diego State University to understand the software’s impact on student performance. CourseKey examined two separate sections of the same course, taught by the same professor, Seth Kaplowitz, of the Fowler College of Business Administration. Each section met three times a week in a traditional lecture hall, but one section did not utilize any technology while the other integrated CourseKey’s software into the syllabus and teaching methods. After collecting and analyzing the data at the end of the semester, the results showed that the students using CourseKey performed better than those who didn’t. There was a dramatic increase in participation and attendance, as well as an increase in overall grades. Before we take a closer look at the results, let’s understand more about the study.

The Controls

experiment controls chart


The study was conducted in a course called “Legal Environment of Business” and both sections were taught by the same instructor, Seth Kaplowitz. Each section had a maximum of 85 enrolled students and lectures were held on Mondays, Wednesdays, and Fridays. The first section started at 11am and the second at noon. The students knew that attendance was monitored, but not included as part of their overall course grade. For the duration of the semester, CourseKey members attended and audited both the control and tech-enabled lectures each time they were in session to manually track attendance and participation. They would then compare the attendance data they collected to the sign-in sheet for the course without tech and the data collected by CourseKey’s software in the tech-enabled course.


Overall, the study showed a 6% increase in attendance, a 32% increase in participation, and a 0.5% increase in the average course grades for the section using CourseKey.

A Surge In Participation

Students in the CourseKey section engaged more with the lecture and participated more using CourseKey’s “Ask a Question” and “Chat” functions. Over the entire semester, 32 questions were asked by students. The students in the tech-free control section had access to an equivalent opportunity to ask questions digitally through their Blackboard Q&A discussion board, but none of the students posed any questions in the channel.

Higher Overall Grades

In terms of overall course grades, the students in the CourseKey section earned on average a half letter grade better than the students in the control section. While attendance was not a factor in the grading scheme, we can infer that stronger class attendance was correlated to better final grades, likely due to increased opportunities for content retention by attending lecture.

Boosted Attendance

The most promising result from the A/B test, however, was the attendance metrics. The students using CourseKey had a 6% higher average attendance rate than the students in the control section: 77% of students in the CourseKey-using class attended lecture compared to 71% of students in the control section. As the semester progressed, both sections saw a decrease in attendance. The section that wasn’t using CourseKey, however, saw a greater fall off than the section that was using CourseKey. During the first half of the semester, 82% of students in the CourseKey section attended class, while the control section had 77% of students attending. In the second half of the semester, the percentage dropped 10% for the CourseKey section and 13% in the section without CourseKey to bring the attendance averages of the second half of the semester to 72% and 64% respectively.

students checking in using sound technology

An Unexpected Discovery

Even more interesting than the higher overall attendance percentage was the insight into student behavior and that some students were cheating to give their peers attendance credit. The control section, which used a paper sign-in sheet for the semester reported skewed data. In every lecture, attendance was inflated by students signed-in by a classmate when, in fact, they were absent. On average, two students were absent each observed class session, but listed as present via the sign-in sheet. In the section using CourseKey, however, attendance data was accurately collected each lecture, with the data collected by the software reflecting the data collected manually by the CourseKey members auditing each class. So, not only did the CourseKey software incentivize more students to attend class overall, it tracked attendance data with complete accuracy and eradicated incidences of students cheating attendance for friends.

This study proved the positive impact that CourseKey’s software had on the students using it. Not only did students feel inclined to attend class more often, but they also took advantage of CourseKey’s communication channels to engage with their peers and instructor. Through their higher attendance rate and more frequent interaction with the lecture, students were able to earn higher grades than their peers in the control section. Plus, Professor Kaplowitz received accurate attendance data each session without having to worry about academic dishonesty from his students. Professor Kaplowitz had this to say about his experience with the software: “I approached the CourseKey team wanting to put it in that class after my attendance dropped below fifty percent. [After two years], I have seen close to a 30 percent increase in attendance!” Professor Kaplowitz retired from teaching last year, but used CourseKey up until his final semester to track attendance data and deliver auto-graded assessments in his classes.

Interested in discovering how CourseKey can improve attendance collection for your institution? Visit www.CourseKey.com or schedule a demo with a team member below.

Leave A Comment

Share on facebook
Share on google
Share on twitter
Share on linkedin
Share on pinterest