Assessing the Impact of Interactive Sampling Using Audience Response Systems

Research Summary:

Iclicker technology has little impact on student performance

Audience Response Systems (ARS), such as iclickers or electronic voting systems, are becoming more common in the classroom in an effort to better engage students and improve active learning. But a new study from the Higher Education Quality Council of Ontario (HEQCO) found that the use of iclickers in one undergraduate course at McMaster University did not improve students’ understanding of course concepts or influence quiz and exam grades. 

Project description

Assessing the Impact of Interactive Sampling Using Audience Response Systems  examines the effectiveness of iclickers in improving students’ understanding of course concepts and overall performance. Iclickers were employed in McMaster’s largest first-year undergraduate course to create an interactive hands-on learning environment for students. Introductory Psychology 1X03 is a blended course consisting of online web modules and weekly face-to-face tutorials of about 25 students each. Tutorials are led by undergraduate teaching assistants (TAs). Approximately 3,000 students were enrolled in the course during the Fall 2012 semester when the experiment took place. 

The experiment employed three different teaching methods during the tutorials to teach three concepts. The teaching methods used were traditional lecture; pen and paper lecture, where students were asked a series of questions and students wrote down their responses and submitted them to the TAs for analysis and posting the day before each quiz; and iclicker lectures, where students were asked questions and responded using iclickers, and TAs immediately provided students with the results. Data from students who did not attend class or write the quiz/exam were excluded from the study, as were approximately 1,000 quiz scores for each concept as some TAs failed to submit their attendance forms on time.

Following the experiment, students completed a brief questionnaire about their experiences.

Findings

Due to technical difficulties, poor presentation of the course material and lack of participation in the quiz, no data were analyzed for the first concept. The teaching method used had no effect on students’ understanding or quiz/exam performance for the second and third concepts. 

Students did not enjoy using the iclickers with very few students recommending their use.  Not having enough time to learn how to use the technology properly, students were more focused on the iclicker itself, than on the concepts being taught. Moreover, when the technology failed or TAs didn’t know how to use the software, students became distracted and active learning was compromised. On average, the pen and paper teaching method was judged to be more useful than the iclicker method.

Recommendations/Further research

The authors suggest that given the amount of time required to set up iclickers, they should be used either regularly or not at all. They found that a one-time set up was more of a distraction than a benefit to both students and instructors. More generally, the authors note that teaching with ARS should require prior planning and commitment from instructors, and better integration into the course overall.

Assessing the Impact of Interactive Sampling Using Audience Response Systems was written by Irina Ghilic, Michelle L. Cadieux, Joseph A. Kim and David I. Shore, McMaster University.