Over the years, teachers and professors alike are coming up with new, innovative ways for students to learn and retain information. One of the more recent additions to these gadgets is the iClicker, an electronic response device used in different schools across Ontario. Now, we must ask ourselves a simple question: Is the iClicker actually effective? Through my experience with this gadget, from pondering whether or not to take it out in class, punching in the classroom code, and trying my luck on the day’s set of problems, in no way did it appeal to me.
Many problems encompass the idea of our handheld answering device. First of all, we have the question of ethics. When the classroom is half empty, and there is an iClicker answer count of 150, there is obviously a problem. From what I had seen in class, there seems to have been many cases where students were able to answer questions for their friends by simply bringing a bunch of iClickers to class. Due to this, many of the students do not actually receive any knowledge from such a tool. Secondly, we are able to choose whether or not we would like our iClicker grades to be part of our final mark. The iClicker marks themselves are not mandatory, and its weight is shifted towards other assignments/exams during the term. This automatically instills a sense of safety, (and who the heck wants to read a chapter ahead anyways?). Many students do not bother to use the iClicker simply because they are confident that their exam marks would be higher than playing around with the learning tool. Lastly, we have the issue of the iClicker’s cost. The idea that we are spending quite a bit of money on a tool that isn’t helping us learn or providing us with many marks is very unatt
ractive. Although some students have the iClicker, few find it effective for their learning.
I can definitely say that the iClicker is not a “must have” tool for teachers to be using. It leans towards a tool for students to gain in a few percent for their grade. Who knows? Maybe by the time I graduate, they will have invented something entirely foolproof, but only time will tell.
2 thoughts on “iClick? iClick not. — Dan Gan, CTE Co-op Student”
Hello Dan, I agree with Mark Morton above.
I used iClickers for the first time last year with ~850 students (across 3 sections) and not only were the number of responses scaled to attendance and as I’d estimate (I watch carefully and they are warned about this), but the feedback on the evaluations was overwhelmingly positive. The students LOVED the clickers and attendance was up over the year before. It broke up the class, which improved focus, and kept them paying attention because, deep down, they wanted to correctly answer the questions at the end of the lecture. Students would discuss their responses with their neighbours if they got the answer wrong on the first try and the proportion answering correctly went up substantially after these brief discussions.
As for cost. I think it costs a student $40 and they get $20 back when they return them. $20 is not very much for anyone – well worth the investment for a participation tool that is useful for an undergrad career.
Just wanted to add my two cents as a Biology faculty member. Clickers worked very well for me and the students always wanted more questions.
Josh Neufeld, Biology
Hi, Dan — One thing we don’t have enough of on the CTE blog is perspectives from actual students, so I’m glad that you contributed this posting about your experience with using clickers in class. I hope, though, that your experience to date with clickers hasn’t totally turned you off of them. After all, like any tool, an instructor can use clickers effectively or ineffectively (I once had a math instructor who used the blackboard very ineffectively: he would stand in front of the board while writing down an equation, and then continue to stand in front of it while talking about it, and then erase it before anyone had a chance to actually see it!). Although individual experiences with clickers vary, overall the evidence shows that they do contribute to learning in the following ways:
INCREASED STUDENT ENGAGEMENT:
Kaleta (2007) reports that “Faculty agreed or strongly agreed that there was greater student engagement (94%), participation (87%), and interaction (68%) in class as a result of clicker use. … The majority of students also agreed or strongly agreed that the use of clickers made them feel more engaged (69%) in class, increased participation (70%), and helped them pay attention (67%).
IMPROVED CLASS ATTENDANCE:
Caldwell (2007) reports that “when clicker scores accounted for 15% or more of the course grade, attendance rose to 80-90%.”
Caldwell (2007) reports that clickers reduced end-of-term attrition from 8-12% to about 4%.
IMPROVED LEARNING OUTCOMES:
Caldwell (2007) reports that “Most reviews agree that ‘ample converging evidence’ suggests that clickers generally cause improved student outcomes such as improved exam scores or passing rates, student comprehension, and learning and that students like clickers.” Kaleta (2007) reports that “The statistical analyses of grade data collected for the 11 parallel courses between fall 2004 and fall 2005 showed a statistically significant impact of clicker use on student performance…. There was an increase of 2.23% in the number of students obtaining a grade of C or better in the courses that used clickers.” Fies (2006) reports that “There is great agreement that CRSs [clickers] promote learning when coupled with appropriate pedagogical methodologies…. The literature also indicates that CRS-supported environments lead to greater learning gains than traditional learning environments.”
Anyway, I hope the next time you take a course that employs clickers, that it enhances your learning experience rather than detracts from it!