Engagement Home Resources Standards Inspiration About the Author
Literature Review Problem Analysis Community
Action Research Project & Literature Review
Does the Use of Clickers in Middle
School Science Lead to Increased Student Engagement and Improved Learning?
Karen Morton, Touro University California School of Education, Vallejo, California 2012
Abstract
Students in middle school science often lack engagement in their learning. This action research projects attempted to demonstrate whether or not the use of clickers would increase engagement and therefore, improve learning. During a two week study, two similar classes were compared with one using clickers to respond to questions presented during lessons while the other responded to the same questions orally or with hand signals. Data was collected using teacher observations, student surveys and a post-test on the concepts taught during the unit. Findings indicated that clickers did in fact increase engagement; however post-test results showed that students using clickers learned less than those without clickers.
Introduction and Statement of the Problem
One of the biggest challenges public middle school teachers face is the lack of engagement of students. Not only is this time of life rife with internal hormonal changes for students, but it is full of social upheaval as well. How they are viewed by their peers drives adolescent decision making, so often what they wear is far higher on their priority lists than what they are studying in school. The state driven science content standards in 7th and 8th grade require students to learn a multitude of facts that have little perceived relevance to their daily lives. The history of the Earth, the difference between speed and velocity, and reproduction of plants are concepts to which few middle school students can directly relate. As teachers, we do our best to point out how knowing this information will help them in the future but frankly, the blank stares, disruptive behavior and lack of quality work are obvious signs that students are not engaged.
Even high achieving students who aim to please and have excellent work ethics seem to be disengaged from much of the science content and rarely choose to pursue the more challenging options many teachers provide. They are cooperative and will complete all assignments with the goal of earning perfect scores, but are they just going through the motions or do they actually care enough about the topic to fully absorb and apply it to new situations? These students are experts at learning teacher expectations and will participate fully in class but unless the teacher is tapping into their higher level cognitive abilities, they get bored and will mentally check out.
The lowest achieving students are obviously the most difficult to engage. By the time they reach middle school, they have a long record of just squeaking by academically and are way behind in the acquisition of many basic skills. They have learned to cope by completing the bare minimum of work, copying from classmates or giving up altogether and becoming behavior problems. After all, to many adolescents, it’s better to appear bad than stupid so they act out to cover their deficiencies (Katch, 1988, p.34). Many will keep quiet in class, hoping the teacher won’t call on them and are resigned to fail. With the rising class sizes public schools face, it is way too easy for many students to fall through the cracks. Often times these students have the ability to succeed academically but due to home life or social turmoil, they are in survival mode and doing well in school is a low priority. How can teachers better engage these students so they are more successful in school?
How can teachers engage all students so they have the opportunity to reach their individual potentials? Many teachers face this problem daily and one area that has been explored in many ways is the use of technology in the classroom. Personal response systems, or clickers, are commonly used in college level courses to involve all students and force them to interact in large classrooms (Morgan, 2008). In my five middle school classes of thirty or more students, it is easy for many to remain passive learners. Even with noble attempts to call on each student daily during class question and answer sessions, the direct participation of individual students is minimal. The more vocal scholars dominate the discussions while the rest are content to sit back and watch or are inhibited by the judgmental nature of their peers. Will being required to answer questions electronically and anonymously be of benefit to disengaged students?
This action research seeks to establish evidence for the use of clickers in a middle school science classroom to increase student engagement in science learning. Implementing the use of clickers in one of my 7th grade science classes and comparing it to a class without clickers will demonstrate whether or not they are effective in increasing student engagement. Teacher observations will be recorded, students will be surveyed on their perceptions of interest level, and a post test on the concepts taught will be administered to each class to determine the impact clickers have on middle school students’ learning. Does the use of clickers in the middle school science classroom lead to increased student engagement and improved learning?
Review of Related Literature
Clickers have commonly been in use in the university classroom since the early 2000s to help instructors integrate active learning into large lecture classes. This technology is known by many names, such as personal response systems, classroom response systems, student response systems, audience response systems, zappers, or simply, clickers (White, Delaney, Syncox, Akerberg, & Alters, 2011). Instructors use software and a wireless receiver to collect instant responses from students. Students submit responses through individual hand-held keypads by pushing numbered buttons based on multiple choice questions posed by the instructor. Class responses are displayed anonymously on a screen and a histogram is created as an object for discussion. Instructors can privately review the responses of each student to see who understands the concept and who needs more help. They can also see which students are not responding at all so no one can get away with being a passive participant.
The main purpose for incorporating clickers is to increase opportunities for active learning through student participation in lectures as demonstrated by Addison, Wright, & Milner (2009). Their research, as well as that of others, has mostly shown a positive correlation between clicker use and increased engagement of students in the content. “With clickers, the majority of students participated more in lectures and reported that they perceived an improvement in understanding and their performance on exams” (Addison et al., 2009, p.89). In large physiology lecture classes, Gauci, Dantas, Williams, & Kemm (2008), noted that active lectures were found to increase both student motivation and engagement. Eighty-nine percent of student respondents thought that the use of clickers motivated them to think while eighty three percent agreed that they were more engaged and interested (Gauci et al., 2008). This was also true in introductory psychology courses where instructors found that “an ARS (audience response system) is a great pedagogical tool to demonstrate a concept and encourage active engagement, open discussion, personal reflection, and learning” (Michelleto, 2011, p.11) In Michelleto’s (2011) study, students were directly involved in mini psychology experiments by responding to survey questions with their clickers. They saw the results on the screen instantly which gave them immediate feedback and fostered lively discussions. Students’ reflection of their experience in the experiment led to deeper engagement and promoted critical thinking. In all of these studies it is clear that clickers increased student interest and more active participation.
Some of the very same research seems to contradict these findings, however. It is not so much that they contradict the increased engagement in student learning but rather the increased engagement does not necessarily lead to more successful learning. In Addison et al.’s 2009 study, high student ratings in increased interest and engagement barely correlated with higher exam scores. These better scores proved to be statistically insignificant and only occurred with the high achieving students. Their scores were compared to similar students in the same course that did not use clickers. The middle and low achieving students achieved no gains in test scores even after reporting increased engagement and interest. “We believe these findings have important implications for the introduction of technology into a learning environment, as a specific technology or activity will not necessarily benefit all students or even those that need it most” (p.90). The study performed by Gauci, Dantas, Williams, and Kemm (2008) also showed inconsistent results on exam scores from clicker classes when comparing students of different achievement levels. Compared to scores they received in a prerequisite physiology course, low achieving students actually scored higher than both middle achieving and high achieving students in the subsequent physiology course that incorporated clickers. Tests scores are the most basic way to measure learning and in these cases, the clickers did not appear to relate to significant gains.
The studies that included data from surveys of student satisfaction and perception were not all positive. Morgan’s (2008) research in five college psychology classes of twenty-five to fifty students focused on assessing whether clickers reduced attrition, grades, and student satisfaction. Even though the findings were not statistically significant, attrition was higher and grades were lower in classes that used clickers. The data showed the following:
70% of students enjoyed clickers
42% enjoyed the anonymity of clickers
65% did not like paying for clickers
74% reported that clickers interfered with discussion (p.34).
All of these clicker studies admit to unavoidable flaws in the designs in that they could not control for all factors. How experienced the instructors were with the use of clickers, how they were used in the classroom, the type of questions asked, and the allowance (or not) of time for students to discuss responses varied. Addison et al., (2009) recognized that “effective teaching methods and question design are critical factors in the successful use of this technology” (p.90). Because students in large classes are less likely to be given opportunities to engage in active learning, Morgan (2008) makes the recommendation “to only use clickers in large classes where more personal means of interaction might be problematic” (p.35).
When comparing two active learning methods, clickers and class discussion, Martyn (2007) drew conclusions that clickers benefit both students and instructors. Clickers are better for students because their answers are anonymous so the fear of public disclosure is removed and clickers create a game atmosphere so the class using clickers enjoyed the process and became more engaged. Clickers are better for instructors because students are engaged the entire class period and the level of understanding of each student can be continually gauged. Students receive immediate feedback and the clickers, if used properly, promote understanding rather than recall.
Questions remain on the effectiveness of clickers in the classroom in engaging students and increasing learning. The majority of research has been conducted at the university level and the results are mixed. Several teachers at my middle school have used clickers in the past but for various reasons have stopped. The aim of my action research was to find out if clickers increase the engagement of my middle school science students. It was my hope that better engagement would lead to increased learning as well through effective questioning strategies and required participation of all students in the class.
Research Methods
A middle school science teacher for 15 years, I currently teach five heterogeneous classes of public middle school science in suburban Northern California, two classes of 8th graders and three of 7th graders. Over the years, I have noticed that my students have become less and less engaged in the subject matter. Their boredom has led to everything from poorer quality of work produced to significant difficulties with their self control which leads to behavior problems. Adding a technology component by the way of clickers, or personal response devices, is an attempt to increase my students’ engagement and perhaps lead to improved learning. Today’s generation of students are accustomed to using technology in multiple ways so introducing clickers should tap into these skills. They are also still kids so the clickers should also add a sense of fun to lessons and make a game out of learning. It is hoped that the technology will appeal to them and increase engagement but will this also lead to increased learning?
To test this idea, I chose my 7th grade B period as my clicker class and my 7th grade D period as my control class for this study. These two classes have the most similar demographics, mean test percentages, both contain equal numbers of outspoken students who dominate class discussions, and nearly equal numbers of students that I have identified through informal observation as disengaged most of the time. The following table illustrates the similarities between these classes.
Class Comparison
B Period D Period
Male/Female 52%/48% 58%/42%
Minority Students 45% 45%
Special Needs Students 62% 67%
Mean 4th Quarter Science Test Scores 82% 82%
Proficient or Advanced in ELA on State Test 69% 70%
Proficient or Advanced in Math on State Test 65% 58%
Proficient or Advanced in Science on State Test 72% 70%
Number of Active Participants 6 6
Number of Non-Participants 9 12
B period was chosen as the clicker class simply because there are 29 students enrolled as opposed to 33 in D period and there are only 32 clickers available.
Classroom Performance System Pulse clickers (eInstruction® 2000) were chosen for the study because they were already available at my school and I had received some basic training on how to use them two years previously. Identical PowerPoint lessons were presented for six days in both the clicker class and the control class with multiple choice and yes/no questions embedded periodically. Each student in the experimental class was assigned a numbered clicker and pushed a button indicating their answer to each question. A radio signal was sent to a hub on the teacher’s computer and each student’s answer was recorded individually. Students were able to see the results of the entire class’s anonymous responses on the screen immediately. Student answers were confidential but tallied as a class. After the lesson the teacher was able to see individual student responses.
The unit chosen for the study was on the electromagnetic spectrum and interaction of light with matter. This is a topic that has proven in past experience to be challenging to teach because 7th grade students have difficulty engaging in a topic with little perceived relevance to their immediate lives. The following California State Standards were covered:
7.6 Physical principles underlie biological structures and functions.
a. Students know visible light is a small band within a very broad electromagnetic spectrum.
b. Students know that for an object to be seen, light emitted by or scattered from it must be detected by the eye.
c. Students know light travels in straight lines if the medium it travels through does not change.
d. Students know how simple lenses are used in a magnifying glass, the eye, a camera, a telescope, and a microscope.
e. Students know that white light is a mixture of many wavelengths (colors) and that retinal cells react differently to different wavelengths.
f. Students know light can be reflected, refracted, transmitted, and absorbed by matter.
g. Students know the angle of reflection of a light beam is equal to the angle of incidence.
Both the clicker and control classes received the same direct instruction, question and answer sessions, hands on activities, and reflection homework assignments during the two week study period. The questioning sessions required multiple choices or yes/no types of responses. The only difference in methodology was that the experimental group used clickers during the question and answer sessions while the control group responded to questions with sign language or orally when randomly called on.
Data Collection
To allow for triangulation, three types of data were collected during this action research project, teacher observation, student surveys, and a unit posttest. The first two qualitative data collection methods involved only the clicker class. The quantitative posttest was chosen to compare the experimental and control groups’ mastery of the concepts taught so both groups were tested.
Data Collection Matrix
Research Question
Does the use of clickers in middle school science lead to increased engagement and improved learning?
Data Source 1
Teacher observation of student's comments and responses to questions.
Data Source 2
Student survey of their perceptions with Likert scale and comments.
Data Source 3
Post-test of 15 multiple choice questions based on the standards taught.
The first source of data compiled consisted of observations of student comments and behaviors while the clickers were in use. These were written down by the teacher as the question and answer sessions were conducted with the clickers during the six lessons. After each session, the individual student responses to the clicker questions were printed out as well. These responses were used to determine what percentage of students chose the correct answer for each question. This method of formative assessment helped determine what students were learning and how the teacher adjusted the lesson.
The second data source was the student survey given the day after the last session and before the posttest was administered. Three basic questions were asked to determine student perceptions of the use of clickers during their science lessons. A Likert type scale was used and students were asked to circle their responses and then provide an explanation. The survey questions were the following:
1. How did you feel about using clickers in class?
1 2 3 4 5
Hated it Didn’t like it It was ok Liked it Loved it
2. Do you think the clickers helped you learn?
1 2 3 4 5
Not at all I’m not sure I think so Yes, somewhat Yes, definitely
3. Do you think the clickers helped you pay attention in class?
1 2 3 4 5
Not at all Not really Sometimes More than usual All the time
The final data source was the posttest given to both the clicker class and the control class. The test questions were chosen from textbook samples as well as teacher created items. An identical posttest was given to both the clicker class and the control class and was used as a quantifiable comparison of the two classes. Before the use of clickers, both classes had identical mean test scores so the results of the posttest should demonstrate whether or not there was a difference in how well students learned the material with clickers compared to no clickers.
Data Analysis
With the first data source, teacher observations, comments and behaviors were sorted into three categories. The participation category included observations of the numbers of students using the clickers during the question and answer session and how they participated. Were they reluctant or enthusiastic, for example? The fun factor category consisted of comments from students and their actions based on how much they seemed to enjoy using the clickers. Both of these categories contributed to the engagement component of the research. The third category incorporated the actual responses to the questions posed during the lesson. The questions used to determine previous knowledge were separated from those used to check for understanding of the lesson. The percentages of correct answers on the lesson based questions were compiled for each session to determine whether or not students were learning the material presented.
Student surveys were the second data source and each question was analyzed separately. The numbers circled for each question were tallied to determine the frequency of each response. The higher numbers (4 or 5) indicated positive responses, lower numbers (1 or 2) were considered negative while the middle (3) was neutral. Comments students wrote for each question were divided into groups of negative and positive responses. The survey results were considered as student perception of engagement and learning during the use of clickers.
The final data source, the multiple choice posttest, was given after completion of the unit. Mean, median and mode scores of the clicker class and control class were calculated to compare how well each class learned the standards taught. It was assumed that the higher the class average posttest score, the more the students learned.
Findings
The findings of this action research show an increase in student engagement while clickers were used during the lessons as observed by the teacher and perceived by the students. However, based on unit posttest scores, this did not seem to correspond to improved learning. The observations collected in the clicker class were divided into three categories; participation, fun factor, and evidence of learning. Participation during question and answer sessions was 100% for the students present. Questions were embedded into the Power Point slide show lessons. Some were testing previous knowledge but most were checking for understanding as the material was presented. Each student was required to enter a response and the wait time was up to one minute but was often much shorter. Each question was locked by the teacher once all the students’ responses were entered. The same four students seemed to be the last to enter their responses and interestingly, three of the four had been previously identified as disengaged. Several students benefited from the wait time when they discovered that they could change their answers if they changed their minds. Their numbers flashed red when they changed answers so multiple students were observed changing their answers during each session. On the first day a student commented, “I’m not going to answer because I don’t know.” When told that he could guess and no one but the teacher would know if he was right or wrong, he was visibly relieved and didn’t hesitate to participate again. Many students commented on the anonymous nature of the clicker responses. “I like this. No one knows your answer,” was a common type of comment. Because of full participation, all students appeared to be engaged while the clickers were in use.
Many observations were directly related to how much fun students were having while using the clickers. They eagerly retrieved their clickers at the beginning of each class and were instantly ready to use them when a question appeared on the screen. Their comments included the following:
“These are cool!”
“Can we do more?”
“We should do this for the rest of the year.”
Cheers were routinely heard when the screen with the correct answer was revealed, along with how many students chose each response. After noticing that their number changed from light to dark blue when they entered a response, several students in the class made a game out of seeing who could be the first to respond. A common refrain with the easier questions was “Hah! I was the first one to click in!” This game seemed to trivialize the questions and distracted and frustrated several students as observed by their facial expressions. The fun factor appeared to increase engagement on the surface but some students clearly were more engaged in the game than concepts they were supposed to be learning.
In analyzing the responses to the questions after each session was completed, findings suggest that most students understood the material presented. Previous knowledge questions received a predictable variety of responses, from 32% to 76% correct answers. The questions that checked for understanding resulted in 70% to 93% correct responses. Learning seemed to be taking place during the lessons when clickers were used but these observations are limited to the clicker class only. Learning was assumed to be taking place in the control class as well.
The second point of the triangulated data consisted of a student survey filled out by all 29 students that used the clickers during the action research. The first question asked how they felt about using the clickers in class. 83% of the students responded that they liked or loved using them. 0% indicated that they hated or disliked the clickers and the remaining 17% felt that they were ok. Thirteen comments indicated that they were having fun with them or that they were easy to use. Eight other students made specific comments about how they appreciated the anonymity of the clicker responses. “I wasn’t publicly embarrassed if I got an answer wrong.” This may have led to increased engagement because students had positive experiences with the clickers.
Survey question number 2 asked students if they thought the clickers helped them learn. 66% of students responded yes, somewhat or yes, definitely. Seven students commented that they learned more because of the immediate feedback. “You immediately got to see if the answer was right or not.” “It helped me understand the lesson better by knowing what I did wrong.” Four students recognized that the mandatory responses forced them to focus on the lesson. “It made everyone click in and that helped us learn. “ The other 34% of the class weren’t sure or felt that the clickers did not help them learn. “It was hard to concentrate.” “It went too fast.” The perception of 2/3 of the class was that they did learn more using clickers than they would have without them.
The third question on the survey asked if students thought the clickers helped them pay attention in class. 55% felt that it did help them pay attention more than usual or all the time, while 34% disagreed. “My attention was focused on the clickers and not talking.” “…because I wanted to know if I got the answers right or not.” Those that disagreed commented that they already paid attention in class without clickers or that the clickers were distracting. “Many people were playing with them and just wanted to get their answers in first.” Although most students felt that they were more engaged, a significant number did not find the clickers helpful and some even noted that they drew their attention away from the material they were supposed to learn.
The final point in the triangulation of data was the posttest. This was meant to measure how much students using clickers learned versus students without clickers. Although teacher and student perception was that engagement in the clicker class was high and this should naturally lead to improved learning, the posttest showed surprising results. The only difference between the two classes in the lessons leading up to the posttest was the use of clickers in one and not the other. They both received the same direct instruction, labs, homework assignments and question and answer sessions. The clicker class responded to questions using the clicker while the control class responded when called on randomly or as a class using hand signals. The results of the test scores are shown in the following table.
Test Scores Clicker Class Control Class
Mean 71.3% 77.3%
Median 11 12
Mode 11 11, 12
This 6% difference in mean scores in favor of the control class as well as the higher median and mode scores suggests that the clickers actually may have caused less learning to occur than would have occurred without them.
Although students appeared to be more engaged and many students felt that the clickers helped them pay attention and learn, as one perceptive student noted, “It felt more like a game rather than we were learning.” The fun factor may have increased students’ engagement but perhaps they were paying more attention to the technology than the concepts the technology was meant to help them learn.
This action research demonstrated that clickers are an effective tool in a middle school science class for increasing participation and adding a pleasurable element to a lesson. All of my students reportedly enjoyed using the clickers to a certain extent. The clickers were a fun and novel way to participate without the threat of public humiliation if a student answered incorrectly. However, the use of clickers may have been too fun - to the point of distraction. My students’ scores on the unit test were quite low considering the level of engagement they demonstrated in class. I would use them again to ensure 100% participation and to receive timely feedback during my lessons. Also, if used for a longer time period I believe the novelty would wear off and the clickers would become less of a toy and more of a tool. Then perhaps clickers would increase engagement and would lead to improved learning as well.
Implications
Conducting this research has given me some insight into the relationship between making learning fun versus making learning meaningful. The clickers were definitely fun for my students. I have always assumed that the interest level of my students would correspond with the amount they learn. The students were engaged when using the clickers so I expected them to do well on the unit test. This was not the case so now I need to find out why. I would like to find out if the clickers would be more effective as a tool for learning by using them for a longer period of time. Perhaps the novelty of the tool needs to wear off before gathering more data.
Presenting my findings to interested faculty members is the next step. All teachers want to maximize participation and engagement of their students and clickers are just one tool that can do this. Many of my colleagues have considered using clickers and would benefit from the knowledge I’ve gained through this research. One of the drawbacks to using clickers is the time it takes to learn the technology and prepare the questions. Another difficulty is ensuring that the questions and the choices of answers are effectively written. Through collaboration we could create question banks to limit the time each teacher spends on preparation and ensure a variety of high quality questions and responses. Then through long term use of clickers by students in multiple classes, student learning could be re-evaluated. Clickers are one of many tools that can be implemented to improve our students’ learning experiences. Increasing participation and enjoyment are valid reasons to introduce clicker technology into the middle school science classroom.
References
Addison, S., Wright, A., & Milner, R., 2009. Using clickers to improve student engagement and performance in an introductory biochemistry class. Biochemistry and Molecular Biology Education, 37(2), 84-91. doi:10.1002/bmb.20264.
eInstruction CPS6.0 Training Videos K12 (2000). Retrieved from http://legacy.einstruction.com /support_downloads/training/CPS6/K12.html.
Gauci, S. A., Dantas, A., M, Williams, D. A., & Kemm, R. E., 2008. Promoting student centered active learning in lectures with a personal response system. Advances in Physiology Education, 33(1), 60-67. doi: 10.1152/advan.00109.2007.
Holt California Life Science, 2007. Light and Living Things (72-97)
Katch, M., 1988. Acting out adolescents: The engagement process. Child and Adolescent Social Work Journal, 5(1), 30-40. doi:10.1007/BF00757470.
Martyn, M., 2007. Clickers in the classroom: An active learning approach. Educause Quarterly, 30(2), 71-74. Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterly
MagazineVolum/ClickersintheClassroomAnActive/157458.
Michelletto, M. J., 2011. Conducting a classroom mini-experiment using an audience response system: Demonstrating the isolation effect. The Clute Institute Journal of College Teaching and Learning, 8(8), 1-14.
Morgan, R., 2008. Exploring the pedagogical effectiveness of clickers. Insight: A Journal of Scholarly Teaching, 3, 31-36.
White, P., Delaney, D., Syncox, D., Akerberg, O. A., & Alters, B., 2011. Clicker implementation modules. Educause Quarterly, 34(4). Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/
EDUCAUSEQuarterlyMagazineVolum/ClickerImplementationModels/242688.
Karen Morton, Touro University California School of Education, Vallejo, California 2012
Abstract
Students in middle school science often lack engagement in their learning. This action research projects attempted to demonstrate whether or not the use of clickers would increase engagement and therefore, improve learning. During a two week study, two similar classes were compared with one using clickers to respond to questions presented during lessons while the other responded to the same questions orally or with hand signals. Data was collected using teacher observations, student surveys and a post-test on the concepts taught during the unit. Findings indicated that clickers did in fact increase engagement; however post-test results showed that students using clickers learned less than those without clickers.
Introduction and Statement of the Problem
One of the biggest challenges public middle school teachers face is the lack of engagement of students. Not only is this time of life rife with internal hormonal changes for students, but it is full of social upheaval as well. How they are viewed by their peers drives adolescent decision making, so often what they wear is far higher on their priority lists than what they are studying in school. The state driven science content standards in 7th and 8th grade require students to learn a multitude of facts that have little perceived relevance to their daily lives. The history of the Earth, the difference between speed and velocity, and reproduction of plants are concepts to which few middle school students can directly relate. As teachers, we do our best to point out how knowing this information will help them in the future but frankly, the blank stares, disruptive behavior and lack of quality work are obvious signs that students are not engaged.
Even high achieving students who aim to please and have excellent work ethics seem to be disengaged from much of the science content and rarely choose to pursue the more challenging options many teachers provide. They are cooperative and will complete all assignments with the goal of earning perfect scores, but are they just going through the motions or do they actually care enough about the topic to fully absorb and apply it to new situations? These students are experts at learning teacher expectations and will participate fully in class but unless the teacher is tapping into their higher level cognitive abilities, they get bored and will mentally check out.
The lowest achieving students are obviously the most difficult to engage. By the time they reach middle school, they have a long record of just squeaking by academically and are way behind in the acquisition of many basic skills. They have learned to cope by completing the bare minimum of work, copying from classmates or giving up altogether and becoming behavior problems. After all, to many adolescents, it’s better to appear bad than stupid so they act out to cover their deficiencies (Katch, 1988, p.34). Many will keep quiet in class, hoping the teacher won’t call on them and are resigned to fail. With the rising class sizes public schools face, it is way too easy for many students to fall through the cracks. Often times these students have the ability to succeed academically but due to home life or social turmoil, they are in survival mode and doing well in school is a low priority. How can teachers better engage these students so they are more successful in school?
How can teachers engage all students so they have the opportunity to reach their individual potentials? Many teachers face this problem daily and one area that has been explored in many ways is the use of technology in the classroom. Personal response systems, or clickers, are commonly used in college level courses to involve all students and force them to interact in large classrooms (Morgan, 2008). In my five middle school classes of thirty or more students, it is easy for many to remain passive learners. Even with noble attempts to call on each student daily during class question and answer sessions, the direct participation of individual students is minimal. The more vocal scholars dominate the discussions while the rest are content to sit back and watch or are inhibited by the judgmental nature of their peers. Will being required to answer questions electronically and anonymously be of benefit to disengaged students?
This action research seeks to establish evidence for the use of clickers in a middle school science classroom to increase student engagement in science learning. Implementing the use of clickers in one of my 7th grade science classes and comparing it to a class without clickers will demonstrate whether or not they are effective in increasing student engagement. Teacher observations will be recorded, students will be surveyed on their perceptions of interest level, and a post test on the concepts taught will be administered to each class to determine the impact clickers have on middle school students’ learning. Does the use of clickers in the middle school science classroom lead to increased student engagement and improved learning?
Review of Related Literature
Clickers have commonly been in use in the university classroom since the early 2000s to help instructors integrate active learning into large lecture classes. This technology is known by many names, such as personal response systems, classroom response systems, student response systems, audience response systems, zappers, or simply, clickers (White, Delaney, Syncox, Akerberg, & Alters, 2011). Instructors use software and a wireless receiver to collect instant responses from students. Students submit responses through individual hand-held keypads by pushing numbered buttons based on multiple choice questions posed by the instructor. Class responses are displayed anonymously on a screen and a histogram is created as an object for discussion. Instructors can privately review the responses of each student to see who understands the concept and who needs more help. They can also see which students are not responding at all so no one can get away with being a passive participant.
The main purpose for incorporating clickers is to increase opportunities for active learning through student participation in lectures as demonstrated by Addison, Wright, & Milner (2009). Their research, as well as that of others, has mostly shown a positive correlation between clicker use and increased engagement of students in the content. “With clickers, the majority of students participated more in lectures and reported that they perceived an improvement in understanding and their performance on exams” (Addison et al., 2009, p.89). In large physiology lecture classes, Gauci, Dantas, Williams, & Kemm (2008), noted that active lectures were found to increase both student motivation and engagement. Eighty-nine percent of student respondents thought that the use of clickers motivated them to think while eighty three percent agreed that they were more engaged and interested (Gauci et al., 2008). This was also true in introductory psychology courses where instructors found that “an ARS (audience response system) is a great pedagogical tool to demonstrate a concept and encourage active engagement, open discussion, personal reflection, and learning” (Michelleto, 2011, p.11) In Michelleto’s (2011) study, students were directly involved in mini psychology experiments by responding to survey questions with their clickers. They saw the results on the screen instantly which gave them immediate feedback and fostered lively discussions. Students’ reflection of their experience in the experiment led to deeper engagement and promoted critical thinking. In all of these studies it is clear that clickers increased student interest and more active participation.
Some of the very same research seems to contradict these findings, however. It is not so much that they contradict the increased engagement in student learning but rather the increased engagement does not necessarily lead to more successful learning. In Addison et al.’s 2009 study, high student ratings in increased interest and engagement barely correlated with higher exam scores. These better scores proved to be statistically insignificant and only occurred with the high achieving students. Their scores were compared to similar students in the same course that did not use clickers. The middle and low achieving students achieved no gains in test scores even after reporting increased engagement and interest. “We believe these findings have important implications for the introduction of technology into a learning environment, as a specific technology or activity will not necessarily benefit all students or even those that need it most” (p.90). The study performed by Gauci, Dantas, Williams, and Kemm (2008) also showed inconsistent results on exam scores from clicker classes when comparing students of different achievement levels. Compared to scores they received in a prerequisite physiology course, low achieving students actually scored higher than both middle achieving and high achieving students in the subsequent physiology course that incorporated clickers. Tests scores are the most basic way to measure learning and in these cases, the clickers did not appear to relate to significant gains.
The studies that included data from surveys of student satisfaction and perception were not all positive. Morgan’s (2008) research in five college psychology classes of twenty-five to fifty students focused on assessing whether clickers reduced attrition, grades, and student satisfaction. Even though the findings were not statistically significant, attrition was higher and grades were lower in classes that used clickers. The data showed the following:
70% of students enjoyed clickers
42% enjoyed the anonymity of clickers
65% did not like paying for clickers
74% reported that clickers interfered with discussion (p.34).
All of these clicker studies admit to unavoidable flaws in the designs in that they could not control for all factors. How experienced the instructors were with the use of clickers, how they were used in the classroom, the type of questions asked, and the allowance (or not) of time for students to discuss responses varied. Addison et al., (2009) recognized that “effective teaching methods and question design are critical factors in the successful use of this technology” (p.90). Because students in large classes are less likely to be given opportunities to engage in active learning, Morgan (2008) makes the recommendation “to only use clickers in large classes where more personal means of interaction might be problematic” (p.35).
When comparing two active learning methods, clickers and class discussion, Martyn (2007) drew conclusions that clickers benefit both students and instructors. Clickers are better for students because their answers are anonymous so the fear of public disclosure is removed and clickers create a game atmosphere so the class using clickers enjoyed the process and became more engaged. Clickers are better for instructors because students are engaged the entire class period and the level of understanding of each student can be continually gauged. Students receive immediate feedback and the clickers, if used properly, promote understanding rather than recall.
Questions remain on the effectiveness of clickers in the classroom in engaging students and increasing learning. The majority of research has been conducted at the university level and the results are mixed. Several teachers at my middle school have used clickers in the past but for various reasons have stopped. The aim of my action research was to find out if clickers increase the engagement of my middle school science students. It was my hope that better engagement would lead to increased learning as well through effective questioning strategies and required participation of all students in the class.
Research Methods
A middle school science teacher for 15 years, I currently teach five heterogeneous classes of public middle school science in suburban Northern California, two classes of 8th graders and three of 7th graders. Over the years, I have noticed that my students have become less and less engaged in the subject matter. Their boredom has led to everything from poorer quality of work produced to significant difficulties with their self control which leads to behavior problems. Adding a technology component by the way of clickers, or personal response devices, is an attempt to increase my students’ engagement and perhaps lead to improved learning. Today’s generation of students are accustomed to using technology in multiple ways so introducing clickers should tap into these skills. They are also still kids so the clickers should also add a sense of fun to lessons and make a game out of learning. It is hoped that the technology will appeal to them and increase engagement but will this also lead to increased learning?
To test this idea, I chose my 7th grade B period as my clicker class and my 7th grade D period as my control class for this study. These two classes have the most similar demographics, mean test percentages, both contain equal numbers of outspoken students who dominate class discussions, and nearly equal numbers of students that I have identified through informal observation as disengaged most of the time. The following table illustrates the similarities between these classes.
Class Comparison
B Period D Period
Male/Female 52%/48% 58%/42%
Minority Students 45% 45%
Special Needs Students 62% 67%
Mean 4th Quarter Science Test Scores 82% 82%
Proficient or Advanced in ELA on State Test 69% 70%
Proficient or Advanced in Math on State Test 65% 58%
Proficient or Advanced in Science on State Test 72% 70%
Number of Active Participants 6 6
Number of Non-Participants 9 12
B period was chosen as the clicker class simply because there are 29 students enrolled as opposed to 33 in D period and there are only 32 clickers available.
Classroom Performance System Pulse clickers (eInstruction® 2000) were chosen for the study because they were already available at my school and I had received some basic training on how to use them two years previously. Identical PowerPoint lessons were presented for six days in both the clicker class and the control class with multiple choice and yes/no questions embedded periodically. Each student in the experimental class was assigned a numbered clicker and pushed a button indicating their answer to each question. A radio signal was sent to a hub on the teacher’s computer and each student’s answer was recorded individually. Students were able to see the results of the entire class’s anonymous responses on the screen immediately. Student answers were confidential but tallied as a class. After the lesson the teacher was able to see individual student responses.
The unit chosen for the study was on the electromagnetic spectrum and interaction of light with matter. This is a topic that has proven in past experience to be challenging to teach because 7th grade students have difficulty engaging in a topic with little perceived relevance to their immediate lives. The following California State Standards were covered:
7.6 Physical principles underlie biological structures and functions.
a. Students know visible light is a small band within a very broad electromagnetic spectrum.
b. Students know that for an object to be seen, light emitted by or scattered from it must be detected by the eye.
c. Students know light travels in straight lines if the medium it travels through does not change.
d. Students know how simple lenses are used in a magnifying glass, the eye, a camera, a telescope, and a microscope.
e. Students know that white light is a mixture of many wavelengths (colors) and that retinal cells react differently to different wavelengths.
f. Students know light can be reflected, refracted, transmitted, and absorbed by matter.
g. Students know the angle of reflection of a light beam is equal to the angle of incidence.
Both the clicker and control classes received the same direct instruction, question and answer sessions, hands on activities, and reflection homework assignments during the two week study period. The questioning sessions required multiple choices or yes/no types of responses. The only difference in methodology was that the experimental group used clickers during the question and answer sessions while the control group responded to questions with sign language or orally when randomly called on.
Data Collection
To allow for triangulation, three types of data were collected during this action research project, teacher observation, student surveys, and a unit posttest. The first two qualitative data collection methods involved only the clicker class. The quantitative posttest was chosen to compare the experimental and control groups’ mastery of the concepts taught so both groups were tested.
Data Collection Matrix
Research Question
Does the use of clickers in middle school science lead to increased engagement and improved learning?
Data Source 1
Teacher observation of student's comments and responses to questions.
Data Source 2
Student survey of their perceptions with Likert scale and comments.
Data Source 3
Post-test of 15 multiple choice questions based on the standards taught.
The first source of data compiled consisted of observations of student comments and behaviors while the clickers were in use. These were written down by the teacher as the question and answer sessions were conducted with the clickers during the six lessons. After each session, the individual student responses to the clicker questions were printed out as well. These responses were used to determine what percentage of students chose the correct answer for each question. This method of formative assessment helped determine what students were learning and how the teacher adjusted the lesson.
The second data source was the student survey given the day after the last session and before the posttest was administered. Three basic questions were asked to determine student perceptions of the use of clickers during their science lessons. A Likert type scale was used and students were asked to circle their responses and then provide an explanation. The survey questions were the following:
1. How did you feel about using clickers in class?
1 2 3 4 5
Hated it Didn’t like it It was ok Liked it Loved it
2. Do you think the clickers helped you learn?
1 2 3 4 5
Not at all I’m not sure I think so Yes, somewhat Yes, definitely
3. Do you think the clickers helped you pay attention in class?
1 2 3 4 5
Not at all Not really Sometimes More than usual All the time
The final data source was the posttest given to both the clicker class and the control class. The test questions were chosen from textbook samples as well as teacher created items. An identical posttest was given to both the clicker class and the control class and was used as a quantifiable comparison of the two classes. Before the use of clickers, both classes had identical mean test scores so the results of the posttest should demonstrate whether or not there was a difference in how well students learned the material with clickers compared to no clickers.
Data Analysis
With the first data source, teacher observations, comments and behaviors were sorted into three categories. The participation category included observations of the numbers of students using the clickers during the question and answer session and how they participated. Were they reluctant or enthusiastic, for example? The fun factor category consisted of comments from students and their actions based on how much they seemed to enjoy using the clickers. Both of these categories contributed to the engagement component of the research. The third category incorporated the actual responses to the questions posed during the lesson. The questions used to determine previous knowledge were separated from those used to check for understanding of the lesson. The percentages of correct answers on the lesson based questions were compiled for each session to determine whether or not students were learning the material presented.
Student surveys were the second data source and each question was analyzed separately. The numbers circled for each question were tallied to determine the frequency of each response. The higher numbers (4 or 5) indicated positive responses, lower numbers (1 or 2) were considered negative while the middle (3) was neutral. Comments students wrote for each question were divided into groups of negative and positive responses. The survey results were considered as student perception of engagement and learning during the use of clickers.
The final data source, the multiple choice posttest, was given after completion of the unit. Mean, median and mode scores of the clicker class and control class were calculated to compare how well each class learned the standards taught. It was assumed that the higher the class average posttest score, the more the students learned.
Findings
The findings of this action research show an increase in student engagement while clickers were used during the lessons as observed by the teacher and perceived by the students. However, based on unit posttest scores, this did not seem to correspond to improved learning. The observations collected in the clicker class were divided into three categories; participation, fun factor, and evidence of learning. Participation during question and answer sessions was 100% for the students present. Questions were embedded into the Power Point slide show lessons. Some were testing previous knowledge but most were checking for understanding as the material was presented. Each student was required to enter a response and the wait time was up to one minute but was often much shorter. Each question was locked by the teacher once all the students’ responses were entered. The same four students seemed to be the last to enter their responses and interestingly, three of the four had been previously identified as disengaged. Several students benefited from the wait time when they discovered that they could change their answers if they changed their minds. Their numbers flashed red when they changed answers so multiple students were observed changing their answers during each session. On the first day a student commented, “I’m not going to answer because I don’t know.” When told that he could guess and no one but the teacher would know if he was right or wrong, he was visibly relieved and didn’t hesitate to participate again. Many students commented on the anonymous nature of the clicker responses. “I like this. No one knows your answer,” was a common type of comment. Because of full participation, all students appeared to be engaged while the clickers were in use.
Many observations were directly related to how much fun students were having while using the clickers. They eagerly retrieved their clickers at the beginning of each class and were instantly ready to use them when a question appeared on the screen. Their comments included the following:
“These are cool!”
“Can we do more?”
“We should do this for the rest of the year.”
Cheers were routinely heard when the screen with the correct answer was revealed, along with how many students chose each response. After noticing that their number changed from light to dark blue when they entered a response, several students in the class made a game out of seeing who could be the first to respond. A common refrain with the easier questions was “Hah! I was the first one to click in!” This game seemed to trivialize the questions and distracted and frustrated several students as observed by their facial expressions. The fun factor appeared to increase engagement on the surface but some students clearly were more engaged in the game than concepts they were supposed to be learning.
In analyzing the responses to the questions after each session was completed, findings suggest that most students understood the material presented. Previous knowledge questions received a predictable variety of responses, from 32% to 76% correct answers. The questions that checked for understanding resulted in 70% to 93% correct responses. Learning seemed to be taking place during the lessons when clickers were used but these observations are limited to the clicker class only. Learning was assumed to be taking place in the control class as well.
The second point of the triangulated data consisted of a student survey filled out by all 29 students that used the clickers during the action research. The first question asked how they felt about using the clickers in class. 83% of the students responded that they liked or loved using them. 0% indicated that they hated or disliked the clickers and the remaining 17% felt that they were ok. Thirteen comments indicated that they were having fun with them or that they were easy to use. Eight other students made specific comments about how they appreciated the anonymity of the clicker responses. “I wasn’t publicly embarrassed if I got an answer wrong.” This may have led to increased engagement because students had positive experiences with the clickers.
Survey question number 2 asked students if they thought the clickers helped them learn. 66% of students responded yes, somewhat or yes, definitely. Seven students commented that they learned more because of the immediate feedback. “You immediately got to see if the answer was right or not.” “It helped me understand the lesson better by knowing what I did wrong.” Four students recognized that the mandatory responses forced them to focus on the lesson. “It made everyone click in and that helped us learn. “ The other 34% of the class weren’t sure or felt that the clickers did not help them learn. “It was hard to concentrate.” “It went too fast.” The perception of 2/3 of the class was that they did learn more using clickers than they would have without them.
The third question on the survey asked if students thought the clickers helped them pay attention in class. 55% felt that it did help them pay attention more than usual or all the time, while 34% disagreed. “My attention was focused on the clickers and not talking.” “…because I wanted to know if I got the answers right or not.” Those that disagreed commented that they already paid attention in class without clickers or that the clickers were distracting. “Many people were playing with them and just wanted to get their answers in first.” Although most students felt that they were more engaged, a significant number did not find the clickers helpful and some even noted that they drew their attention away from the material they were supposed to learn.
The final point in the triangulation of data was the posttest. This was meant to measure how much students using clickers learned versus students without clickers. Although teacher and student perception was that engagement in the clicker class was high and this should naturally lead to improved learning, the posttest showed surprising results. The only difference between the two classes in the lessons leading up to the posttest was the use of clickers in one and not the other. They both received the same direct instruction, labs, homework assignments and question and answer sessions. The clicker class responded to questions using the clicker while the control class responded when called on randomly or as a class using hand signals. The results of the test scores are shown in the following table.
Test Scores Clicker Class Control Class
Mean 71.3% 77.3%
Median 11 12
Mode 11 11, 12
This 6% difference in mean scores in favor of the control class as well as the higher median and mode scores suggests that the clickers actually may have caused less learning to occur than would have occurred without them.
Although students appeared to be more engaged and many students felt that the clickers helped them pay attention and learn, as one perceptive student noted, “It felt more like a game rather than we were learning.” The fun factor may have increased students’ engagement but perhaps they were paying more attention to the technology than the concepts the technology was meant to help them learn.
This action research demonstrated that clickers are an effective tool in a middle school science class for increasing participation and adding a pleasurable element to a lesson. All of my students reportedly enjoyed using the clickers to a certain extent. The clickers were a fun and novel way to participate without the threat of public humiliation if a student answered incorrectly. However, the use of clickers may have been too fun - to the point of distraction. My students’ scores on the unit test were quite low considering the level of engagement they demonstrated in class. I would use them again to ensure 100% participation and to receive timely feedback during my lessons. Also, if used for a longer time period I believe the novelty would wear off and the clickers would become less of a toy and more of a tool. Then perhaps clickers would increase engagement and would lead to improved learning as well.
Implications
Conducting this research has given me some insight into the relationship between making learning fun versus making learning meaningful. The clickers were definitely fun for my students. I have always assumed that the interest level of my students would correspond with the amount they learn. The students were engaged when using the clickers so I expected them to do well on the unit test. This was not the case so now I need to find out why. I would like to find out if the clickers would be more effective as a tool for learning by using them for a longer period of time. Perhaps the novelty of the tool needs to wear off before gathering more data.
Presenting my findings to interested faculty members is the next step. All teachers want to maximize participation and engagement of their students and clickers are just one tool that can do this. Many of my colleagues have considered using clickers and would benefit from the knowledge I’ve gained through this research. One of the drawbacks to using clickers is the time it takes to learn the technology and prepare the questions. Another difficulty is ensuring that the questions and the choices of answers are effectively written. Through collaboration we could create question banks to limit the time each teacher spends on preparation and ensure a variety of high quality questions and responses. Then through long term use of clickers by students in multiple classes, student learning could be re-evaluated. Clickers are one of many tools that can be implemented to improve our students’ learning experiences. Increasing participation and enjoyment are valid reasons to introduce clicker technology into the middle school science classroom.
References
Addison, S., Wright, A., & Milner, R., 2009. Using clickers to improve student engagement and performance in an introductory biochemistry class. Biochemistry and Molecular Biology Education, 37(2), 84-91. doi:10.1002/bmb.20264.
eInstruction CPS6.0 Training Videos K12 (2000). Retrieved from http://legacy.einstruction.com /support_downloads/training/CPS6/K12.html.
Gauci, S. A., Dantas, A., M, Williams, D. A., & Kemm, R. E., 2008. Promoting student centered active learning in lectures with a personal response system. Advances in Physiology Education, 33(1), 60-67. doi: 10.1152/advan.00109.2007.
Holt California Life Science, 2007. Light and Living Things (72-97)
Katch, M., 1988. Acting out adolescents: The engagement process. Child and Adolescent Social Work Journal, 5(1), 30-40. doi:10.1007/BF00757470.
Martyn, M., 2007. Clickers in the classroom: An active learning approach. Educause Quarterly, 30(2), 71-74. Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/EDUCAUSEQuarterly
MagazineVolum/ClickersintheClassroomAnActive/157458.
Michelletto, M. J., 2011. Conducting a classroom mini-experiment using an audience response system: Demonstrating the isolation effect. The Clute Institute Journal of College Teaching and Learning, 8(8), 1-14.
Morgan, R., 2008. Exploring the pedagogical effectiveness of clickers. Insight: A Journal of Scholarly Teaching, 3, 31-36.
White, P., Delaney, D., Syncox, D., Akerberg, O. A., & Alters, B., 2011. Clicker implementation modules. Educause Quarterly, 34(4). Retrieved from http://www.educause.edu/EDUCAUSE+Quarterly/
EDUCAUSEQuarterlyMagazineVolum/ClickerImplementationModels/242688.