Supporting College Level Literacy Research Home Resources Standards Inspiration About the Author |
Research into Science Literacy
Why do so many students struggle in science? Why do students find it so challenging to develop a logical argument, provide solid evidence, and articulate a rational analysis of a phenomenon? Why do so many students fail to persist when faced with these challenges? Perhaps more importantly, as the world becomes increasingly dependent on technologies that necessitate broad based literacy skills, how do teachers help students to overcome these deficits?
|
These questions led to an initial action research plan. A preliminary review of literature looked at resiliency and buoyancy in secondary students, which led the researcher eventually to a larger investigation of the general lack of literacy skills in American students as a whole. However, the initial action research plan narrowed the focus to an investigation of literacy scaffolds for secondary science students. The central driving question for the action research plan was: "If writing scaffolds that are targeted towards improving secondary high school students’ literacy skills such as writing arguments, evidence, and analysis are implemented, then students will learn to communicate more effectively while learning the content more deeply." The purpose of the initial study was to measure the effectiveness of peer review as a scaffold on secondary science student performance and to investigate the efficiency and efficacy of online writing, peer review, and assessment methods. The results of the study showed that up to 35% of students who participated in the online peer review process demonstrated an increase in their writing scores of at least one performance band, when measured against the Knowledge and Thinking (Sciences) College Readiness Assessment (CRA) rubric, as opposed to those students who did not participate in the peer review process. The peer review group also demonstrated a small increase in content test scores over the group who did not participate in peer review. This suggests that collaborative writing activities may help improve students' writing skills and content knowledge.
Background and Need
It is no secret that the United States is seeking to reform its ailing educational system. As the world becomes increasingly technologically and economically integrated, U.S. students have found themselves falling behind their global peers on nearly every measure. For example, 2012 PISA results indicate the 15 year olds in the US ranked below average on every metric: 17th in reading, 20th in science, and 27th in mathematics ("PISA", 2012). This is of particular concern, as these same US students will find themselves competing throughout their adult careers with their peers on both a local and a global level. According to the World Economic Forum, the America now ranks 5th in overall global competitiveness, due in part to the low quality of science, technology, engineering, and mathematics (STEM) education (Gurria, 2011). Meanwhile, the fastest growing and most lucrative careers, both locally and globally, are ones that require strong foundations in STEM education. Even in 2010, 16 of the 25 highest paying jobs were STEM-based; those and other STEM-based occupations are projected to grow 17% over the next ten years (Langdon, 2011). Clearly, there exists a substantial need to improve US students’ educational outcomes, especially in the scientific disciplines.
President Obama spoke clearly on this issue, stating, "The nation that out-educates us today is going to out-compete us tomorrow." (Duncan, 2010). The president has challenged the educational community to improve student outcomes across the board, and specifically to bring US students’ PISA scores to the top ranks in science and mathematics. Improving the scientific and technological knowledge of students is a critical step. Equally important is strengthening the ability of US students to effectively communicate their STEM content knowledge. One major concern reported by the PISA board was the inability of US students to translate their content knowledge into real world situations and effectively communication solutions to real world problems ("PISA", 2012). Not coincidentally, US employers report similar concerns about college graduates. According to a study commissioned by the American Association of Colleges and Universities, the top three skills employers seek in college graduates are 1) the ability to collaborate effectively, 2) a clear and grounded understanding of science and technology, and 3) the ability to write and communicate clearly (Hart, 2006). These needs, unmet in the era of No Child Left Behind, have pushed forward the state-led design and implementation of the Common Core State Standards (CCSS), which began in 2007 and have since been adopted by forty-three states, the District of Columbia, and a number of territories.
The purpose of the CCSS is to better prepare K-12 students for college and career, but it should also better prepare young people to engage in their communities and fulfill their civic responsibilities. Specifically, the CCSS calls for students to demonstrate deeper knowledge and critical thinking and analysis skills in real world contexts and across multiple subjects. This integrated model of literacy means that reading and writing must be taught alongside content in every class (Pearson, 2010). The effects of this work are intended to become evident on a global scale; however, the work that goes into creating scientific literacy by necessity happens on a local level, class by class.
“Writing is thinking made manifest. If students cannot think clearly, they will not write well. So in this respect, writing is tangible evidence of critical thinking…” (Goldberg, 2014). Unfortunately, many students graduate from high school and college without the ability to write clearly and comprehensively on any subject (Khan, 2010; Pappalardo, 2010; Stewart, 2007; Taylor, 2003). However, if writing is a skill, then like all skills, it must be taught, reinforced, and practiced. Traditionally, there has been little emphasis on writing skills outside of English classes. Science and mathematics teachers, in particular, have placed little to no importance on supporting students’ writing skills. The Common Core State Standards (CCSS) seek to change this, emphasizing not only learning to write, but also writing to learn. Embedded within the CCSS are the expectations that students should be able to use writing to recall and organize information, to provide and interpret evidence, and to analyze and build content knowledge across all content disciplines (Graham, Gillespie, & McKeown, 2013). Admittedly, teaching students to write in the scientific disciplines presents a unique challenge. With broad content requirements embedded into each class, it is difficult for many science teachers to accept the idea of taking time away from teaching core concepts in order to instruct students in writing. Another major hurdle faced by many science teachers is knowing exactly how to support writing; indeed many teachers report feeling ill-equipped when faced with incorporating writing instruction into their class (Kiuhara, Graham, & Hawken, 2009; Pearson, 2010). In order for teachers to overcome these challenges, effective writing scaffolds and time-efficient assessment methods must be available for teachers to use. The purpose of this study is to measure the effectiveness of peer review on secondary science student performance, and to investigate the efficiency and efficacy of online writing and assessment methods.
President Obama spoke clearly on this issue, stating, "The nation that out-educates us today is going to out-compete us tomorrow." (Duncan, 2010). The president has challenged the educational community to improve student outcomes across the board, and specifically to bring US students’ PISA scores to the top ranks in science and mathematics. Improving the scientific and technological knowledge of students is a critical step. Equally important is strengthening the ability of US students to effectively communicate their STEM content knowledge. One major concern reported by the PISA board was the inability of US students to translate their content knowledge into real world situations and effectively communication solutions to real world problems ("PISA", 2012). Not coincidentally, US employers report similar concerns about college graduates. According to a study commissioned by the American Association of Colleges and Universities, the top three skills employers seek in college graduates are 1) the ability to collaborate effectively, 2) a clear and grounded understanding of science and technology, and 3) the ability to write and communicate clearly (Hart, 2006). These needs, unmet in the era of No Child Left Behind, have pushed forward the state-led design and implementation of the Common Core State Standards (CCSS), which began in 2007 and have since been adopted by forty-three states, the District of Columbia, and a number of territories.
The purpose of the CCSS is to better prepare K-12 students for college and career, but it should also better prepare young people to engage in their communities and fulfill their civic responsibilities. Specifically, the CCSS calls for students to demonstrate deeper knowledge and critical thinking and analysis skills in real world contexts and across multiple subjects. This integrated model of literacy means that reading and writing must be taught alongside content in every class (Pearson, 2010). The effects of this work are intended to become evident on a global scale; however, the work that goes into creating scientific literacy by necessity happens on a local level, class by class.
“Writing is thinking made manifest. If students cannot think clearly, they will not write well. So in this respect, writing is tangible evidence of critical thinking…” (Goldberg, 2014). Unfortunately, many students graduate from high school and college without the ability to write clearly and comprehensively on any subject (Khan, 2010; Pappalardo, 2010; Stewart, 2007; Taylor, 2003). However, if writing is a skill, then like all skills, it must be taught, reinforced, and practiced. Traditionally, there has been little emphasis on writing skills outside of English classes. Science and mathematics teachers, in particular, have placed little to no importance on supporting students’ writing skills. The Common Core State Standards (CCSS) seek to change this, emphasizing not only learning to write, but also writing to learn. Embedded within the CCSS are the expectations that students should be able to use writing to recall and organize information, to provide and interpret evidence, and to analyze and build content knowledge across all content disciplines (Graham, Gillespie, & McKeown, 2013). Admittedly, teaching students to write in the scientific disciplines presents a unique challenge. With broad content requirements embedded into each class, it is difficult for many science teachers to accept the idea of taking time away from teaching core concepts in order to instruct students in writing. Another major hurdle faced by many science teachers is knowing exactly how to support writing; indeed many teachers report feeling ill-equipped when faced with incorporating writing instruction into their class (Kiuhara, Graham, & Hawken, 2009; Pearson, 2010). In order for teachers to overcome these challenges, effective writing scaffolds and time-efficient assessment methods must be available for teachers to use. The purpose of this study is to measure the effectiveness of peer review on secondary science student performance, and to investigate the efficiency and efficacy of online writing and assessment methods.
Literature Review
The foundation of this study lies ultimately in the social construction of knowledge. Leo Vygotsky, in his constructivist theories about learning, maintained that learning is most likely to occur when students work together on a task to problem solve and thus arrive at a shared understanding. Most studies that are grounded in this framework support the understanding that learning depends on active, social interaction, often with a more skilled partner who understands some of the task at hand. These studies show that students who work in this collaborative manner tend to have better outcomes than those who work independently, especially when properly scaffolded.
The process of scaffolding also grows from the Vygotsky’s idea on the Zone of Proximal Development, in that if a small amount of help is provided to the student at the right time, he or she will be able to achieve a level of understanding that, alone, was just out of reach. Peer review is such a process. In peer review, students submit their work for critique by other students who are at or just above their range of ability. Peer assessment is a social learning activity, and it assumes that students will learn and grow together. This process mimics real-world processes, in which academic and professional journals regularly review and critique the work of their peers. The benefits of peer review have been shown to include learning to think critically (Liu, Pysarchik, & Taylor, 2002; Topping, 1998) and improved content scores with an increased ability to explain complex concepts (Freeman, 2010; Pelaez, 2002). For example, in the Freeman study, prescribed peer graded activities in an introductory level biology class correlated with higher exam scores for both the test taker and scorer, even when the scorer graded “easier” than the professor would have. In the Pelaez study, non-major biology students were better able to articulate complex concepts on exams after engaging in rigorous peer review activities. In another study, students participated in Guided Reciprocal Peer Questioning of each other using high-level question starters to help students learn and practice higher-level thinking with each other. In time, after practicing this and other strategies similar to it, students began to pay closer attention to lessons and (because they knew they would be asked to formulate these higher level questions afterwards, they began to jot down strategic questions during the presentation of the material (King, 2002). In a review comparing multiple studies of peer assessment, Keith Topping found several experiments, including a 1987 study by Hendrickson, Brady, and Algozzine, a 1989 study by H.M. Watson, and an 1994 study by L.A.J Stefani that reported measurable gains in student performance following peer assessments. All studies involved some level of scaffolding of the peer review process (Topping, 1998). In the Hendrickson study, the teacher gave students a specific rubric to follow in terms of comments. The Watson study provided oral suggestions and modeled a sample. The Stefani study included specific sentence starters for the students to complete as they were reviewing their peers’ work. In all cases, students reported feeling more proficient with the writing process or the teacher reported measurable improvement in their writing performance.
There exists some criticism of peer review as a learning process. Many studies rely heavily on anecdotal evidence provided by teachers or by student surveys (Acker, 2011; Kiuhara et al., 2009; Liu et al., 2002; Odom, 2009; Trautmann, 2009) without correlation to quantitative data as is shown in the Freeman, Pelaez, or King studies. Research also shows that in order to lead to higher level thinking gains, student interactions must rise to a higher level, including the exchange of questions and explanations and allowing the students to elaborate and reflect on their own thinking processes (DeBoer, 2000; Grant, 2011). Student must see high quality examples of writing and be able to recognize what makes those examples exemplary.
Another common criticism is that students are more concerned with being “nice” than giving effective, substantive feedback. However, studies show this can be mitigated by taking the peer review process online and provided structure to the review, which appears to increase the effectiveness of student feedback, especially when supported by a rubric (Lu, 2012; Trautmann, 2009). Peer feedback appears to improve student content scores even when peer grading does not show significant improvement in academic performance. For example, 187 students were assessed in the Lu study, and their assessment activities were coded and analyzed. The study showed an increase in the number of comments given by students when the feedback was provided online and an improvement in quality. It also found that assessor students benefited from assessing their peers, especially when they were specifically prompted to identify problems and then give specific suggestions about how the student could resolve the problem or improve the writing. For example, the more problems identified and suggestions given by a student, the more likely he or she was to make improvements on his or her own assignment.
Studies support that peer review activities help students become more comfortable with revising their writing and helps to improve the quality of their writing overall. This initial study seeks to confirm this support, and also seeks to establish whether or not a correlation exists between students’ improvement in writing and their performance on standard science content assessments.
The process of scaffolding also grows from the Vygotsky’s idea on the Zone of Proximal Development, in that if a small amount of help is provided to the student at the right time, he or she will be able to achieve a level of understanding that, alone, was just out of reach. Peer review is such a process. In peer review, students submit their work for critique by other students who are at or just above their range of ability. Peer assessment is a social learning activity, and it assumes that students will learn and grow together. This process mimics real-world processes, in which academic and professional journals regularly review and critique the work of their peers. The benefits of peer review have been shown to include learning to think critically (Liu, Pysarchik, & Taylor, 2002; Topping, 1998) and improved content scores with an increased ability to explain complex concepts (Freeman, 2010; Pelaez, 2002). For example, in the Freeman study, prescribed peer graded activities in an introductory level biology class correlated with higher exam scores for both the test taker and scorer, even when the scorer graded “easier” than the professor would have. In the Pelaez study, non-major biology students were better able to articulate complex concepts on exams after engaging in rigorous peer review activities. In another study, students participated in Guided Reciprocal Peer Questioning of each other using high-level question starters to help students learn and practice higher-level thinking with each other. In time, after practicing this and other strategies similar to it, students began to pay closer attention to lessons and (because they knew they would be asked to formulate these higher level questions afterwards, they began to jot down strategic questions during the presentation of the material (King, 2002). In a review comparing multiple studies of peer assessment, Keith Topping found several experiments, including a 1987 study by Hendrickson, Brady, and Algozzine, a 1989 study by H.M. Watson, and an 1994 study by L.A.J Stefani that reported measurable gains in student performance following peer assessments. All studies involved some level of scaffolding of the peer review process (Topping, 1998). In the Hendrickson study, the teacher gave students a specific rubric to follow in terms of comments. The Watson study provided oral suggestions and modeled a sample. The Stefani study included specific sentence starters for the students to complete as they were reviewing their peers’ work. In all cases, students reported feeling more proficient with the writing process or the teacher reported measurable improvement in their writing performance.
There exists some criticism of peer review as a learning process. Many studies rely heavily on anecdotal evidence provided by teachers or by student surveys (Acker, 2011; Kiuhara et al., 2009; Liu et al., 2002; Odom, 2009; Trautmann, 2009) without correlation to quantitative data as is shown in the Freeman, Pelaez, or King studies. Research also shows that in order to lead to higher level thinking gains, student interactions must rise to a higher level, including the exchange of questions and explanations and allowing the students to elaborate and reflect on their own thinking processes (DeBoer, 2000; Grant, 2011). Student must see high quality examples of writing and be able to recognize what makes those examples exemplary.
Another common criticism is that students are more concerned with being “nice” than giving effective, substantive feedback. However, studies show this can be mitigated by taking the peer review process online and provided structure to the review, which appears to increase the effectiveness of student feedback, especially when supported by a rubric (Lu, 2012; Trautmann, 2009). Peer feedback appears to improve student content scores even when peer grading does not show significant improvement in academic performance. For example, 187 students were assessed in the Lu study, and their assessment activities were coded and analyzed. The study showed an increase in the number of comments given by students when the feedback was provided online and an improvement in quality. It also found that assessor students benefited from assessing their peers, especially when they were specifically prompted to identify problems and then give specific suggestions about how the student could resolve the problem or improve the writing. For example, the more problems identified and suggestions given by a student, the more likely he or she was to make improvements on his or her own assignment.
Studies support that peer review activities help students become more comfortable with revising their writing and helps to improve the quality of their writing overall. This initial study seeks to confirm this support, and also seeks to establish whether or not a correlation exists between students’ improvement in writing and their performance on standard science content assessments.
Results
The results of the study showed that while fully half of students showed no change, up to 35% of students who participated in the online peer review process demonstrated an increase in their writing scores of at least one performance band, when measured against the Knowledge and Thinking (Sciences) College Readiness Assessment (CRA) rubric, as opposed to those students who did not participate in the peer review process. The peer review group also demonstrated a small increase in content test scores over the group who did not participate in peer review. This suggests that collaborative writing activities may help improve students' writing skills and content knowledge.