Collaborative Use of Student Data Gauges Performance and Guides Instruction
Improving systematic use of student data to inform the work of teachers, schools, and districts has become a hot topic in education reform. Learning Forward’s Standards for Professional Learning stress better use of data, and particularly student performance data, within an integrated approach to improving practice.
While better use of data by schools and districts is critical to improving student outcomes, the most direct impact comes from teachers using evidence of student thinking and understanding to improve instruction. However, systematically collecting, recording, analyzing, and reacting to this kind of data is challenging for teachers.
Two teachers who have harnessed this powerful practice are Ashley Hall and Amy Chimino (two of the authors of this article), both 4th-grade teachers in a not-for-profit charter system near Pittsburgh, Pa. Hall and Chimino have learned to use data formatively to influence what they do instructionally. Their students have been highly successful on measures of performance, and the teachers attribute student success, in part, to the ways that teachers assess learning and use the results from those assessments instructionally. They work collaboratively to develop routines that help them — and their students — gauge how well each student understands and can demonstrate mastery of learning standards. The routines include assessment, feedback, public use of data, reteaching, individualization of tasks, and progress monitoring.
Many of their practices serve as an example of what other teachers might do, alone or in teams, to use data more effectively to raise performance. These evidence-based practices are highly effective in these teachers’ classrooms and may be transferred to other teaching contexts if appropriate professional learning supports are in place.
One specific practice — using information from summative assessments in formative ways — is worth highlighting for practitioners who want to use data more effectively in instruction.
Propel McKeesport is a not-for-profit charter system near Pittsburgh, Pa. The school serves traditionally underserved students who are largely low-income (88% free or reduced lunch) and minority (72% African-American). Hall and Chimino shared the same 42 students during the 2010-11 school year.
On the end-of-year statewide exam, 100% of students were proficient or advanced in math and 86% were in English. These results are unusual in the state, particularly given the demographics of the classrooms. Furthermore, these students made considerable progress within the school year. On an August benchmark exam that is intended to predict the score on the end-of-year test, only 35% were proficient or advanced in math and 47% in English.
A primary source of actionable information about students is collected from what the teachers call the Monday assessment. Monday assessments are paper-based, mostly multiple-choice assessments of math and reading that probe student understanding of the required state learning standards. In form, it is meant to mimic the end-of-year standardized exam. Administered each Monday throughout the school year, the assessment includes about 24 to 26 questions in each domain and requires about 1½ to two hours to administer for each subject. Thus, much of each Monday is taken up administering and scoring the assessments. The teachers choose most questions from released items from Pennsylvania state tests or from other states’ released items that match the learning standards the teachers want to assess that week. The chosen items represent varying levels of challenge.
To tailor the content of the assessments to the needs of the classroom, the items fall into three categories:
Although the teachers use multiple-choice items that are mostly from standardized exams, they use the results formatively to impact their practice. To help make multiple-choice items more informative about students’ understanding of standards, the teachers require students to provide a written explanation of their answers for each item. This explanation may include why they chose a particular answer and/or the steps they took to solve the problem. These explanations typically are one or two sentences and may also include numerical symbols. A key element of the teachers’ practice is to have students explain or justify their reasoning, in writing or orally. The explanations serve two purposes: They support student self-regulation by helping students monitor their own understanding and serve as important sources of data for the teachers. Multiple-choice formats may not provide enough diagnostic information for teachers, so the explanations make the multiple-choice items more diagnostic. For example, when asked what the teachers think when a student gets a multiple-choice item correct but the explanation is wrong, one of the teachers responded, “It means they guessed.”
A recognized challenge for teachers using assessment information productively is the timeliness of the data. Standardized test data often cannot be used formatively because results come back weeks or months after administration. To address this challenge, these teachers score the Monday assessment almost immediately. They score the multiple-choice items, read the explanations, and record results, including whether the explanation was correct, so that students get the results as quickly as possible.
In general, Monday assessments represent a context for the teachers’ professional learning.
The teachers record Monday assessment data using three units of analysis:
Students learn about their performance on the Monday assessment in two stages. Individual assessments are scored and returned to students the same day, showing them which test items they answered correctly and their overall percentage of correct answers. On Tuesday, a white board in the classroom prominently displays test performance results for the class in several categories. Class performance itemizes the percentage of students who have scored at each performance category. In addition, the students’ mean score for the assessment is listed. This mean score is graphed on the white board, showing how the class as a whole has performed for the previous weeks. The posted results often spark discussions about performance and goal setting for the following week. The teachers ask students to verbalize attributions that support the connection between effort and outcome and to make explicit attributions about why they performed as they did.
Collecting and analyzing data about students’ understanding of important learning standards provides these teachers with rich professional learning opportunities as they strive to improve their effectiveness. Monday assessments give systematic markers of student progress and understanding. Students’ successes and struggles offer suggestions for which topics need to be reviewed and re-emphasized. As the teachers monitor student progress, the data also allow for reflection on the impact of their instructional plans. As patterns of student accomplishment, as well as misunderstanding, become evident from Monday assessment data, both teachers are able to reflect on and critique the value of their lesson plans, the feedback they provide, the learning supports they design, the rhetoric they use, and the messages they transmit to students and parents.
Monday assessments also support discussions about each teacher’s instructional activities because it is a common routine in both of their classes. They commonly discuss students’ achievement, progress, and strategies to target improvement. For example, the teachers can identify general patterns of performance in students across reading and math. If a student has poor patterns of performance on both assessments that deviate from the student’s typical pattern, they can seek to understand larger issues influencing the student’s performance and may collect data to support these working hypotheses about student performance. In addition, this common instructional routine also enables them to compare each other’s strategies for analyzing the data, which sometimes provides new ways to discern the progress of their students.
In general, Monday assessments represent a context for the teachers’ professional learning. The teachers are standards-driven in their approach to instruction. They have developed routines, including Monday assessments, that provide the right balance between wanting to know about individual students and whole-class progress. These Monday assessments provide enough detail for these teachers to act on instructionally. At the same time, the Monday assessment practice, though it takes effort to collect, record, and analyze data, is practical in classrooms. Monday assessments represent what these two teachers have learned about using assessment information to impact in substantive ways what they do in class. Based on students’ understanding of standards, these teachers react and strive for more effective approaches to support student growth.
This article focuses primarily on these teachers’ practices in relative isolation from their school contexts, but it is important to describe how their context supports their practices. The school supports the teachers’ professional learning from data in four important ways:
Collecting and analyzing data about students’ understanding of important learning standards provides these teachers with rich professional learning opportunities as they strive to improve their effectiveness.
There are several lessons to be gleaned from these teachers’ practices. These lessons are consistent with the broader research literature of data use in schools but provide more detail and a richer context of success than much of that literature.
Sometimes new information and situations call for major change. This issue...
What does professional learning look like around the world? This issue...
Technology is both a topic and a tool for professional learning. This...
How do you know your professional learning is working? This issue digs...