Young people come to us with many challenges, but speaking their minds isn’t usually one of them. Our students tell us what they think pretty much all the time. Every utterance and every cue from their body language is a data point that communicates important information. But it can be hard for educators to prioritize student feedback while managing myriad sources of information from in and outside of the classroom.
The biggest challenge to incorporating student voice has almost nothing to do with students’ actual voices. Those are loud and clear. The challenge has to do with our eyes and ears. We need to learn and practice centering students’ perspectives in our school improvement efforts.
Equipping educators to hear students’ voices, and then translate that information into concrete behavioral and instructional changes, is what our team at Partners in School Innovation has undertaken in San Jose, California, for the last several years. Our efforts have not only suggested promising practices, but also highlighted important lessons for other educators and organizations trying to build educators’ capacity to incorporate student voice into school improvement strategies and instructional feedback loops. They have reminded us that students are a vital yet often overlooked source of professional learning for educators at all levels. For them to learn from us, we have to learn from them.
“...Students are a vital yet often overlooked source of professional learning for educators at all levels. For them to learn from us, we have to learn from them.” #TheLearningPro #StudentVoice Share on XOpening our ears to student feedback
In fall 2020, just after the onset of the COVID-19 pandemic, we began working with nine schools across four districts in San Jose, California. Educators in those schools formed a network for school improvement, and our team operated as coaches and support professionals, working shoulder-to-shoulder with educators and providing real-time feedback on their daily practice while supporting the broader development of a systemwide approach to school transformation.
By 2021, the reality in San Jose reflected the broader confusion of the American public policy response to COVID-19: waves of illness and death; day-to-day uncertainty about school opening and closing; and glitchy implementation of hybrid instruction. Adding to the complexity of those broader challenges, East San Jose has the highest concentration of low-income families in the city, and about one-third of the children in these schools are classified as English learners.
It was clear to everyone that students, who were still engaged in hybrid learning, were struggling, especially in math and especially when they returned to instruction after being ill. But it was hard to figure out exactly what was going on with the students and what kind of support they needed, in part because many students resisted turning their cameras on during virtual learning. We decided to go directly to the source.
We asked students to come to one of our network sessions and share with school and network leaders their perspectives through student experience panels. To make the feedback sessions more comfortable for students and educators, we set up the panels so that students only spoke to teachers at schools they did not attend.
The first round of panels was eye-opening, and it marked an important turning point in adults’ willingness to really hear student voices. Young people shared their anxiety about returning to school amidst an ongoing pandemic, and what they said reflected a broader sense of feeling alienated by the complexity of math instruction. Without the distractions inherent in classrooms, teachers were completely focused on students’ perspectives. And they heard what they had not been able to hear before.
After the panels, teachers talked about how much they had missed by not making time to listen and process more regularly, and they demonstrated an interest in doing so more in the future. To reach that goal, our team supported the network in prioritizing student voice on a more systematic basis. We also engaged educators in learning about how to incorporate students’ feedback in meaningful ways.
Leveraging qualitative data
We knew that the collection and analysis of qualitative data would be central to our work, so we used the book Street Data as an anchor text for our leaders (Safir & Dugan, 2021). Schools often have routines for collecting and analyzing quantitative data, like test scores and attendance, but Street Data offers tools for gathering information about topics that feel more esoteric or harder to measure, like student voice. District and site leaders read the book in professional learning communities, then used their discussions as a starting point for strategic implementation.
From that work, two major approaches emerged for incorporating student voice more systemically: empathy interviews and a survey about math mindsets.
Empathy interviews
Empathy interviews have been around for a half-century, used by social and product engineers to unearth foundational, yet hidden, truths about user experiences, as a means to improve the design of products, processes, systems, and structures.
Conducting empathy interviews is simple: You create an interview protocol that includes targeted, open-ended questions about the problem you’re trying to address, then sit with stakeholders — in this case, students — one-on-one to ask the questions. The deepest realizations emerge from in-depth follow-up questions, so it’s critical to allow the conversation to travel where the student leads, rather than be limited to the questions on the protocol. See the box above for some sample empathy interview questions.
We used empathy interviews in San Jose middle schools to understand a variety of challenges, including underwhelming math performance and students’ struggles with school re-entry after COVID-19 infections. We wanted to ensure that these processes were easy for adults to conduct and learn from but also led to real instructional and behavioral changes. To that end, we created standardized interview protocols that could be useful for multiple situations and built simple data visualization mechanisms using Google Jamboards. We also engaged in professional learning with teachers to prepare them to be interviewers.
At August Boeger Middle School, students told us they craved more collaborative problem-solving time in math and that they always seemed to understand lessons better when teachers provided multiple methods for solving problems. Although we had worked with teachers in the past to encourage best practices on teaching multiple methods (Rittle-Johnson et al., 2017), teachers didn’t incorporate that approach consistently until they heard the feedback from students.
Educators at LeyVa Middle School learned valuable lessons about school re-entry after COVID-19 infection. Students said they wanted to hear more focus on their well-being in return-to-school messaging, whereas much of those communications focused on making up for missed work, which further alienated students who were already reeling from their own or family members’ illness.
Teachers took the feedback to heart and made a concerted effort to be more sympathetic and encouraging toward students as they tried to get back on track after being absent. They also waived some noncritical, nonsummative assignments for students so their academic load would be more manageable. Finally, school leaders set aside resources to pay teachers to provide tutoring after school for students who had missed school due to COVID-19.
Although it is difficult to prove that these conversations and changes in educators’ strategies led to changes in academic performance, there is at least some evidence of positive correlation. Across grades at Boeger Middle, for example, students scoring a C or above in math class increased their average trimester grades by 12 percentage points in the trimester after we implemented the interviews, which is a leading indicator of being on track for 8th-grade graduation.
One Boeger teacher described the process of receiving student feedback as both eye-opening and a critical antecedent to professional learning. “I was surprised to see that the students already knew their challenges,” she said. “For example, one of my students shared that he has trouble seeing the whiteboard.” The teacher subsequently began using different whiteboard markers, a simple shift that enhanced the quality of her instruction while demonstrating that she was listening and cared.
Math mindsets survey
While empathy interviews are great for going deep with a small number of students, surveys can be more effective for collecting broad data across a larger group of individuals. That’s what the educators at Davis Intermediate School did when we worked with them to create and administer a math mindsets survey, which we adapted from a tool created by High Tech High Graduate School of Education and WestEd (Challen et al., 2021).
Surveys asked questions such as, “Do you feel successful in math class?” and “Do you feel like you can ask questions and express your ideas in class?” The questions were rooted in the idea that math, like any learning, is anchored in identity. If students don’t think of themselves as “math people,” namely being capable of doing math at a high level, their engagement, excitement, participation, and performance will reflect that lack of identification.
We administered the survey, then used what we learned from our professional learning on qualitative data to create basic visualizations that highlighted how students were feeling about math instruction (see the chart above). Teachers looked at the survey data disaggregated by classroom, then created goals for improving student mindsets about math. For example, some teachers decided they wanted to increase the percentage of students responding positively to the prompt, “My peers value my ideas in math class.”
The survey helped isolate areas for instructional improvement. Teachers internalized the data from the survey, changed practices, then measured student perceptions again later in the year. For example, Davis math teachers began holding one-on-one conferences with students to provide individualized support for their aspirations as mathematicians.
How to use student feedback
While working with the San Jose middle schools, we learned some important lessons, both about strategies to use for collecting student voice data and how to translate student feedback into professional learning and changes in educator practice.
Take action. Schools can get caught in a sort of “analysis paralysis” when it comes to collecting qualitative data on student voice, so it’s important to take action quickly. It’s counterproductive to overthink your approach here, as there is no such thing as the perfect strategy.
Just pick something and try it. When we wanted to administer a survey about math attitudes, for example, we didn’t hire a psychometrician to create a statistically valid instrument; we just found an existing instrument and adapted it using some questions whose answers we were curious about.
A corollary consequence of analysis paralysis is endless measurement, with no action. If you’re soliciting student feedback on instructional practice, it’s important for students to see their views reflected in classroom routines sooner rather than later. If students think they’re sharing their views, but that you’re not doing anything with that information, that’s probably worse than never having asked them anything in the first place.
Depersonalize and anonymize student feedback. Students need to feel safe enough to express their true opinions, and teachers need to feel safe enough to hear them. When we anonymized the information, it depersonalized some of the tougher feedback and made it easier for people to hear.
With the student panel discussions, it obviously wasn’t possible to anonymize the feedback, but we worked with students to prepare them and help them frame the broad themes and ideas they wanted to share. We supported them to share things that we knew had a better chance of leading to immediate shifts in school policies and educator practices.
Build habits and routines. When we first conducted empathy interviews, we thought it was something we could do once in a while, but then life got in the way and we didn’t implement interviews with enough regularity. Without routines, we found ourselves starting from scratch the next time we went to collect data. We realized it would be more effective to build a regular cadence for conducting interviews.
For example, educators at LeyVa Middle School now conduct a round of student wellness surveys every time they administer quarterly formative assessments. They are ensuring that the frequency of qualitative data collection about students’ perspectives matches that of their quantitative measures of achievement, and they are communicating to students that both sources of information are equally important to them.
Beyond frequency, another important habit involves reaching out to students who are the hardest to reach. If you only conduct interviews with student government representatives, for example, or those who speak up often in class, you’re never going to get the full story.
Connect to a theory of action. Finally, and perhaps most importantly, it’s critical to link your exploration about student voice to your identified challenges and goals. In the case of the San Jose network, we had clear goals around both math instruction and improving school re-entry for students affected by COVID-19. We reoriented grade-level team meetings, observation protocols, and feedback practices to discuss instructional strategies to achieve those goals. We would have used different instruments, and explored different interview questions, if our goals had been different.
Improvement science requires following the data where it leads you, but you cannot solve a problem that you have not yet identified. While empathy interviews can unearth new challenges, you should be clear about whether you are searching for problems or looking for solutions to already identified challenges. Otherwise, your efforts may become too broad to be effective.
As with any school improvement strategy, the strength of implementation will depend on a broader set of systems for incorporating feedback into professional learning: structures for peer observation; routines for sharing feedback after observations; the presence of grade-level teaming in the school schedule; and dedicated time for reflection on new practices, to name a few.
If you are intentional about these strategies and supports, you’re bound to learn some surprising things about your school and its scholars. For example, we learned that even the most disinterested-seeming students really cared about their math grades and that students are eager and able to shape instructional practices in ways that are supportive of their ultimate success.
Each piece of information we glean from students widens our aperture for improving schools. If we want to see the whole picture of what students need to succeed, it is essential that we are consistent and intentional about listening to them.
Download pdf here.
''Each piece of information we glean from students widens our aperture for improving schools...It is essential that we are consistent and intentional about listening to them.'' #TheLearningPro #StudentVoice Share on XChallen, C., Sharrock, D., & Taylor, C. (2021). High Tech High MAIC student agency survey. WestEd.
Rittle-Johnson, B., Star, J.R., & Durkin, K. (2017). The power of comparison in mathematics instruction: Experimental evidence from classrooms. In D.C. Geary, D.B. Berch, & K.M. Koepke (Eds.), Mathematical cognition and learning (vol. 3, pp. 273-296). Elsevier.
Safir, S. & Dugan J. (2021). Street data. Corwin.
Sometimes new information and situations call for major change. This issue...
What does professional learning look like around the world? This issue...
Technology is both a topic and a tool for professional learning. This...
How do you know your professional learning is working? This issue digs...