• Subscribe

    Sign up here for our monthly newsletter.

  • Menu

    FOCUS

    Pilot program aims high

    Challenges en route help district find success in measuring impact

    By Eric Celeste Celeste
    April 2017
    Vol. 38 No. 2

    Denver Public Schools’ Professional Learning Center was still relatively new when it decided to tackle a problem of practice that has vexed systems and departments across the country: How to measure the impact of professional learning. To do so, the Professional Learning Center created a new comprehensive measurement approach — one that would design, test, and apply lessons learned from Learning Forward’s Redesign PD Community of Practice and the work of Thomas Guskey. The complex task came with an added challenge: That program would be tested within the confines of another department’s high-profile efforts to launch an early literacy initiative involving 2,500 teachers.

    Ultimately, the initiative gave the center a strong foundation to continue measuring the impact of professional learning at the district and school level. This certainly wasn’t without struggles, modifications, and lessons learned — some within the context of that community of practice, others through districtwide implementation — that eventually helped the program bear fruit.

    “You can imagine, in a large urban district, there was a lot of support to allow us to do this with the department launching the initiative, but there was anxiety, too,” says Theress Pidick, executive director of the Professional Learning Center in Denver Public Schools. “They had a lot of sensible concerns. What data will we collect? What tools will we use? What’s the process for collecting the data? What are we going to do when we get the data? Who is going to have access to the data? How will we use it?

    “Collaboration was essential. We had to partner closely with the early literacy team, our accountability, research, and evaluation department, and our instructional superintendents before starting. But the early literacy initiative — impacting roughly 2,500 teachers — was itself so important, we felt it was vital that our measurement efforts be a part of it.”

    Despite the wide scope of the early literacy initiative efforts and the ongoing challenges of setting up the nascent Professional Learning Center itself, the center was able to pilot an impact measurement process that it is now expanding and adapting to individual departmental and school needs. The center overcame challenges along the way at least in part because of three factors: the collaborative efforts of the Denver team’s partnership in the 22-district Redesign PD Community of Practice run by Learning Forward; a focus on implementation and meaningful data; and the steadying belief that quality professional learning coupled with ongoing support and feedback leads to better student results.

    “We went in with the intent that we would learn a lot this year, and then we would be able to generalize the tools and processes and apply them to other departments, initiatives, or schools across the district,” Pidick says. “We think we’ve done just that.”

    Fixing what’s broken

    Denver’s Professional Learning Center embarked on the measuring impact initiative in part because of the findings of several 2015 district studies on the state of its professional learning. The studies revealed that the district needed to address several important areas related to professional learning quality and impact. Pidick and her new team in the Professional Learning Center developed a measurement strategy to fix what was broken and systematically collect data to better understand the quality and impact of professional learning.

    One tactic they employed was to create two new roles: professional learning partners — learning leaders who could help subject-matter experts and others provide educators with a high-quality learning experience — and a professional learning analyst. The four professional learning partners hired first planned to work with central office experts as well as with instructional superintendents who supervise principals. That direction changed after Pidick observed that some administrators didn’t fully maximize the partner’s intended role.

    “Initially, there was some skepticism about why these roles were needed,” she says, “so the professional learning partners shifted their primarily workload to the central office departments. We went where we were needed and wanted the most to get traction.”

    The professional learning analyst was paramount to their strategy as that role assisted with the development of the measurement tools, design of the data collection processes, and completion of the ongoing analysis and report development. This was the first time that the district was dedicating a specific resource toward measuring the quality and impact of professional learning.

    Another factor in improving professional learning across the district was being part of Learning Forward’s Redesign PD Community of Practice, in which the 22 participating systems have committed to making dramatic progress on one of two problems of practice by mid-2017. These problems of practice are:

    How to strengthen the measurement of the impact of professional development on teacher practice and make decisions based on these measures​; and

    How to increase the coherence and relevance of professional development, such that teachers experience professional development as useful, timely, and relevant to their classroom practice, and abandon those initiatives that distract or dilute teachers’ focus. ​

    The measurement problem of practice was perfect inspiration for Pidick and her team, and they spent spring 2016 planning and preparing for what needed to be put in place before the district’s summer early literacy initiative learning opportunities launched. They acknowledged that if they had a better handle on measuring quality and impact, the team would gather critical evidence that would undoubtedly create greater coherence. “It was a little bit of chicken or the egg,” Pidick says, “but we dove right into the measurement of impact approach.”

    First, the team devised a theory of action: If the central office and schools get access to data and analysis that systematically measure the quality of professional learning and the impact of that professional learning on changing teacher practice and student achievement, this will enable central office and schools to engage in a cycle of continuous improvement.

    Then came the hard part: devising and implementing a plan to collect and analyze meaningful and relevant professional learning data and tie it to student outcomes.

    They started, as most learning leaders do when faced with a measurement challenge, with Thomas Guskey. (See “Where do you want to get to?” by Thomas Guskey on p. 32.) After returning from the community of practice’s initial convening in December 2015, they pitched to the district’s senior leaders on the idea of using Guskey’s five critical levels of professional development evaluation — teachers’ reactions, teachers’ learning, organization support and change, teachers’ use of new knowledge and skills, and student learning outcomes — as a framework for their measurement approach. The steps in this process would be applied to the early literacy initiative.

    Pidick and her team used early literacy as their entry to think about the following:

    What would these five levels look like in practice?

    What tools would they need?

    How would they gather the data?

    How would they create reports?

    Once they could answer those questions for the early literacy initiative, their goal was to be able to apply what they’d learned to other departments, initiatives, and schools across the district.

    Implementation challenge

    Then came implementation, where, far too often, great ideas and best intentions aren’t enough to ensure success. To avoid this, the Denver team, working with district partners, gathered feedback from the ongoing check-ins with Learning Forward’s Redesign PD Community of Practice to align each step with its equivalent on Guskey’s five levels.

    For level one, the group needed a mechanism to observe and measure the quality of professional learning. In collaboration with district departments, the professional learning partners developed a tool — Framework for Effective Professional Learning — for observing professional learning that gauges the quality based on design and facilitation. (See this and other data tools on the team’s data section of its website at https://plc.dpsk12.org/data-culture.)

    For levels two and three, the team sought to capture teacher satisfaction and perceived learning. “We had surveys that we were able to administer to participants who attended the early literacy professional learning this past summer and the follow-up modules during the school year,” says senior research analyst Brooks Rosenquist, “so that we could get their initial perceptions about how they experienced professional learning and what they thought they had learned.”

    Level four, changes in teacher practice, was a bit trickier. For that, the Professional Learning Center and early literacy teams focused on doing classroom walk-throughs and monitoring — gathering data on strengths and areas of opportunity in terms of changes in instructional practices as a result of the professional learning and what could be emphasized in future opportunities. To do this, they modified tools from the Instructional Practice Guide, developed by Achieve the Core to help teachers and those who support teachers to make instructional shifts related to Common Core State Standards (see www.achievethecore.org/page/2730/instructional-practice-for-the-ccss).

    Through the summer and fall of 2016, the team collected data from 200 different professional learning sessions and survey data from almost 10,000 individual survey responses. These included large summer sessions and smaller such gatherings throughout the fall.

    “We had this model, but we learned some lessons along the way,” Rosenquist says. “While it was relatively easy for the literacy team to grasp the ‘quality’ concepts of professional learning, providing feedback to colleagues using this new framework for professional learning was a bit more tricky. We had to ask ourselves, ‘What’s the best way to do that?’ ”

    This question was important because it spoke to a real-world challenge often faced by change agents in a system. “Educators in central office are experts in their content areas and in the pedagogy of teaching and learning in the K-12 context,” Rosenquist says. “Most regularly leverage participant surveys (teacher perceptions) as the source of quantitative data to inform improvements to professional learning. However, we were asking them to expand that sphere of data to include professional learning quality, teacher knowledge, teacher behavior, and student outcomes.

    “Fortunately, our central office staff is extremely dedicated to continuing to improve the ways we support teachers,” Rosenquist says. “We learned that we needed to frame the various types of quantitative and qualitative measures so that departments could see changes in their sessions over time or be able to compare the impact between different sessions to drive continuous improvement.”

    Pidick agrees that the team has learned a lot of lessons from collecting data and said that, while the reports Rosenquist is creating are meaningful for the early literacy team in forming its next steps, they need to rethink how to present data to give senior leaders greater visibility into what the team is doing, how things are improving and changing, and how the team is responding and collaborating with partners and extending its reach to other departments and schools.

    Some general takeaways have already emerged from this work:

    Teachers expect that the professional development sessions that they experience will incorporate active learning and be differentiated to their needs.

    In general, teachers most value sessions that provide concrete tools, strategies, and resources.

    In general, teachers value bite-sized, actionable sessions over more theoretical ones and smaller sessions over larger ones, especially when those sessions are school-based rather than centrally located and delivered.

    While the districtwide summer event was held centrally with the purpose of establishing across the district a common language and understanding of the foundations of early literacy, the monthly modules that have followed have been designed centrally but customized and delivered by teacher leaders at each site.

    “Not only have teachers responded more positively to these sessions than the more centralized event,” Rosenquist says, “but this approach is more aligned with our district’s vision of schools as the unit of change. Although some of these takeaways may seem obvious, our recommendations are stronger and hopefully more persuasive, given that they are supported by data from within our own organizational context.”

    From district to school level

    At that same time they were evaluating the best way to present the data gathered from the districtwide literacy initiative, Pidick and her team felt they needed to see how the focus on professional learning quality and impact worked at the school level, especially given the research literature, which suggests that the most effective professional learning is ongoing, job-embedded, and experienced together with colleagues, instead of individually.

    While the research suggests this, they wanted to do their due diligence and collect impact data to determine what professional learning investments were adding the greatest value within their own context. They had planned to expand their evaluation focus after the 2016-17 school year, but “we pretty much midway through were saying to ourselves, ‘I think we could do some things early on about measurement at school level that could play a valuable role for school leadership teams in assessing impact,’ ” Pidick says.

    That’s because her group found there seemed to be a void for consistently measuring the quality and impact of professional learning at the school level as well. The district was investing a lot of staff time and school dollars in building educator capacity, but very few schools were systematically collecting and reflecting on the data that resulted from their efforts. “We quickly realized that what we were building and implementing with the early literacy group could be customized and provided for people at the school level. We could collect and provide a much richer data set to help inform them in making continuous improvement decisions,” Pidick says.

    “I think that an important takeaway from this work is that we’re now using it across the board with our work in the Professional Learning Center, and the power is in seeing the impacts on student learning,” says Laura Summers, the district’s associate director for learning communities and data culture. “In the schools we’re working with, we’re actually going to each of [Guskey’s] levels and seeing how professional learning that is facilitated in schools has an impact on teacher practices, level four, as well as level five, student outcomes.”

    For example: The team has created a service model to apply the tools and processes to measure the quality and impact of current school-level professional learning. They’ve observed sessions at a school, then given that school feedback on the quality of those sessions. They’ve also conducted learning walk-throughs in classrooms to determine to what extent teacher practice changed because of the professional learning — level four of the Guskey model.

    At one school, for example, they noticed coherence between the focus of a professional learning session and what teachers were implementing in their classrooms. But when they looked at student work — level five — they saw disconnects. By studying the student work, the team was able to provide the principal and instructional leadership team information to inform their discussion and decisions on how to make adjustments to their approach that would better improve student outcomes.

    The keys for getting the buy-in to this approach at the school level are trust, transparency, and communication. The team worked to ensure that the schools identified their area of need and that the team was being responsive to their unique circumstances and request. They established clear roles, tasks, and timelines and developed an authentic partnership that had continuous improvement, not evaluation, at the core.

    “We want to build capacity as well as look at continuous improvement,” Summers says. “Measurement is sometimes viewed alongside evaluation, and we don’t want it to feel like we’re evaluating them. We really want them to be thinking about how measurement of professional learning quality and impact is a necessary part of their overall development strategy if they are to change student outcomes. That’s the goal in everything we’re trying to do — create impact.”

     

     

     


    Eric Celeste (eric.celeste@learningforward.org) is Learning Forward’s associate director of publications.

     


    + posts

    Search
    The Learning Professional


    Published Date

    CURRENT ISSUE



  • Subscribe

  • Recent Issues

    LEARNING TO PIVOT
    August 2024

    Sometimes new information and situations call for major change. This issue...

    GLOBAL PERSPECTIVES
    June 2024

    What does professional learning look like around the world? This issue...

    WHERE TECHNOLOGY CAN TAKE US
    April 2024

    Technology is both a topic and a tool for professional learning. This...

    EVALUATING PROFESSIONAL LEARNING
    February 2024

    How do you know your professional learning is working? This issue digs...

    Skip to content