Menu

The X Factor is 'Why'

A Clearly Defined Purpose Boosts The Impact of Data Analysis

By Anne Conzemius
August 2012
Vol. 33 No. 4

“We’ve been doing this for years, and nothing’s ever happened. This is the first time we’re actually going to do something with the data.”

This exclamation came from a teacher who has been participating in summer data retreats for more than 10 years. She has found them enjoyable, but never really knew the purpose of the retreats or what would happen as a result. When using data for professional learning, looking at the numbers is not enough. The numbers are a reflection of past and current practice. They have to be understood at a deeper level, contextualized within a given purpose or goal, and ultimately translated into actions that bring about improved results.

This teacher’s experience is not uncommon. In its work with schools, a team of consultants and coaches from QLD Learning, a school improvement company, has discovered a long and varied list of approaches educators use to look at, analyze, decipher, encode, unencode, correlate, graph, chart, and scrutinize their equally unending and varied stream of data resources. If we step back to examine what’s going on, it’s fairly easy to see that the emphasis on using data has been just that — using data. But to what end? This is the question that has been missing in so many cases.

As Steven Covey (1989) advised, begin with the end in mind. Leadership’s role is to begin any data conversation by stating why we are engaging in the analysis and what we expect to extract from the analysis that will help us learn and improve. To do so, leaders must first be clear about the purpose that is driving their pursuit into the data.

If we are clear about why we are looking at the data in the first place, we can be very streamlined about the approaches we take to pursue answers. Thus, the second important role of leadership is to provide appropriate resources, tools, and processes that teachers need to make good use of collaborative learning time. When data are well-organized, relevant, and targeted toward specific analyses or purposes, teachers’ professional learning from the data will be efficient and profound.

Five Uses for Data

There are five generally accepted reasons to use data as a part of an educator’s ongoing professional practice. Of course, there are many other more specific reasons one might look at data, but these five cover the overarching need in an educational setting. The five major purposes for using data are:

  1. To enhance understanding and gain perspective;
  2. To create focus and monitor progress;
  3. To guide decisions or solve problems;
  4. To measure impact and implementation of a new initiative or strategy; and
  5. To generate new learning and innovation.

Note that using data for accountability and evaluation of individuals or groups is not on this list. Accountability and evaluation are inherent in each of these five uses individually and to an even greater extent when all five are considered collectively. That is why it is not advisable to use any single data source or process to evaluate or hold an individual or a group accountable for the results.

For each purpose, there is a distinct analytical process that can be applied to various data sources. Knowing the purpose helps zoom in on the type of data we want to examine, the most appropriate analysis in which to engage, and the metrics that we will use to achieve our purpose. Additionally, the purpose determines which graphical tools are most useful for interpreting and reporting the results.  See “Matching tools to purpose” above.

1. Use Data to Enhance Understanding and Gain Perspective.

All understanding of data is relational. What that means is that we cannot gain understanding or perspective by simply looking at the numbers. The numbers tell a story in relation to something else, such as a standard, an expectation, a goal, a larger group or population, a subgroup, or a similar group. For example, if we were to say seven children earned a score of 100% on this measure, that is all we would know. It is a fact, but it is not a useful fact unless we know the total number of students who took the test or performed the task. Is it seven out of seven (100%) or is it seven out of 700 (1%)? If the latter, how did the other 99% score?

If our purpose is to understand how our students are doing, it is possible to compare a particular performance result against a standard, an expectation, or a goal. That comparison gives us perspective because we are analyzing the data against an acceptable or ideal level of performance. Another type of comparison might be against a larger group, such as a state or national norm, an entire district or school, a grade level, or all courses using that same measure.

In the case of standardized, norm-referenced tests, determining the percent of students meeting or exceeding a standard is useful for gaining perspective on how our own students are doing on a specific standard. Because the data are derived from a very large population of test takers of similar age, percentiles, quartiles, and scale scores can also be useful. These are NOT particularly useful metrics for understanding individual learners’ needs or strengths. The use of common summative and formative assessments are much better tools for understanding an individual student’s performance or that of a group of students at any particular grade level or in any particular course.

Comparative analyses are only as good as the object of comparison being used. If the annual data retreat focuses its analysis on comparing the school’s or district’s performance against the state’s overall results, all we will know is whether we’re better, worse, or about the same as the state average. That does little to compel or inform action, which is why so many teachers return to their classrooms and continue to do what they’ve always done, happy to know that others in the state are worse off.

2. Use Data to Create Focus and Monitor Progress.

When the purpose of looking at data is to prioritize areas of need so that improvement efforts can be targeted and monitored for progress, it is unlikely that the status quo will prevail. Specific actions related to student learning gaps are much more likely to occur because they are evident as a result of how the data were analyzed. Also, the translation of the data into goal setting is immediate and strategic.

SMART goals (specific and strategic, measurable, attainable, results-based, and time-bound) are an expression of both the direction and amount of improvement that is desirable and achievable within a given time period. They identify performance gaps that must be closed to achieve the school’s or district’s vision. Because they are created collaboratively by teachers who will be responsible for achieving them, they represent the commonly held aspirations for professional learning and student achievement.

The data analysis process that is used to create and monitor SMART goals looks for the greatest area of need for the school, team, or classroom. The process is based on the Pareto principle, which is a method for teasing out the few areas of focus that will have the greatest overall impact on the whole. This highly focused approach gives specificity to the goal as well as to the actions that will be needed to reach the goal (Conzemius & Morganti-Fisher, 2011).

The analysis looks at three types of gaps in student achievement and identifies specific standards, skills, and subgroups that require attention. Data are placed into a template that allows for quick and easy calculations. Two of the calculations determine how far the school or team is from where they are required to be (accountability gap) and how far they are from where they want to be (proficiency gap). A third calculation shows how much improvement has been made over time in each subject area. After the greatest area of need has been isolated, teams use a color-coding process called zone analysis to highlight the specific skills, standards, or knowledge gaps that contribute to their current performance levels. These areas are prime candidates for improvement in curriculum, instruction, and assessment. These are also the areas that will be measured and monitored on an ongoing basis to ensure that all students are making continuous progress toward the goal.

Let’s go back to our enlightened teacher who has spent 10 years of her summer vacations looking at raw numbers and state averages. What was different this time? The analytical process she and her colleagues engaged in led them to develop a schoolwide SMART goal which, in turn, gave them what they needed to focus their improvement efforts and monitor their progress throughout the upcoming school year. They left with a plan. They knew where they wanted to be and how far they had to go to get there. They knew what to do next in the form of a few highly strategic actions, and they had a means for monitoring their progress along the way.

3. Use data to guide decisions or solve problems.

One way data can be used to promote professional learning is through a standard problem-solving or decision-making process. The data the team uses will depend on the problem or decision being considered, but the process and the analyses are fairly standard. Let’s look at the problem-solving process. Problem solving could occur in a number of areas, including student behavior, staff communications, technology issues, scheduling, and facilities use.

The Problem-Solving Process

  1. Agree upon and operationally define the specific problem to be addressed. This first step should result in a clear and agreed-upon statement and definition of a single, specific problem that is familiar to or being experienced by all members of the team.
  2. Understand the problem in its current state. How often does it occur? Where and when does it occur? Who is most affected and how? How long has it been a problem?
  3. Identify potential causes of the problem. Use the cause-and-effect diagram (a tool specifically designed to assist teams to identify root causes of a problem) (Conzemius & O’Neill, 1999) to organize categories of potential causes and the underlying factors that are perpetuating the problem.
  4. Isolate and quantify the most pervasive causes. Gather data that will verify or reveal the intensity, location, and frequency of the causes. A Pareto analysis might also be useful in this context.
  5. Create a SMART goal for problem resolution that eliminates or dramatically reduces the most pervasive causes. In this case, the SMART goal is based on eradicating the causes of performance gaps as opposed to working on a single greatest area of academic need.
  6. Create a plan for achieving the goal that includes specific actions, roles, responsibilities, and timelines for the team. Action plans that are linked directly to team goals provide a map for the team’s problem-solving journey. When action plans are thorough and specific, each person on the team will have a clear picture of how he or she will contribute to solving or eliminating the problem. Plans can also be monitored for implementation, which gives the team ongoing feedback on its progress.
  7. Monitor the impact of solutions on problem resolution. Keep an ongoing record of evidence that shows the problem has been resolved and not just redirected to another area or is newly manifest in another way.

The use of data in this category informs decision making at each step of the process, and the analyses are straightforward.

SMART goals are an expression of both the direction and amount of improvement that is desirable and achievable within a given time period.

4. Use Data to Measure Impact and Implementation.  

Every time a new initiative is put into place, a plan for measuring its effectiveness should accompany its implementation. There are two characteristics of effectiveness that need to be measured: Is the program or process being implemented with fidelity, and is it having the level of impact that makes it worthy of our investment? Neither question should be an afterthought. If we want to know whether the initiative is working, we need to have baseline information that represents the starting point of each.

It is rare to find a school or district that applies this level of scrutiny to new initiatives at any point in the process, much less as a routine or planned part of a systemic change strategy. The process and data used to conduct these analyses are not complex, nor do they take inordinate amounts of time to put into place. My colleague, Terry Morganti-Fisher, and I use a measurement system to help districts create a measured approach to systemwide improvement. This system incorporates the use of the SMART Goal Tree Diagram (see diagram on p. 24) as the graphic organizer for planning, goal setting, and monitoring three distinct aspects of the impact and implementation of any new initiative. The three aspects are: results (typically in the form of student learning data); strategic focus (evidence gathered against a rubric of implementation expectations); and capacity building (survey results that demonstrate a change in practice, climate and/or culture) (Conzemius & Morganti-Fisher, 2011).

Many of the data tools that we use to focus and monitor improvement at the school, team, and classroom levels are equally useful for measuring impact and implementation at the system level. One major difference at this level is whether the amount of improvement warrants the investment of time, training, and ongoing expense associated with continuation of the program. That is often difficult to ascertain and requires enough time for the initiative or program to become fully implemented. In the best of worlds, a pilot approach to implementation is desirable.

5. Use Data to Generate New Learning and Innovation.

In each of the four categories already covered, the approach to using data is either to look backward or to examine data that depicts current performance. The highest level of professional learning for any organization is what Senge (1995) refers to as “generative learning.” It is the learning that accompanies innovation, a critical shift of focus required for preparing students for 21st-century skills. Instead of reading trends or patterns in data from the past or comparing student performance to some group against a standard set of competencies, the generative learner is entrepreneurial in his or her approach to change.

Action research is an example of generative learning. It is different than problem solving because it isn’t focused on a problem per se. It isn’t a process improvement because it may address a process or a strategy that doesn’t currently exist. It isn’t so much about understanding where we are as it is a strategy for helping us think through and test innovations in teaching and learning. Data are generated along the way.

Another example of generative learning is through the application of new tools for learning, such as virtual technologies, social media, collaboratories focused on community action, projects that extend beyond the school walls, and timelines. We have only begun to understand the power of the data that these new methodologies will provide.

The data analysis process that is used to create and monitor SMART goals looks for the greatest area of need for the school, team, or classroom.

Making Data Relevant

The practice of carrying boxes of charts, tables, and graphs into summer retreats or spending hours in front of spreadsheets trying to understand the significance of every possible data point is not sustainable. When leaders bring data to the table, it must be for a specific, defined purpose. Data need to be organized in a manner that allows for relatively quick understanding and must be concise and relevant to the purpose at hand. Graphic organizers, templates, tools, and processes are especially helpful for collaborative teams because they help to maintain focus and direction. They should be simple to use and aligned to the analyses that will be conducted. These general tips will take the pain — and some of the resistance — out of using data as a critical tool for professional learning in our schools.


Authors

Anne Conzemius

Anne Conzemius (aconzemius@qldlearning.com) is cofounder and president of QLD Learning, a school improvement company in Madison, Wis. Conzemius has authored or coauthored several books, including More Than a SMART Goal: Staying Focused on Student Learning (Solution Tree, 2011).

References

Conzemius, A.E. & Morganti-Fisher, T. (2011). More than a SMART goal: Staying focused on student learning. Bloomington, IN: Solution Tree.

 

Conzemius, A.E. & O’Neill, J.K. (1999). Building shared responsibility for student learning. Alexandria, VA: ASCD.

 

Covey, S. (1989). The 7 habits of highly effective people. New York: Simon & Schuster.

 

Senge, P. (1995). The fifth discipline: The art & practice of the learning organization. New York: Doubleday.


+ posts

Anne Conzemius (aconzemius@qldlearning.com) is cofounder and president of QLD Learning, a school improvement company in Madison, Wis. Conzemius has authored or coauthored several books, including More Than a SMART Goal: Staying Focused on Student Learning (Solution Tree, 2011).


Search
The Learning Professional


Published Date

CURRENT ISSUE



  • Recent Issues

    EVALUATING PROFESSIONAL LEARNING
    February 2024

    How do you know your professional learning is working? This issue digs...

    TAKING THE NEXT STEP
    December 2023

    Professional learning can open up new roles and challenges and help...

    REACHING ALL LEARNERS
    October 2023

    Both special education and general education teachers need support to help...

    THE TIME DILEMMA
    August 2023

    Prioritizing professional learning time is an investment in educators and...

    Skip to content