Menu

IDEAS

Program evaluation and design go hand-in-hand in Tennessee

By Joe Anistranski, Karen Harper and Stephanie Zeiger
Categories: Data, Evaluation & impact, Improvement science/networks, Technology
February 2024
Educators and researchers often think of program design and program evaluation as separate endeavors, even intentionally creating a firewall between them. But what if designers and evaluators worked together, combining their insights to strengthen both the program and the study of it? In our work evaluating a statewide professional learning program in Tennessee, we have found this approach benefits everyone involved. Since 2022, we’ve undertaken an ambitious task of evaluating the Reach Them All computer science initiative’s professional learning for school- and district-level educators across Tennessee (see sidebar on p. 49). We have engaged in a collaborative evaluation design process that started at the beginning of the initiative to explore how the professional learning program works. We made strategic evaluation decisions based on the program

Read the remaining content with membership access. Join or log in below to continue.

Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium, totam rem aperiam, eaque ipsa quae ab illo inventore veritatis et quasi architecto beatae vitae dicta sunt explicabo. Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores eos qui ratione voluptatem sequi nesciunt. Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit, sed quia non numquam eius modi tempora incidunt ut labore et dolore magnam aliquam quaerat voluptatem.

Log In
   

References

Guskey, T.R. (2000). Evaluating professional development. Corwin.

Kirkpatrick, D.L. (1959). Techniques for evaluation training programs. Journal of the American Society of Training Directors, 13, 21-26.

National Assessment of Educational Progress. (2022). Tennessee demographics (2021-22). U.S. Department of Education, Institute of Education Sciences. www.nationsreportcard.gov/profiles/stateprofile/overview/TN

National Center for Education Statistics. (2022). Digest of education statistics. nces.ed.gov/programs/digest/

Nordengren, C. & Guskey, T.R. (2020). Chart a clear course: Evaluation is key to building better, more relevant learning. The Learning Professional, 41(5), 46-50.


Joe anistranski
+ posts

Joe Anistranski (joe.anistranski@nwea.org) is a professional learning strategy manager at NWEA. 

Karen harper
+ posts

Karen Harper (karen.harper@liftoffconsulting.com) is a computer science professional learning consultant at the Tennessee STEM Innovation Network-Battelle.

Stephanie
+ posts

Stephanie Zeiger (zeigers@battelle.org) is a computer science education consultant at the Tennessee STEM Innovation Network-Battelle.




Categories: Data, Evaluation & impact, Improvement science/networks, Technology

Search
The Learning Professional


Published Date

CURRENT ISSUE



  • Recent Issues

    EVALUATING PROFESSIONAL LEARNING
    February 2024

    How do you know your professional learning is working? This issue digs...

    TAKING THE NEXT STEP
    December 2023

    Professional learning can open up new roles and challenges and help...

    REACHING ALL LEARNERS
    October 2023

    Both special education and general education teachers need support to help...

    THE TIME DILEMMA
    August 2023

    Prioritizing professional learning time is an investment in educators and...

    Skip to content