4c. Evaluate results of professional learning programs

Reflection

Professional learning is a costly endeavor, consuming time and funds at the district level. Naturally, administrators want to see a return on investment in the form of increased student learning. Without getting into a discussion of the efficacy of standardized tests on measuring learning, there must be ways to tie professional learning to results in the classroom. That could be year-end summative data like standardized testing, but is more likely to be seen in daily classroom activities. 

When I evaluate digital tools in my own classroom, I am looking for how the tool helped my students access the higher levels of Blooms taxonomy, how the tool helped my students to develop 21st Century learning skills, and if it increased their engagement as compared to other digital tools or paper tools.

This may look different depending on the school or even the classroom within the school. To explore substandard 4C, I have given frameworks which coaches can utilize for evaluations. These frameworks are designed to give coaches quantitative ways to in turn give qualitative feedback to teachers. I also included evidence on the program evaluation I recently completed. Part of my goal was to identify the effectiveness of a district’s 1:1 roll-out. Data was collected from students and staff in order to measure the efficacy of the initiative. For my purposes, efficacy was defined as amount and quality of use.

Evidence: Tools to Evaluate 21st-Century Teaching

Effective learning is cyclical in nature. Needs are identified, goals are put into place, learning occurs either individually or in groups, and then coaches follow up on the extent to which the learning goals were met. Then new goals can be made and the process repeats. Lifelong learning means that no one is ever really “done” learning.

Due to this cyclical nature, I feel it is appropriate to incorporate evidence I used for substandard 4A (identifying needs) to this substandard (evaluating results).

A framework for which to provide feedback is a valuable tool for coaches. In my 2018 post on Tools to Evaluate 21st Century Learning, I explain three resources and provide forms that coaches can use in the evaluation process. An overview of those resources can be found below. For more detailed information and additional documents, please see my post.

Evaluation Tool 1: Council for 21st Century Learning

The Council for 21st Century Learning is committed to supporting 21st-century learning by offering consulting and training to districts and schools. Their work begins with a diagnostic to identify areas of need. Support is then provided through coaching, workshops, and presentations.

Evaluation Tool 2: Strengthening Your Reflective Commentary

This tool was created by AJ Castley and is included in various methods on the Warwick Learning and Development Centre for teachers to self-assess. The form provides teachers with 7 open-ended questions to consider their teaching across 3 areas: teaching, assessing, and curriculum design. Within each broad question are more particular questions designed to walk teachers through a deep analysis and reflection of what went well and what could be improved within a given lesson. Some of the guiding questions include “Why did you do it that way? How else might you have done it?”

Evaluation Tool 3: Learning Design Matrix

The Learning Design Matrix was adapted from Eeva Reeder, a frequent Edutopia contributor on Project Based Learning. Within the four-square matrix, teachers and coaches can consider elements of a 1) Standards-Based Task, 2) Engaging Task, 3) Problem-Based Task, and also how technology enables and/or accelerates learning of that given task. Rather than viewing the matrix as a comprehensive to-do list, it is helpful to choose several key elements and consider how a lesson you’ve taught or want to teach fits within those elements.

Evidence: Program Evaluation

In early 2019 I had the opportunity to complete an evaluation of the progress made with a district’s 1:1 device roll-out. 

For confidentiality reasons, I am limited in what I can say about the work I completed with this evaluation. I hope that by including the survey below along with listing the goals of my project, readers will at least get a glimpse of how you can begin with the goals in mind and develop survey questions from those goals.

I created my survey using Google Forms because the data is incredibly easy to work with once you’ve collected responses. Google Forms is also easy for respondents to access and can be completed on mobile devices without changes to formatting. The ability of Forms to display likert scales was essential in analyzing various perspectives of respondents. 

The goals for my project were as follows:

  • Gain an objective measure of the attitude towards technology on the part of both teachers and students

  • Differentiate attitudes towards technology; if a negative attitude did exist, what was it due to?

  • Determine ways in which students and teachers are currently utilizing technology

  • Determine what has gone well with the 1:1 initiative

  • Determine areas of need that may still exist

Below, I am providing a link where anyone interested may make a (free) copy for personal use. 

Having gone through the evaluation process, I would pass along the following advice:

  • Gather support from as many people as you can. Having your principal, dean, librarian, technology coach(es), and department heads on board will make the process go smoothly.

  • Reach out to participants in multiple ways. To gather data, I sent out multiple emails but also followed up with announcements at staff meetings. I also kindly asked teachers individually to respond.

  • Incentivize responses. A little encouragement in the form of a random gift card drawing will encourage participants to give up their valuable time to complete your survey.

  • If possible, work with a captive audience (such as taking time out of a staff meeting) to administer the survey.

  • Consider the sample size. While I initially jumped at the chance to administer the survey to the entire student body, processing the data was quite overwhelming. 

  • Have a volunteer complete the survey before disseminating to your audience. They can give you constructive criticism along with an estimate of length for completion.

  • When framing your feedback, consider how you can do so through inquiry questions instead of outright critiques. For example: What would it look like if we hosted a digital library of resources for teachers?

css.php