Teachers’ perceptions of key aspects of their collaborative, ongoing, job-embedded professional learning experiences (e.g., professional learning communities [PLCs] and grade-level or department meetings)
Use this measure:
WestEd’s Professional Learning Measure (Google Form) [Select “Make a copy” to view.]
Measure of...
Measurement instrument overview
Survey items are designed to provide data for professional learning (PL) facilitators in order to inform the design and implementation of ongoing teacher professional learning. The survey should take about 3–5 minutes to complete at the end of a professional development (PD) session; items include both a standardized set for use in all locations and a small set of items that allow limited local customization to align with users’ local goals, needs, and interests. The measure draws heavily on prior work by Laura Desimone and her colleagues and is still undergoing development and testing.
Connection to student learning
WestEd’s Professional Learning Measure supports instructional leaders, designers, and facilitators of collaborative professional learning in assessing teacher perceptions of the learning experience.
Drawing on research about teacher learning, the items in this survey align with several key design features of collaborative professional learning recognized by the field:
- A specific content-area focus. Studies suggest that activities that focus on subject matter content and how students learn that specific content leads to increases in teacher knowledge and skills, improvements in practice, and under the right conditions, increases in student achievement (e.g., Banilower et al., 2005; Carpenter et al., 1989; Cohen, 1990; Cohen & Hill, 2001; Desimone, Porter, et al., 2002; Garet et al., 2001; Kennedy, 1998; Smith et al., 2007).
- Active learning. Effective PD provides opportunities for teachers to “engage in the same style of learning they are designing for their students” (Darling-Hammond et al., 2017, p. v). Effective PD experiences engage teachers in interactive, highly contextualized learning activities, drawing on authentic classroom artifacts, such as lesson and unit plans, student work, and classroom video clips or transcripts, and embedding learning within teachers’ daily work (National Research Council, 2000).
- Coherence. The most effective teacher learning experiences are those that align with identified district goals and priorities, existing initiatives, and teachers’ individual and collective needs and interests (Desimone, 2011; Fullan, 2024). Coherence can also refer to learning experiences that are (a) highly specified to the material teachers will be teaching (conceptual proximity) or that (b) occur immediately prior to teaching content relevant to the professional learning experience (temporal proximity) (Penuel, 2011; Santagata et al., 2011).
- Opportunities for collective participation and/or collaboration. PD has a greater impact when it creates space for teachers to share ideas and collaborate in their learning, ideally in job-embedded contexts that include teachers from the same school, grade, or department (Desimone, 2009). Collaboration supports teachers not only in sharing information and learning from each other but also in creating communities with the power to positively change the culture and instruction of their entire grade level, department, school, and/or district (Darling-Hammond et al., 2017).
- Access to models of effective practice. Teachers are most likely to shift their practice when they have a clear vision of what best practices look like, and one of the best ways for teachers to understand what a particular practice looks like is to see it in action. This can include opportunities to observe teachers skilled in the practice in question in a real classroom, either in person or via video, as well as high-quality lesson or unit plans, sample student work, observations of peer teachers, or accessing written cases of teaching (Darling-Hammond et al., 2017).
- Coaching and expert support. Coaching and expert support involve the sharing of expertise about content and evidence-based practices, focused directly on teachers’ individual needs. Common structures for providing expert support include one-on-one coaching in the context of a teacher’s own classroom; experts sharing knowledge as facilitators of workshops; and remote mentors utilizing technology to communicate with educators, with “experts” including trained master teachers, instructional leaders, and researchers and university faculty (Darling-Hammond et al., 2017).
- Opportunities for feedback and reflection. High-quality professional learning frequently provides built-in time for teachers to think about, receive input on, and make changes to their practice by facilitating reflection and soliciting feedback. While feedback and reflection are two distinct practices, they work together to help teachers move thoughtfully toward the expert visions of practice (Darling-Hammond, 2017). Feedback may occur as part of a mentoring or coaching relationship, a peer observation relationship, or a small group workshop or be provided by a specially trained expert in a particular area (Darling-Hammond et al., 2017).
- Sustained duration. Effective PD provides teachers with adequate time to learn, practice, implement, and reflect upon new strategies that facilitate changes in their practice. Experts on teacher PD have consistently found that in order to translate professional learning into meaningful changes in practice, teachers need ongoing, recurring opportunities to engage in learning a single set of concepts or practices (e.g., Darling-Hammond et al., 2009; Desimone, 2009; Desimone et al., 2002; Knapp, 2003; Yoon, 2007). Effective PD initiatives occur not in the context of single, “one-off” workshops but typically span weeks, months, or even academic years.
What we know about how well this measure works for its intended use
This measure is still undergoing development and testing. During small-scale testing with middle school PLCs, teacher volunteers felt that most questions were straightforward and easy to interpret, but the volunteers also offered several suggestions about how to improve question clarity. During small-scale testing with PLCs, WestEd researchers found high levels of agreement across the PLCs on some questions and more variation on others, with variation often explainable in light of our observations of the PLC meetings. Based on their review of the collected data, the district team provided a few suggestions for improving the data dashboard and survey.
Frequency
This tool should be administered by the facilitator(s) as often as necessary to support improvement goals. The survey is designed to be completed at the conclusion of a collaborative professional learning session.
Measurement routine details
Prior to the start of a collaborative professional learning experience, the measurement team should identify which items from the item bank match the local education agency’s goals, needs, and interests and also adjust the language, where indicated, to match the collaborative learning context.
The survey is designed to be administered at the end of a professional learning session. Facilitators should set aside at least 5 minutes to complete the survey.
Administering the survey in a timely manner supports teacher recall and allows the facilitator to orient teachers toward the segment of the PD that facilitators are most interested in learning more about. Because the survey was designed to be aggregated at the level of a professional learning group, as many of the same teachers as possible should complete the survey during each administration to allow for greater confidence when comparing responses across administrations.
While it is possible to complete the survey on paper, an electronic version (e.g., Google Form) can better protect teacher anonymity and may also be logistically easier in terms of collecting and analyzing data.
Once surveys are completed, the facilitator and key instructional leaders should review the results to make sense of the data together. The group can then engage in collaborative inquiry and discuss next steps and changes to try out.
The facilitator can then implement the changes in subsequent sessions, collect additional feedback, and continue to debrief results with instructional leaders. This routine establishes a learning cycle through which PL designers and facilitators can use the survey tool to measure changes in teacher perceptions of the professional learning experience over time.
Data analysis details
This survey is designed to be aggregated at the level of a professional learning group. To protect the anonymity of teachers, the survey should not be disaggregated in any way and should not be used to assess the progress of individual teachers. Data from open-response items are generated at the individual level, already disaggregated. As such, even in anonymized surveys, open-response items might reveal participant identity. Prior to sharing these responses, facilitators should review open responses and either remove responses that reveal individual identity or mask the response (i.e., remove identifying information) so that individuals cannot be identified.
Administering the survey using a platform such as Google Forms, Qualtrics, or Survey Monkey can support the efficient collection and visualization of data.
An annotated copy of the survey (available on the Practical Measures, Routines, and Representations [PMRR] website) connects the survey items to teacher learning and can support facilitators’ investigation of teachers’ responses. (PMRR is developing protocols to support facilitators to engage in cycles of inquiry.) The few open-response items on the survey can add nuance to teacher responses and, when analyzed alongside closed responses, can provide a window into teachers’ thinking.
Conditions that support use
- As with most practical measures, use of the survey should be part of an ongoing plan for instructional improvement in mathematics.
- It is important for the facilitator to have a collaborator (such as an instructional leader involved with teacher learning) to help debrief PL sessions, make sense of survey results, plan next steps, and reflect on whether resulting changes are leading to improvements. The collaborator does not need to be present for the PD sessions and can be someone in a similar or different role to the main facilitator, including a subject matter coach, a district leader, and/or an external consultant.
- In order to ensure that teachers feel comfortable providing truthful responses, it is important to assure teachers that their responses will not be seen by school or district-level individuals who assess teachers. It is also important for teachers to feel that those collecting the data genuinely value their responses. We recommend reviewing the aggregated responses in the next meeting.
- As with all practical measures, the survey is intended to inform improvement efforts and guide conversations and reflections. The tool should never be used to evaluate a facilitator or teacher’s work.
Challenges
Some schools or districts, especially those without a preexisting culture of data sharing, may have privacy concerns around sharing and discussing data. School or district settings lacking some of the supporting conditions listed above may face challenges in using this tool. If a school or district has not yet created a theory of improvement related to teacher professional learning, it may be difficult to select the right items from the item bank and to decide how to respond to the data.
In spring 2023, the WestEd Math Practical Measures (MPM) team worked with a small school district in the Northeast to try out the survey with three middle school PLCs.
The team began testing by conducting interviews first with district administrators and then subsequently with four teachers, with the purpose of understanding how practitioners new to practical measurement would react to the usefulness of the survey for their district and school contexts and whether teachers would interpret the questions in the intended ways. The next test focused on triangulating sources of PLC data—that is, PLC observations and a survey administration—to determine whether the survey results seemed to reflect what researchers observed in the PLC meetings. An additional test involved sharing data with district administrators to understand whether the data gathered from teachers aligned with district administrators’ existing knowledge about the PLCs. Each of these tests followed a Plan-Do-Study-Act (PDSA) cycle, where we aimed to do the test, study the results and document the learning, act on the learning by revising and improving the survey, and plan subsequent tests to continue the improvement process.
WestEd MPM members met with four middle school teachers to conduct cognitive interviews. During these interviews, teachers participated in a “think-aloud” discussion with researchers as they read the survey questions aloud and described their thought processes to interpret the question and select a response option. Teacher respondents agreed that most questions were straightforward and easy to interpret, but the teachers also offered several suggestions about how to improve question clarity.
After the cognitive interviews, WestEd MPM team members shared the resulting survey draft with district administrators, had them select items from the question sets that were most important to them, and arranged to visit the district to observe PLCs and administer surveys. On a day when regular PLC meetings were being held toward the end of the school year, WestEd team members joined PLCs at three different schools for the full length of each meeting (30–50 minutes each). Researchers documented observations in narrative field notes and administered the survey electronically in each group during the last few minutes of the meeting. Eighteen teachers completed the survey.
Discussion following this test centered on whether we thought the survey was picking up variation in PLC responses and how well any survey could capture aspects of PLCs that might not be immediately observable within a stand-alone PLC meeting. We found high levels of agreement across the PLCs on some questions and more variation on others; oftentimes, the variation made sense in light of our observations. The test confirmed that we had produced a useful practical measure. However, we noted that these were generally high-functioning PLCs, and we wondered whether the survey would be successful at picking up variation for lower functioning PLCs.
As a final step in initial survey testing, the WestEd team met with district administrators to discuss the data. Those present agreed that the results from only 3 PLCs and 18 teachers should not be seen as representative of all PLC experiences in the district. The team noted the overwhelmingly positive results and discussed the alignment between the data and district leaders’ perceived understanding of the schools’ PLCs. Based on their data review, the district team had a few suggestions for improving the data dashboard and survey. Future plans will include additional testing to explore variation with a larger group of PLCs from a single district or multiple districts.
Banilower, E. R., Heck, D. J., & Weiss, I. R. (2007). Can professional development make the vision of the standards a reality? The impact of the national science foundation’s local systemic change through teacher enhancement initiative. Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 44(3), 375-395.
Carpenter, T. P., Fennema, E., Peterson, P. L., Chiang, C. P., & Loef, M. (1989). Using knowledge of children’s mathematics thinking in classroom teaching: An experimental study. American Educational Research Journal, 26(4), 499–531.
Cohen, D. K. (1990). Revolution in one classroom. Journal of Education Policy, 5(5), 103–123. https://doi.org/10.1080/02680939008549067
Cohen, D. K., & Hill, H. C. (2001). Learning policy. New Haven, CT: Yale University Press.
Cohen, D. K., & Hill, H. C. (2008). Learning policy: When state education reform works. Yale University Press.
Darling-Hammond, L. (2009). Recognizing and enhancing teacher effectiveness. The International Journal of Educational and Psychological Assessment, 3(1), 1-24.
Darling-Hammond, L., Hyler, M. E., & Gardner, M. (2017). Effective teacher professional development [Research brief]. Learning Policy Institute.
Desimone, L. M. (2009). Improving impact studies of teachers’ professional development: Toward better conceptualizations and measures. Educational researcher, 38(3), 181–199.
Desimone, L. (2011). A primer on effective professional development. Kappan Magazine, 92(6), 68–71. https://lfp.learningforward.org/handouts/Dallas2018/8133/Effective%20PD%20DeSimone.pdf
Desimone, L., Porter, A., Garet, M., Yoon, K., & Birman, B. (2002). Effects of professional development on teachers’ instruction: Results from a three-year longitudinal study. Education Evaluation and Policy Analysis, 24(2), 81–112
Fullan, M. (2024). Coherence framework extended. https://michaelfullan.ca/coherence-the-right-drivers-in-action-for-schools-districts-and-systems/coherence-framework-extended/
Garet, M. S., Porter, A. C., Desimone, L., Birman, B. F., & Yoon, K. S. (2001). What makes professional development effective? Results from a national sample of teachers. American Educational Research Journal, 38(4), 915-945.
Kennedy, M. (1998). Form and Substance in Inservice Teacher Education. Research Monograph.
Knapp, M. S. (2003). Professional development as policy pathway. Review of Research in Education, 27(1), 109–157.
National Research Council. (2000). How people learn: Brain, mind, experience, and school: expanded edition. Washington, DC: The National Academies Press. https://doi.org/10.17226/9853
Penuel, W. R., Gallagher, L. P., & Moorthy, S. (2011). Preparing teachers to design sequences of instruction in Earth systems science: A Comparison of three professional development programs. American Educational Research Journal, 48(4), 996–1025.
Santagata, R., Kersting, N., Givvin, K. B., & Stigler, J. W. (2010). Problem implementation as a lever for change: An experimental study of the effects of a professional development program on students’ mathematics learning. Journal of Research on Educational Effectiveness, 4(1), 1–24.
Santagata, R., Kersting, N., Givvin, K. B., & Stigler, J. W. (2011). Problem implementation as a lever for change: An experimental study of the effects of a professional development program on students’ mathematics learning. Journal of Research on Educational Effectiveness, 4(1), 1–24.
Smith, T. M., Desimone, L. M., Zeidner, T. L., Dunn, A. C., Bhatt, M., & Rumyantseva, N. L. (2007). Inquiry-oriented instruction in science: Who teaches that way?. Educational Evaluation and Policy Analysis, 29(3), 169-199.
Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the evidence on how teacher professional development affects student achievement (Issues & Answers Report, REL 2007-No. 033). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest.
Add a Comment