Students’ experiences with the whole-class discussion or small-group work in a specific lesson.
Use this measure:
PMRR’s measures can be accessed by registering at the PMRR website: pmr2.org.
Note: PMRR’s “Recommended Conditions for Use of the Practical Measures of the Classroom Learning Environment” and annotated versions of the measures (accessed through the PMRR website) provide important guidelines on using the PMRR measures as part of instructional improvement work, and the research base informing these measures. Much of this overview is drawn from these resources.
Measure of...
Measurement instrument overview
These surveys will help teachers, coaches, and administrators capture the students’ perspectives on the extent to which the whole-class discussion or small-group work supported their learning.
The Whole-Class Discussion Survey consists of 10 questions.The Small-Group Work Survey consists of 14 questions (several of which are also included in the Whole-Class Discussion Survey).
Approximately five minutes should be set aside for students to complete the survey, either at the end of a class period or immediately following the whole-class discussion or small-group work during the class.
Connection to student learning
The survey items are designed to capture the aspects of whole-class discussions and small-group work that research indicates make a difference for student learning. These include:i
- The cognitive demand of the task as implemented
- What students are accountable for in the discussion or group work (the correct answer, as opposed to making sense of mathematical ideas)
- Opportunities for students to listen to, reason about, and make sense of others’ ideas
- Establishing a classroom culture and norms and routines for small groups in which students want to share their ideas, feel that their ideas are valued, and feel comfortable sharing tentative or exploratory mathematical thinking
- Centering students’ thinking in instruction
- The teacher’s role during small-group work
PMRR’s annotated versions of the surveys, accessed through the PMRR website (user registration required), contain more information on these aspects of whole-class discussion and small-group work, and the research base behind them.
i PMRR (2020). Whole Class Discussion Survey: Annotated copy and Small Group Survey: Annotated copy. Retrieved December 2020 from University of Washington: https://www.pmr2.org
What we know about how well this measure works for its intended use
To develop the surveys, PMRR reviewed existing research on aspects of whole-class discussion and small-group work that are tied to student learning. Based on that research, they generated an initial set of survey items. Those items were tested in district classrooms to determine if the student responses reflected differences in the learning opportunities in different classrooms. The PMRR team also conducted cognitive interviews with students to understand if items were accessible and meaningful to students.
The development process is described in further detail in PMRR’s white paper, Practical Measures to Improve the Quality of Small-Group and Whole-Class Discussions. A full list of ongoing research regarding the use of the PMRR measures can be found on the PMRR website.
Frequency
While the surveys are quick for students to complete, they should be administered at a frequency that aligns with improvement goals and professional learning. For example, a teacher might administer the survey at the end of a coaching cycle, focus on implementing a targeted change to instruction during the next coaching cycle, and then administer the survey again a week later at the conclusion of that coaching cycle.
Measurement routine details
[Note: The whole-class discussion and small-group work surveys can be used in a variety of improvement contexts, including efforts to support coaching cycles with teacher-coach pairs, to support inquiry within a professional learning community, or to pilot and test particular lessons or curricula. This guide focuses on using the survey as a tool to support a coaching cycle.]
The teacher-coach pair might start by administering the survey to gather initial data about the extent to which whole-class discussions or small-group work are supporting student learning. These data can facilitate discussion about a specific instructional strategy that the teacher will incorporate into whole-class discussion or small-group work over the course of a coaching cycle.
Teachers administer the relevant survey either immediately following a whole-class discussion or small-group work or at the end of a lesson that incorporated a whole-class discussion or small-group work. Students can take the survey using paper and pencil or online via Google forms to allow for quicker representation of student responses.
To help a teacher assess whether a new instructional strategy has improved the students’ learning, the teacher-coach pair might compare responses from the end of the previous coaching cycle to responses at the end of the current coaching cycle. To best support teachers in making sense of why students responded the way they did, survey data should be analyzed alongside student work, coach observations, and teacher reflections.
Data analysis details
The Whole-Class Discussion Survey items focus on five aspects of whole-class discussion that research indicates make a difference for student learning opportunities. The annotated copy of the survey provides sample improvement goals and conversation starters linked to each aspect that can support teachers and coaches as they explore the survey data.
To understand how a teacher’s focal instructional strategies might be shaping student learning experiences, survey data can be compared across time points or across class periods. Survey data can also be aggregated across classrooms to help coaches and school leadership better understand the type of instructional supports that teachers need.
The surveys were designed to yield data aggregated at the classroom level rather than at the individual student level. PMRR recommends “as many students as possible complete the classroom measure at each administration. For comparability across [survey] administrations, we recommend that as many of the same students complete the classroom measure on each administration. For example, if there are 20 students in one section/period of your math class, we recommend that as many of those 20 students complete the classroom measure at each administration so that you can more confidently compare responses across different administrations.”ii
Data broken down demographically can yield helpful insights provided that users take an asset-based perspective toward students and use this disaggregated data to focus on the impact of teacher practice and instructional strategies on student responses.
The Edsight platform, developed by the PMRR team, allows users to schedule measurement, take notes, collect and visualize data, and look at trends over time.
Additional issues to consider around survey data analysis and aggregation can be found in PMRR’s Recommended Conditions for Use document.
ii PMRR (2020). Recommended conditions for use of the practical measures of the classroom learning environment. https://docs.google.com/document/d/1esTawjoP96RpJSKq-zkeBinKl2axgKhYxi-JCXWEleo/edit#
Conditions that support use
- Using the surveys within the context of regular coaching or a professional learning community will help teachers make sense of their data and connect the data to targeted instructional changes.
- To best make sense of student responses, survey data should be analyzed alongside other information, such as student work, coach observations, and teacher reflections.
- Positioning the surveys as a way to elicit student feedback and voice can help users understand the survey as a tool for exploring practice rather than as an accountability or evaluation tool.
- When discussing the survey results, users should bring an asset-based perspective, and a willingness to reflect on their own practices, to avoid data being used to reinforce existing perspectives. A context of ongoing professional learning can allow a coach or school leader to shape conversations about survey data and can prevent data from reinforcing problematic ways of characterizing students.
Challenges
- Some schools or districts, especially those without a preexisting culture of data sharing, may have privacy concerns around sharing and discussing data.
Other tools and resources to support use
PMRR’s Recommended Conditions for Use of the Collaborative Professional Learning Measure: The Recommended Conditions for Use document provides guidance regarding how PMRR’s classroom practical measures should and should not be used in service of instructional improvement. PMRR generated these “conditions of use” based on systematic inquiry into the use of the measures in their partner districts. Included in the document are recommended conditions for using the measures, recommendations for data analysis, and frequently asked questions about preparing to administer the measures, administering the measures, and analyzing the resulting data after administration.
PMRR’s annotated versions of the surveys, accessed through the PMRR website, connect items with the research base on student engagement and math learning and include sample conversation starters and improvement goals linked to each item.
Metro Nashville Public Schools: Embedding practical measures in coaching cycles
To help understand whether the instructional changes that teachers were making related to whole-class discussion were resulting in improvement, teacher-coach pairs embedded the Whole-Class Discussion Survey into their one-on-one coaching cycles. Teacher-coach pairs co-planned to set goals and select tasks around the whole-class discussion, administered the survey as part of classroom instruction, and used data from select items to inform their debriefing discussions.
As an example, at the end of one teacher’s coaching cycle, over half of students responded “yes” to the item, “Did you have trouble understanding other students’ thinking in today’s whole-class discussion?” As the teacher and coach unpacked why so many students reported having trouble understanding the thinking of their peers, the data point provided an opening for the coach-teacher pair to discuss a couple of the coach’s observations: The teacher tended to rephrase students’ thinking during whole-class discussions (rather than allowing students to rephrase each other’s thinking), and students tended to share without building on and making sense of their peers’ thinking.
The combination of these survey data, the coach’s observations, and the teacher’s reflections prompted the teacher to develop an instructional improvement goal of having students “revoice” and make sense of each other’s’ ideas during whole-class discussions rather than relying on the teacher to make sense of student thinking. Over the next month, the teacher focused on pressing and supporting students to rephrase the thinking of their peers. The survey was administered again one month later during the next coaching cycle, and the teacher-coach pair were able to see that the percentage of students who had trouble understanding other students’ thinking in the whole-class discussion decreased by more than half, suggesting that the instructional change the teacher made was an improvement.
Source: Kochmanksi, N. (2020). Aspects of high-quality mathematics coaching: What coaches need to know and be able to do to support individual teachers’ learning [Doctoral dissertation]. Vanderbilt University.
Several supporting factors helped teachers and coaches use the surveys in targeted ways to improve instruction, in the context of ongoing coaching cycles. The surveys helped teachers investigate the impact of their current practice on student learning, identify specific instructional changes that might improve students’ learning, and assess whether those instructional changes actually improved student learning. Schools also had preexisting routines built around analyzing and sharing data, and teachers and coaches were accustomed to working together to debrief and discuss learnings.
Interviewees:
Kara Jackson, PMRR Co-PI and associate professor at the University of Washington College of Education
Paul Cobb, PMRR Co-PI and research professor at Vanderbilt University
Other sources:
PMRR (2020). Recommended conditions for use of the practical measures of the classroom learning environment. https://docs.google.com/document/d/1esTawjoP96RpJSKq-zkeBinKl2axgKhYxi-JCXWEleo/edit#
PMRR (2020). Whole Class Discussion Survey: Annotated copy and Small Group Survey: Annotated copy. Retrieved December 2020 from University of Washington: https://www.pmr2.org
Jackson, K., Henrick, E., Cobb, P., Kochmanski, N., & Nieman, H. (2016). Practical measures to improve the quality of small-group and whole-class discussion [White Paper]. Retrieved August 8 2019 from https://bit.ly/2Kyhpx2
Add a Comment