The extent to which students are experiencing learning conditions that matter for student engagement and learning.
Use this measure:
To register for Copilot-Elevate and access the survey platform, visit the PERTS Copilot-Elevate website. A summary of the measures and associated learning conditions can be viewed on the Copilot-Elevate: Measures Summary page.
Note: The PERTS Copilot-Elevate website contains comprehensive resources for survey users. Much of the information in this summary measurement guide is drawn from these materials. Those considering using the survey should visit the website for additional resources and information.
Measure of...
Measurement instrument overview
The full survey consists of 18 items spanning six “learning conditions.” It takes students 5–10 minutes to complete, but can be made shorter. The survey is customizable, allowing educators to focus on the particular learning conditions they feel are relevant to their contexts and to choose only the survey items relevant to those learning conditions. The survey is designed for students in grades 6–12.
Through the PERTS Copilot-Elevate platform, educators receive a confidential report with data overall and disaggregated by student demographic characteristics.
Connection to student learning
Research has shown that the six learning conditions measured by the PERTS Copilot survey matter for student engagement and success:
- Affirming cultural identity: Students feel more connected to and motivated in classes that affirm their background.
- Classroom belonging: Students need to feel a sense of community, mutual support among peers, and affirmation to feel safe and connected.
- Feedback for growth: Students need supportive feedback that helps them recognize their own potential to grow.
- Meaningful work: Students need to understand how schoolwork is relevant to their own lives and goals.
- Student voice: Students take ownership of their learning through sharing their knowledge and perspectives in the classroom.
- Teacher caring: Students need to feel valued and respected in the learning environment.
(For further information, including summaries of and links to the research base behind these learning conditions, see: Copilot-Elevate Measures Summary.)
The results of the PERTS Copilot survey provides students’ perspectives on the extent to which they are experiencing these learning conditions. Importantly, Copilot survey data reports help educators surface differences in how students from different racial, ethnic, and gender groups experience learning conditions, allowing educators to take additional steps to understand and mitigate disparities.
What we know about how well this measure works for its intended use
Researchers have empirically analyzed the relationship between the Copilot survey items and student outcomes as well as between the use of the Copilot survey and student outcomes.
A 2019 PERTS study found that teachers using Copilot were able to improve learning conditions over time and that better learning conditions were linked to “higher and more equitable achievement.”i
Findings included that “over 80% of teachers improved one or more classroom learning conditions when they leveraged Copilot over multiple cycles of inquiry and action” and that “students who experienced positive learning conditions in a class were 30% more likely to earn an A or B in that class.”ii These benefits were even more pronounced for students of color.
PERTS also linked better learning conditions to better social and emotional learning outcomes, finding that “when learning conditions improved, students were 86% more likely to experience a higher sense of belonging, 24% more likely to develop a growth mindset, and 2x more likely to report they ‘tried their very best’ in class.”iii
i Gripshover, S., & Paunesku, D. (2019). How can schools support academic success while fostering healthy social and emotional development? PERTS, Stanford University.
ii Gripshover & Paunesku, 2019.
iii Gripshover & Paunesku, 2019.
Frequency
When used as part of an improvement or inquiry cycle, educators typically administer the survey once every two to six weeks.
Measurement routine details
To take the survey, students need access to an internet-enabled device such as a laptop, tablet, or smartphone. The survey should take no more than 10 minutes to complete.
Before the survey can be administered, users need to create a Copilot account through the PERTS website and do some basic work to set up their roster and select the survey items they are using (more information about setting up and administering the survey can be found through the PERTS support page under About Copilot Elevate).
PERTS recommends using surveys as part of “multiple cycles of inquiry and action” to allow educators to engage in ongoing feedback and practice. Educators start a cycle by collecting feedback from their students through the survey to understand how students are experiencing learning conditions.
Following survey administration, educators are emailed a confidential report with disaggregated data for their class. After reflecting on survey results, educators can create an action plan and test strategies to improve student experiences. PERTS has developed a Baseline Meeting Guide for teams of educators to use to reflect on survey results and identify learning conditions to target for improvement. The PERTS platform pairs survey measures with evidence-based strategies that teachers can try to improve learning conditions.
At the close of the cycle, educators can administer the survey again to understand if strategies have shifted student experience so they can continue to make adjustments to their practice. Cycles typically last two to six weeks.
While educators can use the survey individually, the PERTS Copilot platform also enables a group of educators to collaborate as part of a joint project, sharing access to the same tasks, learning modules, and aggregated group data.
Data analysis details
Copilot survey reports contain graphs reflecting the percentage of students responding positively to survey questions (i.e., responding “Agree” or “Strongly Agree” to positively worded questions on a 7-point scale). Responses are organized by learning condition. Only teachers can see their individual survey data.
An example of a graph for an individual survey item over time:
Copilot survey reports also automatically disaggregate data by gender and race-ethnicity group (students are asked what racial and ethnic groups they identify with the first time they take the survey). Educators can also specify a “target group” of students in order to understand the experiences of groups of students who may experience support differently (for example, English language learners or students who are members of multiple intersecting demographic groups).
To balance the need for student confidentiality with the importance of providing disaggregated data about opportunity gaps, reports group together students who self-identify as Black, Latinx, Native American, and/or Pacific Islander; and students who identify as White or Asian (the PERTS Copilot FAQs contains more information on the national statistics on disparities in academic and disciplinary outcomes that informed this grouping). When fewer than five students are in a disaggregated category, responses are hidden to protect confidentiality.
An example of responses to a survey item, disaggregated by a target group specified by the user, race, and gender:
Conditions that support use
- PERTS stresses the importance of introducing students to the survey to build buy-in on its purpose and to emphasize the confidentiality of individual students’ responses (the PERTS document Communication with Students: Maximize Survey Participation & Cultivate Authentic Feedback offers some best practices).
- In addition to introducing students to the purpose of the survey, PERTS recommends debriefing survey results with students. Debriefing demonstrates that educators value and are paying attention to survey responses, encourages students to take the survey seriously, and can counteract survey fatigue.
Challenges
- PERTS recommends that students take the survey in no more than four classes to prevent survey fatigue.
- It can be hard for some teachers to see survey responses from their students indicating shortcomings of the learning environment. Teachers benefit from using the survey in a context in which they are supported to view the data as opportunities for their own learning and growth.
Other tools and resources to support use
The PERTS website includes many resources to support survey users, including:
- Research about how each learning condition affects engagement and equity
- Best practices for communicating with students about the survey to promote authentic feedback
- Strategies and activities that help educators cultivate each learning condition in their classrooms
- Resources for using the Copilot survey as part of a series of improvement cycles. These include a Baseline Meeting Guide to help teams reflect on survey results, identify and learn about target learning conditions, explore research-based strategies to try, and record intended practice changes and learning
- An extensive support page with details about setting up and administering the survey
Districts or schools wanting to use the Copilot survey on a larger scale (beyond small teams of educators) can contact PERTS to learn more about support options.
Mineola Public Schools, a small district in Long Island, NY, had been working to cultivate a growth mindset amongst its students, teachers, staff, and parents. The district was looking for an opportunity to deepen its student engagement work and to understand whether student engagement efforts were actually working. To better explore this question, district leadership selected inquiry cycles using PERTS Copilot-Elevate (i.e., PERTS’ data-driven professional learning program) as one of its professional development offerings in the 2019–20 school year.
Mineola began its work with Copilot-Elevate through a small pilot with 11 middle school teachers teaching students in grades 5–8 in math, humanities, and special education. The pilot teachers wanted to better understand student perceptions of learning conditions. While the group was committed to engaging and supporting their students, they were eager to see if this commitment translated into student experience. As one teacher explained, “I wanted to see if my students really realized how much I care about them, and I wanted to make sure that I made math come to life because I know a lot of students don’t make the connection to why math is important to everyday lives.”
The Mineola teachers were deliberate in introducing the survey to their students, either adapting and personalizing the template survey introduction letters available on the PERTS website or having conversations with students to explain the intent behind the surveys. The teachers emphasized that they were looking for honest answers and stressed the anonymity of the data. One teacher said, “I think the most crucial point is that we made students really aware that we were all embarking on this journey together and that their answers were going to help us . . . they had a hand in shaping their classroom environment.”
After administering the first survey, the teachers came together and reviewed the data as a team. They were “shocked” by the first round of data. One teacher explained, “I was focused on teacher caring, maintaining an environment where everyone feels ok to take educational risks . . . when I first got the data back, I was shocked and disappointed that I did not have 100% [positive responses] . . . it can be challenging to be so vulnerable with this data, but it puts energy on us to change things up.”
As the team reviewed the survey data, the comparatively lower responses on the “Feedback for Growth” learning condition stood out. The team first reflected on why feedback responses were low, hypothesizing that some students might see feedback as a threat or that students might not understand what feedback for growth actually looks like (according to one teacher, her students described feedback as “when a teacher tells you what you did wrong on a test”).
These reflections led to the realization that as a precursor to improving student perceptions of feedback, they needed to work with students to communicate the intention behind feedback for growth. As one teacher recalled, “We had to pause and teach what feedback was. We know we were giving them feedback, but they might not be aware of it.” After deliberately introducing their students to the concept of feedback for growth, the teachers began testing the “stars” (areas where students are doing well) and “steps” (what students will do to continue to get better) strategy from the PERTS library of strategies related to improving specific learning conditions. While the team of teachers all focused on this same strategy, they adapted it to fit their own classrooms. Iterations included writing feedback on sticky notes for students, having students give themselves feedback on their own work and then talking through it with a teacher, and using a seating chart to track which students had received “star” and “step” feedback during a lesson (leading to the realization that not all students were receiving feedback as equitably as the teachers had intended and that those who received feedback more frequently might feel singled out).
As teachers implemented these strategies, they observed that students were more receptive to feedback and that “stars” made students more willing to discuss the “steps.” When teachers met to review the Copilot survey data several weeks later, the data bore this out, showing an improvement in student perceptions and motivating teachers to continue sharing, testing, and refining their strategies. When the COVID-19 pandemic forced a shift to virtual learning, the teachers were able to see that survey results on feedback and teacher caring had dropped, spurring teachers to adapt their strategies in these areas to meet student needs in a distance context.
The Mineola teachers emphasized the vulnerability that comes with not just with seeing their individual data, but with sharing that data with their colleagues. This vulnerability was something that students picked up on and appreciated too. As one teacher described, the idea that “I want to improve myself and improve together . . . is freeing, and I think that alone increases [student] engagement. That’s an intervention in itself.”
Interviewees:
Sarah Gripshover, Director of Research at PERTS
Dave Paunesku, Executive Director at PERTS
Christina Ross, Program Manager, Blueprint Initiatives at Baltimore City Public Schools
The “Measure in Practice” vignette was drawn from a session presented by Teacher Panelists from Mineola Middle School and Sarah Gripshover at the 2020 Carnegie Summit on Improvement in Education.
Carnegie Summit Mineola Middle School Teacher Panelists:
- Gina Amzler
- Lindsey Borges
- Staci Durnin
- Michelle Frasconga
- Heather Hazen
- Diana Kohl
- Jennifer Maichin
- Kerry Murphy
- Courtney Serio
- Anthony Tramonte
- Leslie VanBell
Add a Comment