This tool supports the assessment of the rigor, or cognitive demand, of a mathematical task as the task is written (before it is implemented in the classroom).
Use this measure:
PMRR’s measures can be accessed by registering at the PMRR website: pmr2.org.
Note: PMRR’s “Recommended Conditions for Use of the Practical Measures of the Classroom Learning Environment” and annotated versions of the measures (accessed through the PMRR website) provide important guidelines on using the PMRR measures as part of instructional improvement work, and the research base informing these measures. Much of this overview is drawn from these resources.
Measurement instrument overview
The Rigor of the Task Analysis tool is designed to be used by a coach or district leader with a teacher to discuss the rigor of a task selected for a lesson. The tool helps educators answer the question, “What do students have to do to successfully complete the task?”
The tool consists of a rubric to categorize a task as one of three levels of rigor: “Using Procedures,” “Making Sense of Procedures,” or “Problem Solving.” To help users understand how to use the rubric to categorize tasks, the tool includes examples of tasks related to three math topics — measurement, data, and algebra — followed by a discussion of why each example task was categorized at the level of rigor it was.
Connection to student learning
Research suggests that the rigor of the tasks that teachers select as the basis for their instruction influences all phases of lessons and thus students’ opportunities to learn.i
The PMRR team writes, “As tasks increase in rigor, they admit a wider range of solution strategies, which in turn provide a basis for productive classroom discussions that focus on significant mathematical ideas.”
The Rigor of the Task Analysis tool helps educators distinguish between three levels of rigor:
- Using Procedures: Students can solve the task by using a previously taught procedure and do not need to explain or demonstrate why the procedure works in order to be successful in the classroom.
- Making Sense of Procedures: Students have to explain and/or demonstrate why a procedure works in order to be successful. The cognitive demand of these tasks is higher than Using Procedures because students have to demonstrate that they understand the procedure, often by using models.
- Problem Solving: Students have to figure out which procedures to use by analyzing the task and identifying underlying mathematical relations. The cognitive demand of these tasks is higher than Making Sense of Procedures because students have to analyze tasks mathematically in order to figure out how they can be solved.
PMRR’s annotated version of the Rigor of the Task Analysis Tool, accessed by registering through the PMRR website, contains more information on these levels of rigor and the research base behind them.
i PMRR (2020). Rigor of the Task: Annotated copy. Retrieved December 2020 from University of Washington: https://www.pmr2.org
What we know about how well this measure works for its intended use
The Rigor of Task tool was informed by the Instructional Quality Assessment (IQA) Classroom Observation Tool, which was used in research to assess students’ opportunities to engage in cognitively demanding mathematics and has been tested for reliability and validity by external researchers. The PMRR team found that the IQA, while useful, required that teachers and coaches undergo extensive training before they could understand it and apply it correctly. PMRR worked to create a simplified tool capturing three levels of rigor along with illustrative examples for each level across three content strands. Throughout the development process, math educators and coaches worked with the PMRR team to collaboratively create and refine the tool. The team has found a positive relationship between the rigor of the task, as assessed by this tool, and the quality of math discourse. The context in which this finding emerged is described in greater detail in the Measure in Practice section below.
A full list of ongoing research regarding the use of the PMRR measures can be found on the PMRR website.
The Rigor of the Task tool should be used with a frequency that aligns with improvement goals and professional learning. For example, the tool might be used more intensively during unit planning when educators are selecting the best tasks to employ during various phases of a unit.
Assuming teachers and coaches have been oriented to the importance of using rigorous tasks and supported to understand elements of rigor through professional learning, it should not take long to use the rubric to review a task and determine a task’s level of rigor. However, when planning how often and when to use the tool, improvers should also consider the time needed for a rich discussion around rigorous tasks that the tool is designed to instigate.
Measurement routine details
This rubric is used by teachers, coaches, or district math specialists to assess the rigor of a task as that task is written, before the lesson is implemented. Teachers engaging with the tool should do so in the context of coaching or a professional learning community.
Focusing on a specific task, teachers and coaches start by asking, “What do students have to do to successfully complete the task?”
The tool gives teachers the option of categorizing the tasks into three levels: Using Procedures (least rigorous), Making Sense of Procedures, or Problem Solving (most rigorous). Each level is accompanied by an example and a rationale explaining why the task was categorized as it was, illustrated across three different content strands: data, measurement, and algebra.
The tool is designed to encourage productive discussion between a teacher and a coach about the rigor of a task as written. It also supports teachers and coaches in negotiating a shared understanding about different levels of rigor and about when it is appropriate to use more rigorous tasks (the end goal of using the tool should not be to simply decide whether or not a particular task is rigorous).
While the tool is designed to be used with individual tasks, it can also help teachers and their coaches keep track of students’ learning opportunities over time. For example, in reviewing the mathematical goals for a given instructional unit, they might use the tool to ensure that they include tasks that require a mix of Using Procedures, Making Sense of Procedures, and Problem Solving. Such a mix of learning opportunities is consistent with what is expected of students in many states’ math college- and career-ready standards.
Data analysis details
This tool asks one question: What do students have to do to successfully complete the task?
- If the student can complete the task simply by using a procedure to
solve a familiar type of task, the task is scored “1” (Using Procedures).
- If the student must not only use a procedure but also demonstrate or figure out why the procedure works, the task is scored “2” (Making Sense of Procedures).
- If the student must analyze the problem in order to figure out what procedure(s) to use, the task is scored “3.” (Problem Solving)
Teachers, coaches, and other instructional leaders can collect these data periodically to look for trends in the level of cognitive demand of the tasks being used in individual classrooms, schools, or districts.
Conditions that support use
- The PMRR team stresses the importance of “embedding the measures in support for teacher learning” and the critical role coaches and leaders with deep mathematical instructional expertise play in helping teachers engage with the tool, make sense of data, and identify next steps.
- The PMRR team took care to introduce the tool to teachers deliberately, first supporting teachers to understand that rigorous tasks are critical to productive student discussions and then introducing the tool as a way to discuss and co-analyze the level of rigor of a task. This positioned coaches as collaborators whose role was to help teachers investigate the impact of a task’s rigor on student learning, rather than as evaluators. The PMRR team observed that the effective use of the Rigor of the Task tool was supported by a collaborative, nonevaluative coaching culture.
- As with all practical measures, the Rigor of the Task tool is intended to inform improvement efforts and guide conversations and reflection. The tool should never be used to evaluate a teacher’s work.
- Because the Rigor of the Task tool is essentially a rubric, organizations that have made use of other rubrics for teacher evaluation or that lack an improvement-focused, collaborative coaching culture could be especially susceptible to misusing the tool for evaluative or accountability purposes
Other tools and resources to support use
PMRR’s Recommended Conditions for Use of the Practical Measures of the Classroom Learning Environment. The Recommended Conditions for Use document provides guidance regarding how PMRR’s classroom practical measures should and should not be used in service of instructional improvement. PMRR generated these “conditions of use” based on systematic inquiry into the use of the measures in their partner districts. Included in the document are recommended conditions for using the measures, recommendations for data analysis, and frequently asked questions about preparing to administer the measures, administering the measures, and analyzing the resulting data after administration.
PMRR’s annotated version of the tool, accessed through the PMRR website, connect items with the research base on student engagement and math learning and include sample conversation starters and improvement goals linked to each item.
The Rigor of the Task tool was born out of a research–practice partnership between Metro Nashville Public Schools and PMRR. It was later used extensively in a partnership between PMRR and Federal Way Public Schools, a school district outside of Seattle. Federal Way middle school math teachers were expected to use a freely available online curriculum, but there was little guidance about which aspects of the lessons to prioritize and how. District math specialists were concerned that the rigor, or cognitive demand, of the tasks that most teachers were selecting from the online curriculum was low, thereby limiting students’ opportunities to develop conceptual understanding and procedural fluency. In response, the district undertook a Curriculum Guide Writing Initiative and recruited middle school math teachers to serve as Curriculum Guide Writers. The district math specialists’ goal was to support the Curriculum Guide Writers to revise previously developed units to increase the rigor of the lessons. The district also identified “early implementer” teachers who were willing to pilot these revised lessons and provide feedback.
To understand the extent to which a piloted lesson supported mathematical problem-solving among students, the Federal Way–PMRR team relied, in part, on two other PMRR classroom measures — the whole-class discussion survey and the small-group work surveys — to understand students’ perspectives on the kinds of learning opportunities provided through a piloted lesson.
Data from these surveys allowed the PMRR team and district math specialists to highlight for Curriculum Guide Writers the connection between student opportunities to engage in mathematical discussion and the rigor of the task. For example, lessons centered on rigorous tasks were linked to students reporting they had more opportunities to make sense of ideas during discussion. This led the Curriculum Guide Writers to a realization: If they wanted to improve the quality of mathematical discussions for students, they needed to first focus on the rigor of the tasks they were choosing.
The Rigor of the Task tool provided the Curriculum Guide Writing Team and district math specialists with a shared language and structure to collaboratively analyze and discuss the rigor of the tasks they were developing. To introduce the rubric to the Curriculum Guide Writers, the PMRR team and district math specialists chose sample tasks from lessons that the writers had already developed and asked them to analyze tasks using the tool. Next, the PMRR team surfaced the relationship between those categories of rigor and student responses on the discussion survey measures. Then, when the PMRR team, district math specialists, and Curriculum Guide Writing Team had a shared understanding of how to apply the tool and the importance of task rigor to student discussion, the district math specialists were able to effectively press and support the writing team to increase the rigor of the task during their regular curriculum writing meetings.
Importantly, the point of the rubric was not merely to decide whether a task was rigorous. Rather, the power of the tool lay in its ability to provide Curriculum Writers and math specialists with the language and distinctions to discuss the rigor of a task as written and come to a negotiated understanding about how to select and adapt tasks to support student problem-solving. For example, when one district math specialist noticed that the writers had centered an in-class lesson on procedural tasks but had included a rigorous task in the accompanying homework assignment, the math specialist queried, “What would happen if we were to assign the procedural tasks as homework and focus the main lesson around the more rigorous task?” Rather than telling the writers the order was “wrong,” the math specialist’s question pushed the writing team to think through the pros and cons of embedding the more rigorous task into the main lesson versus homework. This captured the spirit by which the Federal Way district math specialists and Curriculum Writers engaged with the tool — discussing their reasoning, making sense of the different tasks available to them, and making intentional decisions about how to sequence and improve the tasks.
Kara Jackson, PMRR Co-PI and associate professor at the University of Washington College of Education
Paul Cobb, PMRR Co-PI and research professor at Vanderbilt University
PMRR (2020). Recommended conditions for use of the practical measures of the classroom learning environment. https://docs.google.com/document/d/1esTawjoP96RpJSKq-zkeBinKl2axgKhYxi-JCXWEleo/edit#
PMRR (2020). Rigor of the Task: Annotated tool. Retrieved December 2020 from University of Washington: https://www.pmr2.org