During her CueThink Pilot, eighth grade math teacher, Krista Porter from Burleson, Texas used CueThink to support her students’ mathematical reasoning while preparing for her state assessment. Porter explained that, “the flexibility of CueThink allowed me to pull questions [released from past state tests] and put them in my question bank and have the students work on them. The students were able to go through the various problem solving methods and figure out what worked best. When one student had a particularly interesting solution (right or wrong) the others could comment on it.”
This blog describes four ways in which test release questions can be used within CueThink to uncover student thinking and mathematical reasoning:
Solve the original multiple choice problem
Change the multiple choice problem into an open response question
Shorten the multiple choice problem into a scenario
Create an error containing thinklet incorrectly answering the question
For each strategy, the following example question from the Texas Education Agency document “State of Texas Assessments of Academic Readiness (STAAR®) Incorporating Process Standards” is modified to illustrate the ease of the process:
“Of the 250 sheep in a flock, 34% are white. What is the total number of white sheep in the flock?
A) 85 (correct answer)
D) Not here
Solve the original multiple choice problem within CueThink
Enter the question exactly as it appears in the assessment, multiple choice options and all. Even though the problem still has multiple choice answers, students will still have to analyze information, formulate a plan, determine a solution, justify their solution, and evaluate the problem-solving process and reasonableness of the solution within CueThink.
They are analyzing given information in the Understand Phase as they notice and wonder about both the question and the multiple choice options. Frequently, multiple choice questions include two sets of similar answers. Noticing this pattern can help students increase the probability of solving the problem correctly. In the Plan Phase, students are prompted to formulate a plan or strategy. Taking the time to write a detailed plan prevents students from simply guessing the answer. Determining and justifying the solution are equally important skills that students practice in the Solve Phase. And instead of worrying if your students check their work, use the Review Phase questions to prompt them in evaluating the problem-solving process and reasonableness of the solution.
Once a student has solved the problem in CueThink, the learning is just beginning. Using the annotation process, peers view student work and analyze mathematical relationships by looking at the variety of possible strategies used to solve the problem.
Even though there is only one correct answer, there are a number of possible strategies that students could use to solve the problem. Some possible strategies are:
To find 30% of 250 and then find 4% of 250 and add those values together.
To find what is 1% of 250 and then multiply that value by 34
To find 35% of 250 and then subtract 1% of 250
By giving students the opportunity to view peers’ work and evaluate their strategy, students will better understand which strategy is most logical and efficient.
Translate the original multiple choice problem into an open response question for students to solve within CueThink
Changing a question from a multiple choice to an open response is as simple as removing the answer choices. By removing the answer choices, peers and teachers are able to identify gaps in understanding. Though there is only a 25% chance for the student to randomly select the correct answer out of the four choices, it is always possible for a student to guess or deduce the correct option in a multiple choice problem. For example, in the problem above, the student might not be able to find the value of 34% of 250 because 34% is not a benchmark percentage. But the student might know that 50% of 250 is 125. This understanding allows them to disqualify answers B) 216 and C) 165 and gives them a 50/50 chance of guessing the correct answer. The greater the chance of students guessing the answer, the greater the probability that the teacher will be misinformed about students level of understanding.
By translating the multiple choice question into an open response question, students must generate the correct answer without any answer choices. By seeing students’ exact answer along with their problem-solving process, teachers are better able to determine if students fully understood the content assessed in the question. There is no longer the risk that a student guessed the correct answer even though they didn’t understand the concept.
Pose a scenario based on the original multiple choice problem to extend the rigor
Changing a problem into a scenario means removing both the multiple choice answer prompts and the question. Without an explicit question, students get to pose and then solve their own question or questions. This openness lends itself to natural differentiation because students can generate increasingly more complex questions to challenge themselves.
From the example state assessment question, “Of the 250 sheep in a flock, 34% are white. What is the total number of white sheep in the flock?” the scenario could be “Of the 250 sheep in a flock, 34% are white. Write a mathematical question you could solve using the given information.” This scenario still addresses the content goals of assessing students’ ability to solve problems involving ratios, rates, and percents but could also become much more complex based on the question the students write.
Some possible questions a student could pose are:
How many white sheep are there?
If the rest of the sheep are black, how many more black sheep than white sheep are there?
If there are an equal number of black and grey sheep, how many of each color sheep are there?
By increasing students’ autonomy, they are more responsible for determining their path and naturally inclined to show teachers the depth of their understanding of the content.
Create an error containing thinklet answering a state assessment question for students to analyze within CueThink
In their article, “Get the Goof," Michelle H. Pace and Enrique Ortiz cite Bray (2011) when they state, “research suggests that focusing students on analyzing and discussing mathematical errors can emphasize classroom discourse that builds on students’ thinking, promotes conceptual understanding, and mobilizes students as a community of learners” (2016). Thus, the teacher can empower students and directly address misconceptions by presenting error containing thinklets based on the state standard assessment questions. The Massachusetts DOE and other states release Sample Student Work and Scoring Guides from previous years’ assessment. In these release packets are a series of student work that align with a specific score. Select one or two pieces of work that contain errors and use them to create thinklets. Then task students to both score the thinklet using the rubric as well as create a new version of the thinklet fixing the mistakes. Read our blog to learn more about the Benefits of Error Analysis.
Results of using CueThink to prepare for state assessments
Ms. Porter was extremely pleased by the results of her students that used CueThink. “The students learned so much!” She continued that “[CueThink] helped my students visually see the problem solving. They collaborated with each other to boost confidence and skills in critical thinking and reasoning.” “[The group using CueThink] took the test and had a 94% passing rate. The one student who did not pass missed by one question.” The teacher attributed the one student who did not pass the assessment to test anxiety not lack of knowledge.
Share Your Math Stories
Let us know if you try this activity with your students! Email us, blog or tweet using the hashtag #makemathsocial. We’re always looking to share user stories on our blog, so please email firstname.lastname@example.org for a #makemathsocial blog or interview slot!
“Massachusetts Comprehensive Assessment System.” Massachusetts State Seal, 13 Oct. 2017, www.doe.mass.edu/mcas/student/2017/.
Pace, Michelle H., and Enrique Ortiz. “Get the Goof!” Teaching Children Mathematics, vol. 23, no. 3, 2016, p. 138., doi:10.5951/teacchilmath.23.3.0138.
State of Texas Assessments of Academic Readiness (STAAR®) Incorporating Process Standards. Texas Education Agency , Jan. 2017, tea.texas.gov/student.assessment/hb3plan/hb3-sec1ch2.pdf.
Trautz, Caryn. “Benefits of Error Analysis, CueThink Style.” CueThink, 1 June 2015, www.cuethink.com/blog/2015/5/20/error-analysis-cuethink-style.