Login


  • Home
  • Journal
  • Making Sense of 6th Grade Virginia SOL Results Through Distractor Analysis

Making Sense of 6th Grade Virginia SOL Results Through Distractor Analysis

Dr. Virginia Lewis

As a former middle school mathematics teacher and a current assistant professor of mathematics education, I have a professional interest in improving the information available to teachers regarding student performance on the Virginia Standards of Learning (SOL) assessments.  For teachers to make purposeful changes to instruction "the link between assessment and instruction needs to be made strong and explicit" (National Research Council, 1999, p. 5).  For this reason, I chose to examine for my dissertation the results of the Virginia Mathematics SOL Assessments to better understand why students select incorrect choices.  The intent of this study was to reveal these reasons so that teachers could better understand how they can use this information along with their own distractor analysis to improve students’ understanding of mathematics and performance on mathematics assessments.  The following research questions guided this study.

· What are the most likely explanations of middle school students’ incorrect distractor choices when responding to multiple choice items on the Mathematics Virginia SOL assessment?

· How do the incorrect distractor choices differ among students who pass, pass advanced, fail basic, and fail below basic on the Mathematics SOL assessment?

In order to examine these questions, the Virginia Department of Education (VDOE) provided the frequency and percentage of students who selected each answer choice for each of the fifty items on the 2007 and 2008 sixth, seventh, and eighth-grade released tests.  These data are not normally available to teachers and were obtained through a special data request.  The VDOE also provided the frequencies by pass level which made it possible to examine the responses of those students who excelled on the assessment (Advanced), passed the assessment but were not considered to have excelled (Proficient), failed but were close to passing (Fail Basic), and Fail Below Basic.  This article will focus on the sixth-grade data.  The number of students included in this sample and the number of students in the data set in each pass level is included in Table 1. 

Table 1. Frequency of students in the data set for pass level by year

Response Categorization

2007

2008

Advanced

6,102

14,866

Proficient

15,205

20,067

Fail Basic

10,280

11,559

Fail Below Basic

4,119

4,064

Total

35,706

50,556

The VDOE has published a series of presentations on-line titled Student Performance Analysis that "are intended to provide mathematics educators with further insight into the concepts that challenged students statewide" (VDOE, 2015, “Using Statewide SOL Results,” para.1).  Each presentation focuses on student performance on the spring SOL Assessments for a particular grade level by sharing example items similar to those identified as challenging for mathematics students.  The multiple-choice items in the presentations include the answer choices.  The incorrect answer choices, provided as alternatives to the correct answer on a multiple-choice item, are called distractors.  The correct answer as well as the most commonly selected distractor are both identified for each item in the presentation.  In the notes area of the presentation the VDOE offers the most likely reasons why students would pick that distractor.  It is important to note these reasons have been generated by experts for why a student would choose that particular distractor.
The results of my study are intended to provide additional information for teachers about the misconceptions and issues students are having when selecting incorrect responses on the Mathematics SOL assessments.  Even though the Mathematics Standards of Learning were revised in 2009, after my research study was completed, the results remain relevant.  And while my study did not involve Technology Enhanced Items, the older released items are still useful for preparing students for the current assessments and many assessment items continue to be in the multiple-choice format.

Data Analysis

It is not possible to ask students about their thinking during the standardized assessments, in situations like this Patton (1990) advised the researcher to “do the best job he could in describing the patterns that appeared to him to be present in the data” (p.483).  Kloosterman and Lester (2007) used distractor analysis to better understand student performance on individual items on the Main NAEP.  Their work served as the inspiration for this project which used a similar framework to analyze the distractor data for the sixth grade SOL assessment.  While we cannot know for sure how many students picked each distractor for a particular reason we are able to anticipate student misconceptions and are able to examine distractors on an assessment to identify the most likely reasons students would select those choices.  In fact, it is standard practice to purposefully use anticipated student errors to create distractors when writing a multiple choice assessment.
In this study of the sixth grade SOL assessment results, the explanations for why a student would likely select a distractor were generated by a Document Analysis Team (DAT).  The DAT was made up of three document analysts who independently examined the tests.  These document analysts had “different biases, different strengths, so they can complement each other” (Miles and Huberman, 1994, p. 267).  The DAT, which consisted of a sixth-grade mathematics teacher, a sixth-grade special education teacher, and myself, generated possible explanations for why a student would choose each distractor.  Each member of the team independently analyzed twenty-five items before meeting as a team to discuss possible explanations for each distractor.  Since there were one hundred items on the 2007 and 2008 tests combined, the DAT met four times over approximately two months to analyze all one hundred items.  Members of the team generated the same explanation for choosing a distractor about two-thirds of the time.  When more than one explanation was offered during the team meeting, multiple explanations were accepted as reasonable.  In a small number of cases a team member withdrew her explanation from consideration when other explanations were more plausible. 
The major categories used to classify each of the possible explanations for each distractor were based on the Mathematical Abilities described in the National Assessment for Educational Progress (NAEP) Framework for their Main NAEP mathematics assessment (National Center for Educational Statistics, 2011).  Once the DAT produced a listing of possible explanations for each distractor I categorized each of the explanations as Conceptual Understanding (Conceptual), Procedural Knowledge (Procedural), or Problem Solving error.  After all the distractor explanations were categorized, these categorizations were coded alongside the frequencies provided by the VDOE to look for patterns in the data.

 Summary of the Results

In both 2007 and 2008, Conceptual and Problem Solving explanations for possible error were much more common than Procedural errors.  Table 2 shows that Conceptual and Problem Solving errors most likely accounted for nearly one-fourth of answer choices selected on both assessments. 
An examination of the percentage of incorrect responses, displayed in Table 3,  revealed that more than 80% of incorrect responses, 84% for 2007 and 86% for 2008, were likely the result of Conceptual or Problem Solving errors.  This demonstrates the much smaller impact of Procedural Errors on student performance on this assessment, 14% for 2007 and 13% for 2008. Distractor explanations classified as Conceptual errors included a lack of understanding of mathematical terms, an inability to interpret or use symbols correctly, or a lack of relational understanding among the various forms of rational numbers such as fractions, decimals, and percents.  The Problem Solving classification mostly resulted from a failure to effectively use self-monitoring and control strategies, focus on the conditions of the problem, or failure to answer the question that was asked. Distractor explanations were classified as Procedural errors mostly if a student would select that distractor due to a failure to apply a series of steps correctly.

Table 3. Percentage of Incorrect Sixth Grade Student Response by Category

Response Categorization

2007

2008

Conceptual Errors

43

44

Problem Solving Errors

41

42

Procedural Errors

14

13

Unknown Errors

2

1

Total

100

100

Example Analysis of Two Items from the 2008 SOL Assessment

The following examples were chosen from the sixth-grade SOL assessment to illustrate the impact of Conceptual and Problem Solving errors.  Before reading further, solve the problems in Figures 1 and 2.  Then brainstorm a list of anticipated misconceptions, procedural errors, or problem solving errors you think could cause a student to select each distractor. Then compare your thinking to the following analysis.

Analyzing an Item from the Computation and Estimation Strand

According to the classification at the end of the released test, the VDOE classified the question in Figure 1 as belonging to the Computation and Estimation reporting category on the 2008 released SOL test.  Fifty percent of all students (25,311 of 50,556) answered this question correctly by selecting Choice J


Figure 1.  Item 8 from the 2008 6th grade Standards of Learning Released Test

One way to solve this problem correctly is to first determine the whole number miles ran in one week by adding all the whole numbers, 0 + 2 + 3+ 2 + 3+2+10 = 22.  Students could then combine fractional parts, ½ + ½ + ½ = 1 1/2.  Next, combine the whole number and fractional parts, 22 + 1 ½, to find the number of miles ran in one week, 23 ½.  Finally, double this mixed number to find the miles ran in two weeks, 23 ½ x 2 = 47 and then double again to find the number of miles run in 4 weeks, 47 x 2 = 94 miles.

Examination of the distractors reveals the other fifty percent of students who chose an incorrect answer most likely made a Problem Solving error by failing to answer the question that was asked.  Notice this type of error seems to have affected students at all pass levels.  This information is useful for teachers as it provides evidence that students at all pass levels would benefit from instruction focused on the problem-solving process.  Students need to be sure to check their work at the end of this process to be sure the solution found answers the original question posed in the problem.

Table 4. Item 8 Percentage of Students Selecting Each Answer Choice by Pass Level

Answer Choice

Below Basic

Basic

Proficient

Advanced

Overall

F

42.1

 

42.1

32.0

17.7

30.9

G

24.9

15.7

7.1

1.7

8.9

H

19.2

16.6

 

10.0

2.5

10.0

J (correct)

13.4

25.6

 

50.9

78.0

50.1


Distractor F
(23 1/2 miles) was the most popular incorrect response selected regardless of pass level.  The DAT determined students selecting this answer choice most likely correctly combined the rational numbers to determine the total distance for 1 week instead of 4 weeks as requested in the problem.  Selecting Distractor G (47 miles) is likely due to students finding the total distance for 2 weeks but not completely solving the problem to determine the total distance in 4 weeks.  Students selecting Distractor H (70 1/2 miles) most likely found the total distance for 3 weeks instead of 4 weeks. 

T he analysis of this item provides evidence that students at all pass levels are likely selecting incorrect choices due to Problem Solving errors and not just due to conceptual or procedural misunderstandings.  In the current reporting of the SOL assessment results, it is important to note the teacher/student are aware a question in the Computation and Estimation strand was missed.  However, they are unaware the mostly likely reason for missing this item is due to Problem Solving and not a computation or estimation error.  This is important to know as instruction needed to improve performance on this item should focus on the problem-solving process and not a particular Computation and Estimation topic.
Analyzing an Item from the Patterns, Functions, and Algebra Strand

Figure 2.  Item 40 from the 2008 6th grade Standards of Learning Released Test

The VDOE reported the item in Figure 2 belonged to the Patterns, Functions, and Algebra reporting category for the 2008 released test.  Only 57.6% of the students (29,117 of 50,556) selected the correct answer, Choice J.  A student solving this problem may recognize the numbers in the “value” column are all perfect squares.  This may help the student to see the square the “term” pattern to get the “value.”  Another way to solve this problem is to examine the possible patterns posed in the answer choices.  However, to successfully select the correct pattern using this method, the student must “check” the pattern to be sure it is true for each term and its corresponding value in the table.

The other 42.4% of the students most likely selected an incorrect answer due to a Conceptual or Problem Solving error.  In Table 5, Distractor H (Double the term to get the value) was the most popular distractor for students at all pass levels.  Students selecting this distractor most likely confused the pattern concepts of doubling and squaring.  For example, in a doubling pattern the term is multiplied by 2 to find the value, 11 x 2 = 22, 12 x 2 = 24.  In a squaring pattern the term is multiplied by itself to find the value, 11 x 11 = 121, 12 x 12 = 144.    

This result suggests students would benefit from instruction that focuses on these two pattern concepts and the difference between the terms “double” and “square” in order to avoid this type of Conceptual error in the future.

Distractor G (Multiply 11 by the term to get the value) was the second most popular distractor for students at all pass levels.  Students who selected response G probably did not focus on the conditions of the problem.  They likely multiplied the first term by 11, with a correct resultof 121, but did not check to see if this pattern held true for all other terms.  If the other terms were checked it would be evident the pattern does not hold, 12 x 11 = 132, 13 x 11 = 143.  This error is considered to be a Problem Solving error because it could have been prevented by checking the pattern for all terms and corresponding values.  Students are likely to make this type of mistake when they fail to take in all the information in the table as important conditions of the problem.

It is likely that students who chose Distractor F (Add 110 to the term to get the value) also did not check their pattern for all the terms in the table.  They  recognized when they added 110 to the first term the result is 121.  However, they likely did not realize this pattern worked only for the first term.  If students checked an ‘add 110 pattern’ with the second term, they would have calculated 12 + 110 = 122, not the 144 value listed as the second term in the table. This error could be prevented in the future if students more effectively self-monitored their problem-solving process to be sure they checked the pattern to make sure it holds true for the entire  in table. 

It is important to recognize this problem was classified by the VDOE as belonging to the Patterns, Functions, and Algebra strand.  While 27.5% of students who selected choice H did select an incorrect choice because of misunderstanding pattern concepts, the remaining 15% of students who incorrectly solved this problem chose an incorrect answer due to a Problem Solving error.  Teachers need to be aware of both of these reasons for incorrectly solving this problem, because the instructional methods needed to address the Conceptual and Problem Solving errors are very different.  It is important to look beyond the strand classification of each item in order to better understand why students are selecting incorrect responses.

Implications for Instruction

Focusing on a Concept

“Conceptual understanding refers to an integrated and functional grasp of mathematical ideas” (Kilpatrick, Swafford & Findell, 2001, p. 118).  As a constructivist, I believe that in order to learn new mathematical ideas students incorporate them into their existing ideas by building connections. “The greater the number of connections to a network of ideas, the better the understanding” (Van de Walle, Karp & Bay-Williams, 2010, p. 24).  For example, students who have a misconception that confuses squaring and doubling patterns could use color tiles to explore these patterns to make visual connections between these concepts.  A discussion of these visual patterns and a re-examination of a question similar to the item in Figure 2 will help students assess their own answer choices and explicitly recognize they were making this error in confusing these patterns.  In this way students will adapt their connected web of ideas to include a better understanding of differences in squaring and doubling patterns. The instructional methods chosen to improve conceptual understanding will be different as the concepts vary.  Teachers need to be familiar with and anticipate the Conceptual errors that are mostly likely to occur for the various mathematical strands.

Focusing on Problem-solving Strategies

To reduce the impact of Problem Solving errors, teachers must explicitly focus on the problem- solving process during mathematics instruction. Polya described problem solving as a four phase process: understand the problem, devise a plan, carry out the plan, and looking back (Polya, 1945). Effective problem solvers use this same general process regardless of the mathematical strand for a specific problem. In order for students to devise a proper plan, they need a tool bag filled with various strategies they can rely on when solving problems. To foster the development of these strategies, teachers need to provide students "frequent opportunities to explain their problem-solving strategies and solutions” (National Council of Teachers of Mathematics [NCTM], 2000, p. 258).  Sharing strategies helps students build their strategic competence (Kilpatrick, Swafford & Findell, 2001) so they can more effectively analyze the conditions of other problems and select an appropriate strategy to find a solution.  
Students can take turns sharing strategies for solving problems while teachers facilitate class discussion that illuminates similarities and differences in these strategies. This sharing process helps students make sense of how their own strategies connect to the strategies of their peers. Teachers can then ask students to solve a similar problem using another student’s strategy to encourage students to develop multiple ways to tackle the same problem. During these activities students may also discover they prefer the strategy of another student because it is more efficient or is more generally applicable for a wider variety of problems.

Focusing on Self-monitoring

The results of this study showed students also need to be taught how to monitor their own problem-solving process.  Teachers can use think-a-loud activities to foster students’ development of self-monitoring by sharing their own thinking with students, or providing opportunities for students to share their thinking out loud. As part of the looking back phase of the problem-solving process, students “should be honestly convinced of the correctness of each step” (Polya, 1945, p. 13).  Students also need to understand the importance of reflecting over their process when they think they have solved the problem.  "Good problem solvers realize what they know and don't know" and "understand that the problem-solving process is not finished until they have looked back at their solution and reviewed their process" (NCTM, 2000, p. 261).  Self-monitoring also helps students realize when an approach is not working and  change to a more appropriate strategy. 

Conclusion

This study revealed Conceptual errors were likely responsible for approximately 40% of incorrect choices on the 2007 and 2008 on the Virginia SOL sixth-grade assessment.  Conceptual errors stretched across all strands and pass levels; therefore, appropriate instruction needed to impact these errors will vary widely depending on the mathematical strand.  It is important for teachers to understand which concepts are causing their students difficulty and also the impact poor conceptual understanding has on their students’ achievement.  Once a teacher is aware of possible misconceptions he/she can design instruction to "intentionally diagnose and discriminate the known misconceptions" (Nesher, 1987, p. 36).
Throughout the school year students are given multiple-choice assessments to gauge their understanding of concepts. Teachers can work together as a team to brainstorm possible explanations for why their students would select particular incorrect choices.  This analysis will help them identify possible student misconceptions and errors in order to target instruction to the specific needs of the students beyond what is possible if only the strand of the question is considered in the analysis.  Teachers can then choose or create appropriate tasks and activities that focus on these misconceptions in order for students to "realize for themselves where they were wrong" (Nesher, 1987, p. 35). 
The results of this study also revealed Problem Solving errors are likely having a major impact on students’ performance on the sixth-grade SOL assessments. Teachers may not be aware the extent to which Problem Solving errors are affecting their students’ assessment results. According to the Mathematics Standards of Learning for Virginia Public Schools, "The development of problem-solving skills should be a major goal of the mathematics program at every grade level" (VDOE, 2009, p.17).  It is important to note similar Problem Solving errors cause students to incorrectly answer questions in multiple strands. This means a sustained focus on problem solving will positively impact performance in multiple strands on the SOL assessments for students at all pass levels. 
It is my hope this article helps teachers realize the extent to which Problem Solving errors are impacting the performance of their students when answering questions in all of the strands. Focusing on problem solving all year long throughout all the strands is likely to positively impact student performance across all strands. This focus on process is generally applicable while instruction needed to address Conceptual errors requires very specific information in order to respond accordingly. In other words, if teachers want to improve student performance and do not know where to begin, start with problem solving. 
I hope this discussion will also help teachers think about how they can make more effective use of the results of multiple-choice items they use with students throughout the year. These items are constantly providing information about student misconceptions/errors and can be very useful in designing future instruction that focuses on specific concepts or general aspects of the problem-solving process if we look deep enough.


References

Kilpatrick, J., Swafford, J., & Findell, B. (Eds.). (2001). Adding it up: Helping children learn mathematics. Washington, DC: National Academy Press.

Kloosterman, P., & Lester, F.K., Jr. (Eds.). (2007). Results and Interpretations of the 2003 Mathematics Assessment of the National Assessment of Educational Progress. Reston, VA: National Council of Teachers of Mathematics.

Miles, M.B., & Huberman, A.M. (1994). Qualitative Data Analysis (2nd ed.). Thousand Oaks, CA: SAGE Publications.

National Center for Education Statistics. (2011, June 19). Mathematical Abilities. Retrieved from http://nces.ed.gov/nationsreportcard/mathematics/abilities.asp

National Council of Teachers of Mathematics. (2000). Principles and Standards for School Mathematics. Reston, VA: Author.

National Research Council. (1999). Testing, teaching, and learning. Washington, DC: Author.

Nesher, P. (1987). Towards an Instructional Theory: The Role of Student’s Misconceptions. For the Learning of Mathematics, 7(3), 33-40.

Patton, M.Q. (1990). Qualitative evaluation and research methods (2nd ed.). Newbury Park, CA: SAGE Publications.

Polya, G. (1945; 2nd edition, 1985). How to Solve It. Princeton, NJ: Princeton University Press.

Van de Walle, J., Karp, K., & Bay-Williams, J. (2010). Elementary and Middle School Mathematics Teaching Developmentally (7th ed.).  Boston, MA: Allyn & Bacon.

Virginia Department of Education (2015).  Using Statewide SOL Test Results to Guide Instruction Retrieved from http://www.doe.virginia.gov/testing/sol/performance_analysis/index.shtml

Virginia Department of Education. (2009). Mathematics Standards of Learning for Virginia Public Schools. Retrieved from http://www.doe.virginia.gov/testing/sol/standards_docs/mathematics/review.shtml

 

Dr. Virginia Lewis
Assistant Professor
Longwood University
lewisvv@longwood.edu

 


 


VCTM
PO Box 73593
Richmond, VA 23235


For Comments, Suggestions or Help:
Contact Webmaster at vctmath@gmail.com

Powered by Wild Apricot Membership Software