The MAP (Missouri Assessment Program) 4th grade mathematics test includes nationally normed multiple choice items from the TerraNova assessment from CTB/McGraw-Hill (p.2 in the MAP examiner's manual at the Missouri DESE web site.) Columbia Public Schools students' median national percentage rank dropped five percentage points on these items after TERC Investigations was adopted. The adoption was phased with Fairview Elementary adopting TERC in all grades for the 2001-2002 school year and the remaining elementary schools adopting TERC for fourth grade in 2003-2004.
According to Resendez and Manley (2005) cited on the U.S. DOE website the TerraNova CTBS is a reliable and valid standardized test that offers broad coverage of the mathematics content in most textbooks and reflects the National Council of Teachers of Mathematics (NCTM) standards.
According to Assessment Standards For Missouri Public Schools on the Missouri DESE web site "the advantages of these items are: 1) they are effective in measuring students’ breadth of content knowledge; and 2) a large number of these items can be administered and scored in a short amount of time."
A major criticism of TERC materials is their inadequate coverage of standard mathematical content like definitions and terminology, a criticism which is clearly supported by CPS test results.
Subscribe to:
Post Comments (Atom)
12 comments:
You need to take this particular piece of data down when trying to support your argument. While it is fair the say that the score(s) have dropped for the CPS TerraNova, you might need to explain to people the actual data you are presenting. Since you are "Parents For Real Math", let me explain this in real math terms. The only thing this data tells me is the "median score dropped." In other words, the student score in the middle dropped. It does not mean that the overall scores have dropped. You cannot fairly say that "overall" scores have dropped or increased. Yes, I said increased because it is possible for the "median/middle score" of a data set to drop while all of the students above the median improved (maybe significantly). Additionally, many of the scores below the median could have increased significantly. Here is an "investigation" for each of you to do. Create a data set with a median of 40%. Now create 2 new data sets with the median being 35%. However, make one data set such that the lower scores increase and the upper scores increase. Make the other data set such that the lower scores decrease and the upper scores descrease. If you have middle school or high school students, ask them to "explore" these with you.
Dear anonymous,
There are around 1200 student scores in each year of data so we're not talking about a small data set where one student's bad day might shift the median 5 percentage points. Although your scenario might be plausible with a handful of scores and creative middle school logic, most reasonable people would agree that with 1200 scores each year, the median is a good indicator of overall achievement.
I have also looked at the number of CPS students scoring in each quartile. The percent of CPS students scoring above the national average dropped from 65.9% to 62.3% after CPS adopted Investigations (even after excluding the first years Investigations was adopted because the scores dropped so precipitously.)
I don't see how you can think these scores are irrelevant.
Dear Mrs. Pruitt,
Thank you for your response to my email. I did not say the data was irrelevant. I simply stated that you needed to explain more about the data and what it can/cannot tell us. It is a starting point for a discussion around the data, but it is difficult without more data to condemn the math curriculum with this data. I have heard many conversations in this community about this math curriculum. I have talked to many parents. Many of the people voicing opposition to this curriculum state that students in Columbia are not learing to "add, subtract, multiply and divide." However, we have half of our students scoring better than 65% of their peers nationwide. Surely, you can agree that to make this type of statement (on radio, recently) was incorrect and not fair to the Columbia teachers and Columbia school district. To a concerned parent like myself, it would seem highly unlikely to justify this type of statement in light of the data. Yes, the median has dropped, but what else does the data tell us. Since you have the data for the "quartiles" why don't we see a comparison of the % of students in each quartile over the 5 years. This would allow us to better understand what is happening above and below this median value. What is the range of the data set each year? Where are the interquartiles of the data? This would provide much better data and the median alone.
Mrs. Pruitt,
It is me (anonymous) again. In reviewing the statements made about the implementation of this new math curriculum. How was the curriculum phased in here in Columbia? If I understand correctly from your post, Fairview adopted it at all grades in 2001-2002. If one school adopted the program during the 2002 school year, is it fair to use 2000-2001 as your base line for the data. Fairview only has around 80-100 students in 4th grade. Out of 1200 students, is it fair to blame the adoption of the curriculum for the drop in scores on the 2002 MAP data when only a small percentage of the population was impacted by the curriculum. Does this make sense or do I not understand your post? As I said, earlier I like this type of discussion around the data. I still disagree with the assumption that we attack the curriculum based on the median dropping, but there are some other issues to consider. I'm a very discriminating consumer and I am not easily persuaded one way or the other. I have seen people twist and manipulate data to fit their purposes way to many times. I want to know the data well before I make accusations. Thank you for letting me be apart of this discussion. By the way, I am a new Columbia school district parent. I have a daughter in K. So far I am pleased with what she is learning. They have spent a lot of time learning numbers, counting and recognizing patterns. As we all know, we understand the world through pattern recognition and it plays a huge part in how students understand mathematics. I'm looking forward to many more conversations around this math curriculum. Thank you.
Mrs. Pruitt,
I just found an interesting site with more math data on the department of elementary and secondary education website. Check out the MAP data below:
% of students Step 1/Progressing
2001 22.8%
2002 23.8%
2003 21.7%
2004 17.1%
2005 16.1%
% of students Proficient/Advanced
2001 44.0%
2002 44.4%
2003 39.5%
2004 46.2%
2005 46.1%
This data would support my theory that it is possible that students below and above the median improved while the middle part of the data saw a decrease. In order to be fair and open in this discussion, this is not Terra Nova data. From what I know, the terra nova is part of the overall MAP score. So this data shows positive trends in the data over a similar time period.
Now, if we blame the curriclum for the drop in the median scores (students in the middle). We need to recognize that this curriculum is having a positive impact on students in the lower levels. Additionally, they are moving more students up into the upper levels on the MAP data.
Another thing I noticed was that in 2001 there was 1.8% of the 4th grade students in "Level not Determined." It explained that these students did not get a MAP score (therefore, no Terra Nova score?) because they did not have a valid attempt. In 2005, that number was down to 0.3%. So there were more students providing a valid attempt. My thoughts on this is who are these students in the 1.8%. Are they the best and the brightest of Columbia public school students or do they tend to be more of a challenge group? Could be some of both? Could these students participation have had some of the impact on the drop in the median? It is possible.
Okay, I'm done for now. I need to go play with my little girls. Again thanks for letting me voice my thoughts as I try to understand both sides of this issue. As a parent, I want to make a reasoned and informed decision on where I stand with regard to this issue. I am a parent for real math, but I also must use real math to analyze and study all of the information available to make good decisions. Too many people are like sheep and follow withouth questions and understanding what is being presented. That is not me. So I apologize for that. I will probably bug you more in the future as I grapple with this. Thanks.
Do you have the trend data for the 10th grade MAP test? I noticed on Columbia's website they had the MAP data posted.
For 2006 and 2007, the median terra nova score was 77 and 79 percent! I would be interested in seeing a historical graph like you have presented for 4th grade with the 10th grade Terra Nova data. Is having over 50% of our students scoring better than 79% of 10th graders nation wide an idication that this integrate math is working.
Would we use the same logic for the 10th grade test as you are usign for the 4th grade test? If so, maybe you need to amend your criticism of the high school math curriculum or at least admit the data is somewhat inconclusive.
Again, this is just another point for us to have a reasoned and intelligent discussion about what the data can and cannot tell us.
hmm, do you have a graph of the 10th grade Terra Nova scores for 2001-2007. I would really like to see that graph in relation to the
4th grade Terra Nova data. Again, it will give us some interested date to have a conversation around. Can you tell I love to look at data from lots of perspectives? I am very skeptical of all data because it can be shaped and presented in such dubious ways to sell a weak argument.
In looking at the Columbia web site, I found the 2006 and 2007 Terra Nova data for 10th grade. It kind of jumped out at me. Columbia had a median score of 77 and 79 respectively. I would really like to see if we have seen a descrease or an increase at the high school level.
If this Core-Plus at the high school is so bad, why do we have 50% of our students scoring better than 79% of 10th graders nation wide. If we can used the median test scores for 4th grade to denegrate the elementary curriculum, should we not use the 10th grade data to validate the success of the Core-plus? Wouldn't the same logic be applied to both sets of data?
Just some food for thought as I try to wrap my brain around all of this data.
Don't you just love data! Man, I do!! I can't get enough of it.
Oh, I see now. I submitted the same comments twice because you have changed the settings on this blog! Am I to assume that you are not one in favor of an open dialogue about this data?
Dear Anonymous,
First, let me clarify some points of information in your comments. Then, I will try to address some of your questions, and finally I will summarize your arguments and rebut them.
On December 14, 2007 3:03 PM you said:
"If one school adopted the program during the 2002 school year, is it fair to use 2000-2001 as your base line for the data. Fairview only has around 80-100 students in 4th grade. Out of 1200 students, is it fair to blame the adoption of the curriculum for the drop in scores on the 2002 MAP data when only a small percentage of the population was impacted by the curriculum."
Here is verbatim the information I received from Chip Sharp, Secondary Mathematics Curriculum Coordinator for CPS, regarding the timeline of implementation:
"Investigations in Number, Data and Space was implemented in as a “phase-in procedure” across the district. We piloted the program in K-2 in some buildings during the 00-01 school year. We implemented the program district-wide in K-2 during the 01-02 school year. During the 02-03 school year we added third grade. During the 03-04 school year we added grades four and five. However, Fairview Elementary implemented the program K-5 during the 2001-2002 school year."
Additional information you may find helpful is that prior to the opening of Paxton Keeley Elementary in fall 2002 (with mostly students from the Fairview attendance area) Fairview Elementary was by far the largest elementary school in CPS with over 150 students in fourth grade each year from 1998-99 through 2000-2001. Combined enrollment at Paxton Keeley and Fairview in 2002-2003 was 189 students, more than 15% of CPS fourth graders.
Therefore, in my comparison I eliminated years 2001-2002 and 2002-2003 since it would be impossible to separate students studying Investigations from students using a more "traditional" curriculum during those years.
In addition, I did not use 2000-2001 as my base year. I used three academic years starting in fall 1998 through spring 2001.
On December 14 at 2:14 pm you said:
"What is the range of the data set each year?"
I presume the range would be 1-99% since the data is percentile rank.
On December 14 at 2:14 pm you said:
"Where are the interquartiles of the data?"
I have not requested interquartile data. I strongly suggest you contact CPS for data.
You argue that CPS students score above the national average and that this indicates that the reform curricula is effective.
On December 14, 2007 2:14 PM you said:
"However, we have half of our students scoring better than 65% of their peers nationwide." (regarding fourth grade TerraNova scores)
On December 18 at 11:47 pm you said:
"For 2006 and 2007, the median terra nova score was 77 and 79 percent! I would be interested in seeing a historical graph like you have presented for 4th grade with the 10th grade Terra Nova data. Is having over 50% of our students scoring better than 79% of 10th graders nation wide an idication that this integrate math is working."
However, the relevant question is: has the new curriculum raised or lowered achievement? And are students more or less prepared to compete on a national and international basis for post-secondary education and jobs?
Yes, CPS students are above average---if you are new to Columbia, you may not be aware that we have one of the highest rates of graduate degrees per capita in the nation so we would expect CPS test scores to be consistently above the national average. But what is the trend over time, and specifically, do measures of achievement show improvement since the new curricula were adopted?
For fourth graders, the Terra Nova has the advantage of having a stable historic record and national norming. The MAP test is relatively new and the "cut scores" for proficient and above were adjusted in 2006. The MAP test is specific to Missouri so it is not useful for comparing student achievement on a national basis.
Let me address your comments specifically about the 10th grade MAP scores.
There is a problem with looking at 10th grade scores without separating students in the algebra pathway from students in the integrated pathway. Using the 10th grade Terra Nova scores before and after the adoption of Core Plus would be like judging Investigations based on 4th grade scores in 2001-2002 and 2002-2003 when only some students were using the reform curriculum. I would like very much to see a comparison of student scores distinguished by their enrollment in integrated vs. algebra pathway. Perhaps as this debate unfolds, that information will be made public. The CPS Secondary Math Task Force report released last spring did publish some information regarding integrated vs. algebra pathway ACT scores that showed integrated students scored five points lower than algebra pathway students. (A similar comparison for honors students had only six student scores in integrated so I would argue it is statistically useless although that information appeared only in the text report which is not available on the CPS web site. Only the presentation summary is available as far as I can tell.) If the Terra Nova scores showed a significant departure from historic trends after Core Plus was adopted, we might hypothesize that the change was due to the majority of students being enrolled in Core Plus. But really we need the scores separated by pathway.
Another complicating factor in looking at 10th grade scores is that students in 10th grade last year were not exposed to Investigations throughout elementary school. Depending on which elementary school they attended, they would have had Investigations probably only in fourth and fifth grade. As future cohorts enter 10th grade, we will have a clearer view of the cumulative effects of reform math on CPS students, although not a clear comparison with a more traditional alternative.
You also asked if it was "fair" to use 2001 as the baseline for 4th grade scores, and then you used 2001as your baseline for comparison for the 10th grade scores. I have not looked at 10th grade Terra Nova to confirm your data. If you have a link to Terra Nova scores (including scores starting several years before Core Plus was adopted) I would be interested in seeing those scores.
And back to the original topic of my post, the fourth grade Terra Nova scores:
On December 3, 2007 12:42 PM you said:
"The only thing this data tells me is the "median score dropped." In other words, the student score in the middle dropped. It does not mean that the overall scores have dropped. You cannot fairly say that "overall" scores have dropped or increased. Yes, I said increased because it is possible for the "median/middle score" of a data set to drop while all of the students above the median improved (maybe significantly). Additionally, many of the scores below the median could have increased significantly."
On December 14, 2007 3:03 PM you said:
"…it is possible that students below and above the median improved while the middle part of the data saw a decrease."
So we agree that a drop in the median does indicate that scores in the middle decreased. In fact, the percentage of CPS students scoring in the top quartile dropped from an average of 40% to 36% when you compare the three base years to the 4 years since TERC was adopted by all CPS elementary schools. So I think we can agree that the highest scoring students have not improved. I repeat my earlier comment: "The percent of CPS students scoring above the national average (on the fourth grade Terra Nova items) dropped from 65.9% to 62.3% after CPS adopted Investigations."
Mrs. Pruitt,
Thank you for you response. It was very informative. I appreciate you being patient with me as I wrestle with this data.
I'm still curious why we should totally throw the data out for the 2001-2002 and 2002-2003 school year. If what I understand the implementation as you described it, the majority of the 4th graders were in a traditional curriculum and they had been taught a traditional curriculum for their whole schooling up to that point. Some of the 4th graders at Fairview had received instruction using Investigation during that school year. So the majority of their mathematical knowledge and understading was grounded in a more traditional curriculum of sorts.
So could this data idicate a downturn in math scores in Columbia due to something other than a math program? I haven't seen anyone discussing this type of possibility.
The following year, 2002-2003, they implemented the 3rd grade curriculum with only Fairview using the Investigations. Again, the majority of the 4th grade math students taking the test would not be receiving instruction using investigations. The 4th graders at Fairview would have had 3 years of their math instruction in a traditional curriculum and 2 years in the investigations.
So what I see is a downturn in math scores coming at about the time we are in the implementation process. The scores are not reflective of the curriculum change at all in my opinion. The 2001-2002 data was impacted very little if any by the curriculum change. The 2002-2003 data was impacted by a small group of students out of 1200 and they were minimally impacted.
Does this make sense?
As a concerned CPS parent, I notice that much of your website is against the current math curriculum used in the elementary schools. What research is available that shows that the old traditional way of teaching is better than what the teachers are using now?
Dear cpsmathmom,
You have accurately ascertained that this blog is concerned with pointing out the problems with the current math curriculum at CPS. But assuming that we therefore propose to return to "traditional" or "conventional" math instruction is a misapprehension. Aside from the test data from CPS reported on this blog and elsewhere, there is precious little research to support any particular curricula. A 1998 report to the California State Board of Education reviewed 8727 studies of mathematics education and found only 110 which met minimum criteria for experimental studies of mathematics education as high quality research. The National Math Advisory Panel report released recently reviewed over 16,000 research publications and concluded that research on many topics is quite thin. They do recommend however that "students should understand key concepts, achieve automaticity as appropriate (e.g., with addition and related
subtraction facts), develop flexible, accurate, and automatic execution of the
standard algorithms, and use these competencies to solve problems."
Our argument is not with emphasis on concepts; instead we disagree with the avoidance of teaching standard definitions and procedures, including memorization of basic facts (like the multiplication tables, the standard stack-and-carry column addition algorithm and other generally applicable techniques).
There is solid research showing that instruction in procedures and concepts support each other. Try Rittle-Johnson, B., Siegler, R. S., & Alibali, M. W. (2001). Developing conceptual understanding and procedural skill in mathematics: An iterative process. Journal of Educational Psychology, 93, 346-362. The current elementary math curriculum Investigations often teaches techniques that are applicable in only special cases. It avoids teaching generally applicable concepts like converting fractions to common denominators, for one example. This is not just an important procedural skill, but also an important concept.
You might also think we prefer "conventional" classrooms where teachers lecture and students sit and listen, then work on problems alone. Frankly, there's nothing wrong with expecting students to sit and listen, but we have specific objections to using "discovery" methods exclusively as promoted by Investigations and other reform math programs. Discovery methods are certainly a useful tool for teachers, but they have some drawbacks and should be part of a balanced approach. Discovery isn't appropriate for every student or for every topic and can be extremely slow, resulting in a delayed acquisition of the basic skills and knowledge needed to advance to algebra.
So what do we propose? A curriculum with a balanced approach with appropriate performance benchmarks leading to algebra for every student. A great first step is to ensure that math lessons include techniques that are universally applicable and thus powerful while teaching for understanding. Students will learn procedures whether by design or incidentally---those procedures should be powerful enough to allow students to progress to mastery and build on prior knowledge to acquire new skills.
I personally have suggested that the current elementary curriculum review committee consider Singapore mathematics as an option. If you'd like to discuss a specific curriculum in detail please feel free to email or call me directly. Any recommendation of a particular curriculum should be the result of community debate and hopefully consensus. I truly hope this blog can contribute to this process based on what we really know about student performance at CPS.
Post a Comment