The charts below show the number of CPS students enrolling in remedial math courses at the post-secondary level jumped after Core Plus was adopted in 2001-02. From single digit enrollment rates in remedial math prior to 2001, 20% of Hickman graduates enrolled in remedial math at colleges and universities in 2006. Rock Bridge results are just as alarming.
The original data used to prepare the charts are available at the DHE website here: http://www.dhe.mo.gov/hsgrad2005table1c1.shtml The Missouri Department of Higher Education reports high school graduates' enrollment and preparation at Missouri public colleges and universities for each year since 1996. This includes all students that enroll full- or part-time as degree- or nondegree-seeking first time freshmen at all 2 and 4 year post-secondary institutions in Missouri.
State Representative Ed Robb currently has a bill in the Missouri House that would allow students at two-year or four-year colleges or universities to seek tuition reimbursement from their high schools when they have to take post-secondary remedial courses. Whether you agree or disagree with this course of action, it's clear that the problem warrants attention.
It's no surprise that ACT scores (chart below) also went down during the same time period. These are immediate negative results much like the immediate deleterious effect of Connected Math on the IAAT test scores of 7th graders which we reported on this blog. What is surprising is that the district isn't looking at or reporting these results in its Secondary Math Task Force report to the Superintendent. The historical trend in the results of the Iowa Algebra Aptitude Test was also never examined.
The Columbia school district continues to compare itself only to other Missouri schools, and plans to narrow those comparisons even further by suggesting that CPS should only be compared to schools that use experimental curricula like Core Plus. This despite the fact that in 2005 only 17 percent of Missouri’s 10th grade students scored at proficient or advanced in math according to a Missouri Alliance on Math, Engineering, Technology, and Science Education report.
8 comments:
Since there are two math sequences at CPS, how can you tell if the ACT data support your argument. It is conceivable that the students taking the integrated sequence did better than those in the traditional sequence. Or the reverse may be true. You could remedy this problem by showing ACT data for the students in the integrated and compare it to the data for the students in the traditional. Your position would predict that the students in the traditional track would do much better than those in the integrated track. Can you provide this data?
Mark isn't it "coincidental" that the ACT scores started dropping AFTER the new curricula were introduced? The number of students, from what I have been told, in the more traditional sequence is only around 10%. I think it is a strong indicator that the new curricula have affected performance. I am not sure that the district or state has the breakdown of students in the traditional vs. integrated and how they perform on the ACT. I think the main issue raised is that the district is only providing the public with positive information and neglecting to inform the public of the downward trend that has occurred since these curricula were implemented.
Dear Mark,
You make an excellent point. There are several other indications that the decline in ACT scores and rise in remedial math enrollment is due to Core Plus, but you can weigh whether they are convincing.
First of all, the majority of students are enrolled in integrated math so the average ACT scores will be heavily weighted toward the Core Plus students' scores. In addition, the ACT scores and remedial math enrollment rates include only students that enroll at post-secondary schools in Missouri, while most of the top students (those in the traditional track) will attend college out of state. The timing of the increase in remedial math enrollment at the same time Core Plus was introduced is also suggestive.
I would love to be able to look at a breakdown of AP scores (Calculus AB, Calculus BC and Statistics) by the number of integrated vs. number of traditional track students sitting the exams and analyze the score distributions for those students. I think the school district could do this if they chose to.
I understand that the school district doesn't have access to ACT and SAT scores of all their students (that's part of why using the ACT as an exit exam would be beneficial to students and schools) but they need to examine data they do have objectively and thoroughly.
There are no Integrated students in Calculus BC. At least not yet. I don't know of any Calc AB students from Integrated either, but there might be a couple.
It's interesting that the greatest year to year decreases in ACT scores at both HHS and RBHS occurred from 1997 to 1998 and 1998 to 1999, well before the introduction of Core Plus.
As to the question of causation, the seemingly simultaneous increase in remedial enrollments and introduction of Core Plus, shows only that the subject should be investigated. I'm sure that the AP Statistics could investigate this and see whether this is just common response (i.e. 75% of people involved in car accidents reported having eaten french fries in the previous week. Eating french fries, of course does not cause car accidents), confounding variables (more than one variable involved in the cause and effect relationship) or a genuine cause and effect relationship. This is much more complicted than just looking at a few bar graphs.
The data shown above do not indicate if there has been a change in the number of students taking the ACT over the years, or a change in the percentage of total students taking the ACT. Without this information, interpreting the data presented in the graphs is very difficult. One could even say that without this information, the graphs could be mis-interpreted (as is happening here).
I'm new to this discussion, so forgive me if this has already been brought up in other discussions. It would seem to me it is difficult to make the statement "ACT scores decline under Core plus." In reading the other responses, they had very valid arguments regarding why it is difficult, but I have one glaring issue with this statement not mentioned as of yet. The ACT scores mentioned are "composite" scores. These are the overall ACT scores for Columbia students over this period of time. Apparently, we are blaming the decline in the composite score (math, reading and science scores combined) only on the change in the mathematics curriculum. Therefore, I believe it is very, very difficult to make this leap to blame the decline on the math curriculum alone. THere are many other factors contributing to this. Another comment regarding the increase in the number of remedial math students. When I looked at the data for many other schools. I saw similar trends over the same period of time. Are we to assume they are all using "Core Plus"? Many of these schools are showing similar types of increase. A few of the schools that I am familiar with do not use "Core Plus" or anything like it and they have very similar results. So I'm not sure this can be totally blamed on the "Core Plus" curriculum. When trying to make these broad sweeping indictments of a program, it is very important to consider the data source, other contributing factors and make reasoned informed decisions on what can and cannot be said about the data. It is easy to take data and make it say what we want it to say. However, a parent "for real math" must be a savy consumer of the data being presented. Ask lots of questions and look at all factors related to the data being submitted. In my mind, many of these arguments are so weakly supported that I find it difficult to buy into the hype concerning our math program. I am open to more data being presented with regard to this topic, but the arguments presented here are clearly presented to paint a picture for which I do not believe the data supports. If I were a math teacher in this district or some other district, I could easily use some of this data and the arguments present to create some fantastic discussions about the poor use of data to support a point of view.
You can use all the logic you want to try and decide whether Core-Plus affects ACT scores or not, but I really don't think that's the major issue. The real issue is whether or not to get rid of Core-Plus altogether, and I definitely think Core should be eliminated from schools, or atleast offered as optional. Core books don't have glossaries, they have numerous errors in both the answer keys and word problems, and they are the most frustrating math books that I've ever used. Talk to any student in a Core mathematics class and you will get a very similar response.
Parents of Core students dislike the curriculum just as much because they can't help their kids with math homework at all. The wordage is completely different than in any other math book, and there are many concepts that students are just expected to know, or that they're expected to remember clearly from years past.
Basically, Core-Plus has brought me nothing but frustration. Because of this curriculum, I am dropping math next semester, and I will only take math next year if my school decides not to replace F.S.T. with another level of Core.
That's how bad it is.
Post a Comment