PWCS Passing Fewer Students on VA Math SOL

PWCS has been passing fewer students on the VA Math SOL exams than other districts in the state since Math Investigations was mandated county-wide.

We recently received a copy of a presentation provided by the PWC Math department to elementary teachers as part of an in-service.  That presentation claimed that grade 3, 4, & 5 pass rates have increased since 2007 and indicated that Math Investigations, and the inquiry-based approach to instruction PWCS has adopted, was the cause of those increases.  This implication is not entirely accurate. It could even be considered misleading, if we assume district employees are in the business of reporting the truth rather than affirming that the programs they support are peachy-keen.

First, Math Investigations wasn’t mandated county-wide in Grades 3 , 4, & 5 in 2007.  In fact the first full MI year for Grade 3 was the Spring of 2008, for Grade 4 it was 2009, and for Grade 5 it was 2010.  So claiming that increases from 2007 to 2011 are due to Math Investigations isn’t correct as MI wasn’t used in PWC classrooms until 2008, 2009, or 2010, depending on grade level.

Second, the state only began testing 4th graders in Math in 2006, so 2007 pass rates in 2006 and 2007 reflect teachers adapting to testing in Grade 4.   The lowest pass rates, both county wide and across the state, for 4th grade are in 2006 and 2007.

Third, the presentation ignores Spring 2011 pass rates for Grade 5 and instead presents Spring 2010 data, despite claiming that pass rates have increased from 2007 to 2011.  Perhaps that’s because pass rates for Spring 2011 for Grade 5 were down 3% to 90%.

Fourth, the presentation lacks an unbiased point of comparison.  When our children compete for slots in state Colleges and Universities, they don’t compete solely against students in Fairfax, Loudoun, and Stafford; they compete with students from across the state.  Perhaps that’s because our rank in the state has been declining since MI was mandated in 4th and 5th grades, and in 3rd grade only rebounded to pre-MI levels in 2011.

The chart below contains the percentage of students passing the Math SOL by grade level since 2007.

Yellow indicates a Math Investigations Year.

Based on pass rates alone, things don’t look that bad.  Third grade pass rates were unchanged from before MI at first and are up 3% for the last 2 years, fourth grade pass rates are up 2% from before MI, and fifth grade pass rates were unchanged from before MI in 2010 and the decline in 2011 might be a fluke.  So, just based on pass rates, things look OK.

But if you compare PWC pass rates to VA pass rates, you notice that pass rates across the state have been increasing at a greater rate than in PWC, and PWC is getting closer and closer to state averages.  This is to be expected as pass rates increase over time and we approach 100% passing, but, increases in PWC have not kept up with increases state wide.  In fact our rank in the state, based on pass rates in PWC compared with pass rates in every other district in the state, has declined as the chart below indicates.

Yellow indicates a Math Investigations Year.

As you can see from this chart, our rank for 3rd grade pass rates dropped after MI was mandated, but rebounded in 2010 and by 2011 was back to pre-MI levels. Our rank for 4th grade pass rates has dropped for the past 3 years, and our rank for 5th grade pass rates has dropped for the past 2 years.  These declines appear to coincide with the implementation of Math Investigations and “inquiry-based instruction” in PWCS.  If our goal is to be average for the state, then we’re doing great at bringing our student’s performance down to state average.  But if our goal is to be among the best districts in the state, then we’ve got problems.  Problems which appear to have begun when Math Investigations and “inquiry-based instruction” began in PWCS.

Advertisements

9 Responses to “PWCS Passing Fewer Students on VA Math SOL”

  1. jackie h Says:

    Parents and our School Board, once again, should be furious! The millions spent to date on MI and professional development have barely kept student performance stagnant?!?! Federick got rid of MI cutting their losses after they saw no improvement after only 1.5 years!!! Pass-Advanced scores and what used to be high-achieving elem schools have tanked (language arts too!). The School Board should have fired the Math Dept years ago when we first presented them w/real data.

    They continued to turn a blind eye. And as we’re gearing up to present real data yet again, shall we feel optimistic that our School Board will finally admit the wrongdoing and do the obvious?

  2. Tracy Says:

    I am not sure that I understand how the state ranking chart is to be read. Is a percentage? Can you help me to understand it? If it says 50, is that the average score? Is then 50 better or worse?

    I am hearimg that adoption of the new math books will mean that mi will be gone. Is that true? Also the adoption of new books is a function of a scheduledadoption period and not an admission of failure in the current program. Is that accurate.

    My feeling is that math investigations was far from cost effective. Do you have a breakdown on the cost of implementing it? Can you include the cost of training our teachers? What will the cost of the new adoptions be?

    • KimS Says:

      The rank is PWCS’s percentile rank, so a rank of 59 means we pass as many or more students than 59% of the districts in Virginia. A rank of 50 means we pass as many or more students than 50% of the districts in Virginia. It’s a way of taking into account changes in the exams from year to year which might cause pass rates to go up or down and might lead people to think we’re doing better or worse than we are.

      We want the highest number possible, so if we go from being in the 59th percentile to being in the 49th percentile, like we did for 3rd grade pass rates in the Spring 2008 exams, that’s bad. A decline in state rank means that more districts passed the same or a greater percentage of students on the VA SOLs than we did when you compare 2007 to 2008 pass rates.

      If you look at our pass rates and our state rank, for example the third grade pass rates and state rank for 2007 and 2008, you see that we passed the same percentage of students but our rank went down 10 points. That means that while we passed the same percentage of students in 2007 and 2008, our performance relative to the other districts in the state declined as about 10% more districts passed a greater percentage of students than PWCS did in 2008.

      The rank shows that despite unchanged pass rates, our performance relative to the rest of the state is declining. Third grade finally rebounded to pre-MI levels in 2011, but 4th and 5th grade rank remains below pre-MI levels.

    • KimS Says:

      With regards to textbooks, while no decisions have been made regarding math textbooks yet, MI was not submitted to the VA DOE for review against our standards of learning. As such, it is not on the list of state recommended textbooks. Legally we can adopt a textbook that wasn’t reviewed or is not recommended by the state, but in the wake of the Five Ponds controversy, I find it unlikely that the school board would be willing to go that route – especially for a text that is as controversial as MI has been here in PWC.

      So MI is probably out as the primary instructional resource for elementary schools. That doesn’t mean it won’t still be used as teachers use instructional materials from many different sources, just that it won’t be the primary resource.

      The VA DOE follows a somewhat set schedule for updating / revising the standards of learning for each subject taught in VA schools. New textbooks are then reviewed against the updated SOLs and a local school districts select the materials they believe best meet their students needs. Math standards were revised last year with local textbook adoption happening this year. History / Social Studies standards were revised 2 years ago with local textbook adoption happening last year.

      So you are correct that the adoption of the new textbooks is a function of the state schedule and not an admission of failure with the current program. In fact, were you to ask administration officials, they’d likely tell you that MI has been a complete success in the county. That’s what the Principal at Patriot High School told parents at the open houses last Spring and what teacher’s were told at their in-service a few weeks ago.

    • KimS Says:

      Last comment on the cost effectiveness of MI – I can’t answer that. When we’d looked at the cost several years ago it appeared like MI was about the same as any other program, but that didn’t factor in the cost of Professional Development and purchasing / producing materials to supplement MI where it does not meet state standards.

      The cost of a new program will vary depending on the program selected and the “extras” the publisher throws in or we purchase from them (like on-line black line masters or manipulative kits). We budget for textbook replacements each year so that cost doesn’t slam the district the year we buy new books.

      Teacher training is one of those costs that is difficult to nail down. It does seem like our focus from a PD standpoint has changed from teaching about what the expectations are at each grade level and beyond and what common mistakes mean to “how do you teach subject X with this program”. I’m not sure I believe the approach we seem to be following is the approach I’d prefer as it is program dependent and when we switch to a new program, it feels like all of that training is wasted.

  3. Dyanne Says:

    I just read the study your group prepared in 2009. What a well prepared and objective tool to support your concerns.
    I was curious as to the School Board’s response to the report?
    Perhaps you should “re-publicize” your study on Facebook in light of this latest round of decreasing test scores?
    It would be nice for the study to gain more attention and have more people question why PWCS is so steadfastly continuing with this program despite the mounting evidence to it’s success.

    • pwceducationreform Says:

      Thanks Dyanne!

      We’re working on a presentation of test scores similar to what we did in 2009. It should be out sometime in the next month. Just getting the source report, which is prepared by Pearson and provided to the district along with our SOL test data, is difficult and generally requires a FOIA request, fees, and some degree of back and forth as the district plays dumb on the existence of the report. We have never been given access to the reports from before MI was mandated.

      If memory serves, the school board’s response to our 2009 report was to shrug their shoulders. They did ask the district to respond to our concerns that our students were struggling in certain subjects and the district’s response was that they don’t look at the SOL data as the questions change every year. While the questions do change every year, they typically follow a consistent pattern which is why our kids spend the 6 weeks before the SOL exams studying old exam questions and change is frequently along the lines of asking 6 x 8 one year and 8 x 7 the next. To date the school board has not requested that the district prepare a report detailing content deficiencies and identifying what changes they’ve implemented to address those deficiencies, which is what we wanted to see.

      I’ve seen no evidence that the district has suggested or implemented changes to the instructional program to address content deficiencies. Some teachers and schools have, but the centrally mandated program as a whole does not appear to have been adjusted to address those content deficiencies.

      One of the difficulties we’ve consistently run into regarding arguments with the district is with the data itself. We pull the data directly from the VA DOE’s assessment and achievement database which is available to the public on the VA DOE web site at this link ( https://p1pe.doe.virginia.gov/datareports/assess_test_result.do ). PWCS gets their data directly from Pearson which they adjust and dump into a data warehouse. Sometimes that data – the VA DOE data and the PWCS data – don’t tie. The PWCS data is not available to the public, so whenever we present our data the district’s response is (1) we don’t know where they got their data, (2) our data is better, and (3) our data shows that there are no problems. We then end up in a debate about the data as opposed to the instructional practices that are leaving our kids behind.

  4. Dyanne Says:

    typo correction:

    mounting evidence AGAINST it’s success.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: