How was Investigations selected?

One of the questions parents kept asking was how Investigations was selected.  It has a controversial record, meets fewer SOLs than any other text, isn’t approved for use in Grade 5, carries a heavy professional development requirement for teachers, requires the purchase of expensive materials, and represents a fundamental change in what the county teaches, when it teaches it, and how.   All of that caused many parents to wonder  how Investigations was selected and whether the board was prepared for the controversy that they should have known was coming. 

One of the parents leading the fight against Investigations set out to learn just that.  He submitted a FOIA (Freedom of Information Act) request to the county to obtain information regarding how the text was reviewed, selected, and recommended to the school board.   That request revealed a number of interesting things. 

The process was biased in favor of constructivist math from the beginning.

Dr Ruth Parker, a nationally recognized leader in the reform math movement and admitted supporter of the Investigations text series, conducted the kickoff meeting for the textbook review process where she provided her thoughts on the criteria the county should look for in a new textbook.  I’ve attended one of Dr Parker’s speeches and she is am amazing and compelling speaker.  She is a committed educator and it is evident that she was and still is a gifted teacher. 

But she also biased. She truly believes in the constructivist philosophy and and is an open advocate of Investigations.  As she is quoted stating on her company web site the only math program I have ever publicly advocated for, a K through 5 program (is) called Investigations in Number, Data and Space .

I have no doubt that in her classroom her students excelled under her tutelage.  But every teacher isn’t a Dr Parker and I’m not convinced that the philosophy she advocates works for every teacher and every student. 

My concern isn’t with Dr Parker because I respect and admire her commitment and dedication.  My concern is with the fact that the county invited someone who is clearly strongly biased towards one side of the math debate who has spoken in support of one of the textbooks under consideration to kick off what was supposed to be an unbiased process.  Dr Parker’s involvement at that stage could have been tempered had the school system  invited one of any of a number of  equally gifted and equally qualified speakers who support a traditional approach to mathematics education to balance what the members of the textbook review committee heard.   But the county didn’t do so.  Dr Parker was introduced and presented as an unbiased educator and not as a supporter of one side of the debate.  Her involvement, however well intentioned, tainted the entire review process.

It is also worth mentioning that Dr Parker has returned to PWC every year since Investigations was adopted under contract with the county to conduct teacher training.  I am not questioning Dr Parker’s qualification to conduct this training, however, I wonder whether it was a conflict of interest to hire DR Parker to advise the textbook review committee on what criteria they should look for in a textbook when she so openly advocates for one of the textbooks under consideration and was subsequently hired to conduct training for PWC teachers when the textbook she advocates was selected. 

 The criteria reviewers were asked to assess the books against were skewed towards constructivist math and appear to have been disregarded.

The FOIA request asked the county to provide information on the process and evaluative criteria used to select Investigations. Based on the boxes of information we were permitted to review, it appears as if the county developed a list of criteria committee members, parents, and teachers were supposed to look for when reviewing textbooks.  The instructions given to reviewers request that they reference the pages where the criteria are exhibited.  This request seems to have been difficult to meet because few of the reviews received have page references.   

The list of criteria include “supports conceptual understanding of the traditional algorithms and treats them as just one of many ways to solve problems.”   Taken at face value that appears to be a wonderful objective,  until you examine it as it relates to the debate between constructivist and traditional programs.  The standard algorithms, how and when they are taught, and how they are supported in the classroom is one of the key differences between constructivist and traditional programs.   

Traditional programs are based on the belief that learning the standard algorithms to the point of automaticity frees the mind to explore and develop greater understanding of mathematical processes.  As a result students in traditional programs learn the standard algorithms in grades 1 or 2 and are supported in class and through homework to learn the processes to the point of automaticity.  Alternate strategies, like mental math or manipulating the numbers to make the equation easier to solve, are taught to students after they’ve mastered the standard algorithms.

Constructivist programs believe that the standard algorithms are dangerous – that teaching them inhibits learning and obscures place value.  As a result constructivist texts push students to develop their own strategies for solving problems and delay teaching the standard algorithms to later grades, if they’re taught at all.    Until the standard algorithms are covered, assuming they are covered, teachers are prohibited from teaching them to their students and students who have learned them outside the classroom are not supported in understanding the processes.

When viewed through that prism, that one criteria takes on different meaning.  That criteria seems to have been emphasized heavily because a number of the reviews completed by committee members state “only seems to teach one” or “only the traditional algorithms”.  The wording of the criteria coupled with the emphasis placed on it by reviewers indicates to me that the committee facilitators were attempting to skew the selection in favor of a constructivist text.

Eventually it appears as if assessing each textbook against the long list of criteria was overwhelming and the criteria were tossed in favor of developing a list of good / bad characteristics.  Despite repeated attempts to ascertain the process the committee followed and the criteria upon which they made their decision, we still do not know how Investigations was selected and no one who served on the committee has stepped forward and expressed concern with how the discussions were conducted.

Alignment with state Standards of Learning seems to have been de-emphasized.

Alignment with the SOLs is not one of the criteria upon which the texts were reviewed.  Committee members were given a copy of the state list of approved texts and were instructed to go to the VA DOE web site to review the state’s assessments in more detail if they so desired, but there is no evidence of any discussion on how many standards each text met or whether each text was or was not approved by the state.

Why is this important?  Because the state of Virginia says which textbooks can be used as primary texts in classrooms and which texts can only be used as supplemental materials.  Investigations is not approved for use as a primary text in Grade 5. 

The state develops and publishes the standards of learning and reviews textbooks against those standards.  Textbooks that meet a set percentage of standards are recommended for use in schools throughout the state.  Textbooks that do not meet that percentage are not-recommended and can only be used as supplemental material.  The version of Investigations released at the time the committee was working did not meet the state percentage of standards for grade 5 and was not recommended for use.  Version 1 was only approved for use in grades K – 3 and Grade 4.

The version of Investigations the school board approved and county is using in most classroom is version 2.  That version was issued after the state conducted it’s review of textbooks and after the county convened the committee to select textbooks.  Version 2 has not been assessed by the state for alignment with the standards of learning.  County staff have stated that they conducted their own assessment of version 2 against the standards of learning and believe that version 2 would have met the percentage of standards required by the state to be authorized for use as a primary text.  When this assessment was completed is not evident from the materials obtained by the FOIA request. 

Only TERC was permitted to submit promotional and marketing materials to the county during the textbook selection process.

County staff wanted to make the textbook review process as free from bias as possible.  As such they prohibited publishers from contacts schools or providing promotional or marketing materials to county schools or committee members.  All textbook publishers and developers seem to have followed this request, except TERC. 

TERC provided marketing and promotional materials to county staff on the differences between version 1 and version 2 of Investigations. Although other publishers had subsequent releases of their texts due out, no other publishers or textbook developers were permitted to provide this information. 


Ultimately the textbook review committee recommended Investigations to the school board and the school board authorized it for use in all county schools.  That means the decision to remove Math Investigations or
allow an alternate instructional track lies with the school board.



Dr. Parker’s 2007 response to questions where her support form and endorsement of Investigations is evident:

Dr. Parker’s Sept 2009 statement endorsing Investigations:


2 Responses to “How was Investigations selected?”

  1. Ed Says:

    The following is an open letter stating what has been discovered through FOIA requests and meetings with the math department.
    Clearly there is a lot of confusion over what really happened in the math text selection process.
    According to the letter dated May 24th, 2005, the text ratings sheets would be used to evaluate the math texts.
    Yet the letter to me from the school system dated December 4th, 2008 clearly states that the sheets were merely used as a discussion point for the committee. The ratings on which the final decision was made were based on subjective criteria that differed for each text and a scoring rubric for the final rankings does not exist.
    Therefore, the process could not possibly be without bias and was therefore invalid.
    How could the board sign off on the decision without understanding how that decision was reached?
    I ask that the board investigate why the change in process was made and on whose authority.
    It certainly appears to have been engineered to get the result that the math department wanted.
    In addition, the adoption process clearly lists correlation with SOLs as a major consideration yet this was discounted by the math department in the adoption kick-off. The committee was instructed in writing not to consider SOL correlation in clear violation of the adoption rules.
    After all, they have stated on many occasions their absolute faith in investigations as the best way to teach mathematics; how could they not let that kind of commitment cloud their judgment?

  2. Defending The Teach Math Right Team « PWC Education Reform Blog Says:

    […] biased selection process. The school board, for their own reasons, seems to have decided that an unbiased process is bad. Posted in General Education. Leave a Comment » LikeBe the first to like this […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: