By School Graduation Rates

As requested, listed below are the graduation rates for each high school in PWCS. They are listed overall and for economically disadvantaged (ED) and not economically disadvantaged (not ED) students.  They are also listed for Males and Females, as that was a shocker yesterday.  If the numbers are too small for you to read, like they are for me, here’s a pdf of the same file with larger print ===> By school grad rates – 2013.

They’ve also been uploaded to the Test Scores page, which has lots of data for you to look at if you ever get bored and want to look at numbers.

Note:  Report is updated below the chart to clarify what these %’s mean.

By School Rates - 2013

UPDATE:

The numbers presented in the chart above are the percentage of that population of students that graduated with a particular type of diploma. For instance, 100% of the ED students at Brentsville who graduated, achieved a Standard diploma. That doesn’t mean 100% of Brentsville’s ED students graduated.

The chart below is the on-time graduation rate and drop out rate for each school in the county.

Carrying on the Brentsville example, in the class of 2013, 83% of Brentsville’s ED students graduated on-time, with 100% of them achieving a standard diploma, 14% dropped out, and the rest either earned a GED or are still enrolled.

This chart will probably be the more controversial of the two, because of the dropout rates at some schools.  The numbers on the chart are percentages.  The numbers for All and ED students came directly from the state report linked as the source.  The numbers for not ED students are not provided by the state and were calculated from the All and Not ED numbers.  The chart is here is in pdf and on the Test Scores page ===>On-time and drop out rates – by school – 2013

 

On time and dropout rates by School - 2013

Lowest Test Scores in the Area

At the last school board meeting Harry Wiggins, Chairman of the PWC Democratic Committee, addressed the board.  He stated that PWCS had the lowest test scores in the area.  Dr. Otiagbe appeared taken aback by that  and asked the Superintendent to address it.  The Superintendent responded that PWCS consistently exceeded state averages but didn’t elaborate further as it was quite late.

Both are mostly correct.

Confusing, isn’t it?  I’ll try to break it down a bit.

Read the rest of this entry »

Opting out of State Mandated Testing in Virginia

Can you opt your child out of SOL testing in Virginia?  Yes you can, depending on what grade / tests your child is set to take and where you live.

Testing, and it’s effects on the quality of instruction and learning in Virginia and the nation, is a huge topic, especially now as SOL testing season is right around the corner.  Because we get asked about this quite frequently, we’ve added a page containing information about opting your child out of state SOL testing in Virginia.

You can find that page here  Opting Out of State Testing in Virginia.

 

Va Dept of Ed Blows Off 60% of Virginia Students

This summer, after receiving a wavier from NCLB’s 100% passing requirements, the VA Dept of Ed established new goals for the percentage of students passing the SOL in all school divisions and public schools in the state.   The new targeted percentage passing was set for the state overall and for groups of students;  groups that are based on race.

The Va Dept of Ed took lots of flack from the public because the pass rate targets for 2013 – 2017 for Black and Hispanic students are lower than the pass rate targets for White and Asian students.   The Dept of Ed was accused of having lower expectations for Black and Hispanic students than they do for White and Asian students.

The VA Dept of Ed responded that the goals don’t reflect lower expectations for Black or Hispanic students as the final goal, 73% passing by 2017, is uniform across all groups. Instead of setting a uniform starting point that didn’t reflect actual pass rates, the Dept of Ed set the 2012 actual pass rates as the point from which annual increases are expected.  The goals for annual increases for Black and Hispanic students, according the the Dept of Ed, will be challenging but are achievable.

Unfortunately the Dept of Ed blew off White and Asian students as they aren’t expected to improve at all.

In 2012, 73% or more White and Asian students passed the state SOLs in every elementary grade level or high school subject tested.  For White and Asian students, who comprise 60% of Virginia students, no increase in the percentage of students passing the state SOLs is expected through 2017.  Here are the actual 2012 pass rates and the 2012 – 2017 targeted pass rates from the VA Dept of Ed for each racial group  Pass Rate Targets & 2012 Actual Pass Rates

According to the Va Dept of Ed, the percentage of Black and Hispanic students passing the SOL exams is expected in increase nearly 20% over the next 5 years, but no increase is expected for White and Asian students.   In these times of scarce resources, with pressure on school divisions to meet state pass rate targets, where do you think the resources will be allocated – to the schools that are struggling to meet the state pass rate targets or the schools that have already met them? If every child deserves an education that helps them achieve the most they can, is that fair?  Wouldn’t it be more fair for the state to expect the percentage of students passing the SOLs to increase in each and every group, not just in select groups?

If you have any concerns with the goals the Dept of Ed has established, you may want to contact your state representatives.

PWCS Strategic Plan and State AMOs

Every organization, whether private or public, is evaluated on how well they do their jobs.  For private companies that evaluation comes from their customers, who will go someplace else if they’re dissatisfied.  Because government entities are monopolies and citizens, for the most part, don’t have alternatives to choose from, evaluating performance for them is a bit more complicated.  Police departments are evaluated based on crime rates, fire departments are rated on response times, and schools are rated on test scores.

At the January 16, 2013 School Board meeting, PWCS requested board approval of a few changes to the Strategic Plan.  These changes were characterized by Chairman Johns and Dr Walts as “semantic” in nature and were necessary because the state had received a waiver from NCLB and reset the Annual Measurable Objectives (AMO’s) for school divisions.  AMOs list, among many other things, the targeted percentage of students expected to pass the SOL exams each year.   The school division requested that the proposed changes be presented and  voted on the same night.

At the time I’d requested that the school board delay voting on the proposed changes to give citizens time to understand and examine them and provide feedback to their elected officials before the vote was taken.  Citing the “semantic” nature of the changes, the school board voted on the changes on the 16th and approved them.

I think this was a poor decision.

Read the rest of this entry »

Blame the Teachers

Blame the teachers.  That’s the advice from PWCS for anyone with concerns about the Math program.

According to PWCS staff, teachers developed the assessments and have the authority and autonomy to select and use any materials they want in any order no matter what the recommended pacing might suggest.  Teachers can even disavow instructional materials entirely, if they so desire.  Lesson pacing is just a suggestion, not set in stone, and teachers can teach lessons in any order they want based on what they believe will best meet the needs of their diverse community of students.  Our teachers teach the curriculum, not a textbook.

Any concerns or questions anyone has with what’s being taught, the manner in which it’s being taught, the order in which it’s being taught, what’s being tested, or how responses are being graded needs to bring those concerns to the attention of their child’s teacher.  Central office, and the Math Department, exist solely to create learning environments to enhance teacher professional development and student learning.

Got that teachers?  According to PWCS, when it comes to Math, it’s all on you.

If any teachers are feeling like they just got thrown under the bus by PWCS, watch out for the tires, they sting.

Read the rest of this entry »

Please Write to Your Delgates and School Board Members

Tell them that students should be able to add, subtract, multiply, and divide without calculators BEFORE they’re forced into Algebra or higher level courses.  Heck, tell them to order the  VA DOE to stop pushing calculators in our elementary school children.  Tell your Delegates that you support HB 469.    Tell your school board members not to oppose it.

Read the rest of this entry »

Accentuating the Positive or Misleading the Public?

Most of us who have ever worked for a private business have experience in trying to highlight the things we do well.  A restaurant with an extensive wine list will emphasize the variety of wines they offer in adverts, and maybe not mention that they don’t serve dessert.   A business that’s received awards for processing the greatest volume of home mortgages won’t mention how many of their mortgages ended up in foreclosure. A weight loss company will advertise that they have the greatest numbers of customers reach their goal weight of all the national weight loss programs.  They just won’t mention that it’s only 5% of their customers.

Business people know that highlighting what you do well attracts new customers and brings old customers back.

But what about government entities, like the police, fire and rescue, or public schools?   Government entities exist to serve the public; the money they receive to perform their duties comes from the taxpayers.  Unlike private companies, who are subject to the whims of free choice and will lose customers and go out of business if they don’t do a good job, government entities will still get taxpayer money no matter how well or poorly they do their jobs.

Because government entities are going to get their taxpayer money no matter how well or poorly they perform their duties, the public requires that government entities report various measures of their performance.  Police departments are expected to report crime statistics, and police chiefs are held accountable when crime rates increase.  Fire and rescue departments are expected to report response times, and held accountable when slow response times jeopardize public safety.   Public schools are expected to report test scores, and, in theory, held accountable when scores don’t meet community expectations.

What would you think if a school division selectively reported test scores to highlight success and failed to report when test scores were below expectations?

Several weeks ago PWCS reported on the ACT scores for several of our schools.  The press release was titled, “Prince William ACT Scores Exceed National Scores” and stated the following:

August 22, 2012

Students in Prince William County Public Schools (PWCS) scored above the national composite average for the fifth straight year on the college benchmark ACT. Students in four Prince William County High Schools exceeded the state average while overall Division scores remained the same as last year. Battlefield, Brentsville District, Forest Park, and Osbourn Park students achieved composite scores of 23.5, 24.0, 22.4 and 23.4 respectively, exceeding the state composite score of 22.4 and joining other PWCS schools in besting the national outcome of 21.1.

 
PWCS students exceeded the national average score in all four content areas–English, mathematics, reading, and science–on the ACT administered last year. More students ever took the test. The number of PWCS graduating seniors taking the ACT has increased each year since 2007-08, with a 28 percent increase overall from that year to 2011-12.
 
ACT scores assess high school students’ general educational development and ability to complete college-level work. Unlike an aptitude or reasoning test, the ACT is designed to be an achievement test, measuring what a student has learned in school. The ACT consists of multiple-choice questions. ACT results are reported on a scale of 1 to 36, with 36 being the highest possible score. Visit the ACT Web site for more information.>

Sounds great, right?

Except for what’s missing – like how PWCS fared when compared with Virginia as a whole on the ACT and how the other seven Prince William County public high schools fared on the ACT.  One look at those scores and you understand why.

PWCS scored below Virginia averages in every content area tested on the ACT.  Let me say that again, and in bold, because it’s really important. PWCS scored below Virginia averages in every content area tested on the ACT. The other 7 PWC high schools that weren’t listed in the press release scored below county and US averages in every content area tested on the ACT, with the exception of Woodbridge HS, which scored above US averages but below state and county averages.

Unlike the SOL, the ACT is a voluntary test, so these scores represent our top tier students, the students we’d expect to go to college.  And in that top tier group, ACT scores only exceed state averages in 4 of our 11 high schools; that’s 36%.   The other 7, 64%, are below state averages.

Still feel like cheering?

We saw the same approach with SOL scores.  PWCS reported how well our students performed relative to state averages.  A school board member reported on his facebook page that PWCS had exceeded state averages in 21 of 28 areas tested, and proclaimed success!

What we weren’t told, again, was the most telling.

Yes, pass rates were above state averages in 21 of 28 areas tested, but we were below state averages in 7 of the 9 areas tested in High School.  That kind of explains why our SAT and ACT scores continue to lag behind state averages.

Yes, quoting the school division’s press release, “PWCS students achieved higher pass rates than the state on all elementary math tests”.   Unfortunately, pass rates were below state averages in 4 of the 9 areas tested for Math, and below state averages in all of the High School level math subjects tested on the SOL.

Yes, Reading scores went up after several consecutive years of declining, but they were up by the same margin statewide, so that increase may be due to a testing irregularity as opposed to the results of improved instruction.

When compared with other school divisions in our area, our test scores only exceeded Manassas and Manassas Park.  Our pass rates were below Fauquier, Fairfax, Loudoun, and Stafford on just about every subject tested on the SOL. We’ve been below Fairfax and Loudoun for years, but Fauquier and Stafford have only recently begun to kick our butts. At one point a few years ago our SOL scores in Math were close to Loudoun’s. Not so much anymore.

Which bring me to another, final point.  Why do we compare ourselves with Virginia averages and then declare success when we do about average for Virginia?  We’re one of the most affluent counties in the state.  Like our neighbors in Fairfax, Loudoun, and Stafford, our citizens tend to be highly educated.   We should be doing better than average for the state.  We should be one of the top performing school divisions in the state. But we’re not.

I can understand accentuating the positive.  I totally get that.

But our schools are struggling, especially at the high school level and especially with high school Math.  Our SAT and ACT scores show that.  They have for years.

So do our SOL scores.  They have for years.

But you have to be willing to look at the scores to see that, and I’m not sure PWCS is willing to do that. They’re too busy accentuating the positive to step back and admit that there might be some negative. From what I’ve read, our school division appears to believe that things in our schools are just peachy keen and that average for the state is an acceptable goal.

I think it’s unacceptable.

PWCS 2012 SOL Scores – The Good, The Bad, and The Ugly

The overall pass rates for the Spring 2012 SOL exams have been released and the results provide some insight into how well our children are performing relative to their peers at other schools and in other school divisions.  Here are the good, the bad, and the ugly……

The Good:  PWCS continues to pass a greater percentage of students overall than the state average on many SOL exams, with most of those better than state averages occurring in grades K – 8.  Grade 3 reading pass rates, after declining for several years in a row, increased.  K – 8 Reading, Writing, Science, and Math pass rates continue to exceed state averages.

The Bad:  Of the 10 High School subjects topics tested at the state level, PWCS only exceeds state averages in 3 : Geography, US & Virginia History, and World History I.  Since 2007 PWCS has been at or below state averages in Biology, Chemistry, Earth Science, Algebra I, Algebra II, Geometry, and World History II, as this file demonstrates (High School – VA to PWCS).

The Ugly: When compared with overall pass rates from surrounding school divisions (Fairfax, Fauquier, Loudoun, Manassas, Manassas Park, and Stafford), PWCS is among the lowest performing divisions – generally competing with Manassas or Manassas Park for the lowest overall pass rates in almost every subject.  Demographics, particularly the percentage of economically disadvantaged students, may play a role here, but the state hasn’t released demographic data, yet, so we can’t ascertain what role it plays.  Once that data has been released, we’ll report on it and let you know how ugly the ugly is.

What can we conclude?

All is not well in PWCS.  While PWCS passes more students than the state average on many K – 8 exams, I’m not sure doing average for the state is how I’d define a “World Class School System”.  When combined with our SAT scores, which lag behind both US and State averages, PWCS is in need of improvement – especially in High School level Math.

Unfortunately in the past PWCS has been more intent on selectively reporting “success” than on openly and honestly admitting where problems exist and proposing solutions to those problems outside of spending more money on Smart Boards and other “technology”.  That will continue unless our school division leaders demand honest and complete reporting of test scores from the school division.  We don’t present test score data to shame individual schools or demoralize our teachers.  We present that data so that our citizens know the full truth about PWCS’ performance and, hopefully, will demand better from our school division and our children’s schools.

One note:  PWCS, as it always does, will ignore our numbers and analysis because they say they don’t know where we got our numbers.  We got our numbers from the VA DOE’s searchable database of assessment data which you can find here and the excel spreadsheet of 2011 & 2012 test score data the VA DOE released on August 14, 2012.  If PWCS’ data differs from ours, then PWCS needs to explain why their data, which comes from a PWCS database and not the VA DOE database, differs from the state data.

2012 SOL Exam Scores

We’ve updated out Test Scores page to include two new reports.

The first report highlights where each PWCS High School falls below VA or County averages for all subjects tested for 2011 & 2012 (see All PWCS High Schools By Subject For 2011 & 2012)

The second report highlights where Fairfax, Fauquier, Loudoun, Manassas, Manassas Park, PWCS, and Stafford county schools fall below state averages for all subjects tested for 2011 & 2012.  (see SOL Exam Pass Rates for School Divisions & Subjects for 2011 & 2012).