If you want to practically guarantee lower test scores on state mandated tests or college entrance exams, the single best thing you can do is spend weeks ignoring content and studying old test questions. There are legions of studies dating back more than a decade which demonstrate how ineffective studying old test questions is at improving student performance on standardized tests, so it’s not like our schools aren’t aware that what they’re doing is ineffective. Which is rather odd as the primary test prep strategy employed in PWCS is studying old exam questions and test taking strategies rather than reviewing content.
We’ve all gotten the letters sometimes as much as 6 weeks before the test date, “We will begin preparing for the SOL exams next week. Please have your son / daughter bring a 2 in three ring binder to class for keep his / her study materials”. The first year my children took state exams I rushed out to buy a binder with images my child enjoyed so that he won’t be intimidated by preparing for these high stakes tests. One of my children had a Poke-Mon binder while the other chose his father’s old Donald Duck binder.
We eagerly awaited the first batch of study materials to go over with our child because we wanted to be supportive and wanted him / her to do well on the test. We were a little surprised when we opened the binder the first time and realized that it was just copies of old test questions. We assumed that the questions were just review of content they’d studying that day and grilled our children: “Did you do test prep today in school?”, “What did you do to prepare for your SOLs today?”, or “Did you spend your whole day looking at old test questions or did you study stuff other than questions like these?”.
I remember my surprise when I realized that my child wasn’t reviewing content to prepare for the SOLs, he was just studying old test questions. And by studying old test questions I don’t mean studying the content addressed in the old test questions but actually “studying” why answer B was correct and A, C, and D were incorrect. Even worse was realizing that this practice began as early as 4 – 6 weeks before the state exams.
No wonder most textbooks are designed with 140 – 160 days of instruction as opposed to the 180 days in a typical school year!
You may think that with the high stakes out on testing schools have to do whatever they can to prepare students and studying old test questions is a tried and true method. Except that multiple studies have demonstrated conclusively that this practice of cramming with old test questions lowers test scores; that this “cramming” actually undermines student learning. Worse, because the kids are so burned out from the high stress cramming they spend the next 3 weeks of school watching movies, playing games, and having sidewalk chalk or bubble soap days.
So the question then arises – why do we cram for tests by studying questions from old exams if study after study for the past 20 years has demonstrated that such an approach is detrimental to student learning and test scores?
Here are just a few of the studies:
* a research review or meta-analysis
*Becker, B. J. (1990, Fall). Coaching for the Scholastic Aptitude Test: Further synthesis and appraisal. Review of Educational Research, 60(3), 373–417.
Briggs, D. & B. Hansen. (2004). Evaluating SAT test preparation: Gains, effects, and self-selection. Paper presented at the Educational Testing Service, Princeton, NJ, May.
Crocker, L. (2005). Teaching for the Test: How and Why Test Preparation Is Appropriate. In Defending Standardized Testing, ed. R. P. Phelps, 159-174. Mahwah, N.J.: Lawrence Erlbaum.
*Camara, W. (2008). College admission testing: Myths and realities. In R. P. Phelps (Ed.), Correcting fallacies about educational and psychological testing, Washington, DC: APA.
Camara, W. (1999). Is commercial coaching for the SAT I worth the money? College Counseling Connections. The College Board. New York, NY, 1(1), Fall.
*DerSimonian and Laird, 1983. Evaluating the effect of coaching on SAT scores: A meta-analysis, Harvard Educational Review 53, 1-5.
Fraker, G.A. (1986–87). The Princeton Review reviewed. The Newsletter. Deerfield, MA: Deerfield Academy, Winter.
*Kulik, J.A., Bangert-Drowns, R.L. and C-L.C. Kulik 1984. “Effectiveness of coaching for aptitude tests,” Psychological Bulletin 95, 179-188.
*Messick, S. & A. Jungeblut (1981). Time and method in coaching for the SAT, Psychological Bulletin 89, 191–216.
Moore, W. P. (1991). Relationships among teacher test performance pressures, perceived testing benefits, test preparation strategies, and student test performance. PhD dissertation, University of Kansas, Lawrence.
National Advertising Division. (2010). The Princeton Review voluntarily discontinues certain advertising claims; NAD finds company’s action ‘necessary and appropriate’. New York, NY
National Advertising Review Council, Council of Better Business Bureaus, CBBB Children’s Advertising Review Unit, National Advertising Review Board, and Electronic Retailing
Self-Regulation Program. Retrieved June 25, 2010 from: http://www.nadreview.org/DocView.aspx?DocumentID=8017&DocType=1
Palmer, J. S. (2002). Performance incentives, teachers, and students. PhD dissertation. Columbus, OH: The Ohio State University.
*Powers, D.E. (1993). Coaching for the SAT: A summary of the summaries and an update. Educational Measurement: Issues and Practice. 24–30, 39.
*Powers, D.E. & D.A. Rock. (1999). Effects of coaching on SAT I: Reasoning Test scores. Journal of Educational Measurement. 36(2), Summer, 93–118.
D. Powers and W. Camara, “Coaching and the SAT I”, College Board Research Note 99-07, available from Wayne Camara, College Board, 45 Columbus Ave., New York, NY 10023,
Robb, T.N. & J. Ercanbrack. (1999). A study of the effect of direct test preparation on the TOEIC scores of Japanese university students. Teaching English as a Second or Foreign Language. v.3, n.4, January.
Roediger, H. L. & Karpicke, J. D. (2006a). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1, 181–210
Roediger, H. L., & Karpicke, J. D. (2006b). Test-enhanced learning: Taking memory tests improves long-term retention. Psychological Science, 17, 249–255.
Smyth, F.L. (1990). SAT coaching: What really happens to scores and how we are led to expect more. The Journal of College Admissions, 129, 7–16
Snedecor, P.J. (1989). Coaching: Does it pay—revisited. The Journal of College Admissions. 125, 15–18.
Tuckman, B. W. (1994, April). Comparing incentive motivation to metacognitive strategy in its effect on achievement, Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. (ERIC ED368790).
Tuckman, B. W. & Trimble, S. (1997, August). Using tests as a performance incentive to motivate eighth-graders to study, Paper presented at the annual meeting of the American Psychological Association, Chicago, IL. (ERIC: ED418785).
Zehr, M.A. (2001). Study: Test-preparation courses raise scores only slightly. Education Week. April 4.