Jun 09 2012

Confessions of a recovering value-adder

In general, the ‘better’ I teach, the better my students do on my bi-weekly exams.  A big part of teaching is trying to accurately monitor assessment during the lessons and trying to ensure that by the time that the teacher-made test is administered, the students will do well.  There are exceptions.  I’ve taught units with a lot of higher order thinking skills, but on the test were what I considered ‘easier’ mechanical skills which I thought they’d have no problem with since I had done all that work getting them to really think about the concepts — but then they bomb the ‘easy’ stuff.  That’s what it’s like to be a teacher sometimes.

The crux of my classroom management philosophy that helped me get two books published was that students will behave better for a teacher who they have ‘learned’ from (as measured by their scores on the first quiz) than someone who they got a poor grade on the first quiz.  Good grades make the students feel confidence in themselves and in their teacher.

But there is a big difference between a teacher-made test about a few weeks of material that students were specifically taught for and a big end-of-the-year multiple choice state test that God-knows-who wrote.  That’s why I don’t have much confidence in the use of high-stakes tests in decisions to shut down schools, fire teachers, praise other schools, and make their CEOs very wealthy.

When I hear about how this charter school got such great results (100% passed! What does that even mean?  What if they all just passed by a little?), I have to chuckle, knowingly, since I know how little it means.  I can say this with confidence since, when necessary, I can add value with the best of them.

My first experience of ‘teaching to the test’ was in my fourth year of teaching in Houston.  At that time, the state test was called the TAAS.  Though it wasn’t a test that was going to get our school shut down (this was in 1994-1995) or teachers fired, the test was important to my students.  If they didn’t pass it, they would not get a true diploma when they finished 12th grade, but merely a ‘certificate of attendance.’

Students first took the TAAS in the fall of 10th grade and had to pass the three tests, math, reading, and writing.  If they passed one, they didn’t have to take that one again.  They got a second chance to take the test in the spring of tenth grade, then a third chance in the fall of 11th grade, a fourth chance in the spring of 11th grade, a fifth chance in the fall of 12th grade, and a final sixth chance in the spring of 12th grade.

Our school had 300 freshmen, 150 sophomores, and about 100 juniors and 100 seniors.  After the fall TAAS math test I found that 60 of the 100 seniors had failed again.  The principal suggested that we run two math TAAS review classes of thirty students each.  I took one class and the other class was taught by the best teacher in our school, Sheila Whitford.  Sheila, though, was not a math teacher.  She wanted to teach this math class, however, because she felt that the test was more of a reading test than a math test.  The kids just didn’t understand what the questions were asking.

Sheila studied up on how to teach math and ran a very creative class where students did art projects illustrating the concepts to get a deep learning experience.

I created a pure test-prep curriculum which I ran like a drill sergeant.

As the test neared, our school geared up for the test.  I actually created an ad campaign with the slogan ‘Kick some TAAS.’  There were posters on the walls and I even made buttons for kids to wear.  As people loved the slogan so much, I made a follow up which probably should have gotten me fired, ‘Don’t be a TAAShole,’ and if that wasn’t enough, ‘Don’t be a flunking TAAShole.’  I wish I were making this up, but this is true.

The students in my TAAS review class were not dumb.  In fact, I had taught almost all of them throughout the previous years in Algebra, Geometry, and Algebra II.  Those courses were much more difficult than these low-level TAAS questions, but for some reason, they just struggled with them.

My test-prep paid off.  Twenty-eight out of thirty in my class finally passed the TAAS on their sixth try.  Sheila Whitford didn’t have as much luck.  I believe half of her students passed.

I’ve added value more recently, back in 2002 when my friend, who was a middle school assistant principal in the Bronx, at the time, asked if I’d be willing to teach a summer school course to help some of their best students take the New York Specialized High School test to get into a school like Brooklyn Tech, Bronx Science, or Stuyvesant High School (where I now teach).  No student from this middle school had gotten into one of those schools in a long long time.  I now know that many of the students who get into these schools begin test-prep for the one test that determines admission, five years before taking the test.  With four weeks of training, these Bronx middle schoolers were going to have trouble catching up.

Though I didn’t perform a miracle, one of the fifteen students did get into Brooklyn Tech, where she went and graduated.  I am sure the others came much closer to making the cutoff than they would without my coaching.

Even now, I supplement my income as a private math tutor for students (mostly from private schools) who want to improve their SAT scores.  Last year I helped a student from Dalton improve from a 470 on the math section on the PSAT to getting a 720 on his math SAT.  This took forty weeks of test prep.

As an expert value-adder, I will be the first to say that it means nothing.  I am not a magician.  I am an illusionist.  These test are just too simplistic and predictable.  The students didn’t really ‘learn’ anything meaningful that would enhance their lives or help them succeed in college.

Schools today, particularly the ‘high performing’ ones in high poverty areas, I believe, are little more than test-prep factories.  For some schools, the test scores vary inversely to the amount of actual value the teachers add to their student’s lives.

8 Responses

  1. Cal

    ” These test are just too simplistic and predictable.”

    The first is flatly not true. I’m a test prep instructor who has coached hundreds of kids (I’ve done it longer than I’ve taught math), and the tests aren’t simple and predictable. You could not take any kid and work with him for 40 weeks and push the kid up to a 720 unless one of two things were true. Either the kid was a bright kid who just didn’t understand the test (likely) or you are literally divorcing content from the test, and teaching the kid like a robot to respond to stimuli. And that kid would have to work hundreds of hours on his own as well. And if that’s what happened, you should be ashamed of yourself for engaging in it. That’s how a Korean kid who can’t speak English can score an 800 on the SAT verbal without understanding English, and it’s fraud.

    But regardless, it’s simply untrue to say that the tests don’t mean anything, that they are simple and predictable, or that anyone can do that. It’s the smart guy’s arrogance that dismisses the real difficulty of the tests, and trivializes the intelligence that it takes to do well on this test, assuming that one isn’t becoming a robot.

    • Gary Rubinstein

      Well, for sure the kid underperformed on the PSAT and overperformed on the SAT. I might have conveyed that I think the tests are ‘easy’ but if you read my five part analysis of the New York State tests, I think they were unnecessarily tricky. I guess, though, what I mean in this post is that it is possible to do test prep at the exclusion of ‘real’ learning, and that can get test scores up artificially.

  2. Great analysis. I too have done test prep–preparing ESL students for the English Regents exam. It being so simple and predictable was very helpful, and I was able to get many to pass. However, aside from the fact that they needed to do so to graduate, it had no educational significance for these kids.

    My students desperately needed instruction in usage and grammar to avoid having to take no-credit remedial college courses–courses I’ve actually taught. I could have made them better writers, better speakers of English, rather than better test-takers on one single test they ought not to have been taking anyway.

    But who am I, compared to the mighty New York Regents?

  3. Carol Burris

    of course this is so true. What parent (which includes me) thinks that because an SAT tutor helped their child increase their SAT score, the child is more prepared for college. Of course that is not the case. Such destructive nonsense…

    • BEN

      It is true that getting a high test score on a test thru cramming is not going to make a student smarter, just more able to pass the test. But how could you get students to pass the tests without force-feeding them the material?

      My answer: Have all students take a logic course freshman year. Any student can be forced to follow instructions and learn, and following instructions is good preparation when these students are mowing lawns in five years. This is how most classes are taught, which creates problems with students knowing only what they have been told to memorize. If students were taught to think rationally BEFORE they entered college, perhaps they would be more motivated to learn during high school, and pass the states test the first time.

  4. I’d be curious to hear what these students for whom you “added value” have to say about their test prep experiences. Have you followed up with them? Do they also believe that they didn’t learn anything meaningful?

    Also, what are your thoughts on the PARCC assessment (http://www.parcconline.org/about-parcc), which is supposedly aligned with the Common Core? (A high-stakes test aligned to the curriculum—imagine that!)

  5. Gretel

    I had an interesting experience recently where I was watching some adults be interviewed and asked some math questions. I noticed that some of them actually had trouble predicting what the interviewer wanted to hear, so they didn’t come across as well as the one who was really excellent at predicting what the interviewer wanted to hear. I don’t think it was that the one candidate was more prepared. The interviewees had about the same content knowledge. The questions were poorly phrased, and the candidates who did not do well tried to ask a number of clarifying questions before giving an answer. The one who did well seemed to listen hard to the questioner and then gave his answers in a tone that suggested they were guesses. But they were exactly what the interviewer wanted to hear!

    This is perhaps a long-winded way to suggest that saying what people want to hear is a separate skill from understanding and answering questions correctly. I’ve found that it takes students who actually want to understand and answer questions a much longer time to get passing scores than the ones who are able to take a guess at what the test “wants to hear”.

  6. I thought you would like this article about how a wonderful math teacher was ranked the worst 8th grade teacher in NYC according to Value Added. Sorry if you’ve already referenced it:
    http://www.washingtonpost.com/blogs/answer-sheet/post/meet-the-worst-8th-grade-math-teacher-in-nyc/2012/05/15/gIQArmlbSU_blog.html

About this Blog

By a somewhat frustrated 1991 alum

Region
Houston
Grade
High School
Subject
Math

Subscribe to this blog (feed)


Subscribe via RSS

”subscribe

Reluctant Disciplinarian on Amazon

Beyond Survival On Amazon

RSS Feed

Subscribe