Wednesday, March 30, 2011

Value-added?

The LA Times (the deplorable publication that it is) published their own analysis of LAUSD teachers using a value-added method.  It got people all up in arms. 

Value-added gets me all hot and bothered (in the good sense).  Just look at this and tell me you don’t get a funny feeling in your nether parts:

y = Xβ + Zv + ε where β is a p-by-1 vector of fixed effects; X is an n-by-p matrix; v is a q-by-1 vector of random effects; Z is an n-by-q matrix; E(v) = 0, Var(v) = G; E(ε) = 0, Var(ε) = R; Cov(v,ε) = 0. V = Var(y) = Var(y - Xβ) = Var(Zv + ε) = ZGZT + R.

That is statistical porn!

That is a sample of the equation that LAUSD is allegedly using to assess teachers using the “value-added” method.  Essentially, we can use a student’s previous test scores to predict the growth that we would expect from that student (given demographic characteristics) and we can then see what actually happens and use the difference between the predicted and actual values to assign how much “value” a teacher added (or subtracted).

It is a complicated statistical model.  It’s hard for me to argue the use of this model with teachers because I’m not sure that I agree with assessing people’s performance using a mathematical model that it takes a few graduate degrees to understand.  But, the alternative seems even more unacceptable to me. 

Currently, most teachers are deemed “excellent" through observation, even though the majority of our students are not able to pass standardized tests (yes, I realize that this is not the only assessment of a student’s progress, but alas it is what counts right now).  We also look to see what percentage of a teacher's students were "proficient." The problem with this is that a teacher who starts with 80% of his or her students proficient at the beginning of the year is judged the same as a teacher who began the year with 20% proficient students and ended the year with 80% of his or her students being proficient. 

Simple Growth Model: If we just look at a snap shot of a student's progress in the academic year (either using a pretest and posttest, or a compare a student's previous score to the current score), then we get a very hazy picture of what happened and we essentially have no way to control for demographics that obviously will affect academic growth. 

Chicago Public Schools has a pretty easy and intuitive explanation of the value-added analysis.  They have a powerpoint that goes into some Oak Tree Analogy - which may be useful.  But I think this document provides the best explanation:  Value-added explanation.

No comments:

Post a Comment