Get Boardhawk in your inbox

Enter your email address to subscribe and receive notifications of new posts by email.

Parsing the “Reimagine SPF” debacle

It’s like trying to untangle a snarled ball of yarn to figure out precisely what caused the sudden erosion of support for the Reimagine SPF recommendations to overhaul Denver Public Schools’ accountability system.

But it’s hard to read online posts and not come away suspecting that powerful forces really are opposed to measuring schools in any meaningful way. The bottom line is that beleaguered Denver school board President Carrie Olson decided to pull the recommendations for reimagining the School Performance Framework off the June 11 agenda after alliances collapsed and confusion reigned.

Given the chaotic environment, this was a wise move. But reaching any kind of consensus between now and August, when the issue will resurface, is likely to be harder than ever.

We’ve entered such a through-the-looking-glass world that the reasonable arguments for keeping a DPS-created SPF fell completely by the wayside, in favor of a recommendation to replace it with the state’s SPF. Read an earlier post about this issue here. This recommendation won the support of the Denver Classroom Teachers Association.

But a new, multi-measure dashboard (recommendation 2) and creation of a “collaborative continuous learning and improvement cycle to assess the ongoing performance of schools across our three value domains: Academics, Whole Child, and Culture/Climate,” (recommendation 3), suddenly became the new ‘corporate reformer’ conspiracy to shame teachers and –gasp!– measure schools against one another.

School board members eager to do the DCTA’s bidding suddenly found that their initial support of the recommendations had placed them on the wrong side of the issue, from the DCTA’s perspective. That must have made them uncomfortable.

To get a full measure of how far this discourse has devolved, consider this excerpt from a post on the Communities United for Real Education website:

“This “new” SPF will still allow DPS to continue to compare and evaluate schools, even though the wording is all around the whole child and climate and culture. It’s a ruse. They will still be able to close schools as NO board policies are changing related to that by passing this set of recommendations. The real purpose of these two recommendations is to continue marketing schools, promoting the choice system, and requiring that misleading data about schools be published while continuing the inevitable comparisons across schools, regardless of their needs and the different programs that exist.”

See what I mean about no measurement or accountability? On what planet is comparing and evaluating schools a bad idea? In what solar system is marketing schools considered a negative? In what universe is publishing data about schools ‘misleading?”

In a lengthy Facebook comment, Karen Mortimer, an SPF committee member, after establishing her bona fides as an “anti-reformer,” took issue with the CURE post quoted above. It’s worth quoting Mortimer’s comment at length:

These recommendations would actually result in the DPS SPF being blown to smithereens, not perpetuated. It is not “SPF Lite”. It isn’t even an SPF because there are no points that roll up to a score. There is no mechanism to rank schools. There are no winners and losers. There would no longer be any blue, green, yellow, orange or red schools. In our process as a committee, the pro-reformers who wanted to keep the DPS SPF and who decried moving to the state SPF for high stakes accountability, were completely shut down…

Recommendations #2 & #3 MUST NOT be weaponized to punish, shame or close schools. Absolutely not. And, even then, it’s important for everyone to remember that the District doesn’t close schools, the Board does.

The dashboard could provide families, and the broader community, information that you have been wanting for years such as suspension/expulsion rates disaggregated by race, numbers of teachers of color per school, the degree to which teachers feel supported in their schools, the level of mental health supports DPS is providing per school, whether or not students of color feel they have adequate social/emotional supports, whether or not parents feel welcome in a school, and many other data points.

Community – I ask – do we really want this information to stay hidden from us? How is it of benefit to us as champions of public education to only have the State SPF and absolutely no other data? How could we be in favor of completely silencing parent, student and teacher voice?

The dashboard’s purpose is also to allow schools to show where they are excelling and for the community to know where schools are struggling IN ORDER TO hold the District accountable in its support of schools. But, by the same token, if a school IS truly not serving our students well, don’t we as a community want to know? Are we saying that schools should not be held accountable to how they are serving our kids ESPECIALLY when it comes to the Whole Child?

Mortimer gets it partially right here. Of course the measurements she advocates are essential to holding the districts accountable. But it’s not just the district. It’s principals, teachers, and all of us, who need to be held to account for schools struggling to educate all of our children.

Yes, teachers have been singled out unfairly in some quarters and asked to shoulder all the blame for systemic failures. Yes, that is wrong and simple-minded. But it’s equally obtuse to argue that all the blame should be heaped on one system — the school district — though that system obviously deserves sharp criticism.

What about teacher preparation programs? What about teacher associations that ardently defend contractual rights and then whine about teachers not being treated as professionals? There is plenty of blame to go around, and simply redistributing the shame won’t get us anywhere.

Let’s, for argument’s sake, give these individuals and groups that want to eliminate accountability and measurement their way. Let’s stop measuring anything and just trust that teachers will get it right. Then let’s come back together in five years and see what our graduation, dropout, and remediation rates look like, not to mention our achievement gaps.

But wait. We won’t be able to do any of that. We won’t have any data.

Let’s call it the ostrich approach to school improvement.

And that, I’m afraid, is precisely the point.