Pages

Partners

Minggu, 20 Januari 2013

Why High-School Rankings Are Meaningless—and Harmful

How much value can there be in an index that rates thousands of schools? When it reinforces the worst tendencies in our education system, not much.

Over the past month or so, in newspapers and local-news websites all around the country, public high schools and school districts have been trumpeting reports about how they've done on various national rankings of high schools. For instance, here's Bill Runey, principal of Attleboro High School in Massachusetts. "We're really proud of this," he said in a press release put out by the school district. He was referring to the fact that Attleboro had been ranked 1,947th in the nation on the Washington Post's annual ranking of "America's Most Challenging High Schools."
On a local level, school rankings long have been the sort of thing city magazines thrive on, along with their "best of" issues that purport to tell readers where to buy the best burger in the city or get the best waxing. In a single metropolitan area (or even in a single state), rankings of public schools may have some utility if they are done thoughtfully, using sensible metrics. Parents might be able to use that information to find an affordable residence near good schools, while still leaving themselves within reasonable reach of their place of employment. It's harder to fathom the logic for ranking high schools nationwide. Few are the families who will move out of state or across the country on the basis of claims about school quality.
Without taking away from whatever credit Runey and Attleboro High School deserve for their achievements, let's call national rankings of high schools what they are: nonsense. There is no way to say, with any degree of accuracy at all, where any given high school ranks in relation to others in terms of how good it is or how challenging it is. And the claim that Attleboro High School, which was not even fully accredited as recently as seven years ago, is now in the top ten percent of America's high schools -- among the most challenging -- seems improbable, at best.
And yet, every year since 1998, Jay Mathews, an education journalist at the Washington Post, has been putting together a ranking of what he calls "America's Most Challenging Schools," or the Challenge Index. For years, this national list was published by Newsweek, which was owned by the Washington Post Company. When the Post sold off Newsweek in 2010, it kept the Mathews index for itself. Newsweek then produced its own ranking, which has been continued by the Daily Beast. And, of course, US News & World Report, an organization famous for fueling Americans' obsessions with rankings (colleges, law schools, hospitals, etc.) started its own high-school list, too.
All of these lists have flaws that stem from the inherent absurdity of presuming to rank schools around the country according to how good or challenging they are. And they all come in for criticism. Recently, Matthew Di Carlo, a senior research fellow at the Albert Shanker Institute, took a critical look at theNewsweek/Daily Beast and US News rankings, finding some good and some bad features in each of them.
But it's the Mathews "Challenge Index" that has given rise to the sharpest criticism over time (see here, here, and here, for example) because of its methodology, which is reductionist in the extreme. It uses only one factor to calculate its rankings: It divides the number of Advanced Placement (AP), International Baccalaureate (IB), and Cambridge (AICE) exams taken at each school by the number of graduating seniors. Note that the numerator is not even the number of such exams passed, but merely the number taken. So, a given school can rise on the list by increasing the number of its students who take "advanced" classes.

Conversely, schools that are more discerning and thoughtful about which students ought to be taking AP classes end up suffering in the rankings. So, the list produces nonsensical anomalies such as high schools with very low graduation rates ranking much higher on the "Challenge Index" than excellent schools that don't game the ranking system, or that, like Scarsdale High School, have joined the growing list of schools that have eliminated AP courses so that, as Bruce Hammond puts it, "students and teachers could rediscover their passion and creativity" once freed of what is too often a rigid and stultifying AP curriculum.
To their credit, US News and Newsweek/Daily Beast, which also use AP and IB courses as a measure, have made their rankings more sophisticated and reasonable by also adding other measures of a school's quality, such as (in theDaily Beast's case) graduation rates and college-acceptance rates, and (in the case of US News) performance on state accountability tests and the proficiency rates of a school's least advantaged students on those tests. For explanations of their methodologies, see here for the Daily Beast and here for US News.)
Despite steady criticism over the years, Mathews has retained and defended the simple formula he uses to calculate his Challenge Index, refusing to factor in other appropriate measures of school quality beyond the number of students taking advanced classes. (His only concession has been to add a separate list of schools, what he calls "The Catching Up Schools," that takes into account how impoverished the student body is, as measured by the percentage of students who quality for federal lunch subsidies. He also now notes that information in a separate column on his main ranking, along with the percentage of graduates who passed at least one "college-level" test during their high school career, but does not factor those data into his rankings.) Because Mathews otherwise insists on only using AP and IB exams as his measure, the Challenge Index typically comes in for the sharpest criticism of all these rankings. The essential criticisms can be summarized as follows:
1) The inherent impossibility of measuring relative quality in schools. Quality is a very subjective matter, especially in something as intangible as education. And using a simple measure to rank thousands of schools certainly cannot capture the relative quality of schools or indicate which are better than others.
Mathews says his index doesn't purport to identify or rank "the best" schools or otherwise measure quality. He says he's merely identifying the "most challenging" schools, as indicated by the number of its students who take what he calls "college-level courses." But when his ranking was published by Newsweek, it was actually billed as a list of "America's Best High Schools." Like most journalists, Mathews probably doesn't write his own headlines, and he may have been as irritated by the use of the word "best" as many of his readers. But given how his bosses have billed his lists over time, the effect of the lists is reductive. As Valerie Strauss, a sharp critic of the Mathews ranking and a colleague of his at the Washington Post, points out, Mathews "doesn't use that word ['best'] to describe his rankings, but what do you think people take away from them?"
2) Focusing only on AP and other "advanced" courses is silly. Aside from the obvious and already noted objection that looking only at such courses fails to take into account all the other indicators of school quality, some people (I include myself here) say that many of these courses simply aren't all they're cracked up to be, which makes their use as a proxy for quality even more ludicrous.
This isn't the place to rehash the many criticisms one can lodge against AP courses. I did that last October in a piece here. But it's worth noting that since then AP courses have come in for more celebrated blows, such as Dartmouth's decision in January to add itself to the list of schools refusing to give college credits for high scores on AP courses because of concerns that AP courses "are not as rigorous as college courses."
Then, a few weeks later, Kenneth Bernstein, an award-winning high-school teacher (recently retired) and nationally known blogger ("teacherken") garnered nationwide publicity and hundreds of thousands of readers for a letter he published, warning college professors that the current U.S. obsession with high-stakes testing is producing high-school graduates who don't think as analytically or as broadly as they should. He devoted much of his attention in the piece to AP courses, calling them "responsible for some of the problems" professors will encounter with students headed their way.
Most importantly (and damningly), in April, Stanford University researchers released their careful review of more than 20 research studies on the AP experience, the results of which challenged four basic common assumptions about the AP program: (1) The AP program gives students several advantages in terms of college; (2) the AP program helps to narrow achievement gaps and promote educational equity for traditionally underserved students; (3) AP programs enrich students' high school experiences; and (4) schools with AP programs are better than schools without AP programs. Denise Pope and her Stanford colleagues found problems with all these claims.
In the face of continuing evidence that the merits of many AP courses are exaggerated, it's hard to understand why Mathews continues to make them the bedrock of his ranking system. He says that he's just interested in bringing the benefit of more challenging coursework to larger groups of students. But even if he disagrees with specific criticisms of AP courses, one would think that the overall quantity of criticism at this point would be enough to moderate what he himself has called his "obsession" with the program. But obsessed he is: By my count, he has devoted his space in the Washington Post to some aspect of AP courses more than fifty times in the last four and a half years.
3) The Challenge Index has been partly responsible for fueling the tremendous growth in AP enrollments around the country over the past ten years.
Of course, many students take AP courses because they're genuinely interested in challenging themselves with what can be a rigorous course of study and because they're intellectually curious about the subject matter. Unfortunately, too many others take these courses because they're feverishly trying to impress college admissions officers by stacking their record with large numbers of AP courses.
But many students who end up in AP courses are there because they are unwitting pawns of their principals, local school boards, or education bureaucrats, who are pushing more students to take AP classes to improve their schools' ranking on the Challenge Index and other such lists. Remember that the Mathews index doesn't take into account how students perform on the AP exams, just that they take them. The incentive to vacuum kids into these classes ends up packing AP courses with too many students who don't belong there.
In short, by being partly responsible for the explosive growth in AP enrollment over the past decade, the Mathews ranking -- and, to a lesser extent, the others -- amplifies the absurdity that pervades contemporary public education in the United States, where cramming students' heads with information and then subjecting those students to standardized tests seems to have supplanted helping students to learn as the preferred modus operandi of many education officials, and where the behavior of school officials is shaped more by perverse incentives than by educational common sense.
That's the reason to care about this.
If it weren't for the fact that these sorts of rankings actually shape school behavior, everyone would be perfectly justified in ignoring Mathews and the Washington Post as they spend time and other resources assembling his list. The ranking itself is meaningless. But the harm it and other lists of its kind do to public education and the role they play in driving the College Board's revenues can't be overlooked. These lists may sell papers and draw readers to websites, but for those of us outside of that business, we've a duty to push back against this kind of reductionism wherever we see it.

0 komentar:

Posting Komentar