The statements and opinions herein are
solely those of the author and do not necessarily represent the views of any organizations or individuals associated with the author, past or present.

Wednesday, January 13, 2016

The perversion of STARS

If STARS is the gold standard of campus sustainability assessment, we're getting robbed.

By Dave Newport, LEED AP

OK, so STARS isn’t perverted. The STARS website hasn’t been hacked with porn. Not getting kinky.  

Just couldn’t resist a salacious headline. So cheap.

But STARS’ data are being perverted. That perversion is impacting all of us negatively.

And there’s a really easy fix. I’ll come back to that.

First, the table below lays out the perversion. Listed are the top 25 campuses based on their STARS’ scores compared with some third parties that use STARS’ data in deriving their rankings[see footnotes].

A quick scan of the relative rankings reveals all sorts of anomalies. Highly ranked STARS campuses are ranked low—using the same data—or left out of other rankings entirely. And vice versa.


A couple other curiosities pop out. With great and sincere respect to the work done there, UC-Irvine, with a STARS score below any in the Top 25 (66.00), has twice been named Sierra #1 school. Likewise, Sierra’s #1 school in 2013, UCONN, submits STARS’ data as a Reporter (no score or rating).

Yet Colorado State University, with the highest ever STARS score and the first and only campus to achieve a STARS Platinum rating thus far—is only #4 in Sierra and #12 in Princeton Review?

That’s preposterous—and perverted. No, I don’t work at CSU; supposedly they are my campus’ arch-rival. Whatever.

I’m just saying: fair is fair.

Curiouser and curiouser

To be sure, anybody has the right to interpret data as they see fit. And any rankings are inherently subjective; there are few perfect and absolute measurement standards (planet's distance from the Sun?). Neither do any rankings normally portend future performance. LEED buildings are ranked at birth, but can go bad if they are not given appropriate care and feeding. Most important, the rankings should be built as best they can on a foundation of quality data and interpretation, as STARS is. 

The question of journalists’ basis for rankings have been called into question before. Sustainabilistas organized and asked third parties for major reforms in their rankings back in the day. In the interest of credibility, journalists’ opinions should be supported by a transparent methodology. 

Yet in some instances “it's not always clear how those scores are calculated,” the Chronicle of Higher Education reported in 2010.  “Two years ago, a researcher at Sierra revealed to the Chronicle that the magazine's ratings were assigned somewhat haphazardly, based on the impressions of staff members.” Sierra then fired back at the Chronicle.

[A opportunity for comment from Sierra Editor Jason Mark was not responded to prior to this blog's publication time.]

For its part, Sierra seems to have tried to improve its methodology and transparency in the years since. They publically reported a major faux pas in their 2015 calculations, correcting it after it had gone to press. Indeed, Sierra has reported computational errors in every ranking every year since 2012. But to their credit they ‘fessed up; that takes stones. And this year, Sierra’s new Editor announced they are using a consultant to provide data analysis—and they are adding questions about divestment. Kudos for both.

Yet spurious rankings promoted by iconic publications such as Sierra and Princeton Review can have a negative impact on campuses that score curiously low. Sustainability budgets and personnel decisions can suffer from perceived underachievement. Administrators wonder what they have to do to be recognized; likewise, they can—and have--questioned the effectiveness of using STARS at all if what they receive from STARS data are bad marks from third parties.

Same for overachievement; campuses that may not be scoring as high in STARS’ terms but are highly ranked by third parties may be dis-incentivized from further accomplishment in the minds of some administrators. Things have come too easy. Or other campuses that are busting it to wring out every possible STARS point may look at the high ranking of lower-scoring STARS schools and wonder if there is a better way to shine.

I’m undoubtedly not the only sustainabilista that has gotten questions from administrators when my campus (the first STARS Gold and a former Sierra #1) isn’t favorably ranked in some publication—and so-and-so campus was.

I hate that question.

The STARS ‘gold standard?’

Eight years ago as STARS was emerging the Chronicle reported that “many think STARS will become the gold standard of sustainability evaluations.”

But has it?

Certainly the opinions of a few journalists can’t be credibly compared with the broad input and consensus of experts in the field such as STARS represents. STARS is open-source, inclusive, informed by subject matter experts (you), democratic, and constantly improving. Because of that, STARS is literally bought into by over 700 campuses that have paid to play and invested substantially in the data collection and quality control STARS requires. 

Third parties can boast none of that.

Then recently I was meeting with my Provost to plan our next STARS submittal when he popped the question. He’s a supporter of STARS, a scientist and researcher that likes data, and has championed our submittals in the past. His question was a fair one, but one I had never been asked.

“What is the ROI (Return On Investment) of STARS?”

I had just outlined the 18-month process it typically takes us to complete a STARS submittal. Along the way we invest significant personnel time/cost, we divert from other activities, we pay a fresh pair of eyes consultant to review the results, and incur other direct and indirect costs. So his question was stimulated by process costs, not an evaluation of sustainability’s efficacy.

I mumbled some perfunctory STARS cheerleader propaganda—and then told him we’d come back with an analysis.

Thinking about it since, my answer is that as an instrument that measures campus sustainability, STARS is unsurpassed. As an architecture of the elements needed to advance sustainability, it is all there—and for the most part weighted appropriately.

However, as an instrument that contributes to our reputation, it is ineffective. STARS does not publish rankings so our results are under-promoted, at a minimum. So the deep and quality data we all work hard to compile in our STARS report is taken by high visibility third party publications and perverted. They change the weightings of STARS credits to suit their own purpose—and scramble the results.

That’s good for them, bad for us.

Instead of seeing credible, studied, and informed rankings based on the best methodology derived by experts in the field, the public gets arbitrary results that are promoted widely.  Sierra even touts that their “ranking and the stories surrounding it draw national media attention from outlets such as the New York Times, CNN, NPR, and many others.”  And the problem is compounded because prospective freshman and campus administrators surf Princeton Review. Greenies surf Sierra.

But only sustainabilistas surf STARS.

If STARS is the gold standard of campus sustainability assessment, we’re getting robbed.

Not a good ROI.

The STARS ruse

At STARS’ genesis, we took the decision to not rank campuses, just rate them. The consensus was campuses would be more inclined to use STARS for its intended use—driving campus sustainability—if the sustainability movement wasn’t designed to be an arms race of rankings.

I concurred with the policy then. And perhaps that is, in part, why 700+ campuses are on the STARS team. It was the best thing to do at the time.

But that policy also was a bit of a ruse. Obviously, STARS does rank schools, albeit in a very categorical manner. Platinum schools rank higher than Gold that rank higher than Silver, and so on. But we steadfastly stayed away from producing more granular rankings. That Jedi approach remains the policy to this day.

In the current climate of data being appropriated and respun by novices, that policy increasingly undercuts STARS’ goal to “enable meaningful comparisons over time and across institutions using a common set of measurements developed with broad participation from the international campus sustainability community.”

Indeed, the comparisons are not being made from STARS data by STARS standards. The comparisons are being skewed by Sierra that goes to 1.2-million readers and says its green schools issue is its most popular, and by Princeton Review that claims 1.5-million web visits a month—most of them prospective freshman and/or their parents.

STARS touts that “more than 700 institutions on six continents are already using the STARS Reporting Tool, making STARS the most widely recognized standard for higher education sustainability in the world.” [emphasis added]

But it’s not. That statement is inaccurate. See above.

Likewise, STARS’ website proclaims the number one reason to participate is to “gain international recognition for your sustainability efforts.”

But that recognition is among STARS wonks. The rest of the planet is reading Princeton Review and Sierra.

Time for a new chapter.

Claim your lane

There’s no way anybody can convince the third parties to accept and publish the STARS ranking scores verbatim. Not going to happen. 

Neither is US News & World Report waiting at the presses to ink up the STARS scores and publish them with their rankings. Wish that they would. Maybe someday.

Frankly, we want third parties to be interested in our work and promote it--so long as there is a reference standard that sets a credible bar for all. Then when others specialize their rankings and commentary to suit their own niche or perspective we don't diminish the breadth of our work.  

So the big fix is easy.

STARS simply needs to move past the charade that it doesn’t rank. If STARS is, as the website touts, “the most widely recognized standard for higher education sustainability in the world,” then act like it. STARS should clearly state that these rankings are the most credible and definitive sustainability standards in the world.

Because they are.

STARS needs to claim that territory or abdicate its influence to a room of journalists somewhere. In fact, Sierra is already claiming the space--and I don’t blame them for staking out unclaimed territory. They “believe [Sierra’s] ranking provides a broad and accurate picture of how schools are performing in terms of sustainability.”

Wrong.

Sierra’s ranking, however well intended, provides a picture of how an environmental group’s journalists think higher education should operate sustainably. But they don’t have one campus sustainability expert on their staff. STARS owns the franchise on well-studied, credible, definitive campus sustainability data.

STARS should stop giving away its franchise.

To its credit, STARS is testing a rankings approach—and so far, the reception has been very good. Last year, STARS published its first Sustainable Campus Index and received significantly more media attention as a result, according to STARS staff. They also added overall score as an exportable data point in the "Score Display" data view so that it is easier for interested parties to compile STARS rankings, as I did. One sustainabilista at a very credible campus told me that these STARS recent moves have compelled them to submit full-on STARS reports now—so his campus could be ranked. So, the appetite for STARS rankings may be there.

Or it may not. More so than a broken clock, I am wrong more than twice a day, I am told frequently before I even leave the house in the morning... 

So, what do you think? At the right of this blog is a quick poll that will broad brush an opinion on which way to go. Please register your opinion. [1.28.16 Poll closed. 81% in favor]

But if you subscribe to STARS’ credibility, as I do, then this conclusion is inescapable: We’re going to get ranked by somebody anyway. Why not by the best?

Even if you have questions about STARS' data quality or other imperfections in STARS, it’s still the best game in town. And any limitations in STARS extend to the third parties because they use STARS data. Plus they introduce their own errors (e.g. see Sierra's repeated computational errors above).

Bottom line: Our own STARS’ data are enabling others to rank us poorly. To pervert STARS’ findings. STARS should publish and promote its own credible rankings. Protect its franchise.

It’s time to claim our lane.

-30-

Coda: I say 'we' a lot in this blog. That's probably confusing. Sometimes the 'we' refers to my former role on the STARS Steering Committee. Sometimes it's referring to my place among all sustainabilistas. Sometimes I'm just confused. Now you are. 


Footnotes


[i] STARS data as of Dec 15, 2015. This list may have changed as schools file new/revised STARS reports or old STARS reports expire. Table contains current STARS reports, including v1.2 and v2.0 submittals. V1.2 campuses listed in italics.
Data Sources:
http://www.sierraclub.org/sierra/2015-5-september-october/cool-schools-2015/full-ranking
http://www.princetonreview.com/college-rankings?rankings=top-50-green-colleges
http://www.bestcolleges.com/features/greenest-universities/