You have /5 articles left.
Sign up for a free account or log in.
(I thought about titling this "Lies, damn lies, and assumptions", or even "Making an ASS of U and ME". So many decisions, so little time. Sigh...)
One of the hot topics in nerd-topia this week has been an item in The Sunday Times (London) which made a big deal about an estimate that every time someone does a Google search, 7 grams of CO2 get emitted. The article prominently quotes Harvard researcher Alex Wissner-Gross, but Wissner-Gross claims that the seven-gram number came from the reporter, and that he declined to confirm it when asked. His version of the story is somewhat corroborated by the fact that the original item also attributed an estimate of 7-10 grams of CO2 to author Chris Goodall. There are lots of other wrinkles to the story, but both Wissner-Gross and Google itself came up with estimates of 0.2 grams, or about 97% less than the featured estimate.
Apparently unrelated, Rob Watson, the editor of GreenerBuildings web newsletter, was teeing off on a set of estimates of building efficiency produced by an anonymous (but apparently reputable) engineering firm. Those calculations showed the building in question to be reasonably efficient, given its age. Watson's audit said quite the opposite -- that the HVAC equipment was seriously oversized and the building grossly inefficient.
What ties these two bits of information together is that the disparities originate not in mathematical calculations, but in the assumptions which determined the initial numbers to be subjected to crunching.
In the Google case, Chris Goodall assumed that a single search accounted for fifteen minutes of (desktop, not Google server) computer usage. (I might take 15 minutes to browse through the links returned by a search and investigate a good share of them, but that's not the search per se.
In the case of the building, the engineers apparently assumed absurdly unrealistic numbers for outside temperature, building occupancy, equipment and lighting loads and -- by artificially inflating the numerator in their equation (load divided by power consumed) -- managed to calculate a seriously misleading figure for building efficiency.
I don't know that malfeasance was involved in either case, but I do smell (at the very least) conflicts of interest. Were I a reporter, I suspect that a story blaming a well-known tech company for big emissions numbers would get me more column inches (and more prominent placement) than the same story with smaller numbers. And were I (I'm guessing here) an engineer for the company which designed a building's systems in the first place, I'm less likely to get in trouble by finding that building reasonably efficient than by proving it a complete disaster.
But whether we're talking malfeasance, misfeasance, or just differences in perspective, I think these stories emphasize just how important it is to make explicit the assumptions behind any estimate -- particularly when the estimate is of something (like greenhouse gas emissions from a campus) which can't easily be measured directly.
As a new semester starts -- and I start working with a new cohort of students on a new set of class projects -- that's a lesson I know I'm going to have to deliver repeatedly. Were the projects only about proper use (in some abstract sense) of quantitative techniques, assumptions might not matter. But at Greenback we use student labor (it's cheap and readily available) to get better, and smarter, about energy efficiency. And if a project is likely to have real-world impacts, I think it needs reality-tested (or at least realistic) assumptions.