I recommend Ars Technica’s well-written summary of a recent paper in PLoS Medicine that studied the scientific publishing industry from a purely economic model. I believe that the for-profit model of most scientific journals is broken and ripe for disruption by open access journals, such as PLoS Medicine, so I’m inclined to look at the paper from a biased viewpoint. I will note, though, that since the paper is in a PLoS journal, I can choose to look at the results more critically without having to pay out the nose to do so.
Its sort of a joke, especially among industrial scientists, that physicists are ok at lots of things but not excellent at anything and that this explains why there are so few physicists in industry doing physics. While there is some accuracy to the joke, the truth is that the one thing that physicists excel at is measurement. As my graduate advisor used to point out, all of physics is counting. The trick is just to figure out the right things to count and the right way to count them. That’s the essence of measurement and its not always as easy as it seems.
Everybody needs to measure stuff. And whether you’re in a traditional business or a own a Web 2.0 startup or are just an average gal or guy, the need to measure things quickly and precisely has gotten a lot more intense in the past decade. You want to understand where your business is in the Long Tail or how “sticky” your website is or how much your coffee habit is costing you annually. And when I say precisely, I mean precisely in a, well, precise sense. To speak precisely, precise and accurate are not the same things. And this is the first thing to understand about measurement – a measurement is only valid when it is both sufficiently accurate and sufficiently precise.
Accuracy vs. precision
When you measure something accurately, your measurement gives you a number that is very close to the truth. You may not get the same number each time you make your measurement, but you know that its close to the actual value. When you measure something precisely, you’ll get close to the same result each time, but you may not be close to the actual answer. Ideally, we want our measurements to be both accurate and precise. In reality, most folks have a higher tolerance for lack of accuracy than they do for lack of precision. As long as the measurement is reasonably accurate, most people will settle for something that is off from the truth by a good bit so long as they get consistent answers from it. If you check your measuring cups in your kitchen drawers, you will find that they are pretty precise. Fill your 1/4 cup measure up 4 times and dump it in your 1 cup measure and it will fill it up exactly. (Or at least it did on each of the three sets of measures I had in my kitchen.) Yet, I have no idea – nor do I care – if the cups are calibrated properly. Do they deliver exactly 1 cup? If you care about that kind of accuracy, you’ll probably be using a graduated cylinder, not a plastic measuring cup. For most of us, the fact that the cups are precise is more important.
Does our tolerance for inaccuracy seem surprising? If you use Google Analytics to track your website stats, it shouldn’t be. Google can’t know, accurately, how many unique visitors actually visited your site. How can they? Even though they set a cookie to track your visitors, a lot of folks using Firefox will only accept cookies for that session, thus preventing Google from counting them over multiple visits. I do essentially the same thing with Omniweb. A lot of folks using IE will occasionally flush all of their cookies as a privacy measure. Each time the Google Analytics cookie for your site gets deleted, that user looks like a new user to Google. This means your unique visitor count is artificially high, as is your percentage of new users visiting your site. But, really, it doesn’t matter. You don’t care how accurate that number is, because whether you have 570 or 450 unique visitors per day isn’t as important as the trend. Is that number going up or down? Is it higher on Saturday mornings or weekday nights? As long as the measurement is precise, then those trends can be analyzed meaningfully.
Now we know what makes a measurement valid, and we understand that a large fraction of the time, we don’t need as much accuracy as we need precision. While I didn’t explicitly talk about it, it’s important to note that validity is predicated only upon sufficient accuracy and precision. Your car’s fuel gauge is neither terribly accurate nor terribly precise, but it represents a valid measurement because it gives you the data with sufficient accuracy and precision to keep you from running out of gas.
There are four other things to keep in mind about a measurement, which I’ll call the Four ‘R’s: Relevance, Range, Resolution, and Reproducibility.
I just read this piece by Alex Steffen on the WorldChanging blog and highly recommend it. The key quote from the piece, in my opinion, is this one:
[I]f we’re going to avert ecological destruction, we need to to not only do things differently, we need to do different things.
What he’s saying here is something that I’ve pointed out to my colleagues in the innovation community: sustainability is not about making things with less stuff, or that last longer, or that aren’t toxic, or even that can be infinitely cradle-to-cradle recycled. Sustainability requires us to invent things that make it possible to live more sustainably. If the things, the stuff, that we have and use make it easier to live sustainable lives than to not do so, then we will live sustainably.
Its not an easy problem to solve, for the same reason that truly groundbreaking innovation is not easy. It is pretty straightforward to imagine a novel solution for a market that already exists. It is much harder to invent a new market. I think that the kinds of products that will help people live sustainably are products for a market that doesn’t exist yet. Our business strategists don’t know how to value them, so our market analysts can’t compute a return on investment, so no investment is made. And truthfully, our scientists and engineers don’t always have the global perspective necessary to understand what types of solutions are necessary.
The point of Steffen’s article was to underline the importance of community in making these changes in our systems. I think that it is also important to understand the systems themselves. As we grow in our understanding the network of interactions and dependencies in our economy and our society, this understanding will allow us to break out of unsustainable patterns and replace them with ones that are equally understood, but are sustainable to the best of our knowledge. And because we’ll be building from a base of understanding, we’ll be able to look at them in a rational fashion 40 years from now when we understand the ways in which the new patterns are not sustainable.
It may be that at first, these more-sustainable patterns will be obvious. Things that folks like Steffen have been telling us for years, like community gardening, reducing sprawl, and increasing bike transport. But as with everything else, the low-hanging fruits will be quickly exhausted. At that point, progress will only be made by deeper understanding. It will be interesting to see how the tools for gaining that understanding develop.
I recently read Ian Ayres’ excellent book, Super Crunchers. For folks who read and enjoyed Freakonomics, this book is a must-read, covering more cases where clever statistical analyses have uncovered interesting and useful results. The goal in writing the book, according to Ayres, was to encourage people to learn to think statistically. On the other side of the link is a discussion of some errors in experimental design, why their treatment in Ayres’ book frustrates me and why the average person should care.
I posted an article earlier critiquing the media reaction to the recent reports on biofuels and land use management. Worldchanging has just posted a similar, fairly in-depth, critique as well. Their analysis goes more in depth into the specifics of each report, so I highly recommend it. What they do point out is that the Science articles are nuanced and that it was clear that the media in general either missed the nuance or ignored it.
No one grandstands quite like Craig Venter. Whether its leading a team racing the government to the first human genome sequenced, succeeding, or admitting that his team beat the government by sequencing his own genome, this guy has style like few others in science. And while physicists at least have the reputation of having large egos installed as part of their graduate training, Venter’s ego is apparently physicist-sized, at least according to Wired and Forbes.
That being said, there is something phenomenally inspiring about the folks who have no shame about tackling the really big problems. This is a constructive sort of hubris, the kind that Larry Wall correctly identified as a virtue. Venter’s glorious hubris was on display this week at the TED conference, where he announced that he was working on a project to engineer a bacteria that turns carbon dioxide into methane and octane and that he expects results within 18 months on these fourth generation fuels.
In the alternative energy circles, a recent Science magazine online article published by a group from Minnesota has been making a lot of waves in the media. This article from the Seattle Times is typical of the coverage. There are a couple of issues with both the article and the coverage of the article that I’d to point out.
First, let me tackle the article. While no one will argue that corn ethanol is an extremely poor choice for a biofuel feedstock, it is also inarguable that the article focused on current biofuel technology. This implicitly assumes that all new biofuels will be roughly equally bad for the environment. Clearly, this is not the case, since algal-derived biodiesel and similar biomass-derived fuels will not contribute equally to global warming through the destruction of ecosystems. The article also assumed by implication that biofuels are the primary driver behind conversion of ecosystems to cropland. Past data would indicate that this is almost certainly not the case, since slash-and-burn was prevalent in the Amazon basin well before biofuels become a cause celebre. The issues around land use in the developing world would exist with or without biofuels contributing, since there is rarely an incentive for the governments who control these lands to preserve them. Rain forests do not yield significant economic benefit to those who live near them. All the biofuel boom has done is exacerbate the situation. Hopefully, this will bring attention to dealing with the root causes of the destruction of these ecosystems – namely, food security and poverty.
The media has been largely guilty of indulging in shrill hachet jobs on the nascent biofuel industry based on this article. I am certainly not implying that the authors of the Science report intended this; rather, I think that the natural tendency to want to take potshots at large targets is to blame here. Nevertheless, I think its important that people interested in short term energy development continue to work on capturing energy from biomass. With any luck, we’ll solve both the petroleum problem and the disappearing ecosystems problem at the same time.