World Crisis Index

Intrade, the Ireland-based prediction market, has launched a World Crisis Index. This index is a sum of the prices of 8 current markets Intrade is making in the area of global crisis, including a markets on recessions and growth rates in industrialized countries, US unemployment rates, the possibility of new US military action, and other issues. This sum is then normalized and reported. The Intrade markets first came to my attention via an email from Robin Hanson, who is arguably the world’s leading expert in prediction markets. Intrade had a good deal of success in predicting the outcomes of the last election cycle.

I followed the market fluctuations in the electoral issues pretty closely last year, specifically through Intrade’s partnership with Rasmussen Reports. What was interesting to me was how well the markets predicted changes in press coverage, from positive to negative or more interestingly, from sparse to dense and vice versa.

BTU vs. BTU

Robert Rapier at the R-squared Energy Blog has written a very good analysis of how ethanol may be more efficient a transport fuel than gasoline, despite the fact that ethanol contains fewer BTUs per gallon than gasoline. The upshot of it is that because of ethanol’s incredibly high octane rating (over 100!), it is possible to run an engine on ethanol at a much higher compression ratio than one could with gasoline. This would allow you to extract more work from the ethanol than can be extracted from the gasoline.

(note: Robert works for Accoya, one of the most interesting green materials companies out there. I currently have a sample of their product and if I order from them for a project I’m working on, I’ll blog about it here.)

Science and medicine

Some time ago, at the suggestion of my good friend, Daniel Hornbaker, I read an interesting but poorly-argued book by Steve Salerno that detailed the fraudulence and predatory practices of the 8G$ self-help industry. Recently, Salerno published an article in the Wall Street Journal that discussed some of the fraudulent activities in the complementary and alternative medicine (CAM) field. The disturbing part of the article for me was that despite continual failures to show any efficacy of CAM treatments, the NCCAM, a federally-funded part of the National Institutes of Health, is still being funded.

While I’m very interested in scientific investigations of the traditional pharmacopia, such as what the Bent Creek Institute is doing here in town – i.e. lots of extractions and chromatography – I’m concerned that mainstream emphasis on unscientific treatments will lead to a lot more deaths like this one.

Investment in R&D for sustainable technology

I just finished Common Wealth, by Jeffrey Sachs. The book is a fairly dry layout of why we aren’t meeting the UN’s Millennium Development Goals and what the consequences of that failure may be. I can’t recommend the book to the casual reader, because of its incredible denseness, but it does contain a fair amount of useful data for those of us who are thinking in the Bright Green mode.

One tidbit that I found interesting was Sachs’ estimation of the required investment in research and development in sustainable technology in order to address the issues in climate change, water and food security, disease, et al. that the book covered. This required investment was set at 0.2% of GNP of the developed world. By his calculations, which were likely made in 2007, this amount is equal to 70 billion dollars. While his estimation methodology was unfortunately not clearly disclosed, lets run with it for the time being.

By comparison, the 2007 NSF budget was 5.9 G$ (source: NSF.gov), the NIH budget was 29 G$ (source: NIH.gov, and the Department of Defense research budget was 72 G$ (source: Defenselink). Exclusive of other smaller research programs, such as the Department of Energy research programs and NASA, this represents around 107 G$ in funded research. By comparison, the 2007 cost of the Iraq War (specifically excluding Afghanistan and other “War on Terror” expenditures) was 123 G$ (source: CBO)

The implication of these numbers is that it appears to be quite feasible to fund the required research and development in sustainable technology, perhaps even unilaterally. Further, investing that 70 G$ above and beyond current research funding would at least partially address the “green jobs” development that President-elect Obama has been advocating. While some portion of this money would go to academic grants, some non-trivial portion of the funding should be made available in a SBIR/STTR program. Additionally, some technology-driven small business development funds, something like an angel investment fund for sustainable technology, would encourage green job growth while meeting these sustainable technology R&D goals.

It also seems reasonable that such an initiative would incentivize growth in the science and engineering fields. Despite a lot of ado about the need to train more scientists and engineers, many technical fields are and have been producing a glut of students with advanced degrees (as Daniel Greenberg and various industry publications, such as Physics Today and C&E News, have pointed out.) It also goes without saying that once a technical professional transitions from science and engineering to business or law, they do not return – the disparity in pay scales is generally insurmountable, at least in my experience. Driving the demand for technical professionals with these R&D incentives could absorb at least part of this glut, preventing the loss of the most talented individuals from the technical fields.

Above all, the goal of this funding is worthwhile: many of the challenges facing the world have solutions that are either in whole or in part technological. While I am always skeptical of throwing money at problems, I find a world of difference between things like funding direct food aid to developing countries and funding research in drylands agriculture and permaculture in order to improve cropland yields while reversing soil degradation. The former is simply spreading the wealth while the latter so very clearly creating new wealth for the entire world. When these Millennium goals are met, political scientists and economists argue that conflicts over scarce resources in the developing world will dwindle. It seems reasonable , then, that the best investment in foreign aid and development should start here. Hopefully, President-elect Obama’s advisors will encourage him to champion this opportunity to make such an investment in sustainable technology.

Genius Grants

I make it a point to read up on each year’s MacArthur Fellows. These MacArthur “Genius Grants” are unlike Nobel Prizes in that they are more often awarded on the strength of what the recipient will accomplish in the future than on the strength of what the recipient did years ago. More importantly, I’ve found at least one Fellow every year whose work has been so inspiring to me that I’ve continued to follow it over the years. The first of these was Dr. Angela Belcher, a professor of Materials Science at MIT. I’ve also been pleased when I see folks whose work I’ve admired recieve the award, such as Saul Griffith, the founder of Squid Labs and David Macauley, the incredible illustrator of “The Way Things Work.”

This year, one of the most inspiring recipients of the MacArthur Fellowship is an agriculturalist named Will Allen. His non-profit, Growing Power, maintains an urban farm in Milwaukee, providing fresh vegetables to the residents of the distressed inner city there. Regular readers here will note that I have a strong interest in urban agriculture and small-lot permaculture, so it is especially rewarding to see the MacArthur Foundation take interest in the kind of project that Will Allen is leading.

The New York Times published a great article about a month back on Will Allen and Growing Power and MAKE magazine has the video of an interview with him.

Scientific publishing and the winner’s curse

I recommend Ars Technica’s well-written summary of a recent paper in PLoS Medicine that studied the scientific publishing industry from a purely economic model. I believe that the for-profit model of most scientific journals is broken and ripe for disruption by open access journals, such as PLoS Medicine, so I’m inclined to look at the paper from a biased viewpoint. I will note, though, that since the paper is in a PLoS journal, I can choose to look at the results more critically without having to pay out the nose to do so.

Measuring things

Its sort of a joke, especially among industrial scientists, that physicists are ok at lots of things but not excellent at anything and that this explains why there are so few physicists in industry doing physics. While there is some accuracy to the joke, the truth is that the one thing that physicists excel at is measurement. As my graduate advisor used to point out, all of physics is counting. The trick is just to figure out the right things to count and the right way to count them. That’s the essence of measurement and its not always as easy as it seems.

Everybody needs to measure stuff. And whether you’re in a traditional business or a own a Web 2.0 startup or are just an average gal or guy, the need to measure things quickly and precisely has gotten a lot more intense in the past decade. You want to understand where your business is in the Long Tail or how “sticky” your website is or how much your coffee habit is costing you annually. And when I say precisely, I mean precisely in a, well, precise sense. To speak precisely, precise and accurate are not the same things. And this is the first thing to understand about measurement – a measurement is only valid when it is both sufficiently accurate and sufficiently precise.

Accuracy vs. precision

When you measure something accurately, your measurement gives you a number that is very close to the truth. You may not get the same number each time you make your measurement, but you know that its close to the actual value. When you measure something precisely, you’ll get close to the same result each time, but you may not be close to the actual answer. Ideally, we want our measurements to be both accurate and precise. In reality, most folks have a higher tolerance for lack of accuracy than they do for lack of precision. As long as the measurement is reasonably accurate, most people will settle for something that is off from the truth by a good bit so long as they get consistent answers from it. If you check your measuring cups in your kitchen drawers, you will find that they are pretty precise. Fill your 1/4 cup measure up 4 times and dump it in your 1 cup measure and it will fill it up exactly. (Or at least it did on each of the three sets of measures I had in my kitchen.) Yet, I have no idea – nor do I care – if the cups are calibrated properly. Do they deliver exactly 1 cup? If you care about that kind of accuracy, you’ll probably be using a graduated cylinder, not a plastic measuring cup. For most of us, the fact that the cups are precise is more important.

Does our tolerance for inaccuracy seem surprising? If you use Google Analytics to track your website stats, it shouldn’t be. Google can’t know, accurately, how many unique visitors actually visited your site. How can they? Even though they set a cookie to track your visitors, a lot of folks using Firefox will only accept cookies for that session, thus preventing Google from counting them over multiple visits. I do essentially the same thing with Omniweb. A lot of folks using IE will occasionally flush all of their cookies as a privacy measure. Each time the Google Analytics cookie for your site gets deleted, that user looks like a new user to Google. This means your unique visitor count is artificially high, as is your percentage of new users visiting your site. But, really, it doesn’t matter. You don’t care how accurate that number is, because whether you have 570 or 450 unique visitors per day isn’t as important as the trend. Is that number going up or down? Is it higher on Saturday mornings or weekday nights? As long as the measurement is precise, then those trends can be analyzed meaningfully.

Now we know what makes a measurement valid, and we understand that a large fraction of the time, we don’t need as much accuracy as we need precision. While I didn’t explicitly talk about it, it’s important to note that validity is predicated only upon sufficient accuracy and precision. Your car’s fuel gauge is neither terribly accurate nor terribly precise, but it represents a valid measurement because it gives you the data with sufficient accuracy and precision to keep you from running out of gas.

There are four other things to keep in mind about a measurement, which I’ll call the Four ‘R’s: Relevance, Range, Resolution, and Reproducibility.

Continue reading

Worldchanging on Walkscore

The folks at Worldchanging point out the critical flaws in Walkscore. I had a similar take on the site about a year ago, though one that was a lot less in depth. Check out the Worldchanging article for a very insightful take on why WalkScore’s approach is outdated (terrible business model), as well as some commentary on the Second Life tool called Carbon Goggles.

My take on Carbon Goggles is something between “obvious” and “pointless.” Second Life has utterly failed to impress me in the suspension-of-disbelief department, thus I’m much more likely to be moved by data on carbon impact than on something that gimmicky. I suspect that while such gimmicks do tend to be effective in getting points across to folks who are not intimately familiar with the subject at hand, the Second Life audience is one that is not ill-educated on climate change.

Summertime and new responsibilities

The summer is always crazy busy here and this summer is no exception. This year, a confluence of a project at work reaching a critical point, some added job responsibilities, and the usual summer craziness have conspired to keep me from blogging as much as I would have liked.

Recently, I’ve taken on the role of intellectual property champion for the central research division. This means that I’ll be keeping track of trends in what areas we’re inventing, what we’re filing on, and how well we’re doing in generating useful IP for the company. What’s most exciting is that I will be working to devise new, more useful metrics and measurements for the company’s IP performance. This has proved to be an interesting exercise thus far.

What strikes me most is how very little anyone understands about best practices in this area, despite the fact that IP has become central to most of the American economy. I’ve been perusing a report from the General Counsel Roundtable of the Corporate Executive Board that outlines what that group considers to be best practices and they are certainly good, but the implementation is very vague. I’m left with an outstanding question of whether the vagueness is due to the differences in best implementations that arise from legitimate differences between companies or due mainly to the inertia of corporate culture. I expect that I’m going to learn a lot from this.