10 February, 2013

Week II: Refining an idea

http://www.nature.com/nature/journal/v406/n6799/images/4061047az-005.gif

Ray Kurzweil is a fascinating dude. In his book, The Singularity is Near, he offers the following suggestion: Conceptualize the universe as bits of information (binary 1s and 0s, or even qubits) rather than particles of energy or matter. Every bit of energy or matter (or information, henceforth used interchangeably) can be represented with a digital value. We can use information to represent and measure fundamental particles - everything from DNA code in cellular life to the position of an electron (but not a dead cat in a box.) Information is perhaps the true building block of the universe. A bitwise representation of data - that is to say, a single digit, 0 or 1 - can be used to describe many things: the presence or absence of an instruction, a value, an object, a unit...

If we take Mr Kurzweil's suggestion, it opens a path for shifting ideas about how we use energy. In conversations about energy consumption, we're usually referring to wholesale consumption of natural resources. We burn fossil fuels and in trade we get power to keep the machines running and the blogs blogging. It's a remarkable process, and one that's largely taken for granted. But it has a major drawback - much of the energy present in fuel goes to waste.

For reference, let's discuss energy in terms of joules, which are units of heat, energy, or work. A gallon of gasoline is worth about 132 million joules. It takes 4.19 joules to heat 1mL of water by 1 degree Celsius. Moving around using a gallon of gas has an inherent drawback - it generates pollutants, and the whole process that converts gasoline to useful energy for your car's powertrain is perhaps 26% efficient at best. That means in the ideal case (which your car isn't), you're only getting 34.32 million joules out of that 132 million you paid about $5.35 for if you bought gas in Victoria today. Much of it is wasted as heat and combustion by-products. It's a long way from high efficiency.

In many cases, using information to move ideas makes more sense than using gasoline to move people. For instance, a meeting can take place over Skype or teleconference and vital information can be transmitted with a reduced energy cost. The number of joules per bit of information shared in that meeting is thereby decreased. If everybody drove, the meeting would be a lot less efficient. This approach has an important consequence - there is a surplus of energy relative to what would be necessary if everyone had met in person. Lower energy overhead leaves that surplus to be recycled into making the whole information process more efficient. The system becomes a positive feedback loop in which each cycle is potentially less costly than the one prior.

As the demand for lower-overhead meetings increases, the infrastructure necessary to transmit that information will become more efficient at an exponential rate. This tendency is described by Moore's Law, which was proposed in 1976. Moore has proven to be incredibly accurate in predicting that the processing power of computer hardware will tend to double every 18 to 24 months. In response, the cost in dollars and joules per bit of transmitted data will decrease at a correspondingly exponential rate. This graph shows the exponential growth in the number of transistors on a single microprocessor since 1971.

The key advantage in treating energy as information is that the byproduct of information is more information. Consider the simplified life cycle of a gallon of gasoline: mining or drilling for bitumen or oil, several stages of purification and refinement, then pipelining, tanker delivery, and finally point-of-sale transmission to the end user. The oil is extracted from a finite source, and each stage of its life has a certain energy cost and certain unwanted byproducts which will increase as the source is depleted. The machinery and infrastructure will require more maintenance as it ages. Thus the latent value of energy per gallon of gasoline is relatively less at each transition due to the high processing overhead.

By contrast, when data is processed, refined, then transmitted, the net information output is larger at each transition. The source will not be depleted since it can be copied non-destructively. With an ever-decreasing cost in joules per bit, data refinements open headroom to invest more energy in search algorithms, pattern recognition, and more efficient ways to extract meaningful information from massive quantities of data. Consider the Large Hadron Collider - in my opinion, the coolest experiment EVER - which generates 15 million gigabytes of data every year! The LHC's findings are open source and accessible worldwide. As that information is shared, analyzed and applied, we all benefit. A tank of gas gets you a few hundred kilometers, but a think tank full of scientists gets you a few hundred new developments, ideas, technological advances, and experiments.

Gallons of gas have a hard energy density limit - 132 million joules. Information density is increasing exponentially. Bits per square centimeter of computing real estate are now in the billions, and the cost of putting them there is lower than ever. Despite amazing advances, there is still the drawback of intensive resource extraction that's required to build information networks. But even that is changing - fiber optic networks are many orders of magnitude more efficient than traditional copper wire, and they're being installed around the globe at an amazing pace. The rate at which physical resources are being translated to information continues to increase exponentially, with a lower net energy cost with each development cycle.

A bitwise view of energy is an interesting paradigm because it shifts resource extraction from a destructive to a creative process. It rewards discoveries that increase access to information and overall information storage and transmission capacity. Although there are mathematical limits to information density and computational speed we've yet to reach them on a conventional scale. Ray Kurzweil's singularity theory suggests that past a certain limit, our capacity to compress data will make technology and biology indistinguishable as we begin to embed technology into ourselves - an altogether fascinating view of how our treatment of information will ultimately help us to transcend the physical limitations of hard matter.


Word count: 1,044

1 comment:

Spare your two cents.