I'm not a tree-hugger, but I'm conscious about the environment. I recycle, drive a 4-cylinder, try to eat local/organic, and put my trash at the curb instead of burning it. But, I never really stopped to consider my direct connection to the environment through my coding practices.

When we think about abstraction, we usually don't go further than the operating system or the computer processor. We run a high-level language function, it's converted into dozens of low-level language statements, then hundreds of lines of machine code, which is run by the processor. But, all that happens under the condition that the processor has power. Let's take a look at the levels of abstraction that extend beyond your CPU.


Coal was formed from the remains of vegetation that grew millions of years ago. The plants which formed coal captured energy from the sun through photosynthesis. This energy is used to create the elements within plant tissue. The element most important for us being carbon, which gives coal most of its energy. No environmentally negative factors here.


Of course, coal is not the only form of energy used to generate electricity, but it's the largest in the US. Coal power stations have machines that convert the heat energy from the combustion of coal into mechanical energy, which then powers the electric generator. It's the byproducts and wasted heat energy from this process that causes the negative environmental effects. The process is not 100% efficient. Those inefficiencies aren't bottled up, they're sent to our air, lakes and rivers.

On average, a ton of coal generates 2,460 kilowatt-hours of electricity. Let's make that simpler:

1 kilowatt-hour of electricy = .81 pounds of coal

The Kilowatt

We Americans aren't fluent in the metric system, but we know the Kilowatt better than any other unit of measurement when it comes to electricity. Why? Because that's what we see on our electric bill, the Kilowatt-hour. But, lets not forget that the kilowatt-hours we use at home don't directly translate to the kilowatt-hours generated at the power plant. A certain amount of energy is lost in the transmission over the power lines.

So, how does this translate to our computer's CPU power demands? A kilowatt-hour may be enough to power your light bulb for an entire day, but a CPU is vastly more complex. A light bulb performs one operation consistently, from the time you turn it on, to the time you turn it off. Calculating it's energy usage is straightforward. But, your computer's processor is more dynamic.


CPU's used to maintain a more steady power consumption, but today's CPU is more energy conscious when idle. The big question is, "how much energy is saved when it's idle, compared to when it's working?" Apparently, this is not an easy question. It, of course, depends on who you ask, what the rest of the system is doing, and which processor. Many forums I visited seemed to suggest that an Intel I7 would drop to 75 watts, from 95 watts when at idle.

I also had difficulty finding out exactly what idle means. Is the processor idle between each sequential instruction sent its way? Or does it require a length of time before it settles down into it's idle state? Again, a solid conclusion wasn't reached. So, for this article, I'm going to make the following assumptions:

The difference between idle and working = 20 watts = .02 kilowatts

Additional Kilowatts used for 1 CPU instruction = the number of hours the instruction takes * .02

I'm emphasizing the word additional, because the CPU uses power whether it's processing an instruction or not.

The next question: How long does a CPU take to execute one instruction? It varies of course, so let's again use the Intel I7. Wikipedia says 82 billion instructions per second! You might be thinking, why are we possibly concerned with power consumption when we can process 14.7 quadrillion instructions for just one measly kilowatt-hour!

1 Kilowatt-hour, we remember, is created by the combustion of .81 pounds of coal.

1 pound of coal gets us 18.2 quadrillion instructions. Good job coal, your not so bad after all. But, hold on, we haven't worked our way through the remaining levels of abstraction. How much does 18.2 quadrillion instructions really buy us?


Huh, how did I get to JavaScript? I skipped machine code, assembly language, the OS, the browser environment. I decided to do this for the dramatic effect of showing how many CPU instructions are required for a simple JavaScript statement.

var x = 1 + 1;

I'm now thinking to myself, "how do I begin to determine how many machine code instructions that little statement took?"
I intentionally avoided anything user-interface or input related, to simplify this task.

We first need to find out what JavaScript "sits on". Until Google's V8 engine, JavaScript was interpreted, and still is in most browsers, by a JavaScript engine. The first JavaScript engine, SpiderMonkey, was written in C++.


What does it take for C++ to run a JavaScript statement such as the one above. I'd have to imagine that this particular statement is one-to-one. C++ allows us to declare and initialize a variable in one line of code, even if that initialization involves arithmetic.

int a = 1 + 1;

Assembly Language

x:      .word   1
y:      .word   1
a:      .word   0
add a,x
add a,y
int 21h

Is the above proper assembly langage? No. Will it work? Probably not. I don't know Assembly Language, I just wanted to illustrate the number of instructions required to perform a simple operation. In this case, it's likely that each assembly code instruction will produce one line of machine code. So, the most basic JavaScript statement might translate to at least 7 lines of machine code. I'm talking about a statement that involves nothing but the processor and RAM (no display, peripherals, disk drive, etc). A more complex statement, such as document.write(), may actually consist of hundreds of machine instructions. So, we can easily see how inefficient JavaScript code is orders of magnitude more inefficient when you look at it from a machine code standpoint.

But this still doesn't really mean anything. It's too difficult to relate additional lines of machine code directly to environmental impact! But, what if we convert that into more recognizable metric, time.

If a set of poorly optimized JavaScript statements takes 100 milliseconds to run, and the optimized version of that same code takes 50 milliseconds to run, you are keeping that processor busy for 50 milliseconds beyond what is necessary.

50 milliseconds = .05 seconds = .000014 hours

.000014 hours * .02 additional kilowatt hours required by the active CPU = .00000028 kilowatt-hours

So, an additional .00000028 kilowatt-hours are required by your one inefficient algorithm.

Not bad, that only means .00000022 pounds of coal. But, that's just one algorithm, in one page load, by one user.

If your website gets 10,000 page loads today, that number hops up to .0022 pounds of coal wasted. Yikes, that's getting a little scary!

Over a year, that turns right back into that original .81 pounds of coal, enough to generate 1 kilowatt-hour of electricity, to power the additional CPU cycles required by that same inefficient algorithm.

Again, that's just ONE algorithm, on one page on the web.

Now, my brain is exhausted from all this math, but multiply this by the number of pages on the web, and the number of poorly written algorithms on a page, and you've got an awful lot of coal!

Who votes that the windows task manager should have another column for "Coal Burned"!

Leave a Reply

Your email address will not be published. Required fields are marked *