Energy Efficiency vs. Energy Intensity – and Why it Matters
I hate energy efficiency.
Now, before you send the Twitter hordes after me, I’ll explain. When you hear “energy efficiency,” what do you think of? The uncomfortable hue of a compact fluorescent bulb? Hearing the office AC cut out promptly at 5pm? Being a little colder than you might like in January?
Don’t get me wrong, I can’t stand wasting resources—including energy. It’s just that “energy efficiency” frequently connotes sacrifice, giving something up. It’s certainly appropriate in many situations. It just isn’t the best descriptor when it comes to IT.
When Dell drafted our 2020 Legacy of Good Plan, we deliberately chose to focus on “energy intensity” for a product portfolio energy target. ‘Intensity’ is a term more common in the economist’s lexicon than the IT professional’s. But, it does a better job of describing consequences and requirements. If you have to do this much work, you’re going to need that much energy.
Our goal is to reduce the energy intensity of our product portfolio by 80 percent by 2020, compared to our 2011 baseline (see our other goals). And we’re a good part of the way there. If running a set of workloads consumed 100 kilowatt-hours of energy in 2011, that same work would only take 57 kWH today. Our customers (usually) will describe what they need to do, then ask about energy consumption. They typically don’t tell us how much energy they have, then ask us how much compute they can buy (although there are always a few exceptions).
Read the entire article here, Energy Efficiency vs. Energy Intensity – and Why it Matters
via the fine folks at Dell