Energy Internet and eVehicles Overview

Governments around the world are wrestling with the challenge of how to prepare society for inevitable climate change. To date most people have been focused on how to reduce Green House Gas emissions, but now there is growing recognition that regardless of what we do to mitigate against climate change the planet is going to be significantly warmer in the coming years with all the attendant problems of more frequent droughts, flooding, sever storms, etc. As such we need to invest in solutions that provide a more robust and resilient infrastructure to withstand this environmental onslaught especially for our electrical and telecommunications systems and at the same time reduce our carbon footprint.

Linking renewable energy with high speed Internet using fiber to the home combined with autonomous eVehicles and dynamic charging where vehicle's batteries are charged as it travels along the road, may provide for a whole new "energy Internet" infrastructure for linking small distributed renewable energy sources to users that is far more robust and resilient to survive climate change than today's centralized command and control infrastructure. These new energy architectures will also significantly reduce our carbon footprint. For more details please see:

Using autonomous eVehicles for Renewable Energy Transportation and Distribution: and

Free High Speed Internet to the Home or School Integrated with solar roof top:

High level architecture of Internet Networks to survive Climate Change:

Architecture and routing protocols for Energy Internet:

How to use Green Bond Funds to underwrite costs of new network and energy infrastructure:

Wednesday, December 16, 2009

Physicist Models Humanity as "Heat Engine" Argues Difficult to Decrease CO2

[While I agree with some of Garrett's conclusions I am remain optimistic that we can decouple energy consumption from CO2 emissions - hence this is I argue energy efficiency is attempting to solve the wrong problem. I agree with him that energy consumption seems to be constant associated with a growing economy and there is little that we can do to change that. But what we need to do is use energy that produces little or no CO2. Thanks to Jerry Sheehan for this pointer-- BSA]

Physicist Models Humanity as "Heat Engine" Argues Difficult to Decrease CO2

ScienceDaily (Nov. 24, 2009) — In a provocative new study, a University of Utah scientist argues that rising carbon dioxide emissions -- the major cause of global warming -- cannot be stabilized unless the world's economy collapses or society builds the equivalent of one new nuclear power plant each day.

"It looks unlikely that there will be any substantial near-term departure from recently observed acceleration in carbon dioxide emission rates," says the new paper by Tim Garrett, an associate professor of atmospheric sciences.


The study -- which is based on the concept that physics can be used to characterize the evolution of civilization -- indicates:

* Energy conservation or efficiency doesn't really save energy, but instead spurs economic growth and accelerated energy consumption.
* Throughout history, a simple physical "constant" -- an unchanging mathematical value -- links global energy use to the world's accumulated economic productivity, adjusted for inflation. So it isn't necessary to consider population growth and standard of living in predicting society's future energy consumption and resulting carbon dioxide emissions.
* "Stabilization of carbon dioxide emissions at current rates will require approximately 300 gigawatts of new non-carbon-dioxide-emitting power production capacity annually -- approximately one new nuclear power plant (or equivalent) per day," Garrett says. "Physically, there are no other options without killing the economy."

Getting Heat for Viewing Civilization as a "Heat Engine"

Garrett says colleagues generally support his theory, while some economists are critical. One economist, who reviewed the study, wrote: "I am afraid the author will need to study harder before he can contribute."

Garrett treats civilization like a "heat engine" that "consumes energy and does 'work' in the form of economic production, which then spurs it to consume more energy," he says.


Garrett says his study's key finding "is that accumulated economic production over the course of history has been tied to the rate of energy consumption at a global level through a constant factor."

That "constant" is 9.7 (plus or minus 0.3) milliwatts per inflation-adjusted 1990 dollar. So if you look at economic and energy production at any specific time in history, "each inflation-adjusted 1990 dollar would be supported by 9.7 milliwatts of primary energy consumption," Garrett says.

Garrett tested his theory and found this constant relationship between energy use and economic production at any given time by using United Nations statistics for global GDP (gross domestic product), U.S. Department of Energy data on global energy consumption during1970-2005, and previous studies that estimated global economic production as long as 2,000 years ago. Then he investigated the implications for carbon dioxide emissions.

"Economists think you need population and standard of living to estimate productivity," he says. "In my model, all you need to know is how fast energy consumption is rising. The reason why is because there is this link between the economy and rates of energy consumption, and it's just a constant factor."

Garrett adds: "By finding this constant factor, the problem of [forecasting] global economic growth is dramatically simpler. There is no need to consider population growth and changes in standard of living because they are marching to the tune of the availability of energy supplies."

To Garrett, that means the acceleration of carbon dioxide emissions is unlikely to change soon because our energy use today is tied to society's past economic productivity.

"Viewed from this perspective, civilization evolves in a spontaneous feedback loop maintained only by energy consumption and incorporation of environmental matter," Garrett says. It is like a child that "grows by consuming food, and when the child grows, it is able to consume more food, which enables it to grow more."

Perhaps the most provocative implication of Garrett's theory is that conserving energy doesn't reduce energy use, but spurs economic growth and more energy use.

"Making civilization more energy efficient simply allows it to grow faster and consume more energy," says Garrett.

He says the idea that resource conservation accelerates resource consumption -- known as Jevons paradox -- was proposed in the 1865 book "The Coal Question" by William Stanley Jevons, who noted that coal prices fell and coal consumption soared after improvements in steam engine efficiency.

Garrett says often-discussed strategies for slowing carbon dioxide emissions and global warming include mention increased energy efficiency, reduced population growth and a switch to power sources that don't emit carbon dioxide, including nuclear, wind and solar energy and underground storage of carbon dioxide from fossil fuel burning. Another strategy is rarely mentioned: a decreased standard of living, which would occur if energy supplies ran short and the economy collapsed, he adds.

"The problem is that, in order to stabilize emissions, not even reduce them, we have to switch to non-carbonized energy sources at a rate about 2.1 percent per year. That comes out to almost one new nuclear power plant per day."

"If society invests sufficient resources into alternative and new, non-carbon energy supplies, then perhaps it can continue growing without increasing global warming," Garrett says.


ICT and wireless can eliminate 6.9 Gt of CO2

[Another study pointing to the significant impact that ICT can have on reducing on CO2. But once again they dont mention the need for a proper carbon audit as per ISO 14064. As such any such claims of Co2 reduction are only hot air. Too many people still confuse energy efficiency with carbon reduction. The two are not related. We need to decouple energy issues from carbon emissions and focus on reducing the later -- BSA]

ICT can eliminate 5.8 Gt of CO2 by 2020 - IDC

A new IDC report, dubbed the G20 ICT Sustainability Index, has identified some 5.8 billion tons (Gigatons) of CO2 that can eliminated by 2020 with the “focused use of ICT-based solution.” The report was released last week in parallel with the United Nations COP15 meetings in Copenhagen,

As a comparison, the GSMA last month released its own projections that highlighted the potential CO2 reductions from the use of mobile technology at 1.15 Gt CO2e by 2020.


Friday, December 11, 2009

Huge jump in carbon footprint from telecom and Internet

Huge jump in carbon footprint from telecom and Internet

About 37 percent of the carbon footprint of the entire information and communication technology sector (ICT) in 2007 was due to the energy consumption of telecom infrastructure and devices, according to the Climate Group (14 percent came from data centers, and 49 percent came from PCs and peripherals). Contrast that with telecom’s carbon footprint figure in 2002 which was 28 percent of ICT’s carbon footprint.

UK’s Carbon reduction commitment legislation – the shape of things to come globally for universities and business

UK’s Carbon reduction commitment legislation – the shape of things to come globally for universities and business

the UK has passed legislation called the Carbon Reduction Committment (CRC).

The CRC is a groundbreaking piece of legislation designed to help the UK meet its carbon reduction targets by 2020. Basically, the CRC scheme will apply to organisations that had a half-hourly metered electricity consumption greater than 6,000 MWh per year in 2008. Organisations qualifying for CRC would have all their energy use covered by the scheme, this includes emissions from direct energy use as well as electricity purchased. Initially, it is estimated, around 5,000 organisations will qualify, including supermarkets, water companies, banks, local authorities and all central Government Departments. Qualifying organisations mostly fall below the threshold for the European Union Emissions Trading Scheme, but account for around 10% of the UK carbon emissions.

The organisations involved will need to register or make an information disclosure by 30 September 2010. A financial penalty (£5,000 plus a per diem charge for each subsequent working day an organisation fails to submit a report) will be imposed on organisations who fail to meet the deadline.

The first year of the scheme (April 2010-2011) is called the footprint year. Companies are required to submit an audited report of their emissions during the footprint year by 29 July 2011. Again financial penalties will be imposed for failing to meet the deadline.

In the second year, (2011-2012) participants will have to purchase emissions allowances to cover their forecast emissions for 2011/12. And in 2013 auctioning of carbon allowances begins, with all the income from the auctions recycled back to participants by the means of an annual payment based on participants’ average annual emissions since the start of the scheme.

There will be a bonus or penalty according to the organisation’s position in a CRC league table. The league table will be made public thereby enhancing the transparency of companies carbon reporting and hopefully shaming any egregious emitters into reducing their carbon footprint.

I have gone in to a bit of detail about the CRC here because it is difficult enough to find out information about the scheme and most UK business appear to be wholly unprepared for its implementation. The UK Department of Climate Change (I think it is interesting that the UK has a government department of climate change in the first place – how many other governments do?) has an easy to follow guide to the CRC [PDF] available for download which will help.

The CRC is going to be closely watched by other countries and you can be sure it will be used as a model by many to reduce their carbon emissions.

Wednesday, December 9, 2009

Emerging standards for greenhouse gas emissions for ICT

[These 2 projects are very significant as now we are starting to see some quantifiable standards in order to measure the claims of CO2 abatement through ICT. As you know there are many studies claiming significant reduction of CO2 through ICT - but up to now there has been no independent process to validate these claims. Various energy efficiency schemes are probably the most egregious example of these types of hand waving arguments. While energy efficiency may reduce costs it is very ineffective tool for reducing CO2 emissions as compared to purchasing renewable energy credits or sourcing renewable energy directly. If you are interested in the following opportunities, please contact David Wright or Tony Vetter directly as listed below. Thanks to Bill Munson from ITAC for these pointers -- BSA

----- Forwarded by Bill Munson/ITAC/CA on 08/12/2009 17:33 -----

(a) Product Life Cycle GHG Costs
(b) Supply Chain GHG Costs

The World Resources Institute and the World Business Council on Sustainable
Development have developed standards on how to implement ISO 14064 for
companies and for projects, which have become widely used, particularly in
countries implementing the Kyoto Protocol.

They have now developed 2 new (draft) standards for Greenhouse Gas (GHG)
Accounting for (a) Product Life Cycle GHG Costs and (b) Supply Chain GHG
Costs. They are looking for companies to test-drive their draft standards
with a view to providing feedback on how the drafts can be updated to
provide a final standard. They are particularly interested in
sector-specific information and the ICT sector is a of great importance in
this area, because of its potential to impact GHG emissions both positively
and negatively.

Professor David Wright ( at the University of Ottawa is
able to work with a company on this project. A rough division of
responsibilities would be that the company would assess its GHG emissions,
and Dr Wright would assess the impact on the draft standard.,Wright,%20David/option,com_directory/page,viewListing/lid,111/Itemid,116/lang,En/

Shown below is a proposal from the International Institute for Sustainable Development, whois seeking support from ICT industry partners
for their CANARIE study. The key contact is their Project Manager, Global Connectivity, Tony Vetter, he can be reached at or 613-288-2024.

ICT network operators and equipment vendors are looking to a variety of
solutions to reduce the GHG footprint of the world's ICT infrastructure.
Efficiency in how data centres consume energy may be part of the solution;
however using renewable energy is another “zero-carbon” option. CANARIE
Inc. invited proposals to their Green IT Pilot Program for projects that
will accelerate the development of, and participation in, national and
international "zero-carbon" cyber infrastructure and network platforms.

CANARIE has awarded funding to IISD for a project to assess the business
case for moving University IT assets to remote, zero-carbon data centre
facilities. Central to the business case will be an examination of whether
Universities could qualify for tradable “carbon offsets” (credits for GHG
reductions achieved which can be sold to industries who need them), a
revenue opportunity which could help underwrite the costs associated with
relocating their IT assets.

We think this project will be of interest to CIOs of all large
organizations because moving IT assets to zero-carbon facilities has not
previously been considered for generating carbon offsets. Further, there
may be other unexpected barriers to relocation of IT assets that could be
resolved through appropriate policy interventions, including jurisdictional
barriers arising from data security policies and capital financing rules,
and challenges associated with the availability of national
telecommunications infrastructure. These will also be explored through this

Due to the nature of how carbon credit awarding mechanisms are evolving,
IT organizations may in the end not be able to benefit from the carbon
reductions that their IT initiatives could help realize. This is due to
the concept of “additionality” – whether a project is deemed likely to
have occurred anyway without the support of revenues generated by
selling carbon offset credits.
This project’s assessment could open the door to broader acceptance of
IT asset relocation as a carbon reduction activity that should be
supported through carbon offset financial instruments.
Revenue opportunities from carbon credit trading could accelerate the
development of national and international "zero-carbon" cyber

The tasks of this project will be to:
estimate depending on data availability, the aggregate carbon footprint
of IT assets and associated data centres at three Canadian Universities
assess the feasibility for Universities to generate carbon offsets if
their IT departments were to move location agnostic IT assets to remote
data centre facilities powered by renewable sources of energy
assess the feasibility of quantifying and selling these offsets in
registries and carbon exchanges;
assess the business case for University IT departments to move IT assets
to remote, zero-carbon data centre facilities, with attention to the
role of offset revenues if accessible to the relevant business unit;
assess the implications of study findings for scaling similar IT asset
relocation schemes for government agencies and institutions, as well as
the private sector

Anticipated insights:
characterization of the carbon incentives or disincentives to scaling
the relocation of IT assets to zero-carbon facility initiatives
long term implications of University IT asset growth projections and
associated carbon penalties
characterization of organizational boundaries encountered in carbon
accounting processes for facilities expenditures, energy consumption and
GHG emissions
characterization of jurisdictional barriers resulting from data security
policies and capital financing rules to the migration of University,
other public sector, and private sector IT infrastructure and services
characterization of the adequacy of National broadband infrastructure
for supporting cost effective access for remote relocation of IT
infrastructure and services

We believe that some ICT companies may be interested helping to determine
whether relocating IT assets to zero-carbon facilities might qualify for
tradable “carbon credits” in the emerging regimes, as well as in the
identification of other barriers and policy gaps that would impede the
feasibility of such initiatives.

skype: pocketpro

Thursday, December 3, 2009

Calit2 and CANARIE See Campuses as Living Labs for a Greener Future

**Universities Challenged to Develop Technology Solutions for a
Carbon-Constrained World**

Calit2 and CANARIE See Campuses as Living Labs for a Greener Future

On-Site WInd Power Provides 100% of Power to Data Center

[Forward looking data center companies like Other World Computing
understand that data cneters are quickly becoming the new "heavy
industry" of the information age. If the Gartner forecast of 650%
growth become true we need to find alternative zero carbon solutions
for data centers and networks. CANARIE's recent annoucemnt
to fund Greenstar ( is a good example of this
apporach. Greenstar network is a university-industry partnership
involing companies like CISCO and Ericsson to build worlds first zero
carbon Internet to enable the deployment of follow the wind/follow the
sun data networking. -- BSA]

On-Site WInd Power Provides 100% of Power to Data Center

We've heard of data centers that are running on green power, though
these are often mostly done through buying energy credits for distant
generating facilities. But Woodstock, IL-based Other World Computing
is the first to have 100% on-site wind power to run its operations.
The 39 meter (128 foot) diameter, 500 kW turbine is expected to
generate an estimated 1,250,000 kilowatt hours (kWh) per year. This is
more than twice as much electricity as is used by all of OWC's
operations. The facility is grid-tied, and will sell the excess power
back to the local utility, as well as being able to utilize grid power
as backup during slack wind periods.

Top data center challenges include social networks, rising energy

Data growth will hit 650% over next half-decade, Gartner says

Enterprise data needs will grow a staggering 650% over the next five
years, and that's just one of numerous challenges IT leaders have to
start preparing for today, analysts said as the annual Gartner Data
Center Conference kicked off in Las Vegas Tuesday morning.

Rising use of social networks, rising energy costs and a need to
understand new technologies such as virtualization and cloud computing
are among the top issues IT leaders face in the evolving data center,
Gartner analyst David Cappuccio said in an opening keynote address.

The energy cost of two racks of servers, at full density, can exceed
$105,000 a year, he said. And servers are only growing denser, with
new blades that incorporate servers, storage, switches, memory and I/O
capabilities. At today's prices, the money spent on supplying energy
to an x86 server will exceed the cost of that server within three
years, he said

Monday, November 30, 2009

Green IT Conferences for research community

Green IT Conferences for research community

Thanks to Jordi Torres, Barcelona Supercomputing Center for sending a link to

New web page for Green Computing research community

A group of outstanding researchers has set up a simple new web page to make it easier for research community find updated information about the emerging conferences in Green Computing, the next wave in computing.

The site, includes a list of research conferences focused on green computing and energy-aware computer and network technologies. The site has been designed to make it easier for researchers for find information about new conferences (and conference tracks) in the area. Hopefully the page will serve to improve research in this important area.

Monday, November 23, 2009

The impact of cap and trade on your web server

For more information on this item please visit my blog at or

[Doug Alder of Rackforce has put together an excellent in-depth analysis of the impact that cap and trade (with carbon at $20/tone) will have on web and computer servers that are located in jurisdictions that are dependent on coal based power. While the pending cap and trade bills in the US Congress will mitigate most of the costs for consumers, industry and institutions will not be similarly protected. The EPA estimates that cap and trade will raise the cost of electricity for these organizations by an “average” of 60% with significantly higher prices in states dependent on coal powered electricity . To put this in context, cap and trade will cost an organization at least an additional $65 -$150 per year per server (200 W) if those servers are located in a coal powered state or province versus a state or province that is powered by renewable energy such as hydro-electricity. Considering that most businesses and universities have thousands of servers, the aggregate bill could be gigantic. Some excerpts from his excellent blog-- BSA

Power Sources and Their Coming Importance To Your Business
Do you have your own website? If you do it’s hosted on a server. Do you know where that server is located? Do you know the type of carbon footprint that server has where it is hosted? Do you care? If you do pay attention and you’ll learn something.
If we look at energy, first we will see how the source of that energy is important when considering the carbon footprint of a data center.

Let’s look at an example of two data centers, one in West Virginia and the other in British Columbia. Based on the data from Stats Canada, Environment Canada, & US Department of Energy that I researched I was able to build a spreadsheet showing the likely carbon cost for operating a server in each province and state (click on image for a readable version)(terms: gCO2eq/Kwh= grams of CO2 equivalent per Kilowatt hour, mTCO2eq/Mwh = grams CO2 equivalent per Megawatt hour, PUE = Power Usage Effectiveness a way of measuring how efficiently a data center uses the incoming power, that is what is the ratio of power used by the data center to the amount of power required to operate the ICT [Infornation Communications Technology] equipment (servers, switches, routers) – 1:1 would be perfect but basically impossible)

Now let’s see what that could mean to your business.

Say each data center is 120,000 sq. ft. raised floor (not at all unusual) Now allow a standard 32 sq. ft. per cabinet. That would give you a maximum number of 3125 racks (120,000/32) and each rack can hold a maximum of 42u worth of gear (a standard rack) but some of that will be the power distribution units of the data center and likely some of their networking gear too, so in general you will get around 36u of usable space. Assume you put 36 1u 200W servers in those slots. That gives you 112,500 servers in those 3125 racks. In BC, each of those servers would cost you an additional $1.06 per year. In West Virginia that would be $65.72 extra per server (the actual results would be higher though as a 120K sq. ft. data center would use up at least 20% of that space on aisles and various components needed to run a data center.) which, translates to $2365.92 instead of $38.16per rack per year extra. How will you justify that extra $2327.76 per rack per year to your shareholders?

The calculations above though were theoretical. They were based on a data center with perfect utilization of energy. That is for every watt of power required to run the ITC equipment in that data center they only used 1 watt of incoming power. Sadly, that is not the case and the average data center today has a Power Usage Effectiveness (PUE) rating of 2.5 (and many are much worse – that is an average). That means they need to purchase 2.5 watts of power for every watt they sell to their customers. Now go back to the last paragraph and multiply those final numbers by 2.5. Your extra cost is now $4654.52 per rack.

If your company is a public company then, as carbon taxes and/or carbon Cap &Trade becomes legislated then you will have a fiduciary responsibility to your shareholders to seek out the option that has you paying the least amount of taxes in order to maximize your returns. If you are a private company then you still need to consider the source of what powers your servers lest your competition beats you to it and gets a substantial edge in costs over you.

Small Windpower Can Make a Difference in Remote Telecom Facilities
Small Windpower Can Make a Difference in Remote Telecom Facilities
Yesterday at 7:12pm

In the spirit of James Burke, it is always fun to follow the leads and find the connections. In this case, we start with a USA Today article “Wind backs up Honolulu airport power.” Hawaii and clean tech are one of my personal interest. The crux of the story is how the Hawaii Department of Transportation (DOT) has supplemented the power consumed with 16 small 1 Kw wind turbines. Nothing remarkable about a 16 Kw system. 16 Kw would be fine to offset daily power use for a utility building (in this case, the backup power for the Honolulu airport). How these small turbines were mounted drew in my attention.

The system is a state Department of Transportation pilot project and data is being gathered to determine the system's cost savings and energy output. It was installed at the end of June and cost about $100,000. Photos by RICHARD AMBO | The Honolulu Advertiser

We’ve seen many different wind systems which take advantage of the building’s real estate. But, the leading roof top edge has interesting aerodynamic benefits. Buildings have interesting aerodynamic effects. It is a whole specialty realm of engineering which is currently focused on physical stress loads on the build’s structure.

AeroVironment, the makers of the small, modular wind turbine installed at Honolulu’s Airport is on to something which would have significant impact to the way we look at structures. AreoVironment is a revolutionary aviation company. They understand aerodynamics from a flight perspective. Yet, with their Architectural Wind Services, they are applying that knowledge to leverage “the natural acceleration in wind speed resulting from the building’s aerodynamic properties. This accelerated wind speed can increase the turbines’ electrical power generation by more than 50% compared to the power generation that would result from systems situated outside of the acceleration zone.” Imaging what would happen if the expertise from AeroVironment was synergized with a company like Force Technologies? What could be gained by mindfully designing a building to capitalized the natural wind dynamics and use the changes the build acts on those dynamics to recoup energy?

As a minimum today, we can see telecoms buildings in remote rual areas use AeroVironment’s small wind technology to cost effectively offset power utilization. The price range for 12 units range at list between $134,000 to $180,000. In most areas of the US country with commercial electrical rates, that would be a ~5 year payback for the investment. Given that most telecommunications facilities have lifecycles which last decades, this is an interesting investment in energy offsets. Move this to a developing country installation, where you have higher electricity rate, fuel cost (generators), and unpredictable power, and the attractiveness increases. Then add the utilization of space. AeroVironment’s installation on the building does not interfere with other roof mounted solar installations or pole/antenna mounted wind systems. So this specific design can be used as a local power producing suite – offsetting the electrical cost of the telecommunications facility while opening the door for feed-in tariffs for any excess (if there are feed-in tariffs).

Sunday, November 22, 2009

World on course for catastrophic 6° rise, reveal scientists

World on course for catastrophic 6° rise, reveal scientists

World on course for catastrophic 6° rise, reveal scientists - Climate Change, Environment -

The world is now firmly on course for the worst-case scenario in terms of climate change, with average global temperatures rising by up to 6C by the end of the century, leading scientists said yesterday. ...

Friday, November 20, 2009

E.U. to Mandate 'Nearly Zero' Power Use by Buildings

[This is a very significant announcement for universities and businesses in Europe. Europe has been well ahead of the rest of the world in implementing solutions to address climate change. As such I think they will be well positioned to be the big winners as we move to a low carbon economy. As mentioned in the article buildings are responsible for 36% of Europe’s GHG emissions. And according to several studies ICT represents 30-40% of the energy consumption in a typical office building. For universities ICT may represent 50% of the electrical consumption in a typical research facility. More astounding, according to the International Energy Agency (IEA) the aggregate electrical consumption of ICT in many homes is now greater than the aggregate consumption of traditional appliances such as fridges, stoves, etc. We desperately need new solution to address the impact of ICT in our buildings such as using 400/60 Hz multiplexed power systems over existing copper wire, where the 400Hz power is reserved for small scale renewable power to drive low power ICT equipment. Excerpts from NY Times – BSA]

E.U. to Mandate 'Nearly Zero' Power Use by Buildings

Most significantly, the European Union directive will require that nearly all buildings, including large houses, constructed after 2020 include stark efficiency improvements or generate most of their energy from renewable sources, coming close to "nearly zero" energy use.

European countries will also be required to establish a certification system to measure buildings' energy efficiency. These certificates will be required for any new construction or buildings that are sold or rented to new tenants. Existing buildings will also have to, during any major renovation, improve their efficiency if at all feasible.

Buildings are responsible for about 36 percent of Europe's greenhouse gas emissions, and stricter efficiency requirements have been sought for the past several years as absolutely necessary for the bloc to meets its goal of cutting emissions 20 percent from 1990 levels by 2020. Other regions should take note, said Andris Piebalgs, the E.U. energy commissioner, in a statement.

"By this agreement, the E.U. is sending a strong message to the forthcoming climate negotiations in Copenhagen," Piebalgs said. "Improving the energy performance of buildings is a cost effective way of fighting against climate change and improving energy security, while also boosting the building sector and the E.U. economy as a whole."

Gartner Says More Than 30 Percent of ICT Energy Use Is Generated by PCs and Associated Peripherals,"
Gartner news release, April 20, 2009,

Electricity consumption by consumer electronics exceeds that of traditional appliances in many homes

Thursday, November 19, 2009

NCAR's new data center - an embarrassment to the climate community

NCAR's new data center - an embarrassment to the climate community

The National Center for Atmospheric Research (NCAR) and its managing organization, the University Corporation for Atmospheric Research (UCAR), is building a new supercomputing center in Wyoming. The current NCAR data center in Mesa has outgrown the facility's capacity, and a new facility that can accommodate future expansion is needed. The Wyoming facility will contain some of the world's most powerful supercomputers dedicated to improving scientific understanding of climate change, severe weather, air quality, and other vital atmospheric science and geoscience topics. The center will also house a premier data storage and archival facility that holds irreplaceable historical climate records and other information.

NCAR is probably the world’s premier research facility for undertaking climate modeling and research. So it is very bizarre that such an organization would undertake to build a new data center in a state where almost 100% of the electricity comes from coal fired generating plants. What is ever more outrageous is that one of the principal partners in the project, Cheyenne Light Fuel and Power is leading a campaign to stop cap and trade -

NCAR’s strategy to build a data center in Wyoming also highlights the ridiculousness and absurdity of claims to build an energy efficient data center with a low PUE in a LEED qualified building. These claims are meaningless when all of the electricity is coal generated. If NCAR was genuinely concerned about the environment a much smarter move would have been to locate the data center a few hundred kilometers west to Idaho where almost of the electricity is generated from hydro. Relocating to Idaho would do more for the environment than even the most stringent energy efficiency and LEED qualified buildings. It would also send an important message that new jobs and business opportunities are only going to occur in those jurisdictions that provide clean, renewable energy.

I suspect NCAR is being seduced to locate its new data center in Wyoming because of the low price of electricity that comes from coal fired plants. But that strategy may backfire on them as Cheyenne Light Fuel and Power claims that their electricity prices will increase 73% with cap and trade.

Australian ISP goes carbon-neutral

[A great example of a forward thinking ISP. No mention of whether
thay also plan to earn carbon offsets by going carbon neutral. Some
excerpts -- BSA]

Australian ISP goes carbon-neutral

While most carriers are reluctant even to set targets for reducing
their carbon footprint, Australian ISP Internode has already been
carbon-neutral for a year.

The company, which has over 170,000 subscribers Australia-wide,
sources 100% of its electricity needs from renewable energy, and has
molded its equipment upgrade purchasing decisions towards energy
efficiency and sustainability.

The company has also started to invest in its own renewable energy
infrastructure, choosing to run a number of remote sites via solar
cells. With operators forced to pay a premium for piping power to
remote areas - and to provide expensive, long-lasting battery backups
- it is becoming cost-competitive to run these sites on solar, Lindsay

Becoming carbon-neutral is “not as expensive an undertaking as most
people looking at it would imagine,” Lindsay said. In South
Australia, green power costs around 20% more than traditional forms of
power, and that is the dominant cost.

The positive publicity benefits of the decision likely outweigh any
extra financial burden, he added.

“Any telecom company can do what we've done,” Lindsay said.
“It's not as big a challenge as it looks. It comes down to the
fundamental question – do the shareholders of the business care more
about the dividend this year, or about the long-term impact of people
on the planet?”

Tuesday, November 17, 2009

The impact of Cyber-infrastructure in a carbon constrained world

[With the growing power of supercomputers and data centers, people
are starting to realize that cyber-infrastructure may soon have a
significant impact on the environment because of its huge electrical
consumption and the resultant CO2 emissions if the electricity that
powers these systems comes from coal fired electrical plants. As I
mentioned in a previous blog the UK Meteorological Office new
supercomputer is one of the single biggest sources of CO2 emissions
(Scope 2) in the UK. Paradoxically this is the same computer that is
being used for climate modeling in that country. Thanks to a pointer
from Steve Goldstein we learn that even America’s spy agency –NSA,
is also running into energy issues and as such is building a huge new
data centers in Utah and Texas, of which both will probably use dirty
coal based electricity as well. There are also rumors that NCAR is
building a new cyber-infrastructure center in Wyoming (presumably
which will also use coal based electricity) which sort of undermines
its own credibility as America’s leading climate research institute.
I suspect very shortly with all the new announcements of grids and
supercomputers from OSG to Jaguar, that cyber-infrastructure
collectively in the US will be one of the top sources of CO2 emissions
as it is now in the UK. This is an unsustainable path and will come to
haunt those cyber-infrastructure organizations, particularly if
Congress passes a cap and trade bill. Cap and trade will increase the
price of electricity for institutions and businesses by an
“average” of 60% according to the EPA. But electrical prices will
be substantially more in states that are totally dependent on coal
fired electrical generation. Not only that, under the proposed cap and
trade bills any organization that emits over 25,000 tons of CO2 per
year (which includes most universities and research institutions) will
be required to purchase emission allowances or offsets if they want to
exceed their current level of emissions. It is not only traditional
power generators, cement plants or manufacturers that will be affected
by cap and trade. Most of the US higher ed and cyber-infrastructure
research facilities will be similarly affected. However there is some
good news: Cyber-infrastructure, if done right, can be a powerful tool
for reducing CO2 emissions. Larry Smarr and I recently gave a talk on
this topic at Educause which is now available per the link below –

Cyber-Infrastructure in a Carbon Constrained World

See also article in Educause Review

Slides are available on Slideshare

Weather supercomputer used to predict climate change is one of
Britain's worst polluters

The Met Office has caused a storm of controversy after it was
revealed their £30million supercomputer designed to predict climate
change is one of Britain's worst polluters. The massive machine - the
UK's most powerful computer with a whopping 15 million megabytes of
memory - was installed in the Met Office's headquarters in Exeter,
Devon. It is capable of 1,000 billion calculations every second to
feed data to 400 scientists and uses 1.2 megawatts of energy to run -
enough to power more than 1,000 homes.

New NSA data centers in Utah and Texas


"..."As strange as it may sound," he writes, "one of the most urgent
problems facing NSA is a severe shortage of electrical power." With
supercomputers measured by the acre and estimated $70 million annual
electricity bills for its headquarters, the agency has begun browning
out, which is the reason for locating its new data centers in Utah and
Texas. And as it pleads for more money to construct newer and bigger
power generators, Aid notes, Congress is balking.

"The issue is critical because at the NSA, electrical power is
political power. In its top-secret world, the coin of the realm is the

More electrical power ensures bigger data centers. Bigger data
centers, in turn, generate a need for more access to phone calls and
e-mail and, conversely, less privacy. The more data that comes in, the
more reports flow out. And the more reports that flow out, the more
political power for the agency.

Shortage of uranium may limit construction of nuclear plants

From Slashdot

"Uranium mines provide us with 40,000 tons of uranium each year. Sounds like that ought to be enough for anyone, but it comes up about 25,000 tons short of what we consume yearly in our nuclear power plants. The difference is made up by stockpiles, reprocessed fuel and re-enriched uranium — which should be completely used up by 2013. And the problem with just opening more uranium mines is that nobody really knows where to go for the next big uranium lode. Dr. Michael Dittmar has been warning us for some time about the coming shortage (PDF) and has recently uploaded a four-part comprehensive report on the future of nuclear energy and how socioeconomic change is exacerbating the effect this coming shortage will have on our power consumption. Although not quite on par with zombie apocalypse, Dr. Dittmar's final conclusions paint a dire picture, stating that options like large-scale commercial fission breeder reactors are not an option by 2013 and 'no matter how far into the future we may look, nuclear fusion as an energy source is even less probable than large-scale breeder reactors, for the accumulated knowledge on this subject is already sufficient to say that commercial fusion power will never become a reality.'"

Dr Dittmar's study:

Monday, November 2, 2009

Rethinking Cyber-infrastructure - Dan Reed on the future of Cyber-infrastructure and Green IT

[Dan Reed is well known in the academic computing and cyber-infrastructure community. Dan is Microsoft’s Corporate Vice President for Extreme Computing where he also works with Tony Hey –the founder of eScience in the UK. Previously, he was the Chancellor’s Eminent Professor at UNC Chapel Hill, as well as the Director of the Renaissance Computing Institute (RENCI) and the Chancellor’s Senior Advisor for Strategy and Innovation for UNC Chapel Hill. Dr. Reed has served as a member of the U.S. President’s Council of Advisors on Science and Technology (PCAST) and as a member of the President’s Information Technology Advisory Committee (PITAC). He has also been Director of the National Center for Supercomputing Applications (NCSA) at UIUC, where he also led National Computational Science Alliance, a fifty institution partnership devoted to creating the next generation of computational science tools. He was also one of the
principal investigators and chief architect for the NSF TeraGrid

Dan Reed recently gave a great presentation on the Future of Cyber-Infrastructure at a SURA meeting. You can see a copy of his presentation at

His basic thesis is that the bulk of academic computing will probably move to commercial clouds. Although there will still remain some very high end close coupled applications that need dedicated supercomputers the majority of academic computing can be done with clouds. Despite the presence of grids and HPC on our campuses most academic applications still run on small clusters in closets or stand alone servers. Moreover the challenge with academic grids is building robust, high quality middleware for distributed systems and solving the myriad political problems of sharing computation resources in different management domains. As well, the ever increasing costs of energy, space and cooling will soon force researchers to start looking for computing alternatives. Clouds are solution to many of these
problem and in many ways represent the commercialization of the original vision for grids.

Dan also ruminates about the possibility of building “follow the
sun/follow the wind” cloud architecture on his blog, which of course
is music to my ears:


**Geo-dispersion: The Other Alternative **

If it were possible to replicate data and computation across multiple, geographically distributed data centers, one could reduce or eliminate UPS costs, and the failure of a single data center would not disrupt the cloud service or unduly affect its customers. Rather, requests to the service would simply be handled by one of the service replicas at another data center, perhaps with slightly greater latency due to time of flight delays. This is, of course, more easily imagined than implemented, but its viability is assessable on both economic and technical grounds.

In this spirit, let me begin by suggesting that we may need to
rethink our definition of broadband WANs. Today, we happily talk of
deploying 10 Gb/s lambdas, and some of our fastest transcontinental
and international networks provision a small number of lambdas (i.e.,
10, 40 or 100 Gb/s). However, a single mode optical fiber

has much higher total capacity with current dense wave division
(DWDM) technology, and typical multistrand cables contain many
fibers. Thus, the cable has an aggregate bandwidth of many terabits,
even with current DWDM.

Despite the aggregate potential bandwidth of the cables, we are
really provisioning many narrowband WANs across a single fiber.
Rarely, if ever, do we consider bonding all of those lambdas to
provision a single logical network. What might one do with terabits of
bandwidth between data centers? If one has the indefeasible right to
(IRU) or owns the dark fiber

, one need only provision the equipment to exploit multiple fibers
for a single purpose.

Of course, exploiting this WAN bandwidth would necessitate dramatic
change in the bipartite separation of local area networks (LANs) and
WANs in cloud data centers. Melding these would also expose the full
bisection bandwidth of the cloud data center to the WAN and its
interfaces, simplifying data and workload replication and moving us
closer to true geo-dispersion and geo-resilience. There are deep
technical issues related to on-chip photonics


and ROADMs

, among others, to make this a reality.

In the end, these technical questions devolve to risk assessment and
economics. First, the cost of replicated, smaller data centers without
UPS must be less than that of a larger, non-replicated data center
with UPS. Second, the wide area network (WAN) bandwidth, its fusion
with data center LANs and their cost must be included in the economic

These are interesting technical and economic questions, and I invite
economic analyses and risk assessments. I suspect, though, that it is
time we embraced the true meeting of high-speed networking and put our
eggs in multiple baskets.

Thursday, October 15, 2009

Fossil Fuel Is the New Slavery: Morally and Economically Corrupt

[Some people may think this commentary is too strident and over the top, but Robin Chase I believe eloquently captures the urgency of doing something about climate change -- bSA]

Fossil Fuel Is the New Slavery: Morally and Economically Corrupt

A century and a half ago, fossil fuels replaced slaves as the underpriced energy source driving American economic growth. And like slavery, our deep economic dependence makes change difficult, despite the incontrovertible reality that our fossil-fueled system is profoundly wrong. America could not thrive while captured by the slave economy, nor can she thrive while in thrall to a carbon-based economy.

It required almost a hundred years and a devastating civil war to rid the US of slavery. Business interests fought to retain the morally and economically corrupt status quo. Favorable economics prompted blindness and slow response to the moral imperative for ending slavery. Favorable economics today cloud the minds of many legislators and business interests to cling to our system of underpriced fossil fuels. Despite the best efforts of Congressmen Waxman and Markey, the climate bill out of Congress proposed 2020 goals of only 17 percent reductions in CO2 over 2005 levels and passed by the narrowest of margins. Science tells us our 2020 goals need to 25 to 40% reductions over 1990 levels. Senators Boxer and Kerry have proposed 20%, a step in the right direction.

Ownership of another human being and reaping the benefit of their labor is repugnant. While burning fossil fuels is not as intimately observable or viscerally felt, a direct link from our actions to real individual suffering can be traced.

It takes a look back at the past several decades to appreciate the true costs of burning fossil fuel: air, water, and soil pollution, environmental degradation, wars and military entanglements to protect access to the sources, transfer of American's earning to foreign economies, political empowerment of those we buy from, and climate change. Unfortunately, our individual pocketbooks don't feel the true costs of what it takes for Americans to enjoy the energy derived from a ton of coal, or a barrel of oil. And that's why we make so little effort to use it efficiently, conservatively, or wisely. Drill, baby, drill. Burn, baby, burn.

And there will be more casualties. The best estimates of the slave trade's death toll are 15 to 20 million people over its 400-year history. Failure to move to a new low-carbon energy source will result in a similar magnitude of unforgivable suffering and death. The World Health Organization says that climate change was responsible for 300,000 deaths this last year, predicting as many as 9 million excess deaths over the next 20 years alone. Almost all of these initial victims will be among Africa and Asia's poorest who have no voice and no vote with regard to what happens in the US Congress.

Slavery destroyed familial and cultural bonds as well as removed the ability to earn a living. The same goes for global warming. Long-held ways of life are disappearing rapidly as ice melts, rains don't come, and sea levels rise. The Global Humanitarian Forum, recently released a report stating that 2030 will also see 310 million more people suffering adverse health consequences related to increased temperature, 20 million more people falling into poverty, and 75 million extra people displaced by climate change -- in addition to the excess deaths.

Delaying real change is intolerable. Unlike slavery, the global warming legacy will be forever irreparable and unrecoverable. New predictions indicate a good chance of a nine degree global temperature increase this century. What we eat, where we live, how we live, and indeed who lives will be changed. Forever. Again, we face an undeniable moral imperative.

This fall, Congress continues the debate over how quickly our country addresses our broken energy status quo. Just as in moral battles fought before, the correct action and way of life will ultimately prevail. Let's pass a climate bill that reduces CO2 emissions, on a timetable and in a quantity that science dictates, to avert the terrible calamity and suffering that lies ahead if we don't.

Back in 1860, our country was at a frightening and wrenching crossroads as we faced what appeared to many Americans to be an impossibly difficult decision: to accept the line drawn of no new expansion of its morally corrupt energy source and to commit to building America's future on a new economic footing. Today, we are at the same crossroads. Americans will ultimately deliver the correct moral response. The question is, can we do it in time to avert unpardonable suffering around the globe and without tearing our nation in two?

Follow Robin Chase on Twitter:

Read more at:

Iceland looks to serve the world

Since the financial crisis, Iceland has been forced to retreat back from high octane bubble living to nature.

Not, you might think, the most obvious place to stick millions of the world's computer servers which are, for all their uses, rather less attractive.

But the country now wants exactly that - to become home to the world's computing power.

Behind all the large internet companies lurk massive and ever growing data centres chock full of servers churning away.

Google for instance is thought to have around a million of the things, but even less IT intensive operations, banks for example, need hundreds of thousands of servers to store all their data.

Thermal image of a computer
Up to 60% extra energy is required to cool computer servers in the UK

The problem is that while these computers look innocuous, they use a lot of energy.

There is of course the power you need for the servers themselves, but almost as significant is the energy used to keep them cool.

"For every watt that is spent running servers," says Dr Brad Karp, of University College London, "the best enterprises most careful about minimising the energy of cooling and maximising efficiency typically find they are spending 40-60% extra energy on just cooling them."

Cold rush

In Iceland, with its year round cool climate and chilly fresh water, just a fraction of this energy for cooling is needed. It means big savings.

Just outside Reykjavik, work is well advanced on the first site which its owners hope will spark a server cold rush.

In around a year - if all goes according to plan - the first companies will start leasing space in this data centre.

And if this proves successful more sites are planned.

The company expects demand to be huge because as the number of servers around the world grows, a big environmental cloud is looming - all that energy use means an increase in CO2 production.

Iceland has far more power than it can domestically use.

"The data centre industry now is on par with the airline industry as far as the carbon footprint," says Jeff Monroe, head of Verne Global - a data centre company working in Iceland.

Jeff Monroe
A company would save greater than half a million metric tons of carbon annually
Jeff Monroe, CEO of Verne Global

"But, if you think about the growth of those two industries, the growth of the data centre industry is exponentially greater than the airline industry.

"The two are going to cross and we think that - just like the legislation that was passed in the UK concerning carbon footprint and power utilisation - it is going to be a growing concern across the industry."

So data centres are already producing as much CO2 as airlines.

While it has been below the radar until now, Verne Global thinks that with cloud computing on the rise, the carbon footprint of the digital world will soon become "unacceptably high".

And this is where Iceland's natural resources really come into their own.

Enormous savings

The volcanic forces which shaped the landscape have also gifted the country masses of geothermal power - 100% of the country's electricity is renewable and basically carbon free, much generated from water heated far below the ground.

Mr Monroe explains what would happen if a company moved its data centre to Iceland.

"The carbon savings would be enormous.

Icelandic power station
All of Iceland's electricity is renewable and basically carbon free

"For example, if a large internet media company operating thousands and thousands of servers relocated its servers to Iceland, that company would save greater than half a million metric tons of carbon annually."

So you have the cooler climate and an abundance of green energy.

But you would not want to move your previous data centre to what is effectively the middle of nowhere unless it had some good connections.

Iceland has been busying itself laying fibre optic cables to connect the country with North America and Europe.

The cables coming in provide a capacity of more than five terabits/sec - all with server farms in mind.

Travelling down this pipe, data sited in Iceland is just 17 milliseconds from London. Sitting at home on YouTube you would never know, but even that is too slow for some.

Big industry

Gudmundur Gunnarsson, head of communications company Farice, explains some of the problems.

"There are very sensitive financial services that cannot even go outside the M25 in London", he says.

"So everything has to be within that circle, but for approximately at least 70% of other traffic, this delay is more than satisfactory."

Even where speed is not an issue however, the allure of Iceland is not for everyone.

Companies will have to overcome their natural server-hugging tendencies, and some may harbour security fears of storing their data remotely.

But having been through the financial mill Iceland hopes and believes in the next five to 10 years this will be one of its biggest industries.

And, in an irony not lost on a country brought to its knees by finance, one early customer rumoured to have signed a deal to move servers here is - well who else - one of America's biggest investment banks.

MUST VIEW: Two cool zero carbon Internet companies

[I have had the distinct pleasure of being introduced to 2 very cool zero carbon Internet companies - which to mind exemplify the future of the Internet. They also illustrate that the future high tech business opportunities are in building solutions for a zero carbon economy. The first is a new search engine company ( that links search queries with social networks of people making similar searches. But what is quite unique is the company's servers are solely powered at renewable energy sites around North America including various solar and hydro powered data centers. The second company is Eseri ( which provides a green virtual desktop based IT solution, integrated from 75 of the world's best open source software components. The system is hosted in an enterprise class data center running on low power optimized servers, using green hydro power in Montreal, Quebec. Access is through fully interactive virtua l desktops from any device anywhere you can access the Internet. ESERI combines the best of open source with clouds. It is an ideal solution for those who are keen on using open source on their desk top but are scared off in setting up Ubuntu or similar OS and all the attendant open source applications. The other attraction of combining open source with clouds is that you dont get vendor lock in with either the application software or with the cloud services. You can move your applications and data off the service at any time. Well worth checking out --BSA]

Wednesday, October 7, 2009

How your network can reduce your carbon footprint

Great presentation by Rod Wilson from Nortel (sorry Ciena) and Jerry Sheehan of UCSD (Cal-It2)



How the Internet will revolutionize Smart Meters and Smart Grids

[As many of you know I have long complained about the current generation of smart meters. They are too focused on the needs of the electrical utility in terms of minimizing peak load, and have little benefit to the consumer or the environment. In most electrical systems the utility owns and controls the meter. This reminds me of the days when the telephone company owned the telephone which greatly inhibited innovation. It wasn't until regulators forced the telcos to allow direct interconnection of devices to the network that computer and networking revolution took off. I think we face the same challenge with Smart meters. In Germany companies can arrange to interconnect their own government approved meter directly. And as you can see from this article it is already creating innovation. I am also see that Google's new power meter is all about bypassing the utility. Most exciting is the IETF is now undertaking development of protocols for Smart Grids.&nb sp; The IETF has always been the per-eminent standards body because of its insistence on meritocracy rather than politics to drive standards and a can-do culture of "rough consensus and working code". The IETF and the Internet developed the essential protocols that freed us from the tyranny of the telco and their walled gardens, hopefully they will do the same thing for the utilities and smart grids, Some excerpts -- BSA]

Why Google’s PowerMeter Gadget Partnership Is a Big Power Play

With Google’s endless projects — from book search to a browser killer to Blogger — you’re probably wondering why I’m so excited about a new partnership deal for the company’s PowerMeter energy management tool. Well, here’s why: For the first time, consumers can now access PowerMeter via a gadget called the TED-5000, made by startup Energy Inc., and users don’t need to go through their utility or have a smart meter (a digital two-way electricity meter) installed to access it. In other words, Google has finally bypassed the utility with PowerMeter, which is an important step for both bringing consumer energy management products to the mainstream, and pushing utilities to more quickly embrace information technology networks and broadband.

From a posting by Richard Shockey on David Farber's IPer list

Subject: The IETF and the SmartGrid

The general internet community needs to be aware of activities in North
America that directly relate to the use of IETF protocols in the Electric
Utility industry. This activity is generally referred to as the SmartGrid.
Though the issues immediately deal with technical and policy decisions in
the US and Canada, the SmartGrid concept is gaining significant momentum in
Europe and Asia as well.

The SmartGrid has many definitions but as a practical matter it is a
substantial re-architecture of the data communications networks that
utilities use to maintain the stability and reliability of their power
grids. Many of the requirements for the SmartGrid in North America came out
of the 2003 North East power outage which demonstrated a substantial lack of
investment in Utility IT systems.

Of particular note, is the desire by utilities to extend the reach of their
communications networks directly to the utility meter and beyond ultimately
into the customer premise itself. This is generally referred to as the
Advanced Meter Interface (AMI). One of the use cases driving this
requirement is the next generation of plug-in hybrid electric vehicles. The
utilities, correctly IMHO, want to precisely control the timing of how these
vehicles are recharged so not to create a unique form of DOS attack and take
out the grid when everyone goes home at night. This is a principal use case
in 6lowpan ( ID below ). Increasingly energy flows are becoming
bi-directional creating needs for more computational intelligence and
capability at the edge.

What is going on? Why should the IETF community care?

The United States Government, as part of the Energy Independence and
Security Act of 2007 gave the National Institute of Standards and Technology
( NIST ) principal responsibility "to coordinate development of a framework
that includes protocols and model standards" for the SmartGrid.

After several meetings sponsored by NIST in recent months, NIST released a
preliminary report. Several folks from the IETF community attended those
meetings, myself included. There multiple troubling stories about how those
meetings were organized but I'll leave those tales to others.

One of the requests from NIST and the SmartGrid community was a list of Core
Internet protocols that NIST could refer to. Fred Baker has been working on
that task. ( below )

Myself and others are deeply concerned by how this effort is developing.
There is no current consensus on what the communications architecture of the
SmartGrid is or how IP actually fits into it.

The Utility Industry does not understand the current IPv4 number exhaust
problem and the consequences of that if they want to put a IP address on
every Utility Meter in North America.

What is equally troubling is that many of the underlying protocols that
utilities wish to deploy are not engineered for IPv6. We have an example of
that in a recent ID.

Obviously, there are significant CyberSecurity issues in the SmartGrid
concept and NIST has produced a useful document outlining the requirements
and usecases.

How the SmartGrid interfaces with or bridges with Home Area or Enterprise
Local Area networks is unclear, to put it mildly.

I want to use this message to encourage the community to read the attached
documents and get involved in this effort as appropriate. Additional NIST
documents will be published shortly with a open public comment period.

I strongly urge members of the IETF community to participate in this comment
period and lend its expertise as necessary.

It's useful and important work.


Title : Core Protocols in the Internet Protocol Suite
Author(s) : F. Baker
Filename : draft-baker-ietf-core-03.txt
Pages : 32
Date : 2009-10-03

This note attempts to identify the core of the Internet Protocol Suite. The
target audience is NIST, in the Smart Grid discussion, as they have
requested guidance on how to profile the Internet Protocol Suite. In
general, that would mean selecting what they need from the picture presented

A URL for this Internet-Draft is:

What Cisco Can Learn From A Yello Strom Smart Grid Pilot

Networking giant Cisco could learn a whole lot from its partnership with German utility Yello Strom, which I once called the coolest utility in the world, and which focuses heavily on smart grid consumer hardware and the use of the Internet for the power grid. While Cisco included Yello Strom as a partner in its smart grid announcement last month, the networking company announced more details about a 70-home pilot project using Yello Strom’s sophisticated “Sparzähler” or smart meter this morning. If Cisco aims to some day develop a Linksys-based home energy management product, the project detailed today could provide some important information for that effort.

Yello Strom is also one of the only utilities I’ve heard of that has developed and sells its own sophisticated smart meters. In July Martin Vesper, Yello Strom’s executive director, told us that the company looked at the smart meters that were already available on the market, and found only tools that focused on helping energy efficiency from a utility perspective. Not seeing anything they liked, or anything that would get consumers excited, they developed their own, which looks like it would be at home in the window of an Apple store, is built off of Microsoft Windows CE, and has both a small web server and client inside. Yello’s meter is a lot more sophisticated than other smart meters.

This unusual environment — a sophisticated, innovative smart meter, and potentially a home broadband connection — will be a very interesting environment within which Cisco can run a pilot program. It could enable Cisco to get an interesting perspective for how it could roll out any type of Linksys, broadband-based, home energy management product, which Cisco has actively been looking into

Are Returns from Smart Grid Investments Too Weak for VCs?

The smart grid might be the Megan Fox of cleantech right now (hot), but will venture-backed smart grid startups be able to deliver the type of returns that VCs commonly like (somewhere around 10 times their investment)? Not really, suggested venture capitalist Vinod Khosla at the AlwaysOn GoingGreen conference in Sausalito, Calif., earlier this month (watch the video clip here). During a panel on the first morning of the event Khosla called smart grid investments from a VC perspective “interesting, but marginal,” at “10 to 15 percent.”

Indeed, Khosla hasn’t made any direct investments in bringing information technology to the power grid over the years, despite the fact that he played a fundamental role in the development of information technology — as co-founder of Sun Microsystems and an investor with Kleiner Perkins funding broadband firms like Juniper.

skype: pocketpro

Tuesday, October 6, 2009

Canadian Government CTO speaks about clouds and Green IT

[Jirka Danek, the Canadian Government CTO gave a great talk today at GTEC on Cloud computing and Green IT. A great example of the potential of clouds for government services is the web site in the USA. The Obama administration is driving adoption the use of clouds for government and developing policies for agencies to segregate date and processes that can be adopted to the cloud where is the few concerns about privacy and security -- BSA]

Cloud Computing
and the
Canadian Environment

Today there is a tremendous opportunity for Canada to position itself as a world leader in Cloud Computing.

Many public and private organizations are looking at Cloud Computing as a long-term software and hardware service source and data storage solution.

Large organizations across Canada and abroad have started to embrace Cloud Computing and many are currently looking at location options adapted to their needs.
Due to its geographical characteristics, low-density population, IT expertise, quality construction standards, legislative framework (including the Privacy Act and the Personal Information Protection and Electronic Documents Act) and low-cost green energy, Canada is considered a prime location for Cloud Computing.
Major organizations in the Canadian IT industry, as well as the Government of Canada and the provinces and territories, are beginning to realize Canada’s advantage and the benefits of positioning Canada as an economical and strategic choice for Cloud Computing.

There is a tremendous opportunity for Canada to position itself as a world leader in Cloud Computing and to benefit from the economic, environmental and technological returns of this new public utility.

Cloud Computing refers to the use of Internet-based computer technology for a variety of services i.e., software, hardware, data, etc. It incorporates different concepts including:
- Software as a Service (SaaS) – a model of software deployment where an application is licensed for use as a service provided to customers on demand;
- Web 2.0 – the second generation of web development and design, that aims to facilitate communication, secure information sharing, interoperability, and collaboration on the Web;
- Infrastructure as a Service (IaaS) also known as Hardware as a Service (HaaS) – the delivery of computer infrastructure as a service; and
- Other recent technology trends which provide common business applications online that are accessed from a web browser, while the software and data are stored on the servers.

The underlying concept dates back to 1960 when John McCarthy opined that "computation may someday be organized as a public utility". The term Cloud had already come into commercial use in the early 1990s to refer to large ATM networks and by the turn of the 21st century, the term "Cloud Computing" had started to appear. played a key role in the development of Cloud Computing by modernizing their data centres after the dot-com bubble and, having found that the new cloud architecture resulted in significant internal efficiency improvements, providing access to their systems by way of Amazon Web Services in 2002 on a utility computing basis.

In 2007, Google, IBM, and a number of universities embarked on a large scale Cloud Computing research project to build data centers that students could tap into over the Internet to program and research remotely. Cloud Computing became a hot topic by mid-2008 and numerous related events and conferences started to take place.
In June 2008, Jeffrey Hewitt, vice-president of research with Gartner Inc. concluded that Canada's abundant and low-cost hydroelectric power, cooler ambient temperature, fibre cables network and proximity with the United States can help it take advantage of the growing Cloud Computing trend to provide services and Web applications that are economically sound and environmentally friendly.
Hewitt also highlights that “the nurturing of a domestic Canadian server infrastructure to provide web-based resource support could provide long-term growth prospects in terms of servers and the resulting content and services, as well as could help to push this North American country well beyond its current server installed base.”

The majority of Cloud Computing infrastructure as of 2009 consists of reliable services delivered through data centers and built on servers with different levels of virtualization technologies. The services are accessible anywhere in the world, with the Cloud appearing as a single point of access for all the computing needs of consumers.

Some countries are already embarking on the Cloud Computing journey. However, large corporations and governments of all sizes state privacy protection and data security as the main concerns regarding implementation of data holding centres in Asia, Europe, Russia, Brazil and other countries that don’t have the legislative framework in place to adequately safeguard strategic information and assets.

Strategic Considerations:
Cloud Computing
The Cloud Computing trend has intensified as businesses struggling in dismal economic conditions can reduce costs by using applications online as paid services instead of buying, installing and maintaining software on their own machines.

Through Cloud Computing, customers can minimize capital expenditure as infrastructure is owned by the provider and does not need to be purchased for one-time or infrequent intensive computing tasks.

Device and location independence enables users to access systems, regardless of their location or what device they are using.
Multi-tenancy enables sharing of resources and costs among a large pool of users, allowing for:
- Centralization of infrastructure in areas with lower costs (e.g., real estate, electricity, etc.)
- Peak-load capacity increases (i.e., users need not engineer for highest possible load-levels)
- Utilisation and efficiency improvements for systems (often utilized at only 10-20%).
- On-demand allocation and de-allocation of CPU, storage and network bandwidth.

Reliability improves through the use of multiple redundant sites, which makes it suitable for business continuity and disaster recovery. Scalability meets changing user demands quickly without users having to engineer for peak loads.
Security typically improves in Cloud Computing due to the centralization of data, increased security-focused resources, and because providers are able to devote resources to solving security issues that many customers cannot afford.
Sustainability in the Cloud comes about through improved resource utilisation, more efficient systems, and carbon neutrality. Nonetheless, computers and associated infrastructure are major consumers of energy.
Maintainability is another characteristic of Cloud Computing as the vendor is able to release new versions of their service automatically, relieving the client of the hassles related to installing software upgrades on their local servers with each new release.

Advantage Canada

Due to its geographical characteristics, cooler temperatures and low-density population (particularly as one moves farther north in Canada), IT expertise, quality construction standards, legislative framework (including the Privacy Act and the Personal Information Protection and Electronic Documents Act) and low-cost green energy, Canada is considered a prime location for Cloud Computing.

Canada has a reputation of being a highly desirable outsourcing location for companies from around the world because of factors such as our well-educated talent pool, multicultural population, geopolitical stability and relatively low cost of conducting business.

Canada has a definite advantage over other northern countries like Iceland, Finland, Russia, Korea and China, to become a world leader in Cloud Computing; namely on the security and privacy fronts, but also for the reliability of its utility network and electricity supplies, the quality of its traditional and specialized workforce, and for its environment track record.

Canada’s geographical position next to the United States, in addition to existing trade agreements between the North American partners, enables Canada to take advantage of a prosperous regional market and a global market worth 3.5 trillion $US.

The Government of Canada spends approximately $5 billion annually on information technologies (IT) and Budget 2009 has set aside $12 billion to accelerate and expand federal investments in different infrastructure projects, including:
- $225 million over three years to develop and implement a strategy on extending broadband coverage to unserved communities to close the broadband gap in rural and remote communities.
- $1 billion over five years for the Green Infrastructure Fund to support projects such as sustainable energy;
- $500 million over the next two years for infrastructure projects in small communities;
- $750 million for leading-edge research infrastructure through the Canada Foundation for Innovation; and
- $500 million to Canada Health Infoway to encourage the greater use of electronic health records.

The Government of Canada has created a public-private partnership (P3) Crown corporation, PPP Canada Inc., to administer the Public-Private Partnerships Fund and work with the public and private sectors to encourage the further development of Canada’s P3 market.

The Government’s northern strategy aims to strengthen Canada’s sovereignty, advance economic and social development, promote environmental sustainability, and improve governance in the region.

Cloud Computing experts agree that privacy and security of personal information is emerging as the most important hurdle vendors must jump in order to attract customers. Through federal and provincial legislation, Canada has a strong legislative basis to protect personal information that is collected. Ontario Privacy Commissioner Ann Cavoukian, who is recognized as a world leader in IT privacy issues, has been quoted as saying, “User-centric private identity management in the Cloud is possible, even when users are no longer in direct possession of their personal data, or no longer in direct contact with the organization(s) that do possess it.”

Glance into the Future
As real-estate and energy costs rise and as security and privacy concerns increase, public and private sector organizations, large and small, are expected to seek ways to consolidate their data centres and desktop application services in a secure, controlled and low-cost environment.

One of the most significant current IT trends is the advent of the Netbook. This next generation laptop computer is designed to take advantage of Cloud Computing. The Netbook allows the user to access, from anywhere in the world, his or her personal infrastructure and software profile, as well as use different levels of wireless communication and take advantage of Web 2.0 tools.

Due to its low cost ($250-$400) and its portable size and weight (because it doesn’t need powerful processors and extensive memory capacity) the Netbook is gaining in popularity with corporations and their employees, as well as with private users such as students and families. Approximately 400,000 Netbooks were sold in 2007, an estimated 11.4 million were sold in 2008 and some 21.5 million units are expected to be sold in 2009. According to Information Network, Netbook sales are expected to reach 139 million units in 2013.

There are approximately 1.2 million servers in the U.S. Federal Government today and approximately 120,000 in the Government of Canada. Consolidation of these servers is inevitable and trust (i.e., the ability to effectively manage privacy protection and security) and cost efficiency will be the determining factors in consolidations.

As issues such as the carbon footprint left by large cooling-down systems, energy consumption and the current pressures on an old and outdated grid become more complex and expensive to address, it is expected that inexpensive green energy IT solutions such as hydroelectricity and wind power, and the benefits of northern regions will gain pre-eminence.

Web 2.0 is still relatively new, yet Internet experts have already introduced the concept of Web 3.0. Many compare Web 3.0 to a giant database. While Web 2.0 uses the Internet to make connections between people, Web 3.0 will use the Internet to make connections with information.

As our journey through the relatively recent history of the Internet has proved, more and more information is expected to be provided through the Web, and individuals and organizations are expected to make use of an ever-increasing number of sophisticated audiovisual tools.

The rising use of the Internet increases server overkill and the need for appropriate data storage, as well as an increased demand for software and hardware services, mobility and global access.

Research and Development funding is expected to rise in the coming years, in both the public and private sector. Many will see Cloud Computing as a sound place to invest and prepare for future needs. For example, Microsoft announced that it would invest a record US$9 billion in R&D in 2009. They also stated that Cloud Computing would be a major field of investment.

United States President Obama is renowned for his proactive position on the benefits brought by technology and R&D. In his first couple of months in Office he introduced an aggressive agenda on the technology, energy, environment and R&D fronts.

Recent research indicates that 75% of Chief Information Officers (CIO) indicated that they will need and use Cloud Computing in the near future. Research also identifies that the US Government would save US$6.6B over the next three years through Cloud Computing. Just on the energy front alone, it is estimated that $5B in electrical power could be saved in the US through Cloud Computing.

Way Forward:
The move toward Cloud Computing is inevitable and it is happening across the globe and Canada has a definite advantage on other countries around the world.
Canadians can benefit through prompt, coordinated and sustained action within Canada, across jurisdictions, and through private-public partnerships.
Canada also needs to show leadership on the international scene, starting with its southern neighbour, the United States, who could become one of Canada’s best allies and supporter since Cloud Computing supports the President’s agenda and Corporate America’s next step.

There exists an opportunity for the Government of Canada to show leadership through the development of a broader Cloud Computing vision. A coordinated effort with Canada’s private sector leaders in the field would be beneficial.

The Government of Canada could also engage provincial, territorial and municipal counterparts in defining Canada’s Cloud Computing position through a comprehensive Canadian Cloud Computing Strategy.

Monday, October 5, 2009

We need network neutrality for the electrical grid in order for smart meters to take off

[As many of you know I have long complained about the current generation of smart meters. They are too focused on the needs of the utility in terms of minimizing peak load, and have little benefit to the consumer or the environment. In most electrical systems the utility owns and controls the meter. This reminds me of the days when the telephone company owned the telephone which inhibited innovation. It wasn't until regulators forced the telcos to allow direct interconnection of devices that computer and networking revolution took off. I think we face the same challenge with Smart meters. In Germany companies can arrange to interconnect their own government approved meter directly. And as you can see from this article it is already creating innovation. Some excerpts -- BSA]

What Cisco Can Learn From A Yello Strom Smart Grid Pilot

Networking giant Cisco could learn a whole lot from its partnership with German utility Yello Strom, which I once called the coolest utility in the world, and which focuses heavily on smart grid consumer hardware and the use of the Internet for the power grid. While Cisco included Yello Strom as a partner in its smart grid announcement last month, the networking company announced more details about a 70-home pilot project using Yello Strom’s sophisticated “Sparzähler” or smart meter this morning. If Cisco aims to some day develop a Linksys-based home energy management product, the project detailed today could provide some important information for that effort.

Yello Strom is also one of the only utilities I’ve heard of that has developed and sells its own sophisticated smart meters. In July Martin Vesper, Yello Strom’s executive director, told us that the company looked at the smart meters that were already available on the market, and found only tools that focused on helping energy efficiency from a utility perspective. Not seeing anything they liked, or anything that would get consumers excited, they developed their own, which looks like it would be at home in the window of an Apple store, is built off of Microsoft Windows CE, and has both a small web server and client inside. Yello’s meter is a lot more sophisticated than other smart meters.

This unusual environment — a sophisticated, innovative smart meter, and potentially a home broadband connection — will be a very interesting environment within which Cisco can run a pilot program. It could enable Cisco to get an interesting perspective for how it could roll out any type of Linksys, broadband-based, home energy management product, which Cisco has actively been looking into

Are Returns from Smart Grid Investments Too Weak for VCs?

The smart grid might be the Megan Fox of cleantech right now (hot), but will venture-backed smart grid startups be able to deliver the type of returns that VCs commonly like (somewhere around 10 times their investment)? Not really, suggested venture capitalist Vinod Khosla at the AlwaysOn GoingGreen conference in Sausalito, Calif., earlier this month (watch the video clip here). During a panel on the first morning of the event Khosla called smart grid investments from a VC perspective “interesting, but marginal,” at “10 to 15 percent.”

Indeed, Khosla hasn’t made any direct investments in bringing information technology to the power grid over the years, despite the fact that he played a fundamental role in the development of information technology — as co-founder of Sun Microsystems and an investor with Kleiner Perkins funding broadband firms like Juniper.


Wednesday, September 30, 2009

Understanding impact of cap and trade (Waxman-Markey) on IT departments and networks

[For further in depth analysis on this subject please see the upcoming Educause Review special publication on this topic and presentations by Dr. Larry Smarr and yours truly at Educause summit in Denver in November.

There has been a lot of discussion about climate change and what IT departments should do to reduce energy consumption. Most of this is being driven by corporate social responsibility. But a few organizations are undertaking processes to understand the impact of cap and trade on the bottom line of their IT and network operations. When the real cost of cap and trade starts to be felt a lot of organizations will be looking at their IT departments as the low hanging fruit in terms of reducing energy consumption and concomitant GHG emissions.

Only marginal energy reductions are possible with traditional electrical hogging sources such as lightning, heating, air conditioning etc. IT holds out the promise of much more significant savings because of its inherent flexibility and intelligence to support "smart" solutions. Several studies indicate that ICT represents at least 30% of the energy consumption in most organizations and it is estimated as much as 50% within certain sectors such as telecoms, IT companies themselves and research universities. Hard, quantifiable data is difficult to find - but CANARIE is funding 3 research projects to do a more detailed analysis of actual electrical consumption by ICT and cyber-infrastructure for at least one sector in our society - research universities. (Preliminary results are already pretty scary!)

To date the various cap and trade systems have had little impact because either emission permits have been effectively given away, or the underlying price of carbon has had a negligible impact on the cost of electricity. This is all about to change. First with Waxman-Markey bill (HR 2454) now before the senate and the move to auction permits in the European Trading System (ETS). Even if the Waxman-Markey bill fails to pass in the Senate, there are several regional cap and trade initiatives that will be implemented by US states and Canadian provinces in the absence of federal leadership. So, no matter which way you cut it, electrical costs for IT equipment and networks are projected to jump dramatically in the next few years because of cap and trade. On top of that there may be energy shortages as utilities move to shut down old coal plants where it does not make economic sense to install carbon capture sequestration (CCS) systems to comply with the requirements of these cap and trade systems.

The US Environmental Protection Agency (EPA) has done some extensive modeling and economic analysis of the impact of the Waxman-Markey bill. It is probably the best source for a general understanding of how various cap and trade systems around the world are going to affect IT operations. Even though some of the particulars of the bill may change in the US Senate, the broad outline of this bill as well as those of other cap and trade systems will remain essentially the same. Details of the EPA analysis can be found here:

Surprisingly there has been little analysis by the IT industry sector itself on the impact of cap and trade on this industry. IT may be the most significantly affected because of its rapid growth and its overwhelming dependency in several key sectors of society such as university research, banking, hospitals, education, etc. Although IT overall only consumes 5-8% of all electricity depending on which study you use and contributes 2-3% of global CO2 emissions, IT electrical consumption is over 30% in most businesses and even greater amounts at research universities. What is of particular concern is that IT electrical consumption is doubling every 4-6 years and the next generation broadband Internet alone could consume 5% of the world’s electricity. Data centers as well are project to consume upwards of 12% of the electricity in the US.

There are number of important highlights in the Waxman-Markey bill that will be of significance to IT departments and networks:

1. The proposed cap reduces GHG emissions to 17% below 2005 levels by 2020 and 83% by 2050.

2. Most of the GHG reduction will be from the electricity sector and purchase of international offsets in almost equal portions.

3. GHG emissions from the electricity sector represent the largest source of domestic reductions - although transportation accounts for 28% of emissions in the US, only about 5% of the proposed reductions will come from that sector and expected to raise gasoline prices by only a paltry $.13 in 2015, $.25 in 2030 and $.69 in 2050 (Much to the relief of the oil industry, Canada’s tar sands and owners of SUVs)

4. The share of low or zero carbon primary energy rises substantially to 18% of primary energy in 2020, 26% by 2030 and 38% by 2050, although this is premised on a significant increase of nuclear power and CCS. True renewables only make up to 8% in 2015, 12% in 2020, and 20% in 2030

5. Increased energy efficiency and reduced energy demand simultaneously reduces primary energy needs by 7% in 2020, 10% in 2030, and 12% in 2050.

As you can imagine there are many uncertainties and controversial assumptions that affect the economic impacts of H.R. 2454 and many other cap and trade bills. Briefly these are some of them:

(a) The degree to which new nuclear power and CCS is technically and politically feasible. HR 2454 assumes a dramatic increase in nuclear power and deployment of CCS. If either fails to materialize then the GHG reduction targets will not be met. Assumption of growth in nuclear power is particularly suspect as any new nuclear plants in the foreseeable future will be first needed to replace the many aging systems now at the end of their operating life.

(b) The availability of international offset projects. Given the controversy that already exists over international offsets many question the assumptions of being able to purchase this volume offsets particularly when every other country with a cap and trade system will be pursuing this same market.

(c) The amount of GHG emissions reductions achieved by the energy efficiency provisions. In the IT sector in particular growth of IT products and services may simply outweigh any gains made in efficiency.

Although the impact of HR 2424 on consumer electrical costs will be minimal, its a different story for business and industry users. The EPA estimates that the "average" price of electricity will increase by 66% for commercial users. But there will be huge regional variances in these prices depending upon the amount of electricity that is produced from coal without CCS. In those regions largely dependent on coal generated electricity the cost increase will be almost entirely dependent on the market price of carbon.

If your electricity is mostly generated by coal, which includes most of the mid-west in USA and western Canada, then a rough rule of thumb is 1000g of CO2 is produced for every kilo-watt hour of electricity which results nice easy one to one conversion of annual hourly consumption to metric tones of CO2. A typical research university has a 40 MW utilization which translates into about 350,000 MWhr of consumption. This would result in 350,000 mTCO2e. If carbon trades at $25/ton then the increased cost to the institution will be in excess of $8 million per year.

However if many of the assumptions in the Waxman-Markey fail to come to pass, particularly the availability of international offsets then cost of carbon could jump dramatically. (To protect against this the US senate is proposing a “collar” to limit variability in price of carbon). The EPA analysis has various projections for carbon, and depending on the scenario the cost could go up to as much as $350 per ton, if the objective of 17% in GHG reductions are going to be achieved by 2020 and 83% by 2050. The Nicholas Stern report in the UK suggests that carbon must trade at a $100 a ton to achieve meaningful GHG reductions.

One of the main concerns of the Waxman-Markey bill is that it is too little and too late. More and more evidence points to much more rapid warming of the planet than even the most pessimistic computer models have forecast. Although we had a wet and cool summer in eastern North America average global sea temperatures set a new high record this year. The latest study from UK Meteorological office, that incorporates CO2 feedback cycles for the first time, suggests that US could warm up by 13-18F and the Arctic by 27F by 2060. The bottom line is that Waxman-Markey is just a starting point to probably much more stringent GHG reduction policies. The IT sector needs to get prepared for this worst case eventuality. If nothing else it should be part of any disaster planning scenario. This will be the mother of all disaster planning scenarios as opposed to other natural disasters that might affect IT operations it will be long term, if not effectively permanent.

However there is some good news for the ICT sector. One of the requirements of the Waxman-Markey (Title I, Subtitle A, Sec. 101) requires retail electricity providers to meet a minimum share of sales with electricity savings and qualifying renewable generation funded through purchase of offsets or other credits. Nominal targets begin at 6% in 2012 and rise to 20% by 2020. The ICT sector is probably the best qualified to take advantage of these energy requirements by adopting follow the wind/follow the sun architectures and relocating, as much as possible, computers and databases to renewable energy sources. The key to take advantage of these opportunities is to start planning now. Several papers from MIT and Rutgers indicate that savings of up to 45% in electrical costs are possible with such a strategy. These savings will be more significant with the advent of cap and trade.

-- BSA]
skype: pocketpro

Blog Archive