This blog is about using ICTs to develop climate change preparedness solutions built around Energy Internet and autonomous eVehicles
Energy Internet and eVehicles Overview
Governments around the world are wrestling with the challenge of how to prepare society for inevitable climate change. To date most people have been focused on how to reduce Green House Gas emissions, but now there is growing recognition that regardless of what we do to mitigate against climate change the planet is going to be significantly warmer in the coming years with all the attendant problems of more frequent droughts, flooding, sever storms, etc. As such we need to invest in solutions that provide a more robust and resilient infrastructure to withstand this environmental onslaught especially for our electrical and telecommunications systems and at the same time reduce our carbon footprint.
Using autonomous eVehicles for Renewable Energy Transportation and Distribution: http://goo.gl/bXO6x and http://goo.gl/UDz37
Free High Speed Internet to the Home or School Integrated with solar roof top: http://goo.gl/wGjVG
High level architecture of Internet Networks to survive Climate Change: https://goo.gl/24SiUP
Architecture and routing protocols for Energy Internet: http://goo.gl/niWy1g
How to use Green Bond Funds to underwrite costs of new network and energy infrastructure: https://goo.gl/74Bptd
Monday, August 31, 2009
Why IT professionals will become Chief Green Officers
Weather supercomputer used to predict climate change is one of Britain's worst polluters
http://www.dailymail.co.uk/sciencetech/article-1209430/Weather-supercomputer-used-predict-climate-change-Britains-worst-polluters.html
Gartner Says More Than 30 Percent of ICT Energy Use is Generated by PCs and Associated Peripherals
PCs and associated peripherals contribute approximately 31 percent of worldwide information and communication technology (ICT) energy use,
http://www.gartner.com/it/page.jsp?id=941912
Why IT Pros Will Become Chief Green Officers
http://www.greenercomputing.com
The next big corporate "C"-level job will be the Chief Green Officer (CGO). And if IT staff plays their cards right, they'll walk right into that high-paying, high-visibility, high-payoff job. Here's why.
Greening an enterprise requires far more than a background in energy, engineering, or the environment. It's all about data, and the people who know best how to manage that data will become CGOs.
These facts aren't lost on the big IT vendors. Cisco and others are rushing to release hardware and software for greening the enterprise, and at the center of it all are the IT staff who will be buying,
Monday, August 24, 2009
Will Vint Cerf revolutionize the smart grid in the same way he revolutionized the Information Highway?
http://googlepublicpolicy.blogspot.com/2009/08/where-smart-grid-meets-internet.html
Where the smart grid meets the Internet
Posted by Vint Cerf, Chief Internet Evangelist
The term "smart grid" means many things to many people. At the most basic level, the smart grid is defining smarter ways to deliver and use energy -- but did you know that the smart grid is also defining new ways to generate and exchange energy information?
Building information technology into the electricity grid will revolutionize the way our homes and businesses use energy. The first step will be to develop open protocols and standards to allow smart grid devices and systems to communicate with one another. That's why Google and other stakeholders are participating in a working group coordinated by the National Institute for Standards and Technology (NIST) to develop interoperability standards for a nationwide smart grid.
When people talk about networks for exchanging information -- particularly among millions of end users -- the first thing that often comes to mind is the Internet. So it makes sense to take the successful processes used to create Internet standards and apply them to this new energy information network.
Google, for example, believes in the wisdom of crowds (we've used that wisdom to enhance our products and we continue to get feedback on future products via Google Labs and Google Code Labs). And we've found that a good way to harness the wisdom of crowds is to create open standards to solve network issues. Some of the key principles to developing truly open standards include open and free access to:
• Process. The customers of the smart grid information network are energy producers and consumers, hardware and software developers and energy regulators. Collaborate, and make sure all parties are represented during the standards discussion.
• Drafts. There are a lot of people with networking expertise who are not directly involved with smart grid; make it easy for them to participate, for example, by hosting meetings online and posting documents that are universally accessible for review.
• Comments. Allow comments resulting from current standards drafts to influence future drafts.
• Final standards. If people can't access the standard, they can't implement the standard!
• Standards unencumbered by patents. If implementers need to worry about licenses to practice the standard, it is not really a completely open standard.
The smart grid is essentially a nascent energy Internet. Thanks to the open protocols and standards on which it was built, the Internet has grown into a thriving ecosystem, delivering innovative products and services to billions of users worldwide. Applying the same principles of openness to the development of standards for our nation's electric grid would create a smarter platform for products and services, helping consumers conserve energy and save money.
Another excellent paper on low carbon Internet architectures
The paper describes a framework for Internet services to take advantage of data centers that pay different (and possibly hourly) electricity prices, data centers located in different time zones, and data centers located near sources of green energy.
He is now finishing up a new paper, in which they demonstrate how we can cap the (carbon-intensive) energy consumption of Internet services at low cost, to limit their carbon footprints.
Also yours truly and my colleagues at CRC, Inocbyte and I2Cat will have a paper published in Journal of Lightwave Technology on low carbon Internet. As and in partnership with Larry Smaar, Jerry Sheehan and Tom Defanti at CAL-It 2 we are writing a paper for special edition of Educause Review on this topic coming this fall
--BSA]
How Amazon Kindle eBook addresses climate change through de-materialization
http://earth2tech.com/2009/08/19/why-the-kindle-is-good-for-the-planet/
According to a fascinating report from the Cleantech Group, called The Environmental Impact of Amazon’s Kindle, one e-Book device on average can displace the buying of about 22.5 physical books per year, and thus deliver an estimated savings of 168 kg of CO2 per year.
As Emma Ritch, author of the report put it:
Multiplied by millions of units and increased sales of e-books, e-readers will have a staggering impact on improving the sustainability and environmental impact on one of the world’s most polluting industries: the publishing of books, newspapers and magazines.
The report takes a look at the effect of the book and magazine publishing industries on both trees and carbon emissions: the U.S. book and magazine sectors accounted for the harvesting of 125 million trees in 2008, and an average book has a carbon footprint of 7.46 kilograms of CO2 over its lifetime. A book’s carbon footprint also can double if you drive to the store and buy it, versus having it shipped in the mail. So in a similar way to how downloading digital music and listening to it on your computer has a much better carbon footprint than driving to the store and purchasing a CD, the savings for e-Books are about both dematerialization and eliminating the need for transportation.
If a Kindle-user uses the device for the full storage capacity, Ritch says it can “prevent the emission of nearly 11,185 kg of carbon dioxide equivalent,” and for the Kindle DX, that can jump to a savings of 26,098 kg of carbon emissions. But a more average user, who probably won’t use the full storage capacity, will buy about three e-books per Kindle per month, and the report predicts that average consumer would displace closer to 168 kg of CO2 per year.
Considering all of the projected e-Book devices sold between 2009 and 2012 in the U.S., (and taking into account that e-Books don’t often replace books in a 1 to 1 ratio) the report says that e-Books could save 9.9 billion kg of CO2 from being emitted.
Ontario Government launches study to build low carbon data centers in Northern Ontario
Developing Green Technology In Thunder Bay
August 17, 2009
McGuinty Government Supports Local Economy And Innovation
NEWS
New low carbon data centres that provide alternative, less expensive and greener data storage facilities are being studied in Thunder Bay.
Rapidly growing global demands on the data management sector have resulted in increased energy consumption to operate and manage these large amounts of data, as well as to maintain proper climatic conditions at data storage facilities.
With support from the Northern Ontario Heritage Fund Corporation (NOHFC), the Northwestern Ontario Innovation Centre will explore opportunities to develop a data centre industry in Northern communities that use green technology. A feasibility study is expected to be completed by October 2009.
QUOTE
“This exciting project will not only research the economic development potential of establishing low carbon data centres in Northern Ontario, it will also investigate opportunities for transferring the excess energy to other public facilities. That means everyone involved could experience reduced operating costs.”
- Michael Gravelle, Minister of Northern Development, Mines and Forestry, and Chair of the NOHFC
QUICK FACTS
• The NOHFC is providing $25,000 to the Northwestern Ontario Innovation Centre to conduct the study.
• The study will examine feasible energy options, effective ways to reduce the industry’s carbon footprint, and the potential economic benefits such as job creation and revenue generation.
LEARN MORE
• Northern Ontario Heritage Fund Corporation Programs
• Growth Plan for Northern Ontario
Anne-Marie Flanagan, Minister’s Office, 416-327-0655
Michel Lavoie, Communications Branch, 705-564-7125
ontario.ca/north-news
• Privacy
• Important Notices
Copyright information: © Queen's Printer for Ontario, 2009
Last Modified: July 24, 2009
More on how data centers can save millions with follow the wind/follow the sun software
http://earth2tech.com/2009/08/19/how-data-centers-can-follow-energy-prices-to-save-millions/
How Data Centers Can Follow Energy Prices to Save Millions
Companies that own numerous data center operators across the globe could be able to save millions of dollars a year in electricity costs if they dynamically shifted computing power across their data centers to when and where energy prices are the cheapest. At least that’s according to a study out this week from the Massachusetts Institute of Technology and Carnegie Mellon (hat tip Ars Technica).
In other words, companies that have lots of data centers can take advantage of cheap bandwidth, smart software and fluctuating hourly energy prices to shift computing power to a data center in a location where it’s an off-peak time of the day and energy prices are low. Commonly that’s in the middle of the night, which is why industry-watchers like Rich Miller, editor of Data Center Knowledge, call the process “following-the-moon.”
For data centers that are more “energy proportional” — using energy efficiently across a range of activity levels, from idle to peak load, as I explained on GigaOM Pro (subscription required) — and don’t have any constraint on bandwidth use and speed, the savings could be as high as 13-30 percent.
The study is interesting because while some companies with massive distributed data centers are starting to employ these tactics (data center software maker Cassatt, for example, sells a product that dynamically shifts loads to find the cheapest energy prices), this is still a relatively new concept. It’s particularly interesting for companies that offer cloud computing services, selling scalable on-demand computing as a service, since they could use their massive networks to create significant savings and pass that onto their customers.
Given many cloud computing providers are already shifting computing loads to different locations to provide fast delivery and on-demand bandwidth, the researchers suggest that adding in an energy price cost policy wouldn’t be that difficult. And as longtime IT energy researcher Jonathan Koomey found in one of three reports released this week, cloud computing companies are already leading the charge in being smarter about energy use.
An alternative, which Stacey on GigaOM has written about, is helping data centers shift computing loads to tap into renewable power. But unfortunately for the time being, until clean power drops in price, that’s not going to save you a whole lot of money.
How downloading music can help fight climate change
http://greeninc.blogs.nytimes.com/2009/08/17/the-carbon-case-for-downloading-music/
A new study has found that downloading music is substantially better from an emissions perspective than buying compact discs.
The study, which was funded by both Microsoft and Intel and authored by two academics at Carnegie Mellon University and a third affiliated with Stanford University, found that buying an album digitally reduces carbon dioxide emissions by 40 to 80 percent relative to a “best-case” CD-purchasing scenario.
This “best-case” CD scenario involves a customer buying a CD online and having it delivered via a light-duty truck; the more carbon-intensive options examined by the study are express air shipment of the CD, and the customer visiting a store to buy the CD.
The advantage for digital comes largely because CDs must be manufactured, packaged and transported over long distances.
EU Gov't Study: People Won't Pay For Content; New Business Models Needed from the wow dept
Just as more and more European countries are trying to ban or block sites like The Pirate Bay, it seems like a few more politicians should take the time to read the new EU study on digital competitiveness (found via P2P Blog). In it, the authors study the question of paid content and "pirated" content, and find that an awful lot of people have absolutely no interest in paying for content, no matter what -- and that the entertainment industry is exaggerating the impact of things like file sharing, since so few people would actually pay for the content in the first place (even if it weren't available for free). Rather than blaming "piracy," the report properly notes that it's a shift in technology (from atoms to bits) that has created the business model problems today:
De-materialisation of creative content distribution is shaking up the business models of the creative industries, with both potential opportunities and potential losses and bringing new players into the media industries' landscape.
[snip]
RSS Feed:
Monday, August 17, 2009
Green IT solutuions: energy efficiency versus purchasing renewable power?
Of course, the big problem with RECs, or renewable energy resale is its cost. It can be substantially more expensive than dirty power. So purchasing RECs is really an indicator of an IT organization’s true commitment to becoming green. An alternative approach is to acquire your own renewable power is by building windmills on site or relocating IT equipment to a renewable energy sites. For example ecotricity (www.ecotricity.uk) in the UK has a novel business plan where they will build a windmill at a data center at their cost, in return for a long term contract for the power delivered from the windmill. Other organizations such as MIT and UoMassachusetts are looking to relocate their computing and cyber-infrastructure equipment to remote data centers powered by local hydro electric dams and/or windmills. The challenge with this scenario is the reliability of power. How can you build a reliable IT infrastructure when you have intermittent power sources such as wind mills, solar panels, etc? This is where clouds, grids and virtualization connected with high speed networks can play a critical role. Computing and data jobs can quickly be moved from site to site depending on the availability of local renewable power. This is the essence of CANARIE’s Green IT program. And as illustrated by the recent paper from researchers at MIT and CMU can save up to 40% on energy costs alone.
Even for consumers, using renewable power can be much simpler way of reducing their footprint than through buying energy efficient devices. According to the IEA, consumer IT devices use more power in aggregate than all traditional appliances combined in many homes. A lot of this power consumption is taken up by “vampire” loads when devices are in standby mode such as wall chargers, TVs, set top boxes, etc. A simple solution would be for the consumer to install a small solar panel and/or wind turbine and power all these vampire loads with a “multiplexed power” system using the existing household copper wiring. A vampire power network could use something like a 400 Hz power frequency which would be filtered out by traditional AC devices but could easily power appropriately modified AC/DC converters, chargers, switching power supplies, etc – BSA]
[ IMHO some misleading examples of comments on energy efficiency]
Google policy blog: The vast potential of energy efficiency
http://googlepublicpolicy.blogspot.com/2009/08/vast-potential-of-energy-efficiency.html
international Science Grid Week
How Green is my Grid
http://www.isgtw.org/?pid=1001940
Energy-Aware Internet Routing scheme
http://hardware.slashdot.org/story/09/08/17/1413233/An-Electricity-Cost-Aware-Internet-Routing-Scheme
http://www.technologyreview.com/business/23248/
An Internet-routing algorithm that tracks electricity price fluctuations could save data-hungry companies such as Google, Microsoft, and Amazon millions of dollars each year in electricity costs. A study from researchers at MIT, Carnegie Mellon University, and the networking company Akamai suggests that such Internet businesses could reduce their energy use by as much as 40 percent by rerouting data to locations where electricity prices are lowest on a particular day.
Modern datacenters gobble up huge amounts of electricity and usage is increasing at a rapid pace. Energy consumption has accelerated as applications move from desktop computers to the Internet and as information gets transferred from ordinary computers to distributed "cloud" computing services. For the world's biggest information-technology firms, this means spending upwards of $30 million on electricity every year, by modest estimates.
Asfandyar Qureshi, a PhD student at MIT, first outlined the idea of a smart routing algorithm that would track electricity prices to reduce costs in a paper presented in October 2008. This year, Qureshi and colleagues approached researchers at Akamai to obtain the real-world routing data needed to test the idea. Akamai's distributed servers cache information on behalf of many large Web sites across the US and abroad, and process some 275 billion requests per day; while the company does not require many large datacenters itself, its traffic data provides a way to model the demand placed on large Internet companies.
The team then devised a routing scheme designed to take advantage of daily and hourly fluctuations in electricity costs across the country. The resulting algorithm weighs up the physical distance needed to route information--because it's more expensive to move data further--against the likely cost savings from reduced energy use. …The team found that, in the best scenario--one in which energy use is proportional to computing--a company could slash its energy consumption by 40 percent. "The results were pretty surprising," Maggs says.
Spiraling energy consumption has become a major concern for the world's largest Web companies; a report published by McKinsey & Company and the Uptime Institute in July 2008 estimates that datacenter energy usage will quadruple during the next decade in the absence of efforts to improve efficiency.
Koomey suggests that spiraling energy costs could encourage some companies to consider radical steps such as rerouting data: "Electricity use is a big enough component of data-center costs that this just might work."
Wednesday, August 5, 2009
Texas Data Centers to be powered by wind
http://www.datacenterknowledge.com/archives/2009/07/20/wind-powered-data-center-planned/
A Texas startup plans to build a data center powered by energy from huge “wind farms” in the Texas panhandle and the Gulf of Mexico. Baryonyx Corp.has been awarded three wind energy leases for 8,000 acres onshore in Dallam County, Texas and another 38,000 acres in the Gulf of Mexico, the company said. Baryonyx has also acquired 8 acres of land in Stratford, Texas for its data center project.
Baryonyx was formed in May to build data center projects powered by renewable energy resources. Its primary focus will be wind energy, but the company is also developing plans to eventually use hydrogen fuel cells and solar power to support its facilities when wind generation ebbs due to weather conditions.
The project is the most ambitious effort yet to harness wind power to provide electricity for data centers. Green House Data has built a 10,000 square foot facility in Cheyenne, Wyoming that runs primarily on wind energy, while Microsoft has demonstrated wind-powered containers packed with servers.
100 Turbines for Data Center
Baryonyx plans to build a 28,000 square foot data center in Stratford, which will be powered by 100 wind turbines built on the adjacent land that will generate up to 150 megawatts of power. Each of the turbines will be able to generate up to 3.3 megawatts of power. Capacity not needed by the data center will be sold to local utilities. Baryonyx said it will take about 3 years to reach the operational phase for the wind-powered data center.
Leases Support Texas Schools
Once the wind farms are built and producing energy, they will pay royalties to the state’s Permanent School Fund that in the form of power
----
From 2009 to 2014, projected rises in anthropogenic influences and solar irradiance will increase global surface temperature 0.15 ±0.03 °C, at a rate 50% greater than predicted by IPCC.
So conclude Judith Lean, of the US Naval Research Laboratory, and David Rind, of NASA’s Goddard Institute for Space Studies in a new Geophysical Research Letters study, “How Will Earth’s Surface Temperature Change in Future Decades?”
http://www.agu.org/journals/pip/gl/2009GL038932-pip.pdf
Also a major new study, “Impacts of climate change from 2000 to 2050 on wildfire activity and carbonaceous aerosol concentrations in the western United States” finds a staggering increase in “wildfire activity and carbonaceous aerosol concentrations in the western United States” by mid-century under a moderate warming scenario:
We show that increases in temperature cause annual mean area burned in the western United States to increase by 54% by the 2050s relative to the present-day … with the forests of the Pacific Northwest and Rocky Mountains experiencing the greatest increases of 78% and 175% respectively. Increased area burned results in near doubling of wildfire carbonaceous aerosol emissions by mid-century.
http://ulmo.ucmerced.edu/pdffiles/08JGR_Spracklenetal_submitted.pdf
Blog Archive
-
▼
2009
(80)
-
▼
August
(10)
- Why IT professionals will become Chief Green Officers
- Will Vint Cerf revolutionize the smart grid in the...
- Another excellent paper on low carbon Internet arc...
- How Amazon Kindle eBook addresses climate change t...
- Ontario Government launches study to build low car...
- More on how data centers can save millions with fo...
- How downloading music can help fight climate change
- Green IT solutuions: energy efficiency versus purc...
- Energy-Aware Internet Routing scheme
- Texas Data Centers to be powered by wind
-
▼
August
(10)