Energy Internet and eVehicles Overview
Governments around the world are wrestling with the challenge of how to prepare society for inevitable climate change. To date most people have been focused on how to reduce Green House Gas emissions, but now there is growing recognition that regardless of what we do to mitigate against climate change the planet is going to be significantly warmer in the coming years with all the attendant problems of more frequent droughts, flooding, sever storms, etc. As such we need to invest in solutions that provide a more robust and resilient infrastructure to withstand this environmental onslaught especially for our electrical and telecommunications systems.
Linking renewable energy with high speed Internet using fiber to the home combined with eVehicles and dynamic charging where vehicle's batteries are charged as it travels along the road, may provide for a whole new "energy Internet" infrastructure for linking small distributed renewable energy sources to users that is far more robust and resilient to survive climate change than today's centralized command and control infrastructure. For more details please see:
Free High Speed Internet to the Home or School Integrated with solar roof top: http://goo.gl/wGjVG
High level architecture of Internet Networks to survive Climate Change: http://goo.gl/juWdH
Architecture and routing protocols for Energy Internet http://goo.gl/niWy1g
Wednesday, July 28, 2010
Carbon and Computers in Australia - Full Report [PDF - 1.42MB]
Thursday, July 22, 2010
Unfortunately most of this funds research is currently focused on energy research and carbon sequestration. Few yet recognize the importance of IT in reducing GHG emissions. The only exception is Quebec – with its recent announcement of $60 million for Green IT research.
I am working closely with various groups around the world such CAL-IT2 at UCSD, PROMPTinc ClimateCheck , CSA and others to help educate the administrators of these funds on the importance of funding IT research, cyber-infrastructure and networks. More importantly researchers and cyber-infrastructure providers need to understand that any application for funding must go through a much rigorous analysis in terms of the benefits of the research to reducing GHG emissions. Simple hand waving exercises on energy efficiency will not be sufficient as often is the case with traditional research proposals. Understanding how to genuinely reduce carbon, GHG protocols and the standards process will be essential if a researcher or research institution hopes to tap into these funds.
For more details on how to receive funding from these programs please see my NYSERnet presentation:
--Excerpts from Andy Revkin article in NY tomes BSA]
Filling the Global Energy Research Gap
By ANDREW C. REVKIN
Earlier this week, the International Energy Agency released a batch of new findings and reports as its contribution to the Obama administration’s “Clean Energy Ministerial” meeting in Washington. In any case, a more important analysis was the agency’s fresh look at trends in government support for research, development and demonstration of low-carbon energy technologies and ways for countries to collaborate to accelerate energy innovation.
The report describes how India, despite its poverty, has moved ahead with an initiative for raising money for energy research that the United States — thanks to a lack of leadership, congressional polarization and fear of anything remotely resembling a tax — has so far been unable to do: India has created a National Clean Energy Fund for research and innovation financed by a levy of $1.10 (U.S.) per metric ton of mined or imported coal. It’s a very modest fee that has created hundreds of millions of dollars to stimulate Indian research and testing of promising technologies.
I think that, particularly with presidential leadership, there could be more than 60 Democrats and Republicans in the Senate who could get behind the case for fueling an American energy quest this way, or with a directed 2-cent-per-gallon nudge to the gasoline tax, which alone would triple our research budget compared to the pre-stimulus level.
Here’s an excerpt and link to the full report:
The IEA’s Energy Technology Perspectives 2008 called for a clean energy revolution to address global energy security, energy access and environmental challenges. The recently released Energy Technology Perspectives 2010confirms that the transition has begun to a low-carbon economy. The past decade has seen an investment surge in clean energy technologies as governments made bold commitments to fund LCETs.
The 2008-09 green stimulus spending announcements were welcome increases in public RD&D, but is seems they are largely one-time commitments. Further, some governments are backing away from their stimulus spending announcements, and industry is reducing its investments. This is particularly worrisome as clean energy technologies continue to cost more, on an unsubsidized basis, than traditional fossil-based technologies and it is unlikely that a global price on CO2 will be settled in the near future. A great deal more must be done to bridge the gap between the estimated $10 billion in annual pre-stimulus spending and $40 billion to $90 billion needed to achieve sustainable energy goals
As I have long argued the low hanging fruit for organizational change at universities is with cyber-infrastructure, networks and clouds. Computers, networks and HPC systems constitute a significant portion of the energy consumption at many universities. We have the solutions in hand to eliminate the huge carbon footprint of cyber-infrastructure and to enhance the quality of science that can be done with such facilities. Universities and R&E networks, to my mind, should be at the forefront of organizational change to deal with the challenges of climate change. For more thinking along this line please see the following paper in Educause Review written by myself, Larry Smarr, Tom Defanti and Jerry Sheehan “Cyber-infrastructure in a Carbon Constrained World”
Organizing Teaching and Research to Address the Grand Challenges of Sustainable Development
National Research Council
Committee on Stabilization Targets for Atmospheric Greenhouse Gas Concentrations
Climate Change and Higher Education
“Cyber-infrastructure in a Carbon Constrained World”
More on revenue opportunities for R&E and open access networks - building next generation "5G" wireless network http://bit.ly/dck1kR
New revenue opportunities for R&E networks in helping universities reduce their energy costs http://bit.ly/dqvN70
Cloud helps universities reduce costs by 74% - more clouds reduce energy costs http://bit.ly/c5mT58
Cloud computing breakthru! CENIC & PNWGP have connected 10G lightpaths to Amazon compute & storage, OOI CI early user http://bit.ly/aG0a06
EEE Green House Gas standards for 5G networks and Green ICThttp://bit.ly/bqYNyN
CO2 emissions from US datacenters greater than all CO2 emissions from Netherlands or Argentina http://bit.ly/cW6jEY
Amazon joins Top500 supercomputer list with its Cluster Compute service ... http://bit.ly/99zipE
What A Price on Carbon Would Cost University Data Center Operatorshttp://bit.ly/9AOZzH
Moving beyond cyber-infrastructure - greening and moving HPC into the cloud http://bit.ly/bNGrXy
Industry and universities must prepare for next Y2K - "CO2K"http://bit.ly/9UMpMo
OECD recommends that basic research in ICT should be supported through carbon offset mechanismshttp://bit.ly/a8VhNk
Enabling Innovation with next generation wireless 5G Internet + clouds - technical details http://bit.ly/c3iZsZ3:18 PM Apr 25th via web
85% of research computing can be done using cloudshttp://bit.ly/cC1eQ7
The Rise of Research-driven Cloud Computing http://bit.ly/bA9YjL
More on building a 5G wireless mobile R&E green networkhttp://bit.ly/a5zQFL
Tuesday, July 20, 2010
Pike Research has released a report, called Green Telecoms Networks, which looks at green telecoms initiatives worldwide - the opportunities, technology requirements and environmental impact.
The report focuses on the direct impact of green technologies and practices on telecoms networks and reaches the headline conclusion that green telecom network infrastructure investments will be worth $122bn by 2014, representing over 46% of telecoms capital expenditure worldwide. Of that, 63% of the investments will be for mobile networks.
The Asia Pacific region is expected to lead the capex spending by 2014, followed by Europe. Global emissions reductions by then (compared with doing nothing) are estimated at 24%, with a 46% reduction from mobile networks.
Mobile networks, base stations and switching centres will be a focus since they can consume 70%-80% of an operator’s network energy usage. Whilst the use of renewable energy solutions continue to face ROI issue because of their initial cost, as business cases move to a total cost of ownership (TCO) model their implementation becomes more attractive. Pike Research predicts that renewable energy will power 4.5% of the world’s mobile base stations by 2014, up from just 0.11% in 2010. The figure will be higher – 8% - in developing countries.
Fixed networks have declining subscriber numbers, and hence costs, so are less of a concern. Nevertheless, emissions reductions of 15% are still expected by 2014 from technology improvements at the component/board level.
Pike Research also points out that there are lots of opportunities for both fixed and mobile network operators to reduce emissions from data centres, both in the design of the facilities and in the IT itself, through server consolidation and virtualisation, for example.
Monday, July 19, 2010
CLOUD HELPS UNIVERSITIES CUT COSTS BY 74 PER CENT
Cloud Helps Universities Cut Costs By 74%
Queensland University of Australia was among the first of a host of institutions to provide enterprise software to the university system via the cloud in the region. According to officials at the institution, this has allowed all of the universities that took part in their program to extend software beyond the in-house machines in a scalable fashion while cutting costs across the board. Since universities typically need to invest in everything required to keep IT running for students, including software licenses, hardware (including servers and backup) and a large staff to maintain these systems, making the switch to the cloud is a worthwhile investment of time and effort on the part of universities.
Queensland University of Technology in Australia leveraged cloud computing to provide enterprise software to more than 140 universities in Asia Pacific. Glenn Stewart, Professor of Information Systems revealed how the university dramatically reduced costs while it enjoyed greater assurance and scaleability.
Stewart heads the SAP University Competence Centre (UCC) which provides, on a non-profit basis, an SAPsuite of business software to over 800 academics and 42,000 students from 140 universities in Asia Pacific and Japan.
If an individual university was to run the software without the help of the UCC, it needs to invest in hardware, as well as recruit and train specialised staff. “Servers and backup facilities could easily bring the start up cost to A$200,000 (US$173,000). Replacement and recurring staff cost would be another A$150,000 (US$130,000) per year,” said Stewart. This would be a major obstacle for any university looking to use the enterprise software to support teaching.
The introduction of the UCC in 2000 allowed universities to pay A$30,000 (US$26,000) for the use of software on five clients, and now, by migrating the services into a private cloud, each university pays A$7800 (US$6760) for that same package, which is more than 74 per cent reduction in cost.
“Virtualisation and cloud computing has enabled QUT to host the needs of many universities. Individual institutions do not need to buy hardware, hire and train people, and manage all that. There has been significant cost savings for all institutions involved,” observed Dr Robert LoBue, Vice President, Global University Alliances, SAP.
QUT started to move its services into the private cloud last October (2009). Today, it has finalised 80 per cent of its migration, and expects to complete 95 per cent by the end of this year.
The decision to use cloud technology was straight-forward, said Stewart. “In 2005, we started to deliver our services using virtualisation. We would have needed over A$1million (US$866,900) of hardware, but that helped cut it down by half. Still, it did not provide the scaleability we desired. Cloud computing halted our capital expenditure and moved that into operational expenditure. We are now able to provide services on demand, and provide the lowest cost of service to the universities we serve.”
The key benefit of putting services on to the cloud is the ability to scale, according to LoBue. “At the end of 2008, there were 44 universities in the programme. Slightly over a year later, we have extended services to 140 education institutions,” he added.
Open Source Energy Savings
Any large company that wants to save energy by turning computers on and off automatically should consider Condor. With commercial solutions that do the same costing hundreds of thousands of dollars at large installations, Condor is clearly worth a look.
Condor is a hybrid example of high-quality, community-built software. Since the project started in 1988 at the University of Wisconsin, the code for Condor was viewable under an odd proprietary license. This did not retard the use and community improvement of the software, which has become robust over years and has been deployed on millions of computers. Condor supports all the operating systems a typical company or research institution would have and is rock solid in terms of stability and functions for its intended purpose, which is carving up work and sending it out to any number of computers for processing.
Now, following the path of so many other open source projects, companies including Red Hat and Cycle Computing are transforming Condor into a product. Condor allows large numbers of computers, whether servers, desktops or engineering workstations, to be used as a massive high-performance or high-throughput computing facility.
"Condor enables open and cost-effective high throughput computing to environments scaling up to 30,000 processors," says Jason Stowe, CEO of Cycle Computing, which offers support and management tools for Condor.
Condor's expansion toward power management is just one example of the way that the functional footprint of open source is rapidly expanding. Cycle Computing combines Condor and Hadoop, which allows file systems to be provisioned by farms of computers, to create cloud-like capabilities from internal resources. By adding cloud servers to the mix, the size of the computing environment can expand and contract as needed.
Paul Cormier, president of products and technologies at Red Hat, is working on combining a large collection of open source projects into a cloud provisioning and management suite. "The move to cloud computing as the next generation architecture has only been possible by integrating many of these open source projects, such as Condor," says Cormier. "It is only natural that the software for creating and managing these virtual environments come from the world of open source as well."
Greening the grid: Purdue turns server pool into power management hub
This post starts as a throwback to the utility and grid computing applications that used to dominate headlines.
The high-performance grid in question is Purdue University’s DiaGrid, which aggregates the idle compute power of 28,000 processors at the university and on campuses in Indiana, Kentucky and Wisconsin. What initially started as a project mainly focused on effective resource utilization has, over time, has become a potential method for harvesting energy across the connected systems, says John Campbell, associate vice president at Purdue’s Rosen Center for Advanced Computing. What makes this possible is the Condor and CycleServer management tools from Cycle Computing.
The directive is pretty simple at the university, which is trying to eke every available dollar out of the workstations and academic computers across the high-performance computing cluster, which are typically idle between midnight and 7 a.m. “Either join Condor or turn off your machine at night to conserve power,” Campbell says. Eventually, they won’t won’t have to make that decision: the software will automate a shutdown of idle machines.
To get a sense of the impact that DiaGrid has had on the Purdue IT budget, Campbell notes that if the university was forced to replace the computing cycles that the grid coordinates, it would have to spend roughly $3 million in new hardware, not to mention all the power required to run those new systems. The power discussion has grown louder in the past two years, Campbell says, as the campus seeks to get the most utilization out of every watt of power consumed. That conversation has inspired other universities to join the DiaGrid project.
Jason Stowe, CEO and founder of Cycle Computing, says many of the company’s clients — which include the likes of JP Morgan Chase, Lockheed Martin, Eli Lilly and Pfizer — are looking at how high-performance technical computing clusters can play a role in managing power management costs. While the power savings potentially might not be enormous, grid utility applications can play a key role in making sure a company’s existing power draw is used as effectively as possible.
“Rather than buying new machines and more data center space, these technologies can help them make better use of what they have and let them power down nodes that are no longer in use,” Stowe says.
Thursday, July 15, 2010
I thought you might be interested in the email below from IEEE Standards Association. If you think it is merited, please share it with your audience regarding GHG standards and renewable power. I've added some more information about the standards portal and ClimateCHECK's work with green data centre GHG standardization.
The IEEE P1595 Greenhouse Gas (GHG) Standard project will use a new online standards development platform developed by ClimateCHECK, in association with the Greenhouse Gas Management Institute (GHGMI), the worlds largest community of GHG experts. The online standards platform provides global accessibility for experts to collaborate in authoring documents online, with the functionality to track edits, comments, balloting, as well as task management and reporting. The accessibility of the online tools, coupled with purpose built standards development functionality is designed to enable greater productivity of experts with the objective of saving time and costs while maintaining high quality. The process governance functionality, which is flexible to incorporate standards templates and procedures from different standards initiatives, provides transparency and additional credibility to the work of the standards developers.
IEEE GHG standards are relevant to 5G Networks and Green IT because the standards will be useful for data centre design, engineers and CFOs in the business case and provides the methodologies to support green power claims. ClimateCHECK will use the standards portal as part of benchmarking and performance work with McGill University and UCSD supercomputing centre to create GHG metrics. Using ClimateCHECKs online collaborative solution, subject matter experts and stakeholders can more effectively design green standards and standards-based quantitative green metrics to help transition from qualitative green claims. These tools and approach will benefit all the stakeholders in the business decision making process and provide competitive advantage to green product vendors.
From: IEEE Standards Association [mailto:email@example.com]
Sent: July-12-10 10:02 AM
Subject: Call for Participation for P1595(TM) Working Group
IEEE CALLS FOR PARTICIPATION TO DEVELOP STANDARDS FOR QUANTIFYING GHG EMISSIONS FROM SMALL HYDRO AND WIND POWER PROJECTS, AND GRID BASELINE CONDITIONS
The IEEE Standards Association announced a call for participation for the IEEE P1595(TM) Working Group to help develop new standards for quantifying greenhouse gas (GHG) emission credits from small hydro and wind power projects and for grid baseline conditions. The IEEE P1595 Working Group is part of the Climate Change Technology Sub-Committee (CCTSC) of the Energy Development and Power Generation Committee (EDPGC) of the IEEE Power and Energy Society (IEEE-PES).
The IEEE P1595 standard will use protocols for wind power, small hydro and grid baseline developed by the Government of Canadas Department of Natural Resources - CANMET Energy Technology Centre (NRCan-CETC) as its seed documents. These protocols were developed in accordance with the ISO 14064 Part 2 International Standard for GHG Projects, which is used by regulated carbon offset credit markets such as in the Province of Alberta and in the Province of British Columbia. ISO 14064 Part 2 has also been adopted by the Voluntary Carbon Standard (www.v-c-s.org ).
The IEEE P1595 working group will be working in cooperation with the ClimateCHECK, a collaborative solutions provider in the GHG and clean technology markets. The IEEE P1595 working group will be utilizing ClimateCHECKs online standards development platform, which was developed in association with the Greenhouse Gas Management Institute (GHGMI).
Those interested in joining the IEEE P1595 Working Group or for more information, please contact the P1595 Working Group Chair Jim McConnach at firstname.lastname@example.org, phone 1-705 645 5524 or CCTSC Chair Tom Baumann at email@example.com, phone number +1 613 795 1158.
For more see: http://ieeestandards.org/ct.html?rtr=on&s=8nv,1e8t4,2xny,d9np,g3tu,isdz,3wwy
Also the P1595 Working Group will be meeting at the IEEE-PES 2010 General Meeting in Minneapolis, July 25 to 29 see: http://ewh.ieee.org/conf/pesgm10/
IEEE Standards Association
445 Hoes Lane
Piscataway, NJ 08854
To unsubscribe, send an email to: unsubscribe-136942@ with the address: firstname.lastname@example.org in the subject line.
The Coming 'C' Change in Datacenters
by Edward J. Lucente, Vice President of Business Development, Data Center Rebates, Inc.
Recently, I was at the Uptime Institute in New York and had several conversations about carbon dioxide (CO2) management for datacenters. Energy consumed by US datacenters in 2010 will reach 3 percent of overall US energy production. This will double in about five years given that the annual growth in datacenter energy consumption is 10 percent. Increases in datacenter CO2 emissions should mirror energy consumption increases since most datacenters will be unable to convert to greener, cleaner, renewable energy sources.
The good folks at the Uptime Institute estimate that datacenter CO2 emissions willquadruple between 2010 and 2020; also that annual global datacenter CO2 emissions are already on par with the CO2 emissions of the airline industry, or even entire countries. Maybe we should put datacenters in airplanes and keep all the CO2 flying around.
Annual CO2 emission comparisons (Mt = thousands of metric tons)
US datacenters 170 Mt
Argentina 142 Mt
Netherlands 146 Mt
Malaysia 178 Mt
The IT professionals that I spoke with are becoming familiar with their datacenters' "carbon footprint." They understand that by managing CO2 emissions they will be better prepared for existing or future greenhouse gas (GHG) regulations. (GHG also includes water vapor, methane, nitrous oxide, and ozone.)
Also, I noticed that a number of application software companies have sprouted up to promote carbon management information systems that deal with issues around CO2 compliance standards, CO2 inventory baselining, and financial management of carbon allowances and credits. Certainly, innovative application solutions will be needed to help datacenter professionals and executives navigate through CO2 management challenges associated with:
The federal government will be among the early adopters of carbon management software. The US federal government's demand for carbon management software is expected to grow from its current level of $36 million to $294 million by 2017, according to a new report by Pike Research.
In the United States, government regulations concerning CO2 include the EPA's GHG Reporting Rule and the pending Kerry-Lieberman bill, known as "cap and trade." Under the EPA's GHG Reporting Rule, suppliers of fossil fuels or industrial greenhouse gases, manufacturers of vehicles and engines, and facilities that emit 25,000 metric tons or more per year of GHG emissions are required to submit annual reports to EPA. This would include the largest datacenters, and there is a concern that over time this floor of 25,000 metric tons would be reduced by government. Currently, over a dozen US states are contesting this new EPA law in court, so stay tuned.
The passage of the Kerry-Lieberman bill in 2010 is less certain, especially now with the oil spill crisis in the Gulf of Mexico, but it is potentially far reaching. If passed, it would require many businesses to measure, monitor, or manage GHG offsets, abatement projects, GHG sources, GHG reporting, carbon prices, and various protocols. This could be a nightmare for datacenter professionals. Just the bill's preamble scares me, especially the "for other purposes" language:
To secure the energy future of the United States, to provide incentives for the domestic production of clean energy technology, to achieve meaningful pollution reductions, to create jobs, and for other purposes.
Call for Action
I tend to believe that government mandates are less efficient delivery mechanisms than programs developed through private industry and self-regulation; what concerns me is that I have not seen the IT industry take a more proactive, self-regulatory role with regard to managing and minimizing CO2 emissions. Consider these questions:
• Why should the IT industry wait around for government standards on CO2 emissions?
• Shouldn't datacenter professionals control and develop their own CO2 management information systems since they understand best their unique IT and business environments?
• Why wouldn't a CEO, Corporate Sustainability Officer (CSO), Corporate Social Responsibility (CSR) executive, or CIO want to take more control of their destiny?
As mentioned, IT shops can choose from various application solutions and turn to energy efficiency consultants for additional guidance. Datacenters that reduce their CO2 emissions will also reduce their energy bills (OpEx) and total cost of ownership.
I suggest, therefore, that the IT industry create its own "carbon efficiency consortium" to establish carbon management information standards and solutions aimed at reducing CO2 emissions in datacenters. This would be an industry-led, self-regulatory body that provides thought leadership on CO2 management and shares best practices and recommendations for carbon management.
My bet is that datacenter professionals who develop internal management information systems for carbon management now will achieve significant cost savings ahead of their competitors. It's not just about a greener planet; it's about building a sustainable and competitive IT and industry advantage.
About the Author
Edward J. Lucente is vice president of business development at Data Center Rebates, Inc., an IT efficiency consultancy based in Carlsbad, Calif., whose professional services focus on datacenter energy efficiency (DCEE), leasing integrated with technology refreshes, and negotiation of IT energy rebates. Please feel free to email comments email@example.com.
Tuesday, July 13, 2010
“Ecolo TIC” (Ecology ITC)
A proposed $ 60 million which consists of structuring a partnership between government and business leaders in this field. This project enable the development and demonstration of new products or systems that promote the reduction of consumption energy or the development of other systems have a positive effect on the environment.
This initiative, which aims to share the risk with the companies in the ICT sector, will leverage the strengths of Quebec, including the centers private research centers, research and public research groups, universities and SMEs, whose work will contribute to specific projects. It will create a pool of world-class expertise in the niche of green ICT, where innovation technology and maximize commercial benefits go hand in hand.
The government will invest $ 30 million over three years. The contribution of the private sector will be $ 30 million for the same period.
Internet2 and NOAA Partner To Provide New High Capacity National Research Network for Climate Research
New NWave Network To Support 80 Terabytes of Climate Research Data Per Day
ANN ARBOR, Mich. – July 13, 2010 - Internet2 and the National Oceanic and
Atmospheric Administration (NOAA) today announced a partnership to deploy a
highly reliable, high capacity nationwide network that will serve to
significantly enhance the capabilities of NOAA’s researchers and their
partners across the country.
Funded through the American Recovery and Reinvestment Act (ARRA), the new
high capacity research network called "NWave” will be built on a set of
10-Gigabit per second dedicated waves on the national Internet2 Network. The
network waves will be used to provide dedicated, high speed, and high
capacity connection between climate and weather researchers and NOAA’s key
high performance computing sites across the nation.
Climate scientists around the country leverage these HPC resources to
understand, predict, and explain changes in climate. This is accomplished by
developing and applying state-of-the-art, computationally intensive coupled
climate models for advancing climate research, predicting climate from weeks
to decades, and projecting future climate out to several centuries. These
climate predictions and projections are expected to generate approximately 80
terabytes of data per day to support decision makers regionally to globally
with timely and authoritative information. NWave provides the critical high
capacity network links that can support these large data flows between sites
as well as provide the capabilities to allow NOAA scientists the ability to
easily share computational resources with the U.S. Department of Energy and
other U.S. government agencies.
“NOAA is world leader in understanding and predicting the earth’s environment
through its global network of observations, advanced modeling, and weather
and climate research,” said Joe Klimavicz, CIO and director of high
performance computing and communications at the National Oceanic and
Atmospheric Administration. “This new high speed research network will
greatly increase our ability to transparently access large volumes of higher
resolution and more complex climate and weather analyses, predictions and
“The Internet2 community is excited to be an enabler of NOAA’s critical
climate and weather research. The Internet2 Network will connect researchers
across the country to the high-performance computing resources that are an
absolute requirement for the kinds of distributed, collaborative
environmental observations and analyses that will unleash the next wave of
discoveries about our natural world,” said Rob Vietzke, Internet2 executive
director of network services.
NWave will be backed by the operational expertise of the Indiana University
Global Research Network Operations Center (GRNOC), which will provide
24x7x365 professional network support as it does for the Internet2 Network
and other advanced research and education networks in the country.
Internet2 is an advanced networking consortium led by the research and
education community. An exceptional partnership spanning U.S. and
international institutions who are leaders in the worlds of research,
academia, industry and government, Internet2 is developing breakthrough
cyberinfrastructure technologies that support the most exacting applications
of today—and spark the most essential innovations of tomorrow. Led by its
members and focused on their current and future networking needs since 1996,
Internet2 blends its human, IP and optical networks to develop and deploy
revolutionary Internet technologies. Activating the same partnerships that
produced today’s Internet, our community is forging the Internet of the
future. For more information, see http://www.internet2.edu.
NOAA understands and predicts changes in the Earth’s environment, from the
depths of the ocean to the surface of the sun, and conserves and manages our
coastal and marine resources. For more information, visit
202 331 5345
Tuesday, July 6, 2010
I particularly like the quote from the recent ITIF report on Debunking the Myths of Climate Change “Incidentally, although energy efficiency technologies and measures are certainly an important part of attaining a lower carbon footprint, in reality these are short-run, stop-gap solutions. If we add all of the potential savings from energy efficiency, they only abate about 25 percent of GHG emissions. To make matters worse, the “low hanging fruit” will grow smaller over time, decreasing returns to our efforts.”
If we are truly concerned about climate change we need to adopt policies that truly reduce GHG emissions. This is why any proposed “green” solution needs to be developed as a GHG standard accepted by various GHG registries according to the ISO 14064 standard. This applies to any proposed research project as well .Only then will any claim of being green can be independently verified as reducing GHG emissions. Although this process is much harder than energy efficiency hand waving, it will genuinely result in real low carbon solutions. The intellectual challenge of building low carbon solutions is much harder than most of the lazy thinking associated with energy efficiency – but on the upside the outcomes can generate real investment, jobs and economic growth.
In my opinion there are two rules of thumb to building solutions for a low carbon economy:
(1) They must use renewable power sources only in order to de-couple energy production from GHG emissions
(2) They must not involve electric utilities or the grid
For examples of some ideas on products and services for a low carbon economy please see my presentation to National Research Council:
Debunking the Myths of Global Climate Change
Numerous advocacy groups, scholars, think tanks and others have proposed a variety of steps to address global warming based on a set of assumptions about the green economy. Yet, while we need to take bold action to address climate change, much of what passes for conventional wisdom in this space is in fact either wrong or significantly exaggerated.
In our recent report, “Ten Myths of Addressing Global Warming and the Green Economy,” ITIF explains how the debate on policy responses to climate change is fueled by an array of myths, ranging from assumptions that high carbon taxes will generate needed clean innovations to the belief the U.S. is the natural leader in the clean energy sector. If we are to effectively address climate change and at the same time become globally competitive in the clean energy industry, policies need to be guided by careful and reasoned analysis.
Perhaps the most prevalent myth is that carbon taxes or a cap-and-trade regime alone will drive significant GHG reductions and save the planet. The current neoclassical economics-inspired solution focuses on pricing carbon and letting markets work. Proponents have faith that increasing the price of carbon will induce behavior change. But this will only happen when there is a viable and affordable substitute. Adherence to this entrenched myth overlooks the fact that radical innovation in the energy sector is essential to the transformation in how we produce and consume energy in the future. Our strategy must be based on innovation to make the dent we have to make in our greenhouse gas production.
And, by the way, cap and trade–the darling of the moment–isn’t a globally sustainable option. It’s a myth that developing nations can afford to pay a premium for low-carbon energy when they are having trouble enough with providing the basics of food and shelter. The conventional policy response is that the United States (and Europe) should either bribe poor nations with massive clean development aid so they can afford more expensive clean energy, or we should penalize them with border adjustable carbon taxes. And neither option comes for free since the United States would need to increase taxpayer-financed aid subsidies to meet developing countries clean energy demand. The end result is that U.S. taxpayers would pay twice in a global cap-and-trade regime—once for their own consumption and once for developing nations’. The only globally sustainable option is the creation of affordable (read “grid parity”) clean energy for all nations.
The reality, however, is that we don’t have the technology we need to make needed reductions in global GHG emissions at a price at or below the price of fossil fuels—no matter what advocates like former vice president Al Gore say. This notion plays into the policy advice that suggests we just need to raise the price of coal and oil a bit, and technology will fly from the shelf and into the market. This ignores a fundamental truth that the needed breakthroughs in clean energy face daunting challenges, including lowering materials and processing costs, improving conversion efficiencies, and gaining better manufacturing yields. Moreover, clean energy innovators recover only a portion of the benefits their technologies produce. Most companies prefer to “free ride” off existing dirtier technologies, making the rational business decision to under invest in fundamentally new green technologies. To spur the technology we need, government must step in, incentivize basic R&D and propel these technologies through the “valley of death” – the phase in the development of technologies between research and commercial introduction in the marketplace.
Incidentally, although energy efficiency technologies and measures are certainly an important part of attaining a lower carbon footprint, in reality these are short-run, stop-gap solutions. If we add all of the potential savings from energy efficiency, they only abate about 25 percent of GHG emissions. To make matters worse, the “low hanging fruit” will grow smaller over time, decreasing returns to our efforts. To reduce our GHG emissions by 85 percent by 2050, we need radical innovation to provide clean energy alternatives, rather than just using carbon-based fuels a bit more efficiently.
• China Fears Warming Effects of Consumer Wants
GUANGZHOU, China — Premier Wen Jiabao has promised to use an “iron hand” this summer to make his nation more energy efficient.
But even as Beijing imposes the world’s most rigorous national energy campaign, the effort is being overwhelmed by the billionfold demands of Chinese consumers.
Chinese and Western energy experts worry that China’s energy challenge could become the world’s problem — possibly dooming any international efforts to place meaningful limits on global warming.
If China cannot meet its own energy-efficiency targets, the chances of avoiding widespread environmental damage from rising temperatures “are very close to zero,” said Fatih Birol, the chief economist of the International Energy Agency in Paris.
Aspiring to a more Western standard of living, in many cases with the government’s encouragement, China’s population, 1.3 billion strong, is clamoring for more and bigger cars, for electricity-dependent home appliances and for more creature comforts like air-conditioned shopping malls.
As a result, China is actually becoming even less energy efficient. And because most of its energy is still produced by burning fossil fuels, China’s emission of carbon dioxide — a so-called greenhouse gas — is growing worse. This past winter and spring showed the largest six-month increase in tonnage ever by a single country.
China’s goal has been to reduce energy consumption per unit of economic output by 20 percent this year compared with 2005, and to reduce emissions of greenhouse gases per unit of economic output by 40 to 45 percent in 2020 compared with 2005.
But even if China can make the promised improvements, the International Energy Agency now projects that China’s emissions of energy-related greenhouse gases will grow more than the rest of the world’s combined increase by 2020. China, with one-fifth of the world’s population, is now on track to represent more than a quarter of humanity’s energy-related greenhouse-gas emissions.
Industry by industry, energy demand in China is increasing so fast that the broader efficiency targets are becoming harder to hit.
¶Although China has passed the United States in the average efficiency of its coal-fired power plants, demand for electricity is so voracious that China last year built new coal-fired plants with a total capacity greater than all existing power plants in New York State.
¶While China has imposed lighting efficiency standards on new buildings and is drafting similar standards for household appliances, construction of apartment and office buildings proceeds at a frenzied pace. And rural sales of refrigerators, washing machines and other large household appliances more than doubled in the past year in response to government subsidies aimed at helping 700 million peasants afford modern amenities.
¶As the economy becomes more reliant on domestic demand instead of exports, growth is shifting toward energy-hungry steel and cement production and away from light industries like toys and apparel.
Obama's Energy Pipe Dreams
… we won't soon end our "addiction to fossil fuels." Oil, coal, and natural gas supply about 85 percent of America's energy needs. The U.S. Energy Information Administration (EIA) expects energy consumption to grow only an average of 0.5 percent annually from 2008 to 2035, but that's still a 14 percent cumulative increase. Fossil-fuel usage would increase slightly in 2035, and its share would still account for 78 percent of the total.
Unless we shut down the economy, we need fossil fuels. More efficient light bulbs, energy-saving appliances, cars with higher gas mileage may all dampen energy use. But offsetting these savings will be more people (391 million vs. 305 million), more households (147 million vs. 113 million), more vehicles (297 million vs. 231 million) and a bigger economy (almost double in size). Although wind, solar, and biomass are assumed to grow as much as 10 times faster than overall energy use, they provide only 11 percent of supply in 2035, up from 5 percent in 2008.
"Clean energy" won't displace oil or achieve huge reductions in greenhouse-gas emissions—for example, the 83 percent cut by 2050 from 2005 levels included in last year's House climate-change legislation. Barring major technological advances (say, low-cost "carbon capture" to pump CO2 into the ground) or an implausibly massive shift to nuclear power, this simply won't happen. It's a pipe dream. In the EIA's "reference case" projection, CO2 emissions in 2035 are 8.7 percent higher than in 2008.
A good overview of the challenges of building a low carbon infrastructure
- ► 2012 (19)
- ► 2011 (37)
- Computers and ICT in Australia account for 8% ener...
- New sources of funding (>$24 Billion) for research...
- Organizing research and teaching at universities t...
- The green telecoms market is expected to be worth ...
- Cloud helps universities reduce costs by 74% - mor...
- IEEE Green House Gas standards for 5G networks and...
- CO2 emissions from US datacenters greater than all...
- Quebec to invest $60 million in Green ICT for futu...
- Internet2 and NOAA Partner To Provide New High Cap...
- More on energy efficiency versus building a low ca...
- ▼ July (10)
- ► 2009 (80)
- ► 2008 (55)