Energy Internet and eVehicles Overview
Governments around the world are wrestling with the challenge of how to prepare society for inevitable climate change. To date most people have been focused on how to reduce Green House Gas emissions, but now there is growing recognition that regardless of what we do to mitigate against climate change the planet is going to be significantly warmer in the coming years with all the attendant problems of more frequent droughts, flooding, sever storms, etc. As such we need to invest in solutions that provide a more robust and resilient infrastructure to withstand this environmental onslaught especially for our electrical and telecommunications systems.
Linking renewable energy with high speed Internet using fiber to the home combined with eVehicles and dynamic charging where vehicle's batteries are charged as it travels along the road, may provide for a whole new "energy Internet" infrastructure for linking small distributed renewable energy sources to users that is far more robust and resilient to survive climate change than today's centralized command and control infrastructure. For more details please see:
Free High Speed Internet to the Home or School Integrated with solar roof top: http://goo.gl/wGjVG
High level architecture of Internet Networks to survive Climate Change: http://goo.gl/juWdH
Architecture and routing protocols for Energy Internet http://goo.gl/niWy1g
Friday, December 10, 2010
eScience and Community based open source climate modelling
Given the political and societal impact of climate change having community open based models are critical, especially as we try to analyze local impacts of climate change mitigation and adaption. I applaud initiatives such as CIM-Earth, RDCEP and Grass roots Climate Modeling. I might add that open source community based modeling are ideal applications for zero carbon networks/clouds that are powered by renewable sources. The models generally are easily partitionable and have long run times and so can adapt easily to a computation cloud where the underlying infrastructure is constantly changing depending on the availability of energy. Well worth the look – BSA]
In an increasingly interconnected and human-modified world, decision makers face problems of unprecedented complexity. For example, world energy demand is projected to grow by a factor of four over the next century. During that same period, greenhouse gas emissions must be drastically curtailed if we are to avoid major economic and environmental damage from climate change. We will also have to adapt to climate change that is not avoided. Governments, companies, and individuals face what will be, in aggregate, multi-trillion-dollar decisions.
These and other questions (e.g., relating to food security and epidemic response) are challenging because they depend on interactions within and between physical and human systems that are not well understood. Furthermore, we need to understand these systems and their interactions during a time of rapid change that is likely to lead us to states for which we have limited or no experience. In these contexts, human intuition is suspect. Thus, computer models are used increasingly to both study possible futures and identify decision strategies that are robust to the often large uncertainties.
The growing importance of computer models raises many challenging issues for scientists, engineers, decision makers, and ultimately the public at large. If decisions are to be based (at least in part) on model output, we must be concerned that the computer codes that implement numerical models are correct; that the assumptions that underpin models are communicated clearly; that models are carefully validated; and that the conclusions claimed on the basis of model output do not exceed the information content of that output. Similar concerns apply to the data on which models are based. Given the considerable public interest in these issues, we should demand the most transparent evaluation process possible.
I argue that these considerations motivate a strong open source policy for the modeling of issues of broad societal importance. Our goal should be that every piece of data used in decision making, every line of code used for data analysis and simulation, and all model output should be broadly accessible. Furthermore, the organization of this code and data should be such that any interested party can easily modify code to evaluate the implications of alternative assumptions or model formulations, to integrate additional data, or to generate new derived data products. Such a policy will, I believe, tend to increase the quality of decision making and, by enhancing transparency, also increase confidence in decision making.
I discuss the practical implications of such a policy, illustrating my discussion with examples from the climate, economics, and integrated assessment communities. I also introduce the use of open source modeling with the University of Chicago's new Center on Robust Decision making for Climate and Energy Policy (RDCEP), recently funded by the US National Science Foundation
Extracts from Ian Foster’s notes:
We are dealing with wicked or messy problems.
Extreme importance, sensitivity to assumptions, political dimensions.
Need for transparency, broad participation, innovation in approaches.
Open source fulfills all of these properties.
Climate Modeling has distinct characteristics:
-- It is inordinately complex, involving a system of systems.
-- Highly dependent on data that is sparse and inadequate
-- No immediate way to test projections—any solution is an experiment with outcome far in future
-- Consequences of failure to make the right decision are substantial
-- Human decision making is ultimately part of the mix.
Thus, a computer simulation of an Airbus A380 is not the sort of problem I am talking about—complex though it may be.
Many stronger reason for making open source:
-- Permits more extensive comparisons among models
-- Many eyes provide opportunities to improve models
-- Increases confidence in outputs: transparency is good.
All climate models should be open. We probably don’t need so many models.
What do mean by “open”? Accessible, understandable, extensible/modifiable. Not just legal but also software engineering.
What should be “open”? Everything required to construct, run, and evaluate models. Data. Data calibration routines. Models.
CIM-EARTH is a collaborative, multi-institutional project to design a large-scale integrated modeling framework as a tool for decision makers in climate and energy policy. CIM-EARTH is intended to enhance economic detail and computational capabilities in climate change policy models, and to nucleate and support a broad interdisciplinary and international community of researchers and policymakers.
Human prosperity depends fundamentally on energy usage, and that usage must increase many times over if the developing world is to advance out of poverty. Supplying energy to meet human needs is difficult enough; worse still is that a byproduct of most energy production – carbon dioxide from fossil fuel burning – leads to unwelcome changes in the world’s climate. Energy usage now threatens the prosperity it helped create. The problem is inherently a global one: governments, industries, and individuals worldwide are linked in a single energy system whose emissions then affect climate throughout the world. Many scientists now feel that transforming the means by which we capture and use energy is the defining challenge of our time.
Grass Roots Climate Modeling
Want to help propel climate science, but don't have a spare supercomputer to offer? Now you can contribute your spare PC cycles to run climate models that could help researchers predict our future weather patterns.
The recently launched WeatherAtHome.net project allows people from around the world to help run climate simulation while their personal computers are otherwise idle. Like other volunteer computing projects, like Folding@Home and DrugDiscovery@Home, WeatherAtHome.net relies on the kindness of strangers to contribute processor cycles to drive a large-scale computer simulation -- in this case a climate model.
WeatherAtHome.net is actually a subset of climateprediction.net that was launched in 2003. The latter focuses on climate change scenarios on a global scale. WeatherAtHome.net, on the other hand, targets regional climates to help scientists develop local models of weather and climate. The three initial regions the project is working on are the Western US, Southern Africa and Europe, three areas with large human populations that look to be particularly vulnerable to climate change.
An article this week in Climate Central describes the utility of the approach:
“Statisticians are usually happy with 30 runs through a model,” says Philip Mote, a climate scientist from Oregon State University now collaborating with the international team, headed by researchers at Oxford University that launched Climateprediction.net a few years back. Mote’s part of the project is aimed at better modeling future climate in the western United States. “We’ve got almost 45,000, so we’re already in great shape.”
Mote goes on to say that because of the complexity of the landscapes and coastal environments, these three regions have been particularly difficult to model. As a result, the cloud and snow patterns tend to be too complex to be easily simulated in a global model.
However, the distributed nature of the volunteer computing grid means the simulations are less sophisticated than those run a tightly coupled supercomputer. Thus you don't get as complete a picture of the future climate. But the sheer number of simulations that can be attempted allows the WeatherAtHome researchers to compensate with more, if less accurate, data points.
If you're interested in downloading the software and donating some of your idle PC time for this umm... killer application, check out the WeatherAtHome site athttp://climateprediction.net/weatherathome.
- ► 2012 (19)
- ► 2011 (37)
- ▼ December (3)
- ► 2009 (80)
- ► 2008 (55)