Given our species' tendency to waste energy when it is plentiful and cheap (vis a vis our current carbon-based fuel use history) what happens to energy efficiency (as a practice and industry) when we reach grid parity and energy is, for all intents and purposes, limitless and free?
This is not an easily dismissed question.
Think about it: when we reach the point that solar photovoltaics cost as much to install and maintain as carbon-based fuels, then there will be an en masse shift to solar sources of energy (both at the utility and consumer levels).
When that happens, PV costs will be driven down even further which will only cause more investments into solving remaining barriers (read: storage). As those solutions are brought to scale, PV market penetration will get even deeper and be able to reach previously unreachable markets (think low-income, developing countries, etc.) only further driving down costs and increasing investments.
As barriers to mass adoption of PV are solved and PV generated electricity becomes the chief source of fuel for everything (transportation, manufacturing, heating/cooling, etc.) we will, as a species, reach a point where the amount of energy consumed per capita is based purely on how many panels and how much storage each consumer can afford.
At that point, our chief energy source is, for all intents and purposes, unending (provided the sun doesn't go super nova ahead of schedule) and free (again, limited only by each consumer's capacity to acquire and maintain PV solar panels).
When that happens, who cares about efficiency?
Got too many appliances plugged in? No problem - just get more panels.
Need more energy to manufacture your widgets? No problem - just get more panels.
Want to walk around your home in shorts in January? No problem - just get more panels.
I apologize for raising existential questions (I find them just as unsettling and annoying as the next person) but we should at least start pondering situations that could conceivably happen in our lifetimes.
Please share your comments. I am very interested in hearing this community's thoughts on this question.
I have had several small and large businesses base their decision on the age old philosophy of 'Is it cost effective. The estimate of the energy money saved by doing the job, if it was paid back in 12 months or less, they considered it cost effective and would consider signing that contract. Even 25 years ago, when energy was cheap, they had that same mind set.
MANY times I was hit with that mindset as soon as I walked through the door; "If this is going to be cost effective in the first year, we will consider it. If not, you're wasting youir time". With them, even showing the long term saving AND other added benefits, it seldom changed that thought process.
I never got hit with that mindset from a homeowner, not that I recall anyway.
I think that same cost effective mindset is still out there, although probably not as prevalent as years ago.
Ifg energy was free or nearly free, I think the only reason they would consider an energy upgrade would be for all the 'Other' benefits besides reducing the actual cooling / heating monthly cost of their bill.
I still do not believe that most energy professionals emphasize those 'Other' added nenefits, extended equipment life by reducing the load being at the top of that list, in my opinion.
As long as utility companies go, energy will never be free, too many stockholders to appease. They do what the government mandates them to do in the name of energy efficiency, no matter what their advertising says.
Show them something that can drop a small companies energy requirements substantially, they dont like you and will you know that in no uncertain terms.
You raise a good question. The concept of "free and unlimited" energy is not new, in fact it is arguable that this idea is what got us into the trouble we're in to begin with (http://en.wikipedia.org/wiki/Too_cheap_to_meter). What we did was build historically inefficient buildings and hit an energy, and now a climate, crisis. The idea of "free energy" is continually compelling and when it's also seen as having no environmental impact, all controls are lost.
This is one of the main objections I have with "net zero energy" schemes at present - they encourage people with high electrical bills to install PV panels, not examine their energy use. In fact, people with high energy bills are the greatest beneficiaries of PV incentives, people who conserve are not rewarded. This approach is less equitable, less successful and more expensive than a "Feed in Tariff" (FiT) like they've implemented in Germany where they've installed something like SIX TIMES the amount of solar we have in this country (https://financere.nrel.gov/finance/content/germany-solar-feed-in-ta...).
I have not thought through what happens if PV (AND storage) is unlimited some day, but it seems likely that before we reach that point, we will see the end of "net zero," as utilities will no longer be interested or wiling to barter surplus summer power for winter consumption. The paradigm will shift, and winter power will be the challenge, and will remain so. The reason for higher energy demand in houses (at least) in winter is that it's cold outside. It's cold outside because there isn't much sun. Because there isn't much sun, solar systems don't produce much energy. 'Round and 'round we go, trying to treat the symptom with the disease, as it were.
Germany has already seen negative wholesale summer electrical prices due to extra production in summer, and their average yearly renewable fraction is "only" on the order of 20% (http://www.economist.com/news/briefing/21587782-europes-electricity...).
So, I think it likely that "net zero" will go away, and the marginal cost of solar power will be driven down by utilities (with tracking arrays, distributed generation and other means beyond the homeowner) throughout the summer and into the shoulder seasons to the point that a home's PV array will be of value only to the extent that it directly offsets power consumption. Winter power will be at a premium, and storage will be a huge (and expensive) endeavor. All of this argues for efficiency to the point that it conserves expensive power at a lower cost.
The U.S. is catching up to Germany very quickly on solar installs. As of the end of 2013, the Germany-to-U.S. cumulative installed solar capacity ratio was about 2.6 to 1. At the end of 2014, it will be closer to 2 to 1. The U.S. will likely exceed Germany in cumulative solar capacity by ~2017-2018, with a combination of power purchase agreements (PPA's) at utility scale plants and (mostly) net-metered commercial and residential installs. Germany (and the EU in general) is the rare marketplace in the entire world where PV installations have slowed significantly (just 3.4GW for Germany in 2013 compared to ~6GW-7GW each in 2012, 2011 and 2010). There's strong evidence (in hindsight) that from an economic standpoint, Germany (and others in the EU) significantly overpaid for solar incentives. Of course, their money helped prime the pump for the rest of the world, bringing PV up to scale and costs lower and lower. We are now benefiting from this gift from the "early adopters". The second wave of huge PV investment is coming from the U.S., China and the rest of Asia and in 10 years time will likely make Germany's investments in the 2000's
Regarding net-metering - yes, it will have to go away eventually, market by market as PV % increases. Some utilities and regulators around the country are already working on new rules. At current levels of grid penetration for most of the U.S., I don't have any issues with net-metering or incentive levels for net-metered systems. My main issue (and I suspect you would agree with me) is that we (the U.S.) have failed to "incentivize" efficiency to the same level that we do renewable production.
While individual houses in most U.S. locations typically use more energy in winter than in summer, the same can't be said for our electrical grids as a whole. Larger commercial buildings in many climates have cooling demand for 9 months out of the year. Since we have much much more cooling energy demand than Germany, I suspect that most of our grids will still be "summer peakers" rather than "winter peakers" in the near term. This may change over time for some grids as more and more of our on-site natural gas, propane and heating oil use switches over to electricity.
Interesting though, if you read this NREL article (https://financere.nrel.gov/finance/content/germany-solar-feed-in-ta...), it would suggest that we are spending as much as Germany per installed kW if federal tax credits and accelerated depreciation are only considered, no accounting for local or state incentives. If you account for our higher levels of insolation we could be doing what Germany did at HALF the cost. The difference is that we've been quite a bit less successful. Remember that our population is about 320 million and Germany's is around 80 million, so reaching parity in terms of total solar installation isn't that impressive. Given our relatively abundant solar radiation, we should be dwarfing them. In fact, they should probably have installed their PV here, if there were a way to make that work financially!
My main issue with net-metering is exactly what George points to - it provides no incentive to people installing PV to use any less energy than what the "pre-purchase" with their PV system, since whatever they save is "worthless" on the market . Further, they are incentivized to only install the exact size system they predict will meet their needs, since any extra generation is "worthless" as is anything energy conserved. How can this have a happy ending?
What I DO like about a feed-in tariff is that production and consumption are decoupled. If you want to install a PV system and sell power, have at it (for that matter, it's a lot simpler than granting tax credits, since it's performance-based. Put your PV in wrong and it's your problem, no one else's. This is likely a big reason PV is so much cheaper in Germany on soft costs.) If you decide to save energy, you save utility bills and have more energy to sell. VERY simple, you get a bill for what you use, you get a check for what you generate.
I feel that net metering primarily benefits utilities, who "barter" cheaper base load window and night energy for valuable summer near-peak energy. Again, this won't last long if a lot of people (including the utilities) jump into the solar energy market. Each new system eats into the profit of summer production disproportionately because PV generates much more power in summer. As long as the percentage of solar power is relatively insignificant, net metering works. Again, how forward-thinking is this?
Conservation won't die with the solar movement, if anything consumers will be MORE aware of peak use.
With Solar, consumers will simply be paying for available peak capacity instead of overall KWH consumption. Use all the power you want for cheap during low demand times, but pay high rates during peak demand times. Only 1/3 of current electric costs is fuel, the other 2/3's is distribution costs. Even if the fuel is free, distribution costs which are tied directly to the ability to meet peak demand remain.
Time of use rates are already becoming commonplace in residential due to peak demand issues. It's been popular in commercial, but until smartmeters dropped in price wasn't practical for residential applications. Even if a consumer wants to generate their power locally, peak demand issues remain. Customer pays for peak capacity and must spread the load over time in order to keep panel costs down. 5ton central AC units get replaced by "mini splits". Devices with large heating elements (dryers/water heaters/ovens) only to be run off peak if they are kept at all.