Federal Reserve officials have said recently they expect GDP growth in 2011 could be as high as 4%. A poll conducted by Dr. John Paglia, and released today, shows small companies (less than $5 million in revenue) believe the overall U.S. economy will continue to be sluggish, posting 1.87% GDP growth over the next twelve months. Larger companies ($100+ million in annual revenue) have a slightly rosier picture forecasting a 2.14% increase to GDP in 2011. Dr. Paglia polled 1,224 privately-held businesses, capital suppliers, intermediaries, and service providers.
The findings are available at http://bit.ly/privatecapeconpoll
1. GDP seen at 1.98% with probability of recession at 28.43%.
2. Housing expected to decline 1.76% while S&P seen increasing by 6.46%.
3. ‘Increased access to capital’ seen as policy that would help spur job creation the most
4. Most participants ‘somewhat more confident’ in U.S. economic growth in 2011
5. Likewise, most participants ‘somewhat more confident’ in growth prospects of privately held businesses
6. Most participants more incentivized to innovate today
7. 80% of business owners feel economic stimulus measures distributed unfairly
8. Business owners believe a stronger dollar would be more beneficial than weaker dollar
9. Compared to one year ago, respondents more likely to invest in US, Brazil, India, Canada, Australia, and China. Less likely to invest or expand in Japan, EU, Mexico, and Russia.
10. Most respondents believe raising the $14.3 trillion US debt ceiling would be detrimental to US businesses
While climate change represents the most significant environmental concern, the BP oil spill has provided a glimpse into the true cost of carbon based energy dependence and the urgent need to put in place the mechanisms necessary to support a technological shift in our nation’s energy sector.
While the past 40 years have shown great ebb and flow in government efforts to redirect our energy markets (recall the Arab oil embargo of 1973 and cries for energy independence), at no time has the stage been so “set” to transform the energy sector. As articulated by Tom Friedman in his latest NY Times Op-Ed, the BP oil spill disaster provides a great moment of opportunity for the administration to set in place the policy measures necessary to fully engage on the New Energy for America plan.
Unfortunately, while the S&P 500 is up 20% the NEX is down 20% for the same 12 month period…not exactly the sort of returns that will incentivize capital markets to invest in new energy businesses focused on achieving what the administration has said is “the moral, economic and environmental challenge of global climate change, and building a clean energy future that benefits all Americans”.
While subsidies along with ARRA stimulus funding have helped some companies with the economic learning curve inherent in creating novel, Renewable Energy and CleanTech solutions, a much stronger signal needs to be sent if we are truly committed to energy independence and climate change mitigation. A Carbon Tax or a Cap-and-Trade program that puts a cost on carbon is such signal.
Such a signal, coupled with increased R&D investments and enhanced deployment incentives, can go a long way to providing the economic incentives investors demand when putting money in harm’s way. In fact, large scale participation by the capital markets will not only accelerate clean technology innovation, but it will also help in getting large emitting nations to finally get on-board; irrespective of Copenhagen’s politics, as the economic incentive for participation in a new green economy will simply be too large to ignore.
So while the past 40 years have been a demonstration of the disconnect between what is needed versus what is politically feasible, the BP disaster in the Gulf of Mexico provides a rare window of opportunity for the administration to connect these pieces and make its New Energy for America vision a reality.
It has been estimated by scientists that CO2 levels in the atmosphere must be limited to a maximum of 450ppm, which is widely considered to be the maximum CO2 concentration level required to avoid the worst effects of global warming. This entails limiting planetary temperature rise to 2 degrees Centigrade.
For reference purposes, the concentration of greenhouse gases before the Industrial Revolution was 280 parts per million by volume, currently we are at 380 ppm and the maximum as mentioned is generally agreed to be no more than 450 ppm. Based on these thresholds, scientists and researchers have set several future-looking scenario projections on what CO2 levels will be in 2020, 2030, and even 2050. While these projections and predictions are required to begin the dialog necessary to avert climate change, they are most likely false.
It is impossible to know what innovations will be developed over the next 5-10 years, much less what will be happening in 30 years. The past no longer provides reliable insight into our future given the exponential volatility that technology has injected. As John Seely Brown has mentioned, the 20th century was primarily driven by moments of disruption (such as electrification, flight, automobile, and telephony), followed by years of stability from the build-up of infrastructure to realize the efficiency of scale benefits from an innovation (through manufacturing, transportation and distribution efficiencies).
The stability of the 20th century afforded this approach as business had reasonable means to extrapolate from the past and predict and determine what to build, where and how much. Our last 100 years were built on a factory business model that relied on organizational efficiency, hierarchy and control in order to minimize variance.
“I think there’s a world market for about 5 computers.”
Thomas J. Watson, Chairman of the Board, IBM (around 1948)
The factory model of the past is no longer relevant in the 21st century. The embedded and obscure modeling assumptions (based on past industrialization patterns), introduce significant quantitative flaws in the predicted outcomes of the simulated predictions. This includes what CO2 levels in the atmosphere will be in 2030 or 2050. As Vinod Khosla has so articulately expressed, given the current rate of change in technology, trying to predict what 2030 will be like is akin to predicting what 2010 would be like back in 1910!
The fact is that never in the history of man-kind have we had so much constant flux and unpredictability due to exponential technological innovation. Imagine how technology in the past few years has effected:
- how we commute (i.e. GPS in cars and phones),
- how we create, share and consume knowledge (i.e. Google or collaborative web technologies),
- business transaction costs (i.e. how iTunes has collapsed transaction costs for delivery of content to users), and
- impact of time and geography (i.e. email, Skype, FTP – connect anywhere, anytime)
Who could have predicted Google back in 1990? Or that General Motors or Lehman Brothers would have been bankrupt in 2009? The future is being invented each day and influenced by a confluence of forces we could not have even imagined just 10 years ago! So how can we predict what will be happening 20 years from now?
“Heavier-than-air flying machines are impossible.”
- Lord Kelvin, President, Royal Society, 1895
What we have learned is that technology greatly expands the art of the possible and that today’s impossible will be tomorrows common sense. In Phillip Tetlock’s words “We are not natural falsificationists: we would rather find more reasons for believing what we already believe than look for reasons that we might be wrong. In the terms of Karl Popper’s famous example, to verify our intuition that all swans are white we look for lots more white swans, when what we should really be looking for is one black swan.”
As we have no idea what amazing technology inventions will occur over the next 20 years - inventions that may fundamentally change how we produce, manage, distribute and consume energy; I predict that we have no idea what CO2 levels in our atmosphere will be in 2030 and beyond.