Monday, November 28, 2011

Climategate 2, more unethical Team behavior

The second batch of e-mails, (Climategate 2)(you have to register) documenting the “behind the scenes” activities of the scientists that have been some of the stronger advocates for the uniqueness of current global warming over the last two millennia have been released. They have now been perused in running series of posts at various blog sites, and, while there is no immediately obvious major new revelation, the contemptible behavior of these purportedly exemplary individuals is getting an increasing amount of sunlight. It also shows how thick the whitewash was in the “enquiries” into scientific misconduct that followed the release of the original, Climategate 1, e-mails.

As an example, New Zealand Climate Change has shown in two posts ( here and here) how “reputable” scientists (Phil Jones, Michael Mann, Jim Salinger, Tom Wigley, Barrie Pittock, Mike Hulme and others – i.e. "the Team") worked to get Chris de Freitas, then editor of the journal Climate Research fired from that position, and also worked to try and get him fired from his academic position. His “crime” was to allow, following peer review, a paper that challenged the validity of the initial Mann “hockey stick” paper and its conclusions that the late 20th century was the warmest of the last millennium. (That particular conclusion was later changed, by Dr Mann, though much later than these events).

To explain the heinous nature of this particular activity requires some background, and also some information that has only since emerged. And, for the sake of brevity I am only going to summarize the story, though adding some detail not in the NZ post.
At the beginning of the story, back in 1997, the state of historical climate science thinking was that between AD 900 and AD 1300 (roughly) the world was going through a warming period, roughly akin to that we are currently seeing, and known as the Medieval Warming Period (MWP). This was followed by a much colder period that lasted from AD 1400 to AD 1800, known as the Little Ice Age (LIA). Evidence for this can, for example, be found in a study that was carried out looking at the sediments laid down in lakes in Finland. In 2005 Mia Tiljander submitted her thesis on these sediments in which she found, in concurrence with earlier studies of sediments in other lakes in the region that:
During the Roman period there was in AD 140-220 an 80-year-long period in the Lake Korttajärvi area when organic matter deposition and the sedimentation was similar to that during the Medieval Warm Period (MWP), interpreted as milder climate condition. After this period, a clear mineral matter – organic matter varve structure existed, until the beginning of the MWP.

The MWP, AD 980-1250, was an exceptional period. The MWP is characterized by thinly laminated varves rich in organic matter, almost lacking the mineral pulses (i.e. spring floods), indicating mild climatic conditions. This period was interrupted by a colder period from AD 1115- 1145, dominated by mineral-matter-rich varves. The sediment deposited during the MWP was highly organic and dark brownish in colour. Based on pollen and diatom studies (Kauppila 2002), the MWP was a two-stage event. AD 980-1100 was warm and dry, a cold spell (AD 1115-1145) interrupted the warm trend and the following period AD 1145-1220 was again warm and even drier than the first stage.
In light of subsequent discussion of her work (which comes later) it should be noted that the disturbance of the annual sediment layers (the varves) by human activity only occurred in sediments after AD 1720, i.e. towards the end of the LIA. The picture of the historic climate can thus be outlined, as it was understood, by a plot from the first IPCC report.

Global Temperature plot from the first IPCC report (IPCC via John Daly )

This picture was challenged with the publication by Michael Mann, Raymond Bradley and Malcolm Hughes of a paper in Nature in 1998 (MBH 1998) that introduced the “Hockey Stick” plot to the world. Contrary to the prevailing opinion this paper suggested that there was a steady decline in temperatures from 1000 AD to around 1850 AD (the handle of the hockey stick), following which temperatures rose steadily to their present high levels (the blade).

The original “Hockey Stick” from MBH 1998 (MBH via John Daly ) (Note the thickness of the green error bars).

This plot made it easier to argue that current temperatures were a direct cause of industrial activity, due to the generation of increasing levels of carbon dioxide as the world used more fossil fuel. And, as a result, MBH concluded:
Our results suggest that the latter 20th century is anomalous in the context of at least the past millennium. The 1990's was the warmest decade, and 1998 the warmest year, at moderately high levels of confidence.
This conclusion and the elimination of the MWP and LIA, despite the fact that the curve and conclusion over-rode the combined papers of hundreds of scientists who had worked to validate their existence, was seized upon by the Global Warming community and used without further discussion, as the “New Bible.” It was, for example, prominently featured in the 2000 Third Assessment Report of the IPCC. I have seen it used by the current Secretary of Energy as the valid plot of temperatures over the past millennium and thus as justification for the programs he espouses. And this despite the torrent of valid criticism of the curve, and that the original plot only referred to the Northern Hemisphere.

To sustain the credibility of this plot, the warming of the MWP, and the cooling of the LIA, had to be minimized. Both the initial set of e-mails and the new set document that the Team recognized and worked to do this. One immediate challenge was to respond to a paper published in 2003 in the journal Climate Research by Willie Soon and Sallie Baliunas which had looked at some of the previous data (which MBH 1998 had neither considered nor shown invalid) to conclude that:
Furthermore, the individual proxies can be used to address the question of whether the 20th century is the warmest of the 2nd millennium locally. Across the world, many records reveal that the 20th century is probably not the warmest nor a uniquely extreme climatic period of the last millennium.
Such effrontery and direct challenge to the authority of the team could not go unanswered, and the Team swung into action, (e-mail 31 ). It is interesting to note that in that correspondence Phil Jones recognizes the work of Jean Grove, an early climate scientist, (who reviewed the data validating the presence of The Little Ice Age in an excellent seminal book), but who died in 2001).
What we want to write is NOT the scholarly review a la Jean Grove (bless her soul) that just reviews but doesn't come to anything firm. We want a critical review that enables agendas to be set.
Which was not how I remembered the text at all, and so I re-read the beginning of that book, and it shows that he was wrong in that statement. Dr. Grove wrote:
Historical evidence of Little Ice Age events is much more plentiful in Europe than elsewhere but the documentation from other continents though scantier, is supported by a great volume of field evidence (e.g. Hope et al 1976, Hastenrath 1984) which is presented in Chapters 7, 8 and 9. It emerges that the Little Ice Age was a global phenomenon and it is shown in Chapter 10 that it was not unique to the Holocene.
But it was not just enough to write a rebuttal paper (which would be normal scientific practice). Although a rebuttal paper was written, with Tom Crowley suggesting that it be in EOS, this was not considered sufficient. Beginning with an e-mail from Mike Hulme (e-mail 2272) ) the Team began to focus on the editor, Chris de Freitas, who had accepted the paper.
Whilst we do not know who reviewed the Soon/Baliunas manuscript, there is sufficient evidence in my view to justify a "loss of confidence" in the peer review process operated by the journal and hence a mass resignation of review editors may be warranted. This is by no means a one-off - I could do the analysis of de Freitas's manuscripts if need be.

I am contacting the seven of you since I know you well and believe you may also have similar concerns to me about the quality of climate change science and how that science is communicated to the public. I would be interested in your views on this course of action - which was suggested in the first place my me, once I knew the strength of feeling amongst people like Phil Jones, Keith Briffa, Mike Mann, Ray Bradley, Tom Crowley, etc. CSIRO and Tyndall communication managers would then think that a mass resignation would draw attention to the way such poor science gets into mainstream journals.
Pressure was brought to bear, initially forcing changes in the editorial practices at the Climate Research journal, and then leading to the resignations of the editor in Chief (Hans von Storch) as well the one who accepted the paper (Chris de Freitas), and two others (Clare Goodess – who was at the University of East Anglia with the CRU group and Mitsuru Ando) who protested the publication, and who were encouraged to resign by the Team (e-mail 4808). It led to an editorial by the publisher, explaining why they could not allow the publication of papers in the journal to be governed by individuals outside the peer-review process (i.e. Team members).

The lack of objectivity of the editors of the journals in which the Team publish(such as Science, for example) is shown by theTeam being solicited by the journals to write opposing articles for them, as a counter to the Soon/Balianas paper. (e-mail 2469).
Phil Jones and I are in the process of writing a review article for Reviews of Geophysics which will, among other things, dispel the most severe of the myths that some of these folks are perpetuating regarding past climate change in past centuries. My understanding is that Ray Bradley, Malcolm Hughes, and Henry Diaz are working, independently, on a solicited piece for Science on the "Medieval Warm Period".
Of course, since then, Mann among others, has admitted to the presence of an MWP and an LIA, but it is a little late . . .

The Team were still not satisfied, and as the New Zealand post points out, they suggested that a letter be sent to the head of the University at which Dr de Freitas works, (e-mail 3052)
I have had thoughts also on a further course of action. The present Vice Chancellor of the University of Auckland, Professor John Hood (comes from an engineering background) is very concerned that Auckland should be seen as New Zealand's premier research university, and one with an excellent reputation internationally. He is concerned to the extent that he is monitoring the performance of ALL his senior staff, from Associate Professor upwards, including interviews with them. My suggestion is that a band of you review editors write directly to Professor Hood with your concerns. In it you should point out that you are all globally recognized top climate scientist. It is best that such a letter come from outside NZ and is signed by more than one person.
The e-mail goes on to suggest the form of the letter that should be sent:
Instead we have discovered that this person has been using his position to promote ‘fringe’ views of various groups with which they are associated around the world. It perhaps would have been less disturbing if the ‘science’ that was being passed through the system was sound. However, a recent incident has alerted us to the fact that poorly constructed and uncritical work has been allowed to enter the pages of the journal.

A recent example has caused outrage amongst leading climate scientists around the world and has resulted in the journal dismissing (??).. from the editorial board. We bring this to your attention since we consider it brings the name of your university and New Zealand into some disrepute. We leave it to your discretion what use you make of this information.
The strength of the punishment that the Team inflicted on the journal and those associated with the story kept the peer reviewed papers “in line” for some time, and so it has only this last September (after some 8 years) that it has been necessary to force the resignation of another editor, Wolfgang Wagner, to remind the scientific press as to who is in charge here.

Let me, however, end this rather lengthy post with another piece of Team dishonesty. You may remember that I began by quoting Mia Tiljander’s work on Finnish lake sediments. Well due to an odd circumstance, when this was examined by the Team the results were inverted. As a result, instead of showing the MWP that actually existed, her results were included as showing that it did not.

This was admitted in the first Climategate release (e-mail 3511) where Darrell Kaufman wrote
Regarding the "upside down man", as Nick's plot shows, when flipped, the Korttajarvi series has little impact on the overall reconstructions. Also, the series was not included in the calibration. Nonetheless, it's unfortunate that I flipped the Korttajarvi data.
In fact that is not a completely true statement either, since, as Mia Tiljander noted, the original data clearly showed an MWP, and if the data was good enough to use inverted to disprove that it existed, surely it should also be used to prove its presence when turned the right way around. But as Steve McIntyre has noted on the subject, while the original perpetrator may have submitted a correction the good Dr Mann has yet to admit that he used data improperly.

Courtesy of Climate Audit, you can judge for yourselves, comparing perhaps to the top figure, as to whether inverting the data made any difference.

How using the Tiljander data properly (New) reveals the MWP and LIA (Tiljander via Climate Audit), while the Team use (Old) hides them.

In further discussion the more recent sediments (which Tiljander noted were disturbed) which upticked in the “OLD” incorrect use, continued to be used by the team. As Andrew Montford noted in "The Hockey Stick Illusion", however:
The big selling point of Mann’s new paper was that you could get a hockey stick shape without tree rings. However, this claim turned out to rest on a circular argument. Mann had shown that the Tiljander proxies were valid by removing them from the database and showing that you still got a hockey stick. However, when he did this test, the hockey stick shape of the final reconstruction came from the bristlecones. Then he argued that he could remove the tree ring proxies (including the bristlecones) and still get a hockey stick – and of course he could, because in this case the hockey stick shape came from the Tiljander proxies. His arguments therefore rested on having two sets of flawed proxies in the database, but only removing one at a time. He could then argue that he still got a hockey stick either way.


And a short P.S. Steve McIntyre has just posted that the Team tried the same nasty tricks to try and discredit Willie Soon at Harvard.

Read more!

Tuesday, November 22, 2011

OGPSS - a second look at natural gas in the Bakken

The comments on the recent post on natural gas flaring drew a few comments, when posted on The Oil Drum that I would like to discuss, before moving on to start reviewing the situation in Russia.

Most particularly I want to discuss the change in the regulations that are now imposed in the Bakken, and then go on to review the alternate ways in which that stranded gas might be used. I would like to thank those who commented, particularly benamery21 who pointed me to the set of presentations in a Webinar that covers the change. There was also a comment by aws-classifieds that led to a Wood Mackenzie view of the current situation.

The Wood Mackenzie view mirrored to some extent what Sid Green and Rockman have both been noting about the current economics of the shale gas drilling bonanza. Harry Flashman also commented. The point is that much of the growth in drilling for the natural gas in shales such as the Marcellus is coming from Joint Ventures, and
the pool of potential shale gas investors has expanded beyond experienced onshore North American operators to include companies willing to commit up to several billion dollars to enter and develop plays. It can be argued that they have neither the skills nor experience to efficiently exploit the resource and evaluate selling gas into the most liquid, dynamic natural gas market, alien to their own typically regulated natural gas markets. 



The continued waves of new investors have committed around US$90 billion to exploiting the shale gas plays in North America, throwing a lifeline to the established players who have been able to continue to hold and exploit their lease holdings using their new investors’ money.
The article goes on to note that the service companies have been making the most out of the current efforts, and as a result companies are beginning to move more of the well servicing in house (including hydraulic fracturing). The article notes, as has been highlighted particularly in comments on earlier posts on the topic, that the majority of shale gas plays are failing to break even at current prices. In consequence it anticipates a market correction, following which only the very best shale gas plays (and that will include those with high liquid content) will survive.

On that note, it is worth looking at the prediction by the North Dakota Department of Mineral Resources at that recent Webinar, which Ben pointed to.

Predicted future oil production from the Bakken shale in North Dakota (ND DMR )

Note that this anticipates that production will only rise by perhaps 150 kbd over current levels. This is perhaps a cautionary tale for those who are looking to see sustained production from these fields of up to 1 mbd through the rest of the decade. DMR sees production declining around 2015, though only at around 1% per year, under a sustained drilling program.

The State, however, notes that 7.6% of the energy value coming from the wells, and 4.3% of the economic value is currently going up in smoke as over 30% of the gas coming from those wells continues to be flared. Regulations do, however, limit the amount of such gas that is flared, through a control on the amount of oil that can be pumped from the well, before it is connected into a gas-gathering network.
Typically, wells are allowed to produce at a maximum efficient rate or MER for a period of time (generally 30-60 days) in order to evaluate the potential of the well and stabilize the production. After such time, the well production is then restricted to 200 bopd (barrels of oil per day) for another 30-60 days, then 150 bopd for an additional 30-60 days, and finally 100 bopd until the well is connected to a gas-gathering system. Further extensions may be granted provided certain conditions are met. But whatever the reason, until that connection is made, the only alternative is to burn the gas.
There is a small catch if the well continues to be flared. If the company applies for an exemption, it can be relieved of paying taxes and royalties on the natural gas flared for a year, but after that taxes and royalties can be imposed – even if the gas continues to be flared.

However one “out” is to use the natural gas to produce electric power – provided that the generator consumes at least 75% of the natural gas from the well. At the same time new gas plants are being constructed, so that, by the end of next year capacity should be up to around 1,000 MMcf/day. The use of on-site generators was also mentioned in the Webinar, though all the presentations are sequenced so this appears lower down the site.

Two ways are proposed for effectively using the gas that is currently flared. The first is a modification to the requirement for electrical generators, with a project by Blaise Energy. There is a case study of an existing operation available, where the company installed a 300-kw generator near one well, and a 340-kw generator by another, in North Dakota. The power is then sold to a local electrical co-operative, and the company now claims agreements to generate 4 MW, with another 15 MW in view.

Generators at a well site, using the natural gas to generate electricity (Blaise Energy ).

The alternate approach, which is being developed by Bakken Express is to capture and condense the natural gas at the well head, and then transport it either to a pipeline or an end user. The initial proposal is looking at five wells, and will generate both Compressed Natural Gas (CNG) and Natural Gas Liquids (NGL) at the well, which can then be delivered.

Skid mounted compressors at a well, separating and compressing the gas. (Bakken Express )

The company uses specialized tube containers to hold the CNG, and to move it to the customer.

Truck used to transport CNG (Bakken Express)

Should the fields be larger, and further from a reliable pipeline (Yamal for some reason pops back into mind, and we’ll get there) then it might be possible, as Sid Green has suggested, that the process might be carried one step further, and the gas liquefied.

For the present, however, with the growth of CNG powered vehicles it is, perhaps enough to envisage that before too long the carriers shown above might become a more familiar sight pulling into the local filling station.

And with that thought I wish you all a Happy Thanksgiving.

Read more!

Sunday, November 20, 2011

El Niño, La Niña and Regional US temperatures

Changes in the temperatures in the Pacific that are described as El Niño and La Niña events.(NOAA)

One of the increasingly obvious drivers for the weather we’re likely to see over the next year comes from the relative state of the temperatures of the Pacific Ocean. This is generally known as the El Niño effect when the waters are warmer, and La Niña when they are cooler than normal, as shown above. Collectively it is the Oceanic Niño Index (ONI). The effect, over the past 60 years, can be compared with the regional temperatures over that period, derived in an earlier post. Remember that the regional plots were artificially adjusted to move vertically and allow the changes in shape.

A comparison of average station temperatures in the Pacific states (red), mountain states, (dark green), Midwest (light green) and the Atlantic States (purple) with the plots separated by block change in the temperature values for each region, and compared to the Pacific Oceanic Niño Index (ONI) the lower blue filled in line.

The relative importance of the ONI to temperatures in different parts of the world, and particularly Texas, and the derivation of the plot follows.

Considering the volumes of water that are involved, the relative changes in temperatures are quite significant. Consider the current situation, where as we head into a La Niña winter, the temperatures across the Pacific are as much as 1 deg C below normal.

Temperature variations from the average across the Pacific (Australia is in the lower left, the USA on the upper right). (NOAA )

There is a growing recognition that the impact of these changes controls the monsoons in India, going back to the paper published in Science back in 2006 by Kumar et al. This paper explained why monsoon failure always happened with El Niño events, but not all El Niño events led to monsoons. There are thus “flavors” to the events and their results depending on whether they happen in the winter or the summer. The typical impacts, globally, are shown with these illustrations, showing how the timing has influence:

First El Niño:

The changing influence of an El Niño event, depending on timing (after Kumar et al)

India only becomes dry in the second of the two cases, with the monsoon months occurring between June and the end of August, i.e. the lower condition. (And there is also an Arizona Monsoon which might be one of the few areas of the US to be immediately impacted in that case).

With La Niña, the conditions change, and with the event occurring in the summer India gets the needed rain.

The impacts of a La Niña event, depending on timing (after Kumar et al )

The condition that we are moving into at the present is the upper of these latter two pictures, which is not good news for the folk in Texas who have been hoping for rain. NOAA is now predicting that the drought will last into the summer and high temperatures will continue through 2012.

Temperature projections for the next year (NOAA

Precipitation projections for the next year (NOAA )

The impact of these events can, therefore be quite severe. However there is a continual swing from one condition to the other, hence the more broad description of the event that has come into vogue, that of the El Niño Southern Oscillation, or ENSO, or more generally the Oceanic Nino Index (ONI) And the United States sees different impacts in different regions of the country. For example consider the contrast between conditions for the East Coast:

Change in snowfall on the East Coast as a result of ENSO (NOAA)

Change in snowfall in the West as a result of ENSO (NOAA )

There has been some debate in the blogosphere as to what effect this has all had on the US temperatures. Well there are a couple of different points that have to be considered in that discussion, that seem to have got lost in some of the “religious” aspects of the extreme ends of the debate. It has been partially blamed for the high global temperatures in 1998. However, if we take just the last 60 years of data, we can look at how those data fit with the plots that I have previously posted to the site on regional temperatures.

The ONI plot from 1950 (GGWeather )

Now how does this look relative to the temperature changes that have occurred in the US. There are a couple of things to bear in mind on this, that seem lost to the folk at Real Climate. The first is this graphic, which seems to have got lost from the view of most of those who looked at the recent release of pre-papers from the BEST study.

Map of stations in and near the United States with at least 70 years of measurements; red stations are those with positive trends and blue stations are those with negative trends. (The BEST Project).

The second point is the change in temperature profiles (which BEST does not look at, rather concentrating on individual results) for the different regions of the US.

Average variation in time for four regions of the country, with the results adjusted as shown to separate the curves, and show them in order (bottom to top) from West to East.

Using the section of the above plot the regional temperatures were separated, by adding and subtracting from the actual temperatures (as shown in the legend to the above) to get a separation, the section after 1950 can be used to make the comparison.

The ONI plot can now be superimposed below this, with the amplitudes of roughly the same size (since one is in Centigrade and the other in Fahrenheit). The result shows that the ONI (lowered filled blue plot) has some influence on the Western Coast temperature (the lowest red curve), but as one moves through the Mountain states (dark green(, and less on the Midwest (lighter green), with the effects smoothed out by the time they reach the Atlantic Coast (upper purple). The fall in temperature along the Atlantic Coast is emphasized with this plot, and clearly nothing to do with what is happening in the Pacific.

A comparison of average station temperatures in the Pacific states (red), mountain states, (dark green), Midwest (light green) and the Atlantic States (purple) with the plots separated by block change in the temperature values for each region, and compared to the Pacific Oceanic Niño Index (ONI) the lower blue filled in line.

Now rumor has it that something similar happens in the Atlantic?

Read more!

Saturday, November 19, 2011

More pipelines reverse flow as US demand falls

A recent post covered the plan to reverse the flow of oil along the pipeline that runs sensibly from Detroit to Portland, ME (Or if you are Canadian, since it actually runs most of the way through Canada, from Sarnia to Montreal and beyond). The move is one way of distributing the increasing flows of oil that are being extracted from the oil sands of Alberta, to take advantage of existing pipelines and to meet distributed diemand.

And while the plans for the Keystone pipeline are currently on hold, there is another pipeline which is also being reversed. This is the Seaway pipeline and it runs from Freemont TX to Cushing, OK. It can carry some 400 kbd of oil, and the intent is to have to reversal completed by 2013. This will get around some of the bottleneck that has been building at Cushing, and move more crude down to the refineries along the Gulf. The hope is that it will allow domestic oil to replace some of the crude that is currently imported to those refineries, as domestic production rises to an estimate of 6.4 mbd by 2016. (With that additional rise of around 0.7 mbd from Alberta still, hopefully, coming down to the Gulf Coast as well, by 2015). It is expected that the first 150 kbd may flow by the second quarter of next year.

There are also plans to reverse the flow and increase capacity to 225 kbd with the Longhorn pipeline that runs from El Paso to Houston to carry Permian Basin oil to refineries, by 2013.

Changes planned for the Longhorn pipeline (Magellan)

As these moves are planned, it is worth noting that the demand for gasoline has been falling in the US. This is evident from the latest EIA TWIP. The fall has been as much as 500 kbd, which is significant.

Changes in U.S. gasoline demand (EIA TWIP )

It has even caught the attention of OPEC, in their latest (November) Monthly Oil Market Report, which contrasts the fall over two years, showing the drop from pre-recessionary times. (The peak was 9.6 mbd). OPEC sees overall US demand falling 0.11 mbd in 2011, and only picking up by that amount in 2012.

Fall in summer driving demand for gasoline (OPEC Nov 2011 MOMR)

In passing the MOMR also notes that Russian production of passenger cars rose 46% y-o-y, which may mean that export volumes may fall more rapidly than currently anticipated. Oil consumption in the countries of the FSU is reported at 4.2 mbd, a gain of 0.1 mbd y-o-y. (Sales of U.S. vehicles were up by 10%. ) Their current projections, recognizing the economic problems of Europe, is that world demand will grow to 89 mbd next year, with half of the 1.2 mbd increase coming from Asia, and 0.48 mbd coming from China.

Turning back to the change in driving habits in the United States, the recovery – as evidenced by vehicle miles travelled - faltered and fell back earlier this year, showing the inherent problem still resident in the U.S. economy.

Rolling 12-month average of the vehicle miles travelled on all roads in the United States, through September 2011 (FHWA )

Current declines are seasonal and see to be following the patterns of previous years.

Urban travel on US highways, by month (FHWA )

However there are reports of healthier numbers now coming out of the U.S. economy, and so hopefully, these trends will turn around.

Read more!

Monday, November 14, 2011

The Icelandic volcano at Katla is becoming yet more active

I realize that any claims I have as a prophet have faded as the ash that lies on the Myrdalsjokull glacier from the Eyjafjallajokull eruption last year, failed to be blown away under the summer sun. The underlying volcano, that is Katla, while focusing the earthquake activity in the region has done little more than have a relatively small jokulhlaup (or glacial flood) or two. But I can’t help stirring that old pot at least one more time. Because the earthquake activity at the site remains focused in the caldera, and there has been a recent increase in the frequency with which magnitude 3+ quakes have occurred at the site. I was checking on the one that happened last week, and noticed that there are two new ones since last I looked.

Earthquakes in the last 24 hours around the Katla volcano in Iceland (Icelandic Met Office)

UPDATE: The Icelandic Met Office has another page, which shows the quake map in a different mode, and also gives the frequency over the past day. It currently shows an interesting pattern for the quakes of the last 24 hours.

Earthquakes around Katla in the last 24 hours (Icelandic Met Office ) The line follows that of the perimeter of the caldera (the black hatched line).

Note the site also shows the corrected size of the earthquakes, while the earlier site gives an automated assessment. and, as an update, there are consequent tremors that are falling along the same boundary.

The larger earthquakes are also increasing in size, these two being a 3.8 and then a 3.3, occurring about 3 km apart, which with the intervening activity (the red dots) which included a 2.4 suggests that there just might be a fissure opening. (The now corrected values are given in the lower figure).

Three days ago Jón Frímann noticed signs from his geophones indicating that a new dike had intruded into the caldera. The current activity is a little north of that event, it is also very shallow, with the first and larger quake being at 2.5 km depth while the subsequent ones are down around 1.1 km.

Read more!

OGPSS - gas flares, their significance in Russia

Over the weekend I went to a talk on the promise of shale oil and gas, given by Sid Green, a friend and one of those members of the National Academy with Washington influence in regard to the future of the fossil fuel business. (He appears much more a Yerginite than a follower of Matt Simmons, as was evident by his conclusion that the fuels from the shale deposits of the country will be our short-term savior. This is a proposition that I have provided some evidence to doubt). However it was in his introduction, by Joseph Smith, the new Laufer Chair of Energy at MS&T, that a slide appeared that is useful to preface where this, the Tech Talk series, will go next. This is the slide:

A poster from NOAA showing the light emitted at night from city lights (white), fires (red), boats are in blue and gas flares are in green. (The picture was put together over the period from January to December 2003.)

It was that large green blob sitting just below the Yamal Peninsula in Russia that caught my eye. It shows the volume of stranded natural gas in Russia that is being flared off because it is stranded, i.e. there is no current way to ship it to market.

UPDATE: I have changed the end of the post to reflect a written answer to my question from Sid Green.

Interestingly, given that preponderance of flaring in Russia, one can also go to a paper given at a Russian meeting where the more ubiquitous size of gas flaring operations around the world is more evident.

City Lights and gas flares around the world, data collected In 1994-95. (city lights in grey, flares in red).

As I noted in a recent post on developments in the Bakken shale, up to 30% of the natural gas that is being produced with the oil is being flared at the moment because there is no way of getting it to market. (And in 1994, note the amount being flared in the North Sea). It is not just a problem for large wells, For many years I drove between Rolla, MO and Crane, IN, spending the night in Vincennes, usually arriving late, with my drive through Eastern Illinois illuminated by flares from the small stripper wells along the way. And it is possible to see flares from the rigs operating in the Gulf of Mexico.

Gas flares illuminating the night in the Gulf of Mexico (NOAA)

In the past Gregor Macdonald has also documented the flares that are found off the coast of Nigeria. . And there was some suspicion that these represented the greatest volume of gas being burned off in this way.

Flares around Nigeria color coded by duration, Those active in 2006 and 2000 are yellow. Those active in 2000 but not 1992 or 2006 are green. Those active in 1992 but not 2000 or 2006 are blue. (NOAA )

A 2007 survey, carried out by NOAA for the World Bank, showed that Russia was burning roughly twice the volume of gas as that lost in Nigeria. A close look shows how the plumes from the flares dominate the Siberian night-time sky.

Thermal plumes from gas flares in Siberia

The NOAA report indicates that around 160 billion cubic meters of gas is flared each year, roughly a quarter of the volume of natural gas that is used in the United States each year. And while countries such as Nigeria have been able to reduce the amount that is flared, countries such as Russia, Kazakhstan and Iraq have increased the volumes flared. (It also explains how the images above were generated). The region of Russia with the most gas flaring is that around the Khanty-Mansiysk region, which accounts for roughly half the Russian total. In 2007 a conference on the subject heard that Russia was flaring around 50 bcm per yer, with Khanty-Mansiysk contributing 24 bcm of this total. (The gas is flared because this is currently where about half of Russia’s oil production is coming from). At that time the goal was set that, by the end of this year, (2011) some 95% of this natural gas should be utilized. There have been a variety of ways suggested to reach that goal. Again, putting the volumes in context, Russia commercially produced some 600 bcm of natural gas in 2006, as well as some 10 million barrels of oil a day.

Flaring around the Khanty-Manysiyski region south and east of Yamal. (World Bank)

At present Russia has reached a new peak in crude oil production of some 10.34 mbd for October, while Saudi production is estimated to have risen to 9.8 mbd. Russia is thus the current largest oil producer, and so it is time to look at where (other than just the region shown above) the oil is coming from, and what the prospects for the future hold for the longer term production, and export of energy from that country.

The natural gas production picture is not quite that rosy, even with the reduction in gas flaring that has been undertaken. Gazprom is reported to have reduced supply as prices in Europe have risen towards $15 per kcf (thousand cubic feet), almost four times that of gas in the United States. Russia as a whole produced some 1.8 bcm/day (63 bcf) of which Gazprom produced 1.35 bcm, both figures down from the same time last year. At the same time domestic consumption of natural gas has risen by some 1.3 bcf/day. Russia supplies about a quarter of European demand, and as production falls off in some of the fields of Western Europe that portion may increase. However the global supply of natural gas is still quite healthy with countries seeking to find domestic sources from the gas shales that might lower their import needs. Thus the power that Gazprom was able to wield just a couple of years ago has now been somewhat reduced.

All of these factors strengthen the conclusion that this series should now move to look at some of the fields in Russia, and given that Dr Yergin has proved to be a better historian than prophet, that probably means that I should go away and re-read The Prize, before it starts. (Though The Quest is an easier read). After all, one wonders how many of us, a week ago, could have found Khanty-Manysiyski on a map?

Location of Khanty-Manysiyski on a map of Russia (Google Earth)

In passing, Secretary Salazar has just announced that permitting will allow the Natural Buttes Project to move forward. Anadarko are expected to develop up to 3,675 wells in the Uintah Basin over the next decade to supply more cheap natural gas into the market, and likely keep the price pressure on the production of gas from gas shales. Which brings me full circle back to the opening of the post, since the question that I asked Sid at the presentation was “How long can the gas shale companies afford to sell their gas at under $4 per kcf, when it is costing them more than this to produce it?”

UPDATE: I had a short snippy version of Sid's reply here, but he was kind enough to send a critique and deeper explanation, which gives a better answer. To summarize his answer, he feels very much in agreement with the points that Rockman has been making in comments on a number of my posts on this topic.

He notes that financing for the E&P companies has recently largely come from "venture capital" money. The companies are able to recover much of their own capex in the first months, but he was careful to note that this did not imply that they are able to make a profit. And with that return they are able to continue on the “tread mill” of drilling another well, and another . . . .

He quoted costs, and noted that a recent WSJ article said that Pioneer was reported to have costs of around $2.48 per kcf. Though if I can interject they are drilling the Eagle Ford and thus the costs may be lower for the gas, since they are, as he notes, making most of the money from the associated liquids. (And Rockman is not that excited about the general situation down there).

However he thinks Haynesville costs are up to $3.50 and Marcellus up to $4.00 or more per kcf. As a result the number of rigs might soon start falling, though he has hopes that with some technical improvements the cost figures might come under better control, and he senses that there are others in the industry also anticipating greater production at lower cost, with some of the better ideas that are being developed. These relate (HO thinks) to better control of the fracture paths induced out into the formations. But without much change operations will move over to more liquid productive areas, and the natural gas situation will not be sustainable as it is.

As an additional side comment there are now animated maps of the Barnett, Bakken and Eagle Ford plays, showing the wells drilled each year, and the production totals, under the Shale Play Development History Animationssections of the EIA map page. (H/t this weeks TWIP which is on the Bakken andEagle Ford.)

Read more!

The Keystone pipeline - A Canadian response

Well that didn’t take long! (With a hat tip to Art Berman). I mentioned at the end of the last post that the loss of the oil market through postponement of the Keystone pipeline would not sit well with Canada. The delay in the pipeline removes at least temporarily sales of some 600 kbd of Albertan crude from the Canadian inventory, with the concomitant losses to provincial and national government revenue.

Speaking in Hawaii, the Canadian Prime Minister (who met with the Chinese President while on the island) said:
“This does underscore the necessity of Canada making sure that we’re able to access Asian markets for our energy products, and that will be an important priority of this government going forward,”
And to emphasize the point, while the Canadian Finance Minister has noted that the pipeline plan itself might not survive the delay. TransCanada Corp, who were to operate the pipeline has to retain its customers, even as the Finance Minister flies to China to market energy exports, and perhaps sell the available supply to China. And in the broader scheme of things:
Canada will seek to join a new Asian trade block as it tries to increase energy exports to the region following the U.S. decision to delay approval of TransCanada Corp.’s $7 billion Keystone XL pipeline, Prime Minister Stephen Harper said.
To recap from the earlier posts in the OGPSS series on Canadian pipelines there are plans by Enbridge to build a “Northern Gateway” pipeline from the oil sands to the West Coast. And this has already found support from China. Given the irritation that the United States has just provided the Canadian Government it is worth repeating the comment by the President of Enbridge:
I challenge any of you to name one other country in the world that only has one market for its largest export. Right now our most valuable resource is landlocked in North America and isolated from the world market. That means it is often isolated from world price. The August spread between West Texas Intermediate and Brent Crude, the world price, was $22 per barrel. Canadian heavy crude has more often than not sold at a discount to U.S. light crude that goes well beyond the quality differential – simply because of lack of market.
It should not be forgotten in this debate that the greatest objection to the pipeline seemed to come from Nebraska. Lest you forget Nebraska is the largest producer of corn ethanol west of the Mississippi River. It has 34 ethanol plants, converting 769 million bushels of corn a year into 2 billion gallons of ethanol. (The equivalent of 130 kbd out of the national production of around 900 kbd).

Official map of the planned pipeline from the Department of State.
TransCanada Keystone Pipeline, LLC (Keystone) is proposing to construct and operate a crude oil pipeline from Hardistry (Alberta), Canada to Patoka, Illinois (view map of project). The pipeline will be able to deliver 435,000 barrels per day (bpd) of crude oil to existing terminals in Missouri (Salisbury) and Illinois (Wood River and Patoka). The system capacity could be expanded in the future up to 591,000 bpd.

The proposed project includes 1,073 miles of new pipeline in the U.S. (Keystone Mainline). The Keystone Mainline will be comprised of 1,018 miles of 30-inch diameter pipeline from the Canadian border to Wood River, Illinois and 55 miles of 24-inch diameter pipeline from Wood River to Patoka, Illinois. Keystone may also construct an additional pipeline segment from near the Nebraska-Kansas border to Cushing, Oklahoma consisting of 291 miles of 30-inch diameter pipeline.


Read more!

Thursday, November 10, 2011

OGPSS - A review of North American future production

This is a good point to recap the intent of the current series of OGPSS articles. Internally, this marks the end of the segment that has dealt with North American oil and natural gas production. The series is not an in-depth review of individual fields but rather seeks to paint a broad background against which to judge the realities of the coming changes in the global supply:demand balance. More detailed discussions are provided by the excellent analysis of such folks as Jean Laherrère, who is even now examining in more detail the production records from the wells in the deep waters of the Gulf of Mexico.

Perhaps more critically, it comes when there are increasing indications that the investment being placed on renewable energy around the world as a replacement for fossil and nuclear power generation might not be adequate to meet the need. For example, this week the British Institution of Mechanical Engineers has pointed out to the Scottish Parliament that their renewable targets for 20% total energy and 100% of electricity production by 2020
did not appear to be supported by a rigorous engineering analysis of what is physically required to achieve a successful outcome in the timescale available.
It is relatively easy for politicians to promise that energy supply is adequate into the foreseeable future. Euan Mearns, for example, has repeatedly written, and guest hosted articles on The Oil Drum which show that supply requires infrastructure and planning over many years, without which (and the absence is evident) those promises become not only unrealistic, but also unattainable.

Just as I wrote this, the White House announced that the decision on the Keystone pipeline would be postponed until after the election next year. In a sense the Administration view echoes the opinion of one of those opposed to the pipeline:
“It’s not like we’ll have an oil shortage,” said Mark Lewis, a partner at the law firm Bracewell & Giuliani who specializes in oil and gas pipelines. “We’ll continue to get product from the sources we get product from.”

If Canadian tar-sands oil moves to Asia, Lewis said, then the U.S. would just continue getting its oil from regions like the Middle East. And the Canadian oil sands would most likely still be developed, with that product entering the same global market.
This series is testing the truth of those remarks and that opinion by looking at the realities of future production and the oilfields that it will come from. As mentioned above, this began with a look at the resources of North America, and it is time to review what was found.

When this series began last April it set out to look at the potential future changes in the North American oil and natural gas supply there were three different parts to that supply. The first is the historic oil, that coming from existing wells and fields, often coming from stripper wells around the country. Then there were the supplies from fields that are currently being developed, and where production is continuing to rise, such as in the Bakken fields of North Dakota and Montana. And then there is the potential that will come from finding and developing new fields in the relatively near term. Perhaps as an indication of my own “Oops” moment, I used the Energy Plan proposed by Governor Perry to illustrate the potential future additional fossil fuel that might be available over the next two decades. That was very similar in promises to a Wood MacKenzie Report prepared for API, and released in September. To give my comments on those projections, I will look first at the summary of projected increase in oil production from that report. This is divided into two parts, the first assumes that current policies continue, and it sees energy supply growing as follows:

Total US Production – using current conditions. (Wood Mackenzie )

The plot shows an increase in oil production from 2010 to 2030 of some 1.2 mbd, which in the overall scheme of things is not really that much change from the present over the twenty year time interval (0.7% per year). Natural gas, on the other hand is expected to rise some 14.4 bcf/day (1.2% per year) which is a little more impressive.

However, if all the wishes of the industry were to come to pass, (the development policy case) then Wood Mackenzie sees considerable potential for supply growth.

Production with an accelerated development program (Wood Mackenzie )

In that developed condition Wood Mackenzie sees oil production increasing 7.6 mbd to 15.4 mbd by 2030, while natural gas production rises by 36.8 bcf/day to 96.9 bcf/day.

The first concern that I have with these plots is that of the declining reserve. In previous posts I have discussed various estimates for the decline in production that occurs in existing wells over time. This occurs with both oil and natural gas wells, and changes with rock type, well type and other factors. For example a recent post by Fractracker on the performance of 756 Marcellus wells notes that horizontal gas wells declined an average of 39% in the last year, while vertical wells declined 47.6%, though those numbers do not reflect wells that were closed because they were no longer producing (16.7% of the horizontal wells and 6.9% of the vertical wells). Figures from other gas shales have reported decline rates in the Haynesville, for example, of 85%. In conventional oil wells (and reservoirs such as the Bakken are, for this discussion, considered as unconventional) the decline rates now lie above 5% . What this means for US production is that, just to stay even, the industry has to add at least 360,000 bd of new production each year, to maintain a roughly 7.2 mbd level of total oil production. (and that is likely to be, at best, the average gain in production over the next ten years from the Bakken and Niobrara combined).

So where does Wood Mackenzie see the gains in production from the “development” case?

Regions where additional production may be obtained (Wood Mackenzie )

The problem that the map shows is that much of the new discovery and development is anticipated to come from the offshore, whether it is East Coast, West Coast, Gulf of Mexico or Alaska. In comments on the post discussing drilling off the Atlantic coast, my quote on it taking four to ten years for production following the start of leasing led to kindly admonishment by Art Berman, Harry Flashman and Rockman on the overly optimistic view that this represented, with the reality of development being more likely to occur over decades. This holds even more true for resources off the Alaskan coast, where the on-land infrastructure is not currently in place. And it fails to recognize the antagonism that state governments (such as those I mentioned from New Jersey) have to offshore drilling on both coasts.

The current increase in oil production in the United States is, in part, due to the development of long horizontal wells, with multiple fractures along those wells to improve flow to economic levels from shales that were previously uneconomical to drill. As long as enough production can be achieved from such wells to keep them viable for the oil industry they will be developed. When they cease to be then they won’t. The rapid decline rates that are found for these wells (as was shown in the posts on Bakken and Niobrara) mean that as production moves from the sweeter spots in the reservoirs out to the leaner fringes, a point will be reached, in a few years, where those economic factors will come into play. As that time approaches the price of oil (for a variety of reasons) will be sufficiently higher than it is today to encourage greater overall production than might be anticipated from today’s figures, but it will be nowhere near enough to meet the levels of US demand. And it will still curtail overall production.

The gain in production that Wood Mackenzie sees in moving to the “development phase” does not really kick in, were all things to come to pass, until about 2016. And even then the main initial gains are assumed to come from the easing of regulations, which is, I rather suspect, a wish, rather than any reflection on what might realistically happen.

Possible gains in future production over the current condition (Wood Mackenzie )

The delay in granting permission for the Keystone pipeline will shift some 700 kbd of US supply out another year at least. But that does bring up the issue of the oil sands. They are the one source where the deposit size is known and adequate, and where gains to help meet demand could be achieved. However, with the delay and possibly even denial of the pipeline, it is now quite possible that the Canadian Government and the other parties concerned might hear again the same blandishments from China that persuaded the Turkmen, and when we finally get around to giving grudging permission for Canada to sell us more oil, that additional tar sand oil may already be heading overseas through pipelines running west.

Whether, at that point, we can then
continue to get our oil from regions like the Middle East
is a point that I will look at as this series continues, and we start to examine the long term potential of sustained oil supply from regions overseas.

Oh, and a quick P.S. for those who remember the post on the Alaskan pipeline, in August it was running at an average of 538,623 bd, with the annual average now running at 568,471 bd.

Read more!

Friday, November 4, 2011

OGPSS - Drilling off the Atlantic Coast

When the Presidential Administration changes, particularly when that change involves different political parties, the results on energy policy, and ultimately energy availability and price can be, but are not always, significant. In recent posts I have cited Governor Perry’s Energy Plan were he to come to power, and have noted that, for some issues, the change in power may not have much effect. This is particularly true where the energy reserve is already being produced, with production levels being, to a degree, controlled by things such as price, rig availability and the potential promise of a well.

But there are some regions of the country where the policies of the Government can have a greater impact on potential production. And the last region that I have left to discuss of those included in the Governor’s Plan, that of the Outer Continental Shelf (OCS) and particularly that off the Atlantic seaboard, is one such. It is not, however, only Governor Perry that recognizes the potential promise of the region. This is a theme that ex-Speaker Gingrich, another Presidential candidate, has also sounded. His remarks were directed at the benefits of dredging Charleston’s port, thereby supporting the OCS activities and potentially adding 8,800 jobs in S. Carolina.

The most likely first area to be leased is that off the coast of Virginia, that was designated as Lease Sale 220.

Region designated for Lease Sale 220 by the Department of the Interior, in November 2008. (Bureau of Ocean Energy Management, Regulation and Enforcement )

At the time it was estimated that the area might contain 130 million bbl of oil, and 1.14 Tcf of natural gas.


President Obama had continued support, proposing to open the Atlantic Seaboard to drilling back in March, 2010. Unfortunately while the initial plan bad been for lease sales to begin off the coast of Virginia this year, that was put on hold with the Deepwater Horizon disaster.

Following the sealing of the well in the Gulf, and after due review, the Department of the Interior released a statement last December that noted, relative to the Atlantic and other locations:
Based on lessons learned from the Deepwater Horizon oil spill, the Department has raised the bar in the drilling and production stages for equipment, safety, environmental safeguards, and oversight. In order to focus on implementing these reforms efficiently and effectively, critical agency resources will be focused on planning areas that currently have leases for potential future development. As a result, the area in the Eastern Gulf of Mexico that remains under a congressional moratorium, and the Mid and South Atlantic planning areas are no longer under consideration for potential development through 2017. The Western Gulf of Mexico, Central Gulf of Mexico, the Cook Inlet, and the Chukchi and Beaufort Seas in the Arctic will continue to be considered for potential leasing before 2017.
However
Because the potential oil and gas resources in the Mid and South Atlantic are currently not well-known, Interior will move forward with an environmental analysis for potential seismic studies in the Mid and South Atlantic OCS to support conventional and renewable energy planning. No lease sales will be scheduled in the Atlantic in the 2007-2012 program or in the 2012-2017 program.

The current emphasis has been on the Mid-Atlantic States, particularly Virginia and South Carolina, where there is support for the idea. In contrast, New Jersey has expressed concern over the risks of offshore drilling in the past and Governor Corzine was particularly outspoken against offshore drilling or leasing acreage, feeling that it would threaten the tourism and fishing industries of the state. Governor Christie (who replaced Governor Corzine) has retained that opposition to drilling off New Jersey, but is apparently not as passionate about states further south.

The OCS was first surveyed by the USGS back in 1974, and from 1976 some 40 wells were drilled, though none brought in commercial quantities. President Carter approved a 5-year leasing plan in 1980 that saw an additional 50 leases sold over the next two years. But in 1990 President Bush withdrew the Atlantic OCS from lease potential. That action was in process of being reversed at the time of the GOM oil spill. ( Linda Bennan. )

The possible change in the party that holds the Pesidency might therefore influence the opening of leases to develop the Atlantic OCS. Given however that it was a Republican President who withdrew the acreage in the past, and that it has been advanced by Democratic Presidents, that is not necessarily a conclusive declaration. That is particularly the case given the objections of some of the Republican Governors.

But even if the decision is made to go ahead, the EIA consider that it will take at least four years from the sale of leases until production from any identified reserves can be begin, and in some cases such an interval is considered optimistic. (Art Berman has pointed out that the sequence of first carrying out new seismic work, then drilling an initial well, followed by appraisal wells, if that first well were to be successful, and then the construction of production platforms would likely move the timeline out closer to ten years). Estimates of the total oil and natural gas available vary quite widely, The BOEMRE thinks that there is 0.5 to 1 billion barrels of oil in the Mid Atlantic and 0.03 to 0.15 billion barrels in the South Atlantic. But with the slow rates of development it is unlikely that the production will exceed the estimate of the EIA, which foresaw some years ago, that this new OCS production might possibly only stabilize production of offshore oil.

Projected OCS production (2007) (EIA )

It was not expected to stabilize offshore natural gas production.

Lower 48 offshore NG production with and without increased OCS (EIA )

In which regard, the model that the EIA use in predicting production from these fields is:
For currently producing fields, a 20-percent exponential decline is assumed for production except for natural gas production from fields in shallow water, which uses a 30-percent exponential decline. Fields that began production after 2008 are assumed to remain at their peak production level for 2 years before declining.

In passing lease sales are anticipated to restart in December for the Gulf of Mexico in waters ranging from 16 to 10,900 feet deep. The prospects are considered to have the potential to the production of between 200 and 450 million barrels of oil, and 1.5 to 2.65 Tcf of natural gas. However minimum bids for the acreage bave risen from $37.50 to $100 an acre.
(H/t Gail and Art)

Read more!