Sunday, October 30, 2011
BEST and a growing controvery.
Some time ago it appeared that the Berkeley Earth Surface Temperature (BEST) Study was going to be a new review, critically supported by both those that believe in Anthropogenic Global Warming and those more dubious about the quality of the data that had led to that conclusion. The first papers have been released, and the controversy over what they found, which seemed relatively minor at the beginning, is now starting to get some notice.
Bishop Hill has drawn attention to an article in the London Mail on line in which Dr. Judith Curry voices concern about the papers and the way the information was released. Up in Canada, Steve McIntyre is raising more points of concern in successive posts at Climate Audit. And within the comments at both sites there are further growing areas of contention being identified.
I have, as those who read the individual state reviews that have been posted here over the last two years (and which are listed down the left side of the page) my own concerns. I have sent these to the attention of the BEST Executive Director. To validate their reality I went back to each of the data tables used to generate the graphs for each state, and checked them for correctness. I found that at the beginning the curves for temperature variance with latitude, longitude , elevation and population had included the temperatures for the GISS stations in a state as well as the USHCN station data. This is a mistake, as the continuing study showed how different the two sets of data are. I have thus re-plotted the data using only temperatures for the USHCN set (both homogenized and for the TOBS data).
I will now go back and change the plotted graphs for each state to make sure that they are the correct ones, but this may take a few days, so in the mean time, please be patient.
In the meanwhile you might ponder this plot which is figure 4 from their paper on the Urban Heat Island Effect.
Map of stations in and near the United States with at least 70 years of measurements; red stations are those with positive trends and blue stations are those with negative trends. (The BEST Project).
Bishop Hill has drawn attention to an article in the London Mail on line in which Dr. Judith Curry voices concern about the papers and the way the information was released. Up in Canada, Steve McIntyre is raising more points of concern in successive posts at Climate Audit. And within the comments at both sites there are further growing areas of contention being identified.
I have, as those who read the individual state reviews that have been posted here over the last two years (and which are listed down the left side of the page) my own concerns. I have sent these to the attention of the BEST Executive Director. To validate their reality I went back to each of the data tables used to generate the graphs for each state, and checked them for correctness. I found that at the beginning the curves for temperature variance with latitude, longitude , elevation and population had included the temperatures for the GISS stations in a state as well as the USHCN station data. This is a mistake, as the continuing study showed how different the two sets of data are. I have thus re-plotted the data using only temperatures for the USHCN set (both homogenized and for the TOBS data).
I will now go back and change the plotted graphs for each state to make sure that they are the correct ones, but this may take a few days, so in the mean time, please be patient.
In the meanwhile you might ponder this plot which is figure 4 from their paper on the Urban Heat Island Effect.
Map of stations in and near the United States with at least 70 years of measurements; red stations are those with positive trends and blue stations are those with negative trends. (The BEST Project).
Read more!
Friday, October 28, 2011
Katla, volcanic ash, and other opinions on the effect of climate,
There has been an interesting set of quotations over on Dr. Curry’s Website which led back, through a post by Roger Pielke Snr to an article by Paul Voosen abouhow the changing aerosol and particulates in the atmosphere could be helping to control climate. While not wishing to get into that debate at the moment, (though the quotes are interesting, and should be read in context) one of the “complaints” of the scientists has been that there has not been a major volcano, with a significant impact on climate since Mount Pinatubo erupted in 1991. In that case, the second largest eruption of the 20th Century, ejected some 20 million tons of fine ash and sulphur dioxide into the upper atmosphere reducing global temperatures by about 1 deg F, for a couple of years.
In the current article one of the scientists at the Mona Loa observatory (John Barnes) has been monitoring the particulate content of the stratosphere and has noted that the count is going up, without such a major eruption. And it is that fact, discovered four years ago, that then leads into a discussion as to why the world isn’'t heating as fast as has been predicted. (In which discussion no-one brings up the new BEST study and the fourth figure in the Urban Heat Island paper which shows that large sections of the United States have had a negative, rather than a positive trend over the period of study. (Not that readers of the state temperature data given in posts listed on the lower right column will find that a new discovery – see for example, the discussion on the temperatures of the “Flat Middle”.
Which is, in itself, a good lead in to noting that Katla just earned another two stars (signifying earthquakes greater than 3.0 – with one being in the crater and one to the North-West towards the area of intense activity earlier this summer.
Earthquakes at Katla in the last 24 hours (Icelandic Met Office )
It is not that this necessarily brings this eruption much closer, though the smaller quakes are clustering again and Jon commented that there does not appear, as yet to be much harmonic activity (signifying significant magma movement ) yet, though there may be a small dyke intrusion into the caldera.
The question at the moment, I suppose is as to whether it will be the clustering that can be seen here under the right-hand star (the +3 quakes) that will intensify prior to the quake, or whether we will not get a major eruption until that cluster spreads to cover a broader region around the caldera. There are arguments on both sides.
But this may be the future source of some of those particles that John Barnes continues to seek, if the eruption lives up to some of its previous history. At the moment all we can do is wait and see whether the eruption comes this year, or not, and if it does how it will answer the question.
.
In the current article one of the scientists at the Mona Loa observatory (John Barnes) has been monitoring the particulate content of the stratosphere and has noted that the count is going up, without such a major eruption. And it is that fact, discovered four years ago, that then leads into a discussion as to why the world isn’'t heating as fast as has been predicted. (In which discussion no-one brings up the new BEST study and the fourth figure in the Urban Heat Island paper which shows that large sections of the United States have had a negative, rather than a positive trend over the period of study. (Not that readers of the state temperature data given in posts listed on the lower right column will find that a new discovery – see for example, the discussion on the temperatures of the “Flat Middle”.
Which is, in itself, a good lead in to noting that Katla just earned another two stars (signifying earthquakes greater than 3.0 – with one being in the crater and one to the North-West towards the area of intense activity earlier this summer.
Earthquakes at Katla in the last 24 hours (Icelandic Met Office )
It is not that this necessarily brings this eruption much closer, though the smaller quakes are clustering again and Jon commented that there does not appear, as yet to be much harmonic activity (signifying significant magma movement ) yet, though there may be a small dyke intrusion into the caldera.
The question at the moment, I suppose is as to whether it will be the clustering that can be seen here under the right-hand star (the +3 quakes) that will intensify prior to the quake, or whether we will not get a major eruption until that cluster spreads to cover a broader region around the caldera. There are arguments on both sides.
But this may be the future source of some of those particles that John Barnes continues to seek, if the eruption lives up to some of its previous history. At the moment all we can do is wait and see whether the eruption comes this year, or not, and if it does how it will answer the question.
.
Read more!
Labels:
BEST,
Iceland volcano,
Katla,
particulates,
volcanic dust
Wednesday, October 26, 2011
OGPSS - the Niobrara, the Tuscaloosa and the Chatanooga
When Governor Perry recently released his Energy Plan, I noted in my review, that it bore a close resemblance to the position paper prepared by CERA for the API, and that it could be, in consequence, considered perhaps the “best shot” of the oil and gas industry in predicting where new North American resources might come from in the next couple of decades. One distinction that I did not make, and that is somewhat difficult to decide, is the extent to which these events will come to pass, regardless of who is the next President.
If one looks at the different regions that are suggested as sources to increase oil availability for the United States, I have largely discussed the various options, whether the Bakken Shale, the off-shore resources of Alaska and of the Gulf of Mexico, and the gas shale reserves such as the Marcellus. One of the two regions and resources I had not covered was the increase in production that can be anticipated from the Niobrara shales of Colorado, Wyoming and adjacent states. (The other is the offshore Atlantic). Since this is a new enough development that it may not be well known, I thought to mention some basic information about the Niobrara in this post.
The development of fuel sources from the fine grained shales that cover much of the country has been one of the great success stories of the past decade, and the proliferation of natural gas that it has generated has helped keep the price of that critical fuel low, thereby providing an advantage to American industry. And if one looks at the shale plays potentially available around the country, there are a number that have yet to be developed significantly. I am not sure why this was singled out relative to a couple of others as being the only likely benefit of the Governor’s attention, nor what gains would be made in terms of either production or jobs from any action the Governor might make as President, given that, in the same way as with the Bakken, the drivers are much more likely to be geologically and oil-price driven than they are regulatory, particularly at the Federal level.
Shale plays around the contiguous United States (EIA).
The most likely of still relatively inknown resources (outside the Niobrara) to be developed in the near future may well be the Tuscaloosa, and the Chattanooga. The Tuscaloosa has been vertically drilled back in the 1960’s and since but with the potential to have similar attributes to the Eagle Ford, i.e. a liquids rich resource, it has a marketable product in those liquids that may make it more attractive. Exploratory drilling has begun, with two horizontal wells slated to be completed this year, at a cost of perhaps $8 million apiece to reach the reservoir and extend a lateral running out about 7,500 ft. Whether the production will make it possible to profitably cover those costs remains yet to be determined. The reservoir varies from 200 to 800 ft thick and lies at depths between 11,000 and 14,000 ft. The reservoir may contain as much as 7 billion barrels of oil.
The Chattanooga, on the other hand is a relatively very shallow gas play which may make it easier and less costly to develop, particularly if existing wells can be deepened. In relative terms the oil production anticipated is only as yet (from vertical wells) on the order of 6 bd, with 25 – 300 Mcf of natural gas. It is not, in short likely to bring significant production to the table – at least on the basis of current prospects.
And this is where the Niobrara has the greater potential. It is considered to be somewhat similar to the nearby Bakken, which has now reached production levels reportedly of around 400 kbd, with production varying between around 150 to just under 600 bd/well.
Peak well production rates in the Bakken (EOG Resources )
The Niobrara is not a continuous reservoir, but rather broken into a series of smaller sections:
Niobrara locations in some seven states in the West (OGJ )
The production numbers are still somewhat tentative, given the small number of wells that have been produced, but EOG reports from its wells in Colorado for example, one well with an initial production of 645 bd, though that has now fallen to between 300 and 250 bd, while a second is now running at 225 bd. The company is planning a total of 45 wells this year. Further north in Wyoming Entek is reporting low oil flows even before fracturing.
Well costs are reported to be between $4 and $5.5 million for E&D in drilling down to the reservoir which lies in the 7,000 to 8,500 ft depth and then completing a horizontal section of perhaps 4,500 ft with perhaps 18 stages for hydraulic fractures.
Anadarko plans for development of the Niobrara
Anadarko, which is reported to have one well that initially produced at 550 bd, is expecting to run between 1 and four rigs, building up to a production of around 16 – 20 wells per rig per year on a 40 acre spacing. The Wyoming section of the Niobrara currently has some 13 rigs, while the Bakken in North Dakota has around ten times that number. Rigs are, however, also reported to be in short supply.
(There is an Anadarko video)
The story would not, however, be complete if one did not note that China is also aware of the play, and in January bought a third of the Chesapeake Niobrara operation. Given that one of their wells came in with an initial production of 1,270 bd of oil, and 2.4 mcf of natural gas, Chesapeake seem to retain their ability to find the sweet spots in reservoirs., relative to others for whom the 200 – 500 bd average is more typical.
And a final reminder, from the Bakken post, of the relatively short life that can be found for a typical well producing oil from a fractured shale deposit.
Typical Bakken well production (ND DMR )
If one looks at the different regions that are suggested as sources to increase oil availability for the United States, I have largely discussed the various options, whether the Bakken Shale, the off-shore resources of Alaska and of the Gulf of Mexico, and the gas shale reserves such as the Marcellus. One of the two regions and resources I had not covered was the increase in production that can be anticipated from the Niobrara shales of Colorado, Wyoming and adjacent states. (The other is the offshore Atlantic). Since this is a new enough development that it may not be well known, I thought to mention some basic information about the Niobrara in this post.
The development of fuel sources from the fine grained shales that cover much of the country has been one of the great success stories of the past decade, and the proliferation of natural gas that it has generated has helped keep the price of that critical fuel low, thereby providing an advantage to American industry. And if one looks at the shale plays potentially available around the country, there are a number that have yet to be developed significantly. I am not sure why this was singled out relative to a couple of others as being the only likely benefit of the Governor’s attention, nor what gains would be made in terms of either production or jobs from any action the Governor might make as President, given that, in the same way as with the Bakken, the drivers are much more likely to be geologically and oil-price driven than they are regulatory, particularly at the Federal level.
Shale plays around the contiguous United States (EIA).
The most likely of still relatively inknown resources (outside the Niobrara) to be developed in the near future may well be the Tuscaloosa, and the Chattanooga. The Tuscaloosa has been vertically drilled back in the 1960’s and since but with the potential to have similar attributes to the Eagle Ford, i.e. a liquids rich resource, it has a marketable product in those liquids that may make it more attractive. Exploratory drilling has begun, with two horizontal wells slated to be completed this year, at a cost of perhaps $8 million apiece to reach the reservoir and extend a lateral running out about 7,500 ft. Whether the production will make it possible to profitably cover those costs remains yet to be determined. The reservoir varies from 200 to 800 ft thick and lies at depths between 11,000 and 14,000 ft. The reservoir may contain as much as 7 billion barrels of oil.
The Chattanooga, on the other hand is a relatively very shallow gas play which may make it easier and less costly to develop, particularly if existing wells can be deepened. In relative terms the oil production anticipated is only as yet (from vertical wells) on the order of 6 bd, with 25 – 300 Mcf of natural gas. It is not, in short likely to bring significant production to the table – at least on the basis of current prospects.
And this is where the Niobrara has the greater potential. It is considered to be somewhat similar to the nearby Bakken, which has now reached production levels reportedly of around 400 kbd, with production varying between around 150 to just under 600 bd/well.
Peak well production rates in the Bakken (EOG Resources )
The Niobrara is not a continuous reservoir, but rather broken into a series of smaller sections:
Niobrara locations in some seven states in the West (OGJ )
The production numbers are still somewhat tentative, given the small number of wells that have been produced, but EOG reports from its wells in Colorado for example, one well with an initial production of 645 bd, though that has now fallen to between 300 and 250 bd, while a second is now running at 225 bd. The company is planning a total of 45 wells this year. Further north in Wyoming Entek is reporting low oil flows even before fracturing.
Well costs are reported to be between $4 and $5.5 million for E&D in drilling down to the reservoir which lies in the 7,000 to 8,500 ft depth and then completing a horizontal section of perhaps 4,500 ft with perhaps 18 stages for hydraulic fractures.
Anadarko plans for development of the Niobrara
Anadarko, which is reported to have one well that initially produced at 550 bd, is expecting to run between 1 and four rigs, building up to a production of around 16 – 20 wells per rig per year on a 40 acre spacing. The Wyoming section of the Niobrara currently has some 13 rigs, while the Bakken in North Dakota has around ten times that number. Rigs are, however, also reported to be in short supply.
(There is an Anadarko video)
The story would not, however, be complete if one did not note that China is also aware of the play, and in January bought a third of the Chesapeake Niobrara operation. Given that one of their wells came in with an initial production of 1,270 bd of oil, and 2.4 mcf of natural gas, Chesapeake seem to retain their ability to find the sweet spots in reservoirs., relative to others for whom the 200 – 500 bd average is more typical.
And a final reminder, from the Bakken post, of the relatively short life that can be found for a typical well producing oil from a fractured shale deposit.
Typical Bakken well production (ND DMR )
Read more!
Labels:
Bakken,
Chattanooga,
natural gas production,
Niobrara,
shale oil,
Tuscaloosa
Monday, October 24, 2011
The Delinquent Teenager - the IPCC and the WWF
I mentioned last week that Donna Laframboise has just written a new book with the lengthy title “The Delinquent Teenager Who Was Mistaken for the World's Top Climate Expert”, but which, for this post I will shorten to “The DT”. The book is a fascinating dénouement of the reality of the structure of the Intergovernmental Panel on Climate Change (IPCC). The IPCC issues reports which it describes as:
The work is widely reported as being that of the world’s experts in the different aspects of climate, and its impacts, with the assessment reports bringing those ideas together under the supervision of panels of additional experts in the various fields. Thus it is claimed to summarize the opinions of thousands of scientists and to mark the quintessence of thinking on the state-of-knowledge. With that reputation it is quoted as the justifying document for a wide ranging set of governmental actions around the globe, including the move by the EPA to control Greenhouse Gases, and for example, the moves in Europe to restrict the use of coal in power stations and to justify the increase in investment in wind and solar energy technologies.
But that curtain of respectability based on the “authority” of the authors has been shattered through the revelations of the book. And while it should be noted that many of the individual facts have been discussed in the past, they have never been chained together to shake the edifice so thoroughly and justifiably as is done here.
There are, of course, those who will denigrate the work, the World Wildlife Fund (WWF) has issued a press release in which if “refutes ‘ludicrous’ claims’ that it infiltrated the IPCC. However, the power of the book comes in the detail that allows me to look up the role of the WWF. Chapter 4 of the report of Working Group 2 deals with species extinctions, and it has five of the 10 most senior personnel affiliated with the WWF. The WWF has, as Donna notes (and details) worked to persuade IPCC authors to join one of its panels, and she goes on (in footnote 31-5) to list 78 IPCC personnel who have thus become involved. (Also on her Website). The facts deny the WWF protestations.
But it is in the revealed mediocrity and mendacious nature of the reports themselves that The DT is most disturbing. Rather than relying purely on the skilled knowledge of the world’s experts, and their papers, as the IPCC claims is the practice, considerably less upright behavior is documented throughout the book. Very junior scientists are recruited to draft significant portions of the material, and many documents used turn out to be from non-peer-reviewed sources. It is not that I am necessarily impressed with “peer review”, as an academic I have very mixed feelings about those who consider that nothing is established until it is in a peer-reviewed publication – despite physical evidence or other real world experience - but if you claim that the material is peer-reviewed and as is shown here it is not, then you are at best proven unreliable. The actions of the “Team” in filtering papers that try to get into such journals, and the consequences to those who doubt the “Climate Bible” and the editors who publish them are now well documented in books such as Andrew Montford’s “The Hockey Stick Illusion”, and the Mosher and Fuller book “Climategate: the CRUtape letters” . The "gatekeepers" to the journals are working hard to ensure that only their views are heard, thus the value of these books.
The DT illustrates, for example, the actions of Kevin Trenberth, who with little documented background knowledge in the area of the possible effects of climate change on hurricane intensity, led that chapter of the Climate Bible, and in the process over-rode the expert knowledge of Chris Landsea, who had commented “there are no known scientific studies that show a conclusive physical link between global warming and hurricane frequency and intensity.” (The full e-mail with this objection is a Webcite within the electronic version of the DT).
This really is a highly informative, well documented look at the IPCC which deserves wide distribution, and it should be read by all those who cite the IPCC as justification for a multiplicity of processes. Sadly to say, I rather doubt that it will get the hearing and the proper respect that is its due. What has become evident over time is that the pathways to “getting the word out” lies through a band of “environmental reporters” who would not have work if there were no “environmental crisis.” Rather than review opposing viewpoints or disquieting information, such as that found in this book, their practice has been to ignore the message in the hope that it will go away. After a while they can then say “oh, but it was brought up, and really it was but a storm in a teacup!” They then move on, blissful in their disregard for anything but to maintain a status quo that keeps them gainfully (???) employed.
I would however suggest that it is well worth taking the time to understand what foundation the IPCC is built on, because there is, in all things “ a time when all things come to those who wait.” And as the ball of that discontent grows you will be able to understand both its context, but also what other legs of straw will also fall, in time, as the edifice crumbles. This is a worthy beginning to that eventual result. It is unfortunate that the world could not hold out for a change in IPCC structure, that would allow the true picture to emerge, but as the book shows, the powers that are entrenched have their own agenda, and the controls to maintain it . . . for the present.
an update of knowledge on the scientific, technical and socio-economic aspects of climate change.Four reports have been issued to date, with work ongoing on the fifth, which is referred to as the Fifth Assessment Report (AR5) and which will be released in the 2013/2014 timeframe. The book deals with the way in which the current report, AR4, released in 2007, and which might be considered the “Climate Bible” (Donna’s words) was compiled, and the makeup of the writing teams for AR5.
The work is widely reported as being that of the world’s experts in the different aspects of climate, and its impacts, with the assessment reports bringing those ideas together under the supervision of panels of additional experts in the various fields. Thus it is claimed to summarize the opinions of thousands of scientists and to mark the quintessence of thinking on the state-of-knowledge. With that reputation it is quoted as the justifying document for a wide ranging set of governmental actions around the globe, including the move by the EPA to control Greenhouse Gases, and for example, the moves in Europe to restrict the use of coal in power stations and to justify the increase in investment in wind and solar energy technologies.
But that curtain of respectability based on the “authority” of the authors has been shattered through the revelations of the book. And while it should be noted that many of the individual facts have been discussed in the past, they have never been chained together to shake the edifice so thoroughly and justifiably as is done here.
There are, of course, those who will denigrate the work, the World Wildlife Fund (WWF) has issued a press release in which if “refutes ‘ludicrous’ claims’ that it infiltrated the IPCC. However, the power of the book comes in the detail that allows me to look up the role of the WWF. Chapter 4 of the report of Working Group 2 deals with species extinctions, and it has five of the 10 most senior personnel affiliated with the WWF. The WWF has, as Donna notes (and details) worked to persuade IPCC authors to join one of its panels, and she goes on (in footnote 31-5) to list 78 IPCC personnel who have thus become involved. (Also on her Website). The facts deny the WWF protestations.
But it is in the revealed mediocrity and mendacious nature of the reports themselves that The DT is most disturbing. Rather than relying purely on the skilled knowledge of the world’s experts, and their papers, as the IPCC claims is the practice, considerably less upright behavior is documented throughout the book. Very junior scientists are recruited to draft significant portions of the material, and many documents used turn out to be from non-peer-reviewed sources. It is not that I am necessarily impressed with “peer review”, as an academic I have very mixed feelings about those who consider that nothing is established until it is in a peer-reviewed publication – despite physical evidence or other real world experience - but if you claim that the material is peer-reviewed and as is shown here it is not, then you are at best proven unreliable. The actions of the “Team” in filtering papers that try to get into such journals, and the consequences to those who doubt the “Climate Bible” and the editors who publish them are now well documented in books such as Andrew Montford’s “The Hockey Stick Illusion”, and the Mosher and Fuller book “Climategate: the CRUtape letters” . The "gatekeepers" to the journals are working hard to ensure that only their views are heard, thus the value of these books.
The DT illustrates, for example, the actions of Kevin Trenberth, who with little documented background knowledge in the area of the possible effects of climate change on hurricane intensity, led that chapter of the Climate Bible, and in the process over-rode the expert knowledge of Chris Landsea, who had commented “there are no known scientific studies that show a conclusive physical link between global warming and hurricane frequency and intensity.” (The full e-mail with this objection is a Webcite within the electronic version of the DT).
This really is a highly informative, well documented look at the IPCC which deserves wide distribution, and it should be read by all those who cite the IPCC as justification for a multiplicity of processes. Sadly to say, I rather doubt that it will get the hearing and the proper respect that is its due. What has become evident over time is that the pathways to “getting the word out” lies through a band of “environmental reporters” who would not have work if there were no “environmental crisis.” Rather than review opposing viewpoints or disquieting information, such as that found in this book, their practice has been to ignore the message in the hope that it will go away. After a while they can then say “oh, but it was brought up, and really it was but a storm in a teacup!” They then move on, blissful in their disregard for anything but to maintain a status quo that keeps them gainfully (???) employed.
I would however suggest that it is well worth taking the time to understand what foundation the IPCC is built on, because there is, in all things “ a time when all things come to those who wait.” And as the ball of that discontent grows you will be able to understand both its context, but also what other legs of straw will also fall, in time, as the edifice crumbles. This is a worthy beginning to that eventual result. It is unfortunate that the world could not hold out for a change in IPCC structure, that would allow the true picture to emerge, but as the book shows, the powers that are entrenched have their own agenda, and the controls to maintain it . . . for the present.
Read more!
Labels:
climate change,
IPCC,
Kevin Trenberth,
The Delinquent Teenager,
WWF
Sunday, October 23, 2011
U.S. temperatures and the BEST study
For those who have been following my review of the temperatures around the United States, there was an interesting set of papers released by the Berkeley Earth Surface Temperature (BEST) study in the last week. Four sets of findings have been released, based on:
1) statistical methods
2) Urban Heat Island
3) Station Quality
4) Decadal Variations.
For the moment I am most interested in the second of these, since it relates to the effect of population around the measuring station on the resulting temperature reported. This is not quite the same as the UHI effect, although they are both generated by the same factors.
In the paper, to greatly over simplify, the authors appear to have looked at the trends in temperatures in urban and rural sites, and then looked at the difference between the trends in temperature since 1950 for each. They report that the overall trend (i.e. temperatures in urban areas – temperatures in rural areas) is negative rather than the expected positive. They find an average negative trend of 0.342 deg F per hundred years. (Or 0.00342 deg F per year).
I have a couple of concerns about the results they have reported, which requires that I carry out some comparative plotting of my own to generate similar trends. Since I will be looking at the 48 contiguous states (though some will drop out as being too small) this will take a little while. So I ask you to bear with me if posting is a bit sparse while I run these numbers.
1) statistical methods
2) Urban Heat Island
3) Station Quality
4) Decadal Variations.
For the moment I am most interested in the second of these, since it relates to the effect of population around the measuring station on the resulting temperature reported. This is not quite the same as the UHI effect, although they are both generated by the same factors.
In the paper, to greatly over simplify, the authors appear to have looked at the trends in temperatures in urban and rural sites, and then looked at the difference between the trends in temperature since 1950 for each. They report that the overall trend (i.e. temperatures in urban areas – temperatures in rural areas) is negative rather than the expected positive. They find an average negative trend of 0.342 deg F per hundred years. (Or 0.00342 deg F per year).
I have a couple of concerns about the results they have reported, which requires that I carry out some comparative plotting of my own to generate similar trends. Since I will be looking at the 48 contiguous states (though some will drop out as being too small) this will take a little while. So I ask you to bear with me if posting is a bit sparse while I run these numbers.
Read more!
Labels:
BEST,
elevation,
latitude,
north American temperatures,
population,
UHI
Wednesday, October 19, 2011
OGPSS - Governor Perry, an Energy and Jobs Plan
Editorial Comment: I usually would not consider this a technical talk, but rather more political, but I have just about finished reviewing the potential for growth of the reserves in North America. In that context, as I delved into Governor Perry's recently announced Energy Plan, I realized that it followed fairly closely the recommendations of the American Petroleum Institute, and other Energy Alliances. Given therefore that it might be considered the "best shot" of the oil and gas industry to predict how to increase oil and gas production in the United States, I will treat it more as such a plan, and have removed my own comments on this post. (Though I may make some in a following post. I am also using his numbers rather than other values that might be available))
One of the relevant (to this site) facets of the current Republican debates at the start of this Presidential Race has been the Energy Plan that Governor Perry put forward the other day. Because it actually gets specific about where some of the projected 1.2 million jobs he anticipates adding to the American economy will come from, but given that detail has not got a lot of publicity, I plan to briefly review it here, together with some of the source documents that were used to generate it. Please note that this is not an endorsement, but rather an illustration of one of the plans that have been suggested. Here is the summary illustration.
The jobs anticipated by Governor Perry’s Energy Plan
The entire plan is available, as a 40-page pdf, and in its shortest summary version was condensed into
The Governor puts current U.S. consumption at roughly 19 mbd with domestic supplies producing around 7.5 mbd. The nation runs on oil with transportation using 72% of the oil, and 96% of the countries transportation fuel needs are supplied by oil and gas.
The Governor inserts a quote from Governor Jindal of Louisiana that states:
Projected gain in opportunities in the GOM with an enhanced permitting process (CERA )
(It should be noted that roughly 94% of this in 2011, 97% in 2012 and all of the 2013 opportunities would be in the Deepwater offshore.)
In looking next at Alaska, the Governor sees the opportunity to develop the National Petroleum Reserve, with its 896 million barrels of oil and 53 Tcf of natural gas, as well as the Alaskan Outer Continental Shelf, (under the Chukchi and Beaufort Seas) which may contain as much as 10.2 billion barrels. The report by Northern Economics to Shell is quoted that anticipates some 55,000 jobs around the entire country coming from the development, with some 35,000 jobs being in Alaska. (Of this the breakdown would be 30,000 from the Beaufort UCS and 25,000 from the Chukchi Sea OCS. It is anticipated that the increased production will fill the Alaskan pipeline again, with jobs being generated to make the connections. Which is why the 1.2 million job figure is only reached over time. Wood Mackenzie produced a report for API that is also used as a reference for the Governor, and it shows the job growth (broken down a little by source) as:
Growth in jobs related to changes in Energy Plans (Wood Mackenzie )
I am assuming that the increased production from Alaska would fit in the “Increased Access” category. And please note that the Wood Mac report carries out to 2030, while the Governor is only talking of the jobs through 2020 (which is the 1.2 million number).
And while the Governor is largely discussing this plan in terms of jobs, this is, after all, an Energy site, and so I will also add the anticipated change in oil production that is foreseen from this change in the situation – again from Wood Mackenzie.
Gains in Production from changing regulations and access (Wood Mackenzie )
The jobs numbers were derived as a count of specific jobs generated in the industry, and then using a 2.5 multiplier to add their effect on the general economy. (This they consider to be conservative, given that in cases it might be as high as 5). The overall addition of oil to the national reserve is considered to be roughly 60 billion barrels of oil, broken down as follows:
Anticipated gains in reserves added through changes in regulation and access. (Wood Mackenzie)
The Governor is a little more conservative in the oil that he anticipates coming from the Atlantic OCS, anticipating only some 3.2 billion barrels of oil and 28Tcf of natural gas, as well as creating some 10,000 new jobs. He references a report from the Consumer Energy Alliance as his source for some of this information. Note that, in contrast to the Alliance, he only anticipates that drilling would occur offshore Virginia and the Carolinas.
Looking at increasing production of oil in the Western States, he cites the Blueprint for Western Energy Prosperity (site registration required) from the Western Energy Alliance. This projects that some 500,000 jobs could be created, along with the production of 1.3 mbd of oil and an additional 1 Tcf of natural gas from Western Resources.
The increase in oil production is anticipated to come from the Bakken fields (currently at 289 kbd and anticipated to increase to 650 kbd by 2020, and it also anticipates development of the Niobara formation in Colorado and Wyoming which, from sensibly zero, has recently started to be developed and is anticipated to produce some 286 kbd by 2020. However the total gain in production from the two, over existing production in the West, is anticipated to be 529 kbd.
Natural gas, transported through the Rockies Express and Ruby pipelines is expected to add 1 Tcf of production, which the Alliance shows divided between the Western States.
Anticipated future gas production from the Western States (Western Energy Alliance )
The Alliance makes the point, as does the Governor, that reaching these levels requires a reduction in legislation, and regulation, and improved access to federal lands.
Approval of the Keystone Pipeline (a topic of current debate) is expected to add 20,000 new jobs, which only leaves the allowance of increased development of the Marcellus and Eagle Ford shales (dependent on the allowed use of fracking the shale) to add respectively 250,000 jobs in the New York, Pennsylvania, Ohio region, and 68,000 jobs in Southwest Texas, and you have the Governors 1.2 million.
The jobs anticipated by Governor Perry’s Energy Plan
To achieve this the Governor proposes:
As I mentioned at the beginning I will make some comments on this, in light of my recent posts on North American Energy in a later post.
One of the relevant (to this site) facets of the current Republican debates at the start of this Presidential Race has been the Energy Plan that Governor Perry put forward the other day. Because it actually gets specific about where some of the projected 1.2 million jobs he anticipates adding to the American economy will come from, but given that detail has not got a lot of publicity, I plan to briefly review it here, together with some of the source documents that were used to generate it. Please note that this is not an endorsement, but rather an illustration of one of the plans that have been suggested. Here is the summary illustration.
The jobs anticipated by Governor Perry’s Energy Plan
The entire plan is available, as a 40-page pdf, and in its shortest summary version was condensed into
My “Energizing American Jobs and Security” plan will commence or expand energy exploration from the Atlantic coast to the western seas off Alaska. We will end the bureaucratic foot-dragging that has reduced offshore drilling permits in the Gulf of Mexico by eighty percent. We will tap the full potential of the Marcellus Shale in Pennsylvania, Ohio and West Virginia. We will unleash exploration in our Western states, which have the potential to produce more energy than what we import from Saudi Arabia, Iraq, Kuwait, Venezuela, Columbia, Algeria, Nigeria and Russia combined.
The Governor puts current U.S. consumption at roughly 19 mbd with domestic supplies producing around 7.5 mbd. The nation runs on oil with transportation using 72% of the oil, and 96% of the countries transportation fuel needs are supplied by oil and gas.
The Governor inserts a quote from Governor Jindal of Louisiana that states:
According to a recent study by IHS CERA, in 2012 alone the Gulf of Mexico could create 230,000 jobs, increase revenues and royalty payments to state and federal treasuries by $12 billion, and contribute some 400,000 barrels per day of oil production towards US energy independence if the federal government accelerates the pace of permitting activity to a level that reflects the industry's capacity to invest.This quote refers to the report “Gulf of Mexico - Restarting the Engine” by CERA which tabulates the difference achievable between a slow permitting environment, and an enhanced one over the next two years, and uses it (in more specific detail) to develop the summary table:
Projected gain in opportunities in the GOM with an enhanced permitting process (CERA )
(It should be noted that roughly 94% of this in 2011, 97% in 2012 and all of the 2013 opportunities would be in the Deepwater offshore.)
In looking next at Alaska, the Governor sees the opportunity to develop the National Petroleum Reserve, with its 896 million barrels of oil and 53 Tcf of natural gas, as well as the Alaskan Outer Continental Shelf, (under the Chukchi and Beaufort Seas) which may contain as much as 10.2 billion barrels. The report by Northern Economics to Shell is quoted that anticipates some 55,000 jobs around the entire country coming from the development, with some 35,000 jobs being in Alaska. (Of this the breakdown would be 30,000 from the Beaufort UCS and 25,000 from the Chukchi Sea OCS. It is anticipated that the increased production will fill the Alaskan pipeline again, with jobs being generated to make the connections. Which is why the 1.2 million job figure is only reached over time. Wood Mackenzie produced a report for API that is also used as a reference for the Governor, and it shows the job growth (broken down a little by source) as:
Growth in jobs related to changes in Energy Plans (Wood Mackenzie )
I am assuming that the increased production from Alaska would fit in the “Increased Access” category. And please note that the Wood Mac report carries out to 2030, while the Governor is only talking of the jobs through 2020 (which is the 1.2 million number).
And while the Governor is largely discussing this plan in terms of jobs, this is, after all, an Energy site, and so I will also add the anticipated change in oil production that is foreseen from this change in the situation – again from Wood Mackenzie.
Gains in Production from changing regulations and access (Wood Mackenzie )
The jobs numbers were derived as a count of specific jobs generated in the industry, and then using a 2.5 multiplier to add their effect on the general economy. (This they consider to be conservative, given that in cases it might be as high as 5). The overall addition of oil to the national reserve is considered to be roughly 60 billion barrels of oil, broken down as follows:
Anticipated gains in reserves added through changes in regulation and access. (Wood Mackenzie)
The Governor is a little more conservative in the oil that he anticipates coming from the Atlantic OCS, anticipating only some 3.2 billion barrels of oil and 28Tcf of natural gas, as well as creating some 10,000 new jobs. He references a report from the Consumer Energy Alliance as his source for some of this information. Note that, in contrast to the Alliance, he only anticipates that drilling would occur offshore Virginia and the Carolinas.
Looking at increasing production of oil in the Western States, he cites the Blueprint for Western Energy Prosperity (site registration required) from the Western Energy Alliance. This projects that some 500,000 jobs could be created, along with the production of 1.3 mbd of oil and an additional 1 Tcf of natural gas from Western Resources.
The increase in oil production is anticipated to come from the Bakken fields (currently at 289 kbd and anticipated to increase to 650 kbd by 2020, and it also anticipates development of the Niobara formation in Colorado and Wyoming which, from sensibly zero, has recently started to be developed and is anticipated to produce some 286 kbd by 2020. However the total gain in production from the two, over existing production in the West, is anticipated to be 529 kbd.
Natural gas, transported through the Rockies Express and Ruby pipelines is expected to add 1 Tcf of production, which the Alliance shows divided between the Western States.
Anticipated future gas production from the Western States (Western Energy Alliance )
The Alliance makes the point, as does the Governor, that reaching these levels requires a reduction in legislation, and regulation, and improved access to federal lands.
Approval of the Keystone Pipeline (a topic of current debate) is expected to add 20,000 new jobs, which only leaves the allowance of increased development of the Marcellus and Eagle Ford shales (dependent on the allowed use of fracking the shale) to add respectively 250,000 jobs in the New York, Pennsylvania, Ohio region, and 68,000 jobs in Southwest Texas, and you have the Governors 1.2 million.
The jobs anticipated by Governor Perry’s Energy Plan
To achieve this the Governor proposes:
1. Immediately return to pre-Obama levels of permitting in the Gulf, followed by responsibly making more of the Gulf available for energy production.
2. Open the ANWR Coastal Plain (1002), National Petroleum Reserve Alaska (NPR-A), and the Alaskan OCS (Beaufort and Chukchi Seas) for development.
3. Open the Southern Atlantic OCS off-shore resources for development.
4. Immediately approve the Keystone XL Pipeline.
5. Expand on-shore oil and gas development in Utah, Colorado, North Dakota, Montana, New Mexico, and Wyoming, authorizing more development on federal lands.
6. Oppose federal restrictions on natural gas production, including hydraulic or nitrogen fracturing and horizontal drilling.
As I mentioned at the beginning I will make some comments on this, in light of my recent posts on North American Energy in a later post.
Read more!
Sunday, October 16, 2011
A new book and the temperatures in the Mountains
One of the many significant salient points that Donna Laframboise makes in her new book on the IPCC lies in the predilection of the IPCC authors to rely on computer modeling over factual data. I will probably have more to say on this in my planned review of the book (which so far is fascinating to read and damning in the details) once I finish it, but it does suggest that posts such as these on the actual variations in US temperatures over the past century will languish in outer darkness – oh, well!!
To recap where we are, after looking at the temperatures of the individual contiguous states of the American Union, I have started to look at regional changes and to compare them. A significant drop of around four degrees Fahrenheit that occurred between 1950 and 1965 along the Eastern Seaboard, and which inter alia led to a southern migration of the black-capped chickadee, has since been reversed in the warming of that region post-1965. In contrast in the middle of the country there has been sensibly no warming trend since 1895 (which might explain why the Governor of Texas finds it harder to believe in global warming, since his state hasn’t seen any).
Yet on the West Coast the trends show a relatively consistent increase in temperature since 1895. And so the next place to look is to see what happened in those states that lie along the Rockies and provide the mountainous barrier between the West and the Middle. In the following post I am going to derive this graph, and comment along the way as I get there.
Average variation in time for four regions of the country, with the results adjusted as shown to separate the curves, and show them in order (bottom to top) from West to East.
For this exercise I am going to include Montana, Idaho, Wyoming, Nevada, Utah, Colorado, Arizona and New Mexico . I have previously used both the homogenized data from the USHCN and the Time of Observation corrected (TOBS) data, with a preference for the latter since, as I noted with the individual state evaluations, it has shown a consistently better correlation across the states with latitude and elevation than the data after manipulation.
Combining first the average temperatures for the states, using the homogenized data, one gets a relatively flat curve, but one with a decided kick over the last thirty years, the sign that many consider the marker for Anthropogenic Global Warming.
Average of the state temperatures for the Mountain Region, averaging the values for the different states in the region.
However, before there is much comment on this, it should be noted (when this is compared with the figure above it) that that trend is not as clear in the four regions as I ultimately graphed them. But before getting to that let me just put up a set of comparison graphs, using the USHCN homogenized data, to show that the temperatures of the eight states that I looked at appear to vary relatively consistently. By setting them one above the other it should be easier to cross-compare.
These are relatively consistent, though they appear to smooth a bit with movement south, but to get back to that kick in the end of the plots. As you may know, and I have commented in the individual state plots, the USHCN data is “homogenized” from the raw data. When one takes the average of this “manipulation” and subtracts the Time of Observation corrected values from it, then one gets this curve as the average change made by the climate scientists in reporting the values for the stations in the evaluation (all 233 of them).
Difference between the average USHCN homogenized temperatures and the raw temperature averages, as corrected for time of observation (the TOBS values).
Given that, the rest of the analysis will go back to using the TOBS values. Firstly this changes the average a little:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data.
The change in the slope of the curve drops the rise in temperature per century from 1.7 deg F to 1.08 deg F.
However this curve was obtained by just averaging the average state temperatures, there are varying numbers of stations and varying areas for the different states. When one weights the value by the relative number of stations in the state (equivalent to averaging all the stations in the region), then one gets:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data and averaging the 233 stations in the region.
In each state, the area that each station “covers” (i.e. the area of the state divided by the number of stations) varies from 8,500 sq miles in Nevada, to 2,123 sq miles in Utah. To try and account for this I have weighted the state values by the areas of the states, and this gives this plot:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data and weighting the state averages by the area of the state.
With this plot the steadily increasing temperature suggested in the USHCN plot does not appear as evident, and the temperatures since about 1990 seem relatively steady.
So how do the regional values now compare?
Comparison of average temperatures for the different regions of the United States (as I have defined them)
Obviously the mountain temperatures are lower than those of the other regions given that there is a fall in temperature with elevation. For the combined region this looks like this:
Variation in state temperature as a function of average station elevation in the state.
The r-squared value is much lower for this than it is for the individual stations in the region, and so I will have to look at the effects of latitude, elevation and population separately.
But to conclude I wanted to see how the temperature patterns changed across the country, and the overlay of the regional temperatures made that a little difficult to see, so I “shifted” the curves by adding and subtracting temperatures from each set, so that the comparison of shapes could be made, and that is where we came in:
Average variation in time for four regions of the country, with the results adjusted as shown to separate the curves, and show them in order (bottom to top) from West to East.
To recap where we are, after looking at the temperatures of the individual contiguous states of the American Union, I have started to look at regional changes and to compare them. A significant drop of around four degrees Fahrenheit that occurred between 1950 and 1965 along the Eastern Seaboard, and which inter alia led to a southern migration of the black-capped chickadee, has since been reversed in the warming of that region post-1965. In contrast in the middle of the country there has been sensibly no warming trend since 1895 (which might explain why the Governor of Texas finds it harder to believe in global warming, since his state hasn’t seen any).
Yet on the West Coast the trends show a relatively consistent increase in temperature since 1895. And so the next place to look is to see what happened in those states that lie along the Rockies and provide the mountainous barrier between the West and the Middle. In the following post I am going to derive this graph, and comment along the way as I get there.
Average variation in time for four regions of the country, with the results adjusted as shown to separate the curves, and show them in order (bottom to top) from West to East.
For this exercise I am going to include Montana, Idaho, Wyoming, Nevada, Utah, Colorado, Arizona and New Mexico . I have previously used both the homogenized data from the USHCN and the Time of Observation corrected (TOBS) data, with a preference for the latter since, as I noted with the individual state evaluations, it has shown a consistently better correlation across the states with latitude and elevation than the data after manipulation.
Combining first the average temperatures for the states, using the homogenized data, one gets a relatively flat curve, but one with a decided kick over the last thirty years, the sign that many consider the marker for Anthropogenic Global Warming.
Average of the state temperatures for the Mountain Region, averaging the values for the different states in the region.
However, before there is much comment on this, it should be noted (when this is compared with the figure above it) that that trend is not as clear in the four regions as I ultimately graphed them. But before getting to that let me just put up a set of comparison graphs, using the USHCN homogenized data, to show that the temperatures of the eight states that I looked at appear to vary relatively consistently. By setting them one above the other it should be easier to cross-compare.
These are relatively consistent, though they appear to smooth a bit with movement south, but to get back to that kick in the end of the plots. As you may know, and I have commented in the individual state plots, the USHCN data is “homogenized” from the raw data. When one takes the average of this “manipulation” and subtracts the Time of Observation corrected values from it, then one gets this curve as the average change made by the climate scientists in reporting the values for the stations in the evaluation (all 233 of them).
Difference between the average USHCN homogenized temperatures and the raw temperature averages, as corrected for time of observation (the TOBS values).
Given that, the rest of the analysis will go back to using the TOBS values. Firstly this changes the average a little:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data.
The change in the slope of the curve drops the rise in temperature per century from 1.7 deg F to 1.08 deg F.
However this curve was obtained by just averaging the average state temperatures, there are varying numbers of stations and varying areas for the different states. When one weights the value by the relative number of stations in the state (equivalent to averaging all the stations in the region), then one gets:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data and averaging the 233 stations in the region.
In each state, the area that each station “covers” (i.e. the area of the state divided by the number of stations) varies from 8,500 sq miles in Nevada, to 2,123 sq miles in Utah. To try and account for this I have weighted the state values by the areas of the states, and this gives this plot:
Average temperatures for the Mountain states, using the Time of Observation corrected raw data and weighting the state averages by the area of the state.
With this plot the steadily increasing temperature suggested in the USHCN plot does not appear as evident, and the temperatures since about 1990 seem relatively steady.
So how do the regional values now compare?
Comparison of average temperatures for the different regions of the United States (as I have defined them)
Obviously the mountain temperatures are lower than those of the other regions given that there is a fall in temperature with elevation. For the combined region this looks like this:
Variation in state temperature as a function of average station elevation in the state.
The r-squared value is much lower for this than it is for the individual stations in the region, and so I will have to look at the effects of latitude, elevation and population separately.
But to conclude I wanted to see how the temperature patterns changed across the country, and the overlay of the regional temperatures made that a little difficult to see, so I “shifted” the curves by adding and subtracting temperatures from each set, so that the comparison of shapes could be made, and that is where we came in:
Average variation in time for four regions of the country, with the results adjusted as shown to separate the curves, and show them in order (bottom to top) from West to East.
Read more!
Labels:
Arizona state temperatures,
Colorado,
Idaho,
IPCC,
Montana,
Mountain state temperatures,
Nevada,
New Mexico,
Utah,
Wyoming
Thursday, October 13, 2011
OGPSS - the future of oil sand production in Alberta
If one examines the forecast future supply of liquid fuels that the EIA projects in their most recent International Energy Outlook, that for 2011, the agency projects a considerable growth in unconventional supplies of liquid fuels.
EIA projections of future growth in liquid fuel supplies (EIA)
For North America, the projections foresee considerable growth both in production within the United States (rising from 8.5 to 12.8 mbdoe) and from Canada (rising from 3.4 mbdoe in 2008 to 6.6 mbdoe in 2035).
Regional growth in liquid fuels generation 2008 to 2035 (EIA )
The growth in unconventional fuels is most critically anticipated as coming from oil sands, with biofuels (a topic for another day) close behind.
Future sources of unconventional liquid fuels (EIA )
It is, in passing and for folks such as Dr Yergin perhaps, worth noting that the EIA does not see much of a significant role for oil from shales through 2035. But it highlights the criticality of the Athabasca oil sands in the future well-being of the North American fuel supply chain.
There have been a significant number of posts, over the years, both at Bit Tooth and at The Oil Drum, in which both myself and others have written about these reserves that are playing an increasing part in North American oil supply, and that will likely also grow to supply other nations, particularly China. (A topic dating back to the start of The Oil Drum). In this particular post I will therefore just briefly overview the reservoirs, and the technologies used to extract the fuel, in looking at the projected outlook for the future – given that this has been reviewed and changed a number of times in the past. Some measure of that variation comes from the predictive curves that Sam Foucher posted back in 2006.
Predictions of oil sand production from 5 years ago (The Oil Drum )
Considering first the resource, the amount of oil that exists within the oil sands of Alberta has been estimated as being between 1.7 and 2.5 trillion barrels. It is found in three major deposits the largest of which is the Athabasca, and then there are Cold Lake and Peace River.
Locations of the major oil sands in Canada (CAPP)
There are several different ways in which the oil can be recovered, of which the one generating the most visibility is where the sands lie close enough to the surface that they can be mined. Early use of single, large-scale bucket wheel excavators did not work out well, since production is tied to the well-being of a single machine. As a result the sands are mined by perhaps 15 shovels within the mining pit, each scooping about 100 tons of sand at a time, and loading trucks, which then carry the material, 300 tons at a time, to an in-mine crusher which breaks the rock into small fragments. These fragments (with waste rock largely removed) are then mixed with hot water and pumped to large tanks at the primary Upgrader, where the sand, water and bitumen are separated.
Loading a truck, it will take 2 to 4 shovel loads to fill the truck bed, depending on its size.
There is a Youtube video of the process available.
The sand also contains small particles of clay, which are difficult to settle out of the water, and the large tailings ponds used for this have been the focus of considerable controversy. This concern is recognized, and considerable efforts are being made to reduce the time that it takes for the settlement, and for site reclamation. That effort is continuing with a collaborative effort between the mining companies and four universities to improve the technology over that currently used.
Some 80% of the reserves lie too deep for mining to be an effective solution, and with the bitumen being, in natural form, too thick to easily flow to wells, ways had to be found to encourage that flow. The most common, and that most often projected for future development, is based on the use of Steam Assisted Gravity Drainage.
Artist's illustration of the SAGD process (Devon Canada Corp)
There is a video of the SAGD process on Youtube.
One of the problems with SAGD comes in the need to generate the steam that is fed into the upper pipe, and which migrates up into the sand to heat and thin the oil. The typical method to provide the steam is with natural gas, and as Dave Cohen noted some years ago, the supply of that natural gas becomes more of an issue, as the size of the operation continues to grow.
One of the more innovative of the techniques suggested overcomes that problem through burning some of the oil in place, in a process known as Toe to Heel Air Injection (THAI) . Igniting a section of the oil sand, and after using the heat to drive off the more volatile oil while continuing the burn with residual coke left behind, as the flame front progresses, has been shown to be viable.
It is an artist’s impression of a side view of the site, with the blue dotted horizontal line representing the recovery well and air being fed in from a higher well into the formation.
The video showing the THAI process is also on Youtube.
However, at present it has only a limited planned future, though that may increase as information on the technique becomes more evident.
So what is the likelihood that, using these techniques, that production will rise to the levels which the EIA project? Well in their October 3rd edition the Oil and Gas Journal (OGJ) listed (registration required) those projects currently planned for Canada in the period through 2020 and beyond. The list contains 144 projects, many of which are defined as the separate stages of different developments. Taking just those that relate to oil sand production roughly 70% plan on using thermal methods to recover deeper oil (mainly SAGD, though there are small amounts of THAI, cyclic steam and electrothermal). 30%, roughly 2 mbdoe, of the increase in production is planned to come from surface mining.
It is unlikely that the total 6 mbdoe of increased production will all come to pass. However, given that these are all defined projects, many broken into separate phases that can be geared back or advanced, depending on market conditions, and supply – particularly in the further out years – the projected increases that the EIA project would seem to be eminently reasonable to anticipate. (Caveats on water and natural gas availability are discounted at present).
I would be remiss in not thanking HereinHalifax who led me to the report on Eastern Canadian supplies, which pointed out that the flow of crude in the pipeline from Sarnia to Montreal, about which I wrote earlier, was originally Eastward, carrying western crude, but which was reversed as the market for the oil from Alberta developed in the United States, and Eastern Canadian refineries supply switched to tanker import.
The increasing plans to produce oil from Alberta depend, to a degree, on the ability of pipelines to carry the resulting product to market. Production is therefore likely to hinge on the success with which those advocating the various pipelines are able to achieve. And to a coonsiderable extent these are political, rather than technical decisions.
Postscipt: Emmanuel has pointed out that when I discussed the fields in Eastern Canada I did not mention the Old Harry Field, which has been the subject of much debate between Quebec and the Federal Government and which may be twice as large as the Hibernia field. With apologies (and thanks to Kristin) here is the location:
Location of the Old Harry Field (Source Evidentia)
EIA projections of future growth in liquid fuel supplies (EIA)
For North America, the projections foresee considerable growth both in production within the United States (rising from 8.5 to 12.8 mbdoe) and from Canada (rising from 3.4 mbdoe in 2008 to 6.6 mbdoe in 2035).
Regional growth in liquid fuels generation 2008 to 2035 (EIA )
The growth in unconventional fuels is most critically anticipated as coming from oil sands, with biofuels (a topic for another day) close behind.
Future sources of unconventional liquid fuels (EIA )
It is, in passing and for folks such as Dr Yergin perhaps, worth noting that the EIA does not see much of a significant role for oil from shales through 2035. But it highlights the criticality of the Athabasca oil sands in the future well-being of the North American fuel supply chain.
There have been a significant number of posts, over the years, both at Bit Tooth and at The Oil Drum, in which both myself and others have written about these reserves that are playing an increasing part in North American oil supply, and that will likely also grow to supply other nations, particularly China. (A topic dating back to the start of The Oil Drum). In this particular post I will therefore just briefly overview the reservoirs, and the technologies used to extract the fuel, in looking at the projected outlook for the future – given that this has been reviewed and changed a number of times in the past. Some measure of that variation comes from the predictive curves that Sam Foucher posted back in 2006.
Predictions of oil sand production from 5 years ago (The Oil Drum )
Considering first the resource, the amount of oil that exists within the oil sands of Alberta has been estimated as being between 1.7 and 2.5 trillion barrels. It is found in three major deposits the largest of which is the Athabasca, and then there are Cold Lake and Peace River.
Locations of the major oil sands in Canada (CAPP)
There are several different ways in which the oil can be recovered, of which the one generating the most visibility is where the sands lie close enough to the surface that they can be mined. Early use of single, large-scale bucket wheel excavators did not work out well, since production is tied to the well-being of a single machine. As a result the sands are mined by perhaps 15 shovels within the mining pit, each scooping about 100 tons of sand at a time, and loading trucks, which then carry the material, 300 tons at a time, to an in-mine crusher which breaks the rock into small fragments. These fragments (with waste rock largely removed) are then mixed with hot water and pumped to large tanks at the primary Upgrader, where the sand, water and bitumen are separated.
Loading a truck, it will take 2 to 4 shovel loads to fill the truck bed, depending on its size.
There is a Youtube video of the process available.
The sand also contains small particles of clay, which are difficult to settle out of the water, and the large tailings ponds used for this have been the focus of considerable controversy. This concern is recognized, and considerable efforts are being made to reduce the time that it takes for the settlement, and for site reclamation. That effort is continuing with a collaborative effort between the mining companies and four universities to improve the technology over that currently used.
Some 80% of the reserves lie too deep for mining to be an effective solution, and with the bitumen being, in natural form, too thick to easily flow to wells, ways had to be found to encourage that flow. The most common, and that most often projected for future development, is based on the use of Steam Assisted Gravity Drainage.
Artist's illustration of the SAGD process (Devon Canada Corp)
There is a video of the SAGD process on Youtube.
One of the problems with SAGD comes in the need to generate the steam that is fed into the upper pipe, and which migrates up into the sand to heat and thin the oil. The typical method to provide the steam is with natural gas, and as Dave Cohen noted some years ago, the supply of that natural gas becomes more of an issue, as the size of the operation continues to grow.
One of the more innovative of the techniques suggested overcomes that problem through burning some of the oil in place, in a process known as Toe to Heel Air Injection (THAI) . Igniting a section of the oil sand, and after using the heat to drive off the more volatile oil while continuing the burn with residual coke left behind, as the flame front progresses, has been shown to be viable.
It is an artist’s impression of a side view of the site, with the blue dotted horizontal line representing the recovery well and air being fed in from a higher well into the formation.
The video showing the THAI process is also on Youtube.
However, at present it has only a limited planned future, though that may increase as information on the technique becomes more evident.
So what is the likelihood that, using these techniques, that production will rise to the levels which the EIA project? Well in their October 3rd edition the Oil and Gas Journal (OGJ) listed (registration required) those projects currently planned for Canada in the period through 2020 and beyond. The list contains 144 projects, many of which are defined as the separate stages of different developments. Taking just those that relate to oil sand production roughly 70% plan on using thermal methods to recover deeper oil (mainly SAGD, though there are small amounts of THAI, cyclic steam and electrothermal). 30%, roughly 2 mbdoe, of the increase in production is planned to come from surface mining.
It is unlikely that the total 6 mbdoe of increased production will all come to pass. However, given that these are all defined projects, many broken into separate phases that can be geared back or advanced, depending on market conditions, and supply – particularly in the further out years – the projected increases that the EIA project would seem to be eminently reasonable to anticipate. (Caveats on water and natural gas availability are discounted at present).
I would be remiss in not thanking HereinHalifax who led me to the report on Eastern Canadian supplies, which pointed out that the flow of crude in the pipeline from Sarnia to Montreal, about which I wrote earlier, was originally Eastward, carrying western crude, but which was reversed as the market for the oil from Alberta developed in the United States, and Eastern Canadian refineries supply switched to tanker import.
The increasing plans to produce oil from Alberta depend, to a degree, on the ability of pipelines to carry the resulting product to market. Production is therefore likely to hinge on the success with which those advocating the various pipelines are able to achieve. And to a coonsiderable extent these are political, rather than technical decisions.
Postscipt: Emmanuel has pointed out that when I discussed the fields in Eastern Canada I did not mention the Old Harry Field, which has been the subject of much debate between Quebec and the Federal Government and which may be twice as large as the Hibernia field. With apologies (and thanks to Kristin) here is the location:
Location of the Old Harry Field (Source Evidentia)
Read more!
Labels:
Alberta,
crude oil production,
oil mining,
oil sands,
SAGD,
THAI
Wednesday, October 5, 2011
Regional US temperatures, the West, the Middle and the East
The relative changes in the temperature profiles between the Central States and those along the Atlantic Coast, begs the question as to the relative conditions on the West Coast. And so I have combined the state temperatures of California, Oregon and Washington State. As with the other regions the first average that I took was for the homogenized data, and just took an average for the three states, to get a sense of what has occurred over the past hundred and ten years.
Average temperature over time for the Pacific Coast, using the average of state temperatures, and homogenized USHCN reported temperatures
The most obvious immediate conclusion is that the temperature drop from 1950 to 1965 which is so evident on the East Coast does not appear to have happened in the West. And the flat temperatures over the last hundred years found in the Central States don't appear to hold either.
Breaking this down to a review of the initial Time of Observation corrected temperatures, but initially again just averaging the state temperatures, one gets:
Average temperature for the states along the West Coast, averaging the TOBS average state temperatures.
In contrast with the Middle States there does appear to be a steady increase, with time, in the average temperature. Looking at the average of all the stations (a total of 138), the data continues to show that rise:
Average temperature for the states along the West Coast, averaging the TOBS average station temperatures.
The area that each station covers falls from an average of 3,000 sq miles in California, through 2,460 in Oregon to 1,621 in Washington. When one adjusts for area, using this weighting, and adding a trend line to see what the temperature change averages, one gets:
Average temperature for the states along the West Coast, weighting the TOBS averages by area.
There is still no sign of that fall in temperatures in the 1950’s, in fact the temperature goes up. Comparing the results for the three different states:
Average individual West Coast TOBS temperatures over time
The form of the plots are roughly similar with the two more northern states showing almost identical patterns, and a slightly wider range of fluctuation.
When one compares the three regions that I have looked at to date, however, the difference between them is quite clear. Well actually, if I plot the data itself it is a little obscured by the curves as they superimpose.
Average temperatures for different regions of the country over the past century.
The Middle states are clearly warmer than those on the two coasts, which I hadn’t expected, but it is more difficult to separate the two coastal variations. Because of this overlap I have moved the lines a little apart, so that the changes in pattern can be more clearly seen. So, for this plot, ignore the temperature values on the left, since I have, as shown, adjusted the values so that they are separated, and so that the differences in behavior each year can be seen.
Pattern of temperature change for three regions of the United States over the past century.
While there is some congruity between the central and Atlantic values, these are clearly different from the Western states. Perhaps I should see what is happening in the mountains??
I am also curious to see what the effects of El Nino’s have on the land temperatures, but I think that I will add in the mountain states, before putting those effects onto the plot.
I might start here but any other suggestions would be welcome.
Average temperature over time for the Pacific Coast, using the average of state temperatures, and homogenized USHCN reported temperatures
The most obvious immediate conclusion is that the temperature drop from 1950 to 1965 which is so evident on the East Coast does not appear to have happened in the West. And the flat temperatures over the last hundred years found in the Central States don't appear to hold either.
Breaking this down to a review of the initial Time of Observation corrected temperatures, but initially again just averaging the state temperatures, one gets:
Average temperature for the states along the West Coast, averaging the TOBS average state temperatures.
In contrast with the Middle States there does appear to be a steady increase, with time, in the average temperature. Looking at the average of all the stations (a total of 138), the data continues to show that rise:
Average temperature for the states along the West Coast, averaging the TOBS average station temperatures.
The area that each station covers falls from an average of 3,000 sq miles in California, through 2,460 in Oregon to 1,621 in Washington. When one adjusts for area, using this weighting, and adding a trend line to see what the temperature change averages, one gets:
Average temperature for the states along the West Coast, weighting the TOBS averages by area.
There is still no sign of that fall in temperatures in the 1950’s, in fact the temperature goes up. Comparing the results for the three different states:
Average individual West Coast TOBS temperatures over time
The form of the plots are roughly similar with the two more northern states showing almost identical patterns, and a slightly wider range of fluctuation.
When one compares the three regions that I have looked at to date, however, the difference between them is quite clear. Well actually, if I plot the data itself it is a little obscured by the curves as they superimpose.
Average temperatures for different regions of the country over the past century.
The Middle states are clearly warmer than those on the two coasts, which I hadn’t expected, but it is more difficult to separate the two coastal variations. Because of this overlap I have moved the lines a little apart, so that the changes in pattern can be more clearly seen. So, for this plot, ignore the temperature values on the left, since I have, as shown, adjusted the values so that they are separated, and so that the differences in behavior each year can be seen.
Pattern of temperature change for three regions of the United States over the past century.
While there is some congruity between the central and Atlantic values, these are clearly different from the Western states. Perhaps I should see what is happening in the mountains??
I am also curious to see what the effects of El Nino’s have on the land temperatures, but I think that I will add in the mountain states, before putting those effects onto the plot.
I might start here but any other suggestions would be welcome.
Read more!
Monday, October 3, 2011
Two stars for Katla (a volcanic intensity scale)
Having just finished the post on Eastern Canada, I glanced at the Icelandic Activity Map, on my way to shutting things down for the evening. Katla, the volcano that lies under the Myrdalsjokull glacier has seen a steadily increasing level of earthquake activity over the summer, as I have been noting. This now seems to be picking up a little more with two quakes in the caldera region that have reached a level of 3 or more. (Signified by a star rather than a dot). The younger (green) of the two shown was a 3.3, while the older is reported at 3.0.
Earthquakes in the region of the Katla volcano over the past 24 hours (Icelandic Met Office)
UPDATE: And just as I finish the post on Western US temperatures, and check with Katla, the activity seems to be picking up even more with a 3.9 earthquake sitting on top of the earlier 3.3 (you can see it as a yellow star in the background). But there is also a considerable amount more activity in the caldera, which has to be worrying.
Activity on the evening of the 5-6th October (Icelandic Met Office).
Jon is posting some of the seismic information on the quakes at the Iceland Volcano and Earthquake site, and the Icelandic Met Office has now issued a statement which also shows the unusual nature of the current activity, in that it includes a number of earthquakes above a level 3, and that the activity still appears to be increasing. A more distributed graph of the recent activity can be found here.
Earthquakes in the region of the Katla volcano over the past 24 hours (Icelandic Met Office)
UPDATE: And just as I finish the post on Western US temperatures, and check with Katla, the activity seems to be picking up even more with a 3.9 earthquake sitting on top of the earlier 3.3 (you can see it as a yellow star in the background). But there is also a considerable amount more activity in the caldera, which has to be worrying.
Activity on the evening of the 5-6th October (Icelandic Met Office).
Jon is posting some of the seismic information on the quakes at the Iceland Volcano and Earthquake site, and the Icelandic Met Office has now issued a statement which also shows the unusual nature of the current activity, in that it includes a number of earthquakes above a level 3, and that the activity still appears to be increasing. A more distributed graph of the recent activity can be found here.
Read more!
Labels:
earthquake,
Eyjafjallajokull,
Iceland volcano,
Katla,
Myrdalsjokull
OGPSS - Pipelines and Eastern Canada
I wrote, last time, about the projected pipelines that are being anticipated to take advantage of increased production from the oil sands in Alberta, Canada with the intent that I would use that to lead into a review of the oil sands operation itself. However Gail has raised the issue of the proposed reversal of the pipeline (Line 9) that runs from Sarnia in Ontario to Montreal so that oil would flow to the Suncor refinery in Montreal, rather than the current flow which runs the other way. (Sarnia is right by Detroit). The pipeline reversal is planned to continue beyond Montreal, to include reversing the flow of a supply pipeline from Portland, ME so that the oil from Alberta might also supply northern New England. The project has been called “The Trailbreaker” and had been postponed two years ago when demand declined.
The overall question of the relative role of exports and imports on the Canadian oil supply equation was also raised in comments on the previous post. As a result I am going to look a bit more into the overall Canadian picture, and particularly that in the East Coast, and will postpone the oil sand piece a week (sorry Neil). To begin it helps to look at the current projects that Enbridge have in mind for pipelines.
Overview of the Enbridge planned pipeline projects (Enbridge)
The Ozark pipeline, which is shown running from Cushing to Wood River was closed on Friday, Sept 30th, when the pipeline was found to be exposed as it ran across the bottom of the Mississippi River. It carries around 240 kbd and the shut down is precautionary, since there has been no leak detected.
The dark green arrow from Cushing to Houston will be known as the Wrangler pipeline and would carry up to 800 kbd from Cushing to the refineries along the Gulf, competing with the Keystone XL, which is planned to bring an additional 500 kbd down from Alberta (about 500 kbd flows through existing connections). The intent is to move the oil away from Cushing where it is proving to be a bit of a glut.
With this surplus, and the ability to move oil around the Mid-West, the need for additional feed down from Montreal dissipates, and thus the argument for the reversal of flow so that crude from Alberta can be sent back up that pipe to the refineries in Eastern Canada. At present oil comes by tanker to Portland, ME and then enters the Portland-Montreal Pipe Line (PMPL) system. The oil flows through either an 18-inch or 24-inch line up to Montreal, a distance of 236 miles, with flow then continuing to Sarnia. A certain increase in energy security would thus be achieved for the East Coast of North America, if the oil refined on the East Coast was produced in North America, instead of coming from Russia and the Middle East. At present flow would be reversed in only one of the pipes, the 18-inch, coming from Portland. At present some 200 tankers a year offload in Portland, and send around 95 kbd up to Montreal. (The pipeline came into being in 1941 to avoid exposing oil tankers moving into Quebec to U-boats).
Portland Montreal Pipe Line (Wikipedia)
If one looks at the overall flows of oil in Canada, the increasing reliance on Albertan oil can be appreciated.
Flows of Canadian crude oil in thousand cubic meters/day – (multiply by 6.3 to get kbd) in 2009 (Canadian NEB)
Canadians consume around 1.5 mbd, of which half is gasoline, and 30% diesel. I discussed their exports in the last post.
Interestingly the largest exporter of gasoline from Canada to the US lies further East, where it comes from the Irving Oil refinery in St John New Brunswick with the 300 kbd refinery shipping 75% of the gasoline it produces into the Eastern United States.
There are two other refineries in the region, the Dartmouth refinery at Halifax in Nova Scotia refines 82,000 bd of crude, largely for the Eastern provinces of Canada. The crude is off-loaded from tankers, refined and then loaded onto smaller coastal tankers for delivery.
Yet further East is the Come by Chance refinery, now the North Atlantic Refinery in Newfoundland. In its time it was one of the largest bankruptcies in Canadian history, but it now refines some 115 kbd that arrives in 320,000 ton tankers at the nearby ice-free port. It suffers from that bankruptcy in that a subsequent sale was commensurate on the product not being sold in Canada (outside of Labrador and Newfoundland), which means that the products largely ship to the United States.
Yet one must go even further East to find the oil fields of the Eastern Canada Sedimentary Basin. And unfortunately while the nearest refinery might be near an ice-free port, the fields themselves are not.
Eastern Canadian Sedimentary basins and hydrocarbon fields (Center for Energy )
The fields include White Rose which is expected to yield around 230 milllion barrels, at around 120 kbd, with a life of 12 – 15 years. First oil was in 2005.
Terra Nova is estimated at 406 million barrels. With a production of around 125 kbd it is anticipated to last 18 years. Production began in 2002.
Seasonal floating ice in these parts varies from 1.6 to 5 ft thick and an average wind speed of around 20 mph. also leads to ice buildup on the tankers and superstructure making it a consideration in operations, as are icebergs. (Remember the picture of the tug pulling an iceberg). That was taken around here.
Hibernia is also producing at around 126 kbd, and by August 2010 had produced 704 million of the 1.4 billion potentially recoverable oil from the field, Approval has now been given to extend development to the south.
Current plans to develop Hebron, some 6 miles N of Terra Nova include also the fields of West Ben Nevis, and Ben Nevis. Production is expected to begin in 2017, at a rate of around 150 – 180 kbd of oil and 200 – 350 kbd of water. Reserves are considered to be in the 400 – 700 mb range.
Although these numbers are significant, and will provide for a considerable fraction of the need in this part of the world, and further south, they do not constitute the promise of sustained increased levels of production that are required for significant help against the declining production seen in many of the worlds major oilfields. For some help in that direction it is necessary to head West. And so it is time for the review of the oil sands.
There is however, considerable controversy over the operation of the oil sands in Alberta, particularly from those objecting citing the mess that is made of the landscape. In general the illustrations of oil sand operations that are used, show the site during mining, where the black color of the sand tends to make for a dismal picture. It might be more informative to show what the sites are like afterwards. And so, as a prelude to more discussion, here is a photo of some reclaimed land. It is here that the wood bison (as opposed to buffalo) roam.
Reclaimed land after the oil is removed from the sand (CAPP )
The overall question of the relative role of exports and imports on the Canadian oil supply equation was also raised in comments on the previous post. As a result I am going to look a bit more into the overall Canadian picture, and particularly that in the East Coast, and will postpone the oil sand piece a week (sorry Neil). To begin it helps to look at the current projects that Enbridge have in mind for pipelines.
Overview of the Enbridge planned pipeline projects (Enbridge)
The Ozark pipeline, which is shown running from Cushing to Wood River was closed on Friday, Sept 30th, when the pipeline was found to be exposed as it ran across the bottom of the Mississippi River. It carries around 240 kbd and the shut down is precautionary, since there has been no leak detected.
The dark green arrow from Cushing to Houston will be known as the Wrangler pipeline and would carry up to 800 kbd from Cushing to the refineries along the Gulf, competing with the Keystone XL, which is planned to bring an additional 500 kbd down from Alberta (about 500 kbd flows through existing connections). The intent is to move the oil away from Cushing where it is proving to be a bit of a glut.
With this surplus, and the ability to move oil around the Mid-West, the need for additional feed down from Montreal dissipates, and thus the argument for the reversal of flow so that crude from Alberta can be sent back up that pipe to the refineries in Eastern Canada. At present oil comes by tanker to Portland, ME and then enters the Portland-Montreal Pipe Line (PMPL) system. The oil flows through either an 18-inch or 24-inch line up to Montreal, a distance of 236 miles, with flow then continuing to Sarnia. A certain increase in energy security would thus be achieved for the East Coast of North America, if the oil refined on the East Coast was produced in North America, instead of coming from Russia and the Middle East. At present flow would be reversed in only one of the pipes, the 18-inch, coming from Portland. At present some 200 tankers a year offload in Portland, and send around 95 kbd up to Montreal. (The pipeline came into being in 1941 to avoid exposing oil tankers moving into Quebec to U-boats).
Portland Montreal Pipe Line (Wikipedia)
If one looks at the overall flows of oil in Canada, the increasing reliance on Albertan oil can be appreciated.
Flows of Canadian crude oil in thousand cubic meters/day – (multiply by 6.3 to get kbd) in 2009 (Canadian NEB)
Canadians consume around 1.5 mbd, of which half is gasoline, and 30% diesel. I discussed their exports in the last post.
Interestingly the largest exporter of gasoline from Canada to the US lies further East, where it comes from the Irving Oil refinery in St John New Brunswick with the 300 kbd refinery shipping 75% of the gasoline it produces into the Eastern United States.
There are two other refineries in the region, the Dartmouth refinery at Halifax in Nova Scotia refines 82,000 bd of crude, largely for the Eastern provinces of Canada. The crude is off-loaded from tankers, refined and then loaded onto smaller coastal tankers for delivery.
Yet further East is the Come by Chance refinery, now the North Atlantic Refinery in Newfoundland. In its time it was one of the largest bankruptcies in Canadian history, but it now refines some 115 kbd that arrives in 320,000 ton tankers at the nearby ice-free port. It suffers from that bankruptcy in that a subsequent sale was commensurate on the product not being sold in Canada (outside of Labrador and Newfoundland), which means that the products largely ship to the United States.
Yet one must go even further East to find the oil fields of the Eastern Canada Sedimentary Basin. And unfortunately while the nearest refinery might be near an ice-free port, the fields themselves are not.
Eastern Canadian Sedimentary basins and hydrocarbon fields (Center for Energy )
The fields include White Rose which is expected to yield around 230 milllion barrels, at around 120 kbd, with a life of 12 – 15 years. First oil was in 2005.
Terra Nova is estimated at 406 million barrels. With a production of around 125 kbd it is anticipated to last 18 years. Production began in 2002.
Seasonal floating ice in these parts varies from 1.6 to 5 ft thick and an average wind speed of around 20 mph. also leads to ice buildup on the tankers and superstructure making it a consideration in operations, as are icebergs. (Remember the picture of the tug pulling an iceberg). That was taken around here.
Hibernia is also producing at around 126 kbd, and by August 2010 had produced 704 million of the 1.4 billion potentially recoverable oil from the field, Approval has now been given to extend development to the south.
Current plans to develop Hebron, some 6 miles N of Terra Nova include also the fields of West Ben Nevis, and Ben Nevis. Production is expected to begin in 2017, at a rate of around 150 – 180 kbd of oil and 200 – 350 kbd of water. Reserves are considered to be in the 400 – 700 mb range.
Although these numbers are significant, and will provide for a considerable fraction of the need in this part of the world, and further south, they do not constitute the promise of sustained increased levels of production that are required for significant help against the declining production seen in many of the worlds major oilfields. For some help in that direction it is necessary to head West. And so it is time for the review of the oil sands.
There is however, considerable controversy over the operation of the oil sands in Alberta, particularly from those objecting citing the mess that is made of the landscape. In general the illustrations of oil sand operations that are used, show the site during mining, where the black color of the sand tends to make for a dismal picture. It might be more informative to show what the sites are like afterwards. And so, as a prelude to more discussion, here is a photo of some reclaimed land. It is here that the wood bison (as opposed to buffalo) roam.
Reclaimed land after the oil is removed from the sand (CAPP )
Read more!
Labels:
Canada,
Cushing,
Enbridge,
Hibernia,
Keystone XL,
Labrador,
Maine,
New Brunswick,
oil pipelines,
White Rose
Subscribe to:
Posts (Atom)