Donate Bitcoin

Donate Paypal


PeakOil is You

PeakOil is You

Comments on the general approach to modeling depletion

Discuss research and forecasts regarding hydrocarbon depletion.

Re: Comments on the general approach to modeling depletion

Unread postby DigitalCubano » Tue 17 Jan 2006, 03:43:12

WebHubbleTelescope wrote:If someone wants to do this kind of calculation without having to pay the government's consultants again with our tax dollars, let's do it here.


...and off he goes on another red herring over irrelevant minutia. They used an Excel plug-in to run a Monte Carlo analysis. Big-friggin'-deal.

Now onto the next step, WHT. Are we due for another figure from your model or are we still short a diatribe on the font size of the report?
User avatar
DigitalCubano
Permanently Banned
 
Posts: 434
Joined: Fri 19 Aug 2005, 03:00:00

Re: Comments on the general approach to modeling depletion

Unread postby WebHubbleTelescope » Tue 17 Jan 2006, 09:35:05

DigitalCubano wrote:
WebHubbleTelescope wrote:If someone wants to do this kind of calculation without having to pay the government's consultants again with our tax dollars, let's do it here.


...and off he goes on another red herring over irrelevant minutia. They used an Excel plug-in to run a Monte Carlo analysis. Big-friggin'-deal.

Now onto the next step, WHT. Are we due for another figure from your model or are we still short a diatribe on the font size of the report?


Thanks for the compliment. No one else seems to covers the minutia (and the obvious). You finally get what I am trying to do!

For instance, take a look at the way the USGS describes the Monte Carlo analysis. I'd like to track down exactly how they come up with their discovery predictions.
Image

One thing I find striking is that their probability of a discovery is biased away from Time=0, which I think implies some sort of start-up cost or latency. Otherwise that could be a form of Poisson distribution which implies they are looking at a number (N) discoveries found by a particular time, which given a value of N>0 will be biased away from zero as well. The size distribution looks like a wigged-out lognormal.

Overall it looks like straight probability and statistics with little or geology knowledge required. Which is what I have been saying all along.
User avatar
WebHubbleTelescope
Tar Sands
Tar Sands
 
Posts: 950
Joined: Thu 08 Jul 2004, 03:00:00

Re: Comments on the general approach to modeling depletion

Unread postby ReserveGrowthRulz » Tue 17 Jan 2006, 12:22:43

WebHubbleTelescope wrote:Overall it looks like straight probability and statistics with little or geology knowledge required. Which is what I have been saying all along.


I think it unlikely that the USGS would do a Monte Carlo simulation as a stand alone assessment without geology behind it, and considering that WHC apparently has a vested interest in NOT finding it ( and when someone points it out to him, to discredit the work if at all possible) , does anyone else want to look to show him he's just wrong again, or do I have to go do this as well?
User avatar
ReserveGrowthRulz
Permanently Banned
 
Posts: 813
Joined: Fri 30 Dec 2005, 04:00:00

Re: Comments on the general approach to modeling depletion

Unread postby WebHubbleTelescope » Tue 17 Jan 2006, 21:25:56

ReserveGrowthRulz wrote:
WebHubbleTelescope wrote:Overall it looks like straight probability and statistics with little or geology knowledge required. Which is what I have been saying all along.


I think it unlikely that the USGS would do a Monte Carlo simulation as a stand alone assessment without geology behind it, and considering that WHC apparently has a vested interest in NOT finding it ( and when someone points it out to him, to discredit the work if at all possible) , does anyone else want to look to show him he's just wrong again, or do I have to go do this as well?


Somebody please show me how I have gotten something fundamentally wrong. I've been working the math since last June and it seems to provide a good foundation to build from.
Code: Select all
R(t+dt) = R(t) + (T(t) - R(t) * E(t)) * dt
P(t) = E(t) * R(t)

where
Code: Select all
R(t) = Current reserves
T(t) = (Triangular, e.g.) Discovery curve
E(t) = Extraction rate (yearly or daily)
P(t) = Yearly (or daily) Production

Other, "virtual" reserves can be inserted into the flow to model latencies. The MC comes in when you want to estimate the T(t)-distribution.

I have several objectives for this approach, any one of which I don't mind achieving.
1. Introducing something fundamentally new to the discussion.
2. Going beyond the heuristics and empirical relationships that I see plastered everywhere, to an approach applied math-minded people can understand.
3. Resurrecting some old but perhaps forgotten techniques buried in the literature (so far nothing has shown up).
4. Showing analogies to other physical processes, like an RC circuit in electronics or a 1st order damped system in mechanical dynamics.
5. Using the formulation to historically analyze or make predictions based on current data.
6. Demonstrate an alternative to and weaknesses of the conventional approaches such as the logistic curve and gaussian.

{update} 7. Come up with an open-source modeling environment, where all source code and data is made public.

fire away
User avatar
WebHubbleTelescope
Tar Sands
Tar Sands
 
Posts: 950
Joined: Thu 08 Jul 2004, 03:00:00

Previous

Return to Peak oil studies, reports & models

Who is online

Users browsing this forum: No registered users and 9 guests

cron