30 Second bites of Info about Resource Estimation and Reconciliation

If you’d like advice, tips and hints about Resource Estimation and Reconciliation in quick easy to read 30 Second bites… here they are!

30 Second bites of Resource Estimation and Evaluation Info
Authors: Ian Glacken – Director of Geology; Paul Blackney – Principal Consultant

What to do if you have conditional bias:  Get more and closer-spaced data, estimate into larger volumes (this generally works – but not always), think about a change of support if this isn’t possible, and choose an estimation method which doesn’t exaggerate the bias. Author: Ian Glacken

Be aware of the limitations of ID estimation: Inverse distance (ID) is often touted as a viable alternative to Ordinary Kriging (OK), and it can be – but you must appreciate its two or three very serious limitations and correct for these.  The limitations are that there is no accounting for data clustering (unlike OK) and the choice of a power is essentially arbitrary.  As long as you are aware of these issues, ID can sometimes be used as a ‘check’ technique for OK. Author: Ian Glacken

Always look at a range of top-cut (capping) techniques:  There is no single ‘magic bullet’ technique for top-cutting a set of data; the top-cut chosen will depend upon the nature of the data, the support of the data and the location of the sample points.  Best practice is always to look at two or three different approaches to cutting and see if there is a commonality between one or more methods.  If the outcome is sensitive to the top-cut level chosen then it will pay to carry out a sensitivity test on different cuts. Author: Ian Glacken

Capping before compositing or after?  This is a controversial topic which has adherents on both sides, although most believe that capping should be carried out after compositing, when the data has equal support.  If you are going to cap before you composite the data, then the capping should be carried out with the variable as ‘metal’ or accumulation (grade times length), rather than just the assay itself. Author: Ian Glacken

Don’t ignore the effects of clustering on data statistics: if you don’t correct for clustering any statistics that you derive from the data will be wrong.  Don’t assume that just because a set of drilling data is (more or less) on a regular grid that there are no clustering effects.  It only takes a few minutes to look at the effects of clustering and it can save (or damn) your project.

Remember that declustered data statistics are not perfect: Applying declustering is a good idea for many drillhole data sets, but recognise that it’s not a perfect process.  During optimisation of the declustering process, it should be apparent that parameter selection varies the outcome to some degree.  Recall this imprecision when comparing whole-of-domain grade statistics after block grade estimation – differences may be within the precision of the declustering process and therefore not necessarily worthy of follow-up action. Author: Paul Blackney

Every step of the resource estimation process should be validated in some way: While this may sound onerous and time consuming, it doesn’t need to be.  For instance, check the downhole compositing process by determining the total sample length before and after compositing – it should be the same or nearly the same.  The cause of any difference should be determined and either corrected or understood before proceeding Author: Paul Blackney

Try to formalise the confidence in your resource categories: at least for a given deposit and/or commodity, try to communicate to all the stakeholders what error (plus or minus, in metal terms) is meant by Measured, Indicated or Inferred Resources.  Everyone will have an idea, you just want everybody to have the same idea for the same category.  Commercial personnel and Engineers will generally have a more optimistic view of the confidence intervals than Geologists – you all need to be on the same page to manage expectations. Author: Ian Glacken

Don’t assume that geological and grade continuity are the same thing:  In an ideal world your deposit will have geological and grade continuity – but they are not the same or present to the same degree.  Some strongly continuous geological structures can have very poor grade continuity, and some discontinuous structures can in fact have good grade continuity.  The scale of examination is important, and the most important scale is the scale of mining. Author: Ian Glacken

The volume-variance relationship is the key to resource estimation; understanding the relationship between samples of a certain size and their variability, and how this variability changes at different volumes (supports), can explain many of the apparent problems in a resource estimate.  Using one scale of estimate when you need a smaller or a larger scale is still a very common mistake. Author: Ian Glacken

30 Second bites of Reconciliation Info
Author: Ian Glacken – Director of Geology

Stockpile volumes:  Remember that if you have a large stockpile, and particularly one which has a second layer on it, that you will need to increase the loose rock bulk density as compression will have occurred from trucks driving up on the broken rock.  You may need to increase the loose rock bulk density by as much as 30%.  It may pay to do a volume survey of a discrete area and compare this with the accurate tonnage removal (as measured off the primary crusher weightometer) to calibrate the bulk density.

Accounting for development volumes underground:  Unless development is strictly on lines and the ore is either much wider or much narrower than the development face, the drives may meander and stripping cuts may need to be taken to straighten up the profiles.  Don’t forget to account for any ore stripping in the end of month development volumes.

Grab samples from ore passes:  Deriving grades from underground ore passes, especially in precious metal mines, is a hit-and-miss activity at best, and may give a woefully biased sample at worst.  However, if you must sample ore passes (and it’s not recommended), lots of small samples (5-10 kg) taken over a few days, rather than one larger (20-30 kg) sample per shift.

Stockpile sampling methods that don’t work:  All of them.  Well, apart from processing the entire stockpile (and nothing else) through a two-stage crushing system and taking an unbiased sample through a stream cutter, a dry vezin sampler, or at a pinch, a cross-belt sampler.

Beware temporary stockpiles underground: These may be the home for parcels of ore for a few shifts and may not be part of the official stockpile list.  It is very easy to miscount or double count material movements when temporary and ephemeral stockpiles are used.  A robust material tracking system may help to resolve counting issues but extreme diligence is called for!

Process mapping is the key to reconciliation: The heart of a robust reconciliation system is understanding your current process flow, in terms of measurement points, stockpiles, surveys, moisture, tonnage and grade recording, materials handling and re-handling, blending and treatment of ore from diverse sources.  Once you understand this then you can streamline or refine the documentation and recording.

Where is the mine/mill changeover point?  Surprisingly, it’s not always at the same place at every mining operation.  The logical point for the end of the mine recording (claimed tonnage and grade) is when ore goes into the primary crusher, but in some cases the ‘mine’ stream stops at the crushed ore stockpile (the cone) and in others the mill stream starts at the ROM pad.  Once this transition point is fixed and understood then a deeper understanding of ‘claimed’ and ‘actual’ ensues.

Don’t ignore moisture in reconciliation!  Although much of Australia is dry we do have wet spells, and any mine in the tropics or in the northern hemisphere is going to have to cope with water to a greater or lesser degree some time during the year.  Every reconciliation reading needs to be on a dry tonnes basis, which means for those wet (or potentially wet) readings then moisture needs to be measured and subtracted.  Thankfully this is an easy task – but often ignored.

Spreadsheets aren’t databases:  Spreadsheets are the cornerstone of reconciliation reporting and analysis but are not the best repositories for raw data.  Any large spreadsheet which is not locked down is almost certain to have errors, and if this is the main storage area for the raw reconciliation data then there will be problems.  Databases, whether through a production tracking system or a custom industrial-strength application, are no-brainers for storing raw reconciliation data.

Who does the reconciliation?  It shouldn’t matter!  A robust and auditable reconciliation system is one within which the outcome doesn’t depend upon the operator.  In other words, the flow and the data reduction and processing should be totally objective – any approach which relies on individual subjectivity is doomed to failure.

Who does the reconciliation?  It shouldn’t matter!  A robust and auditable reconciliation system is one within which the outcome doesn’t depend upon the operator.  In other words, the flow and the data reduction and processing should be totally objective – any approach which relies on individual subjectivity is doomed to failure.

 

 

Share the Post:

Related Posts

We provide a lot of great technical content for free!

Subscribe here for our podcasts, technical articles and news

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere