August 2017

Geology, grade and geometallurgical modelling – spoilt for choice?

  • By Ian Glacken FAusIMM(CP), Geology Director, Optiro

The breadth of tools available to aid geological, grade and geometallurgical modelling is as extensive as it has ever been, but evaluators need to ensure that the methods and products chosen, the data exchange and the interaction of the various software is communicated adequately through workflow mapping

The evolution of modelling practices

When I first started the long journey in resource estimation, modelling using digital technology was simple – it couldn’t be done – or rather, the choice of methods and software was stark and simple. Generating a three-dimensional shape was slow and painful. The choice of grade estimation techniques was limited to the polygonal/sectional/nearest neighbour approach, and creating block models, let alone viewing them, was a bridge too far. Geostatistics was limited to the theoretical on paper, and even spreadsheets, those modern-day Swiss Army Knives, were in their infancy. Data was invariably keyed in by hand, resulting in many entry and transcription errors, not to mention the effects of mind-numbing boredom leading to further slips of the finger.

Essentially, if one did not want to do the whole estimation on paper (and most did, which perhaps is no bad thing), the aid provided by digital tools was severely limited. The annual Mineral Resource estimation process at the large mining operation where I carried out my geological apprenticeship stretched  the resources of the (manual) drafting section and severely dented Australia’s paper mill output.

I am happier than anyone to note that times have changed a lot, but perhaps not as much as in most other technology-driven industries, notably the petroleum sector. Nonetheless, the resource estimator of the second decade of the 21st century has a bewildering array of tools at their disposal, all pressed into the service of not only grade and volume models, but also models of lithology, weathering, alteration, mineralogy, geomechanical and geotechnical variables, acid consumption and generation and metallurgical recovery,  to name an incomplete list.

Even in straightforward single-commodity deposits such as a structurally-hosted orogenic gold mine, it is highly likely that there will be accessory commodities such as sulfur (as a proxy for flotation of sulfide-hosted gold, for instance), by-product silver or contaminants such as copper, arsenic or bismuth. So the task is considerably larger than twenty or thirty years ago, but then the array of tools at the disposal of the modern estimator is substantially broader, to the extent that the choice of software products in a resource estimate is almost self-defeating. Notwithstanding the productivity gains through the huge increases in processing and visualisation power, the corresponding technological advances are awesome in their diversity and potential power – and this is for a conservative, generally technically risk-averse industry such as mining – and the amount of choice and potential power will only increase.

Too much choice?

But is there a danger of too much choice? While the major mining software vendors would like us to think that we would carry out almost all of the geological, grade and geometallurgical modelling functions using only their software, companies other than exploration juniors will typically use four, five or even more products for resource evaluation projects, each with their own specialty and niche functions. This is not a bad thing; it ensures that the best-in-class products are preferred, and through increased licensing and sales, not to mention major company patronage, those products and functions continue to develop and persist as market leaders.

So it then becomes an issue of the workflow, the data interchange between the various products, and the communication and documentation of what is becoming an increasingly sophisticated set of tasks and activities leading to products (the models) that go far beyond simple grade and tonnage reporting. So while we should embrace the diversity in modelling technology and the advantages it gives us, we need to ensure that the understanding and documentation of what we are doing is not overlooked in the glare of the shiny tools we use these days.

As an example of the sort of benefits modern resource evaluation products generate, as a consulting group we recently worked on a project where a drill hole database included vein dip and strike measurements from oriented core. These were supplemented by in-pit readings of the same suites of veins. The orebody had been structurally subdivided into a series of lithostructural units by cross-cutting faults and by stratigraphic contacts. Using the tools in several of the leading mining software products, we were able to:

  • input the combination of drill hole dip and dip direction readings and in-pit measurements
  • subdivide these by lithostructural domain
  • determine the predominant vein orientations per domain
  • highlight subpopulations on the stereonets
  • see where these particular hole and pit readings were located in 3D space
  • generate spatial subdomains
  • export the preferred orientations to a geostatistical analysis software product for further data processing.

Nothing particularly remarkable, but the relative ease and speed with which  this set of actions was completed would have been unthinkable perhaps only five years ago.

Apart from the very early days of modelling software in the 1980s, evaluators have always had to rely on a mix of products to achieve their desired results. Nowadays, choosing the right software and ensuring that the data and parameters are transferable between the various offerings is more important than ever. Thankfully, most of the major vendors of ‘specialised’ software – that is, not the so-called generalised mining packages (GMPs) – take special care to ensure that data is both easily imported and that the products of the processing – the models, solids, analysis or parameters – can be exported for further processing.

Could the work be done using one product?

But are there advantages in using a single GMP for all evaluation activities? When we look at the broad workflow for a geological, grade and geometallurgical modelling assignment, we may consider the following activities and their use of digital software.

Data collection and storage

This includes geoscientific data, primarily drill holes, but potentially grab samples, pseudo drill holes (such as face channels or chips) and often drilling metadata. In almost all cases the evaluator will have access to dedicated, industrial-strength, geoscientific database packages that have a multitude of import modes, on-entry validation, multiple layers of security and access and flexible query and extraction. It is certainly possible to use the database facilities in a GMP as the primary storage for geoscientific data, but most evaluators will have access to a dedicated geoscientific database.

Data quality assessment

The examination and assessment of both data collection quality (collar and downhole surveys) and assay/density quality is sometimes addressed by the major database vendors through add-ins, but mostly via home-grown software – Excel or other products – which vary between practitioners. The logical source of quality control (QC) analysis is the database product, since the extraction and formatting of the data is so closely allied to the other database activities. But the fact is that the database add-on packages do not currently fully address the needs of a robust and detailed QC analysis, which needs to consider the treatment and assessment of accuracy (standards), contamination (blanks), precision (duplicates), particle size analysis, reporting of anomalies and correction through a variety of tools. So we invariably see an export of suitably-formatted QC data from the primary database. The GMPs do not currently appear to have the set of tools to address all the requirements of a comprehensive quality assurance/quality control (QA/QC) system.

Geological and geometallurgical modelling

Whether of lithologies, structures, weathering, alteration, mineralogy, processing characteristics, or combinations thereof, modelling is often the purview of the GMPs, which tackle most of the procedures and processes required to a greater or lesser degree of efficacy. However, based upon observations of current practice, the use of specialised shape modelling software such as Leapfrog is becoming almost ubiquitous. In the increasingly important area of implicit modelling, the GMPs aspire to being a one-stop shop, and this has been achieved with varying degrees of success. Most evaluators use a combination of products and techniques, with the final geological and geometallurgical outputs – generally solids or surfaces – finding their way into a GMP for interaction with drilling and sampling data. In most cases the transfer is straightforward and so the use of a specialist package such as Leapfrog does not pose any issues.

Data conditioning

The flagging, coding and compositing of data is generally carried out within the GMP, where there are a variety of treatments of the data to provide a set of coded composites. Care needs to be taken with importing flagged composites from one package to another, as the desurveying routine – the interpolation  of the downhole surveys between measured points – does vary between implementations and can cause headaches in some transfers. Some of  the more advanced data conditioning tasks – such as contact analysis or declustering – are available in some GMPs but are generally found in specialist statistical or geostatistical software.

Statistics and geostatistics

All the major GMPs provide facilities for exploratory data analysis of flagged and composited data, and all allow for the generation of variograms for the definition of spatial continuity. However, the ease of variogram directional choice and modelling does vary between GMPs and many evaluators choose to generate variograms within ‘specialist’ software. Use of these specialist products becomes more relevant when coestimation – the use of correlated variables (such as gold and sulfur, for instance) is being considered. In particular, many geometallurgical variables interact and require more advanced variogram modelling. In general, the export of composited sample data to a specialist product is straightforward, as is the import of the modelled parameters. Where the specialist software packages excel is in the optimisation of estimation parameters, which is still an advanced option (or is not present) in many GMPs.

Estimation, validation and classification

The initial primary purpose of most of the GMPs was the generation of block models within constraining solids, and all the main products perform this task particularly well. Some of the specialist geostatistical products also carry out estimation, generally with more advanced options than the GMPs. The visualisation features within the GMPs allow for rapid visual validation of the modelling against the input samples, and generally the more sophisticated validation tasks can be carried out within some of the more extensive GMPs – but it may be necessary for the evaluator to export the models to a more specialised package for some of  the necessary validation. Again this is generally a relatively easy task, with, in some cases, the native digital block model format from the GMPs imported without an export step. Classification, using either a formula or a graphical selection of blocks, is also very easily achieved within  a GMP.

Change of support and more advanced estimation methods

Generating tonnage-grade curves for block sizes other than the production size requires geostatistical manipulation of the block model, as does the generation of recoverable resources – tonnage and grade within a larger block available at a range of cut-off grades. Some GMPs have made advances in these areas and some of the more common recoverable resource techniques, such as multiple indicator kriging and uniform conditioning, are now available and are reasonably robust. However, for a full-featured estimation of recoverable resources in general the evaluator needs to turn to a specialist software package. Similarly, the estimation of non-linear variables for geometallurgical purposes (such as pH or recovery) generally requires more advanced software.

Quantitative risk assessment

The assessment of grade and modelling risk, considered essential by many as an adjunct to a minimal-error grade estimate, is becoming more common among practitioners, and is, to a certain degree, possible within GMP software. The gap between academic and specialist work and everyday production applications is diminishing yearly, and this will eventually lead to more robust implementations of conditional simulation applications within the mainstream packages. One of the main applications that may drive the use of quantitative risk modelling applications is the use of risk-qualified resource classification, an approach that has been adopted by many of the world’s top miners. For now, the generation of risk around a static estimate requires the use of more specialised software or even self-generated (ie programmed) routines.

Workflow communication

This brief review of the status of geological, grade and geometallurgical modelling procedures and workflows reveals that, in most cases, the pursuit of best practice demands that up to four or more separate software products may be used, and in many cases, transfer of data in and out of the GMP may take place three or four times during an evaluation exercise. While data flow within the GMP is often handled via a series of scripts, routines or macros, these do not control the external software, and it becomes critical that the increasingly complex workflow is recorded. Figure 1 is a simplified map of only part of an estimation process, involving three different software products and over five inter-package transfers. The overall process, including the database capture and export, used five separate products, including a GMP, and seven transfers.

Click for larger image.

Conclusion

As the demands on resource (grade, geology, and geometallurgy) evaluators increases to match the more complex predictive needs of their engineering and processing clients, we owe it to ourselves to ensure that, firstly, we are using the best tools for the job; secondly, that those tools that carry out the job in the most efficient and productive manner are promoted; but finally, and most importantly, that the complex sequence and mix of tools, products and inter-package transfers that constitute the workflow is both documented efficiently and communicated, not only to peers but also to those who would invest and promote on the basis of the output – ie the company management, analysts and shareholders. Thus we see that an excellent job in estimation and modelling is a job only partially complete without the documentation and communication aspects of the approach. It could easily be argued that this workflow mapping and reporting has certainly not kept pace with the technological developments, and that there is much to be gained by the industry – in terms of elevating best practice – from sharing the workflow details.  

Feature image: boonchoke/Shutterstock.com.

Share This Article