Development of an optical sorting algorithm to utilise digital images for the rapid discrimination of target minerals from gangue

Koh, Amini, McLachlan, Beaton

Presented at the Preconcentration Digital Conference November 2020

ABSTRACT

Preconcentration greatly reduces the ore mass input to the processing plant while upgrading the grade. One of the commonly used methods of preconcentration is optical sorting. These methods rely on optical sensors which classify the material stream into “accept-reject” streams. Currently, it is common to use near infrared (NIR) or colour cameras to classify streams based on reflectance or colour thresholds set by experts. The imaging is done through line-scan sensors where the material is scanned on a belt or a chute. Then, a separation apparatus like a diverter gate or air jet is controlled to separate the streams according to the classification made by the algorithm.

Separation efficiency is dependent on the classification algorithm. Existing classification algorithms rely on low-level feature discrimination like observing individual pixel values for reflectance or colour hues. This limits discrimination between minerals with low contrast in colour. Human experts perform mineral segmentation using texture, colour distribution and shape, but it is difficult to translate these visual rules to mathematical thresholds.

Deep learning methods like Convolutional Neural Networks (CNNs) provide high-level feature discrimination similar to a human expert. Instead of relying on pre-defined features, the CNNs learn complex features from the dataset. CNNs can construct a hierarchy of features based on pixel clusters to learn texture, shapes, and colour spectrums of the mineral grain surface to provide better separation.

In this study, a state-of-the-art deep learning instance segmentation method is used to outline boundaries and classify grains from the background (mineral segmentation). The algorithm is trained on images collected from a flotation feed of a gold deposit, where the gold is mechanically locked inside pyrite. This method can process on-line video inputs at 5 frames per second to provide a live grade. The proposed method is inexpensive as it only uses a camera and a desktop computer.

AUTHORS

E J Y Koh1,2, E Amini3, G J McLachlan4 and N Beaton5

1. Technical Specialist, Cooperative Research Centre for Optimising Research Extraction (CRC ORE), Brisbane, Queensland, 4069. Email: This email address is being protected from spambots. You need JavaScript enabled to view it. .

2. PhD. Candidate, School of Mathematics and Physics, University of Queensland, Brisbane, Queensland, 4067. Email: This email address is being protected from spambots. You need JavaScript enabled to view it.

3. IES Utilisation Manager, Cooperative Research Centre for Optimising Research Extraction (CRC ORE), Brisbane, Queensland, 4069. Email: This email address is being protected from spambots. You need JavaScript enabled to view it.

4. Professor of Statistics. School of Mathematics and Physics, University of Queensland, Brisbane, Queensland, 4067. Email: This email address is being protected from spambots. You need JavaScript enabled to view it.

5. General Manager - Integrated Extraction Simulator. Cooperative Research Centre for Optimising Research Extraction (CRC ORE), Brisbane, Queensland, 4069. Email: This email address is being protected from spambots. You need JavaScript enabled to view it.

Keywords

Optical sorting, deep learning, preconcentration, on-line characterisation, instance segmentation, mineral segmentation

ACKNOWLEDGEMENTS

The authors would like to thank Reyhaneh Hosseini Tabatabaei for providing the thin section flotation samples used in this study. This research would not have been possible without the financial support from CRC ORE. CRC ORE is part of the Australian Government’s CRC Program, which is made possible through the investment and ongoing support of the Australian Government. The CRC Program supports industry-led collaborations between industry, researchers and the community.

If you are interested in purchasing proceedings or abstracts, please use this link.

Categories

0