Avoiding bottlenecks in analyser speed

Louise Davis

Most concentrator plants rely on online analysis as a data source for process optimisation. The reactions in, for instance, a froth flotation tank cell are usually described as an exponential function. After an initial small delay a lot of change happens in a short period of time, and then the rate of change declines.

This is where analyser update speed becomes a factor to take into account. If updates are slow, the operators may miss the sudden change and only detect it after the fact. This leads to periods of time where the process parameters are not set for the actual conditions. With ever-changing ore contents, mill grades, etc., these windows of time may add up to a substantial amount of time each day.

A key item to consider is the detection limit as a function of measurement time. In every plant there will be a couple of samples where the main elements concentrations are very low. Typically tailings streams, however with many mines going into zones with lower content the samples from the mills might also have quite low metal content.

Consider a hypothetical case with 20 sample streams, where four are low-content, head-grade samples and four are tailings samples. Let’s say the other 12 samples are of such content that 30 seconds is sufficient to get good results. They make up for a total of six minutes measurement time. If the analyser meets the requirements to produce good measurements in 30 seconds also for the eight low-content streams, the total cycle time will be 10 minutes. If the analyser needs as much as 1-2 minutes per low-grade stream, the total cycle time will be 15-20 minutes.

In practice, some streams are measured more frequently than others, for the reasons discussed above. But, if some streams require long measurements times they will set back the next update for all other streams. The total cycle time could be around 12 minutes for 20 streams and 30 seconds measurements, but as much as 30 minutes if longer measurement times are needed for low-grade streams.

Standard deviation
Online measurements will always have a standard deviation of a few relative percent. For the human eye it is quite easy to spot the current value and trend despite these variations. A process control algorithm would have to perform some sort of averaging of the measurements to make the signal useful for process control.

Either way, update time is critical. If one were to make an average of the last five measurements, with 12 minutes total cycle time the oldest measurement would be 60 minutes old. With 20 minutes total cycle time, two samples would be more than an hour old, and probably not relevant. This poses a risk for constant sub-optimisation. It is therefore clear that a faster analyser is a natural first step towards improving productivity and profitability today.


Mikael Normark is CEO of Xore. www.xore.se

 

Recent Issues