Screenhouse optimal control
Managing a screenhouse in the mining industry involves several critical challenges, each important for ensuring efficiency, productivity, and optimal use of resources.
Running the Correct Number of Screens: Balancing the number of screens in operation is crucial for maintaining efficiency. Operating too few screens can lead to bottlenecks, where ore is not processed quickly enough, causing delays and reduced throughput. Conversely, running too many screens can lead to wasted energy and increased wear and tear, as well as higher operational costs. Finding the right balance requires a deep understanding of the ore characteristics and processing requirements.
Running the Screens at Optimal Screening Efficiency: Screening efficiency is determined by the accuracy and speed with which the desired material is separated from the undesired material. Factors affecting this efficiency include the type and condition of the screens, the properties of the ore (such as size, moisture content, and hardness), and the operating parameters of the screens. Optimal efficiency is achieved by regularly monitoring these factors and adjusting the screening process as necessary.
Distributing the Ore Evenly Across the Bins: Even distribution of ore to the bins is essential for consistent processing. Uneven distribution can lead to overloading of some bins and underutilisation of others, resulting in inefficiencies and potential damage to equipment. This can be managed through automated control systems that monitor and adjust the feed rate and distribution based on real-time data.
Utilising the Capacity of the Screen House Bins: Maximising the use of bin capacity ensures efficient storage and handling of processed ore. This involves not only the physical capacity of the bins but also considering factors like flow characteristics of the ore, which can affect how much material can be effectively stored and retrieved. Advanced monitoring and control systems can help optimise this aspect by providing real-time data on bin levels and material flow.
Evenly Discharge Ore from the Screen House: Ensuring a consistent and controlled discharge of ore from the screenhouse is crucial for downstream processes. Irregular discharge can cause operational issues in subsequent stages, like material handling and further processing. Implementing automated control systems that synchronise the discharge rate with downstream requirements can mitigate this challenge.
Minimising Screen Start/Stop Cycles: Frequent starting and stopping of screens can lead to increased wear and tear, higher maintenance costs, and reduced lifespan of the equipment. It's important to optimise the operation schedule of the screens to minimise these cycles. This can be achieved through careful planning and scheduling, as well as implementing smart control systems that can adapt to changing processing needs in real-time.
Automatic Feeder Selection Logic: Implement a control system that uses automatic feeder selection logic. This system would analyse real-time data on mass balance, ore characteristics, and screen efficiency to determine the optimal number of screens to run at any given time. It ensures the screens are used efficiently, reducing energy consumption and wear while maintaining throughput. The system adapts to changes in ore characteristics, ensuring consistent processing quality.
AI Modelling of Screen Discharge and Efficiency: Develop deep neural network models that simulate screen discharge and efficiency. These models use vast amount of data from sensors and cameras to continually update and refine the understanding of how different ores behave on the screens. This leads to more accurate predictions of screen performance, enabling proactive adjustments to maintain optimal efficiency.
Automated Calibration Cycles: Integrate automated calibration cycles into the operation. These cycles would regularly update screen efficiency curves based on real-time data, keeping the system calibrated and optimised for current conditions. This ensures that the screening process is always operating at peak efficiency, adjusting for wear and changes in ore properties.
Optimisation Scheme for Mass Balance: Implement a global optimisation scheme that analyses the complete mass balance of the screen house. This system calculates the optimal time the shuttle conveyor should spend above each bin to balance bin levels effectively. This results in optimal utilisation of bin capacity, reducing the likelihood of overflow or underutilisation and ensuring a smooth flow of material through the screen house.
Bin Skipping Logic: Develop bin skipping logic to be integrated into the control system. This logic would allow for the strategic skipping of certain bins to prevent overfilling or to allow time for material processing. It enhances the flexibility of ore distribution, allowing for more effective handling of varying ore volumes and types.
Continuous Learning: Regularly and autonmously verify and update the AI models as well as, the validity of constraints and the dynamics of those constraints. This involves assessing whether the existing controls are still optimal for current operations and making automatic adjustments as necessary. Ensures that the AI models remains effective over time, adapting to changes in ore properties, equipment performance, and operational goals.
Genetic algorithms and reinforcement learning are two powerful methods in machine learning and artificial intelligence that Minealytics utilises to optimise complex industrial processes, such as determining the optimal time for a shuttle conveyor to spend above each bin in a screenhouse. Given the variability in infeed, the changing ore properties, screen wear, and the dynamics of discharge feeders, these methods can adapt and learn to improve efficiency.
Genetic Algorithms (GAs)
The screenhouse operation is modeled as an optimisation problem, where the goal is to maximise throughput and efficiency while minimising wear and tear. Each potential solution (i.e., a set of times the shuttle conveyor spends above each bin) is encoded as a 'chromosome'. The Minealytics GA iteratively improves the solutions. It selects the best-performing solutions, combines them (crossover), and introduces random changes (mutations) to explore new solutions. The effectiveness of each solution is evaluated using a fitness function, which considers factors like throughput efficiency, wear minimisation, and balance of bin levels.
GAs are excellent at searching through large, complex solution spaces to find global optima, avoiding local optima traps. They can adapt to changes over time, making them suitable for environments where ore properties and screen wear vary. GAs can evaluate many solutions simultaneously, speeding up the optimisation process. And GAs are complemented well by machine learning models that mimic the target process so they allow for faster and safer population generation.
Reinforcement Learning (RL)
The Minealytics RL agent represents the control system, and the screenhouse operation is the environment. The state includes information about the bin levels, ore properties, and screen conditions. Actions are the time allocations for the shuttle conveyor above each bin. Rewards are given for achieving desired outcomes (e.g., efficiency, minimal wear). The agent learns a policy—i.e., a strategy to decide actions based on the state—to maximise cumulative rewards over time.
RL learns optimal strategies through trial and error interactions with the environment, making it well-suited for complex, dynamic systems. The Minealytics RL control continuously adapts its policy in response to changing conditions in the screenhouse operation. RL directly optimises for a defined objective, such as maximising throughput or minimising operational costs.
Conclusion
Both genetic algorithms and reinforcement learning offer robust approaches to optimizing complex and dynamic systems like a screenhouse operation. Genetic algorithms are particularly powerful in exploring a wide range of potential solutions and finding global optima, while reinforcement learning excels in learning and adapting from continuous interaction with the environment, making it highly effective in environments where conditions change frequently.