IT in Manufacturing


Real-time analytics for industrial batch ­processes

February 2014 IT in Manufacturing

Companies that utilise batch manufacturing processes for specialty chemicals, food, pharmaceuticals, biotech products and other products can utilise data that is already being collected to make real-time decisions and take appropriate actions. Real-time batch analytics can help companies gain a better understanding of their processes, minimise variations and know where to make improvements to the process instantaneously.

Historically, batch processes have been difficult to control and analyse because each batch is unique: batches are not the same length; time lags differ; raw materials can differ and there are often differences in equipment, operating conditions and process activity. Advanced batch controls can be complex. Understanding how these variations impact batch quality while the batch is running can provide enormous benefits.

Real-time batch data analysis

Manufacturers use batch analytics software to compare batches to help uncover potential problems in real time. One analytics solution of which ARC is aware, enables users to compare ideal, ‘golden’ batches to other batches to help better understand how variables affect the current batch. The supplier calls this ‘dynamic time warping’ (DTW). DTW aligns the data and can compare many batches’ parameters and accommodate for variable timing differences. The technology helps align the data accurately between batches and match parameters from historical batch data to the variations found in the live batches. The data analytics software can be used to determine how the batch is progressing and predict if a batch will meet specification or have to be reworked, modified, or even discarded.

Significantly, the batch analytics software expands the range of processes that can take advantage of advanced process control. The solution can be used to interpret data to help optimise the process in real time. The software analyses comparisons of batch trajectories across different batches to other parameters and variables. The primary multivariate methods employed include principal components analysis (PCA) and projections to latent structures (PLS). PCA provides a concise overview of a data set. It is used to recognise patterns, including outliers, trends, groups, relationships, etc. PCA helps detect abnormal operations. PLS establishes relationships between input and output variables and develops predictive models of a process for quality predictions.

Integrated MVA analysis

In addition, the included model predictive multivariable analysis (MVA) software enables users to adjust batch trajectories and predictions for control using a comparative model. MVA helps the engineer look at the batch parameters holistically to be able to identify the interaction of the variables and uncover what is contributing to a particular condition. The real-time analytics can help determine how all the variables affect the batch. By drilling down on individual parameters, an engineer can determine if something is out of range or ‘not quite right’, make decisions about the process and take appropriate actions. The software can also help predict when problems are beginning to develop so that corrective measures can be taken.

The analysis can examine conditions and measurements that impact product quality. By visualising the data, the engineer can determine if the batch should be used for model generation. It’s critical to compare data from multiple batches to align batch parameters to determine how a batch is doing.

During the data extraction and model building process, data for the selected batches are automatically aligned with the correct parameters using dynamic time warping. By generating models and using the dynamic time warping screens, the manufacturer can determine if a parameter is different from batch-to-batch. In the past, it was not easy to access, visualise, and compare data; or generate models to compare parameters on the fly, while the batch was running. This technology makes it far easier to do so.

No PhD required

The included model-building application tools can enable workers who are familiar with their process to step through the process. Users can generate models by selecting which batches should be used to generate models. They can also compare the results with other models and check predictions. Lab analysis data can be used to validate the models and determine when the model is working well. The analysis can help determine what is working well and what needs to be improved.

Applying batch analytics in brewing industry

A major brewing manufacturer is using this batch analytics software as part of a beta trial to identify process problems. According to one of its engineers, the brewing company used the software to model its Briggs Lauter Tun – a unit that separates extracted wort (sugar from grains) – to identify the critical quality parameters during production runs.

The brewing company runs 60 to 80 batches a week on this tun and the company loses money if it deviates from the standard operating procedures. The company chose this unit because it was already collecting a lot of data on it. The batch analytics software is used to build a model for the batch process or unit and executes alongside a running batch process. The models aid in predicting quality parameters, identifying variables that are affecting the process and help detect faults early on in the process. The model was built to compare the running real-time batch against historical batches. The model enables users to drill down on individual parameters and compare with other batches to determine if something is out of range or otherwise not right. The company used the software to build a model and then used the model’s advanced statistics to determine that the steam pressure solenoid was plugged.

According to the plant engineer, “Creating batch process models can be particularly challenging for batch applications because of the inherent time variability from batch to batch. Batch lengths vary because of equipment, operating conditions, faults in one stage of the batch, time lags, and raw material variations. The analytics can be used to compare the current batch against what we consider to be a good batch to find the cause of a problem.”

The multivariate analysis built into the model showed the parameter outliers and helped identify potential parameters that might be an issue. The company was also able to use the DTW feature that overlays different batches and matches the parameters to identify abnormal conditions with their pH meters. The company corrected this problem to increase efficiencies and is now using the technology to identify other challenges.

For more information contact Paul Miller, ARC Advisory Group, +1 781 471 1126, [email protected], www.arcweb.com





Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

Bringing brownfield plants back to life
Schneider Electric South Africa IT in Manufacturing
Today’s brownfield plants are typically characterised by outdated equipment and processes, and face challenges ranging from inefficient operations to safety hazards. However, all is not lost, as these plants stand to gain a lot from digitalisation and automation.

Read more...
Generative AI for immersive real-time visualisation
Siemens South Africa IT in Manufacturing
Siemens will deepen its collaboration with NVIDIA to help build the industrial metaverse.

Read more...
Award-winning Gen AI solutions
IT in Manufacturing
Amazon Web Services recently hosted an exclusive event in South Africa on ‘Elevating Possibilities with Partners - a Showcase of GenAI Excellence’. This event brought together ten esteemed partners, including Synthesis Software Technologies, to highlight innovative advancements in the field of Generative AI.

Read more...
AI is driving data centres to the edge
Schneider Electric South Africa IT in Manufacturing
The data centre has become the cornerstone that links our digitally interconnected world. At the same time, the rapid growth and application of AI and machine learning (ML) is shaping the design and operation of data centres.

Read more...
Full-scale central control room simulator
Valmet Automation IT in Manufacturing
Valmet will deliver a full-scale central control room simulator to Nordic Ren-Gas, the leading Nordic green hydrogen and e-methane developer in Finland.

Read more...
Re-imagining business operations with the power of AI
IT in Manufacturing
inq. has introduced a range of artificial intelligence solutions to assist organisations across industry verticals in optimising business operations and improving internal efficiencies.

Read more...
Safe, sustainable cycling helmet technology
Siemens South Africa IT in Manufacturing
Lazer Sport, one of Europe’s leading cycling helmet manufacturers, has adopted the Siemens Xcelerator portfolio of industry software to bring to market KinetiCore, its new proprietary rotational impact protection technology.

Read more...
Defending against modern-day cyber threats
IT in Manufacturing
The anatomy of cyber threats has changed, meaning that organisations can no longer rely on traditional cybersecurity solutions to protect their perimeter, but should instead rethink their data protection strategy and become proactive in their defence against breaches.

Read more...
Data centre sector 2024 market outlook
IT in Manufacturing
As the world adapts to the digital transformation of almost every aspect of everyday life, the data centre sector, which plays such a pivotal role in digitalisation, is constantly evolving.

Read more...
Reinventing the workforce in the age of generative AI
IT in Manufacturing
Generative AI has burst onto the scene. It appeared fast, and is evolving even faster. Its impact on value chains will fundamentally transform the nature of work, reshaping how businesses deliver value, and delivering better experiences for employees and customers.

Read more...