Many years ago at a paper mill I was working with, superintendents read the values of their key performance indicators (KPIs) off circle charts and transferred them manually to a whiteboard every morning, and most of the morning meeting was spent discussing those values and what had happened over the past 24 hours.
A few years later at the same mill, they merely had to turn on a projector, and all the KPIs were displayed on a screen, with any abnormal values highlighted in red. The meeting was now focused on the problem areas and what was planned for the coming day. All this was made possible by online data from thousands of sources stored in the data historian system and a focus on management by KPIs.
This is just one useful way of creating value from the "big data" available in a pulp and paper mill. Here are some other things that can be done using the mill's data:
- Troubleshoot process issues
- Calculate monthly averages for accounting and budgeting
- Build predictive models to guide operators, allowing tools for alarm setting or faster grade changes
- Enable sophisticated control strategies, allowing tighter control and less variability, ultimately leading to cost savings for things such as chemicals and energy, and higher productivity.
But leveraging value from data has its challenges. To believe that all data are always accurate can be a rash assumption. For example, calculated averages may include a lot of zeroes from downtime, or the meters generating the data may be out of calibration.
In one mill, I discovered huge discrepancies in monthly steam generation statistics, and the accounting department was using outdated estimates to allocate steam usage by each department. It turned out that many of the steam flow meters were not working properly. There were also many steam leaks across the mill that were not being repaired. The same mill had added data measurement points to their storage system without adding storage capacity, with the result that data older than eight months were being discarded to make room for new data, which is a little like throwing out the baby with the bathwater.
Every facility should have a data strategy that supports the long term goals of the facility. What are the biggest opportunities for gains in quality, cost and throughput? What kind of investments need to be made in the data infrastructure and the human resources to maintain this infrastructure? The measurement and collection system should be treated as a vital piece of infrastructure that needs to be kept in good working order. And maintaining and using the data system requires a set of skills that includes IT expertise, instrumentation expertise and good process knowledge – a combination that is rarely found in one person, but can be achieved by having a cross-functional team that works well together. Do you have a good data strategy in place?
Martin Fairbank, Ph.D. Martin Fairbank has worked in the forest products industry for 31 years,
including many years for a pulp and paper producer and two years with
Natural Resources Canada. With a Ph.D. in chemistry and experience in
process improvement, product development, energy management and lean
manufacturing, Martin currently works as an independent consultant,
based in Montreal. He is also an author, having recently published
Resolute Roots, a history of Resolute Forest Products and its
predecessors over the last 200 years.
Martin Fairbank Consulting
Industry Experience
- Pulp and Paper
- Materials Recycling
- Biorefinery
- Manufacturing
- Government
Services
- Carbon Markets
. Carbon credits
. Carbon footprint
. Life cycle analysis - Project Assessment
. Proposal writing for government funding
. Project technical evaluation for funding agencies - Chemical Regulations
. Regulation compliance advice
. Chemical questionnaires demystified - Continuous Improvement
. Process improvement
. Lean manufacturing