Overall equipment effectiveness is a fundamental measure for starting and monitoring improvement. Debbie Giggle discovers how to put the measure in place and avoid the pitfalls that can cause the process to stall
Overall equipment effectiveness (OEE) is already well-established as a way of measuring and improving business performance, but many businesses are still missing out on the benefits it can deliver.
“The level of understanding of OEE is generally high,” said management consultant Richard Jones of MCP, “but many companies find that the text book theory and the shop floor reality are two very different things. Some find it problematical to make the leap from their traditional methods of performance measurement to an OEE-based system. Others start to measure OEE and then fail to maximise on its potential, or discontinue after a pilot trial. But, once harnessed, a clear understanding of OEE and the root causes impacting upon it, can be a powerhouse for continuous improvement.”
Alister Jones of Cadbury Schweppes backed this up saying: “Measurement of OEE is the fundamental starting point from which we build our continuous improvement activities. It informs decisions and helps us to determine where to apply effort and capital expenditure. It could be argued that the benefits of all small group activity on the relevant production lines are either directly or indirectly due to OEE.”
So why aren’t more UK businesses maximising the potential of OEE?
For some, moving to an OEE-based system of performance measurement in the first place can be something of a culture shock.
Richard Jones said: “Traditionally targets may have been set by looking at past performance and then assessing present day performance in relation to it. Shopfloor teams may be accustomed to hitting something like 98 per cent or 104 per cent of target. When OEE is measured properly however the baseline figure for a company new to the technique might be around the 60 per cent mark. Unless the members of the board are very clued up about OEE, it can be difficult for a production manager to break this news to the senior management team, even though facing an uncomfortable truth holds the key to significant improvement.”
In these insecure times it’s easy to see that ‘shooting the messenger’ might be a fear. Equally, shopfloor personnel might also feel demoralised at a perceived ‘moving of goalposts’ unless the process was communicated effectively.
But that doesn’t explain why some companies, having cleared the hurdle of measuring OEE, don’t go on to reap the rewards.
Richard Jones said: “OEE means different things to different people. One potential pitfall is that the manufacturer might fail to measure OEE correctly or collect incomplete data. Or there can be differences in interpretation.”
At Cadbury Schweppes in the west Midlands true OEE is measured from start to finish of the process on designated production lines. Two years ago a decision was made to set a global group-wide definition for the measurement of OEE.
“We have 123 production sites in total,” explained Alister Jones of Cadbury Schweppes. “Not all of these were involved in this project, but the most experienced in OEE world-wide decided on an agreed definition of what should be measured and how. There were only subtle differences between the ways the sites were working, but it was important to have common language that could be understood, and measurements that could be compared life-for-like across multiple countries.”
Experts advise that OEE should be a true reflection of equipment performance and therefore include changeover periods and stoppages for planned maintenance.
Richard Jones continued: “It can be tempting to just measure stoppages during uptime, but that might give an incomplete picture, especially where changeovers between batches can be time-consuming. At MCP we advise our clients to use a relatively simple equation: Availability x performance x quality using the full working shift pattern as the basis – not just actual uptime.
“To set appropriate initial targets, simply measure each of the performance categories for a suitable time and then take the highest scores from each category and multiply these together for your first OEE baseline.”
Flexsys Rubber Chemicals, Ruabon, applies a slightly different form of measurement.“We use the Oliver Wight checklist for operational excellence on a world-wide basis across the group,” explained Roger Mason. “About two and a half years ago however we decided to introduce OEE as a secondary measure, just at the UK site, to drive improved attainment of daily, weekly and monthly targets. Our aim was to adhere as closely as possible to our production schedules, as these have a direct impact on on-time delivery and customer satisfaction. Service levels for customers are a key to success in our market-place where competitors in low-cost economies make it difficult to win and keep customers on price alone.”
While the text books advocate measuring periods of planned downtime within OEE, Flexsys decided that this was inappropriate in their highly-automated process-orientated production environment.
“The nature of our business is such that nothing is made for inventory. So if a production process is not needed it is immediately shut-down. Any other action would simply generate waste. We decided that including periods of no current demand within OEE would drive the wrong kind of behaviour from employees by encouraging them to run machinery when it was not necessary. Instead of that we are focused on delivering customer orders as efficiently as possible. Our data relates to theoretical maximum, actual run rate, number of hours available for production and quality.”
As well as measuring its production processes, Flexsys monitors OEE of its utility services including process water, water treatment, compressed air and power from a 3 ½ megawatt on-site power station. Individual pieces of plant are measured and these performances are multiplied by the availability of utilities and services.
Flexsys has been pleased with the results it has achieved since adding OEE to its range of business metrics. At the outset, two and a half years ago, OEE was at around the 60 per cent mark, contributing directly to its problems in meeting production schedules. Today OEE is running at around the 85 per cent to 90 per cent mark. Performance against production schedules have been significantly improved with particular impact on attainment of weekly targets. Two years ago approximately 25 per cent of weekly production schedules were met, but today this has improved to around 90 per cent.
The improved levels of performance have fed back into the scheduling system. Order delivery times were previously calculated using demonstrated production times rather than hypothetical ones for obvious reasons, to avoid disappointing customers. As performance has now been much improved, the demonstrated production times are also better and shorter leadtimes are now quoted to customers, improving Flexsys’s competitive position.
Both Flexsys and Cadbury Schweppes agreed that the way they introduced OEE to their employees has had a fundamental impact on their successful outcomes.
Alister Jones of Cadbury Schweppes said: “The people involved in collecting the OEE data must have a clear understanding of why they’re doing it, and why it’s important. It is very natural for people to fudge the figures if they don’t see the point behind them. But that undermines the accuracy and, if the credibility of the data isn’t there, it ceases to be a valuable tool.
“To work well OEE needs to be a grass-roots process. When we’re introducing OEE on a new line it is a team-centric activity. Operators will use simple tally charts to record the information, and will carry out their own calculations. As time goes on the team might decide that there’s an easier way of doing things and ask for back-up. At that stage we will introduce IT. It is never imposed as a kind of ‘black box’ that employees don’t understand, or might even resent.”
Richard Jones had additional advice for companies wishing to extend their use of OEE, or maintain impetus. “Some companies find their OEE improvements hit a high point and then tail off when there is a change of staff,” he said. “You need to ensure that the skills and training are in place at all times to maintain standards. Companies reliant on contract or temporary staff can find this particularly challenging.
“In addition, if you’ve had a successful OEE pilot trial, you need to approach each subsequent introduction with the same systematic approach. The temptation is to rush things when you cascade OEE across the rest of the site. The operators however will need to work through the process just as methodically and will need time to get the new process of measurement bedded-in.”
In conclusion though, the biggest potential pitfall would be for a manufacturer to establish a system for measuring OEE and then fail to do anything with the resulting data.
Roger Mason said: “Viewed sensibly, collection of OEE data by operators is another form of waste, unless you are going to harness the knowledge you’ve gained and be proactive about making improvements. You need to get to the root cause.”
Overall equipment effectiveness (OEE) is already well-established as a way of measuring and improving business performance, but many businesses are still missing out on the benefits it can deliver.
“The level of understanding of OEE is generally high,” said management consultant Richard Jones of MCP, “but many companies find that the text book theory and the shop floor reality are two very different things. Some find it problematical to make the leap from their traditional methods of performance measurement to an OEE-based system. Others start to measure OEE and then fail to maximise on its potential, or discontinue after a pilot trial. But, once harnessed, a clear understanding of OEE and the root causes impacting upon it, can be a powerhouse for continuous improvement.”
Alister Jones of Cadbury Schweppes backed this up saying: “Measurement of OEE is the fundamental starting point from which we build our continuous improvement activities. It informs decisions and helps us to determine where to apply effort and capital expenditure. It could be argued that the benefits of all small group activity on the relevant production lines are either directly or indirectly due to OEE.”
So why aren’t more UK businesses maximising the potential of OEE?
For some, moving to an OEE-based system of performance measurement in the first place can be something of a culture shock.
Richard Jones said: “Traditionally targets may have been set by looking at past performance and then assessing present day performance in relation to it. Shopfloor teams may be accustomed to hitting something like 98 per cent or 104 per cent of target. When OEE is measured properly however the baseline figure for a company new to the technique might be around the 60 per cent mark. Unless the members of the board are very clued up about OEE, it can be difficult for a production manager to break this news to the senior management team, even though facing an uncomfortable truth holds the key to significant improvement.”
In these insecure times it’s easy to see that ‘shooting the messenger’ might be a fear. Equally, shopfloor personnel might also feel demoralised at a perceived ‘moving of goalposts’ unless the process was communicated effectively.
But that doesn’t explain why some companies, having cleared the hurdle of measuring OEE, don’t go on to reap the rewards.
Richard Jones said: “OEE means different things to different people. One potential pitfall is that the manufacturer might fail to measure OEE correctly or collect incomplete data. Or there can be differences in interpretation.”
At Cadbury Schweppes in the west Midlands true OEE is measured from start to finish of the process on designated production lines. Two years ago a decision was made to set a global group-wide definition for the measurement of OEE.
“We have 123 production sites in total,” explained Alister Jones of Cadbury Schweppes. “Not all of these were involved in this project, but the most experienced in OEE world-wide decided on an agreed definition of what should be measured and how. There were only subtle differences between the ways the sites were working, but it was important to have common language that could be understood, and measurements that could be compared life-for-like across multiple countries.”
Experts advise that OEE should be a true reflection of equipment performance and therefore include changeover periods and stoppages for planned maintenance.
Richard Jones continued: “It can be tempting to just measure stoppages during uptime, but that might give an incomplete picture, especially where changeovers between batches can be time-consuming. At MCP we advise our clients to use a relatively simple equation: Availability x performance x quality using the full working shift pattern as the basis – not just actual uptime.
“To set appropriate initial targets, simply measure each of the performance categories for a suitable time and then take the highest scores from each category and multiply these together for your first OEE baseline.”
Flexsys Rubber Chemicals, Ruabon, applies a slightly different form of measurement.“We use the Oliver Wight checklist for operational excellence on a world-wide basis across the group,” explained Roger Mason. “About two and a half years ago however we decided to introduce OEE as a secondary measure, just at the UK site, to drive improved attainment of daily, weekly and monthly targets. Our aim was to adhere as closely as possible to our production schedules, as these have a direct impact on on-time delivery and customer satisfaction. Service levels for customers are a key to success in our market-place where competitors in low-cost economies make it difficult to win and keep customers on price alone.”
While the text books advocate measuring periods of planned downtime within OEE, Flexsys decided that this was inappropriate in their highly-automated process-orientated production environment.
“The nature of our business is such that nothing is made for inventory. So if a production process is not needed it is immediately shut-down. Any other action would simply generate waste. We decided that including periods of no current demand within OEE would drive the wrong kind of behaviour from employees by encouraging them to run machinery when it was not necessary. Instead of that we are focused on delivering customer orders as efficiently as possible. Our data relates to theoretical maximum, actual run rate, number of hours available for production and quality.”
As well as measuring its production processes, Flexsys monitors OEE of its utility services including process water, water treatment, compressed air and power from a 3 ½ megawatt on-site power station. Individual pieces of plant are measured and these performances are multiplied by the availability of utilities and services.
Flexsys has been pleased with the results it has achieved since adding OEE to its range of business metrics. At the outset, two and a half years ago, OEE was at around the 60 per cent mark, contributing directly to its problems in meeting production schedules. Today OEE is running at around the 85 per cent to 90 per cent mark. Performance against production schedules have been significantly improved with particular impact on attainment of weekly targets. Two years ago approximately 25 per cent of weekly production schedules were met, but today this has improved to around 90 per cent.
The improved levels of performance have fed back into the scheduling system. Order delivery times were previously calculated using demonstrated production times rather than hypothetical ones for obvious reasons, to avoid disappointing customers. As performance has now been much improved, the demonstrated production times are also better and shorter leadtimes are now quoted to customers, improving Flexsys’s competitive position.
Both Flexsys and Cadbury Schweppes agreed that the way they introduced OEE to their employees has had a fundamental impact on their successful outcomes.
Alister Jones of Cadbury Schweppes said: “The people involved in collecting the OEE data must have a clear understanding of why they’re doing it, and why it’s important. It is very natural for people to fudge the figures if they don’t see the point behind them. But that undermines the accuracy and, if the credibility of the data isn’t there, it ceases to be a valuable tool.
“To work well OEE needs to be a grass-roots process. When we’re introducing OEE on a new line it is a team-centric activity. Operators will use simple tally charts to record the information, and will carry out their own calculations. As time goes on the team might decide that there’s an easier way of doing things and ask for back-up. At that stage we will introduce IT. It is never imposed as a kind of ‘black box’ that employees don’t understand, or might even resent.”
Richard Jones had additional advice for companies wishing to extend their use of OEE, or maintain impetus. “Some companies find their OEE improvements hit a high point and then tail off when there is a change of staff,” he said. “You need to ensure that the skills and training are in place at all times to maintain standards. Companies reliant on contract or temporary staff can find this particularly challenging.
“In addition, if you’ve had a successful OEE pilot trial, you need to approach each subsequent introduction with the same systematic approach. The temptation is to rush things when you cascade OEE across the rest of the site. The operators however will need to work through the process just as methodically and will need time to get the new process of measurement bedded-in.”
In conclusion though, the biggest potential pitfall would be for a manufacturer to establish a system for measuring OEE and then fail to do anything with the resulting data.
Roger Mason said: “Viewed sensibly, collection of OEE data by operators is another form of waste, unless you are going to harness the knowledge you’ve gained and be proactive about making improvements. You need to get to the root cause.”