Optimising manufacturing

Description
The report on information about Optimizing the Manufacturing Process, Polymer Sheet Forming Operation, Identifying and Quantifying Improvement Opportunities, Reducing the Root Causes of Productivity Losses

PRODUCTION

AND OPERATIONS MANAGEMENT Vol. 7, No. 1, Spring 1998 Printed in U.S.A.

A SYSTEMATIC STRATEGY FOR OPTIMIZING MANUFACTURING OPERATIONS*
JONELL KERKHOFF, THOMAS W. EAGAR, AND JAMES UTTERBACK

Alcoa Industrial Chemicals Division, Pittsburgh, Pennsylvania 15215, USA Materials Science and Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA Sloan School of Management, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139, USA
A manufacturing optimization strategy is developed and demonstrated, which combines an asset utilization model and a process optimization framework with multivariate statistical analysis in a systematic manner to focus and drive process improvement activities. Although this manufacturing strategy is broadly applicable, the approach is discussed with respect to a polymer sheet manufacturing operation. The asset utilization (AU) model demonstrates that efficient equipment utilization can be monitored quantitatively and improvement opportunities identified so that the greatest benefit to the operation can be obtained. The process optimization framework, comprised of three parallel activities and a designed experiment, establishes the process-product relationship. The overall strategy of predictive model development provided from the parallel activities comprising the optimization framework is to synthesize a model based on existing data, both qualitative and quantitative, using canonical discriminant analysis, to identify main effect variables affecting the principal efficiency constraints identified using AU, operator knowledge and order-of-magnitude calculations are then employed to refine this model using designed experiments, where appropriate, to facilitate the development of a quantitative, proactive optimization strategy for eliminating the constraints. Most importantly, this overall strategy plays a significant role in demonstrating, and facilitating employee acceptance, that the manufacturing operation has evolved from an experienced-based process to one based on quantifiable science. (MANUFACTURING OPTIMIZATION, MULTIVARIATE STATISTICAL ANALYSIS, PROCESS-PRODUCT RELATIONSHIP, ASSET UTILIZATION)

1. Optimizing

the Manufacturing

Process

A basic and first level definition of manufacturing excellence is making product to customer specification in the most cost-effective manner with efficient use of resources (equipment and people) and delivering to the customer on time. Numerous articles and books have been published on manufacturing excellence, which typically encompasses just-in-time manufacturing, total quality management, total productive maintenance (TPM), and employee involvement (Hall 1984; Schonberger 1986). Five basic objectives of manufacturing excellence are
* Received July 1994; revisions received July 1996 and April 67 1059-1478/98/0701/067$1.25
Copyright 0 1998, Production and Operations Management Society

1997; accepted

April

1997.

68
l l

JONELL

KBRKHOFF,

T. W. EAGER,

AND

JAMES

UTTERBACK

Match throughput with demand-make only what is needed Reduce inventory l Maintain high quality throughout the operation l Reduce lead times l Reduce operating expenses. Achieving these objectives will maximize efficient operation in a cost-effective manner while fulfilling customers’ demands for high quality, short lead times, and flexibility. Operational judgment is key in achieving these objectives because various trade-offs exist in reducing inventory versus reducing lead times versus reducing operating expenses. The goals and approaches selected to accomplish these objectives must be grounded through an integrated production and inventory strategy suited for the business’s product customization and delivery performance expectations on a make-to-stock, make-to-order, or assemble-to-order basis. Driving continual and rapid improvements in these basic objectives will result in continuing improvements in quality, delivery performance, manufacturing efficiencies, and operating costs; all of which will contribute to the profitability objectives of the enterprise. Looking across U.S. manufacturing operations, we find that these operations are at different stages in achieving manufacturing excellence. Additionally, the opportunities for improvement vary considerably with the type of industry and from operation to operation within an industry. Such operational divergence facilitates the evolution of a virtual continuum of paths toward manufacturing excellence. Nowhere is this more apparent than in the recent proliferation of methods for defining improvement opportunities by identifying and eliminating bottlenecks for capacity constrained processes and by improving manufacturing efficiencies for unconstrained processes. In various types of manufacturing operations, opportunities for process improvement are often missed or given incomplete attention because of a lack of discipline in collecting data, analyzing data, and executing a quantitative systematic plan for improvement. This paper proposes a strategy for capturing improvement opportunities offered by the abovementioned manufacturing excellence objectives by l Identifying and quantifying the opportunities for achieving efficient operations through use of an asset utilization (AU) process l Focusing on these opportunities with specific quantitative tools to eliminate the most critical root causes of time and material losses and provide the largest gain to efficient operation and overall profitability of the business. The framework for reducing quality losses described here uniquely incorporates the abovelisted phases together, in a systematic manner, to provide systematic and quantitative insight into the manufacturing operation. This paper will discuss this strategy and will present a practical application for a polymer sheet manufacturing operation. 2. Background on the Polymer Sheet Forming Operation

The process for’polymer sheet manufacturing is based largely on technology developed many decades ago. The polymer sheet forming process is a continuous casting operation. A schematic example of a typical continuous casting process is shown in Figure 1. A viscous polymer stream is cast onto a wheel and conveyed through an oven system to create a sheet of specific thickness and characteristics. This sheet is wound onto large rolls, which are then sent to other operations within the company. Critical features of this sheet include thickness profile, thickness uniformity, absence of defects, and sheet modulus (rheology). Teams of operators in the polymer operation are responsible for operating a group of machines and performing basic maintenance. Individual process engineers are involved with day-to-day process improvement activities for specific groups of machines. In ad-

STRATEGY

FOR OPTIMIZING Continuous

MANUFACTURING Process

OPERATIONS

69

Web Manufacturing

viscous polymer in /

Ovens

.-I

Casting
FIGURE

Coating

Windup

1. Schematic Diagram of a Typical Continuous Casting Operation.

dition to the machine teams, process improvement teams also drive improvement activities by machine function. These cross-functional teams are comprised of engineers, working within the polymer operations, who cover machine functions such as casting, coating, and conveyance. These groups have worked historically on improvements for their own set of machines or functional area. Improvements primarily had been driven by fixes of machine-specific problems. It had often been observed that the same product, produced on different machines, exhibited different performance characteristics. At the initiation of this project, a strongly held belief within operations was that this manufacturing process is an art and not a science. This case study focuses on a set of machines in the polymer manufacturing operations, and illustrates the application of AU to identify and quantify improvement opportunities through root cause analysis and the application of a process optimization framework to understand and quantify key process-product relationships as a mechanism for capturing the quality improvement opportunities identified by AU. 3. Identifying and Quantifying Improvement Opportunities

A process for identifying and quantifying opportunities for improvement is AU. The AU process looks at how we can most effectively match demand requirements with equipment utilization and efficient operation. Through the use of AU, improvement opportunities are quantified by evaluating the root causes of time and material losses on equipment across an operation (Stewart 1991; Beckerman 1992; Kerkhoff 1994, 1995, 1996). The authors would like to state clearly that this AU approach must be driven coherently across an operation. The goal is not to drive each piece of equipment to 100% Au, which would result in excess inventory or work in process. The AU process employed here focuses on specific aspects, such as scheduled and unscheduled maintenance, material flow through the operation, feedstock issues, thoughput inefficiencies, production rates, product quality issues, and waste. There are other approaches similar to AU. Examples include the overall equipment effectiveness approach described as a part of TPM (Nakajima 1988) and the frequently employed uptime metric. However, the uptime metic is limited, in that it only looks at sources of downtime, excluding equipment performance measurements for speed efficiency, setup, and quality. The name of the process is not important. What is important

70

JONELL

KERKHOPP,

T. W. EAGER, TABLE 1

AND

JAMES

UTTERBACK

Dejnition

of Asset Utilization and the Manufacturing Productivity Parameters

Asset utilization = % availability * % run time efficiency * % run speed efficiency * % yield calendar % Availability = available % Run time efficiency = time - (downtime) calendar time

time - setup time available time

actual pounds produced % Run speed efficiency = run time * standard rate

% Yield

=

conforming pounds produced actual pounds produced

is that the process employed covers equipment performance completely and that the users are committed to the rigor and discipline of its use. The authors promote the AU process, undertaken in a systematic manner, to identify, quantify, and focus opportunities for improvement. 3.1. Asset Utilization Dejkitions Improvement opportunities are identified by measuring an overall AU number and four key manufacturing productivity parameters: availability, run time efficiency, run speed efficiency, and yield. These parameters are defined in Table 1 and shown schematically in Figure 2. Each of the parameters looks at a specific part of a manufacturing process infrastructure. Availability determines the percent of time that the equipment is available to run product. Downtime, which is time spent on scheduled and unscheduled maintenance, no operation, and idle time caused by lack of customer orders, is tracked by this metric. The

calendar time

% Availability

4

available time

% Run Time Efficiency

1

run time

setup time

% Run Speed Efficiency

time at maximum speed

time lost by not running at max speed

% Yield

quality time time lost by not making quality product
Diagram Showing the Asset Utilization Parameters and Their Relationship to Calendar

PIGURE 2. A Time Time.

STRATEGY

FOR OF'TIMIZING

MANUFACTURING

OPERATIONS

71

no operation category is time that the equipment is down because of situations beyond its control such as equipment being down in other parts of the operation, material flow problems or incoming material, and supplies that are not available or are of poor quality. Run time efficiency examines the percentage of cycle time that is spent actually running product versus setting up for other products. Time spent on product changeovers and transitioning is accounted for in this measurement. Run speed efficiency determines the percentage of time that the equipment ran at maximum speed. Time spent running at actual operating speed is compared with the maximum equipment speed. Run speed efficiency is calculated easily by determining how the actual amount of material produced compares with what amount of material should have been produced at maximum speed or standard rate. Yield is the percent of time that quality product is produced on the equipment. To calculate yield, the amount of time spent running waste or running substandard product must be assessed. The AU process is applied to specific pieces of equipment across a manufacturing operation. Data collected for this analysis are tailored specifically for each functional type of equipment. The AU data is then analyzed, by equipment, to identify root causes of time losses and quantify the magnitude of these root cause effects, so that a Pareto analysis may be employed to provide a relative ranking of these opportunities for improvement. 3.2. Guidelines for Implementing the AU Process

There are three important guidelines for productively using an AU process: 1. The AU process should be employed to drive toward predictable equipment and operations. Unscheduled maintenance and quality loss events measured by AU denote that equipment and processes are not predictable or reliable. Events or conditions leading to unscheduled maintenance and quality losses should be eliminated. 2. Improvement activities should focus on increasing the AU of any capacity constrained equipment, or in the case of unconstrained equipment, the slowest producing piece of equipment versus across all equipment with a given function. If a piece of equipment is not the rate limiting step in an operation, efforts to increase its utilization will only result in increasing inventory and reducing profitability. 3. It must be communicated clearly that the goal of the AU process is to increase efficient equipment utilization as a way to reduce costs. Au should not be driven to 100%. Reflecting back on the five basic objectives of manufacturing excellence, it is important that each operation make the product mix required in the most efficient manner and in the minimum amount of time needed to meet the demand or make only what is needed. To achieve these objectives, operations must be predictable and reliable and material flow must be synchronized across the operation. The forecast demand and/or actual orders, the inventory levels, and the inventory goals determine what product mix must be made for a given time frame. By knowing the production rates, the time required to make this product mix on the rate limiting equipment can be determined. This time should be translated into the AU goal for that time period, to insure that only the product mix that is needed is made. The actual AU can be monitored and compared with this Au goal. 4. Reducing the Root Causes of Productivity Losses

A process optimization framework was developed through this project for reducing process variability and increasing product quality as a result of a %yield improvement opportunity identified through AU. The process optimization framework is comprised of two parts. The first part strives to link the knowledge and experience of personnel within the operation with fundamental

72
polymer stream in

JONELL

KERKHOFF,

T. W. EAGER,

AND

JAMES

UTTERBACK

cool polymer

transport polymer

cool polymer

Casting Hopper

FIGURE

3. Functional

Flow

Diagram

for the Casting

Process.

theory and statistical techniques, by using multivariate canonical discriminant analysis to quantify the relationship between key process conditions and product attributes, based on existing process and product attribute data (both numerical and descriptive). The second part uses the learnings from the first part for developing a designed experiment, which further quantifies the magnitude of process effects on product attributes by changing process conditions in a controlled manner. The learnings from both parts are then employed to develop a real time predictive model for the casting process signals based on the polymer sheet thickness profile attributes. 4.1. Manufacturing Optimization Framework: Three Parallel Activities A schematic diagram of the casting zone was shown earlier in Figure 1. The viscous polymer stream flows into the casting hopper reservoir at a specific temperature and viscosity. The polymer flows from the casting hopper reservoir through a slot of fixed dimensionality, forming a catenary between the hopper slot and the wheel surface. In the casting flow diagram in Figure 3, two functions of the casting are highlighted in a gray box as critical to casting process. These functions, distribute flow of polymer in the hopper and shape the catenary between the hopper and the wheel surface, are the first steps in creating the polymer sheet. The process conditions associated with these two functions directly and dramatically affect the final polymer sheet profile and edge shape quality. A framework was developed for understanding and quantifying the process-product relationship outlined in Figure 4. This framework examines the cause and effect relationOn-going parallel efforts Knowledge I experience
Key parameters based knowledge and opinion on

-I
HI
/
FIGURE

Process data
Key parameters based statistical analyses on

I-

Theoretical calculations
Key parameters based theoretical calculations on

J

d

4. The Process

Optimization

Framework

Developed

for Examining

the Process-Product

Relation-

ships.

STRATEGY

FOR

OPTIMIZING

MANUFACTURING

OPERATIONS

73

ships between casting process conditions and the resulting sheet product attributes by using three parallel activities to select key casting process parameters and determining their effect on polymer sheet metrics. These parallel activities serve to better characterize the casting functions from three points of reference: knowledge and experience of operations personnel, process data analysis with multivariate statistical tools, and order-ofmagnitude theoretical calculations. Valuable information about any process resides with the engineers, operators, and maintenance personnel working in the operation. It is critical that the knowledge, opinions, and experience of these people be captured in a systematic format for driving and focusing the casting process improvement activities. Tools such as fault tree diagrams are appropriate for this purpose. The second critical activity is the evaluation of casting process data with valid and appropriate statistical techniques. Multivariate statistical tools such as principal components analysis, canonical discriminant analysis, and partial least squares analysis can be employed successfully to evaluate large populations of attribute data to identify the main process parameters as well as codependent sources of variability. Examples of use include instrument calibration, mixture analyses, and archeological classification of artifacts. The theoretical basis of these tools has been discussed previously (Fredericks, Lee, Osborn, and Swinkels 1985; Haaland and Thomas 1988; Massart, Vandeginste, Deming, Michotte, and Kaufman 1988; and Jackson 1991) and is outside the scope of this paper. The third activity serves to link the first and second activities to the fundamental theory of the casting process. In the present example, order-of-magnitude calculations were used to determine the magnitude of change anticipated on the cast sheet attributes with changing process conditions. The learnings and output of these three activities were incorporated coherently into a designed experiment as the next step in the process optimization framework. A screening experiment (Daniel 1973; Cotter 1979; Montgomery 1984) was designed to focus on eight key process parameters. The final step in the process optimization framework was the development of a predictive model for the real-time detection of process, conditions leading to out-of-spec product. The authors have striven to develop a process improvement approach that is applicable in general terms for common machine functions and can be applied easily and efficiently to all machines across the enterprise in terms of manpower, time, and cost. Likewise, this approach also could be tailored to tolerate machine-to-machine differences, facilitating the development of an enterprisewide, function-independent approach for identifying and prioritizing opportunities for manufacturing optimization. 5. Results and Discussion Results and findings of the presented below.
AU

analysis and the process optimization

framework are

5.1. AU Analysis for a Set of Polymer Machines The AU calculations were performed for a set of eight polymer machines within the sheet manufacturing operations using 9 consecutive months of historical process data. The average 9-month data values are shown in Table 2 for these eight machines. Availability values ranged from 64 to 97%. Most of the downtime was caused by unscheduled maintenance and scheduled maintenance activities. There was little idle time across the set of polymer machines evaluated. Run time efficiency values approached lOO%, ranging from 95 to 99%. High values for this parameter were expected, because this is a continuous operation with a large number of dedicated machines and minimal product changes. The run speed efficiency values ranged from 74 to 91%. A major root cause of running at lower speeds was the occurrence of quality problems at the higher operating speeds.

74

JONELL

KERKHOFF,

T. W. EAGER, TABLE 2

AND

JAMES

UTTERBACK

Productivity Parameter and Asset Utilization Calculations
Machine Availability 78 95 64 95 81 93 86 97 values are the g-month (%) Run Time Efficiency (%) 96 99 95 98 96 98 97 99 average. Run Speed Efficiency (%) 75 88 74 84 91 85 85 79 Yield 68 80 69 85 72 79 81 85 (%) Asset Utilization 38 66 31 66 51 61 57 64 (%)

C
D F G H I M 0 Note. All machine

Yield values ranged from 68 to 85%. Time spent running any product that does not meet customer quality specifications affects this metric. The resulting AU numbers ranged from 31% to 66%. This shows a difference in utilization of approximately 35% across the machines evaluated. Examining the root causes of quality losses further pinpointed specific yield improvement opportunities by quantifying the types of waste and reject generated across the machines. This analysis suggested that approximately 60% of the waste and reject material on these machines was attributed to the casting process. Improvement efforts in the second phase of the optimization strategy were directed at reducing casting quality losses to capture machine inefficiencies across these machines. 5.2. Process Optimization Framework

The framework for process optimization is presented by discussing the three parallel activities and their linkage and the screening experiment findings for the prediction model. 5.2.1. KNOWLEDGE AND EXPERIENCE. The engineers, operators, and maintenance personnel working on the polymer machines were extremely valuable resources for information about the casting process. Two undesirable product conditions were downselected by the group as most frequently occurring in cast sheet. These are widthwise thickness variability and edge condition variability. Fault tree diagrams were developed to organize this process information, obtained from brainstorming sessions conducted over a 4-month period. These diagrams helped to understand the relationship between the casting process conditions and undesirable product quality. A fault tree diagram is shown in Figure 5 for variability in the thickness profile. These diagrams and the process by which they are generated are critically important for capturing the knowledge, opinions, and learnings of experienced personnel, which is so often lost. Perhaps even more important, is the effect that the fault tree development process had for drawing the local experts into the optimization process and ensuring their buy-in for subsequent experimentation. The process conditions and signals ranked by the group as reliably indicating that thickness problems were occurring in casting were casting hopper temperature, polymer temperature, casting valve pressure, and filter pressure. 5.2.2. STATISTICAL TOOLS FOR EXAMINING PROCESS DATA. The second of three parallel activities in this framework was to examine historical data from the casting process areas. The goal of this work was to determine if a predictive model, using inputs from existing process signals, could be developed from historically recorded qualitative product attribute metrics. This process is important for determining if adequate process data (or attributes)

STRATEGY

FOR

OPTIMIZING

MANUFACTURING

OPERATIONS

75

Manpower
out-of-roundness variation troubleshooting

Measurement
quality lab interpretation approaches prolilometer variability

sampling practices hopper to wheel distance robust hopper design machine vibrations polymer temperature hopper atmosphere intrinsic polymer viscosity casting hopper temperature hopper temperature raw material quality fluctuations range of control signals at normal operation settings interpretation differences kill with process controller

polymer and hopper material air flow fluctuations in hopper cavity

Materials

Method Showing the Potential Causes for Variations in Thickness Profile within

FIGURE 5. The Fault Tree Diagram the Casting Zone.

are being monitored or new sensor inputs are required. Canonical discriminant analysis was used for the data set evaluations discussed in this work. 5.2.2.1. Historical Process Data Set Reduction and Examination. The first issue to resolve was whether preexisting data was valid for model development. A total of 34 types of casting signals were available from machine C, representing temperature, pressure, and valve position signals. Continuous, historical signals were collected for the three months of May, June, and August 1992. The data sets were reworked to contain process data that corresponded only to rolls that were tested (off-line) for thickness profile. This step eliminated all process data for which no final product metrics were available. The process conditions resulting in “bad” thickness test results were broken down into three categories, rated by the quality lab as action grade, passable limit, and nonpassable. The process conditions resulting in “good” test results were labeled as good. The resulting data set reflected the casting process conditions at the time of actual product testing over the 3-month time period. 5.2.2.2. Prediction Model for Polymer Sheet Thickness Quality. The statistical analyses were completed by J. P. Twist of the Chemometrics Laboratory at the company using canonical discriminant analysis. Initial assessment of this data set (Figure 6) showed three major groupings of data: one for the May-June time period, one for early August, and one for late August. The data labeled as good, action grade, passable limit, and nonpassable could not be separated into independent clusters. Subsequent investigations into the chronological significance of these groupings showed that equipment maintenance work performed between each of the three time periods resulted in major equipment changes and modifications to process set point values. These changes were of significant magnitudes to result in different process states. The data were grouped chronologically into time periods because of the overriding effect of the different process conditions on the thickness categories. In the next round of analyses, data from each of the three time periods, shown in Figure 6, were evaluated individually. The data within each time period could be separated into clusters, based on the thickness categories of good, action grade, passable limit, or nonpassable, using canonical discriminant analysis. The canonical discriminant analysis re-

76 6

JONELL

KERKHOFF,

T. W. EAGER,

AND

JAMES

UTTERBACK

1

I

I

I

I

I

I

I

I

I

I

May-June 1992 Action grade m Nonpassable Passable limit m Good

1

August 17-31 1992

August 1-16 1992

I
-5

I
-4

I
-3

I
-2

I

I

I

I

I 3
the Separation 1992.

I 4

I 5 6

-1 0 1 2 Relative magnitude of PRlNl
Component Vectors Showing 1992; and (3) August 17-31,

FIGURE

Groups:

6. A Plot of the Two Major Principle (1) May-June 1992; (2) August 1-16,

of Data into Three

sults are shown in Figure 7 for the May-June 1992 data. Differentiation of the four thickness categories can be observed easily in Figure 7 and represents a significant accomplishment for applying multivariate statistical tools to distinguish product quality test results using preexisting process conditions (or signals indicative thereof). The canonical discriminant analysis results were similar for the other two time periods in that unambiguous separation of process data, based on the same four thickness categories, was achieved

STRATEGY

FOR OPTIMIZING

MANUFACTURING

OPERATIONS

77

5 4 3 2 l4 B % 0

-

-

-1 -2 -3 -4 -5 -6 -7 -6 -9 -10 m -22 -16 Relative magnitude -4 of PRlNl 2 m m Action grade

g 2
SJ E I % $

Nonpassable Passable Good limit

FIGURE 7. A Plot of the Two Major Discriminant Analysis Vectors Showing the Separation of the G Data from the A, N, P Data for the August 1- 16, 1992 Time Period.

also. A prediction model was developed, based on our ability to use discriminant analysis to relate and separate qualitative thickness results to casting process conditions. As a check on the canonical discriminant analysis results, the prediction models were tested and validated for their ability to predict thickness quality, as originally taken from an independent population of quality test results. As shown in Table 3, the prediction model correctly determined action grade quality 98.89% of the time, good quality lOO%, nonpassable quality lOO%, and passable limit quality 100% of the time. Similar analyses were obtained for the two sets of August 1992 data. Out of the 34 casting process signals evaluated by the canonical discriminant analysis, only eight of these signals were used (i.e., significantly weighted) in the prediction models. It is significant to note that only a few signals (out of a myriad of available process signals) were key to establishing the relationship between the casting process and the thickness product data. Even though a single predictive model could not be built for all of the 3-month data set because of

78

JONELL KERKHOFF, T. W. EAGER, AND JAMES UTTERBACK TABLE 3 Comparison of Actual Data with the Prediction Model Results Generalized squared distance function o;;(x) = (X - x,,’ cov-’ (X - x,,

Posterior probability of membership in each grouping Pr(JIX) = exp(-0.5 D,‘(X))lsum exp(-0.5 02X)) K Number of observations and percentages classified in each grouping

AG AG events % G events % NP events % PL events % Total % Priors 89 98.89 0 0.00 0 0.00 0 0.00 89 50.00 0.2500

G 0 0.00 15 100.00 0 0.00 0 0.00 15 8.43 0.2500

NP 1.11 0 0.00 28 100.00 0 0.00 29 16.29 0.2500

PL 0 0.00 0 0.00 0 0.00 45 100.00 45 25.28 0.2500

Total 90 100.00 15 100.00 28 100.00 45 100.00 178 100.00

significant step-function process changes (arising from maintenance activities), a valid predictive model based on the same key signals was developed for each time period when the process was unchanged (i.e., between maintenance activities). Three of the key casting process signals selected by the prediction model, casting hopper water valve position, casting hopper temperature, and polymer temperature play important roles in maintaining the polymer temperature and the casting hopper temperature during casting. Two additional key signals, casting valve pressure and polymer filter pressure, indicate the stability of the polymer viscosity. By applying multivariate statistical techniques for qualitatively characterizing key process parameters and correlating these to product attribute data, we were able to demonstrate that metrics reflecting polymer rheology are the principal indicators of process quality in polymer sheet forming operations.
5.2.3. ORDER-OF-MAGNITUDE THEORETICAL CALCULATIONS. The third activity, conducted in parallel to the other two discussed above, examined the theoretical considerations of the casting process to determine the magnitude of change anticipated on the casting conditions. Order-of-magnitude calculations were used to examine the effects of changing polymer temperature and viscosity. Although this work is too lengthy to address completely in this paper, the calculations served to validate the findings of the first two thrusts, discussed above. One of the principal goals of the order-of-magnitude calculations was to preclude meaningless experimental design scenarios and offer yet another opportunity to discover potential main-effect variables that could affect observed process performance. When examining the commonality of these three parallel activities, it is crucial to note that the casting process signals cited in the knowledge and experience activity corroborated the key casting signals determined by the multivariate statistical analysis of historical data and the order-of-magnitude theoretical calculations. This was a signiJicant step toward demonstrating that the process is a science and not an art.

STRATEGY

FOR

OPTIMIZING

MANUFACTURING

OPERATIONS

79

5.2.4. DESIGNED SCREENING EXPERIMENT. On the basis of the data obtained through the above three parallel activities, a screening experiment was employed for the next step in the process optimization framework (Daniel 1973; Cotter 1979; Montgomery 1984). The screening experiment was designed to examine quantitatively the casting processsheet thickness profile relationships as a mechanism to verify the casting functions, distribute flow, and shape catenary. It was hypothesized that casting conditions would affect sheet thickness profile directly (as main effect variables) or through interactions with one another. Because of the constraints of time and lost sheet production over the testing period, the screening experiment was limited to the evaluation of individual casting parameters as main-effect terms. Interaction terms were not considered. Production constraints often preclude the evaluation interaction terms, unless these are identified as main effects from historical data, because the time required to explore these variables in a meaningful (statistically significant) manner is unavailable. Production losses caused by experimentation can be considerable when, as in the present case, changes to certain maineffect production line conditions, like polymer temperature, require as much as 3 hours to reach thermal equilibrium. Eight casting parameters were selected for evaluation as main-effect variables. Casting signals and sheet samples were collected for each experiment or set of run conditions. 5.2.4.1. Thickness Projiles for Thermal Experimental Conditions. Experimental results are discussed for the four experiments in which the polymer temperature and the casting hopper temperature were varied. Examples of the thickness profile data for these four experiments, labeled as experiments 1, 2, 11, and 12, are shown in Figure 8. Each trace is vertically offset to separate the profiles for ease of viewing. Temperature conditions were observed to affect the resultant thickness profiles in a dramatic manner. The thickness traces for experiments 1 and 12 are for casting conditions in which the polymer temperature is greater than the hopper temperature by approximately lO”F, showing little change from center to edge. The thickness traces for experiments 2 and 11 are for casting conditions in which the polymer temperature is less than the hopper temperature by approximately 10°F. These profiles have the largest edge-to-center difference. All the profiles shown in Figure 8 exhibit peaks or valleys, which originate from hopper struts. It is

FIGURE 8. Four Thickness Profiles and Casting Hopper Temperatures.

from

Experimental

Conditions

Showing

the Effect

of Changing

Polymer

80

JONELL

KERKHOFF,

T. W. EAGER,

AND

JAMES

UTTERBACK

important to note that these are extreme temperature differences and rarely are observed during normal production operations. 5.2.4.2. Fourier Analysis as a Quantitative Measurement Tool. The significance of observations drawn from designed experiments of industrial processes is largely dependent on the sensitivity and resolution of the measurements (both process and end-use product) employed. When adequate measures do not exist they must be developed. For sheet forming processes, it is crucial that differences in thickness profiles, like those shown in Figure 8 for specific process conditions, be recorded using the best quantitative metrics available. Given that the predictive model developed to this point was based on four qualitative thickness categories, derived from limited profilometry measurements of sections of asproduced sheet, we chose to increase the rate of profilometry sampling and, more importantly, developed a Fourier transform-based data evaluation tool for evolving this metric from a qualitative ranking to a quantitative analysis. To illustrate the power of Fourier transformation, as applied to the present measures, Figure 9 shows Fourier analysis results for the thickness profiles of Figure 8. These results were plotted as the square of the amplitudes of the resulting frequencies. The peak at 1 contains information about the magnitude of the edge profile. The peak at 0.067 shows the presence and magnitude of strut lines. Sheet samples were analyzed using the preexisting quality bench metrics for thickness profile. These four metrics: (1) mean thickness across the sheet, (2) range of thickness across the sheet, (3) maximum range in 3% increments across the sheet (max 3%), and (4) maximum range in the first 35% of the sheet from both edges (max 35%), determine if a thickness sample is within specification. In Figure 10, the quality bench metrics are compared with the Fourier metrics for the experimental scans shown in Figures 8 and 9. There is virtually no change in the preexisting quality bench metrics for the thickness profile changes that occurred with the extreme experimental casting conditions. The Fourier metrics developed for this test exhibit both a larger dynamic range of response (3.5 orders of magnitude) and a higher sensitivity to these thickness profile changes. As demonstrated by the present example, preexisting metrics could well have been a principal constraint limiting new knowledge obtained from designed experiments based solely on these measurements. For monitoring sheet profile, Fourier metrics offer improved diag-

4

2.5

+-

expt1 exp#2 exp#ll

RGURE

Thickness

9. Fourier Results Profile Attributes.

Showing

the Effect

of Changing

Polymer

and Hopper

Casting

Temperatures

on

STRATEGY

FOR

OPTIMIZING

MANUFACTURING

OPERATIONS

81

2.5

l----tFourierpeakl -)-tmean range -r%max3% max35% #2 experiment H 1 experiment #I experiment #12

0 experiment

FIGURE10. Plot Comparing

the Fourier

Metric

Peak of 1 with the Four

Quality

Bench Metrics.

nostics of product attribute data, particularly for applications where assignation of defect origin (i.e., strut marks, roll off center, etc.) is important. 5.2.4.3. Results of the Screening Experiment. After the sheet sample analyses were completed, experimental calculations (Cotter 1979) were performed to determine which factors had significant effects on the cast sheet profile. Table 4 lists the eight casting parameters and rankings for the Fourier and two quality bench metrics, mean and range. These experimental results confirm the findings (hypotheses) of the parallel activities comprising the process optimization framework, discussed above, which suggested that polymer temperature and hopper temperature are the most significant main-effect variables affecting sheet thickness profile. The results of the screening experiment are a significant step toward quantitatively correlating casting process conditions with resultant cast sheet thickness quality. The prediction model, developed from qualitative data collected during the parallel activities of the optimization framework using multivariate statistical tools (see Section 5.2.2.2), can now be refined based on the quantitative Fourier thickness quality metrics. Refinement of the prediction model is the first and most important step toward developing a proactive optimization strategy for the two casting functions, distribute flow and shape catenary. The overall strategy of predictive model development provided from the parallel activities comprising the optimization framework presented above is to synthesize a model
TABLE Rankings Casting Parameters” of the Fourier 1.00 1 2 5 3 4 Bench Metrics Mean 1 4 2 4 3 4 1 having the largest magnitude value. 2 Range 1 3 4 5

and Quality 0.067 2 1 3 4 5

Polymer temperature Hopper temperature Edge control flow Edge temperature Atmosphere setting 1 Atmosphere setting 2 Atmosphere temperature Vacuum under hopper a Rankings from 1 to 5 with

82

JONELL KERKHOFF, T. W. EAGER, AND JAMES UTTERBACK

based on existing data, both qualitative and quantitative, identify main-effect variables affecting the principal efficiency constraints identified using AU, operator knowledge and order-of-magnitude calculations, and then refine this model using designed experiments, where appropriate, to facilitate the development of a quantitative, proactive optimization strategy for eliminating the constraints. 6. Highlights, Accomplishments, and Recommendations

A manufacturing optimization strategy with a unique combination of tools has been presented and is comprised of an AU model and a process optimization framework using multivariate statistical analysis. The AU model demonstrates that efficient equipment utilization can be assessed and serves as the principle identification metric by which improvement activities can be focused on areas where the greatest benefit to the operation can be accomplished. The process optimization framework, made up of three parallel activities (formally capturing operator knowledge, multivariate statistical analysis, and order-of-magnitude calculations) and a designed experiment, established the processproduct relationship. This framework also served to quantify the effect of process conditions on product attributes and selected key process parameters for the verification strategy. One of the most significant results from the parallel activities in this work was the development of a prediction model, from preexisting process data, that capably established the relationship between process conditions and qualitative product attribute data. Fourier analysis was employed for the quantitative evaluation of thickness profile and dramatically improved the diagnostic utility of thickness profile data for process monitoring. Most importantly, as a result of this manufacturing optimization strategy, the polymer sheet manufacturing operation can now be said to be a process based on quantifiable science instead of a process that is based as an art. 6.1. Capacity Gained through the AU Process The AU process and the four productivity parameters act as drivers for identifying and quantifying opportunities for increasing capacity and for reducing operational costs with existing equipment by improving the overall efficiencies of equipment utilization. Examination of the %yield values for the eight polymer machines studied in this work shows that %yield for the eight machines listed in Table 5 range from 68 to 85%. Six machines have values less that 0.85. If the quality losses could be reduced so that the %yield values across all eight machines could be improved to 85% or greater, the benefit to the operation would be equivalent to a net capacity gain of 60%
TABLE 5 Capacity Gains through Yield improvements of 85% Machine C D F G H I M 0 Current Yield (%) 68 80 69 85 72 79 81 85 Yield Improvement (%) 85 85 85 85 85 85 Capacity Gain 17.00 5.00 16.00 13.00 6.00 4.00

Effective net gain for the set of eight machines Effective net gain across all machines in operation

0.6 machines 1.1 machines

STRATEGY

FOR OPTIMIZING

MANUFACTURING

OPERATIONS

83

of an additional machine. Similarly, %yield improvements to 85% or greater on all low-efficiency machines across the operation would result in a net capacity gain of one additional machine, which is an important zero (or low) capital opportunity for the operation. It is significant to note that this benefit is realized from improvement activities that increase %yield to 85%, which is a realizable goal as benchmarked on in-plant machines. Additional net capacity gains can be achieved with improvement activities that focus on increasing the other productivity parameters and the overall AU number, as discussed below. A schematic of how the AU process helps to identify and drive improvement activities is shown in Figure 11. Polymer sheet capacity gain provides two opportunities for the polymer operations. First, if additional capacity is needed, a capacity increase can be realized without additional capital expenditures. Second, if there is no need for additional capacity, the overall number of machines in operation can be reduced, providing savings in environmental and operating costs. 6.2. Additional Work and Activities Although project deadlines constrained the nature and magnitude of the improvements realized in the example discussed above (polymer sheet manufacturing), a larger scope of additional improvement opportunities remained. The AU process may be used to assess quantitatively these opportunities and provide a framework for root cause analysis to define process optimization activities, as outlined in Figure 11. As listed in Table 2, run time efficiencies are high, averaging 97%, as might be expected for continuous, specific product dedicated machines where set up times and product changes have been minimized. Availability numbers are next highest, averaging 87%. The principal production-controlled factors contributing to lost availability are unscheduled maintenance and scheduled maintenance. Key activities for minimizing unscheduled maintenance are the implementation of preventative maintenance and equipment reliability programs. The largest productivity gains suggested from the AU data in Table 2 are from run speed efficiency and yield, averaging 82 and 77%, respectively. Increasing run speed would increase throughput but would adversely affect product quality (as per prior machine experience), contributing to even lower yield numbers. Owing to the apparent

identify

and Quantify

Improvement = A
l

Opportunities * RTE * Y

Asset Utilization

RSE

[e][r]
FIGURE

Increase

Efficient

Operation

[-YiEzgF] [W]
- improve product

by focusing

on Root Causes

test methods

11. The Asset Utilization Process as a Driver for Manufacturing Improvement Activities.

84

JONELL

KERKHOFF,

T. W. EAGER,

AND

JAMES

UTTERBACK

dependency often observed between these two metrics, optimizing %yield (as discussed above) allows for the development of a better understanding of the key process parameters contributing to yield losses. The next step in improving %yield would be to perform a designed experiment, focusing on the major process factors or casting parameters identified in the screening experiment discussed above. The knowledge gained from this phase of the optimization process then could be employed to reexamine increasing run speeds under deliberately (or more intelligently) controlled process conditions wherein yield losses are minimized. Depending on the magnitude of the impact of run speed on product quality, designed experiments may again be required, at increased run speeds, to determine if new (or reprioritized) critical process variables have emerged under the new process conditions. 6.3. Applying the Manufacturing Processes Optimization Strategy to other Manufacturing

The manufacturing optimization strategy established through this work is comprised of the AU process and the process optimization framework. The AU process can be adapted readily across different operations, which are set up as continuous, batch, or job shop operations. Batch or job shop operations typically would have lower run time efficiency numbers than a continuous operation because of the setup time and product change times required for each batch or piece to be produced. The AU process has been applied successfully to continuous polymer sheet manufacturing, batch and semicontinuous chemicals operations, batch aluminum rolling and finishing operations, job shop forging production, and bauxite mining operations (Stewart 1991; Kerkhoff 1994, 1995) as a tool for identifying and quantifying opportunities for improvement. The process optimization framework can be applied across different operations, wherever there is a need to reduce process variability and product quality. The strengths and unique features of this framework are the quantitative linkage of knowledge and experience of operations personnel with theoretical foundations and multivariate statistical tools to quantify the relationships of more than one key process signal to product quality attributes. Additionally, the model developed from this framework can be implemented in an on-line manner to predict whether or not product quality will be in specification.’
’ The authors express their appreciation to the internship company with a special thanks to J. Paulson and D. Grant for their help and support throughout this project. Additional thanks go to the division managers for providing the opportunity to do this internship project in their division and to the people throughout the manufacturing operations for their friendliness and openness. We acknowledge the support and resources made available through the Leaders for Manufacturing Program, a partnership between MIT and major U.S. manufacturing companies. A special thanks and note of appreciation goes to the intern student’s sponsoring company, Alcoa, for providing the opportunity, encouragement, and support to attend this program.

References
BECKERMAN,S. (1992), “The Case of the Hidden Factory,” Alcoa News, Spring, 1-5. COTTER, S. (1979), “A Screening Design for Factorial Experiments with Interactions,”
317. DANIEL, C. (1973), Factor “One-at-a-Time of FT-IR Plans,” Spectra. Journal
of American

Biometriku,

66,2,

Statistics

Association,

68, 353.

FREDERICKS, M., J. B. LEE, P. R. OSBORN,AND D. A. SWINKELS(1985), “Materials Characterization Using P.
Analysis Part 1: Results,” Analytical Excellence, Applied Chemistry, Spectroscopy, 60, 1202. 39, 303.

HAALAND, D. AND E. V. THOMAS (1988) “Partial Least-Squares Methods for Spectral Analyses with Application
HALL,

to Simulated and Glass Spectral Data,” R. W. (1984), Attaining Manufacturing

Business One Irwin, Homewood, IL.

STRATEGY
JACKSON,

FOR OPTIMIZING

MANUFACTURING

OPERATIONS

85

J. E. (1991), A User’s Guide to Principal Components Analysis, John Wiley & Sons, New York. KERKHOFF, (1994), Asset Utilization for Tabular Operations, Internal Alcoa memo. J. (1995), Asset Utilization Analysis for Alcoa Industrial Chemicals, Internal Alcoa Report. MASSART, D. L., B. G. M. VANDEGINSTE, S. N. DEMING, Y. MICHO~I-E, AND L. KAUFMAN (1988), Chemometrics: A Textbook, Elsevier, Amsterdam. MONTGOMERY, D. C. (1984), Design and Analysis of Experiments, John Wiley & Sons, New York. NAKAJIMA, S. (1988), Introduction to TPM, Productivity Press, Cambridge, MA. SCHONBERGER, J. (1986), World Class Manufacturing, The Free Press, New York. R. STEWART, D. F. (1991), “Asset Utilization: A Competitive Weapon,” Alcoa Worldwide Manufacturing Conference, Aluminum Company of America, Alcoa Technical Center, April.



doc_215085889.pdf
 

Attachments

Back
Top