Research Article
Developing a lean measurement system to enhance process improvement
- P. Lewis, G. Cooke
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 145-151
-
- Article
- Export citation
-
A key ingredient to underpin process improvement is a robust, reliable, repeatable measurement system. Process improvement activity needs to be supported by accurate and precise data because effective decision making, within process improvement activity, demands the use of “hard” data. One of the oldest and most established process improvement methods is Deming’s Plan-Do-Check-Act (PDCA) cycle which is reliant on the check phase, a measurement activity where data is being gathered and evaluated. Recent expansions of the PDCA such as the Six-Sigma Define-Measure-Analyse-Improve-Control (DMAIC) methodology place significant importance upon measurement. The DMAIC cycle incorporates the regimented requirement for the inclusion of measurement system analysis (MSA) into the breakthrough strategy. The call for MSA within the DMAIC cycle is to provide the improvement activity with a robust measurement system that will ensure a pertinent level of data during any validation process. The Lean methodology is heavily centred on the removal of the seven Mudas (wastes) from a manufacturing process: defects, overproduction, transportation, waiting, inventory, motion and processing. The application of lean, particularly within the manufacturing industry, has led to a perception that measurement is a waste within a manufacturing process because measurement processes identify defective products. The metrologists’ pursuit for measurement excellence could be construed as a hindrance by the “cost down” demands being perpetrated from the same organisation’s lean policy. So what possible benefits does enforcing the regimes of the lean and quality philosophies upon the measurement process have and how does this ultimately enhance the process improvement activity? The key fundamental to embed with any process improvement is the removal of waste. The process improvement techniques embedded within lean and quality concepts are extremely powerful practices in the drive to eradicate waste but there are numerous contextual problems with the application of the process improvement activity and its associated measurement system. The application demands of the organisation may be dependent on a number of financial and resource constraints which may introduce reasons not to apply stringent measurement methodology and practice. The failure of various process improvement activities due to poorly managed measurement activity has arguably never been comprehensively analysed. Process improvement activity theory fully embellishes the need for applied robust measurement systems, so how can process improvement and measurement be systematically aligned to gain benefits? The aim of this paper is to consider whether lean philosophies can be integrated and applied within measurement systems. The discussion seeks to identify if seven Muda exist within measurement systems and whether in doing so this will lead to benefits for process improvement activities.
Visual tolerance analysis for engineering optimization
- W. Zhou Wei, M. Moore, F. Kussener
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 153-162
-
- Article
- Export citation
-
Classic methodologies of DOE are widely applied in design, manufacture, quality management and related fields. The resulting data can be analysed with linear modeling methods such as multiple regression which generates a set of equations, Y = F(X), that enable us to understand how varying the mean of one or more inputs changes the mean of one of more responses. To develop, scale-up and transfer robust processes to manufacturing we also need to set the control tolerances of each critical X and understand the extent to which variation in the critical X’s propagate through to variation in the Y’s and how this may impact performance relative to requirements (or specifications). Visual tolerance analysis provides a simple way to understand and reduce propagation of variation from X’s to Y’s using models developed from DOE’s or historical data. This paper briefly introduces the concept of tolerance analysis and extents this to visual tolerance analysis through defect profiles and defect parametric profiles. With the help of visual tolerance analysis, engineering and statistical analysts can work together to find the key factors responsible for propagating undesired variation into responses and how to reduce these effects to deliver a robust and cost effective process. A case study approach is used to aid explanation and understanding.
Assessing measurement uncertainty in CMM measurements: comparison of different approaches
- S. Ruffa, G.D. Panciani, F. Ricci, G. Vicario
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 163-168
-
- Article
- Export citation
-
Manufactured parts are affected by size and form errors, which need to be assessed against geometrical and dimensional tolerances in order to meet the functional requirements they have been conceived for. The compliance assessment of workpieces with specifications depends on measurements and is unavoidably affected by several uncertainty contributions. According to the geometrical product specifications and verification (GPS), measurement uncertainty consists of method and implementation uncertainty. Literature proposes different approaches for the evaluation of implementation uncertainty, however a standardized method has not been yet achieved. This paper makes an analysis of the various elements to be considered when choosing a specific approach: in the implementation uncertainty of circular features. The most common manufacturing signatures affecting circular profiles were considered, together with the number of points necessary for a reliable estimation of implementation uncertainty.
Multivariate SPC for total inertial tolerancing
- M. Pillet, A. Boukar, E. Pairel, B. Rizzon, N. Boudaoud, Z. Cherfi
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 169-175
-
- Article
- Export citation
-
This paper presents a joint use of the T2 chart and total inertial tolerancing for process control. Here, we will show an application of these approaches in the case of the machining of mechanical workpieces using a cutting tool. When a cutting tool in machining impacts different manufactured dimensions of the workpiece, there is a correlation between these parameters when the cutting tool has maladjustment due to bad settings. Thanks to total inertial steering, the correlation structure is known. This paper shows how T2 charts allow one to take this correlation into account when detecting the maladjustment of the cutting tool. Then the total inertial steering approach allows one to calculate the value of tool offsets in order to correct this maladjustment. We will present this approach using a simple theoretical example for ease of explanation.
The role of measurement and modelling of machine tools in improving product quality
- A.P. Longstaff, S. Fletcher, S. Parkinson, A. Myers
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 177-184
-
- Article
- Export citation
-
Manufacturing of high-quality components and assemblies is clearly recognised by industrialised nations as an important means of wealth generation. A “right first time” paradigm to producing finished components is the desirable goal to maximise economic benefits and reduce environmental impact. Such an ambition is only achievable through an accurate model of the machinery used to shape the finished article. In the first analysis, computer aided design (CAD) and computer aided manufacturing (CAM) can be used to produce an instruction list of three-dimensional coordinates and intervening tool paths to translate the intent of a design engineer into an unambiguous set of commands for a manufacturing machine. However, in order for the resultant manufacturing program to produce the desired output within the specified tolerance, the model of the machine has to be sufficiently accurate. In this paper, the spatial and temporal sources of error and various contemporary means of modelling are discussed. Limitations and assumptions in the models are highlighted and an estimate of their impact is made. Measurement of machine tools plays a vital role in establishing the accuracy of a particular machine and calibrating its unique model, but is an often misunderstood and misapplied discipline. Typically, the individual errors of the machine will be quantified at a given moment in time, but without sufficient consideration either for the uncertainty of individual measurements or a full appreciation of the complex interaction between each independently measured error. This paper draws on the concept of a “conformance zone”, as specified in the ISO 230:1 – 2012, to emphasise the need for a fuller understanding of the complex uncertainty of measurement model for a machine tool. Work towards closing the gap in this understanding is described and limitations are noted.
Uncertainty inherent in empirical fitting of distributions to experimental data
- G. Barbato, G. Genta, R. Levi
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 185-191
-
- Article
- Export citation
-
Treatment of experimental data often entails fitting frequency functions, in order to draw inferences on the population underlying the sample at hand, and/or identify plausible mechanistic models. Several families of functions are currently resorted to, providing a broad range of forms; an overview is given in the light of historical developments, and some issues in identification and fitting procedure are considered. But for the case of fairly large, well behaved data sets, empirical identification of underlying distribution among a number of plausible candidates may turn out to be somehow arbitrary, entailing a substantial uncertainty component. A pragmatic approach to estimation of an approximate confidence region is proposed, based upon identification of a representative subset of distributions marginally compatible at a given level with the data at hand. A comprehensive confidence region is defined by the envelope of the subset of distributions considered, and indications are given to allow first order estimation of uncertainty component inherent in empirical distribution fitting.
Performance study of dimensionality reduction methods for metrology of nonrigid mechanical parts
- H. Radvar-Esfahlan, S.-A. Tahan
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 193-200
-
- Article
- Export citation
-
The geometric measurement of parts using a coordinate measuring machine (CMM) has been generally adapted to the advanced automotive and aerospace industries. However, for the geometric inspection of deformable free-form parts, special inspection fixtures, in combination with CMM’s and/or optical data acquisition devices (scanners), are used. As a result, the geometric inspection of flexible parts is a consuming process in terms of time and money. The general procedure to eliminate the use of inspection fixtures based on distance preserving nonlinear dimensionality reduction (NLDR) technique was developed in our previous works. We sought out geometric properties that are invariant to inelastic deformations. In this paper we will only present a systematic comparison of some well-known dimensionality reduction techniques in order to evaluate their accuracy and potential for non-rigid metrology. We will demonstrate that even though these techniques may provide acceptable results through artificial data on certain fields like pattern recognition and machine learning, this performance cannot be extended to all real engineering metrology problems where high accuracy is needed.
Average outgoing quality of calibration Lab facilities
- M. A. Al Reeshi, Q. Yang
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 201-207
-
- Article
- Export citation
-
Quality assurance is an integrated part of any calibration facility. The calibration facility as well as its customers are interested in the facility production outgoing quality. In most calibration labs the inspection of calibrated items is performed according to a suitable sampling inspection policy. Some of these policies are very good in assuring the quality of the calibration services they offer, but do not provide a clear assessment of the outgoing quality of the entire production of the facility. This paper has developed two methods of calculating the average outgoing quality (AOQ) of a calibration lab that uses a multistage sampling inspection policy. The policy structure is presented first along with the exact procedure of how to perform it by the inspectors and the methods to calculate the AOQ. The two methods differ from each another in the type of data required to calculate the AOQ. The first method requires the technicians’ production, the number of items subject to inspections and the number of failing items found. The second method requires only the number of technicians at each level of the multistage inspection policy. The verifications of the performances of two methods are accomplished by building a simulation model on an Excel worksheet. The model simulates the calibration facility with the right parameters, and then compares the two methods with the actual AOQ. The paper further discusses the advantages and disadvantages of each method in a broader context of quality assurance.
Static temperature analysis and compensation of MEMS gyroscopes
- Q.J. Tang, X.J. Wang, Q.P. Yang, C.Z. Liu
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 209-214
-
- Article
- Export citation
-
MEMS gyroscopes as a kind of angular rate sensor have been widely used, but their accuracy tends to be low in practical applications, especially under temperature influence, and they generally require error compensation. Based on the analysis of gyroscope operating principle, this paper has shown that the resonant frequency and measuring precision of the gyroscope are dependent on temperature and temperature gradients. The paper has thus proposed a compensation model based on temperature and temperature gradients. The experimental results have demonstrated that the thermal drift of zero bias can be effectively suppressed, and the accuracy can be improved by one order of magnitude after compensation. Compared with compensation methods only based on temperature, the new method gives significantly better performance. The new error compensation model has not only integrated the differences under different temperature conditions, but also reduced the repeatability errors. It provides a theoretical basis for accurate compensation of the gyroscope thermal error in practical applications and is applicable to other MEMS gyroscopes.
The yield estimation of semiconductor products based on truncated samples
- K. Gu, X.Z. Jia, H.L. You, T. Liang
-
- Published online by Cambridge University Press:
- 06 March 2014, pp. 215-220
-
- Article
- Export citation
-
Product yield reflects the potential product quality and reliability, which means that high yield corresponds to good quality and high reliability. Yet consumers usually couldn’t know the actual yield of the products they purchase. Generally, the products that consumers get from suppliers are all eligible. Since the quality characteristic of the eligible products is covered by the specifications, then the observations of quality characteristic follow truncated normal distribution. In the light of maximum likelihood estimation, this paper proposes an algorithm for calculating the parameters of full Gaussian distribution before truncation based on truncated data and estimating product yield. The confidence interval of the yield result is derived, and the effect of sample size on the precision of the calculation result is also analyzed. Finally, the effectiveness of this algorithm is verified by an actual instance.