Six Sigma! – another milestone on the road to excellence.
David Hutchins, Chairman, David Hutchins International Limited
First published in 'Tell Me' the jounal of the Quality Methods Association (QMA)
What is Six Sigma?
Six Sigma is the name given to a management concept originated by Motorola in the late 1980’s with stunning results. It enables them to become one of the first winners of the prestigious ‘Baldrige Award for Quality’ and is claimed to be responsible for very impressive improvements in all aspects of business performance.
Six Sigma successes
The results were so impressive that Six Sigma soon attracted the attention of other serious organisations in the USA and in particular, Jack Walsh the high profile Chief Executive of GE. There, the concept was introduced company wide with equally impressive results. Business performance at GE is now so impressive that the company has become the benchmark for the rest of industry. Other star performers such as Allied Signal, Navistar, Polaroid, Bombardier, etc., also developed Six Sigma programmes and soon became disciples of the concept and advocated its use down through their supply chains.
Six Sigma in the United Kingdom
With such a success record and backed by these influential names, it was bound to be only a matter of time before Six Sigma began to appear in the United Kingdom. During 1998, a number of British subsidiaries and suppliers to these large companies found themselves being introduced to the concept by their American customers with the result that Six Sigma is becoming an important new approach to business performance improvement.
How does Six Sigma relate to other initiatives?
Six Sigma could be said to be one of a family of concepts within the general umbrella of Total Quality Management (TQM). Fundamental to the Six Sigma approach is the belief that all variation, no matter how rare, incurs a cost. Ultimately, the goal is to completely eliminate all sources of error no matter how small. The concept is applied to all functions of the business both in operations, support functions, accounts and so forth.
In the early stages, the tools for which people are trained to use are relatively simple. They include techniques to collect and analyse data, diagnostic tools for cause analysis and the means to implement solutions and to hold the gains. Using terms borrowed from the martial arts, teams that have demonstrated the ability to use these tools effectively for the solution of real work related problems are said to have achieved ‘Six Sigma Green Belt’ status. However, as progress is made, new techniques are introduced to enable the maturing teams and individuals to tackle more complex problems. When this level is reached, the teams are regarded as being ‘Six Sigma Black Belt’. Achievement of this status is a strong motivator for the teams.
All of this is set in a structured approach using goals deployment, establishing Key Performance Indicators (KPI’s) using tangible metrics to conduct Gap Analysis and to set stretched goals..
The ultimate aim is the creation, implementation and development of an organisation in which everyone is involved in working towards their company being the best in its field.
One of the attractions of Six Sigma is the fact that the most important practical and usable techniques are mostly quite simple even for those who are not mathematically inclined. This is true even at the Six Sigma Black Belt level. Only the most sophisticated of the techniques employed require a degree in mathematics to obtain their benefits. The need for these is relatively rare and in such cases outside assistance may be sought if these are required. Experience indicates that even the crudest application of the simplest concepts can produce stunning results.
The development of the Six Sigma concept by Motorola was the result of an observation that their leading competitors in Japan operating TQM style management are able to conceive, develop, manufacture new products with lead times only a fraction of those achieved by Motorola and completely defect free from the point of launch. It was clear to Motorola that if they allowed this situation to continue, it would not be long before they would be in severe economic trouble.
They learned that the concepts used so effectively by their Japanese rivals were based on an observation by Dr Jonji Taguchi, an eminent Japanese Engineer and Statistician, that all variation, irrespective of the value of specification tolerances or limits has a consequential cost. This cost increases as a quadratic function as the variation spreads from the target value. The only way in which these variables can be isolated and eliminated is through the use of the statistical method.
Unfortunately in the West this is academic because the only problem solvers capable of using such methods are the limited resources of qualified technical personnel. If these were to be used on the scale necessary to make such a quantum leap in improvement activity, the cure would cost more than the disease. This was not a problem for the Japanese because they have all the Quality Circles, Quality Control Circles (also known as Kaizen activities or Self Directing Work Groups) of direct employees specially trained in such tools.
Motorola recognised that terms such as ‘Quality Circles’, ‘Kaizen’, ‘Just in Time’, ‘Total Quality’, etc had all been badly represented in the past, therefore it was important to start with something fresh. The use of the term ‘Six Sigma’ was inspirational. It enabled Motorola to avoid all of the discredited terms. It also enabled Motorola to focus everyone on the ultimate goal of driving out variation whilst at the same time avoiding the use of another much maligned term, Statistical Process Control.
To the statistician, the term Six Sigma is immediately recognised as a measure of the probability of finding a variable which happens so rarely as to occur only around once in million chances. This assumes that the data is more or less normally distributed about the average. Using the term with regard to programmes such as we are discussing here, the use is not intended to be so precise. The object is to convey the idea that all variation should be regarded as being tolerable. Always, we should seek better and better ways of doing things. After all, the competition will so really there is little choice.
Before illustrating the concept with case examples, for the benefit of those readers who are unfamiliar with statistically based techniques, it is necessary to introduce a few fundamentals, and to display the basic simplicity of statistical concepts normally involved. This applies to office operations as well and the attraction of Six Sigma is that all functions and activities in an organisation can use the tools employed.
For all practical purposes, there are basically only two kinds of data:
Variable and Attribute or Countable Data
Variable Data is exactly what it says it is, it embraces everything which varies, either in discrete, incremental steps, such as that described on a digital or analogue scale. Such data include such characteristics as dimension, weight, speed, time, voltage, current, capacitance, viscosity, moisture content, lead times, credit period etc.
Attribute Data includes all right/wrong, good/bad, is/isn't, did/didn't, error/no error, missing/not missing etc type situations.
There is a sub-group to this type of data called subjective data because it is subjective to the senses, and includes activities such as wine tasting, tea tasting, loudness when measured by the ear instead of an instrument, feel, smell etc.
In Japan, long before taking the student on journeys into mathematics, they would provide the student with simple practical tools which work and produce results. The Japanese have also found that over 80% of the problems which can be solved in industry require nothing more than this simple basic knowledge. Of the two types of data, probably 80% of all applications relate to attribute rather than variable data. All attribute data behaves in much the same way regardless of whether we are studying such diverse matters as ‘errors in invoices’, ‘broken gear teeth’, ‘missing files’, ‘absenteeism’ etc.
Attribute data always begins from zero possible occurrences. For example, a roll of cloth may have zero, one, two, three or more oil stains, marks, or tears. In theory it could have an infinite number. A batch of invoices might contain zero, one, two, three or more errors.
If attribute data is plotted in chart form, it will always produce predictable patterns which will vary depending upon the average number occurrences of the feature being observed. For example, supposing a sample of 50 cans was taken at random from production, and the number which have been found to be defective recorded in chart form. If the average number of defectives in the sample was about 7%, the result of a series of samples would appear as follows:-.
Note that the spread of the data leans slightly to the left but is more or less evenly distributed about the mean. Now supposing an improvement had been made to the process, and data again plotted to see the effect of the changes. The result might be as follows. Here we can see that not only has the average number decreased to 3.5%, the shape of the distribution has changed as well with a definite skew leaning to the left.
The chart shows clearly that the process has been improved. Now let us take the situation one stage further, and suppose that further improvements had been made. The result might well look like this.
Again, there can be no question that improvements have been achieved. Even though the improvement is only from 3.5% to 1% the shape of the distribution of the data is very different. the shape is very different This is a hypothetical example just to demonstrate the simplicity of using frequency diagrams for attribute data. Now let us consider an example of the use of this method.
Case 1 - The leaky catheter bag!
Having to wear a catheter bag may be bad news. To wear a leaky catheter bag is even worse, so it is hardly surprising that the company which makes this product, conducts leak tests on 100% of its products. The test requires that the product is immersed in a tank of water to a depth of approximately 5½ inches. Air is then pumped into the bag at a pressure of 7 p.s.i., bubbles indicate any leaks.
A project team concerned with the problem of leaks collected a large sample of batch cards and then plotted the number of leaky bags per shift. The result looked like this.
It was nothing like the curves shown in previous figures.
One of the team members then suggested the possibility that the chart might be concealing three separate sets of data. The question then arose, what might cause the difference? One member then remarked that there were three different welding heads one in the store, one on the machine and the other being cleaned. Perhaps there was a head to head difference. When the data was re-plotted on this basis, this was found to be the case.
The challenge for the team was then to find out why one of the heads was better than the other two. When they found the answer, they were able to bring these up to the level of the best. The total time spent on the data collection and analysis was three hours. The cost of the improvements negligible. The benefit saved the company thousands of pounds per annum in defect costs, not to mention the improved reputation with its customer the Department of Health & Social Security. The technique was extremely simple and the problem solved by a team of process workers!
Mathematically, the treatment of variable data is quite different from attribute data. But again, for practical purposes, in project work and for very simple control purposes, the means of plotting the data is very similar to the plotting of attribute data with the only minor difficulty being the selection of scales on the graph.
For practical use, the resulting charts of variable data are usually very revealing. Here are a few examples.
CASE1. SERVICE TIME – COUNTER STAFF.
When the average service time spent dealing with customers in a bank, shop, check in at a hotel, or other similar data is plotted, the information revealed can determine the number of staff required to meet a given level of demands. It can also indicate the dispersion transactions for different types of service and point to possible means for improvement. For example, it is possible that the dispersion may be due to differences in types of customer demands. By segregating these different customers, the service time for some may be greatly reduced.
It can be seen from the foregoing example that much can be revealed from the simplest of techniques. Techniques which can be used by anyone, without any need for mathematics or deep statistical theory.
Techniques such as those described above are taught in simple but effective educational programmes aimed to empower the workforce. Through training in these basic skills, Japanese workers have managed to solve literally million of problems of the level described in this text. Before moving on, let us take a look at a further example of the power of these simple techniques.
CASE 2 – BANDAGE WEAVING.
The previous example of the catheter bags was recorded during a real life example obtained from the first day of a three day course in basic data collection and analysis for line managers and supervisors. None had any previous educational background, and in just a short time, thousands of pounds of real savings had been made using the tools described.
On the second day, the participants were introduced to data collection using variable data. One group decided to check the lengths of bandages ready for shipment in the finished goods stores. Printed on the boxes, the length was specified as being 10 metres. About 50 boxes were selected at random and lengths measured. The results were as follows
Chart to be inserted
The chart shows that on average, the company was giving away over 1 metre of bandage per box. It was estimated that this had probably been going on for some thirty years, and cost the company £35,000 for each separate size of bandage. For a small company, this was a great deal of money. The worst aspect was the fact that the customer was probably totally unaware of the gift. How many people measure bandages? and certainty no one other than a Trading Standards Officer would even bother.
CASE 3 - PLANT MAINTENANCE.
Using exactly the same technique as described above, an engineering company obtained the co-operation of its workers to collect data on plant break downs.
Variable data was plotted for time to failure on certain components, which were known to have relatively short lives.
If failures occurred during the shift, disruption was high and cost around two hours of down time for each occurrence. It was found that on average this occurred about every 260 hours of operation. This did not seem to be a big problem but in fact the real cost was much higher than was realised, not in the direct cost of lost time, but in the accumulation of work in progress (most of it is not progressing). There was also the question of reduced predictability of production output, which in turn led to stock outages, and to keeping excessive stocks in finished goods warehouses to safeguard against this.
These costs appear on the balance sheet as stocks and inventory and also suffer depreciation and incur interest charges. Worse still, these unscheduled breakdowns result in lower levels of plant utilisation. The usual counter to this would be to obtain additional plant, and of course appropriate manning levels. These will appear as direct costs on the profit and loss account, fixed but depreciating assets on the balance sheet will tie up precious working capital.
When the failure pattern was revealed, it was agreed that the feed fingers be replaced every 200 hours even though they were still functional. These changes were made outside shift time, therefore causing no disruption of work flow, better predictability of scheduling times, no log jams in production and lower work in progress.
When collected by the workforce, this data is almost free and the workers enjoy being involved. Breakdowns are probably as much a frustration for them as they are to the scheduling department, and to the Managing Director when he is forced to analyse his factory costs.
In Japan, it is estimated that over 80% of all problem solving requires nothing more than the application of these simple tools.
CASE EXAMPLE 4 – ‘THE CRACKED BED!’
A medium sized company employing around 400 people in the automotive components industry experimented with the implementation of Xbar/R control charts in the manufacture of high precision automotive components. These are required to work in an exacting environment involving high levels of stress and change of temperature and the need for stable physical properties. Nevertheless, whilst these demands required the very highest levels of machine capability at the final stages of production, surprisingly wide latitudes of dimensional accuracy were tolerated in the early production processes.
These inaccuracies were thought to be unimportant prior to the heat treatment process as it was thought that at this stage, scale and other surface problems were such that considerable grinding would be necessary afterwards at which time the variations could be removed.
The first experiment was conducted on an Automatic Lathe. In order to set up the parameters on the control chart, 10 consecutive samples of 4 items were selected from the process. Care was taken at this stage to ensure that none of the settings on the machine were altered and that operations were continuous with homogeneous raw material. As an automatic machine, this meant ensuring that the supply of raw material was uninterrupted and not require replacement during the sample period. This is important when setting up the basic control parameters.
Measurements were taken on the items selected and the action and warning limits plotted on the chart.
It was found that several of the readings were outside the lines which had been calculated from the data and this did not seem to make sense. If the lines were constructed from the data, how could the data itself be outside the lines themselves? At first it was thought that some error had been made in the calculations and it was checked several times, as was the theory and the measurements of the components.
Then it was then realised that the formula used to construct the diagram was based on the assumption that the data was distributed according to the laws of the normal distribution for process variables. When the averages of the samples of 4 were arranged in histogram form it was found that this was not the case. The data did not appear to conform to any distributions.
The engineer consulted the setters of the machines, and also the operators, Many possible theories were suggested as to why this should be the case.
The suggestion by one setter - 'cracked bed' proved correct. The machine had been purchased second hand and apparently it was known to the setters that the machine bed was cracked from the time of purchase. When asked why they hadn't, volunteered this, before the reply was "no one ever asked!"
The bed of the machine was repaired by the tool room. Again samples were taken and still the results behaved in an abnormal way. However, whilst the engineer was in attendance at the machine, he noticed that one of the operators checked a component and decided to adjust the size which he did with the aid of a mallet!.
The tool was held in the tool holder by a simple screw. The operator slackened the screw, tapped the tool lightly with a mallet, measured the next component, then attended to another machine. At the same time, it was noticed that the surface finish on the components produced was very irregular. When the tool was inspected, it was noted that it had been reground to extremely peculiar dimensions. It was also noted that the shape of the cutting tool varied considerably depending upon which setter did the regrinding. It was obvious that somehow tool replacement must be standardised in accordance with best practice. Both problems were solved simultaneously.
A new tool holder was purchased with a micrometer adjustment eliminating the need for the mallet. The tools themselves, which were of hardened steel were replaced by a new design which used replaceable, pre-ground carbide tips. This ensured the use of correct tool cutting angles and fool proofed against unauthorised experimentation. They also required many fewer adjustments. When these changes were introduced there was a remarkable change in the results. Not only did the machine behave precisely according to the rules of statistical theory, the variation had been reduced to such an extent that it was debatable whether it was necessary to do any form of in-process inspection let alone use of Xbar/R charts.
The concept was subsequently applied to the entire machine shop. The setters and operators who had originally viewed the whole exercise with great amusement changed their attitude to one of interest, and then finally asking to be able to use the techniques themselves.
It was not long before frequency diagram curves were appearing for all sorts of measures. One setter used the technique to prove that one brand of drill lasted 20% longer than its competitors and required fewer regrinds. This also made a significant reduction to both lead times and machine downtime.
Eventually, either Xbar/R charts, attribute charts, or the simplified charts were established on all key operations in the plant. In some cases, the reduction in variation eliminated the need for a number of subsequent operations. In the case of the 3" lathe the accuracy produced eliminated the need for two grinding operations before heat treatment and an expensive grinding operation afterwards.
In addition to this, prior to and during the period of implementation of these methods, the company had purchased a variety of plant but on an ad hoc basis without much control and from vague specifications. Virtually none of the machines purchased lived up to the claims of the manufacturers and all resulted in the excessive use of production engineering time to obtain the results required.
The engineer who had been responsible for the introduction of statistical process control decided to use this method to develop a new product acceptance sampling procedure. Statistical process capability was established in the new plant at the suppliers premises prior to shipment. The tests were then repeated after satisfactory installation at the plant. Stage payments were made to the suppliers following satisfactory results from the sampling. The final payment followed six months later when the product proved that it was capable of maintaining its contracted performance to the satisfaction of the specification. All plants which were purchased using this approach performed significantly more accurately with fewer breakdowns and for several years longer than that which had been purchased using traditional methods.
It is clear that the use of Six Sigma as a management concept provides considerable potential opportunities for industry.
In the words of Bob Galvin, the now retired CEO of Motorola and Six Sigma champion.
“We did it because it was necessary if we were to confront Japanese competition, but now we do it because it represents a better way to manage our business and to involve all of our people. If the Japanese did not exist, we would have had to have invented them”!
Training in Six Sigma is one of DHIs major activities. We are one of the longest established exponents of the concept with a heritage which is merged with the work of Dr Juran whose teachings form the backbone of Six Sigma. All of our Six Sigma materials have been created by ourselves and we hold the copyright on both our published materials and our methods.
Please contact us for a quotation to assist with your Six Sigma programme and please check our Six Sigma course advertised on this web site. They include:
Six Sigma Appreciation
Six Sigma Yellow Belt
Six Sigma Green Belt
Six Sigma Brown Belt
Six Sigma Black Belt
Transactional Six Sigma
Six Sigma Master Black Belt certification.
All of our Six Sigma courses have been certified by IAQC