What makes some businesses better at thriving in disruption?

Process improvement and change go hand in hand. But disruption? Disruption is a word loaded with negative connotations. For everyone who embraces the idea of digital disruption there will be hundreds  who dread it.

But dread is precisely the wrong response: digital disruption isn’t the future, it’s the present, and if you think the ice isn’t breaking up under your feet, it’s just because you’re standing on a deceptively large sheet of the stuff. Be warned, the thaw is here.

It’s an irony that we are much more happy with digital disruption in our personal lives than we are at work. Possibly because we understand the process of our daily lives much better. We all know it’s a pain to find a parking space and feed the right change into the meter, so we don’t mind an app that tells us where the spaces are, and gives us a frictionless platform for payment.

Knowledge of the process, then, gives us the courage not only to weather change but to embrace it with enthusiasm. As Ivan Seselj says in this article, ‘Teams that clearly understand a process can spot problems and improvements more easily.’ We clearly understand and can spot problems in our personal lives, but the processes that go on around us at work may be more of a mystery. Evolved over time, their complexity is part of their charm. When disruptive technology comes along, some of us panic.

Digital disruption is more seismic to an industry than an individual organisation tweaking their processes, but like a major earthquake, it has been waiting to happen for some time. We’ve long been able to generate data, and have only been waiting for computers to get up to speed in sorting the signal from the noise. It isn’t just that there is valuable information to be crunched out of the numbers we have, it’s that the processing has crossed a threshold of affordability. Data plus analysis can now deliver real value to businesses and offer a return on investment that is giving early adopters a sudden and massive advantage over their competitors. Digital disruption is not killing businesses; late adoption of technology is killing businesses.

The inertia is understandable, even if it’s undesirable. Jeff Cole puts it well: ‘Whenever you roll out a process change of any significance that impacts people – be it positively or negatively – they need time to absorb the change and become acclimated to the new way… Getting them involved early on in the design allows them to become acclimated much faster once the rollout period hits.’

Disruption favours the agile, because only the truly agile are able to take advantage of the new opportunities opened up by disruption. Start ups have less fear because they have less to lose, but the big beasts should take a leaf out of their younger competitors’ playbook.

Going forward, we will be getting better data, and more intelligent analysis. We can’t change (or necessarily predict) what that will mean for our businesses, but as new possibilities unfold, we can certainly choose our response.


How to choose the right process management software


Your processes are not being looked at and used by teams – maybe because they’re out-of-date or incomplete. You know you need to make a change to the way you manage process info but where do you start? There are so many different BPM tools and approaches, it can be overwhelming. So much so that some companies will just revert back to their current way of doing things. Don’t be that company!

Here are a few things you should definitely consider to make sure you select the best process management option for your organization.

Do the groundwork first

Before considering a BPM tool, first decide what your organization plans to achieve with its process improvement efforts. Why is it important, and what is driving the strategy to focus on a process-centric approach?

Secondly, understand your organization’s current situation and level of maturity: how are you managing processes today, and what resources are available to drive and support your process improvement efforts?

Thirdly, know what your future development requirements are as they will impact your platform selection criteria. You’ll want a tool that supports your strategic direction.

6 Criteria to consider when selecting process management

One size doesn’t fit all when it comes to BPM software.  Here are 6 criteria to consider when you’re evaluating which process management software to invest in, to reach your process improvement goals.

Think about who is going to use the software and their communication preferences.  Will it be used by technical process specialists or frontline staff? The interface should provide users with the ability to engage with the tool easily. If it’s too hard, it won’t be used. Look for functionality that saves the user time and effort, like the automatic generation of a process map based on the user’s text input, and a search function that will improve ease of use, not cause frustration.

Most processes are not static, so you’ll also need to think about change management. For example, automatic notifications displayed via a dashboard can gives users visibility of the current state of their processes, and give managers confidence that the right people are being notified of process changes.

Your process management software should give business analysts information that can support their efforts to make processes lean, and help them utilise tools like Six Sigma and Kaizen. If this is an approach you plan to adopt, you’ll need to be able to identify opportunities to reduce waste, remove non-value add activities and spot cost reduction opportunities.

To facilitate this analysis of current processes and identification of improvement opportunities, the BPM tool should be made available to the whole organization. Your teams know your processes inside out and can weigh in on what’s not working in your as-is processes, so you can plan your to-be processes. This is a critical component in identifying opportunities for automation and improvement.

Your process management software should provide version control that records who uses the system, and tracks changeable components for auditors. Governance is critical to the success of your process improvement efforts so make sure the tool you select makes this easy. To drive engagement across teams, translate measures of success into people’s job reviews and key performance indicators so you don’t find yourself back where you started with out-of-date processes that no one uses.

Content Management
Collaboration is key to process improvement. Consider how the BPM tool will enable and encourage active collaboration and the exchange of ideas, and how it will accommodate and track suggestions for improvement and feedback. These should be should be traceable, and successful execution should be tracked.

Providing teams with access to process information where and when it’s needed is a challenge for many organizations.  Creating a single location and one version of process truth will help drive engagement with your processes. Ideally, process guidance should be available in the places and systems that teams already visit every day, like the company intranet or via URL links in ERP systems or in CRMs like Salesforce. Think about how your teams will access process information. Where and when will they need it? How can process management software enable the accessibility you require for your teams?

Your process management software will need to grow with you as your business expands and changes. This is frequently where tools like Word, Visio and PowerPoint can’t keep up as process mapping tools, because they aren’t scalable. Beyond the technology, your rollout plan should also include the amount and type of resourcing you’ll need to implement and embed the technology.

Cost should never be the only selection criteria for your business process software decision. Regardless of the cost, you should question the return in investment if you don’t start to see small wins within a short period of time. You also want a vendor that treats your investment like their own, and gives you a voice to table suggestions. A flexible provider takes you with them on the process improvement journey and should be quick to respond to your queries.

There’s no question. The right business process management software can enable your process improvement efforts, but only if you select the right tool and the right process management approach for your business. Thinking about your BPM requirements from the perspective of usability, governance, analysis, content management, technology and cost will help ensure you select the right approach to meet your current and future business process management requirements, and will ultimately ensure maximum return on your process investment.

Spring Cleaning Fancies Lean Six Sigma 5S

There is something awe-aspiring about having a home and garage that is neat and clean. It goes beyond having a beautiful or expensive home; it actually brings more value to who you are as an individual or a family. As an individual, it says you take pride in your home, and as a family it says we are all working together in harmony.

Luckily, Six Sigma has a magic tool to help you achieve a clean, well-organized household. It is the Lean 5S tool.

The Ease of The 5S Tool Applied to the Home

Here’s a quick breakdown of the 5S tool as applied to your home:

Sort (Seiri): Remember, Rome wasn’t built in a day, so choose one small area at a time to tackle. Choose an area that has been particularly bothersome and start. Let’s take your home office. Label three separate boxes Trash, Donate, and Give away.

Categorize anything you haven’t used in the last couple of years or don’t have an interest in, such as old fax machines, books on topics that don’t interest you, old cellphone cables…you get the idea. Also categorize any paperwork that doesn’t serve you any longer (mainly because of online and internet storage). This excludes tax returns for the last seven years — those you must keep in a safe place!

Straighten (Seiton): Organize the items you have decided to keep in such as way that the frequently needed items are easily obtained. For example, keep trays for:

  • Unpaid Bills
  • Paid Bills
  • Important to Remember — You can use a fill-in calendar to accompany the important to remember tray.

Shine (Seiso): Clean, dust, and organize so that the easy to get items are easily obtainable and everyone can see where they are located. This will eliminate the waste of buying something over again.

Standardize (Seiketsu): This Lean Six Sigma tool works great in a garage, where storage takes place. Many times, the garage is where you store items not frequently needed, so until you really need them, they are buried and hard to get. This Lean SS tool could be used to have a more efficient and effective way of location placement of such an item.

Sustain (Shitsuke): Everyone in the family is held accountable for maintenance. This could include the once-a-month chore of making sure all items are well organized and in their place. The Sustain part of the Lean SS tool is easy as long as there is consistency in the chore.

The PDCA Cycle as Part of the Continuous Improvement Process

There are many great tools and templates used in Six Sigma, and today we are going to spend a little time with PDCA (Plan-Do-Check-Act). PDCA is a template or cycle used for problem solving. W. Edwards Deming originally created PDCA way back in the 1950s, and his intentions were to use PDCA with a continuous improvement procedure to help rebuild Japanese industries.

PDCA: The Problem Solving Cycle


P (Plan): In this stage, the purpose is to investigate the situation in its current stage. After you clarify the nature of the problem, define the problem that you want to fix. Identify potential root causes. Capture and analyze real data and write a mission statement.

D (Do): In this stage, you want to make the team aware of the gravity of the real problem. Use and analyze data currently found, and make sure you are defining and implementing a solution plan.

C (Check): Here you want to monitor and evaluate the effect of the current implementation. Seek countermeasures to improve the current solution. Once again, collect current data with the improvements in place and if need be, train those who are directly affected by the new solution plan.

A (Act): Continuously monitor the new performance measures, and decide if any adjustments need to be made. If adjustments need to be made, then integrate into the new working practice. If the solution is not working, then abandon the plan and ask the team what was learned from this and start a new PDCA cycle with a new target goal.

Problems usually accumulate over time, and if you use continuous improvement tactics (such as kaizen) as a normal part of the business culture, you will truly circumvent many serious issues.


For those interested in a career that can take them anywhere around the world, project management might provide the answer. It’s also a great career for those who are willing to stay in the profession and reap the benefits of experience.

That’s especially true for those who earn certification in project management and process improvement methodologies such as Six Sigma.

The 2018 Project Management Salary Survey from the Project Management Institute (PMI®) offers a look at the opportunities around the world for those wanting to make a career in project management. The results show that salaries have increased for 70% of those surveyed in the past 12 months.

And 26% reported increases of 5% or more during that same period.

Global Reach

The project management profession has taken flight around the world. The PMI survey included 33,000 project management workers in 37 countries.

PMI estimated last year that by 2027, organizations around the world will need a staggering 88 million people working in project management. Of that number, about 75% will be in China and India, two countries with large populations that are experiencing massive growth.

Of the workers included in the 2018 salary survey, those with certification in project management reported salaries that are 23% higher on average across all countries.

Six Sigma and Project Management

PMI has done studies on the connection between Six Sigma and project management, a topic the organization says has “not yet received the attention it deserves in project management conferences and publications.”

Six Sigma methodologies such as DMAIC and DMADV have been a main component of successful strategies for many businesses, including Motorola, Toyota, Boeing and DuPont. The benefits of Six Sigma include an understanding of what customers require from a product or service, eliminating mistakes in processes, improving product quality and delivery, reducing waste and costs and a commitment to continuous process improvement.

Six Sigma can be applied to a project to manage scope, quality, cost and time. PMI reports that the methodology has such potential in project management that it “may have the appearance of an impossible dream.”

That’s not the case, however, and those who apply Six Sigma coupled with project management certification present very attractive job candidates to potential employers.

Around The World

Where are the opportunities in project management? About a third of the 33,000 project management professional included in the PMI salary survey work in the United States. However, there also were large numbers of project managers in other countries. Those with the most included:

  • Canada
  • India
  • Spain
  • Italy
  • Germany
  • Australia
  • Brazil
  • United Kingdom
  • China
  • France
  • Japan

The highest median annual salaries – expressed in U.S. dollars – were in Switzerland ($130,966), the U.S. ($112,000), Australia ($108,593) and Germany ($88,449). However, median salaries in 15 countries were over $70,000 a year, and that includes project managers at all levels of experience.

Experience Pays Off

The survey also showed the wisdom of not only getting certified in the profession, but also the value of sticking with the profession. In every country, experience equaled significant increases in salary.

Salaries in the U.S. provide a good example. According to PMI, these are the median salary levels for project management professionals in the U.S., based on years of experience.

  • Less than 3 years: $75,000
  • Three to 5 years: $85,000
  • Five to 10 years: $100,000
  • 10 to 15 years: $115,000
  • 15 to 20 years: $122,000
  • 20 years or more: $130,000

That trend repeats itself across almost every country. Of the 37 countries surveyed, 12 reported median annual incomes of more than $90,000 after 20 years.

Clearly, project management offers a secure and potentially lucrative career path. It also gives people an opportunity to work around the world in profession that transcends political borders and unites people on the shared mission to improve how organizations operate.


Where once only our computers, phones and gadgets were connected to the internet, the internet of things (IoT) is now making it a truly connected world.

Cars, appliances, road signs, weather stations, pollution detectors, medical patients, even farm animals – if anyone or anything can have a sensor put on it, it can be a part of IoT.

IoT is part of what is called Industry 4.0, or the fourth major upheaval in the history of industry. It is most associated with manufacturing but can be applied to many other industries.

Industry 4.0 involves the following factors:

  • The rapid increase in the amount of data, computational power and connectivity (IoT is a major driver in this area)
  • The increased use of data analytics to drive business strategy
  • The emergence of machine learning and artificial intelligence
  • Advances made in transferring digital instructions to the physical world (obviously, another big IoT area)

Uses of IoT

What all this means for businesses is the ability to gather information from objects and people in the “real world” through sensors that share information with each other as well as being connected to the internet.

The data gathered from the IoT will mean even more information for organizations to use in streamlining operations.

For example, it can help supply chain managers know where trucks, trains and ships are in the supply route, as well as the amount of inventory in a warehouse. For healthcare operations, sensors worn by patients can transmit real-time data on their physical condition to medical professionals far away. It’s also the technology behind the invention of autonomous cars.

By 2020, projections call for 26 billion devices to be connected to the IoT.

Lean Six Sigma and IoT

As with all things involving data, the challenge has moved from collecting and storing data to figuring out the best ways to put it to use. That’s where the tools and techniques of Lean Six Sigma can come into play.

With the IoT, the amount of data available will grow even larger than it already is today. For example, retailers will have finely grained information on customers’ buying habits. Manufacturers will have details on every phase of any process. Transportation officials will have a better grasp on where problems are occurring in a city and how to allocate tax dollars to improve roadways and provide better public transportation.

However, before it can be used, data must be cleaned and formatted in a way that it can be analyzed for the type of insights that can improve a business. That’s the first step in any Lean or Six Sigma process – analyzing the datato find out where an operation stands and what changes need to be made.

By aligning IoT and Industry 4.0 with Lean Six Sigma methodology, organizations can better leverage vast amounts of data to make operations more efficient and provide better products and services to customers.

Applying Lean Six Sigma

In a way, IT should be viewed as another channel to gather data. Turning that data into actionable business intelligence still requires a consistent and successful strategy. That means using proven methods such as Lean Six Sigma.

Six Sigma focuses on eliminating defects in a process. Lean focuses on cutting out waste in a process by keeping only those steps that add value to the end user. With the big data that IoT will provide organizations, the tools and techniques provided by Six Sigma are more important than ever.

For example, studies have already been done on applying Lean Six Sigma to supply chains. Managing supply chains have become more complex and competitive, with businesses now having a global reach and all of them are trying to beat each other on providing the best service.Industry 4.0

In this study, researchers found that a “Lean Six Sigma approach in global supply chain using Industry 4.0 and IoT creates an ideal process flow that is highly optimized as well as perfect and free from defects and wastage.”

While only a theory, the study provides a blueprint for those wanting to couple Lean Six Sigma quality control and continuous process improvement with the potential of IoT and Industry 4.0.

Three Areas Of Focus

Businesses and consultants have started to dig deep into the use of Lean Six Sigma and the expansion in the amount of available data. The demand for quality control and process improvement is high. Otherwise, organizations might find themselves lost in a forest of data.

In The Manufacturer, a website that focuses on the manufacturing industry, three areas were identified where Lean Six Sigma can accelerate implementation of new technologies:

  • Ensuring that customer engagement remains forefront as organizations move into Industry 4.0
  • Using the lessons learned from previous advances in technology to optimize the use of IoT
  • Using risk assessment in the implementation of IoT and Industry 4.0

IoT and Industry 4.0 offer businesses advantages in data collection and potential business intelligence. However, the need for a quality process to be in place has not diminished. Managers and employees with knowledge in Lean Six Sigma continuous process improvement are better positioned to take an active role in ensuring that new technologies are incorporated into an operation in a meaningful way.

They also will understand that no leap forward in technology will give an organization an advantage without having a solid process in place, such as the framework provided by Lean Six Sigma.

Check Imaging Improvements with Lean Six Sigma

Moving information electronically offers tremendous opportunities for cost efficiencies and improved levels of customer satisfaction for the banking industry. But paperless processes also carry heightened risks for failure. A single process breakdown, when automated within a high-speed electronic world, can result in problems for hundreds of thousands of customers. One such process is check imaging – potentially a huge opportunity for reduced costs and happier customers, but if poorly executed, a huge legal liability. This is a banking issue of global proportions. Deutsche Bank’s Alex Brown projects that a quarter of the world’s large banks will be implementing check imaging within the next two years.

In this case study Lean Six Sigma was used as the method of extracting the benefits of the imaging process, while negating the potential headaches that can occur in a poor process.

What is Lean Six Sigma? Many people have heard of Six Sigma and know that it deals with reducing defects, improving quality and eliminating variation. Lean is a discipline focused on improving process speed and eliminating waste. Lean Six Sigma is the synergistic union between the two, as quality improves speed and speed improves quality. By integrating Lean speed and Six Sigma quality, the rate of improvement in quality, cost and speed increases much faster and goes much further than either discipline could achieve separately.

Reducing the Risk with Paperless Transactions

The required rate of improvement in this case study was urgent. The stakes were high for the client. Undoubtedly, the business case for imaging checks is strong. Supplying customers with electronic images of canceled checks instead of returning the actual checks significantly improves costs by reducing outlays for staff, transportation and postage. Additionally, the case for imaging relates to improved customer service, transaction speed and fraud prevention. Moreover, it is a stronger business case today than it was a few years ago. Fraud has increased – it currently costs the industry between $12 billion and $16 billion a year, according to Carreker Corp. And the cost of electronic storage is now from one-tenth to one-twentieth as expensive as it was just four years ago, according to Bank Administration Institute (BAI).

But poor execution of check imaging carries a huge downside. It can impact a bank in three broad areas: Customer retention, cost and legal ramifications. In this case:

  • Unhappy customers – Image mismatching (i.e. when a customer’s check is incorrectly matched to another customer’s account) was causing customers to turn away. The service failed to meet customers’ expectations and was causing concern about security. As customers began to use the service more, delays in accessing their checks became an issue.
  • Higher costs – The value proposition for imaging is rooted around its impact on efficiency and cost. In this case, the high degree of mismatching was requiring the efforts of an entire full-time team to rectify the issues.
  • Potential lawsuits – The Privacy Act and the Patriot Act have increased the need for information to be accurately stored and appropriately communicated. Violating the Privacy Act – albeit in error – leaves financial institutions open to lawsuits from those individuals and corporations whose information went astray.

Identifying Levers for Improvement

In the beginning the client was not focused on mismatching, but rather simply defects within the imaging process. They knew they had a problem, but not sure where. There were three main areas of focus that determine quality in the imaging process:

  • Images are available.
  • Images received are correct (alternative: mismatched images).
  • Images are legible.

A Pareto analysis was used to graphically depict the contribution of each type of defect to the total. The output: 62 percent of defects fell under the mismatch category.

Pareto Analysis of Image Defects

Pareto Analysis of Image Defects

As illustrated in the figure above, the greatest opportunity lay in solving the issue of mismatches.

What kind of opportunity exists in banking today? To have a Six Sigma level of image integrity means 99.9997 percent are properly indexed. Currently, banks are between three and four sigma. In other words, for every one million items a bank processes per day 6,210 to 66,800 images are defects and up to 41,416 images are mismatched.

From Root Cause to Solution

In this case study, the key was to identify what lay behind the mismatching errors. The team documented two root causes. The first root cause was network failures. The second variety was created when there was a major hardware malfunction at the time of capture and the sorter operator did not follow proper recovery procedures.

The first action was to implement a critical to customer safeguard, ensuring the errors did not pile up at the customers’ doors. The team concluded that by implementing a tracking number checkpoint, mismatches would be caught during capture – in other words, before they reached the customer. The mismatches could then be quickly rectified. The second action was to implement improved training and sorter operator incentives so as to ensure proper recovery procedures at the time of malfunction.

The team also built a value stream map of the process which included key performance data such as wait times, setup times, rework loops and processing times. Creating the map highlighted wasted time and effort that usually isn’t apparent, even to people embedded in the process. Each step was categorized into value-add or non-value-add, and the team was able to identify and eliminate significant waste and dramatically decrease cycle time by removing non-value-add steps in the process. All told, the team was able to cut the cycle time by 50 percent, raising productivity levels and driving out $500,000 in cost.

MSA: Be Sure that Your Data Is Valid

When I was trained as a Green and Black Belt, I didn’t fully understand the value of measurement system analysis (MSA). The concept worked, but I thought it was something that you do in one project out of 10 – only when you have the obvious opportunity to prepare a study with the possibility to assess repeatability and reproducibility.

As time went by, I was trained as a Master Black Belt. Working in Anheuser-Busch InBev, I started to sit through Green Belt project coaching sessions. I coached projects myself and then began teaching Green Belts and Black Belts. And during each Measure phase discussion I heard, especially from non-manufacturing project leaders: “Yes, but how does MSA apply to me? I improve market share, I can’t use gage R&R there.”

Having worked with several providers of training content, I still feel that a good general overview of the question is missing. “Hard stuff” such as gage R&R or attribute agreement analysis explanations are easy to find, but there isn’t good guidance on how to do MSA in “softer” cases.

This article attempts to provide high-level overview for the question of choosing the right method for data validity checks.

Select the Right KPI

Any improvement project has to deal with data by definition. Peter Drucker, the founder of modern management, said, “You can’t improve what you don’t measure.” Well, let’s be precise. Without data, you may be able to improve things, but you’ll never prove you have succeeded. This is why in all project management trainings, SMART (specific, measurable, achievable, relevant, time-bound) objectives are discussed.

What also matters is the right level of the key performance indicator (KPI). In manufacturing, every company measures its performance on the factory level and on the individual machine level. However, this is not enough for process improvement.

Manufacturing Example

Consider a manufacturing example. A manufacturing company wants to improve overall equipment efficiency (OEE) of a packaging line. We use the Pareto principle to get to the bottom of things. If there is no granular measurement system in place, it has to be established. But let’s say we have. We break down OEE into categories and see that planned downtime attributes for 30 percent of total capacity loss; we spend too much time on changeovers. The next step would be to identify the number of changeovers and the duration of each and then apply the single minute exchange of dies (SMED) approach for a reduction of changeover time. The right primary metric for this project could be average changeover time. It has to be measured reliably. How can I be sure to trust this figure?

Non-Manufacturing Example

Now let’s take a non-manufacturing example. The logistics department of a fast-moving consumer goods (FMCG) company wants to reduce the number of on time, in full (OTIF) non-compliance due to transport-related issues. A service-level KPI is not granular enough to provide the required visibility. We need to look at categories within the service level. However, aiming at a share of transport-related issues reduction within the total number of issues is wrong because this KPI could improve simply when some other OTIF categories deteriorate. The right KPI here would be the percentage of transport-related issues to the total number of shipments.

So, we selected the right KPI. Then what? Let’s measure the baseline, improve the performance and demonstrate the difference, right? Not quite. We still need to make sure we can trust these figures.

Select the Right Data Validation Method

If you have a project related to KPI improvement, how do you validate the accuracy of the data? In general, you have to consider if you can plan the experiment or not. However, even if you only have historical data, much can be done to evaluate the level of trust in the measurement system.

Figure 1 shows the attempt to provide the algorithm to get to the right validation method.

Figure 1: Decision Tree

Figure 1: Decision Tree

Let’s illustrate this decision tree with a few examples.

1. A beverage company produces bottles with a twist-off caps. Twist-off torque needs to be measured. It is continuous data that is measured directly with a device. However, the measurements for the same bottles cannot be retaken as it is a destructive test. Therefore, the nested gage R&R method needs to be applied.

2. A beer company produced a limited volume of a special brand in the last few years. Due to customer complaints, we suspect there were some issues with the bitterness of the product. The beer is not produced anymore and we don’t have an access to the quality control facility, but we do have historical data. Bitterness could be measured continuously, but we can’t re-take the tests. We have to do exploratory data analysis to make an assumption about whether we can trust the figures.

3. A printing company uses liquid inks during production. In order to provide good quality, inks should have the right viscosity, which is measured by printing operators at the machine. Viscosity is measured indirectly as the number of seconds it takes for ink to run out of a standard measuring funnel. It is continuous data that is measured directly. We can evaluate both repeatability and reproducibility. If we have balanced data (i.e., equal number of measurements per sample between and within operators, and/or measurement devices), we go for a traditional gage R&R crossed study.

If the data is not balanced, or we want to take additional factors into account (for example, interactions between different operators and different measuring devices), we go for an expanded gage R&R.

4. A steel company tries to reduce its total electricity consumption; it is measured by the factory meter. It is calibrated, but to what level can we trust this? It is continuous data that is measured directly, but there is no way to re-test this. The only way to verify is to find an alternative measurement system and then compare the figures using a paired T-test. This alternative could be the meter of electricity provided or the sum of the individual meters installed within the factory (in both cases, power grid losses should also be taken into account).

This method will never tell you exactly how much you can trust the measurement system. What you will get, however, is the confidence interval of difference between two or several values from different measurement systems, so you can decide whether you are comfortable with this difference. Important note: analysis of variance (ANOVA) and T-test assume normality of data. For non-normal data, use respective non-parametric tests.

5. A pharmaceutical packaging line has a control unit that rejects every package with a weight under the tolerance (this can happen if a leaflet is missing). Discrete output (accept/reject) is the function of the continuous input (weight in grams). Therefore, to assess the accuracy of this control unit, an attribute gage study method needs to be applied.

6. During recruitment interviews, assessors evaluate applicants’ cultural fit on a scale from 1 to 5. Based on this and other factors, a different person may decide whether to offer the job to a candidate. In the first case, it is a discrete ordinal measurement (you can only have integer numbers). In the second case, it is discrete binary (yes/no). In both cases, you can apply attribute agreement analysis.

Some Examples of Exploratory Data Analysis

A few simple graphical analysis tools can tell you more than thousands of words. It has been said that there are only three rules about data: “First rule: Plot the data! Second rule: Plot the data! Third rule: Plot the data!” For our purposes, that means to look at the data distribution, check how the process behaves with time and look for some abnormalities – these steps can help you identify if there is anything wrong with the data. Two examples follow.

Example 1: Bitterness of Beer

Minitab’s graphical summary in Figure 2 below shows that this data is not normal. There is a tri-modality of the data (distinctive spikes in three places) that could be caused either by shifts in the process or by other factors.

Figure 2: Summary for Bitterness

Figure 2: Summary for Bitterness

No natural process can be explained like this; therefore, we cannot draw any reliable conclusions before we understand the nature of this abnormality. Could the spikes at 12, 15 and 18 units have any connection with the process specifications? Yes. The lower tolerance is at 12 and the upper tolerance is at 18, which puts the middle value (and target) to 15. This can be seen in the capability analysis shown in Figure 3.

Figure 3: Process Capability of Bitterness

Figure 3: Process Capability of Bitterness

Certain data manipulation took place, where the measured values (especially those exceeding the upper specification) were rounded to the “comfortable” figure. Fortunately, capability analysis shows what the true process looks like.

The practical conclusion here is that the historical data was not reliable. The entire process of taking and recording measurements needed to be improved.

Example 2: Use of Glue

One of the operations in beer packaging production is the application of labels to the bottles. Melted glue is used for this, and its consumption is controlled and reported. For the glue consumption optimization effort, look first at the historical data (Figure 4).

Figure 4: Historical Data for Glue Consumption

Figure 4: Historical Data for Glue Consumption

Without even getting a p-value, it is clear that the data is not normal. Why might that be? A look at a time series (Figure 5) might help explain.

Figure 5: Time Series for Glue Consumption

Figure 5: Time Series for Glue Consumption

What the histogram showed was a weekly averaging-out – not the picture of daily consumption. Actual measurements were clearly taken once a week, and each figure was smoothed out throughout several days. There was a requirement of daily consumption reporting; however, nobody seemed to be bothered by equal figures from day to day.

As a result of this, daily actual measurements started to take place resulting in much more detailed information available for future process optimization.


No matter what process we optimize, we always need data. Whenever we use data, we must always doubt its accuracy. If statistical analysis can be conducted to estimate measurement error, it should be done. If it can’t be done, seek out indirect ways to understand the level of trust we can have in the given measurement system.

The No. 1 Reason Why Continuous Improvement Projects Fail

In a previous article I wrote about the reasons why so many lean manufacturing, Six Sigma, and other improvement programs fail. In this article I’m going to expand on reason No. 1: the Academy Award Syndrome.


Academy Award Syndrome

The Academy Award Syndrome is where a program or project is launched to much fanfare, ceremony, and expense, but six months later all that remains is a bunch of faded posters on the wall, boxes of expensive and unused workbooks, and an even more cynical and jaded bunch of employees than we had before.

In our society we are generally becoming more cynical. We are certainly overwhelmed with the launches of new initiatives from our politicians, from companies that are trying to sell us new products, and the daily media cycle that supports these launches.

Within a company our new project or initiative that we are all enthusiastic about is probably not the first that has ever occurred. If previous initiatives have been launched and then abandoned, our employees can be very cynical, and this can cripple our program.

When we launch a big initiative in the textbook way, as prescribed in many books written about the Toyota Production System, we have top management announce it to show their commitment. We have training, we have posters, T-shirts, lunches, and videos. I’m not kidding here; I have seen it happen, sadly more than once.

Two distinct effects are produced

1. A lot of time and money is spent designing the right T-shirt, video, and poster, so that all the brand themes are consistent and cultural subtleties about appropriate colors are observed. The list of ways to paralyze the program before it begins is long. The knock-on effects are to delay the start of the program, spend money before we have made it, and frustrate the people who ultimately have to do the work to make it all happen. Meanwhile your focused, effective competitor is six months further down the track, and you have not even organized the kick off BBQ!

2. Your employees having seen improvement programs come and go like the four seasons, can adopt the following behaviors:
• Sit back and wait to see if this thing will die the same way as all the others.
• Actively resist just to be proven right.
• Have a negative perception about the project before it has started.

All of these things spell death!

What do we do differently?

The effect we want to generate is a change in behaviors that quickly produce improved results. That’s it! Then we want to repeat that effect every hour, every, day, forever.

We start by just introducing one or two changes in how we work, staying focused on those changes until we see the effects we want to see appear. Then we introduce a few more, and a few more.

The point here is that we go about this in a low-key, results-focused, and supportive fashion.

The ultimate compliment you can be paid is when one of your people says something like, “Production, quality, safety, and housekeeping just seem different, better, cleaner, quieter. Not sure what is happening differently.”

Actors collect their Academy Awards after the movie is released successfully. We should do the same.

Comments are welcome and encouraged.

In my next article I will deal with the No. 2 reason why lean manufacturing and Six Sigma projects fail to deliver a spectacular return on investment: the Magic Wand Disease.



Ten years ago today, quality improvement lost one of its most important pioneers.

Joseph Juran, the famed author and quality improvement expert, died February 28, 2008. At the age of 103, Juran had single-handedly done more to create the foundation for modern process and quality improvement methods than any other person.

His contributions came in many forms.

Juran took the Pareto Principle – the top 20% of any country’s population accounts for 80% of its economy – and translated it into business. He developed the Juran Trilogy that addressed the planning, control and improvement of quality in products. He completely upended the traditional early 20th century approaches to making operations more efficient.

Juran’s Background

Quality planning consists of developing the products and processes required to meet customer’s needs.”

 Born in Romania in 1904, Juran immigrated to the United States when he was eight. His family settled in Minneapolis, Minn. Juran did well in math in school and became an expert chess player. After high school graduation, he earned a bachelor’s degree in electrical engineering from the University of Minnesota.

He worked at Western Electric’s Hawthorne Works, eventually moving into Bell Lab’s statistic-driven quality control department. His job involved working with a team that tested quality improvement innovations. This early work essentially set the course of his life.

In the 1930s, he rose to the position of chief of industrial engineering. He also earned a law degree from Loyola University Chicago School of Law, but never practiced.

During World War II, Juran worked for the government’s Lend-Lease Administration, focused on streamlining shipment processes. But, more importantly, during this time he came across the works of 18th century Italian economist Vilfredo Pareto.

Pareto’s Principle

“The vital few and the trivial many.”

By the time Juran had left regular office work to become a consultant on quality improvement, he had developed a theory that Pareto’s 80/20 principle could be applied to any organization’s operation.

For example, he argued that most defects are the result of a small percentage of the causes of all defects, according to the Economist. For another, 20% of a team’s members are going to make up 80% of a project’s successful results. And 20% of a businesses’ customers will create 80% of the profit.Pareto Principle

Juran felt organizations, armed with that knowledge, would focus less on meaningless minutiae and more on identifying the 20%. That means eliminating the 20% of mistakes causing the majority of defects, rewarding the 20% of employees causing 80% of the success and serving the 20% of loyal customers that drive sales.

In a way, Pareto’s Principle puts numbers to the idea that in business, as in life, things are not evenly distributed. Pareto was studying land ownership in Switzerland. But Juran saw that it applied to business, as well.

A Focus on People

“It is most important that top management be quality-minded. In the absence of sincere manifestation of interest at the top, little will happen below.”

Despite his expert knowledge in math and statistics, Juran emphasized people. He learned early on that the biggest roadblock to process improvement is not the tools of the trade, but rather the need to make a cultural shift among the people involved.

That started at the top. Juran felt a major impediment to quality improvement is a disengaged upper managementset in its current ways.

As usual, the highly quotable Juran summed up the situation well. In addition to the quote at the start of this section, Juran also had this to say about the need for leadership at the top of an organization:

“Observing many companies in action, I am unable to point to a single instance in which stunning results were gotten without the active and personal leadership of the upper managers.”

Juran Trilogy

“Goal setting has traditionally been based on past performance. This practice has tended to perpetuate the sins of the past.”

In his focus on people and how they work in processes, Juran took a different approach than others working in the growing quality improvement field. In doing so, he completely changed how companies looked at reducing inefficiencies.

Juran found the hidden costs in how companies tended to deal with defects. In the early 20th century, that often meant dealing with the issue after it had occurred rather than focusing time and money on making quality improvements to keep defects from happening.

He also felt that the resulting poor product quality cost companies more than they fully accounted for, including damage to a company’s reputation that led to a loss of customers.

He also advocated for creating operations that ran efficiently without the need for costly inspections.

He developed the Juran Trilogy, which involved three principal areas:

Quality planning – This involves identifying your customers, determining their needs and developing products that respond to their needs.

Quality improvement – Develop a process to create the product and then optimize that process.

Quality control – Create a process that can operate under minimal inspection.

The Juran Trilogy was formally published in 1986 and quickly became established as a must-read for those involved with quality improvement around the world.

Visit To Japan

“All improvement happens project by project and in no other way.”

In 1954, Juran made a trip that literally changed quality and process improvement forever. He spoke to business leaders in Japan about his ideas on quality improvement and control. While he had done this in other places, the Japanese took his message much more seriously.

In the subsequent years, Japanese business leaders developed the ideas that led to Six Sigma and Lean. They instituted process improvement methodologies that have since been copied around the world. While these Japanese leaders deserve the credit for developing these ideas, Juran also deserves credit for sparking the ideas.

Over the years, Juran also wrote books. Further insight into his ideas can be found in the pages of the “Quality Control Handbook,” “Juran’s Quality Handbook” and “Managerial Breakthrough: A New Concept of the Manager’s Job.”

In 1979, Juran established the Juran Institute in Connecticut. The institute is a training, certification and consulting company that focuses on quality management, including Lean and Six Sigma.

One of the best quotes about Juran comes from Joseph De Feo, the CEO of the Juran Institute, upon Juran’s death. He said: “Dr. Juran recently told me that he wanted everyone to know he had a wonderful life and hoped that his contributions to improving the quality of our society will be remembered.”

Of that, there is little doubt.