Top IT/ITES company
Designation Process Excellence lead
Hiring For Top It/ites company
Job Description Is responsible for creating a continuous improvement culture across the global BPO by fostering the six-sigma methodology.
Ensures the Continuous Improvement approach across Engagements and Centers
Is responsible for mentoring Lean Six Sigma Black Belt, Green Belt and Yellow Belt projects as required.
Is responsible for ensuring 100% staff is trained on Lean Six Sigma Yellow Belt.
Supports process alignment based on GPM
Reports monthly Delivery Excellence progress against targets to the Engagement Stakeholders through Balance Scorecard
Fosters the global sharing of innovation, new methodologies and best practices.
Supports QMS implementation as required
Desired Profile
 1. Minimum 6+ yrs of total experience.
2. minimum 2+ yrs of experience in Process Excellence or continuous process improvement.
3. Blackbelt certified( preferred)
4. should be wiling to relocate to Salem( Tamil Nadu)
5. willing to work in 6 day week.
Experience 6 – 11 Years
Industry Type BPO / Call Centre / ITES
Role Team Leader-Quality Assurance/Quality Control
Functional Area ITES, BPO, KPO, LPO, Customer Service, Operations
Employment Type Full Time , Permanent Job
Education UG – Any Graduate – Any Specialization

PG – Any Postgraduate – Any Specialization

Doctorate – Doctorate Not Required

Compensation:  6,50,000 – 8,00,000 P.A.
Location Salem
Keywords lean six sigma black belt process excellence green belt continuous improvement delivery excellence qms implementation
Contact Anvitha M

Driving better process change in half the time

 

What if I told you that “Process Improvement” is really just a code word for “Process Change”? At some point you’ll need to tell people involved in the process to stop doing things the old way and please start doing them the new way. The track record of being successful in process change is kind of dismal – the global corporate landscape is littered with burned-out husks of supposedly slam-dunk successful improvements gone awry in rollout. Well-intentioned and intelligent teams surround the smoking wreckage, scratching their heads and pondering what the heck went wrong.

What is your definition of a successful process change? I’ve always been partial to this one: You achieve your improvement objectives on time, on budget, and when the dust settles, there aren’t a lot of dead bodies lying about. That means you didn’t bull-doze through an organization with your process change using some form of brute-force implementation with zero regard to any collateral damage you may be generating in other areas. Brute-force is tempting because it is fast, but successful change that sticks can take time – but shouldn’t take forever. Process changes can absolutely move faster than the glacially-slow speed at which many corporate changes poke along.

So how can we generate fast process change that sticks and do so in a healthy fashion? Thanks for asking – I can tell by your question that you are a smart and attractive person… Sometimes, changing our project management mindset by asking a few questions right up front can help. Here are three questions, and simply asking those early in the improvement effort, have helped some to cut implementation time in half:

Tip #1: People Count (A Lot) So How Can We Involve Them Early & Often?
It’s easy in methods like Lean and Six Sigma to get so focused on the technical aspects of architecting process changes that we don’t consider the people aspect until rollout – and that’s way too late! As early in the improvement effort as the discovery or definition phases we can be involving those who will be impacted by this change. A subtle but important note: It’s better to have a change done with you rather than to you. Early involvement builds a sense of ownership that can significantly reduce later resistance issues in rollout. Tools like Catchball, FMEAs, Fishbone diagrams, Process Mapping, etc. are all great ways to involve members of your larger target audience – not just your core team.

Tip #2: How Will We Assess Our Human-Side Risks? 
You and I can have the best process in the world but if the humans who need to actually follow that process don’t do so, we’ve wasted our time and money. Any improvement or change that may be controversial or large is likely to meet with some resistance. Better to know about that in advance so you can manage it. Tools like a Stakeholder Assessment can help you think through which groups will be impacted, what their reaction and potential questions may be, and then plan (in advance) ways to mitigate any anticipated risks.

Tip #3: What Is Our Plan For Robust Communications Throughout The Project?
Many teams grossly underestimate the amount of effort in and volume of communications that need to go into a successful process change. Your project may be just one of 48 different things an employee will hear about in a given day. Don’t expect your artfully crafted poster or masterfully written email to be remembered. Think of multiple communications using multiple methods in order to get your message across. It bears repeating that we should be communicating with our target audience over the full duration of the improvement project – not just at the end. Tools like a Communications Plan are essential to ensuring that a robust flow communications to the right audiences from the right senders at the right times is happening.

These are very healthy questions to be asking and I hope they aid you as they have others. Now, if you’ll excuse me, my pets are indicating that “Dogs count (a lot) so how about involving us in some belly rubs and play time?” Happy change!

3 Essentials of Applying LSS to AEC Firms

Applying Lean Six Sigma (LSS) to the architecture/engineering/construction (AEC) industry can create unique challenges. LSS is not heavily used in the service industry and almost never used in AEC businesses. When LSS is presented to an AEC firm one of the first comments from the company’s team members is, “If it ain’t broke, don’t fix it.” This comment comes from a lack of understanding of what is possible. It comes from an attitude of “We’ve always done it this way.” What they do not realize is that their system is broken.

It has been estimated that the service industry is operating with a level of waste at close to 80 percent. Many AEC firms do not even attempt to reduce that waste and have no problem passing the cost of that waste on to their clients.

Introducing LSS to an engineering office, architectural office, or a team of construction workers can be daunting. In general, members of the AEC industry perceive LSS as not applicable and not useful to their day-to-day operations. This comes from an attitude of resisting change and results from rejecting LSS before they understand how it works. They do not believe the LSS success stories and think they are overstated. This resistance, however, is certainly not insurmountable. It does, however, require a deliberate approach.

1. Make LSS Strategic

It is critical that LSS be presented to the organization as the new method of operation, not just an experiment. It needs to be clear that LSS is not a fad that will fade but a permanent approach to doing business. This means that LSS needs to be built into the company strategy. It needs to be apparent in the long-term vision, mission, objectives and metrics, and needs to be cascaded to every operational unit throughout the organization.

Every member of the organization should not only have a clear understanding of what LSS is but should also understand how it applies to his or her daily activities. Members of an organization should understand how to use the LSS tools that apply to their daily efforts and see a direct connection from their application of those tools to the organizational system as a whole. This requires an extensive training program to ultimately touch every person in the organization.

2. Make LSS Cultural

Far too often the application of LSS becomes a recurring cycle of selecting a specific tool to use, celebrating temporary improvements, and then getting frustrated when the same old issues come right back. LSS is far more than a collection of tools. It is a powerful management and cultural paradigm that should affect every decision, behavior and action throughout the organization. Ultimately, those short-term temporary improvements need to evolve into long-term continuous improvement; this requires that LSS be deliberately infused into the culture of the organization.

Making LLS a vital part of the culture requires that the purpose and importance of LSS be included in the principles, values and beliefs fostered by the organization. It needs to become foundational to the purpose of the organization and essential to the organization’s competitive advantage and long-term success. Instead of just applying LSS tools, develop a LSS culture of continuous improvement.

3. Make LSS Scientific

If the organization is going to effectively apply LSS, members of the organization will need to witness its results. They will need to have clear evidence that the effort will be worth it. This means that the metrics used by the organization need to track the improvements directly associated with a LSS approach; the metrics should be strategically focused and not just randomly applied. If LSS efforts are not being measured, members of the organization will not take it seriously. Team members will assume the improvement does not matter. When the positive impact of LSS is made clear using effective metrics, the organization’s commitment to LSS is magnified.

It should be made clear that metrics are not to be used to punish or assign blame. They are used to drive performance and influence behavior. This means that metrics should not only include key performance indicators but also key behavioral indicators. This combination measures results as well as the behaviors contributing to those results. It is difficult to achieve high performance without encouraging ideal behaviors that create and sustain that level of performance.

Example: Engineering Calculations

A staff engineer spends 20 hours over three days preparing the calculations on a project. After completing the calculations, the staff engineer sends them to a senior engineer for review. The senior engineer does not get around to reviewing the calculations for two days. He then spends two hours reviewing the calculations and provides feedback.

The staff engineer then spends three additional hours revising the calculations and returns them back to the senior engineer. After two more days, the senior engineer spends another hour reviewing and provides additional feedback. The staff engineer then spends two more hours making revisions and sends the calculations back to the senior engineer. The senior engineer then sends the calculations to the engineer responsible for stamping. After four days and three hours of review time, the engineer in charge provides feedback to the staff engineer including several changes to the design approach of the project. The staff engineer spends six hours revising the calculations and sends them back to the engineer in charge and cc’s the senior engineer. After four more days, the staff engineer follows up with the engineer in charge to determine if the calculations are approved and stamped. Two days later, the engineer in charge sends the stamped calculations to the staff engineer.

Does this process seem rational? Or is it overburdened with repetitiveness and redundancies? In this example, the inefficiencies of the firm’s operations resulted in nearly 45 percent wasted time. This wasted time resulted in an extra cost of more than $1,500. This cost is covered by the firm, the client or both. It took the firm 17 days to produce a final product that could have been produced in four days if the waste and delays were eliminated. Unfortunately, this type of scenario is not uncommon.

Applying a LSS perspective to this example, the total amount of time spent on the project was 17 days times 8 hours per day, which equals 136 hours. The amount of time spent working on the project was 37 hours. The total waste time was 99 hours (73 percent). The goal should be 37 hours. (In the Lean world even that would be too much since there are probably inefficiencies in the way the work itself was performed.)

Example: Purchase Orders

Overly excessive control can seriously damage performance as happened in the purchasing office of one of the world’s largest oil and gas conglomerates. In this case, because of a history of inaccurate purchase orders, a series of controls were put in place to make sure that all purchase orders were properly vetted. A purchase order now required 16 approval signatures before it could be released.

The process of routing this document often took six to eight months. The business needed to streamline the process. Each of the 16 signees were asked what they looked for on the document to determine whether or not to sign it. Every one of the 16 shared that all they looked for was that another specific individual had signed the document. If so, they were sure the purchase order had to be correct.

In the end, they had 16 control points for every purchase order, all of which created waste and time delay, and none of which added value, since none of the signees actually checked the document. The failed system drove this behavior because it caused a false sense of security for all the signees. It would have been better to have one control point where the document was looked at carefully. Then there would be more accuracy and significantly less waste.

This type of redundancy and waste is also seen in the engineering example. Using the same logic, there was 90+ percent waste in the way project submittals were created. And this waste equates directly to dollars.

Conclusion

Few organizations in the AEC industry use LSS. They continue to do things the way they have always been done. They operate with huge amounts of waste and are making little effort to reduce that waste. This information should be exciting to the leaders of organizations in the AEC industry. Because if LSS can be used to reduce the waste in an organization, huge competitive advantages can be gained. A business will be able to provide unmatched service, production speed and quality at unbeatable prices.

When working to apply LSS, make it strategic, cultural and scientific. That is the formula for producing sustainable results that will render the competition irrelevant.

Case Study: Improving Value for Clients and Employees

On her drive into work, Melissa Garcia was listening to Beethoven’s Piano Sonata No. 14 in C Sharp Minor and sipping on the mocha java blend she’d brewed in her kitchen just minutes before. It was a brisk fall Thursday morning, and Melissa was headed into the office where she was three weeks into her new job as the senior VP of operations for Portland’s Pacific Northwest Technology Partners (PNTP). As she maneuvered her late model sedan into the parking garage at 6:08 a.m. and navigated into the nearly empty lower level parking lot, her brakes let out a defiant squeal. “I’d better get those looked at before my trip to Mount Hood next weekend,” she thought to herself.

Melissa had recently been promoted to her new role and had transferred to company headquarters to serve in that capacity. With thoughts of yesterday’s staff meeting with her peers still filling her mind, Melissa was mentally preparing for her 10 a.m. meeting with her boss, Meifeng Li. Early in the interview process for the new role, Melissa felt she had a strong supporter in Meifeng. Meifeng was interested in Melissa’s overall prior experience, accomplishments and proven ability to transform a large organization.

PNTP Operations Mission Statement

To contribute to the success of our stakeholders through the utilization of our unique capabilities and distinct expertise.

Opportunity Statement

Meifeng’s main concern, and one of Melissa’s key challenges in her new role, was introduced as a staffing concern. PNTP’s operations unit had remained fairly consistent in size at about 600 employees. However, increasing competition was driving PNTP’s margins down. That, coupled with increasing client demands – from both new clients as well as an increase in existing client demand for their products and support services – had prompted senior staff to request Meifeng to report on the efficacy of her operation. Melissa’s prior experience provided her and Meifeng a solid foundation on which to build their approach to addressing these pressing issues.

Approach

Melissa spent a good bit of the morning reflecting on the task at hand to make sure she was properly prepared for her morning meeting. Throughout the interview process for her new role, Melissa had presented her approach to a similar business problem she had faced in her previous job. In that project, a large percentage of the highly skilled employees in her organization had expressed concern, through the annual employee feedback process, that a significant portion of their time was spent on routine and mundane client requests. These employees were wired to manage higher-complexity tasks and deliver quality service to their key strategic clients. They yearned to be able to do what they felt they did best in their jobs every day. The project took about eight months from the discovery phase to pilot and validation.

The leadership team was impressed with the approach, diligence, flexibility and focus on the employee feedback that the core project team displayed. The team took a “seek first to understand” approach and was guided by the principle to strive for simplicity in their outputs. The project plan, though comprehensive, could best be summed up as follows:

I. High-level work type analysis
II. Go deep on key activities
III. Formulate recommendations
IV. Prioritize and vet solutions
V. Pilot and validate outcomes
VI. Wide-scale implementation
VII. Sustain the gain

I. High-Level Work Type Analysis

Many of the highly skilled employees had been in their roles servicing their clients’ needs for many years. Over the course of those many years, as the clients’ needs evolved and competition to serve those needs by industry competitors rose, the employee workload had become more broad-based with wide variation. This variation stemmed from ever-increasing client demand, coupled with the clients’ varying degrees of willingness to leverage self-provisioning technologies that the company worked diligently to implement.

As an added complexity, the employees recognized that their role was largely relationship-based. Team members struggled with the client discussions that were geared toward self-provisioning solutions for the fear that they would lose the hard-earned goodwill they had garnered for serving their clients’ needs in a more personal, one-on-one fashion.

In an attempt to provide transparency to the workload environment, Melissa and the team set out to answer a seemingly simple question: “What, specifically, do our employees do for their clients every day?” With a basic understanding of their workload, the team believed they could formulate hypotheses around the type of work that:

  1. Could be eliminated altogether.
  2. May be better self-provisioned by the client and positioned appropriately (incentivized).
  3. Likely would be managed with more efficacy through improved technological solutions.
  4. Is best served remaining as activities completed by the highly skilled employees.
  5. Should continue to be managed internally by a more appropriately skilled resource pool.

Anchored around the annual employee feedback process, and specifically a question aimed at determining if the employees felt they had the opportunity to do what they do best every day, the project team decided to involve the employees themselves to garner insight into the activities and tasks they do each day. To meet their goal of striving for simplicity, the core project team envisioned a four-box visualization to their data findings that would be based on the activities’ frequency and complexity. They gained agreement on this approach and sought to involve as many employees in the data gathering process as necessary to gain a significant cross-section of client-types served, employee experience, tenure and a host of other related conditions.

During the data collection phase, efforts focused on: creating an intuitive data collection plan template (Figure 1); generating readily understandable operational definitions for the visual display criteria being leveraged (frequency and complexity); determining the approach for choosing and training the data collectors; piloting and modifying the collection plan; collecting and reviewing the data for integrity; creating graphical displays; summarizing the findings; and reviewing the high-level findings with the leadership team.

Figure 1: Data Collection Template

Figure 1: Data Collection Template

Melissa was pleased with the fruits of the team’s efforts from the outset. Nearly 100 highly skilled employees had been trained and worked diligently to collect information on more than 5,000 activities completed during a month-long timeframe. In an effort to protect against bias, the team was also instructed on how to document activities that they partake in on pre-determined intervals, which were likely to have fallen outside of the collection period.

Initial analysis indicated a clear picture emerging across the frequency and complexity scales. Nearly 80 percent of the activities fit into three distinct categories (Figure 2). The evidence to maximize the use of existing technology was surfacing. And employee engagement in the overall data collection process signaled a potential strength for future communication and change management efforts.

Figure 2: Nine-Box Data Review

Figure 2: Nine-Box Data Review

The team set about creating their simple visual depiction of the data collection as part of their preliminary analysis. They determined a nine-box was more in line with the results, given the data definitions used while collecting the data. As per their initial project approach plans, the axes for the visual centered on frequency and complexity. The nine-box, highlighting just the top three categories, represented 80 percent of all data collected. Further, within those three categories, there were 55 distinct sub-categories of activities. The team spent a significant amount of due diligence on analyzing the data and activities of these subcategories.

II. Go Deep on Key Activities

High-level analysis seemed promising. Melissa and her team had earned the green light to dig deeper on the data they had collected, were excited to begin to synthesize that data into information and, ultimately, business intelligence that could be acted upon. The team decided to create a decision tree (Figure 3) to get a more refined view of the 55 subgroups that made up the top three categories of data. The core team was divided into three workstreams, one each representing the top-three categories indicated by the data. The team created a decision tree approach that they could use to determine if the subcategorical work would “fall out” into one of the five categories listed earlier:

  1. Eliminate
  2. Increase self-provisioning by client
  3. Further automation candidate
  4. Remain in work group with existing highly skilled employees
  5. Remain in work group but with alternate staffing approach

Figure 3: Decision Tree

Figure 3: Decision Tree

III. Formulate Recommendations

With the subcategories effectively categorized into the five primary buckets laid out above, the workstream leads and their teams next set out to brainstorm potential solutions. The teams were instructed to arrive at “best case” solution scenarios based on minimal limitations to both technology spend and policy-change opportunities. This solution set would provide Melissa with an optimistic look at what could be. They were also instructed to create a “tempered expectations” solution set view based on likely benefits given significant technology spend and policy change constraints.

IV. Prioritize and Vet Solutions

Melissa continued to be pleased with the team’s progress. With this information in hand, her next step was to attain sponsor buy-in on the “best case” and “tempered expectations” exercise results. Once attained, the team set about working to:

  • Further flush out solutions
  • Create criteria to leverage to prioritize those solutions
  • Work with the finance team to create and articulate feasible business cases
  • Work with the organizational design team to frame organizational change scenarios and their implications

The team, again, turned to a simple nine-box display to articulate their position. This time, however, the axes were anchored around client impact and operational impact criteria. The team generated additional subcriteria for each of these axes as well as a weighting strategy for those subcriteria. This ensured that each potential solution was objectively weighted against each other and provided insight into the highest degree of cost/benefit impact to the organization.

V, VII and VIII. Pilot and Validate, Implement and Sustain

Based on the clear-cut evidence provided, the team gained widespread approval to pilot the most beneficial solutions. That pilot was meticulously planned and executed; the validated results exceeded initial expectations. Though there were some unforeseen bumps in the road, swift mitigation occurred and the organization moved to full-scale implementation at the conclusion of the planned pilot. As Melissa was offboarding from her prior organization, her core team was putting their control-and-response plan into place to create a measurement system and ownership structure to give them the best chance of sustaining the gains realized throughout the pilot.

It was 9:50 a.m. now. Melissa was ready to have a robust conversation with her manager in 10 minutes. She quickly called her mechanic to make an appointment for the brakes on her car, logged out of her preferred “Classical Music for Classic Rock Aficionados” desktop station, then strode confidently down the hall to her boss’ office for her one-on-one, stopping along the way to freshen up her coffee from the office breakroom.

SPURRED BY ESSA, SCHOOL DISTRICTS TURN TO PROCESS IMPROVEMENT FOR BETTER STUDENT OUTCOMES

Process Improvement – 3d render concept with blue and white arrows flying over a white background.

 

Process improvement helps business make better products and healthcare operations achieve better patient outcomes. Education systems now hope to do the same for students.

Improving student outcomes and streamlining often cumbersome school system operations are the focus of new initiatives across the nation.

Part of push behind the move is the Every Student Succeeds Act (ESSA). Signed by President Obama in 2015, the act replaces the federal No Child Left Behind Act enacted in 2002. Unlike the previous law, the ESSA gives much more leeway to local school districts to decide on how to best spend educational dollars.

It’s also made continuous process improvement a priority in school districts nationwide.

The Need For Process Improvement in Education

The goal of every school district is to provide a quality education to every student regardless of age, neighborhood, race or economic status. However, that requires the best possible management of resources, smart financial strategies and understanding the issues behind low-performing schools.

These issues often are referred to in phrases such as equity, closing achievement gaps, improved quality of instruction and increasing outcomes for students (in other words, preparing them for success in careers or college).

Some states see ESSA as an opportunity. Many have started to revamp entire education systems, according toEducation Week. This includes turning to two important components of Lean and Six Sigma: evaluating data and constant, consistent review of operations.

For example:

Tennessee has developed a School Improvement Support Network that supports districts in improving low-performing schools. Part of the effort is on training school district and state officials on the needs of low-performing schools and the root causes of their problems.

New York has already established a five-year plan that sets goals for student achievement and graduation rates. The plan also includes continuous evaluation of those goals and adjusting them based on student outcome data.

New Mexico has developed a real-time data system that tracks issues such as how many students in each grade are behind in terms of earning credits and how many students have transferred schools.

Examples of Six Sigma in Education

The use of process improvement methodology in schools is nothing new. In some cases, Lean, Six Sigma and Lean Six Sigma are directly involved with quality improvement efforts at school.

The Des Moines School District in Iowa has created a Department of Continuous Improvement that has reduced paper timesheet submissions by 97% and saved $80,000 in textbook inventory costs.

At Temple University, Six Sigma Green Belt Nichole Humbrecht, a senior in engineering, has applied the methodology to the school’s charity fundraising efforts. She works with HootaThon, an annual dance event that raises money for the Child Life Services department of the Children’s Hospital of Philadelphia.

A host of universities have also applied Lean and Six Sigma to reduce waste and save costs in a variety of areas, including recycling efforts and improving patient satisfaction at university-affiliated hospitals.

Clearly, education provides as many opportunities for applying Lean and Six Sigma as does business. Training employees and earning certification in Six Sigma can be the first step into making education systems more effective and efficient.

 

 

 

 

 

 

 

 

 

 

 

Statistical Analysis: The Underpinning of All Things Quality

There are many subjects that we cover regularly here at Quality Digest. Chief among these are standards (ISO 9001 or IATF 16949, for example) methodologies (such as lean, Baldrige, or Six Sigma), and test and measurement systems (like laser trackers or micrometers). One topic, however, is consistently at the very top of the list when it comes to audience popularity—industrial statistics, including statistical process control (SPC).

It’s no secret why statistics hold such a place of honor among QD readers. Quite simply, without exquisite measurement and the ability to understand if a process is in or out of control, continuous improvement is impossible. Statistical analysis is the very underpinning of all things quality, and it touches on everything from proper management to understanding how to best leverage technological innovation.

With that in mind, I recently had the opportunity to exchange thoughts with Neil Polhemus, Ph.D., the chief technology officer for Statgraphics Technologies Inc. Late last year he released the book, Process Capability Analysis: Estimating Quality. A slightly edited version of our conversation follows.

Mike Richman: What are the costliest mistakes people make when it comes to industrial statistical analysis?

Neil Polhemus: I think the costliest mistake is confusing quantity of data with quality of data. Statisticians learned long ago that all data are not created equal. I remember attending a class on designed experiments when J. Stuart Hunter, Ph.D. contrasted practical accumulated records calculation (PARC) analysis with design of experiments (DOE). A PARC analysis relies on gathering together whatever data happens to be available, entering it into the computer, running some analyses, and hoping that useful models emerge. He contrasted this with the statistical DOE, where the data to be collected are carefully planned to allow maximum information to be generated by its analysis. Dr. Hunter made it crystal clear to all of us that statistical thinking is even more important before the data are collected than afterward. Optimizing the sampling plan has a huge effect on how successful one’s analytical efforts are likely to be.

MR: How would you describe the difference between “data” and “information”?

NP: Data are all around us. It consists of everything that we observe. Our eyes collect data, our ears collect data, and so do our senses. Yet that raw data is not useful until our brain puts it into a context where it can be applied to solve some practical problem, such as keeping our car in the center of the proper lane on a highway as we drive to work. Data analytics is all about taking data, some of it structured but much of it unstructured, and extracting from it inferences that can be applied to making correct predictions. When we fit a statistical model, our goal is to take a set of observed values and extract enough information to make predictions about situations that have not yet been observed. Of course, all of this occurs in a dynamic multivariate context where what we are trying to model has changed as soon as we’ve observed it.

MR: What about so-called “Big Data”? Does it offer specific opportunities and limitations in helping manufacturers improve?

NP: Collecting large amounts of data from industrial processes is now the rule rather than the exception. Most manufacturers have large databases from which they can examine the performance of their processes in myriad ways. Structuring that data in such a way that the relevant data can be accessed in a timely fashion is a huge challenge. I gave a talk recently at the Symposium on Data Science and Statistics. You could see there that getting the data in the proper format is often much harder than analyzing it. With Big Data, we also need to replace the idea of “statistical significance” with that of “practical significance.” With millions of observations, every P-Value you calculate will equal 0. The small sample tests we all learned about in Stat 101 are no longer useful. The primary concern becomes whether the data have been collected in a manner that eliminates any sampling biases.

MR: That’s an interesting take; can you offer a few examples of biases that you may run into in collecting data sets?

NP: Suppose you wish to compare a new methodology for producing a product with the method you’ve been using for many years. So you set up a pilot line to generate samples using the new method, and compare the results with samples that you’re currently shipping. Is that really a fair comparison? Are you using the same operators? Are they doing things the same on the pilot line as they would on a real production line? Does someone have a vested interest in demonstrating that the new method is better (or worse) than the current method? It’s always hard to collect data where everything remains the same except what you’re trying to test. Dr. Hunter used to refer to “lurking variables” that affect a response without your knowledge. Unless we protect against those unseen effects, we can easily make the wrong decisions.

MR: Do you think that Six Sigma, which of course is so widely in use now, still has things to teach us about performance improvement? How do you separate the PR side of the methodology from its nuts-and-bolts statistical analyses?

NP: Any methodology that gets practitioners to look at their data rather than simply filing it away is clearly useful. Although it’s easy to criticize specific Six Sigma assumptions, the general DMAIC approach is sound. You define the problem, be sure you can measure adequately what you need to measure, collect data to analyze, improve the process if possible, and institute controls to be sure that improvements are sustained over time. Nothing to argue with there. Was it a radical change from what industrial statisticians had been saying for decades? I don’t think so. But it reached the ears of upper management in such a way that institutional changes were made. Sometimes the promises may have been greater than what could really be delivered, and sometimes it was applied in ways that it was not designed for, but overall I believe that Six Sigma was beneficial. However, in the era of big data, major changes need to be made to the methodologies normally associated with Six Sigma. Methods based on small sample statistics break down when the data sets get large.

MR: Finally, what do you see as the biggest technological changes that will affect quality and statistics in, say, the next 10 or 20 years? Is it automation, the Internet of Things, quantum computing, or something else entirely?

NP: I’m not great at forecasting the future, but it’s always seemed to me that the goal we should be working toward is removing drudgery from our lives. A lot of people go to work each day, do a repetitive job that they don’t enjoy, and come home too tired to give their loved ones the attention they deserve. The more we automate these types of jobs, the more time people will have to do the activities they enjoy and the things that really count in life. How do we do this? By building quality into products, by constructing statistical algorithms that are as good or better than expert humans, by linking all our devices to an intelligent control system, and by turning data into useful real-time information. A brave new world is coming very quickly. Hopefully our social systems can handle the change.

Six sigma JOB- Associate- Wealth Management Operations

Job Description

Job Description: Incedo is looking for associate will be responsible for
Account Ops
Effective audit of workflow to ensure accuracy
Conduct proactive audits of various work types and review of exception reports
Work together in a team environment to eliminate/reduce opportunity for errors.
Audit using the companys internal systems as per defined guidelines
Return defective units to operation team for correction
Daily reporting
Other duties and responsibilities as assigned

Key Skills: At least 1-5 years of Account Ops, (Preferably in Financial Industry)

Strong communication and organizational skills, detail oriented, self-motivated, high-energy, and self-directed problem solver, with an execution focus
Ability to resolve problems through root cause analysis
Financial and operational acumen
Ability to read, interpret and comprehend financial documents
Working knowledge of Microsoft Office
Demonstrate a structured and methodical approach
Process improvement and commitment to quality mindset
Flexible and open to multiple shifts

Six Sigma certification (nice to have)

Salary: Not Disclosed by Recruiter
Industry: Banking / Financial Services / Broking
Functional Area: Financial Services, Banking, Investments, Insurance
Role Category: Operations/Processes/Finance/Legal
Role: Operations Officer
Employment Type: Permanent Job, Full Time
Keyskills: Operations Management, Process Improvement, Six Sigma Certified, Root Cause Analysis, Operation Team, Wealth Management, Key Skills, Problem Solving, MS Office.

Memorial Day: We Honor All Who Contributed Enormously

In the spirit of Memorial Day weekend, we at 6sigma.us would like to deviate from our regular format and take time to honor those have passed on. Because of their efforts, our world is that much better.

In the United States, Memorial Day is a holiday to honor all of those who died while serving our country in the armed forces or military. Those who fought for the United States of America fought for our freedom, and that is why we are able to enjoy many niceties and benefits that we have.

We honor our fallen by taking a moment of silence, visiting our ancestors who fought and perished, who now live in spirit at our national cemeteries, memorials, and in our hearts.

If a family member is one who perished fighting for our country, many of us hold family gatherings, and through the power of spirit thank them for their service and carry them in our hearts.

In the same spirit, we here at 6Sigma.us would like to honor Six Sigma’s forefathers. We honor heroes such as Bill Smith, who was known as the father of Six Sigma, and many others, who without them Six Sigma would not have been in existence. 

We thank the countless people who we will never know for their enormous contributions, and to the pioneers who invented Six Sigma that have saved millions of livelihoods.

 

Statistical Analysis: The Underpinning of All Things Quality

There are many subjects that we cover regularly here at Quality Digest. Chief among these are standards (ISO 9001 or IATF 16949, for example) methodologies (such as lean, Baldrige, or Six Sigma), and test and measurement systems (like laser trackers or micrometers). One topic, however, is consistently at the very top of the list when it comes to audience popularity—industrial statistics, including statistical process control (SPC).

It’s no secret why statistics hold such a place of honor among QD readers. Quite simply, without exquisite measurement and the ability to understand if a process is in or out of control, continuous improvement is impossible. Statistical analysis is the very underpinning of all things quality, and it touches on everything from proper management to understanding how to best leverage technological innovation.

With that in mind, I recently had the opportunity to exchange thoughts with Neil Polhemus, Ph.D., the chief technology officer for Statgraphics Technologies Inc. Late last year he released the book, Process Capability Analysis: Estimating Quality. A slightly edited version of our conversation follows.

Mike Richman: What are the costliest mistakes people make when it comes to industrial statistical analysis?

Neil Polhemus: I think the costliest mistake is confusing quantity of data with quality of data. Statisticians learned long ago that all data are not created equal. I remember attending a class on designed experiments when J. Stuart Hunter, Ph.D. contrasted practical accumulated records calculation (PARC) analysis with design of experiments (DOE). A PARC analysis relies on gathering together whatever data happens to be available, entering it into the computer, running some analyses, and hoping that useful models emerge. He contrasted this with the statistical DOE, where the data to be collected are carefully planned to allow maximum information to be generated by its analysis. Dr. Hunter made it crystal clear to all of us that statistical thinking is even more important before the data are collected than afterward. Optimizing the sampling plan has a huge effect on how successful one’s analytical efforts are likely to be.

MR: How would you describe the difference between “data” and “information”?

NP: Data are all around us. It consists of everything that we observe. Our eyes collect data, our ears collect data, and so do our senses. Yet that raw data is not useful until our brain puts it into a context where it can be applied to solve some practical problem, such as keeping our car in the center of the proper lane on a highway as we drive to work. Data analytics is all about taking data, some of it structured but much of it unstructured, and extracting from it inferences that can be applied to making correct predictions. When we fit a statistical model, our goal is to take a set of observed values and extract enough information to make predictions about situations that have not yet been observed. Of course, all of this occurs in a dynamic multivariate context where what we are trying to model has changed as soon as we’ve observed it.

MR: What about so-called “Big Data”? Does it offer specific opportunities and limitations in helping manufacturers improve?

NP: Collecting large amounts of data from industrial processes is now the rule rather than the exception. Most manufacturers have large databases from which they can examine the performance of their processes in myriad ways. Structuring that data in such a way that the relevant data can be accessed in a timely fashion is a huge challenge. I gave a talk recently at the Symposium on Data Science and Statistics. You could see there that getting the data in the proper format is often much harder than analyzing it. With Big Data, we also need to replace the idea of “statistical significance” with that of “practical significance.” With millions of observations, every P-Value you calculate will equal 0. The small sample tests we all learned about in Stat 101 are no longer useful. The primary concern becomes whether the data have been collected in a manner that eliminates any sampling biases.

MR: That’s an interesting take; can you offer a few examples of biases that you may run into in collecting data sets?

NP: Suppose you wish to compare a new methodology for producing a product with the method you’ve been using for many years. So you set up a pilot line to generate samples using the new method, and compare the results with samples that you’re currently shipping. Is that really a fair comparison? Are you using the same operators? Are they doing things the same on the pilot line as they would on a real production line? Does someone have a vested interest in demonstrating that the new method is better (or worse) than the current method? It’s always hard to collect data where everything remains the same except what you’re trying to test. Dr. Hunter used to refer to “lurking variables” that affect a response without your knowledge. Unless we protect against those unseen effects, we can easily make the wrong decisions.

MR: Do you think that Six Sigma, which of course is so widely in use now, still has things to teach us about performance improvement? How do you separate the PR side of the methodology from its nuts-and-bolts statistical analyses?

NP: Any methodology that gets practitioners to look at their data rather than simply filing it away is clearly useful. Although it’s easy to criticize specific Six Sigma assumptions, the general DMAIC approach is sound. You define the problem, be sure you can measure adequately what you need to measure, collect data to analyze, improve the process if possible, and institute controls to be sure that improvements are sustained over time. Nothing to argue with there. Was it a radical change from what industrial statisticians had been saying for decades? I don’t think so. But it reached the ears of upper management in such a way that institutional changes were made. Sometimes the promises may have been greater than what could really be delivered, and sometimes it was applied in ways that it was not designed for, but overall I believe that Six Sigma was beneficial. However, in the era of big data, major changes need to be made to the methodologies normally associated with Six Sigma. Methods based on small sample statistics break down when the data sets get large.

MR: Finally, what do you see as the biggest technological changes that will affect quality and statistics in, say, the next 10 or 20 years? Is it automation, the Internet of Things, quantum computing, or something else entirely?

NP: I’m not great at forecasting the future, but it’s always seemed to me that the goal we should be working toward is removing drudgery from our lives. A lot of people go to work each day, do a repetitive job that they don’t enjoy, and come home too tired to give their loved ones the attention they deserve. The more we automate these types of jobs, the more time people will have to do the activities they enjoy and the things that really count in life. How do we do this? By building quality into products, by constructing statistical algorithms that are as good or better than expert humans, by linking all our devices to an intelligent control system, and by turning data into useful real-time information. A brave new world is coming very quickly. Hopefully our social systems can handle the change.

LEAN SIX SIGMA GOES BACK TO SCHOOL

Every industry is built on a foundation of processes and systems. Technology, manufacturing, consulting, thought leadership and more.

If there are underlying processes, then fundamentally, those processes can be improved. That’s why it’s so silly when some people say that Six Sigma and Lean are manufacturing-focused methodologies.

They’re not. They’re process improvement methodologies. And, at the risk of sounding redundant, every industry is built on processes.

Take education as an example. It’s the effective design and delivery of instruction. Methods and styles change from teacher to teacher and student to student, but it’s still a process-based medium. Think about all the administrative work that goes into a school or university. Or the enrollment process.

All this can be improved and strengthened, just like the Toyota production factories and Motorola assembly lines from the early days of Six Sigma.

It might seem far-fetched to you, but it’s absolutely possible, and one school is doing its best to demonstrate it.

A Lesson in Lean Six Sigma

In Racine County, Wisconsin, the Racine Unified School District has applied Lean Six Sigma techniques to redefine its enrollment process in its middle school. Aas part of their Lean Six Sigma initiative, the school’s board members were asked to complete low-level process improvement training, which has already led to some fascinating ideas.

“From our perspective, you can remove Lean Six Sigma from the title and say we bring people together to look at a process from a holistic standpoint,” said Kamaljit Jackson, Unified’s senior accountability and efficiency officer in an article for The Journal Times, a local newspaper.

The district wants to get every staff member involved in process improvement brainstorming (which is a tried-and-true Lean Six Sigma tenet) – everyone from teachers, to receptionists, to janitors. And collectively, the school district wants to figure out how to gather and use data more effectively.

“We have a lot of data points here, and we need to use those data points to make decisions on what strategies we need to use to make sure our students are successful on the academic side,” Jackson said.

Lean Six Sigma and Education

The Racine Unified School District is just one example, and if they’re successful in their process improvement initiative, dissenters might call them an outlier. But the principles of Lean Six Sigma are sound and valid – and there’s no reason they wouldn’t work at other levels of education too.

In fact, they have.

Singapore Management University (SMU) created significant change in their institution by training staff on Lean Six Sigma methodologies. Staff that have already earned their Green Belts are given additional training and guidance to further develop their process improvement knowledge and skills. And, to advance in the training program, staff actually have to put the lessons into action. Everyone is expected to implement and contribute to a Lean Six Sigma project designed to improve the university.

Meanwhile, the University of Miami-Ohio has been called one of the most efficient schools in the country because of their work with Lean and Six Sigma principles – from big, campus-wide initiatives, to small and granular improvements (like digitizing the fingerprinting processes the campus police use).

All industries are founded on processes, and all processes can improve – even if that improvement is only slight. But even so, any improvement – especially in education – is a step in the right direction.