I. What is Lean Six Sigma ?
A. Enterprise-wide view
1. History of continuous improvement
Describe the origins of continuous improvement and its impact on other improvement models.
2. Value and foundations of Six Sigma
Describe the value of Six Sigma, its philosophy, history and goals.
3. Value and foundations of Lean
Describe the value of Lean, its philosophy, history and goals.
4. Integration of Lean and Six Sigma
Describe the relationship between Lean and Six Sigma.
5. Business processes and systems
Describe the relationship among various business processes (design, production, purchasing, accounting, sales, etc.) and the impact these relationships can have on business systems.
6. Six sigma and Lean applications
Describe how these tools are applied to processes in all types of enterprises: manufacturing, service, transactional, product and process design, innovation, etc.
1. Enterprise leadership responsibilities
Describe the responsibilities of executive leaders and how they affect the deployment of Six Sigma in terms of providing resources, managing change, communicating ideas, etc.
2. Organizational roadblocks
Describe the impact an organization’s culture and inherent structure can have on the success of Six Sigma, and how deployment failure can result from the lack of resources, management support, etc.; identify and apply various techniques to overcome these barriers. Critical Success Factors to Six Sigma
3. Change management
Describe and use various techniques for facilitating and managing organizational change.
4. Six Sigma projects and kaizen events
Describe how DMAIC projects and kaizen events are selected, when to use DMAIC and Six Sigma instead of other problem-solving approaches, and the importance of aligning their objectives with organizational goals.
5. Six Sigma roles and responsibilities
Describe the roles and responsibilities of Six Sigma participants: black belt, master black belt, green belt, champion, process owners and project sponsors.
II. Organizational Process Management and Measurement
A. Impact on stakeholders
Describe the impact Six Sigma projects can have on customers, suppliers and other stakeholders.
B. Critical to x (CTx) requirements
Define and describe various CTx requirements (critical to quality (CTQ), customer (CTC), satisfaction (CTS), delivery (CTD), etc.) and the importance of aligning projects with those requirements.
Define and distinguish between various types of benchmarking, including best practices, competitive, collaborative, etc.
D. Business performance measures
Define and describe various business performance measures, including balanced scorecard, key performance indicators (KPIs), the financial impact of customer loyalty, etc.
E. Financial measures
Define and use financial measures, including revenue growth, market share, margin, cost of quality (COQ), net present value (NPV), return on investment (ROI), cost benefit analysis, etc.
III. Team Management
A. Team formation
1. Team types and constraints
Define and describe various types of teams (e.g., formal, informal, virtual, cross-functional, self-directed, etc.), and determine what team model will work best for a given situation. Identify constraining factors including geography, technology, schedules, etc.
2. Team roles
Define and describe various team roles and responsibilities, including leader, facilitator, coach, individual member, etc. RACI as a useful acronym
3. Team member selection
Define and describe various factors that influence the selection of team members, including required skills sets, subject matter expertise, availability, etc.
4. Launching teams
Identify and describe the elements required for launching a team, including having management support, establishing clear goals, ground rules and timelines, and how these elements can affect the team’s success.
B. Team facilitation
1. Team motivation
Describe and apply techniques that motivate team members and support and sustain their participation and commitment.
2. Team stages
Facilitate the team through the classic stages of development: forming, storming, norming, performing and adjourning.
3. Team communication
Identify and use appropriate communication methods (both within the team and from the team to various stakeholders) to report progress, conduct milestone reviews and support the overall success of the project.
C. Team dynamics
Identify and use various techniques (e.g., coaching, mentoring, intervention, etc.) to overcome various group dynamic challenges, including overbearing/dominant or reluctant participants, feuding and other forms of unproductive disagreement, unquestioned acceptance of opinions as facts, group-think, floundering, rushing to accomplish or finish, digressions, tangents, etc.
D. Time management for teams
Select and use various time management techniques including publishing agendas with time limits on each entry, adhering to the agenda, requiring pre-work by attendees, ensuring that the right people and resources are available, etc.
E. Team decision-making tools
Define, select and use tools such as brainstorming, Decision Matrices, nominal group technique, multi-voting, Pugh Matrix etc.
F. Management and planning tools
Define, select and apply the following tools: affinity diagrams, tree diagrams, process decision program charts (PDPC), matrix diagrams, interrelationship digraphs, prioritization matrices and activity network diagrams.
G. Team performance evaluation and reward
Measure team progress in relation to goals, objectives and other metrics that support team success and reward and recognize the team for its accomplishments.
A. Voice of the customer
1. Customer identification
Segment customers for each project and show how the project will impact both internal and external customers.
2. Customer feedback
Identify and select the appropriate data collection method (surveys, focus groups, interviews, observation, etc.) to gather customer feedback to better understand customer needs, expectations and requirements. Ensure that the instruments used are reviewed for validity and reliability to avoid introducing bias or ambiguity in the responses.
3. Customer requirements
Define, select and use appropriate tools to determine customer requirements, such as CTQ flow-down, quality function deployment (QFD) and the Kano model.
B. Project charter
1. Problem statement
Develop and evaluate the problem statement in relation to the project’s baseline performance and improvement goals.
2. Project scope
Develop and review project boundaries to ensure that the project has value to the customer.
3. Goals and objectives
Develop the goals and objectives for the project on the basis of the problem statement and scope.
4. Project performance measures
Identify and evaluate performance measurements (e.g., cost, revenue, schedule, etc.) that connect critical elements of the process to key outputs.
C. Project tracking
Identify, develop and use project management tools, such as schedules, Gantt charts, toll-gate reviews, etc., to track project progress.
Also see Project Management.
A. Process characteristics
1. Input and output variables
Identify these process variables and evaluate their relationships using SIPOC and other tools.
2. Process flow metrics
Evaluate process flow and utilization to identify waste and constraints by analyzing work in progress (WIP), work in queue (WIQ), touch time, takt time, cycle time, throughput, etc.
3. Process analysis tools
Analyze processes by developing and using value stream maps, process maps, flowcharts, procedures, work instructions, spaghetti diagrams, circle diagrams, etc.
B. Data collection
1. Types of data
Define, classify and evaluate qualitative and quantitative data, continuous (variables) and discrete (attributes) data and convert attributes data to variables measures when appropriate.
2. Measurement scales
Define and apply nominal, ordinal, interval and ratio measurement scales.
3. Sampling methods
Define and apply the concepts related to sampling (e.g., representative selection, homogeneity, bias, etc.).
Select and use appropriate sampling methods (e.g., random sampling, stratified sampling, systematic sampling, etc.) that ensure the integrity of data.
4. Collecting data
Develop data collection plans, including consideration of how the data will be collected (e.g., check sheets, data coding techniques, automated data collection, etc.) and how it will be used.
C. Measurement systems
1. Measurement methods
Define and describe measurement methods for both continuous and discrete data.
2. Measurement systems analysis
Use various analytical methods (e.g., repeatability and reproducibility (R&R), correlation, bias, linearity, precision to tolerance, percent agreement, etc.) to analyze and interpret measurement system capability for variables and attributes measurement systems.
3. Measurement systems in the enterprise
Identify how measurement systems can be applied in marketing, sales, engineering, research and development (R&D), supply chain management, customer satisfaction and other functional areas.
Define and describe elements of metrology, including calibration systems, traceability to reference standards, the control and integrity of standards and measurement devices, etc.
D. Basic statistics
1. Basic terms
Define and distinguish between population parameters and sample statistics (e.g., proportion, mean, standard deviation, etc.)
2. Central limit theorem
Describe and use this theorem and apply the sampling distribution of the mean to inferential statistics for confidence intervals, control charts, etc.
3. Descriptive statistics
Calculate and interpret measures of dispersion and central tendency and construct and interpret frequency distributions and cumulative frequency distributions.
4. Graphical methods
Construct and interpret diagrams and charts, including box-and-whisker plots, run charts, scatter diagrams, histograms, normal probability plots, etc.
5. Valid statistical conclusions
Define and distinguish between enumerative (descriptive) and analytic (inferential) statistical studies and evaluate their results to draw valid conclusions.
1. Basic concepts
Describe and apply probability concepts such as independence, mutually exclusive events, multiplication rules, complementary probability, joint occurrence of events, etc.
2. Commonly used distributions
Describe, apply and interpret the following distributions: normal, Poisson, binomial, chi square, Student’s t and F distributions.
3. Other distributions
Describe when and how to use the following distributions: bimodal, exponential, lognormal and Weibull.
F. Process capability
1. Process capability indices
Define, select and calculate Cp and Cpk to assess process capability.
2. Process performance indices
Define, select and calculate Pp, Ppk and Cpm to assess process performance.
3. Short-term and long-term capability
Describe and use appropriate assumptions and conventions when only short-term data or attributes data are available and when long-term data are available. Interpret the relationship between long-term and short-term capability.
4. Process capability for non-normal data
Identify non-normal data and determine when it is appropriate to use Box-Cox or other transformation techniques.
5. Process capability for attributes data
Calculate the process capability and process sigma level for attributes data.
6. Process capability studies
Describe and apply elements of designing and conducting process capability studies, including identifying characteristics and specifications, developing sampling plans and verifying stability and normality.
7. Process performance vs. specification
Distinguish between natural process limits and specification limits, and calculate process performance metrics such as percent defective, parts per million (PPM), defects per million opportunities (DPMO), defects per unit (DPU), process sigma, rolled throughput yield (RTY), etc.
A. Measuring and modeling relationships between variables
1. Correlation coefficient
Calculate and interpret the correlation coefficient and its confidence interval, and describe the difference between correlation and causation. NOTE: Serial correlation will not be tested.
Calculate and interpret regression analysis, and apply and interpret hypothesis tests for regression statistics. Use the regression model for estimation and prediction, analyze the uncertainty in the estimate, and perform a residuals analysis to validate the model. NOTE: Models that have non-linear parameters will not be tested.
Multiple Linear Regression
3. Multivariate tools
Use and interpret multivariate tools such as principal components, factor analysis, discriminant analysis, multiple analysis of variance (MANOVA), etc., to investigate sources of variation.
4. Multi-vari studies
Use and interpret charts of these studies and determine the difference between positional, cyclical and temporal variation.
5. Attributes data analysis
Analyze attributes data using logit, probit, logistic regression, etc., to investigate sources of variation.
B. Hypothesis testing
Define and interpret the significance level, power, type I and type II errors of statistical tests.
2. Statistical vs. practical significance
Define, compare and interpret statistical and practical significance.
3. Sample size
Calculate sample size for common hypothesis tests (e.g., equality of means, equality of proportions, etc.).
4. Point and interval estimates
Define and distinguish between confidence and prediction intervals. Define and interpret the efficiency and bias of estimators. Calculate tolerance and confidence intervals.
5. Tests for means, variances and proportions
Use and interpret the results of hypothesis tests for means, variances and proportions.
6. Analysis of variance (ANOVA)
Select, calculate and interpret the results of ANOVAs.
7. Goodness-of-fit (chi square) tests
Define, select and interpret the results of these tests.
8. Contingency tables Select, develop and use contingency tables to determine statistical significance.
9. Non-parametric tests
Select, develop and use various non-parametric tests, including Mood’s Median, Levene’s test, Kruskal-Wallis, Mann-Whitney, etc.
C. Failure mode and effects analysis (FMEA)
Describe the purpose and elements of FMEA, including risk priority number (RPN), and evaluate FMEA results for processes, products and services.
Distinguish between design FMEA (DFMEA) and process FMEA (PFMEA), and interpret results from each.
D. Additional analysis methods
1. Gap analysis
Use various tools and techniques (gap analysis, scenario planning, etc.) to compare the current and future state in terms of pre-defined metrics.
2. Root cause analysis
Define and describe the purpose of root cause analysis, recognize the issues involved in identifying a root cause, and use various tools (e.g., the 5 whys, Pareto charts, fault tree analysis, cause and effect diagrams, etc.) for resolving chronic problems.
3. Waste analysis
Identify and interpret the 7 classic wastes (overproduction, inventory, defects, over-processing, waiting, motion and transportation) and other forms of waste such as resource under-utilization, etc.
E. Design of experiments (DOE)
Define basic DOE terms, including independent and dependent variables, factors and levels, response, treatment, error, etc.
2. Design principles
Define and apply DOE principles, including power and sample size, balance, repetition, replication, order, efficiency, randomization, blocking, interaction, confounding, resolution, etc.
3. Planning experiments
Plan, organize and evaluate experiments by determining the objective, selecting factors, responses and measurement methods, choosing the appropriate design, etc.
4. One-factor experiments
Design and conduct completely randomized, randomized block and Latin square designs and evaluate their results.
5. Two-level fractional factorial experiments
Design, analyze and interpret these types of experiments and describe how confounding affects their use.
6. Full factorial experiments
Design, conduct and analyze full factorial experiments.
A. Waste elimination
Select and apply tools and techniques for eliminating or preventing waste, including pull systems, kanban, 5S, standard work, poka-yoke, etc.
B. Cycle-time reduction
Use various tools and techniques for reducing cycle time, including continuous flow, single-minute exchange of die (SMED), etc.
C. Kaizen and kaizen blitz
Define and distinguish between these two methods and apply them in various situations.
D. Theory of constraints (TOC)
Define and describe this concept and its uses.
Develop plans for implementing the improved process (i.e., conduct pilot tests, simulations, etc.), and evaluate results to select the optimum solution.
F. Risk analysis and mitigation
Use tools such as feasibility studies, SWOT analysis (strengths, weaknesses, opportunities and threats), PEST analysis (political, environmental, social and technological) and consequential metrics to analyze and mitigate risk.
A. Statistical process control (SPC)
Define and describe the objectives of SPC, including monitoring and controlling process performance, tracking trends, runs, etc., and reducing variation in a process.
2. Selection of variables
Identify and select critical characteristics for control chart monitoring.
3. Rational subgrouping
Define and apply the principle of rational subgrouping.
4. Control chart selection
Select and use the following control charts in various situations: X¯ – R, X¯ – s, individual and moving range (ImR), attribute charts (p, np, c, u), short-run SPC and moving average.
5. Control chart analysis
Interpret control charts and distinguish between common and special causes using rules for determining statistical control.
B. Other control tools
1. Total productive maintenance (TPM)
Define the elements of TPM and describe how it can be used to control the improved process.
2. Visual factory
Define the elements of a visual factory and describe how they can help control the improved process.
C. Maintain controls
1. Measurement system re-analysis
Review and evaluate measurement system capability as process capability improves, and ensure that measurement capability is sufficient for its intended use.
2. Control plan
Develop a control plan for ensuring the ongoing success of the improved process including the transfer of responsibility from the project team to the process owner.
D. Sustain improvements
1. Lessons learned
Document the lessons learned from all phases of a project and identify how improvements can be replicated and applied to other processes in the organization.
2. Training plan deployment
Develop and implement training plans to ensure continued support of the improved process.
Develop or modify documents including standard operating procedures (SOPs), work instructions, etc., to ensure that the improvements are sustained over time.
4. Ongoing evaluation
Identify and apply tools for ongoing evaluation of the improved process, including monitoring for new constraints, additional opportunities for improvement, etc.
IX. Design for Six Sigma (DFSS) Frameworks and Methodologies
IX Common DFSS methodologies
Identify and describe these methodologies.
1. DMADV (define, measure, analyze, design and validate)
2. ICOV (Identify Customer needs, Characterize Design, Optimize Design, Validate)