Monday 29 April 2019

Integrating Lean and Six Sigma

Both the Lean and the Six Sigma methodologies have proven over the last twenty years that it is possible to achieve dramatic improvements in cost, quality, and time by focusing on process performance. Whereas Six Sigma is focused on reducing variation and improving process yield by following a problem-solving approach using statistical tools, Lean is primarily concerned with eliminating waste and improving flow by following the Lean principles and a defined approach to implement each of these principles.

Six Sigma, Six Sigma Tutorials and Materials, Six Sigma Learning, Six Sigma Study Materials

The impressive results companies such as Toyota, General Electric, Motorola, and many others have accomplished using either one of them have inspired many other firms to follow their example. As a result, most companies have either a Lean or Six Sigma program in place. However, using either one of them alone has limitations: Six Sigma will eliminate defects but it will not address the question of how to optimize process flow; and the Lean principles exclude the advanced statistical tools often required to achieve the process capabilities needed to be truly ‘lean’. Therefore, most practitioners consider these two methods as complementing each other. And while each approach can result in dramatic improvement, utilizing both methods simultaneously holds the promise of being able to address all types of process problems with the most appropriate toolkit. For example, inventory reduction not only requires reducing batch sizes and linking operations by using Lean, but also minimizing process variation by utilizing Six Sigma tools.

Therefore, many firms are looking for an approach that allows to combines both methodologies into an integrated system or improvement roadmap. However, the differences between the Six Sigma and Lean are profound:

Lean Six Sigma 
Goal  Create flow and eliminate waste  Improve process capability and eliminate variation 
Application Primarily manufacturing processes Improve process capability and eliminate variation 
Approach  Teaching principles and “cookbook style” implementation based on best practice  All business processes 
Project Selection  Driven by value stream map  Teaching a generic problem-solving approach relying on statistics 
Length of Projects  1 week to 3 months  Various approaches 
Infrastructure  Mostly ad-hoc, no or little formal training  Dedicated resources, broad-based training 
Training  Learning by doing Learning by doing

Developing an integrated improvement program that incorporates both Lean and Six Sigma tools requires more than including a few Lean principles in a Six Sigma curriculum or training Lean Experts as Black Belts. An integrated improvement strategy has to take into consideration the differences and use them effectively:

◈ Lean projects are very tangible, visible, and can oftentimes be completed within a few days (whereas Six Sigma projects typically require a few months). An integrated approach should emphasize Lean projects during the initial phase of the deployment to increase momentum.

◈ Lean emphasizes broad principles coupled with practical recommendations to achieve improvements. For example, Lean suggests a technique to analyze and reduce changeover time that does not require sophisticated analysis and tools. However, Lean principles are oftentimes inadequate to solve some of the more complicated problems that require advanced analysis. Therefore, Six Sigma needs to be introduced during the first year of the deployment to ensure that the improvement roadmap includes a generic problem-solving approach.

◈ An integrated improvement program needs to be fueled by a vision of the future state and by a pipeline of specific projects that will help close the gap between current and future state. Lean introduced value stream mapping as the central tool to identify the gaps and to develop a list of projects that can be tackled using Lean or Six Sigma methodology.

◈ Whereas the Six Sigma process and tools can be applied to virtually every process and industry, the Lean approach is much more specific and the content needs to be adjusted to industry needs: For example, reducing set-up time in a plant that has lines dedicated to a single product is pointless. Therefore, the Lean curriculum needs to be adjusted to meet the needs of the specific business.

◈ Training is effective but only when combined with application. Lean principles are typically taught as separate workshops, with each workshop combining a short training session on the principle with direct application on the shop floor. Six Sigma training is broken down into the phases of the DMAIC (Define, Measure, Analyze, Improve, Control) process with time between each training session to apply the tools learned to the project. The extensive analysis required for Six Sigma projects suggests that a workshop structure as used for Lean training would not be effective.

The integrated approach to process improvement (using Lean and Six Sigma) will include:

◈ Using value stream mapping to develop a pipeline of projects that lend themselves either to applying Six Sigma or Lean tools.

◈ Teaching Lean principles first to increase momentum, introducing the Six Sigma process later on to tackle the more advanced problems.

◈ Adjusting the content of the training to the needs of the specific organization – while some manufacturing locations could benefit from implementing the Lean principles with respect to housekeeping, others will have these basics already in place and will be ready for advanced tools.

The following roadmap provides an example for how one could approach the integration of Lean and Six Sigma into a comprehensive roadmap.

Six Sigma, Six Sigma Tutorials and Materials, Six Sigma Learning, Six Sigma Study Materials

Integrating Lean and Six Sigma Roadmap

From a training perspective, the Lean principles would be taught first, using the simpler projects identified through the value stream map as training projects for the Lean workshops. A Black Belt therefore would learn how to apply these Lean principles working on a real life problem. In addition, a Lean Black Belt would complete a large Lean project over the course of the training to become certified. The Six Sigma process will be introduced once the Lean principles have been taught. Again, the training participants would work on one specific project identified by value stream mapping.

As a result, a Lean Black Belt in this example would receive in total 30 days of classroom training, would participate in five Lean workshops, and complete one large Lean and one large Six Sigma project over the course of one year. Such a Black Belt would be capable of applying Lean and Six Sigma tools to a variety of business problems and choosing the appropriate approach to address the problem at hand.

Saturday 27 April 2019

Introducing SixSigma3.0

It is a time of change. The world is developing faster with each passing day. Continuous process improvement technologies, which could make these changes more significant and effective, trail far behind this trend. Who suffers? Companies that lose competition, people who cannot find a new place in the changed labor market and countries that are wasting resources that could go to create a new economy. A new efficient business process improvement methodology is a world priority.

SixSigma3.0, Six Sigma Tutorials and Materials, Six Sigma Certifications, Six Sigma Learning

Over the last three years, the Association Six Sigma (in Russia) has been working to create a new version of Six Sigma that is capable of adequately meeting new challenges. But first, a review of Six Sigma’s history.

Evolution of Six Sigma


Initially, Six Sigma originated at Motorola in 1986; I call this Six Sigma 1.0. Then, over the course of 15 years, the methodology quickly spread to the largest corporations, which continued to work on it, adding tools and approaches from the experience of the Toyota Production System and theory of constraints (TOC). With the proliferation of computers and dramatic increase in productivity, many Six Sigma tasks can be solved in more efficient ways today than in years past. There was a need to update Six Sigma.

Several disparate attempts were made to create a new version of Six Sigma. These were Lean Six Sigma, Green Belts v.2.0, Six Sigma 2.0. But in these versions, nothing fundamentally changed. Therefore, we propose SixSigma3.0 as the beginning of a new stage in the development of the methodology, in which there will be a single standard, fully upgraded toolset and an integrated mechanism for regular updates. To distinguish this new version of Six Sigma from all other variations, the name should be written as a single word without spaces.

What SixSigma3.0 Should Be


SixSigma3.0 should become a disciplined professional business process optimization methodology aimed at producing measurable effect. The name is important – written as one word to prevent clients from guessing what methodology they are using. All developments in the methodology should be registered in a centralized standard and distributed universally as upgrades by an authorized international organization – in the manner that computer software is upgraded. We hope that one of internationally recognized Six Sigma certifying bodies (ASQ, IASSC, ISSSP) will be interested in taking on this role.

SixSigma3.0 Methodology


The SixSigma3.0 methodology should include only the most effective concepts, technologies and tools that have a proven track record of efficient application, chosen from continuous improvement systems that exist in this field: Six Sigma, Lean, Kaizen and TOC. Handpicked tools should be organically incorporated into the SixSigma3.0 methodology. Most of the tools require refinement to reduce the learning curve and efficiency of practical use.

The main objectives of the new methodology are:

1. Building logic algorithms that help the practitioner achieve missions as well as develop mechanisms for continuous data flow and seamless exchange of results between tools – eliminating the need to re-enter data. This will help to automate the process improvement professional’s job in the future.

2. Replacement of continuous improvement terminology with the terminology of computer games.

3. Projects will be replaced with Missions or combat tasks. The practitioner’s obligation is to achieve the Mission under any conditions. DMAIC (Define, Measure, Analyze, Improve, Control) will be replaced by 10 Mission Operational Tasks (MOTs), some of which will be performed in parallel and some in sequence. This approach lifts many of the problems with implementation of traditional Lean Six Sigma such as blurred responsibilities, unjustified hopes for management assistance and doubts about what to do in each phase of DMAIC.

Toolset for SixSigma3.0


Divide existing tools of continuous improvement in two groups: Basic Kit and Advanced Toolset. The Basic Kit should include the most effective tools that the Effector must master professionally with a guarantee that this is his or her confirmed minimal competence level. When hiring, the employer should be guaranteed this competency level.

The development of the Basic Kit should be entrusted to an international organization. The Advanced Set will include effective tools specially modified for practical use after thorough analysis.

Effector – New Name for Old Profession


We propose to name the new occupation Effector – a professional in business process improvement methodologies capable of achieving optimization effect. Without a short recognizable name, the occupation lacks drive and energy.

No more Yellow, White, Red, Green or Black Belts; no sensei, operational excellence specialists, Lean leaders or Champions – no more confusion.

◈ Effector: a specialist in systems for improving business processes and operational efficiency.
◈ Effector-in-Chief: manager responsible for operational excellence in the organization.

The SixSigma3.0 Effector should play a significantly more important role than Green and Black Belts in traditional Six Sigma.

The weakness of the traditional approach is a shift in responsibility for the success of projects toward leadership support, corporate culture and other external circumstances. In fact, the entire responsibility for the success of the projects should lie with the SixSigma3.0 Effectors. An Effector should be able to perform the task in any conditions. Only in this way can the SixSigma3.0 methodology can take root in an organization and benefit it.

The selection and training of Effectors should be based on the following principles:

◈ Strict prior assessment
◈ Special motivation
◈ Total mastery of the methodology and Basic Kit
◈ Continuous study and use of tools from the Advanced Toolset
◈ View each project as a combat mission
◈ High discipline


Training and Certifying Effectors


The duration of basic training should be limited to three to four days (or 20 training hours), which is the requirement of most customers. Training should be more practical and applicable to real world situations. A book of knowledge should be constantly updated with practical experience.

It is necessary to simplify the certification procedure. After successful examination in an authorized organization, the trainee will receive a temporary certification for a period of one year. Certification must be confirmed after the successful implementation of an operational task. Certification will be confirmed by a committee of an organization consisting of three or more certified Effectors, who will assess the results of the operational task.

The Mission of SixSigma3.0


Unlike traditional Six Sigma, the optimization tasks in SixSigma3.0 are set by management as Missions.
The Mission is the most important element of the SixSigma3.0 methodology. A Mission is a task with any type of goal: problem solution, streamlining of business process, or improvement of high-level corporate key performance indicators (KPIs) or certain process KPIs.

Unlike a project manager, the Mission leader can use various approaches to achieve the task: project management planning, fast just-do-it projects, a Kaizen blitz or Agile tools (Scrum and sprint).

Solving business problems as missions could add additional energy to operational efficiency initiatives and increase the responsibility of Mission leaders for their accomplishment. The mental attitude of the leader and the team should be focused on unconditional success.

Mission goals and inputs are concisely and comprehensively described in a Mission Brief.

No More DMAIC


At the heart of Lean Six Sigma’s methodology is a five-step solution to business problems – DMAIC. The experience of applying DMAIC demonstrates that the methodology itself needs improvement. Its steps are not equivalent, and its sequence of actions and the use of tools leads to difficulties in the project.

We suggest that the improvement mission in SixSigma3.0 be guided by Mission Operational Tasks (MOTs), which are executed sequentially, but with the possibility of returns and parallel implementation. Each operational task is a key factor in the successful implementation of the mission.

10 Mission Operational Tasks (MOTs)

  1. Scanning: Scan the business landscape to identify areas of inefficiency and missed opportunities.
  2. Capture the goal: Determine the reason for an optimization Mission and identify target business process, main metrics (KPIs) and target values.
  3. Execution: Ensure support of the mission by the company’s leadership, secure necessary resources, organize efficient Mission implementation team and choose the appropriate project execution approach (project, just do it, Agile, Scrum, sprint, etc.)
  4. Process: Research the Mission’s target process, process baseline parameters, levels of KPIs, regulatory requirements and customer needs.
  5. Data: Analyze the Mission’s needs for data and install a data collection system.
  6. Analysis: Search for problems in the process that impede the implementation of the Mission’s goals, identify cause-and-effect relationships, factors of influence and root causes.
  7. Development: Develop solution packages for troubleshooting problems from the registry of root problems using modern business improvement and design technologies.
  8. Implementation: Test the developed solutions for mistakes and inefficiencies before full-scale implementation/deployment.
  9. Control: Develop a set of measures for monitoring and fixing implemented solutions.
  10. Report: Prepare and present Mission Report to top management touching upon economic effect, learned lessons and opportunities to apply solutions to other business areas.

Nine Reasons for Switching to SixSigma3.0

  1. Revive the brand in new conditions. Currently we see a decline in the interest in continuous improvement methodologies.
  2. Create a new occupation in the field of management and make it catch on (the occupation lacks short and recognizable names like lawyer, manager, auditor, etc.) We propose a new name – Effector.
  3. Improve efficiency compared to other continuous improvement methodologies that are made up of too much useless knowledge and a lack of professionalism.
  4. Ensure continuous development of the methodology and its relevance in a changing world.
  5. Maintain a single basic level of training specialists in operational improvement opposed to the existing system in which every organization has its own program and approaches to Belt training.
  6. Revive interest in improving business processes. There is an undeserved lack of interest in improving processes for psychological reasons among millennials and Generation Z.
  7. Initiate emergence of SixSigma3.0 professionals’ community. Today, operational excellence professionals are separated by their adherence to different methodologies.
  8. Develop and constantly upgrade a SixSigma3.0 standard based on application practices and research developments. The existing Six Sigma standards never change.
  9. Develop a professional software that provides automation of the methodology and direct exchanges of data and calculation results between instruments. Currently, Six Sigma education is based on manual calculations and standalone tools – we must create demand for advanced Six Sigma software.

Friday 26 April 2019

Understanding Statistical Distributions for Six Sigma

Many consultants remember the hypothesis testing roadmap, which was a great template for deciding what type of test to perform. However, think about the type of data one gets. What if there is only summarized data? How can that data be used to make conclusions? Having the raw data is the best case scenario, but if it is not available, there are still tests that can be performed.

In order to not only look at data, but also interpret it, consultants need to understand distributions. This article discusses how to:

◈ Understand different types of statistical distributions.
◈ Understand the uses of different distributions.
◈ Make assumptions given a known distribution.

Six Sigma Green Belts receive training focused on shape, center and spread. The concept of shape, however, is limited to just the normal distribution for continuous data. This article will expand upon the notion of shape, described by the distribution (for both the population and sample).

Getting Back to the Basics


With probability, statements are made about the chances that certain outcomes will occur, based on an assumed model. With statistics, observed data is used to determine a model that describes this data. This model relates to the distribution of the data. Statistics moves from the sample to the population while probability moves from the population to the sample.

Inferential statistics is the science of describing population parameters based on sample data. Inferential statistics can be used to:

◈ Establish a process capability (determine defects per million).
◈ Utilize distributions to estimate the probability of a variable occurring given known parameters.

Inferential statistics are based on a normal distribution.


Figure 1: Normal Curve and Probability Areas

Normal curve distribution can be expanded on to learn about other distributions. The appropriate distribution can be assigned based on an understanding of the process being studied in conjunction with the type of data being collected and the dispersion or shape of the distribution. It can assist with determining the best analysis to perform. 

Types of Distributions


Distributions are classified in the same ways as data is classified – continuous and discrete: 

◈ Continuous probability distributions are probabilities associated with random variables that are able to assume any of an infinite number of values along an interval.
◈ Discrete probability distributions are listings of all possible outcomes of an experiment, along with their respective probabilities of occurrence. 

Distribution Descriptions


Probability mass function (pmf) – For discrete variables, the pmf is the probability that a variate takes the value x.

Probability density function (pdf) – For continuous variables, the pdf is the probability that a variate assumes the value x, expressed in terms of an integral between two points. 

In the continuous sense, one cannot give a probability of a specific x on a continuum – it will be some specific (and small) range. For additional insight, think of x + Dx where Dx is small. 

The notation for the pdf is f(x). For discrete distributions: 

f(x) = P(X = x)

Some refer to this as the probability mass function, since it is evaluating the probability upon that one discrete mass. For continuous distributions, one mass cannot be established. 

Cumulative density function (cdf) – The probability that a variable takes a value less than or equal to x.


Figure 2: Normal Distribution Cdf

Cdf progresses to a value of 1 because there cannot be a probability greater than 1. Once again, cdf is F(x) = P(X <  x).This holds for both continuous and discrete. 

Parameters


Parameter is a population description. Consultants rely on parameters to characterize the distributions. There are three parameters: 

◈ Location parameter – the lower or midpoint (as prescribed by the distribution) of the range of the variate (think of the mean)
◈ Scale parameter – determines the scale of measurement for x (magnitude of the x-axis scale) (think of the standard deviation)
◈ Shape parameter – defines the pdf shape within a family of shapes

Not all distributions have all the parameters. For example, the normal distribution parameters have just the mean and standard deviation. Just those two need to be known to describe a normal population. 

Summary of Distributions


The remaining portion of this article will summarize the various shapes, basic assumptions and uses of distributions. Keep in mind that there is a different pdf and different distribution parameters associated with each. 

Normal Distribution (Gaussian Distribution)



Figure 3: Normal Distribution Shape

Basic assumptions: 

◈ Symmetrical distribution about the mean (bell-shaped curve)
◈ Commonly used in inferential statistics
◈ Family of distributions characterized is by m and s

Uses include: 

◈ Probabilistic assessments of distribution of time between independent events occurring at a constant rate
◈ Mean is the inverse of the Poisson distribution
◈ Shape can be used to describe failure rates that are constant as a function of usage

Exponential Distribution



Figure 4:Exponential Distribution Shape

Basic assumptions: 

◈ Family of distributions characterized by its m
◈ Distribution of time between independent events occurring at a constant rate
◈ Mean is the inverse of the Poisson distribution
◈ Shape can be used to describe failure rates that are constant as a function of usage

Uses include probabilistic assessments of: 

◈ Mean time between failure (MTBF)
◈ Arrival times
◈ Time, distance or space between occurrences of the events of interest
◈ Queuing or wait-line theories

Lognormal Distribution



Figure 5: Lognormal Distribution Shape

Basic assumptions:

Asymmetrical and positively skewed distribution that is constrained by zero.

◈ Distribution can exhibit many pdf shapes
◈ Describes data that has a large range of values
◈ Can be characterized by m and s

Uses include simulations of: 

◈ Distribution of wealth
◈ Machine downtimes
◈ Duration of time
◈ Phenomenon that has a positive skew (tails to the right)

Weibull Distribution



Figure 6: Weibull Distribution Pdf

Basic assumptions: 

◈ Family of distributions
◈ Can be used to describe many types of data
◈ Fits many common distributions (normal, exponential and lognormal)
◈ The differing factors are the scale and shape parameters

Uses include: 

◈ Lifetime distributions
◈ Reliability applications
◈ Failure probabilities that vary over time
◈ Can describe burn-in, random, and wear-out phases of a life cycle (bathtub curve)

Binomial Distribution



Figure 7: Binomial Distribution Shape

Basic assumptions: 

◈ Discrete distribution
◈ Number of trials are fixed in advance
◈ Just two outcomes for each trial
◈ Trials are independent
◈ All trials have the same probability of occurrence

Uses include: 

◈ Estimating the probabilities of an outcome in any set of success or failure trials
◈ Sampling for attributes (acceptance sampling)
◈ Number of defective items in a batch size of n
◈ Number of items in a batch
◈ Number of items demanded from an inventory

Geometric



Figure 8: Geometric Distribution Pdf

Basic assumptions: 

◈ Discrete distribution
◈ Just two outcomes for each trial
◈ Trials are independent
◈ All trials have the same probability of occurrence
◈ Waiting time until the first occurrence

Uses include: 

◈ Number of failures before the first success in a sequence of trials with probability of success p for each trial
◈ Number of items inspected before finding the first defective item – for example, the number of interviews performed before finding the first acceptable candidate 

Negative Binomial



Figure 9: Negative Binomial Distribution Pdf

Basic assumptions: 

◈ Discrete distribution
◈ Predetermined number of occurrences – s
◈ Just two outcomes for each trial
◈ Trials are independent
◈ All trials have the same probability of occurrence

Uses include: 

◈ Number of failures before the sth success in a sequence of trials with probability of success p for each trial
◈ Number of good items inspected before finding the sth defective item

Poisson Distribution



Figure 10: Poisson Distribution Pdf

Basic assumptions: 

◈ Discrete distribution
◈ Length of the observation period (or area) is fixed in advance
◈ Events occurs at a constant average rate
◈ Occurrences are independent
◈ Rare event

Uses include: 

◈ Number of events in an interval of time (or area) when the events are occurring at a constant rate
◈ Number of items in a batch of random size
◈ Design reliability tests where the failure rate is considered to be constant as a function of usage

Hypergeometric


Shape is similar to Binomial/Poisson distribution.

Basic assumptions:

◈ Discrete distribution
◈ Number of trials are fixed in advance
◈ Just two outcomes for each trial
◈ Trials are independent
◈ Sampling without replacement
◈ This is an exact distribution – the Binomial and Poisson are approximations to this

Other Distributions


There are other distributions – for example, sampling distributions and X2, t and F distributions.

Thursday 25 April 2019

How to Move Through a Project Manager Career Path

Project Manager Career, Prince2 Tutorials and Materials, Prince2 Certifications, Prince2 Learning

Being a project manager can be highly rewarding, with the opportunity to work in different business sectors, such as IT, construction, engineering and retail. There is plenty of job variation, interaction with people, working with new technologies, improving your skills and opportunities to train and further your career. We’ve highlighted below a project manager career path that can be followed, along with some useful tips to help you progress.

Step 1: Get qualified


Gaining a recognised project manager qualification or a degree in management is a good first step. Many project managers started their career path via progression through a company, often studying for their qualifications as part of their wider career.

Aside from PRINCE2, anyone can study project management as part of an undergraduate or Master’s degree course. Depending on your time and budget, it might find it easier to learn PRINCE2 through an online course.

Step 2: Take on project management support roles


Most project managers, with or without qualifications, start their career as project assistants, IT and management support staff, coordinators and analysts. They gain experience in their industry and learning from other project managers. Some people volunteer as a project coordinator or support member of staff; often there are more opportunities.

Project Manager Career, Prince2 Tutorials and Materials, Prince2 Certifications, Prince2 Learning

If you entered into a project management role without getting qualified, as in Step 1, but feel comfortable applying for, or getting promoted to, a project manager position, now’s the time to study for a qualification. The general timescale for moving from project support to project management is around five to ten years. It is recommended that a record of your project management work is retained which can go a long way towards a future qualification.

Step 3: Run smaller projects


A great way of showing initiative is to suggest and run small projects, i.e. developing a new process or adapting a current process that is more efficient. Map it out and identify what can be automated or even deleted. Test your new process to ensure it is an improvement over the original one and follow up with a report to management on how time can be saved, as well as any cost efficiencies.

Step 4: Beyond project management


From here, there are opportunities to progress up the corporate ladder into roles such as program or IT director, chief operating officer (COO) and chief executive officer (CEO). Another career path is to establish an outsourcing project management company.

Project management career advice


Network: Project Insight offer valuable insights into a project management career path from successful project managers such as Bertrand Duperrin of Emakina and Dan Pink, author of To Sell is Human, who champions networking as a great way to progress a project manager career.

Be flexible: Ranstad believe that communication, delegation and flexibility are key to developing a career in project management. Managing clients is not always an easy task and learning to compromise between what the client wants and the practicalities of what is possible is a good platform to work from.

Communicate: It can be all too easy to do everything yourself but that defeats the objective of being a project manager. Great communication and relationship building are crucial, as is learning to delegate jobs to team members, particularly specialist tasks.

Keep learning: Self-development and learning from others, particularly in the early years of a project manager career are vital. Keeping up-to-date with new technologies and the latest thinking help you to stay ahead. Above all, always be prepared to expect the unexpected and learn from the mistakes of others.

Stay positive: A career in project management can be diverse and rewarding. One project manager role won’t necessarily reflect that of another and there are no strict rules on the path you take as you progress on a project manager career. But adopting the right attitude, developing the right skills, taking opportunities as they arise and learning as you go are all great steps in your career.

Wednesday 24 April 2019

Climbing up the Project Management Career Ladder

Project Management Career, Project Management Tutorial and Material, Project Management Certifications

Project Management is a wide-ranging field of work with many different roles. As with most careers, the higher salaries come with progression and advancement. This can seem challenging when it comes to project management as there are numerous different methodologies used, which makes it difficult to know which will be the most beneficial to your career.

All organisations undertake projects as it is necessary for businesses to improve their products and services in order to keep up with the competition. Not all organisations make use of formal project management, but the number that do is constantly increasing as it has been proven to increase the chances of success. This means that there is a consistent demand for qualified project management professionals.

Qualifications speak loudly in the project management field and it will significantly enhance your chances of a rewarding career if you gain the most sought-after certifications.

Earn the most sought-after certifications


With the numerous project management methodologies that are used in organisations across the globe today, it can be troublesome to decide which courses to study. However, there are certain qualifications that carry significant weight in the field of project management. Gaining these particular certifications will ensure that your CV is appealing to potential employers when applying for many roles, even those outside of traditional project management.

1. Project Management Fundamentals

If you are brand new to project management, the Project Management Fundamentals course provides the ideal starting point in your studies. This course will teach you the core skills and knowledge required to begin a career in the project management field. Learn about the methods and processes, stakeholder management, risk management, quality control, the importance of effective communication and the key planning activities associated with effective project management.

2. PRINCE2®

PRINCE2 is the leading project management method in the UK and is also widely used in organisations around the world. This method can be tailored to any size or type of project and is managed in stages to ensure that the project sticks to its objectives. The PRINCE2 certification consists of two levels – Foundation and Practitioner. The Foundation level will teach you the 7 Themes, Principles and Processes associated with the methodology and the terminology that is used. The Practitioner level will ensure that you are able to practically apply the theory to real-world project situations.

3. AgilePM®

Agile is a project management method that was originally created for the purpose of software development. Other methods do not always react well to changes during the course of the project, but the Agile method addresses this issue. This method uses an incremental and iterative approach to delivery, which enables it to work unexpected changes into the required phase of the project. The AgilePM certification also consists of two levels – Foundation and Practitioner. During the Foundation-level training, you will learn about project planning, project control, the implementation of the Agile methodology and the roles and responsibilities required during a project. The Practitioner level course will ensure that you become familiar with lifecycles and objectives, how to maintain control of an Agile project and the techniques necessary to manage an Agile project.

4. Change Management

A highly beneficial addition to the toolbox of any project management professional is the internationally recognised change management certification. Change is inevitable in any organisation and it is essential to manage it effectively. The change management qualification covers organisational change, the impact of change on individuals, stakeholder communication, benefits management and the roles which are required in order to achieve successful change. Gaining this certification will enable you to implement the techniques that will assist a smooth transition during change initiatives, which are essentially projects.

5. Business Analysis

Another qualification which will give you an advantage over other project management candidates is the business analysis certification. This is accredited by BCS (British Computer Society) and will add tremendous value to your CV. Business analysis enables you to discover the root cause of issues within an organisation and determine the best solution to rectify these issues. A business analysis certification will demonstrate to potential employers that you are able to assist in the optimisation of business performance and the streamlining of the systems which already exist within the organisation. This course will ensure that you have an understanding of strategy analysis, business system modelling, business process modelling, developing a business case and implementing change.

Enhance your soft skills


Most project management experts agree that there are certain soft skills which are recommended if you want to be a successful and well-rounded project management professional.

◈ Organisation – Strong organisational skills are absolutely essential as the majority of the work involved requires you to organise a multitude of different elements on a daily basis.

◈ Communication – The ability to communicate clearly with stakeholders, team members, suppliers and all others involved with a project is one of the most important soft skills for a project management professional.

◈ Creativity – It is crucial for a Project Manager to be creative in finding solutions to problems and resolving them with minimal impact to the overall success of the project.

◈ Leadership – Good leaders lead by example and effective leadership skills are vital to project management as you will need your team to respect you and to trust your judgement throughout the course of a project.

◈ Motivation – An unmotivated team can have an actively negative impact on the outcome of a project, so you must be able to understand and motivate your team to enhance their productivity and make the most of their input.

Climb the rungs


◈ Bottom rung – Project Office Administrator – £24,000

Project Office Administrators generally earn a salary of £24,000 and this is a common starting role in the field of project management. This position requires assisting with project management duties such as making appointments, performing site visits, preparing reports, tracking project progression, coordinating meetings, hiring contractors and monitoring compliance regulations. This role will provide ample experience on real-world projects and give you the footing you need to progress in your project management career.

◈ Middle rung – Project Manager – £55,000

With some experience, you will move through the ranks and gain additional skills that are learned when practically applying your knowledge. As a Project Manager in the UK, you can earn a salary of £55,000. Some of the roles and responsibilities associated with this position are monitoring and reporting project progress to stakeholders, developing the project budget, estimating time and cost, planning and managing resources, benefits realisation and risk analysis. A project manager is ultimately responsible for the planning, execution, monitoring, controlling and successful closing of a project.

◈ Top rung – Programme Manager – £77,500

A project is a single initiative and a programme is a collection, or portfolio, of projects. As a Programme Manager you will be able to earn a salary in the region of £77,500 or more. Programme Managers do not manage individual projects but coordinate the various projects in the portfolio and the teams working on each one. Strong experience as a Senior Project Manager is required for this position, as well as vast knowledge of project management methodologies.

The role typically entails managing the programme budget, planning the overall programme, monitoring the progress of each project to ensure objectives are met, managing the usage of resources, stakeholder management, mitigating and resolving issues and risks, defining programme controls and ensuring that deliverables are aligned across the programme.

Project management is a field with relatively clear-cut career progression and the keys to unlocking senior roles and higher salaries are certifications and experience gained from working your way up the ladder. There will almost certainly always be a need for project management, which makes this an excellent career choice.

Monday 22 April 2019

Creating a Fresh View of Six Sigma Data and Tools

Six Sigma DMAIC and DFSS roadmaps provide the guidance needed for using facts and data to understand problems, opportunities and solutions to get results in a wide variety of project settings. For the routine cases, they give practitioners what they need. There are some situations, though, that can benefit from a broader view of Six Sigma capabilities and tools. For example:

Six Sigma Tutorial and Material, Six Sigma Certifications, Six Sigma Learning, Six Sigma Study Materials

1. Some project situations do not map neatly to the textbook DMAIC or DFSS cases. More than one new enthusiastic Black Belt starts out with a checklist-view of DMAIC or DFSS, trying to fit every project into too rigid a set and sequence of tools and artifacts. They learn pretty soon that there is a bit of art involved in seeing the nuances of any one project, and in working the ones that seem to sit somewhere in between DMAIC and DFSS.

2. Organizations that really get Six Sigma recognize that its approach to making better use of better data applies to all manner of learning and decision support – some much smaller scale and others much broader than Six Sigma projects. As Six Sigma thinking moves out of the textbook and into the business and technology bloodstream, people learn to see the applicability of Six Sigma tools and capabilities in a new way.

Without taking anything away from the familiar DMAIC and DFSS roadmaps, a broader, more general view fostered by non-textbook situations is worth exploring. This fresh perspective may provide ideas for extending the leverage of the Six Sigma toolkit – in routine ways and in some new ways.

Organizing the Tools and Capabilities


The Six Sigma Capabilities and Tools chart below is a bit daunting at first, but it illustrates a number of useful notions and it outlines the scope of this article. It is designed to be read from the bottom, where everyone starts their encounters with prospective information in the world of unfiltered events. A few things worth noticing on first review:

◈ The right side depicts tools that operate on numeric data – where most practitioners started their view of Six Sigma. The left side depicts tools that operate on language data, which deserves to be an equal citizen in the toolkit.

◈ The chart is designed to be read from the bottom, using the center to illustrate what a team or individual is accomplishing. The entry point then is where everyone starts their encounters with prospective information – unfiltered “Events.”

◈ Progress in DMAIC or DFSS, or in things bigger and smaller than Six Sigma projects, involves moving upward on the chart. The sun does not rise and set on pure statistics (insert slings and arrows here) – but, rather, on helping make the right use of the right kinds of data to get to the solutions and results depicted at the top. Statistics might well be used as a means to understanding value and risk along the way. In the end, the goal is understanding and capitalizing on value and risk – statisticss may well be a means to get that done.

Six Sigma Tutorial and Material, Six Sigma Certifications, Six Sigma Learning, Six Sigma Study Materials

From Events to Facts and Data


No one has to do any work to get events – they arrive with great regularity and for free. The email stating that customer satisfaction with a company’s software is down 10 percent, or the presentation stating that there are lots fewer defects since the new software development process, and so many more events vie for attention and acceptance. Six Sigma realizes that unfiltered raw experience like this can seem more compelling and believable than it should. Measurement systems analysis (MSA) provides an array of tools for seeing, segmenting and quantifying different sources of error introduced in the process of translating observations about events into what can be called “facts and data.” The statistical side of MSA helps a lot in the situations it fits, and the more general logical side of MSA helps strengthen the fact and data content in any situation. Six Sigma practitioners challenge any “event translating” process in terms of:

1. Accuracy: How do the prospective facts agree with the truth? What could make them be biased?

2. Repeatability: The systematic variation due to the automatic parts of the system. To what degree would the system (the automated part and the same person using it) make the same translation using the same raw observations?

3. Reproducibility: The human part of the error. To what degree would the same person, playing their prescribed role in the translation system, get the same result when translating the same thing?

4. Stability: What could make the system performance change over time?

5. Linearity: Does system performance vary over the range of values being measured?

The MSA considerations help improve numerical data. In language data, there also is potential bias (in the form of emotion, solutions, unsupported judgment) and sampling noise. These can be addressed by using some of the tools depicted in the “language facts and data” section of the Six Sigma Capabilities and Tools chart.

Facts and Data Uncover Patterns


Facts and data are a necessary input to the pursuit of exploring patterns in data. With believable, unbiased and reasonably complete facts, project teams can look for contrasts, trends, correlations and interactions using patterns to trigger their insights. Pattern detection comes naturally to humans. People are very good at seeing “What’s different in this picture?” or “Which factors seem to be associated with (or not) changes in some measure or observation of interest?”

Graphs and charts, of course, leverage the knack for detecting patterns by organizing data in succinct ways that tease out a variety of different patterns for viewers. The table below outlines a few of the most familiar graphical tools. Sometimes the patterns themselves tell enough about what is going on – or they provoke the right “why?” questions to help a team check other patterns to get to the bottom of the causes or dynamics that connect factors (x’s) with important project result measures (Ys).

Some Common Graphical Tools and Statistical Counterparts
Analysis Question Graphical Tools Statistical Tools 
How is the data distributed?  By Value: Histogram, Box Plot
By Count: Pareto Chart
Normality Tests
Chi-Square
How does data vary over time?  Run Chart (Time Series Plot) Control Chart 
How do changes in x’s compare and
contrast to changes in Ys? 
Scatter Diagram
Box Plot
Multi-Vari Chart
Variation Tests
Central Tendency Tests
ANOVA 
How do x’s interact with one another?  Interaction Plot Regression Analysis 


Separating Signal from Noise


Graphical tools help to a point, but a Six Sigma team sometimes needs to discern differences and relationships in a more refined way – making statements about their significance. That leads, of course, to the statistical tools, which pick up where the graphs leave off, separating the significant signal from chance noise. These tools, of course are Six Sigma’s strong suit.

For numerical data, the main thrust of any hypothesis test is to challenge the sample data and the prospective findings about a comparison being made based on these things:

1. The sampling process itself influences the noise level in data – that can show up as bias or central tendency shift and/or variation noise. Averaging values that are gathered under an unbiased plan is the common weapon for the bias component.

2. Any comparison being made should weigh the presumption that chance alone can explain any apparent differences. This is, of course, the null hypothesis.

3. Any statements about the significance of a reported effect (e.g., trend, difference, relationship) should be qualified against the risk that innocent noise is being confused as a significant signal. This, of course, creates the p-value, confidence interval or critical value of the test that a team can use to show how well the data did or did not pull signal from the noise.

For language data, there is a parallel notion of separating signal from noise. A team can distill and average language data in tools like the KJ. Using the rules of abstraction, a team can distill themes and messages that are reasonably supported by lower level samples of language data.

Building Useful Models


If one thinks of a model as any construct that helps to answer questions about a system, then it can be agreed that there are many kinds of models. Some common model types include:

◈ First Principles
◈ State Machine Models
◈ Regression Analysis
◈ Design of Experiments
◈ Neural Networks
◈ Fuzzy Logic
◈ Monte Carlo

Six Sigma might use any of the model types listed, as they all can be used to do “what if?” analysis – predicting Y changes based on hypothetical x changes. Better yet, they can be used in the reverse direction to do “what’s best?” analysis, setting Y to a desired value and figuring the suitable settings of one or more input x’s that could result in the desired Y.

The quote “All models are wrong – some models are useful” (attributed to the noted statistician George Box) is a reminder that empirical models have to live with uncertainty and noise. Still, a useful model, viewed with appropriate caution, can provide useful insights into the dynamics of x and Y, and the operating or design choices a team has about moving a Y result where it would the result to be.

Looking Ahead: A second article discussing the upper portion of the Six Sigma Capabilities and Tools chart is forthcoming. It will consider the ways that models are used to help generate ideas, develop and select solutions, and verify and monitor the results delivered.

Friday 19 April 2019

Specification Limits: Proceed with Caution

It’s natural for a manufacturing group to be concerned with a product’s specification limits – the upper and lower limits imposed on the process. Specification limits are sometimes designated by the producer, definitions of quality or, most frequently, a customer, in an attempt to narrow the distribution of a product’s properties. Unfortunately, to keep a process in control, teams often develop strategies based on those specification limits instead of ones that better reflect the natural variation of that process.

Practitioners should understand that specification limits are something forced upon a process; not part of its natural voice. Knowledge of this concept can help them learn how to develop and follow statistically based control limits as part of a more reasonable process control strategy.

Specifications vs. Statistics


Donald Wheeler introduces this subject in Understanding Industrial Experimentation, Second Ed. (SPC Press, 1992): “The traditional approach to the problem of product variation has been that of specifications. By using specification limits to define some neighborhood of a process’ target, say target +/- Δy, manufacturers have hoped to place acceptable bounds upon the degradation in performance for the product. As long as the quality characteristic Y falls within these specification limits, the product is said to be satisfactory. When the value for Y falls outside these limits, the product is suddenly deemed to be unsatisfactory and certain actions are invoked to remedy the situation. …[However,] specification limits are artificial boundaries used to make arbitrary decisions about what product to use. They are a naïve attempt to deal with the problems created by the variation of product characteristics. All product is considered to be either good or bad, and the dividing line between good stuff and bad stuff is seen to be a sharp cliff.”

Figure 1 shows how a control strategy based on specification limits works: Good stuff and bad stuff are separated by sometimes arbitrary walls.

Six Sigma Tutorial and Materials, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Figure 1: Specification-based Process Control

Statistical process control based on the voice of the process (i.e., its 3-sigma limits) provides a better basis for controlling a process. Under a specification-based process control strategy, if a process’ natural control limits fall within its specifications, an under-control situation can easily arise, wherein the product may drift around its target. On the other hand, if a process’ natural control limits fall outside its specifications, an over-control situation can easily arise, resulting in an inflation of process variation.

Ideally, what’s needed is a mindset that does not simply consider anything within specifications “good,” but instead continually minimizes variation and strives to center the process using statistically based process rules. Figure 2 shows the results from such a control strategy.

Six Sigma Tutorial and Materials, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Figure 2: Statistical-based Process Control

The Taguchi Loss Function helps illustrate the loss in product quality that stems from following specification limits rather than statistically based control limits. The line in Figure 2 is derived from the following:

{σLT2 + (average property – target)2},

where σLT = the process’ long-term sigma.

Notice how this function continually drives a process at its target under conditions of minimum variation. Typically, this function is multiplied by a constant, which turns the loss into estimated dollars used here. The Taguchi Loss Function is an easily calculated, production-run metric that reflects both process variation over the course of the run and the quality of its targeting.

Example: An Under-control Process


The following example helps show the differences in process management between specification-based and statistically based strategies. Figure 3 is a typical Shewhart chart used for monitoring and controlling the relative viscosity (RV) of a nylon 66 polymer. Generally, the RV measurement system itself is a very good one, contributing just 2.5 percent to total process variability. Notice, however, that this polymer’s lower and upper specification limits (LSL and USL) fall well outside its lower and upper control limits (LCL and UCL) – the limits based on this process’ natural voice. Note too, that none of these data points – RV measurements made every 12 hours – fell outside the specification limits.

Six Sigma Tutorial and Materials, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Figure 3: In-process Relative Viscosity Data

Production management is usually happy with a run like this, as nothing was produced outside specs. However, depending on their tolerance for variation, the customer might not be happy. Initially, the customer could receive a polymer with an RV well above target (area B in Figure 3), followed by a polymer with an RV well below target, and then again above target (area C). The circled regions in Figure 3 each had data points falling outside the control limits; If the process was being managed statistically, these regions would allow the line engineer to make adjustments to re-target the process. However, the effectiveness of this retargeting would depend on how useful the process’ gain factors were – in other words, how much of an impact an appropriate process adjustment would have on this polymer’s RV.

The capability metrics of the process tell the story. Its Cp, or the voice of the process, is a very capable 1.59, while its Ppk, the voice of the customer, at 0.80, indicates far less practical capability because of the process’ tendency to drift. The combination implies that this process would be judged very capable if 1) the long-term sigma (σLT) – the yellow polynomial fitting line in the chart – could be reduced to short-term sigma (σST), a measure of point-to-point variation, and 2) the process targeting were improved.

Finally, the quality of the process targeting is evident in the Taguchi Loss Function (Figure 4), which is clearly asymmetrical – it is weighted to the high side of target. Note, this plot is zeroed at the 45 RV target; control limits would sit at +/- 1.4 deviation and specification limits at +/-3 deviation. The perfect run would show a symmetrical “smile shape” bounded by +/- 1.4, and its minimum value would fall to σST2.

Six Sigma Tutorial and Materials, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Figure 4: Taguchi Loss Function for RV

Improving Performance


A strategy that opts to control a process for violations of its specification limits is never a good thing. When these specs fall well outside natural variation-driven control limits, the process can easily become under-controlled, as evidenced in the RV example above. When these specs fall within the control limits, on the other hand, the process can easily become over-controlled when an operator makes an adjustment that is not called for, given the natural voice of the process. When appropriate adjustments are made, however, the Taguchi Loss Function slides toward its minimum, and the customer’s performance (and satisfaction) likely improves.

Wednesday 17 April 2019

How To Turn Process Data Into Information

A repeated series of actions and variables is a process. A collection of processes is a system. Virtually perfect Six Sigma quality results from an optimal interaction of all the variables in a given system.

Process and system questions we all face at work include: Which variables are the most important to the customer? Am I being efficient? Am I being effective? Am I using the best methods to complete my tasks? Is there a better way?

The science of data collection has two keys we can use to answer these questions. By systematically observing processes and systems, we learn faster than we do through trial and error. As Yogi Berra said; “You can see a lot just by looking.” When we look and learn, we can improve. Six Sigma counts, measures, and graphs speed learning.

Key #1: Develop Crystal Clear Operational Definitions


Define precisely what you mean to count or measure before you start to count and measure. This is tricky business. For example, write down your definition of the word ‘pan’ on the following line.

Here is another example of why clear operational definitions are crucial to even a simple process like counting. Count the number of f’s in the following paragraph.

FOR CENTURIES IMPORTANT PROJECTS HAVE BEEN DEFERRED BY WEEKS OF INDECISION AND MONTHS OF STUDY AND YEARS OF FORMAL DEBATE.

Depending on how you decided to define the letter “f” there are seven possible correct answers. There are no lower case f’s. So, zero is one correct answer. If you decided to count any F, there are 6. If you proof read phonetically, in other words you defined “f” by the sound of the letter, the F in each OF sounds like a “v.” So, if you defined an F by the way it sounds you could have counted 1, 2, 3, 4, 5 or 6. Any one, or all of these answers, taken in the context of its definition, could be considered to be correct.

Key #2: Array Your Data in Columns and Rows


Tables are the proven way to array the data you collect in columns and rows. Data can be collected from any process.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

In our computer age, operational definitions for ‘row’ and ‘column’ have changed. Rows are now called “Records.” Columns are “Fields.” Fields describe details about each record. Recording each record in its proper sequence is exceptionally important. Record the data sequence for every Six Sigma analysis. A four-field array is illustrated below.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

This Six Sigma data array of fields and records would tell us a little about each observation. The more fields, the richer our understanding can be. For example, in the same amount of space the following table has twice as much data. Rich data, meaning each column/field has a crystal clear operational definition, can yield rich information. Many times we collect dozens of fields for each recorded observation. Since data collection is time consuming and expensive, design your collection plan with care before you begin.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Spreadsheet example: Excel spreadsheets help accounting and finance professionals scan vast amounts of data. Spreadsheets help them spot patterns. As helpful as this format is to their trained eyes, each record/row and column/field in a traditional spreadsheet mixes different fields and records together. This heterogeneous characteristic does not lend itself to a computer powered, statistically valid Six Sigma analysis.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Six Sigma array – spreadsheet example: The following Six Sigma, Six Sigma, array presents A/R aging dollars data in a way that leverages 21st Century computing power. Note the sequential time frame in the first column. The entire table goes back years, week-by-week, record-by-record. Each field is homogeneous. Uniform arrays promote the use of accurate, Six Sigma statistical analyses as well as improving eyeball assessments.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications

Four Rules For Data Analysis


Statistical methods deliver the highest level of evidence for making judgments. A statistical analysis is the surest way to confirm whether what we have observed is due to chance variations, or due to something special. Without an analysis, we have only intuition, assumptions, and guesses. Sometimes our assumptions are correct, sometimes not. Six Sigma reduces guesswork. Statistical analyses increase confidence in the judgments we must make in an uncertain world.

1. Calculate the average value for your set of numbers. A set of numbers is called a data set. The average, or mean, is symbolized in Six Sigma by the character X Bar. This symbol is pronounced X-bar. Excel’s paste function key (fx) can and will calculate averages automatically

2. Calculate the standard deviation, sigma, for your data set. The symbol for the standard deviation is called sigma, sigma. Excel’s paste function key (fx) or other Six Sigma software can and will calculate this statistic automatically

3. Calculate the probability information for your data set. Probability information tells us if the differences we see in our counts or measures are due to random chance. Or, probability will tell us if the differences we see are most likely due to one factor or a combination of variables. Excel’s ANOVA (analysis of variation) function automatically calculates an essential probability statistic called an F ratio for you

4. Graph your data using an appropriate analytic graph. Excel’s bar graphs and pie charts are descriptive, not analytic. The affordable Six Sigma software add-ins illustrated later in this manual work seamlessly with Excel to graph the average, standard deviation, and probability information in more meaningful ways

Clearly, basic spreadsheet skills are an expected competency for Six Sigma executives who expect their Black Belt experts to produce best results. Computing power makes Six Sigma initiatives possible. Prioritize learning strategies to your personal needs.

When we turn our counts and measures into accurate statistical pictures, patterns emerge. These precious, time and money saving patterns would otherwise remain buried in columns and rows of numbers. Learning to recognize these patterns is an indispensable Six Sigma skill. Valuing the information conveyed by these patterns is one of the most important contributions executive leaders can make to Six Sigma projects.