Monday, 17 June 2019

Introducing the E3P3 Process Improvement Methodology

Organizations involved with business process improvement use different methodologies, approaches, tools and techniques for implementing quality management programs. These management programs have different names in different organizations – TQM (total quality management), Six Sigma, operational excellence, etc. Regardless of the approach and name used, each organization needs to manage a proper selection and combination of its different approaches, tools and techniques to ensure a successful implementation process. Organizations that are looking for opportunities for business process improvement often begin with PDCA or Plan, Do, Check and Act, an iterative four-step management method.

The concept of the PDCA cycle was originally developed by Walter Shewhart in the 1920s and is often referred to as the Shewhart Cycle. It was taken up and promoted effectively from the 1950s on by quality management guru W. Edwards Deming, and is also known as the Deming cycle or Deming wheel. When any organization executes basic process thinking approaches, the main emphasis is on PDCA. All other methods and approaches have evolved from this revolutionary method.

Understanding PDCA

In PDCA, the four stages are as follows:

1. Plan: The Plan stage focuses on selecting the project, establishing the objectives, defining the process and design, or revising business process components to improve results.

2. Do: The Do stage refers to implementing what is planned – implementing the action plans and measuring their performance.

3. Check: This stage verifies that the process achieved the desired results and checks the effectiveness of the corrective action taken.

4. Act: Finally, the Act stage entails containment, disposition and correction, analyzing any differences and their causes, as well as taking action to improve things.

Quality management systems of the past have emphasized process thinking using PDCA to ensure, for example, the shipment of acceptable product to the customer. In most instances of its use, Check is synonymous with inspection activities – sometimes leading to more than 100 percent inspection of products and no clear verification of the processes. Organizations have been dependent on this inspection phase to ensure the removal of unacceptable products before shipment to the customers.

In actual practice, however, the Plan stage for organizations has become a black-box opportunity finder, waiting for problems to be noticed visibly from the business processes to act upon. And Act has become the weakest link in the complete PDCA cycle, where it is the least emphasized stage of a PDCA improvement cycle. Act is seen as either a standardization process or an improvement process. The input to act comes from Check, which provides inadequate and insufficient feedback for action. The data available from Check for analysis frequently appears as attribute type, leading to options such as OK/NOT OK, YES/NO and GO/NO GO. Root causes for such attribute decisions have been awarded to the only incoming sources of variation – the operators/technicians within the organization. The ultimate root cause leading to improvement is left to the technicians or operators, who are working on the shop floor; the ultimate improvement action leads to “training technician/operators.” The remaining incoming sources of variation (the other of the 6Ms aside from man – method, material, machine, measurement systems and milieu) are given less importance in improvement actions.

With the continuous demand of the customer in one hand and the ongoing needs of the customer on the other hand, methods like Lean Six Sigma play a major role, continually improving all business processes and products. Performance levels have reached single digits in terms of percentage of defects and, most of the time, following Six Sigma, in parts per million instead of a large proportion of defects.

Methodologies like Lean Six Sigma have been working relentlessly to ensure that all the aspects of a business are improved to achieve the process output to a level close to perfection, as demanded by the customers. In such cases, because of increasing product and process complexity and performance expectations, verifying the output for acceptance is no longer sufficient. Organizations must look at the output against robust target performance levels, trying to get as close to the target as possible. The deviation from the target must be understood and continually reduced. According to Taguchi’s philosophy, the goal is to produce closer to the target and continually reduce variability around it.

The Revolutionary PDCA Cycle

Figure 1 shows a map of the PDCA cycle.

Six Sigma Methodology, Six Sigma Certifications, Six Sigma Study Materials

Figure 1: PDCA Cycle

Introducing the Evolutionary E3P3 Methodology

The evolutionary E3P3 (“e-cube, p-cube”) cycle is made up of six steps: Evaluate, Evolve, Execute, Perfect, Progress, and Preach and practice. The E3P3 cycle is based on the closed-loop feedback diagram shown in Figure 2.

Six Sigma Methodology, Six Sigma Certifications, Six Sigma Study Materials

Figure 2: E3P3 Cycle

The E3P3 methodology incorporates the philosophies of quality management gurus such as Shewhart, Karaou Ishikawa, Joseph Juran, Genichi Taguchi and Deming.

 Table 1: Basics Steps Used in E3P3
·  Select project to find what and where to improve
·  Define problem, set targets and schedules
·  All 6Ms are delivered as inputs to the process
·  Provide assurance of good inputs (all Ms) to the process
·  Manage appropriate, defined and foolproof resources proactively
·  Ensure the 6Ms can all do well
Effectively execute the plan 
Ensure that the process is performing as planned and that the process output is on target. (If the process input it not on target, the gap in the Perfect step must be recognized and acted upon.)
Leads to improvement in the process (variation is reduced) 
Preach and practice 
·  Sustainability of the achievement
·  Horizontal deployment 

Shewhart’s techniques taught that work processes could be brought under control by determining when a process should be left alone and when intervention was necessary. This highlighted that business processes should be assessed at regular intervals to find the gaps available in the areas of improvement. Ishikawa stresses quality audits – internal as well as external – to understand the present as-is condition of the business or work process. The gap between reality and requirements will enable to determine if action is needed.

Similarly, according to Ishikawa, the most likely process inputs, the Ms, are grouped in the material, method, machine and manpower (4M) categories. These four Ms, the main sources of variation of any process, must be managed proactively instead of being sought after the postmortem through root cause analysis. To act proactively, the business processes of any organization need to be assessed on a regular basis to find out where exactly the process lies.

Table 2: Basic Steps Used in the E3P3 Methodology vs the PDCA Cycle
PDCA Cycle
E3P3 Methodology
· Evaluate available area, business processes and/or opportunities
· Analyze current state
· Develop baseline
· Identify gaps/problems in areas/business processes
· Determine the root cause(s) of the problem(s)
· Plan a change or a test aimed at improvement
· Determine the root cause of the problem
· Plan a change or a test aimed at improvement (who, what, when and where)
· Determine and analyze root cause(s) of the problem(s)
· Determine the interventions necessary to correct the problem(s)
· Determine the expected outcomes
· Determine the responsible parties for the improvement of the problem
· Prepare appropriate, well-defined and foolproof 6Ms required for improvement
· Plan resources
· Determine goals and targets
· Determine and set metrics for improvement
· Identify and define roles and responsibilities
· Design optimal processes, scheduling the steps for correction
· Set robust targets
Execute (Do Well)
· Implement the plan and measure its performance
· Carry out the change or the test
· Monitor results and collect data
· Implement the plan and measure its performance on a trial basis
· Carry out the change or the text
· Monitor results and collect data
· Continuously check for efficiency
· Permanently implement if the trial is successful
· Measure for performance
· Finalize
· Train employees
Perfect (On Target?)
· Assess, review and evaluate the result of the change
· Measure the progress
· Check for unforeseen consequences
· Complete the data analysis
· Verify that the process achieved the desired results against the set target value
· Measure the process against the set target value (if not at target value, identify variation and remove)
· Summarize learnings
· Check for unforeseen consequences
Progress (Reduce Variability in Process)
· Decide on changes needed to improve the process
· Standardize process changes
· Communicate changes to all involved
· Provide training
· Decide on changes needed to improve the process
· Standardize process changes
· Communicate changes to all involved
· Execute learning from root cause analysis of deviation with the set target value, then optimize
· Practice more to continually improve
Preach and Practice (Sustain)
· Identify any training needs or any developments needed for full implementation of improvement
· Fully adopt the solution for process improvement
· Continue to monitor solution
· Find other opportunities for improvement
· Horizontal deployment
· Sustain developments

Let’s look more closely at some of the E3P3 stages.

◈ Evaluate: The Evaluate stage is not clearly defined, but assumed, in PDCA. In E3P3, it focuses on continuously assessing the business process (the process health) to look for improvement opportunities.

◈ Evolve: This is the main preparation stage, where available good inputs or the best inputs to the process are assured before starting the improvement activities. This stage assures that the good 6Ms are going into the processes as a part of the process improvement initiative.

◈ Perfect: The Perfect stage ensures that the process is performing as planned and that the process output is on target. If the process output isn’t on target, the gap in the “perfect” step must be recognized, and appropriate actions need to be generated and executed.

◈ Preach and practice: This stage, which is assumed in the PDCA cycle, is to ensure horizontal deployment in similar processes or situations. It lays a foundation for continuous learning for employees and across all business processes within the organization.

Using PDCA or E3P3

Whenever any process improvement activities are to be initiated in any organization, going directly for the Plan stage, as defined by PDCA, cycle is difficult. The organization needs to identify and prioritize the vital few from the large number of projects that are available. In such cases, the Evaluate stage of E3P3 identifies and prioritizes those vital few improvement projects in a systematic way.

Similarly, the Perfect stage in E3P3 is more focused on checking against targets that are set from the beginning, which is subjective in nature under Check in PDCA. When implementing E3P3, the PDCA approach can be kept on a standby mode for people who still believe PDCA to be the relevant problem-solving approach. As the situation and requirements evolve, however, to cater to the requirements of customers, E3P3 can replace the PDCA approach in all improvement projects, from manufacturing to service industries.

E3P3 is a more appropriate systematic process improvement methodology for today’s business process improvement methodologies and tools. It comprises a combination of philosophies and principles of five significant quality leaders as shown in Table 3.

Table 3: Philosophies Embedded in E3P3
Evaluate (assess current state, find the gaps)
Quality Leader
Evolve (proactively manage inputs, the 6Ms)
Shewhart, Ishikawa
Execute (ensure execution at its best)
Perfect (output on target)
Progress (reduce variability)
Preach and practice (sustainability)

The E3P3 process improvement methodology can be implemented in any business process improvement initiative. E3P3:

◈ Begins with a systematic understanding of the present situation of the business processes, continuously measures the health of the business processes,
◈ Identifies the processes that needs improvement,
◈ Prioritizes the vital few process improvement projects,
◈ Executes as per the plan with assured good inputs to the process,
◈ Verifies the output against a set target,
◈ Makes necessary arrangement and modifications if the output does not match with the target value, and
◈ Deploys the same learnings in similar processes across the organization.

Why Use E3P3

Consider that the initial Evaluate stage in E3P3 is not only for evaluating the problem, but also for evaluating key organizational support, identifying and evaluating the stakeholders and team members, and evaluating the exact needs of each and every stakeholder in the process.

In PDCA, Plan is defined qualitatively and devoid of any scope for setting a concentrated target value (unlike in E3P3). In PDCA, implementation is geared toward only finding a solution to the present issue/problem, with no set target to match the requirements of the client and competition.

Friday, 14 June 2019

PMP vs PRINCE2 vs ITIL Comparison

PMP Certifications, PRINCE2 Certifications, ITIL Certifications, ITIL Guides, Prince2 Study Materials, PMP Learning

A common question is what really is the difference between PMP and Prince2 and which one I should go for. Here are some high level differences between the project management standards namely , PRINCE2 (PRojects IN a Controlled Environment) and the PMP (Project Management Professional).

Levels 3 Levels, Foundation, Practitioner and Professional 1 Level PMP. 
Organization  APM Group, UK  PMI , USA 
Recognition  Worldwide  Worldwide 
Approach  Process-based – defines What, How, When, and who can do a series of management processes. It dictates the right process to follow  Knowledge based – Tools & Techniques and best practices that can be applied when managing projects 
Roles  Defines the roles of everyone involved in a project Focuses on the project manager's role 
Content  Business case, Products/ Project planning, Project Assurance, Management by Exception, Clear roles and responsibility Definition, these topics not fully covered by PMP  Covers Procurement, EVM, Critical Path, Communication Management, HR Management, not covered by PRINCE2. PMP gives details of PM Roles and not many details on other’s roles. 
What  Practical Project Management Methodology  PMP / PMBoK – comprehensive info about all aspects of project management. 
Numbers  70K PRINCE2 Practitioners  240K PMI members and 200K PMP certified 
Countries  Popular in UK, Europe  Popular in USA, India etc. 
Highly prescriptive on processes  Large descriptive, some places prescriptive. 
Business case driven  Customer requirements driven 
Questions  75 , 50% passing  200, 61% passing 
Hours  4 hours  1 hour 
Eligibility  Not Mandated  4500 of PM experience
Training  Not Mandated 35 contact hours from Registered education provider 
Price  200 Pounds 555USD 
Recertification  N/A for Foundation. All PRINCE2 Practitioners should be re-registered within 3-5 calendar years of their original certification. Required every 3 years. 60 PDUs needed

I think PMP takes the good approach for purposes of teaching the subject content of each knowledge area, but not as effective when it comes to providing guidance for running a particular project.

Prince2 would be applicable more if you are looking for a PM job in the UK or for a company that utilised the Prince2 methods. Prince2 and PMP are not competitors, but mutually exclusive. PM can benefit by doing both of these certifications.

PRINCE2 or PMP just by itself may not make you a better project manager, you need project management experience, take the course, study and get certified, when you go back to managing projects, you are equipped with a whole new set of tools, techniques and knowledge. Its cyclic process since when you apply your newly learnt skills, you will earn more skills and with time become a better project manager. Those who do not put the knowledge into use will not become better PMs. If you are looking at a certification from the point of view of gaining knowledge, then they both complement each other. However if you are looking at it as a marketing tool, then it depends upon which country you are working. PMP appears to have a wider following in North America and parts of Asia, whereas PRINCE2 appears to have its followers in Europe (and maybe Australia). Both certifications are good and has lots of benefit , I strongly encourage you to take up atleast one of them.

Comparison with ITIL 

ITIL is one of the fastest growing frameworks, and helps organizations increase IT efficiency, improve quality, and control costs. ITIL® is a framework is mainly for management of IT Services across the entire lifecyle. Activities in the framework are combined in processes to focus on continual improvement of process, product, people and partner relations.

ITIL addresses how IT organisations as a whole should operate, PMP addresses how individual projects within the organisation can be delivered. They are both complementary and mutually exclusive. ITIL applies mainly to IT industry where as PMP claims to be universal and independent of domain.

ITIL is process focussed (like PRINCE2) and not technology focussed, meaning you dont need to know the technology. In todays competitive market, ITIL certification is a big plus as it positions you as service management expert. The world is shrinking so its no longer true that only IT services professionals will do ITIL. So if you do that in addition to doing PMP then you will definitely have an edge in market.

Thursday, 13 June 2019

Step-by-Step Guide to Building Dynamic Pareto Charts

While working in environments where costly software is not easily accessible across the organization, an Excel-based Pareto chart can come in handy. A dynamic Pareto chart template can be a practical aid to practitioners in this situation. By following the steps described here, practitioners can create a Pareto chart that can be updated with a click of a button, provided they input the data labels and their respective frequencies.

Building the Template

This simple chart can be created in six stages. Practitioners need not be Excel wizards; the template is designed so that they may directly input the formulas provided. Once practitioners are comfortable with all the steps, they may wish to modify the prototype further. (Note: Excel 2003 was used to create the graphs, but it should not be much of a problem to replicate this in other versions of Excel.)

1. Create Table

Open a blank worksheet and save as “Pareto.” In the sheet, create a table with a column of items and the frequency of their occurrence (Figure 1).

Six Sigma Study Materials, Six Sigma Guides, Six Sigma Certifications, Six Sigma Learning

Figure 1: Items and Frequency in Table

The table has been filled with some nonsensical data for the sake of example. Cells E1 and F1 are deliberately left blank.

2. Input Formulas

Fill cells with the following information:

◈ In B27, enter the formula

◈ In D2, enter the formula
Copy D2 and paste the contents in D3 to D26.

◈ In E2, enter the formula
Copy E2 and paste the contents in E3 to E26.

◈ In F2, enter the formula
Copy F2 and paste the contents in cells F3 to F26.

◈ In C2, enter the formula
Copy C2 and paste the contents in C3 to C26.

At this point, the table should look like Figure 2.

Six Sigma Study Materials, Six Sigma Guides, Six Sigma Certifications, Six Sigma Learning

Figure 2: Table after Entering Formulas

3. Create Pareto Chart

To transfer the table data into a chart, follow these steps:

◈ Select cells A1 to C12.
◈ Go to Insert -> Chart
◈ In the chart dialogue box, select the Custom Types tab. Choose the chart type Line – Column on 2 Axes. Click Finish.

4. Input Formulas into Chart

To transfer the formulas to the chart, follow these steps:

◈ Click any blank cell in the Pareto sheet. Go to Insert ? Name ? Define.
◈ Enter the name “DataLabel” (Figure 3)
◈ In the Refers to box, input the formula
◈ Click OK

Six Sigma Study Materials, Six Sigma Guides, Six Sigma Certifications, Six Sigma Learning

Figure 3: Define Name Dialogue Box

◈ Click any blank cell in the Pareto sheet. Go to Insert -> Name -> Define.

◈ Enter the name “Frequency.”

◈ In the Refers to box, input the formula

◈ Click OK.

◈ Click any blank cell in the Pareto sheet. Go to Insert -> Name -> Define.

◈ Enter the name “Cumulative.”

◈ In the Refers to box, input the formula

◈ Click OK.

◈ Right-click on the chart, just within the edge. Select Source Data. Under the Series tab, click Frequency.

◈ In the Category (X) axis labels field, input the formula

◈ In the Values field, input the formula
=Pareto.xls!Frequency (Figure 4).

◈ Click Cumulative in the series field. In the Values field, enter the formula

◈ Click OK.

Six Sigma Study Materials, Six Sigma Guides, Six Sigma Certifications, Six Sigma Learning

Figure 4: Source Data Dialogue Box

5. Create Macro to Update Pareto

To easily update the Pareto, follow these steps:

◈ Go to Tools ? Macro ? Record New Macro
◈ Type “Pareto” in the Macro name field.
◈ In the space for Shortcut key, type “P”
◈ Click OK.
◈ Select the range A1 to F26.
◈ Go to Data ? Sort
◈ In the Sort by drop down list, choose Column B. Click the Descending radio button.
◈ Click OK.
◈ Click Stop recording on the macro menu.

The dynamic Pareto template is now ready. Practitioners may input different label names and their respective frequencies. To generate a Pareto chart, simply hit Shift + Control + P.

6. Modify

The chart can be further modified by going to View ? Toolbars ? Forms. From the resulting toolbox, choose the Command button. Drag the button onto the sheet and size it as appropriate. When releasing the button, a dialogue box should appear, asking about assigning a macro. Choose Pareto from the list. Rename the command button Create Pareto.

After hiding Columns D, E, F and row 27, the final Pareto should look like Figure 5.

Six Sigma Study Materials, Six Sigma Guides, Six Sigma Certifications, Six Sigma Learning

Figure 5: Completed Dynamic Pareto

Whenever the input fields are updated, practitioners can simply hit the command button, and voila! The Pareto chart is generated.

Wednesday, 12 June 2019

Characterizing the Measurement Process

Measurement system analysis (MSA) can be fun, always impresses the customer, and can often result in an important surprise or two. Learn why gage R&R is only one part of the equation for achieving near risk free measurements of all types.

Six Sigma Study Materials, Six Sigma Learning, Six Sigma Guides, Six Sigma Certifications

Characterizing measurement error is one of the most important yet overlooked and misunderstood aspects of any measurement process. Gage R&R is used in many forms to assess the measurement precision (spread). Determining measurement accuracy (central location) requires an understanding of stability, bias and linearity. Indeterminate measurements (statistical) may also require consideration of response-to-control (RtC) variable correlation and autocorrelation. When precision and accuracy measurements are assessed in combination, the analysis is referred to as measurement capability analysis (MCA) or MSA.

Process Accuracy Measurements


Stable processes are those that are free from special cause variation. Statistical process control (SPC), scatter plots, or other forms of statistical analysis are used to measure process stability. Stability determination requires enough data sampled to cover a wide range of possible variation contributors that apply to the process being measured. Possible contributors include:

1. Part variation: piece-to-piece, raw material lot-to-lot, piece area-to-area, etc.
2. Tooling variation: cavity-to-cavity, tool-to-tool, tool wear over time, etc.
3. Human variation: operator-to-operator, supervisor-to-supervisor, set-up lead to lead, number of other tasks performed at the same time, ergonomic conditions, etc.
4. Time variation: Sample time-to-time, hour-to-hour, shift-to-shift, day-to-day, week-to-week, month-to-month, season-to-season, year-to-year, lunch and other break times, etc.
5. Location variation: machine to machine, building-to-building, plant-to-plant, state-to-state, country to country, etc.


Bias in a sample is the presence or influence of any factor that causes the data population or process being sampled to appear different from what it actually is. To measure process measurement bias, a higher measurement authority is compared to the data average. For determinate measurements this process is referred to as calibration. For indeterminate measurements, average values are compared to a target or specification value.


Linearity refers to measurements being statistically different from one end to the other of the measurement space. A measurement process may be very capable of measuring small parts but much less accurate measuring large parts or one end of a long part can be measured more accurately than the other end. Linearity is measured using measurement standards calibrated to higher authorities, traceable to NIST. Indeterminate measurement linearity is measured in the form of interaction effects.

If equipment demonstrates non-linearity, one or more of these conditions may exist:

1. Equipment not calibrated at the upper and lower end of the operating range
2. Error in the minimum or maximum master
3. Worn equipment
4. Possible poor internal equipment design characters

Response-to-Control (RtC) Variable Correlation and Autocorrelation

Measurements are often used to control processes that in turn affect subsequent measurements. When indeterminate measurements are taken at a slower rate than the time it takes for process changes to be fully implemented then measurement adequacy is not an issue. However, low control variable (used to control the process) to response variable (being measured) correlation can mask the time it appears to take for a process change to be implemented. Using SPC to control a process where data is taken and process changes are made every hour yet where poor RtC correlation masks changes that take a full day to implement would not be adequate. RtC variable correlation is measured using regression analysis. This situation exists when the response variable is not the same as the control variable, which is often the case.

Similarly, autocorrelation affects measurements when correlation between paired values of mathematical functions taken at constant intervals incorrectly indicate the degree of periodicity of the function. Woodall states that much lower levels of autocorrelation than Wheeler’s suggested 0.7 minimum recommended level can have a substantial effect on a control chart’s statistical performance.

Process Precision Measurements

Gage R&R

Gage R&R statistically isolates different types of variation in the measurement process. These types of variation include:

◈ Repeatability = equipment variation = within variation
◈ Reproducibility = appraiser variation = between variation
◈ Residual or pure error
◈ Variation due to interaction effects. For example, out of several inspectors, one might have a tendency to read one gage differently than others

Gage R&R can be applied to any kind of measurements (attribute or variables, indeterminate or determinate).

There are many overlapping methods outlined in the literature that can be used to perform Gage R&R. A few of these methods are as follows:

1. Analysis of variance (ANOVA) method
2. Average and range method
3. Within part variation (WIV) method
4. Automotive industry action group (AIAG, Southfield, MI) method
5. Short range method for non-destructive testing
6. Short range method for destructive testing
7. Long range method for non-destructive testing
8. Long range method for destructive testing
9. The Instantaneous method (one appraiser for equipment variation only).

The two most common method types used and supported by statistical software are the ANOVA method and the average and range method.

Other MSA Components

Other components that might be included in MSA are precision-to-tolerance ratios, measurement capability indices (Cpk for measurement system capability), procedures, plans or protocol, and summary reports.

Important uncertainty contributors to consider when documenting related contextual information include:

◈ Environment

◈ The reference element of the measuring equipment

◈ The measuring equipment and setup

◈ Software and calculations

◈ The metrologist

◈ The measuring object

◈ The definition of the measurand (standard or perfect / ideal measurement)

◈ The measuring procedure

◈ Physical constants

◈ Intentional bias out of cost or resource concerns. The “operator model” is a relatively recent concept developed by ISO Technical Committee 213 and accounts for intentional uncertainty. This is probably the most overlooked uncertainty contributor

Monday, 10 June 2019

Top 7 benefits of Having ITIL Skill

ITIL Study Materials, ITIL Tutorials and Materials, ITIL Guides, ITIL Learning, ITIL Certifications

The challenges in digital data management are getting more complex because of the increasing amount of data required by the businesses. IT Infrastructure Library (ITIL) certification delivers the much-needed perfection to IT professionals to structure and implement the tailor-made IT service management strategy with a deep understanding of particular requirements. ITIL management allows collecting, analyzing, and distributing the data by following the time-tested methodology. As more businesses are realizing the benefits of ITIL management, the requirement for ITIL certified professionals is increasing fast in almost all the business sectors including education, e-commerce and healthcare etc.

ITIL Certification – A Qualification by Choice: 

The tiered structured ITIL certification allows the candidates to choose the certification type and level according to personal career objective. ITIL certification, one among the top IT certifications, is provided at five levels to help the IT professionals boost their career in a progressive manner.

ITIL Study Materials, ITIL Tutorials and Materials, ITIL Guides, ITIL Learning, ITIL Certifications

The ITIL intermediate certification modules are designed to produce the competent ITIL experts in specific areas like-

◒ OSA (operational support & analysis)
◒ PPO (planning, protection and optimization)
◒ RCV (release, control and validation)
◒ SOA (service offerings and agreements)
◒ ITIL service operation
◒ ITIL service transition
◒ ITIL managing across the lifecycle …

7 Key Benefits of Having ITIL Skill:  

ITIL is a globally recognized set of the best in class management practices. ITIL certification helps you know the widely used concepts, terms, and processes to improve the organization’s growth. More numbers of organizations in almost all the business sectors are accepting ITIL implementation as a necessity to survive in the competitive marketing environment. Before joining any particular ITIL training course, you need to know the benefits for performance and career boost. The key benefits experienced by the most of ITIL certified experts, irrespective of their role in services management, are:

1. Worldwide Recognized Qualification:

ITIL certification sets an international benchmark for your qualification and service management skill. Leading international service providers recognize ITIL certification as a prerequisite for services management experts; therefore, it helps to boost your career even at international level.   

2. Acquaintance with Standard Language:

Many service managers use advanced service management processes but without knowing the standard terminology or processes. ITIL certification helps you learn the standard language and processes widely used globally.

3. Smart Approach to Improve the Initiatives:

Smart professionals work in smarter ways to demonstrate their skills and values. ITIL courses & workshops provide a smart skill to help you identify the potential to improve the initiatives.

4. Helps to Introduce Proactive Culture: 

ITIL training builds the confidence to innovate new ways to improve customer satisfaction. ITIL training helps you focus better on the customers’ expectations and users’ experience. The gained expertise in using ITIL framework and tools helps you improve service delivery quality by developing a new proactive culture.

5. Instills Confidence & Refines Capabilities: 

The quality of service delivery depends on the capabilities of the involved personnel; the organizations need confident and capable Services Management Experts to compete with rivals. ITIL certification courses are designed to produce confident service managers with improved capabilities to address the challenges in specific areas. 

6. Makes You A Key Contributor To Organization’s Growth:

ITIL certification course improves your competence, productivity and capability to build better relationships with customers and within the organization. ITIL expertise helps you make the processes more cost-efficient by optimizing the use of available resources. The holistic approach to getting better ROI with an eye upon risk factors helps the organization to achieve sustainable growth. 

7. Career Boost: 

The successful completion of ITIL course gives you a globally recognized qualification and expertise; therefore, you are paid better. Numbers of project experts accept that they got 15% salary hike after getting ITIL certification. Besides the salary aspect, you get wider landscape with more opportunities to progress.

ITIL Certification Course: Is It For You? 

More and more organizations worldwide are adopting time-tested ITIL framework; so, the job trends make ITIL training a smart choice for IT services professionals. ITIL certification courses are designed to benefit -

◈ Professionals engaged in a business sector but planning to move a company providing IT services.
◈ IT service management professionals willing to update their skills 
◈ Mid-level & senior-level IT professionals
◈ IT consultants

Friday, 7 June 2019

ITIL - Key Benefits

ITIL Study Materials, ITIL Guides, ITIL Certifications, ITIL Learning

It is amazing to know that the IT Infrastructure Library or ITIL® is of 20 years. On its third model now, ITIL is essentially the most broadly accepted framework for IT Service Management in the IT world. It's a practical, no-nonsense strategy to the identification, planning, supply and help of IT workers to the enterprise.

ITIL Study Materials, ITIL Guides, ITIL Certifications, ITIL Learning

Since ITIL is an approach to IT “service” management,” the idea of a service have to be mentioned. A service is one thing that gives worth to clients. Companies that prospects can straight make the most of or consume are often called “enterprise” providers. An instance of an enterprise service that has widespread applicability throughout industries could be Payroll. Payroll is an IT service that's used to consolidate info, calculate compensation and generate paychecks on a daily periodic foundation. Payroll might depend on different “enterprise” providers akin to “Time Monitoring” or “Reward Management” for data essential to calculate the proper compensation for a worker throughout a given time interval.

The Core Benefits Of ITIL Includes:

◈ Alliance with business needs. ITIL becomes a benefit to the business when IT can proactively recommend solutions as a response to one or more business needs. The IT Strategy Group recommended in Service Strategy, and the execution of Service Portfolio Management gives IT the opportunity to understand the business’ current and future needs and develop service offerings that can address them.

◈ Negotiated achievable service levels. Business and IT become true partners when they can agree upon realistic service levels that deliver the necessary value at an acceptable cost.

◈ Anticipated, consistent processes. Customer outlooks can be set and are easier to meet with through the use of predictable processes that are consistently used. As well, good practice processes are foundational and can assist in laying the groundwork to meet regulatory compliance requirements.

◈ Effectiveness in service delivery. Well-defined processes with clearly documented accountability for each activity as recommended through the use of a RACI matrix can significantly increase the efficiency of processes. In conjunction with the evaluation of efficiency metrics that indicate the time required to perform each activity, service delivery tasks can be optimized.

◈ Quantifiable, improvable services and processes. The adage that you can’t manage what you cannot measure rings true here. Consistent, repeatable processes can be measured and therefore can be better tuned for accurate delivery and overall effectiveness. For example, presume that a critical success factor for incident management is to reduce the time to restore service. When predictable, consistent processes are used, key performance indicators such as Mean Time ToRestore Service can be captured to determine whether thisKPI is trending in a positive or negative direction so that the appropriate adjustments can be made. Additionally, underITIL guidelines, services are designed to be measurable. With the proper metrics and monitoring in place, IT organizations can monitor SLAs and make improvements as necessary.

◈ A communal language – terms are defined.

Wednesday, 5 June 2019

Use Realistic Tolerancing and Poka-yoke for Process Control

Many organizations attempt to improve their processes. The first instinct for many new quality personnel is to attempt 100 percent inspection. A more experienced approach to quality control is to control your inputs to eliminate defects. Consider the following scenario for an organization that had a limited capital budget but needed to ensure its product was safely delivered in its own packaging. The organization used poka-yoke (or mistake proofing) and realistic tolerancing to control its process improvements regarding seal strength.

The Seal Strength Situation

This company was focused on reducing the percentage of poorly sealed product. They knew the relationship between the output – the force to break a seal, referred to as “seal strength” – and three inputs – time, pressure, and temperature. Other inputs affecting seal strength are polymer thickness and seal bar variability. In addition, all processes have noise variables which can impact performance.

The company had previously addressed the input variable of time. They instituted a way to confirm that the length of sealing time – the time when pressure is applied while hot air is blown onto the surfaces – is maintained within a narrow range. Contact sensors were inexpensively installed along with a timer. If the time requirements are not met, then the package is rejected using the same mechanism already in place for low package weight controls.

Applying Poke-yoke to the Sealing Process

A simple designed experiment showed that to achieve satisfactory sealing results, there was a tight range of acceptable values for the pressure used to clamp the materials. Past standard operating procedures had incorrectly documented this range, so team members were retrained to seek the desired results. The maintenance staff added visual controls indicating the proper pressure settings, and the correct hydraulic pressure settings were programmed to shut down the sealing process if they went outside of the desired range of hydraulic pressure. This is an example of a poka-yoke, which was a new control technique for the company: When a critical input – in this case, the hydraulic pressure – is outside of an acceptable range, defects are prevented from being passed along in the process by shutting down the sealing process.

The designed experiment also revealed that the temperature of the air forced across the materials being sealed had the largest influence on seal integrity. Unfortunately, the supply of heated air varied significantly because of the source and seasonal and daily ambient temperature swings. The team did not have much capital left to address the widely varying temperature. Although the instrumentation engineer wanted to invest in a more advanced control scheme, the lack of funding led him to propose a simpler idea. He suggested that the team use a poka-yoke approach here, too, similar to that used for sealing pressure. If the heated air temperature was too high or too low, the sealing process would stop and the partially sealed package would be rejected. What was needed then was the acceptable range of air temperatures.

Applying Realistic Tolerancing

To determine that range, the team applied the technique of realistic tolerancing – a way of controlling process that uses regression analysis as part of its analysis. If the specifications of Y are known, where to find the X can be found, in addition to regression analysis. After implementing the new pressure controls, the team gathered process data on sealing strength and the temperature of the hot air delivered to the packaging material. A regression analysis was done with 95 percent prediction bands. (Note: Prediction bands are typically set at 95 percent, but they can be made at 99 percent depending on the comfort level for the business. If the regression is good with a high R2, then 99 percent is more appropriate. Prediction bands are where the Xs of the data are contained – 99 percent prediction bands are wider, causing the range of the Xs allowed to narrow.)

Realistic tolerancing uses these prediction bands and seal strength specifications to determine the range of air temperature allowed for successful seals. The fitted line plot shown in Figure 1 shows that to achieve the optimized seal strength levels, specified as between 275 lb-f (pound force) and 375 lb-f, the temperature should be maintained between 189 degrees Fahrenheit and 231 degrees Fahrenheit.

Six Sigma Certifications, Six Sigma Guides, Six Sigma Learning, Six Sigma Study Materials

Figure 1: Optimized Air Temperatures

If more noise could be eliminated from the process, as represented by less variation around the best fit line, the acceptable range of temperature widens as shown in Figure 2. This noise includes the imprecision of the measuring procedure/device of seal strength (which is why Six Sigma promotes the need to confirm proper precision of critical measures). Although not part of this example, consider the hypothetical situation shown in Figure 2. With less noise in the process, the acceptable range of hot air temperatures widens from 183 degrees Fahrenheit to 241 degrees Fahrenheit.

Six Sigma Certifications, Six Sigma Guides, Six Sigma Learning, Six Sigma Study Materials

Figure 2: Optimized Air Temperatures After Noise Reduction

A team member suggested using 99 percent prediction bands that narrowed the acceptable temperature ranges to only allow 187 degrees Fahrenheit to 237 degrees Fahrenheit (Figure 3). This allows for fewer defects in the process because more of the process results are accounted for. Another (more advanced) technique that may be used here is guard banding, where limits for the input are narrowed by narrowing the width of the output’s specifications by as much as 1.5 sigma on each side. Figure 3 shows the same regression and tolerance analysis as Figure 2, but with lower variation and the use of 99 percent prediction bands.

Six Sigma Certifications, Six Sigma Guides, Six Sigma Learning, Six Sigma Study Materials

Figure 3: Optimized Air Temperatures with 99% Prediction Bands

The company was satisfied with these results; customer complaints for unsealed product were eliminated! They used realistic tolerancing with 99 percent prediction bands and achieved success with the reduced variation (see Figure 3).The company began to encourage its continuous improvement teams to use poka-yoke and realistic tolerancing to not only sustain these gains, but also to find new areas in which their processes could be mistake-proofed. Keep in mind that realistic tolerances are most powerful when just a few inputs affect critical process results.

Maintaining the Gains

Once a process has been improved, controls must be put in place. Many techniques can be used to sustain the gains; poka-yoke and realistic tolerancing can be used separately or together to prevent defects in the process. Consider these control techniques along with others such as statistical process control. Any controls put in place should be documented, which allows for easy confirmation of how improved results were achieved in the past.