Wednesday 8 January 2020

Doing Some Software Six Sigma and Agile Mythbusting

These are interesting times in the world of software development. Urgency around reducing development cycle times and costs, and growing the business value of software and IT assets is high. Six Sigma, CMMI, Lean software development and Agile methods are variously working together (or not quite) to address that urgency and show results. In the course of the lively discussions on the subject have come a couple of notions about Six Sigma that seem to have started with a grain of truth, but then morphed into accepted truths that are a bit out of step with the latest facts and experience. At least two of these “myths” need to be confronted and at least shaken up.

Myth No. 1: Six Sigma is about statistics…software development is not statistical, therefore there is no fit.

Six Sigma Is About Statistics


As the name implies, Six Sigma certainly has roots in statistics, but it has evolved to be much more than that. Figure 1 illustrates that the first incarnation of Six Sigma, at Motorola in the 1980s, had a strong focus on reducing defects and their rate of escape to customers. Six Sigma in those days revolved around defects per unit (DPU), defects per million opportunities (DPMO), and sigma levels. Even then, DPU and sigma levels were more of a common language for assessing process and product capability than a reflection that all improvements and tools were statistical. Those measures helped Motorola shift its manufacturing view from yield (good units) to defects within units, surfacing useful details on causes and cures.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications, Six Sigma Prep, Six Sigma Online Exam

Figure 1: Six Sigma Evolution

Companies using Six Sigma to reduce defect rates came to understand the need to focus improvement work around business drivers (reducing the most important costs – not 3.4 DPMO for everything that moves) and to reduce wasted time (not always 1:1 connected with defects). That Lean performance perspective is illustrated in the second box in Figure 1. While Six Sigma and Lean grew out of different origins, recognition of their natural synergy fostered a natural integration in “Lean Six Sigma” some time ago.

In most of today’s markets, reducing defects, costs and wasted time is a price of admission. Strong competition in those areas reduces profit margins and presents customers with several high-quality choices. Growing a business (and sometimes simply sustaining one) can hinge on a company getting ahead of the discovery curve on the next products, features and services of value. Design for Six Sigma (DFSS) informs a development process about prospective value and risk – and their intersection with product and process architecture, design and feature decisions.

In all the above, statistics can be a means for qualifying and quantifying meaningful patterns and differences, but they are not an end in themselves or the only approach. Language data, time and state data, and graphical and cluster analysis are examples of non-statistical kinds of information and methods that may be used in Six Sigma work. In basic terms, the Six Sigma thought process for DMAIC and DFSS projects is about:

◉ Working on the right problem or opportunity.
◉ Quickly reaching shared understanding with customers, the business and technology about the relevant environment(s) and (stated and latent) requirements.
◉ Identifying practical success measures (and baselines and targets).
◉ Identifying factors that may influence success (value and risk).
◉ Gathering facts and data about causes and/or dynamics that matter in problem-solving or design.
◉ Considering a full range of solution alternatives, using the facts to select the best.
◉ Demonstrating practical results – first with pilot, model or prototype (to reduce risk and learn about tuning and robust design).
◉ Delivering sustainable and scalable gains.

When statistics are helpful anywhere on that list, a well-rounded Six Sigma practitioner needs to be able to know if and how to bring them to bear. When the thought process involves other kinds of data or analysis, that is okay too – given that the significance of findings and results can be documented.

Software Development Is Not Statistical


This is partly true, but not completely. If one takes a strict view of the software development process, the availability and quality of useful process measures and the inherent variation in the work, there is a lot for a statistician not to like. Software development is not statistical in the same way that hardware development is. That does not mean there are not areas for useful applications of statistics.

The table below illustrates some distinctions between hardware and software. In hardware it is relatively easy to build a functioning first article – put the right few engineers and select materials in a lab for awhile and they will build a one-off of almost anything. In hardware, the value and risks associated with volume delivery of many exact copies of the one-off are huge. Mechanical, electrical and physical variation in the supply of components and process conditions are a key focal point for hardware sigma. Waste (scrap, excess inventory, etc.) is pretty easy to see in hardware, and the process and product factors driving it are often quite measurable (dimensions, temperatures, rates, etc.).

Contrasting Six Sigma Value and Issues for Hardware and Software

Hardware Software 
First Article Build One-Off: Easy
Volume Delivery of Identical Units Is Difficult 
Full Build: Challenging
*Effort
*Duration
*Defects
*Net Value
Defects and Waste: Invisable
Inputs and Result: Intangible 
Volume Delivery
(1…n Copies)
The Hardware Factory:
*Component Variation
*Process Variation
*Defects
*Delays
*Scrap
Defects and Waste: Visable
Inputs and Result: Tangible 
Software/IT (1…n Copies): Easy
Variation Can Still Be the Enemy in Areas Like:
*The Target Hardware
*User-to-User Ds
*Use-Environment Ds
Capability and
Robust Design
Anticipate Performance Under the Whole Range of Real World Use Conditions

In software, a fully functional first-article build is not trivial. Reproduction of 1 through n copies is not hard. Anyone can easily make digitally faithful copies of install kits. The value and risk are not in the tangible “tolerances” as they are in hardware, but in the less tangible variation and measures of success related to schedule, cost (mostly effort cost) and quality (things missing, wrong or extra) from pre-requirements through installation and hand-off.

While statistics do not take the shape of a “million widgets” problem, as in hardware, they can help answer useful questions about what to measure (and how and how much), how to make best use of incomplete data, how to separate signal from noise in data samples, and how to identify and learn from patterns and contrasts in a set of observations.

Myth No. 2: Design for Six Sigma imposes a waterfall life cycle model – emphasizing “requirements up front” in direct conflict with Agile insights and experience.

Design for Six Sigma Imposes a Waterfall Life Cycle Model


Design for Six Sigma has dug a bit of this hole for itself. Descriptions of tollgate reviews and language that describes the value in understanding requirements better up front could lead one to believe that DFSS is part and parcel a waterfall model.

The DFSS thought process can be seen as very flexible and above the distinction of waterfall or iterative life cycle. Figure 2 illustrates that regardless of the size of work batches or their sequence, development work still shakes out into some small number of familiar activities. In whatever sequence or frequency that these activities are visited, they make use of, and they deliver, certain kinds of data and incremental knowledge products (shown in the buffers in Figure 2). DFSS contains a set of tools and proven approaches that can facilitate information gathering, learning, decision-formulating and selecting, and results tuning and documentation support of any of those activities.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications, Six Sigma Prep, Six Sigma Online Exam
Figure 2: DFSS As a Set of Activities and Knowledge-Product Buffers

DFSS Emphasizes ‘Requirements Up Front’


On the surface, there is a strong DFSS message about understanding requirements “up front.” In practice, though, there is not as much incompatibility with Agile insight as it may seem. It all comes down to managing value and risk. When the value of moving ahead, the risk of being wrong and the value connected with more requirements work suggest forward progress, DFSS allows it, with notes about things that need to be learned.

Figure 3 illustrates the ends of the spectrum of Agile versus waterfall thinking about requirements. Many projects that might describe themselves at one end of that spectrum find that the practical answer is somewhere in the middle. Understood and applied right, DFSS can do a lot to streamline the work anywhere on that continuum.

Six Sigma Tutorial and Material, Six Sigma Guides, Six Sigma Learning, Six Sigma Certifications, Six Sigma Prep, Six Sigma Online Exam
Figure 3: Spectrum of Agile Versus Waterfall Thinking About Requirements

Related to the idea that DFSS forces “all requirements work” up front is a notion that more attention to requirements understanding means more unnecessary features and bloated code. It is a little counter intuitive, but understanding about requirements can translate into crisper decisions about what not to do – reducing unnecessary features and size. Requirements understanding brings light to the decision-making – not necessarily weight to the code.

Related Posts

0 comments:

Post a Comment