Divide and Conquer

August 27, 2006 by
Filed under: Process, Project management 

In the ideal situation, the process of software development should consist of a series of appropriately selected steps, each of which clarifies some aspect of the system as you proceed from inception to delivery. Done properly, there should be no “and then a miracle happens” steps along the way, as can occur when you dive into the code right after throwing together some concepts on the back of an envelope.

The goal is to have precisely the right number of steps. None that fail to add clarity in some respect, and enough to make the process predictable and repeatable. There is generally no way to identify these in advance, it is entirely dependent on the nature of the product being built and the knowledge and experience of the team involved. While at the highest level we can say that we need to know what we want to build before we decide how we are going to build it, before we build it, this is insufficient as reasonable guidance for the team.

On many large projects, or projects that are developed under the guidance of an exhaustively detailed methodology, there can be a great deal of waste in dogmatically applying steps that add no intrinsic value. I’ve been involved in the development of large systems where the decomposition of the system into hardware and software components (a critical aspect if done properly) was essentially a cut-and-paste exercise of moving system-level requirements into their appropriate subsystem level specs.

What was missing in this exercise was the required analysis and clarification of the interfaces between the subsystems needed in order to support the system’s ability to achieve its goals. We had literally performed functional decomposition, which the textbooks will tell you is a critical stage of software development. What we should have seen this as is more appropriately functional decomposition and alignment, ensuring that the pieces we broke the system into held together. The glue was the missing value-add, and the key part of this activity.

For example, let’s say that we have a system that needs to shut down if the temperature gets too high. We have a system-level requirement (which needs to be more precisely specified, of course), that could break down into requirements for the hardware to have a temperature sensor, and the software being able to provide a controlled shutdown feature. This is functional decomposition to be sure, but there is a little something missing.

We need to understand here the mechanism by which the sensed temperature is transferred to the software. Is there a polling that takes place periodically (and what frequency is this), is there a hardware-based trigger that sends a signal to the software? Are there conditions where the system may be too busy to poll at the appropriate frequency, potentially preventing a reasonable shutdown? What are the valid ranges and parameters associated with the messages or signals, and are these subject to change at design or run-time? Does the information need to be cached for audit purposes, does the shutdown event need to be logged?

Perhaps more than a little something missing, and I’m sure this is not an exhaustive list of questions. This is all requirements-level work, too.

This glue is the value added information that is the critical component of this stage of analysis, and often falls way short. Without this, the coordination of the subsystems to achieve the overall goal is left to be discovered later. This discovery will be done, but in the context of individual decisions or integration-stage discoveries. Often incorrectly made or found far later in the project through trial and error, where the cost is significantly higher.

In most projects, this higher cost will not be measured or well understood.

We spend most of our time in the implementation stage, inefficiently making what should be requirements decisions. This failure to provide value-add often happens at all stages of development, and we need to take care to add this appropriate understanding at each stage.

We must ensure that all the work we do adds the appropriate value of understanding to the project. The dividing into smaller chunks is important, but relatively trivial to perform. It is the addition of the glue to bind these chunks that is the tough work, that is often neglected until later. Discovering this information as early as possible is the key to conquering your projects. – JB

If you enjoyed this post, make sure you subscribe to my RSS feed!

Comments

Feel free to leave a comment...





  • Search by Topic

  • What’s Happening

    January, 2018 – A workshop series to help you develop resilience in the workplace and in your life!

    Next open enrolment sessions start soon – contact us to get involved!

  • On The Road Again

    Jim frequently travels across Western Canada for engagements, and welcomes opportunities to meet, run a workshop, Diagnostic or Lunch and Learn session.


    Contact Jim if you would like to connect around any of the upcoming dates:

    • September 19-21, 2017 – Winnipeg, MB
    • October 3-5, 2017 – Regina, SK
    • October 20-22, 2017 – Winnipeg, MB
    • October 23-25 – Saskatoon, SK
    • November 14-16, 2017 – Winnipeg, MB
    • November 20-22 – Regina, SK
    • November 26-28, 2017 – Edmonton, AB
    • November 29-December 1 – Calgary, AB
    • January 17-19 – Calgary, AB
    • February 10-11, 2018 – Edmonton, AB
  • What People are Saying

    Jim is a man of integrity. He has a wealth of experience in all aspects of software development, deployment and project management. He has a great deal to offer as a mentor and thought leader. It is always a pleasure working with Jim.

    — Jas Madhur, Director of Quality, MDSI