article

Alex_Martins avatar image
0 Likes"
Alex_Martins posted

Continuous Delivery: You're Doing It Wrong!

Image title

I see many successful DevOps teams (yes, there are quite a few already!) at a point in their journey where they have learned to work together as one, without any walls between them, and are now focused on increasing delivery speed. That?s when they put Continuous Delivery (CD) at the top of their agenda.

When most people think about CD, they think of improving the build-test-deploy-operate cycle. They don?t think about how to accelerate and improve the intake process. Ensuring that quality is built into the application (not tested for after the fact) is the key to achieving the desired acceleration through CD. Testing and quality assurance organizations are definitely trying to transform themselves to become the enablers of the acceleration in the CD pipeline (check out

The Gap in CD Acceleration

Organizations working to achieve CD soon realize that they are great at building things right and with speed. However, they are still unsure as to whether they are building the right things. There is a difference between the two and, in my experience, this is a gap in CD initiatives.

Just last week, I was invited to an all-day DevOps transformation workshop at a large multinational company where they gathered their business, development, testing, operations, PMO, and enterprise architecture teams. That was the first session in their journey, and the objective that day was for the different teams to agree on common terminology across the groups and also identify the priority areas they would want to start working on improving. They were asking me for some coaching to make sure that they were going down the right path.

It was clear that day that all conversations inevitably converged on how to accelerate the code development, testing, and deployment to the different environments. However, no one questioned whether they were using the right inputs (i.e., clear, unambiguous requirements) before they started coding, testing and deploying. That?s when I jumped in to drive the session.

So, how do you ensure you are building the right things and you?re building them fast? You focus on improving and accelerating the requirements-gathering process regardless of whether you?re in a traditional or Agile organization.

A good example of this is the way that requirements are communicated across different teams. Requirements are the foundation of everything in the software development lifecycle (SDLC) and yet, after 30 years, they are still being communicated the same way across different teams: through the written language. They are written in Word or Excel documents or in requirements management tools. That is a completely manual process.

Image title

Naturally, a manual process becomes a bottleneck in a highly automated CD pipeline where the ultimate goal is speed with quality. Not only that, but requirements written in text many times are ambiguous and open to interpretation. Ambiguity is the cause of approximately 56% of defects introduced in the application code. The first version of the famous figure below was released over 10 years ago by projectcartoon.com, but unfortunately, it is still relevant. Each team involved in the SDLC has different interpretations and expectations of the requirements.

Image title

Agile software development methodologies and more recently CD practices are all aiming to prevent this from happening. They shift the software development paradigm to short iterations and continuous feedback across teams to ensure that any communication gaps or inaccurate expectations, which all lead to requirements ambiguity, are identified and addressed earlier in the lifecycle, truly ?shifting left? all quality-related activities to prevent defects instead of testing for defects. However, most of these activities are manual, which slows down the CD pipeline.

Still Too Much Manual Effort

In addition to requirements acceleration being overlooked in CD initiatives, I also see many companies I work with still doing a lot of things manually that they could be automating.

  • Test cases are still designed the same way: by reading requirements documents or user stories and designing test cases and test steps. The process is manual and unsystematic. The test case coverage is dependent on the test case writer and their understanding of the requirements.
  • Test automation is still not pervasive in the SDLC. Around 70% of all testing is still done manually. Teams that have achieved better levels of automation still struggle because creating the automated tests is still a very manual process. Developers doing test-driven development (TDD) or test automation engineers still have to manually write code to automate tests in the application. Depending on the developer or engineer, automated test scripts will achieve different levels of test coverage and need to be maintained over time, which also requires more manual effort.
  • Test data is always a bottleneck and a pain point for most organizations as it takes up around 50% of the tester?s time. At a very simple level, usually, they address it by taking a copy from production data, masking it, and making it available to preproduction environments. This process is usually a blend of manual and automated processes that are very time-consuming (i.e., days to weeks). Unfortunately, based on what I typically see across the organizations that I work with, the data variety in production usually covers only 10-20% of the test cases. That means that additional time is needed to manually create and prepare and massage data.
  • Interfaces to other systems (i.e., through APIs, web services, etc.) are always challenging because, usually, 56% of those dependencies are not available for developers and testers. Those interfaces either don?t have the capacity required by the system under test (SUT) or simply have not been built yet. This slows down development and testing time as the interfaces have to be stubbed in order for the application code in scope to be tested.

So we hear a lot of automation talk across the CD pipeline and that we must eliminate as many manual activities as possible in order to achieve maximum acceleration. However, it is clear now that we still have work to do.

Requirements as the Foundation for CD

When an architect is designing a house, they start with a rough sketch or a blueprint. That blueprint is then reviewed iteratively with the client. Once the client is satisfied, the architect baselines the blueprint and shares it with the other persons in the design process, such as the interior designer and the electrical engineer. They then add their designs as layers on top of that foundational blueprint created by the architect.

Obviously, this is a very high-level take on the process of designing a house, but stay with me. You?ll get the analogy coming up.

Architects and the other roles in the process described above use computer-aided design (CAD) software in design projects. The software helps them tie all design layers (from the different persons) to the foundational blueprint and keeps full traceability across all of them. So, through the CAD tool, each person can see the full project with all layers turned on or they can simply turn on their own layer (i.e., plumbing, electrical) in order to see only what pertains to them. The main advantage is that everyone involved is working off of the same foundational blueprint and everyone understands how it all fits together at the project level.

The best part is when there is a change in any of the layers. The software automatically identifies the impact of the changes across all layers and prompts the user to take action to update a specific layer and address the impact on it. There are a lot of automated suggestions for each owner of an impacted layer on how they can implement a fix to the layer. See below an example of multiple layers in a factory design project represented in a CAD tool.

cad1

cad2

cad3

Truly Shifting Left and Achieving Maximum Lifecycle Acceleration

Now, imagine if these building design concepts, techniques, and tools existed in the software design world. Unambiguously define a requirement? Keep full traceability across application screens, code, manual tests, automated tests, test data, interfaces (virtual or real), defects, etc.? No need to keep on imagining. I know many companies that have already moved to true full CD pipeline acceleration by starting with the requirements and then going to coding, testing and releasing.

Stay tuned and I?ll share a detailed approach on how to apply these building design techniques to software design. 

continuous delivery
10 |600

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Article

Contributors

Alex_Martins contributed to this article