article

JonathonWright avatar image
0 Likes"
JonathonWright posted

Legacy Is Our Legacy

Image title

After our Halloween webcast Tales in Testing ? Paranormal Service Virtualization Use Cases, I discovered the terrifying truth that I?m officially a millennial (by 13 weeks). At the same time, if I had been an information technology (IT) system that was delivered over four decades ago, I would have most likely been written in Cobol and found lurking on DECtape somewhere in a dark server room! It?s official: I am legacy!

But what if I told you that the statement ?legacy is our future? is not a contradiction and that everything within IT is just a node?

Why Should You Care About Your Legacy Today?

Modern organizations and applications are built off legacy systems and therefore depend on them. We?ll demonstrate this by building an imaginary connected device.

Let us say our device is a simple physical switch that has two straightforward states, either a zero or a one. This is the legacy upon which the device will be built.

The switch alone is simple enough to check; it is either on or off. But what if you add another five physical switches and then have to account for all the possible orders in which they could be activated? Suddenly, the possibilities that could be checked are roughly the same as the number of atoms in the universe.

This device is still simple by modern standards and is built with legacy parts. What if we now add a simple microservice to the switch and connect the system into an Internet of Things (IoT) mesh that can be controlled directly from an Amazon Echo?

Suddenly, we have gone from a simple legacy toggle switch to a machine to machine device, and then to an IoT-connected device powered by artificial intelligence and enabled with deep learning algorithms.

If this is not enough to worry about, there are the added digital risks that come with cognitive adaptive technology to digital predators. Just look at the recent examples of hackers exploiting IoT connected crock pots. How do you currently fail-forward or self-heal from systemic failure? How can you keep alive after a massive distributed denial of service? Think of the recent botnet attacks stemming from IoT cannons in our recent blog. These modern threats feel better placed within an episode of Mr. Robot rather than the new reality for C-level executives (such as chief digital officers and chief risk officers).

All this potential risk to your brand just so I can ask Alexa to turn on my heating in my UK home while presenting onstage at STARWest?s Think you can just ?Test? that API? Think again event in California last month. The cognitive adaptive technology that we are discussing already exists. And if you aren?t already testing it both functionally (remember: no UIs) but more importantly non-functionally (security, performance, and resilience), then you need to become ready, capable, and enabled.

Image title

Just think of the number of information transformations (Turing[i] Machines ? an abstract entity that can transform information) and API calls (or equivalent programmable looms and punch cards) you need to make to bring these legacy technologies out of the dark ages and into the modern world. Now imagine testing all of those!

Image title

What Is Legacy?

Put simply, it is an unknown node that was once known. Something of value that a human operator in the past has delivered into the ecosystem, but which is not self-managing or regulating. Often, knowledge of these valuable components leaves with whoever created them. For example, in the late 1950s the COBOL programming language was first introduced, nearly 50 years later in 1997, Gartner estimated that there were 200 billion lines of COBOL in existence, which ran 80% of all business programs.

What's the Problem With Having Legacy for Testing?

Understanding. Unless you completely understand every component and the related behaviors of your ecosystem of ecosystems, then the intention, know-how, and logic behind a system are questionable. Without sufficient knowledge of the system, testers and developers are left to question the purpose of the system component. Such uncertainty is the enemy of assurance, and we should aim to reduce epistemic uncertainty, striving to make all knowable knowledge known.

Working with completely undocumented legacy systems is one of the core challenges the CA Agile Requirements Designer development team has had to contend with. The complexity of these systems means that there are more possibilities to be tested than there are atoms in the universe. Before a change, these organizations often attempt to gather knowledge from across as many teams as possible in the organization and rely on individual subject matter experts to create tests. The time spent planning a change sometimes then dwarfs the development time spent executing it ? nowhere near reactive enough to develop systems that can keep up with constantly changing user needs.

How Can You Understand Your Legacy?

As the above examples suggest, you need to understand legacy components and how they relate to everything developed after them. To test systems enough to achieve true assurance, this understanding should go right down to the node level and how they relate.

The traditional approach to observing legacy systems that are not self-describing is reflection. Today, the standard approach is to apply heuristic techniques to problem-solving, learning, or discovery, creating cognitive maps that reflect the current understanding of how the known nodes relate. These maps can then be maintained and built upon as knowledge is discovered.

The challenge with creating cognitive maps that route from a single node is that they typically only represent known knowledge insofar as it can be expressed by the teams that contribute to the maps. It can therefore exclude the quantitative intelligence necessary to provide the bigger picture, which means the cognitive maps do not represent all the possibilities present in the individual nodes. By contrast, true assurance requires the ability to sufficiently test against these possibilities.

How Can You Re-Discover Legacy Enough to Test It Sufficiently?

Legacy systems can be discovered by probing nodes as part of an exploratory approach. This involves probing by searching, investigating, and interrogating to obtain the contextual interrelationships of nodes and offer an acceptable and scientific approach to discovery.

Image title

This probing aims to uncover and record all knowable information in a self-discoverable manner. The purpose is to discover the behaviors of the underlying nodes within the subsystem, as well as the key interactions that make up the nervous system across the ecosystem. The end goal is to encapsulate the whole system holistically, leveraging a systems thinking approach and lean systems engineering techniques to apply the correct mindset, heuristics, and business goals.

Once a visualization of the whole ecosystem has been achieved, the understanding of the legacy system can be maintained. This will be a constant iterative process of uncovering more knowledge (including that obtained through testing) and might encompass technologies such as machine learning, training algorithms, and intelligent analytics.

How Can I Test Legacy? 

Full life cycle virtualization enables you to test hypotheses from day zero even for the most complex ecosystems of ecosystems. Before a single line of code has been written, testing can be achieved via the virtualization of

  • Functional CX (globally unique identifiers).
  • Nonfunctional UX (link virtualization).
  • Security and penetration (endpoints virtualization).

With legacy, you are not testing from day zero; you are starting at the other end and trying to recover the knowledge that should have been established and maintained from the start. Testing legacy therefore means moving from existing poorly understood operations to the design phases (i.e., OpsDesign). With the knowledge needed to test a system upfront, best practices can be tested to provide true assurance that the legacy systems and everything built on them is functioning as it should.

Using the above principles of discovery, imagine the potential of opening Pandora?s box and discovering the entire biosphere of legacy components.

Image title

Whether you label certain endpoints (AS/400, 80s), links (IPX, 80s), or nodes (DB2, 70s) within your organization as legacy or even retro, the time is now to start your digital transformation journey to digitally remastering the exponential innovation within your organizations.

The next step is to unlock your organization's digitalized DNA so that you can become enterprise digital and go beyond legacy (remember that legacy is our legacy!).

agile
10 |600

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Article

Contributors

JonathonWright contributed to this article