article

Scott_Edwards avatar image
0 Likes"
Scott_Edwards posted

The 13 Haunts of Testing

Ah, Halloween.  A tradition we celebrate every year.  We take a break from our fast-paced lives and pretend to talk and walk with monsters, zombies, and scary clowns.  We get a kick out spooking ourselves talking about the haunts of witching spells, ghouls, goblins and old decrepit houses. We laugh at the slight scary, and then just move on.  The next day, we wake up, eat some trick-or-treating candy, and all is normal. 

Halloween is fun because it?s not real. Which means it?s not that scary.

But for software developers and testers, the nightmares of gremlins in the code that cause buggy software is real, and real scary.  Recently, Business Computing World wrote an article about how software failures are costing the economy over 1 Trillion. The BBC reported that nearly 50% of all business experienced a cyber-attack or breach in the UK.

Nightmares on Testing Street

Too often, software developer and testing teams can?t escape the frights of software quality as they work to create applications that inspire rather than scare.  These haunts are especially true when these teams have to implement new Agile development methods to meet the pressures of delivering innovative software, fast.  But modernizing testing practices has proved a huge barrier in meeting the speed development teams are implementing and business leaders are demanding.

Why is that?  Too many testing processes have been manual, thereby killing the fast time to market approach.  A telling stat is that 93% percent of testers say automation is important, but only 20% say they have 80% or more of test automation coverage. Additionally, legacy testing tools simply weren?t designed for speed and sprints, and traditional testing Centers of Excellences restricted use of those tools.  And, DevOps teams have lacked the visibility and analytics into how code progressed based upon the passing of specified quality criteria, and if what they were developing really did what the business requirements specified. 

Wake Up From Your Quality Nightmares With Continuous Testing

On Friday, October 13th, a group of thought leaders in the QA space are holding a Continuous Testing Virtual Summit.  This summit includes people from SauceLabs and the Selenium community, Cloubees and the Jenkins community, CA Technologies, TestPlant, Vericode, Perfecto Mobile, BlazeMeter, Sprint, Lincoln Financial Group, DZone, and TechWell to talk about how to take the scare out of testing.  To get ready for this event on October 13, I?ve put together what I?d say are the 13 biggest hurdles to achieve agility in testing.  Here are the 13 Haunts of Testing:

These pains lead to what I term the "13 Haunts of Testing:"

  1. Testing with legacy tools that can?t keep up with the speed of Agile ? These Frankenstein-speed of tools just can?t keep pace with a continuous application delivery model.  They were built for an age where apps were released 2-4 times a year. Testing teams don?t have months to test now, but hours.  And often, they required a rocket scientist (or at least someone trained in the software) to actually use it. And the tools would only get distributed from a restrictive Center of Excellence that controlled everything.
  2. Never investigating how to accelerate and improve the intake/requirements process ? It has been said that over half of application quality issues stem from unclear business requirements. Despite the evolution of some areas of testing, requirements are still specified through written natural language, which leads to ambiguity and poor testability.
  3. Manually creating test cases by hand ?  Wastes time and introduces errors.  Bad idea. Test cases are too often manually designed and are built on incomplete requirements. Test automation requires a human being to manually create the automation scripts first, which then have to be manually maintained sprint after sprint.
  4. Incomplete testing coverage ? Creating thousands of test cases and just hoping you have the right coverage.  Huge problem.  A financial organization recently showed us how they had 27,000 test cases hoping to cover all possible user navigation scenarios.  One we did some digging, those 27,000 test cases only covered 30% of possible scenarios.  By modeling all the business requirements, you can actually reduce test cases to a minimum (automatically) and then maximize your coverage.
  5. Unable to access test environments ? If you don?t have unlimited access to test environments, 3rd party systems, or data for testing, that?s a bottleneck.  This often has meant either playing ping pong while you wait, spending tons of money to replicate production environments, or just covering your eyes and letting the code go without really testing its resiliency.  Bad idea.
  6. Doing load testing late in the software development lifecycle ? You?re just begging for performance problems if this is what you typically do.  This is the same for Stress Tests - for understanding the system?s capacity limits, for Soak Tests - for examining the system?s abilities to hold a continuous expected load. For Spike Tests - for examining the system?s abilities to hold a heavy load that was abruptly and quickly generated. And for Isolation Tests - a repeated test for examining if a known and detected system error or issue previously detected by the same test, was fixed.  Skip the war room chaos and shift testing to earlier in the SDLC
  7. Waiting weeks to get data to test with ? Getting access to test data is often one of the biggest adversaries to testing at speed.  A common number is that testers often spend half their time waiting for data in order to run their tests.  According to Subhendu Pattnaik, ?the new age software development and testing methodologies need faster, and more recurring release cycles to tackle various new challenges posed by the digital world. In this context, it becomes inevitable for organizations to create a test environment with the necessary test data and testing standards.?Test data can be considered any kind of information fed into the application being tested ? like names, addresses, birth dates, account numbers, and so on.  The right data includes ?happy path? data, out-of-range data, and negative data. The more representative the data, the more thorough your tests will be. Finding appropriate test data shouldn?t be like trick or treating from door to door to get what you want. 
  8. Testing with Personally Identifiable Information (PII Data) ? When looking for test data, just grabbing data from production should be the last thing you do.  This is just a huge no-no.  With the GDPR (General Data Protection Regulation coming full force in May 2018, you should stop the practice now of taking production data and using it in your testing.  The last scare you need is ending up being on the front page of some paper because someone hacked your development system and stole PII data.  Data sub-setting, masking, virtualization, and synthetic generation should all be part of your plan.
  9. Waiting to Run Tests until the User Interface is Complete ? The UI is critical to test; it?s where you interact with your end-users, in a sense, face to face.  But not testing all the component and integration tests in the backend is scary.  The UI is just the tip of the iceberg.  Request/response pair testing must begin early to fix costly mistakes that can be taken care of before the UI-wrapping.
  10. Coding without running security testing early in the app lifecycle ? Just look at Equifax. Security must be at the forefront, and something that should be built in from the beginning.  To banish the banshees of security gaps, you?ll want tools that not only identify risks in near real-time while the coding is happening, but also be able to identify good coding behavior that helps improve the capabilities of the team.
  11. Manual API testing ? We?ve established that API testing is critical.  Today?s apps are made up of 100?s of API calls.  With the number of API tests needed to thoroughly validate an API and the explosion of APIs, the manual approach is time-consuming and labor intensive. Most API testing tools only provide a method to execute a single test that you created manually.  What is wicked cool is auto-generating thousands of API testing scenarios, and then executing all test cases with the click of a button.
  12. Testing after development sprints are complete ?  Too often, Agile is an enemy to testing.  But it shouldn?t be.  Testing rather should begin early, right at the beginning as requirements gathering begins. Alex Martins explained, ?Shift left (testing) is about bringing thinking earlier into the development lifecycle. In other words, it is about thinking upfront, and involves a process of testers working earlier to challenge the sources of knowledge that inform both design and development. In this way, the likelihood that the design will faithfully reflect the user?s desired functionality is maximized.?
  13. Relying on manual testing hand-offs and being blind to your testing status at release ?If you are treating testing as a specialized, isolated workflow operating with traditional, manual testing procedures and technologies, you might as well be digging a grave for your application.  To make your application succeed, you must eliminate any silos?essentially creating a seamless team focused on quality, including developers, QA, release managers, operations and more.  Analyzing your end-to-end testing strategy, and orchestrating the application delivery pipeline to promote releases from one environment to the next as soon as all the tests pass, is what can set you up for success.

The truth is, quality must be built into the application from the very beginning, not tested for after the fact. As stated by Business Computing World, ?This means quality must be the key consideration right from the moment of conception.?  That is the key to achieving the desired acceleration and quality as you bring those feature-rich applications to market.

Again, join the thought-leaders in Continuous Testing on the Virtual Summit on October 13th to see how you can escape these haunts of testing.  

testing
10 |600

Up to 8 attachments (including images) can be used with a maximum of 1.0 MiB each and 10.0 MiB total.

Article

Contributors

Scott_Edwards contributed to this article