The Curiosity Blog

Use SDLC Traceability to Fix Test Automation

Written by Mantas Dvareckas | 26 July 2022 08:00:00 Z

The increased demand for new software, faster delivery and better quality calls for greater automation across every stage of the software delivery lifecycle (SDLC). In fact, 46% believe that increasing the level of automation is the most important aspect when working towards making testing and development more efficient [1].

However, even with the introduction of test automation, overall automation rates across the SDLC remain low. End-to-end automation is often held back by persistent silos, conflicting data insights, poor requirements, and an enduring reliance on repetitive manual work.

These challenges reflect the fragmented nature of modern delivery pipelines and a lack of general traceability.

A lack of traceability creates a wide range of issues, from inadequate user stories and test coverage, to a lack of system understanding and repetitive manual processes. This in turn leads to teams being reactive rather than proactive when tackling quality issues, scrambling to fix bugs and maintain tests. The fragmentation of tools and teams further adds to misunderstanding and miscommunication across the whole SDLC, impeding development.

By contrast, the introduction of traceability can help teams make the right choice for the right reasons, by linking all aspects of the SDLC and eliminating fragmentation. Organisations looking to overcome software delivery problems must therefore consider implementing SDLC traceability.

What is SDLC Traceability, and where would I use it?

The goal of traceability is to keep track of and record the history of an item or component, and it’s impact on interrelated assets. Throughout the SDLC, there are a range of linked components, assets and data points. This includes requirements, code, test data, environments and test cases. If you can more formally link these changing components, you can achieve a healthier software delivery ecosystem and deliver better quality at speed.

However, modern delivery pipelines are often fragmented, as organisations use a variety of tools across different teams and silos, with only poor links between them. This fragmentation creates a lack of understanding between stages and teams in the SDLC, as well as an inability to automatically respond to changes made across tools and teams.

Implementing better traceability gives organisations the ability to track user stories, objects, data, scripts, test runs, and everything else across the whole SDLC. Traceability creates transparency and accessibility for every step of development, and is key for overcoming bottlenecks through automation. This in turn supports numerous goals for enterprises software delivery:

  • Risk mitigation
  • Faster releases
  • Operational efficiency
  • Proving (not just meeting) data compliance

Want to learn more about using traceability to optimise software delivery? Watch our free on demand webinar, Enhanced their SDLC Traceability through Test Automation!

Symptoms of Low Traceability in Testing

Test automation is critical for continuous integration, delivery and deployment. Automated testing allows teams to automate repetitive-but-necessary testing tasks, along with tasks like load testing, which would be practically impossible to perform manually.

However, the truth is that test automation is not a magic, plug-and-play solution. It typically requires financial investment, skilled developers, and time.

The return on this test automation investment is often reduced by a reliance on slow and repetitive scripting, as well as a range of manual processes surrounding automated test execution. In fact 1/3rd of DevOps team time is spent on manual CI/CD tasks like detecting code quality issues [2].

This dependency on overly manual processes reflects a lack of traceability. Poor tracing across the SDLC means that tests must be created or updated by hand to test changes in code, user stories, and beyond.

Alongside limited automation adoption, poor traceability has a range of negative consequences for testing:

  1. Finding bugs late - mounting remediation costs.
  2. An over-reliance on manual test creation.
  3. Manually checking and fixing brittle tests.
  4. Bottlenecks created when finding, waiting for, or creating test data.
  5. Mismatched data breaking automated tests.
  6. Mounting technical debt - always playing catch-up.
  7. Misunderstanding what needs testing.
  8. Not knowing with confidence when testing is “done”.

Without better traceability, these challenges can be near impossible to overcome for organisations who are already struggling with growing application complexity and rapid system change.

Organisations must consider a different approach to test automation, one that has traceability built into its foundation. Model-based test generation enables one such approach, auto-generating targeted tests based on changes to formally linked assets.

Building Traceability Through Models

Model-based test generation offers an approach that can generate targeted tests on demand, linking test generation to changes in user stories, code, and beyond. Model-based testing thereby introduces the flexibility needed to update test assets in-sprint, as well to generate rigorous automated tests continuously.

With Curiosity’s Test Modeller, requirements, tests, code check-ins, and models can further be tracked within a Traceability Lab:

As a change occurs in one file or tool, Test Modeller’s Traceability Lab will run impact analysis to flag risks across user stories, tests, code, and data. Generating tests, user stories and data further allow rapid responses to these quality risks, testing and developing at the pace of change.

Previously built models in Test Modeller further become reusable assets and can be used as subflows, enabling users to quickly build end-to-end flowcharts with full traceability between components.

The Traceability Lab: Linking Tests, Code and Data to Requirements

Test Modeller’s Traceability Lab provides a real-time checklist of potential quality improvements. The checklist acts as an early warning system, allowing cross-functional teams to respond to emerging risks.

For example, the Traceability Lab analyses changing user stories, identifying ambiguity, incompleteness, and other potential risks. Using Test Modeller, risks identified in user stories can then be mitigated by generating Jira user stories and subtasks, supporting accurate and unsiloed requirements communication across projects and teams.

The same models can further auto-generate and maintain optimized test scripts and data, reducing both testing bottlenecks and design bugs.

All of this combines to create a solution where cross-functional teams can work using their favourite tools and formats, while also collaborating in parallel from tracked resources, visual models, and matching user stories.

Watch this demo to see Test Modeller’s Traceability Lab in action:

Continuous Quality, from Design to Release

Organisations can facilitate a culture of quality across the whole SDLC, while fixing test automation with Test Modeller’s model-based tools and Traceability Lab.

If you don’t think greater traceability is required at your organisation, ask your teams these simple questions:

  • Do you understand all of your data?
  • Do you have all the data to develop and test the requirement?
  • Does the test have the data needed to run?
  • Do you know which data is being used for each test?
  • Do you know all the code effected by a requirement change?
  • If you change your code which requirements need to be updated?
  • Do your tests, test your requirements?
  • Are the tests linked to requirements?
  • Do you have all the data to develop with?
  • If the data characteristics change, do you know which code to change?

Speak with a Curiosity expert to get started!

Footnotes:

[1] Capgemini, Sogeti (2021), World Quality Report 2021-22. Retrieved from https://www.capgemini.com/gb-en/research/world-quality-report-wqr-2021-22/  

[2] DynaTrace, 2021 Global DevOps Report. Retrieved from https://www.dynatrace.com/monitoring/solutions/devops-report/