Automate data comparisons for rigorous test automation
Automatically test changes in back-end systems, ensuring that test assertions do not mask critical and potentially costly bugs. Running automated data comparisons during test automation gives testers confidence that tests have achieved the right results for the right reasons, verifying expected results in databases and back-end systems.
Shallow test assertions leave systems exposed to costly bugs
Too often, testing today cannot ensure that it gets the right results for the right reasons. Shallow test assertions might verify that a message has been displayed in a UI, but this does not robustly test against expected results in back-end and integrated systems. This leaves systems exposed to costly bugs, such as a failure to update databases correctly, or when line of business interactions do not trigger critical business processes correctly. Automated testing today requires robust assertions that test against the full range of functionality under test. Otherwise, bugs in test assertions risk hiding bugs in the system, as inaccurate tests “pass” when they shouldn’t.
Find bugs in back-end systems earlier and at less cost to fix
Integrating automated database comparisons into UI and API testing rapidly boosts testing rigour, seamlessly testing complex systems across multiple tiers. The automated snapshot comparisons at the data level give testers assurance that their tests have passed for the right reason, verifying that tests have made the expected changes in back-end systems. With Curiosity’s database comparison utility, when the database has not been updated correctly as tests run, QA are alerted early to a potentially critical and costly bug. Testers and developers can embed the snapshot database comparisons seamlessly in open source and commercial test automation frameworks, finding defects in back-end systems while they are quick and affordable to fix.
The rapid snapshot comparisons compare data before and after tests run, providing a summary and granular report of database changes. Testers can understand exactly what data has been added, deleted or modified by UI or API tests, verifying that critical back-end processes have been performed correctly. The automated data comparisons reporting is easy to setup in Curiosity’s web portal, while integration with model-based test generation embeds comparisons automatically in auto-generated tests. This integrated approach removes test case creation and scripting bottlenecks, while creating coverage-optimised tests with robust assertions. It enables rigorous in-sprint testing, finding potentially costly bugs before they slip to production.
Robust assertions for in-sprint testing
Watch this demo of automated MySQL data comparisons during test automation for an eCommerce system, to discover how:
Curiosity’s database comparison engine integrates seamlessly with open source and commercial test automation frameworks, snapshotting and comparing back-end systems before and after testing.
The automated data comparisons compare different instances of data as tests run, rapidly ensuring that expected results have been achieved in critical back-end systems.
The high-performance comparisons provide a summary and granular report of data that has been added, modified, or deleted by test activity, enabling thorough assertions during automated testing.
Test generation from Test Modeller embeds the data comparisons in auto-generated test scripts, rapidly creating the tests needed to find bugs while they are quick and more affordable to fix.
A data comparison report and step-by-step web reporting combine with run results overlaid in visual system models, enabling the rapid identification and root cause analysis of potentially costly bugs.