The Complexities of Work Flow Testing Simplified with GenRocket by Hycel Taylor on Dec 11, 2017

The Complexities of Work Flow Testing Simplified with GenRocket

Arguably, one of the most challenging tasks for a tester is the testing of complex workflows. Complex workflow testing involves testing multiple business units, interacting with each other, in a given sequence, over some period of time, while validating that certain steps between interacting business units are producing the correct results.

By nature of all of the moving parts within a complex workflow, the steps that must be defined, the business methods that have to be tested, and the test data that needs to be generated, it’s already a major challenge. But, even more challenging is not just generating any kind of test data, but generating data that can be dynamically mapped and shared amongst business units from one workflow step, to the next workflow step, and the next one after that, and so one until the entire workflow is successfully tested.

For many testers, this can be an almost impossible task or at least one that takes hundreds of hours to implement and many more hours to maintain. Well, testing complex workflows is actually not as impossible or as challenging an experience when using the full capabilities of the GenRocket Test Data Generation platform in conjunction with the Test Data Mapping Design Pattern. With the GenRocket TDG platform, you can easily:
  • Model, maintain and generate synthetic test data for multiple business units
  • Manage organization variables that allow business units to share common values during dynamic test data generation
  • Script complex workflows using the GenRocket API
  • Dynamically manage related data between workflows using the Test Data Mapping Design pattern.
Before we talk about the benefits of the GenRocket platform to successfully test complex workflows, we first need to define the Test Data Mapping Design Pattern.

Test Data Mapping Design Pattern

The design pattern for mapping a synthetically generated value as the key to some other value, is either mapped in memory or to a temporary database table with the following three columns:
  • namespace – defines a unique identifier for a set of key/value pairs
  • key – defines a key to reference a value within the namespace
  • value – defines a value to be referenced by a key within the namespace

Database Insertion Mapping Example

We are inserting new rows into a database table called, user.  The user table is currently populated with 10000 rows of data. The user table has a key column, id, that is auto-generated by the database and the last auto-generated number was 10000.

When the first row of synthetic user data is inserted into the database, the synthetically generated user id starts at 1. However, the number that is auto-generated as the primary key for the id is 10001. Thus, the mapping of the synthetically generated id to the auto-generated id is stored in a Test Data Map as [user, 1, 10001] where:
  • namespace = user
  • key = 1
  • value = 10001
Now that the synthetically generated id is mapped to the auto-generated id, anything that synthetically generates a key of 1, can retrieve the actual row in the user table that has the primary id of 10001.

But, how does this translate to using synthetically generated test data with the test data mapping design pattern in order to automated testing of complex workflows involving multiple business units?

GenRocket Platform to the Rescue

This is where GenRocket Projects and GenRocket Organization Variables can be used to model test data that can represent multiple business units that share a common set of variables.

Within a GenRocket organization, each business unit’s data can and should be modeled within its own project; this allows the Domains of a business unit to be modified and versioned as it changes over time.

The common set of generated values that are shared between business units, for relational integrity, may be defined within a set of Organization Variables and referenced by any Project Domain variable within the organization.For example:
  • Organization Variable A may be referenced as a starting generation number for the customer Id in some Domain Attribute Generator parameter within Project A and mapped to a primary Id returned by some database function.
  • Within Project B, a Domain Attribute Generator parameter may query for a customer Id, using the Organization Variable A, in order to retrieve the actual primary Id.
  • In Project C, a Domain Attribute Generator parameter may use Organization Variable A, to generate a history record that can map back to a given customer but not contain the customer’s actual primary Id.

Organization Variables

Within GenRocket, the possibilities for modeling and managing shared data between multiple business units are flexible, easily modifiable and infinite.

By using the GenRocket API, with your favorite scripting language, in conjunction with the Test Data Mapping Design Pattern, GenRocket Scenarios can be loaded, modified and executed, in real time, to successfully test very complex workflows.

The sequence diagram below describes how a user may use the GenRocket API, with a scripting language, to load and execute a set of GenRocket Scenarios, execute business units that use the generated data, then map the resulting data to a mapping table to be used with the next set of Scenarios.



In summary, testing complex workflows, involving multiple business units are difficult to test because, by their nature, with all of their moving parts they are complex; that does not mean they can’t be tested.

Using GenRocket, multiple business units can be modeled and versioned within their own projects and have their data fully synthetically generated. Common generated values that allow multiple business units to share relational data can be defined within Organization Variable Sets. And by scripting testing workflows using the GenRocket API, in conjunction with the Test Data Mapping Design Pattern, complex workflows can be fully tested and fully automated while having the test data dynamically generated in real time.

Also published on Medium.