Testing Integrations: Finding the Lines

~ by Randy Wagner

Testing Integrations, a Two-pronged Attack

Integrations within the Guidewire sphere are often a double-edged sword for QA. The tool or system that Guidewire (GW) is being integrated with is typically a functioning production system that the GW center is just tapping into. It has probably been tested thoroughly but it is now part of a GW configuration that must be validated.

Dual Validation Approach

Validating the configuration requires a dual approach and the trick is to find two different lines for your teams, as described below.

The Integration Team

The integration team typically does all the configuration between the customer and the vendor. They build and test the backend set up for communication—real-time or batch—to ensure that the data flows with the appropriate responses.

The X-Center Team

The code configuration within the UI is the x-center team’s responsibility. If there are new fields or field modifications, the x-center team has to make those changes.

These two teams need to work together to ensure the smooth flow of data. This brings us to two different lines.

Line 1 – Integration Team QA

The QA for the integration team has to ensure that the outbound and inbound data fire correctly, the data transmits accurately, and the results are populated appropriately.

This individual is not testing the application itself, which the vendor has already vetted. However, they do have to do enough testing to prove that each field value is managed correctly or triggers the right dependencies. For most purposes, we can test every field but likely not every permutation of every field.

The first line to find is the amount of testing needed between the code configured for the customer and how it lands and behaves in the vendor’s system. Sometimes, the vendor does the validation on their side; sometimes, they have a UI for the client to verify. You’ll have to determine that early to make sure your QA estimates are accurate.

Line 2 – Project Teams QA

The second line is between the project teams. Once we have data from the integration team, there is testing that needs to happen within the x-center team.

For example, if I have an address validation in Claims Center, I need to know that all of the right fields are being verified by the integration and populating when I select an alternate answer.

If there is a workflow in the x-center driven by the data returned, then it’s up to the x-center team to validate this. This QA has to ensure his or her test cases capture the behavior of the data.

Who Owns What?

When pressed for time, especially in later Development sprints, both teams try to close stories and limit unnecessary testing—a term each team will have a different perspective on. There is often what might be called a “spirited discussion” on who owns what. In an ideal world, we would have overlap testing for the benefit of the product and the client, but we rarely have that luxury.

At the end of the day, the integration team builds and tests the flow and storage of data while the x-centers manage the UI and any resulting behavior in the center. And, if the system changes after Production, regression testing is required on the affected integrations—even if the integration itself has not changed.

Finding the lines isn’t always easy but it is necessary in order to get the most out of your QA teams.


Randy Wagner is Director of Quality Assurance for CastleBay Companies. He has 20 years of consulting experience across private and public sectors, Guidewire InsuranceSuite, InsuranceNow, and Duck Creek, with specializations in quality assurance, configuration management, and automation.