Tuesday, November 24, 2009

Testing Framework Review

I was invited to review the existing automated testing solution created by the testing team of Equifax project. They are facing problems related to QARun tool and clear understanding of testing methodology and framework creation.

Review the existing framework of automated tests for a group of J2EE applications. Define a strategy for creating automated regression test suite to increase efficiency og testing team.

Application Architecture:
A set of application related to user credit details. All the data and logic handled within Mainframes. Multiple small J2EE applications basically taking care of reporting and some decision making using reports.

Data Flow:
Input Web Forms => XML => C++ => Mainframe Flat File => Mainframes Processing => Mainframe Flat File => C++ => => XML => Output Report on Web Screen

A very good set of test data is already available and is maintained by the test team. This is a great positive point.

1. Tool being used is QA Run. It is incapable of identifying the objects on the web screen required to validate the test criteria.

Current Solution (Using QA Run):
1. Fill up the form in browser
2. Submit to generate report on screen
3. Select all the data on screen using Select All command.
4. Copy it to a txt file
5. Save it to a folder as baseline.
6. Next release repeat all the above steps.
7. Compare the two files generated above and validate of each and every character in the txt files match in terms of its location and value.
8. If they don't match look up the root cause manually.

I identified the following problems:

1. No Clear distinction of GUI and Data validation test cases.
2. Current methodology is fragile.
3. No clear boundaries for the data inside txt files to help validate reliably.

Solutions Discussed:
1. Define Gui validation test cases and data validation test cases separately.
2. Get them signed off by the stake holders for priority and importance.
3. As discussed with the DEV team, get an XML generated from existing DOM object within code which is being used to generate the report.
4. Using XPath and VBScript (under QA Run) code the data validation scenarios as automated test scripts.

1. Define naming conventions for test script files / test objects
2. Setup process for getting signoff from stakeholder
3. Educate the test team regarding the naming conventions and use of SVN.
4. Synchronize with the DEV team to get their help as and when required.
5. Team to learn and try XPath

Next Steps:
1. Undersand the gap between the DOM Object and the result report web page if any.
2. Create simple test script using an XML as input data and the test condition created using XPath in VBScript.
3. Present to stakeholders.

This was a very interesting exercise for me and hope this to leads to a win win kind of a solution for both the test team and the stakeholders.

No comments: