A data-driven framework collates all the possible test data and stores it in a single place. The data in this file then undergoes through an automated process to run multiple times without manual effort.
There are two steps in this process:
1. All the input values are stored in a file such as a .csv, .xls, .xml etc. This file covers test data pertaining to the best case, worst case, positive and negative test scenarios.
2. A test script is developed which uses this file/data as variables and it runs iteratively for all data sets in the file.
Important observations:
1. Collecting different sets of test data in a file/database.
2. Creating a script that can read this data.
3. Store the results of the automation process in step 2 and analyzing it for actual and expected results.
4. Repeat the testing process with the next set of data in the input file.
Advantages
1. DDT makes regression testing easier and faster.
2. Any change in the test script or test data will not affect the other since both are maintained separately.
3. DDT makes the testing process clear. Since there is a logical separation of data from the script, both can be reused over and over again in different test cases and scenarios.
Limitations
1. Crafting the right data set is a skill much needed in a DDT environment where data is the driving factor.
2. Debugging programming errors when a failure occurs is a difficult task for testers with no programming knowledge.
3. The programmer needs to ensure that all the data in the test file has run through the script.
4. The process of DDT followed in a team should be documented to include details about managing test scripts, analyzing test results in various environments, etc.
Types of Data-Driven Testing
DDT varies depending on the type of input being used to test the application. 1. Comma-separated values (.csv) files.
2. Excel file (.xls)
3. Database tables
4. Table variables
5. Script arrays
Best Practices
1. Perception of a test case- Rather than just thinking about the mere flow and accuracy of the application under test, additional capabilities should also be sought out while testing. Observing the latency of the load with a huge volume of data, response from APIs during the heavy load time, etc. are few examples to take the testing perception to another level.
2. Avoid manual intervention- When multiple navigations or re-directional paths are involved, it is imperative that we write our test case accordingly so as to avoid manual intervention at all costs even for complex navigation.
3. Emphasis on negative test cases- Exception handling is just another term for negative test cases. If negative test cases are carefully written and if the application under test undergoes through it then exceptions will never arise.
4. Focus on real information during the data-driven process.
5. Virtual APIs should be driven by meaningful data.