Quick, follow this public account
Let’s learn together
Tips: “When learning testing, choose an institution, and it’s important to have a trial class!” Click the end of the article “Read the original” for more information!
When first encountering automation testing, I didn’t fully understand data-driven and keyword-driven approaches, thinking they were a bit mysterious, just parameters and function values! In fact, they also reflect some characteristics that distinguish testing from development (mainly referring to system testing), as well as the context of technological development.
It can actually be understood as a tight coupling of automated test scripts and test cases, which poses maintenance challenges for test scripts and contradicts the user-oriented approach in system testing.
Every automation testing tool vendor will promote that their tools are very easy to use, and testers without a technical background can simply record the testing process and then play back the recorded test scripts to easily automate all tests. Such claims are very irresponsible.
Now let’s analyze why automation testing cannot rely solely on recording/playback.
Scripts created through recording are mostly hard-coded using scripting languages, and when the application changes, these hard-coded values need to be modified as well. Therefore, maintaining these recorded scripts is very costly, to the point of being nearly unacceptable.
All test scripts must be recorded when the application can be executed correctly; if defects are found during recording, testers must report them to the defect management system, and only after the defect is fixed can the recording continue. In such cases, relying solely on recorded scripts for testing is highly inefficient.
At the same time, these recorded scripts are not very reliable; even if the application has not changed, they may fail to execute due to unexpected circumstances. If testers used the wrong scripting language during recording, the script must be re-recorded.
In summary, while creating automated test scripts through recording seems easy, it actually encounters the following issues:
① Most testers do not have a technical background and find it difficult to fully master the testing tools;
② The application must reach a certain level of stability before recording test scripts can begin;
③ Recorded test scripts are too tightly coupled with test data;
④ The cost of maintaining automated test scripts is very high.
“What is data-driven? Many people may think that data-driven means writing parameters in EXCEL and calling them when running scripts. If I tell you that this is not actually data-driven, but just a more advanced form of parameterization, you would be surprised!”
Now let me explain: first, why is it called data-driven? It must have a driving meaning; for example, can you control the business flow of testing using EXCEL? The answer is no. So how can we achieve driving?
Thus, putting test data in separate files is merely advanced parameterization. Data-driven means you must have data to control the business flow of testing. For example, if you are testing a WEB application with many pages, you can use data to control which page you work on each time (i.e., navigating to the corresponding page using data). It is a lower-level version of keyword-driven, controlling at the function level, while keywords control at the action level. Therefore, data-driven should control the entire testing process.
In some complex test cases, the same case contains many test flows, where different test flows use different test input data. At this time, the input of test data not only includes parameters but also control fields for business processes (which can be understood as logical parameters), which further reflects the meaning of data-driven.
Data-driven automation testing is a testing method proposed to address the tight coupling between development and testing. By establishing a mapping table that associates software metadata defined by testing and development, a loose coupling relationship is created between testing and development.
Whether testers modify test scripts or developers modify software, only the metadata mapping table needs to be updated, thus allowing testing and development to proceed in synchronization. This can reduce the workload of debugging test scripts and better achieve automated testing.
A data-driven automation testing framework is one that reads input and output test data from a data file (such as ODBC source files, Excel files, CSV files, ADO object files, etc.) and then passes the variables into pre-recorded or manually written test scripts.
These variables are used to pass (input/output) for verifying the test data of the application. In this process, reading the data file, test status, and all test information are written into the test scripts; test data is only contained in the data files, not in the scripts, making the test scripts merely a “driver” or a mechanism for transferring data.
Data-driven scripts are those associated with the application. These scripts are written in the proprietary language of the automation tool through recording or manual coding, and appropriate values are assigned to the variables as test data input. These variables serve as media for some key application inputs, allowing the scripts to drive the application through external data.
These data-driven scripts often contain hard-coded data, sometimes including very fragile identification strings in window components. When this happens, the scripts can easily become non-functional due to changes in the application.
Another common characteristic of data-driven scripts is that all the efforts made in test design ultimately manifest in the scripting language of the automation tool or are copied into both manual and automated test scripts. This means that everyone involved in the development or execution of automated tests must be very proficient in the programming language of the test environment and automation tools.
1) Advantages:
① Test scripts can be synchronized with application development, and when application functionality changes, only the business functionality part of the scripts needs to be modified;
② By utilizing a modeled design, repetitive scripts are avoided, reducing the cost of creating or maintaining scripts;
③ Test input data, verification data, and expected test results are stored separately in data files, facilitating modification and maintenance by testers;
④ By judging whether the return value is “True” or “False,” error handling can be performed, enhancing the robustness of test scripts;
⑤ Automation test developers create data-driven testing processes, while testers create test data;
⑥ During testing, results are collected and presented in the context of input data, simplifying manual result analysis.
2) Disadvantages:
① Must be very proficient in the scripting languages of automation testing tools;
② Each script corresponds to multiple data files, which need to be stored in their respective directories based on the functionality of the scripts, increasing complexity;
③ Testers need to maintain corresponding test plans based on specific test data and write this data into various files with different requirements;
④ When editing data files, attention must be paid to the transmission format required by the test scripts, or errors will occur during script processing. If specialized technical personnel maintain this, relying on the data-driven script’s automation testing framework becomes simpler and quicker. However, maintenance is challenging, and this data-driven model needs to be sustained, which can lead to failures even after long periods of maintenance.
The concept of keyword-driven naturally arises from an object-oriented perspective, where the same business logic is naturally written into a class or function to be called as a keyword by different test scripts.
When the testing framework evolves to the point where all testing processes can be completed through functions and classes, it reaches an advanced stage of keyword-driven, where the development of test cases becomes a combination of test data and keywords, simplifying this combination into a familiar task of filling out tables, ultimately achieving a test driven entirely by data and keywords.
In a keyword-driven framework, you can create keywords and associated methods and functions. Then, you create a function library that contains logic for reading keywords and calling related actions.
Keyword-driven automation testing (also known as table-driven test automation) is a variant of data-driven automation testing that supports tests composed of different sequences or multiple paths. It is an application-independent automation framework that is also suitable for manual testing.
The keyword-driven automation testing framework is built on data-driven means, with tables containing instructions (keywords) and not just data. These tests are developed to use keyword data tables, independent of the automation tools used to execute the tests. Keyword-driven automation testing is an effective improvement and supplement to data-driven automation testing.
This automation testing model mainly consists of a core data-driven engine, component functions, support libraries, and application mapping tables. Automated testing begins with an initial script that passes high-level test tables to a high-level driver, which, when processing these tables, encounters mid-level test tables and calls a mid-level driver. Similarly, when the low-level driver processes low-level tables, it attempts to keep the application synchronized with the tests. When the low-level driver encounters a low-level keyword component, it determines the type of component and calls the corresponding component function module to process the instruction.
All these elements rely on information in the mapping table, which serves as a bridge between the automation testing model and the application being tested. The support library mainly handles functions such as file processing, logging, and email sending.
The End
