background
When verifying the correctness and reliability of the system, the use case scenario alone cannot cover all scenarios in the full production environment. A set of drainage tools is required. Before the system is officially launched, use the online request to test the system to be launched. Under normal request Understand whether there is an error, and understand the performance bottleneck of the system under multiple requests. Commonly used drainage tools include GoReplay, tcpcopy, etc.
The automatic test module traffic regression test function of the pig tooth fish performance platform mainly uses the HTTP request and response generated by the operation in the GoReplay recording product interface to generate the traffic file, and then import it into the Choerodon platform to generate use cases for management and execution. Through the introduction of GoReplay and the practice of GoReplay in the hogtooth performance platform, this article helps everyone understand the concept and use of the hogtooth traffic regression test.
GoReplay
GoReplay, originally called gor, is easy to use and has relatively complete functions, so we use GoReplay for traffic recording. GoReplay is the easiest and safest way to test applications with real traffic before going into production.
As applications grow, the amount of work required for testing also increases exponentially. GoReplay provides a simple idea of reusing existing traffic for testing, which makes it very powerful. It can analyze and record application traffic without affecting the application, eliminating the risk of placing third-party components in the critical path.
Block diagram of the working principle of GoReplay:
Installation of
- Download : 1616d45b2b3718 https://github.com/buger/goreplay/releases
- Enter instructions in the environment:
--wget https://github.com/buger/goreplay/releases/download/v1.1.0/gor_1.1.0_x64.tar.gz
In this way, we can get the gor_1.1.0_x64.tar.gz compressed file.
- Then unzip it and enter the command:
--tar vxf gor_1.1.0_x64.tar.gz
The file was decompressed too much and we got a gor file; we moved the gor file to the path environment so that we can use the gor command to record traffic.
GoReplay basic instructions
--input-raw
-Used to capture HTTP traffic, you should specify the IP address or interface and application port.--input-file
-Accept the previously used file--output-file
.--input-tcp
-If you decide to forward traffic from multiple forwarder GoReplay instances to it, it is used by GoReplay aggregated instances.
Available output:
--output-http
-Replay HTTP traffic to the given endpoint, accepting the base URL.--output-file
-Log incoming traffic to a file.--output-tcp
-Forward the incoming data to another GoReplay instance.--output-stdout
-used for debugging, output all data to stdout.
Practice of
1. Recording traffic
1.1 First, we first install Gor_1.1.0 in the server;
1.2 Then enter the following command:
sudo nohup gor --input-raw :8080 \ # The port of the listening service (the port of the default gateway is 8080) -http-allow-method GET \ # Only record requests for the four methods of GET, POST, PUT, and DELETE -http- allow-method POST\
-http-allow-method PUT\
-http-allow-method DELETE\
-input-raw-track-response \ # Capture response message -input-raw-timestamp-type PCAP_TSTAMP_HOST \ # Specify the timestamp format -input-raw-buffer-size 32mb \ # Control the system buffer used to hold TCP packets Size-prettify-http \ # Automatically decode Content-Encoding: gzip and Transfer-Encoding: chunked requests and responses -output-file-append \ # Append to the file so that only one .gor file is generated in the end -output-file requests. gor & # Specify the name of the result file.
The meaning of these commands is to monitor the service port and start recording requests of the specified request type. For example, the request types recorded here are: GET, PSOT, PUT, and DELETE. Capture response messages and append these requests to a file. The file name generated here is called "requests.gor".
1.3 After the command is executed, the output is as follows:
[1] 19436 shown here is the process PID of the GoReplay program. After we finish recording, we can use this PID to terminate GoReplay.
1.4 At this time, GoReplay has started to perform traffic recording. At this time, the tester can start the test on the system under test, and the requests issued by the test during this period of time will be recorded.
testers before officially recording related functions, it is recommended to refresh the page to request the self interface to obtain current user information. The response of this interface is convenient to analyze the use case when importing traffic files later. If neither recording to the self interface nor providing it when importing The user information acquisition interface cannot resolve the user to which the request belongs, and the use case generated by the request will also be ignored.
1.5 After recording the traffic for a period of time, we execute the following command to terminate the recording of GoReplay and enter the following command:
sudo kill -15 ${gor
`Process PID}`
sudo kill -15 ${19436}
command like ours here to terminate the gor process.
1.6 At this point, you can see the directory where the recording command is executed, and get a flow file requests.gor
At this point, the recording is complete.
2. Import the flow file
2.1 We enter the hogtooth traffic regression test page:
at the top right of the flow regression test to import the flow file and enter the flow import interface:
2.3 Choose the directory for placing the generated use cases. Here we choose the test collection directory. Click the upload button to upload the requests.gor file we just recorded. After confirming the upload file, a file import record will be generated immediately below.
If the import use case is 0, there may be the following reasons: ①__During recording, the system under test did not turn off the primary key encryption function; ②__During recording, the self interface was not requested to obtain user information and the import Information acquisition interface; ③__Provide user information acquisition interface, but the recorded traffic file time is too long, exceeding the user's token expiration time, resulting in the request authentication information involved in the traffic file has expired, and the user cannot be identified. Therefore, use cases cannot be generated; ④__All requests are not json type requests ⑤__All the request methods are not GET, POST, PUT or DELETE.
2.4 After the file is successfully imported, the corresponding use case will be generated in the selected directory. The path, request method, menu, user, and request time corresponding to each use case will be displayed in the list.
- Path: The path requested in the use case.
- Request method: the request method requested in the use case.
- Menu: The menu to which the corresponding request belongs in the use case.
- User: The name of the user who performed this request during the recording process.
- Request time: the execution time corresponding to the request during the recording process.
3. Use case batch processing
3.1 Because of the use cases we obtained by importing the traffic file, the ID parameters used by each request will change in the subsequent execution process. Therefore, we need to replace each request path, request parameters, and ID parameters in the request body with variables through the use case batch processing function.
Before that, we also need to select a POST type request and extract the ID generated in the response body as a variable for reference in subsequent use cases.
First, select a traffic regression set in the tree structure on the left side of the page, and then click the use case batch processing button at the top, and the batch processing page will appear on the right.
3.2 Use the search bar to filter the use cases. The supported search methods are:
- Enter search criteria to query: you can search for any content, and the path, request and response to the application that contain the search value will be displayed in the list below.
- Quick screening: The preset quick screening is
with numerical use cases, which can directly search for all use cases with numerical values in the path, request and response, which is used to help further narrow the ID query range. At the same time, the saved custom filter conditions will also be stored in the drop-down box of quick filter.
- Request method filtering: Allows filtering out GET, POST, PUT and DELETE type use case requests.
- Use case status screening: Support screening for use case requests with
processing completed or
unprocessed.
- Regular filtering: Supports the use of regular expressions to filter out the use case requests that meet the conditions.
- Directory filtering: Support filtering out the use case requests under each directory.
- Menu filtering: Support filtering out the use case requests under the corresponding menu.
- Specific field: used to specify the effective location of the search value. Support positioning: path, request header, request parameter, request body, response header and response body.
3.3 Extract the variables in the page. In this interface, you need to find the use case request that generates the ID, and extract the ID parameter in the response body as a variable. Specific steps are as follows:
- Locate the target use case through the various options in the search bar.
use cases containing values in the search bar of the quick filter, and first filter out all use cases that contain values. -In the specific field, select: POST to filter out the target use cases. -Select the menu where the function block to be processed is located, or enter relevant content in the search criteria to further narrow the search scope. -Finally, find the target use cases one by one in the filtered use case requests.
- Tick a target use case, click
below to add variable extraction, and the variable extraction interface will pop up on the right.
- Select the source of extraction: generally the response body JSON, here needs to be determined according to the location and format of the extracted target variable; support selection of response body JSON, response body XML, response body text and response headers.
- Input variable name: The variable name entered here will be used as the variable referenced in subsequent use cases.
- Selector: Use the selector to locate the location of the extracted variable.
After the variable is extracted successfully, it is also necessary to perform batch ID replacement for the use cases that use the ID parameter in the request, and replace it with the extracted variable. Using this function, you can extract configurable parameters as variables in batches, such as extracting common project IDs, tenant IDs, or other resource IDs in the request.
Value replacement function:
- Select the replacement area: support selection path, request parameter, request header, request body, response header, response body; used to locate the specific area that needs to be replaced for all selected use cases.
- Input source value: the exact value of the previous ID parameter. Subsequent to this ID value will be replaced with the variable that has been extracted.
- Enter the replacement value: Enter the variable that needs to be quoted here.
Example: The name of the variable extracted before is id, enter here: ${id}
- Use case status replacement: directly select the target status to which the selected use case request needs to be changed in the drop-down box; for the use case requests that have been processed, directly set them to the state
processed in batches. After returning to the list, the status of these use cases becomes
processing is complete.
Summarize
Pig tooth fish full-scene performance platform traffic regression test uses GoReplay batch recording product interface operations, and centrally manages the obtained use cases, which facilitates subsequent batch regression testing, which greatly reduces the testers’ scripting and collection Repetitive and time-consuming work such as test data improves the team's test efficiency.
Reference Materials
https://www.cnblogs.com/sunsky303/p/9072871.html
https://blog.csdn.net/xqtesting/article/details/109722583
This article was originally created by the Toothfish technical team, please indicate the source for reprinting: official website
about pig tooth fish
The Choerodon full-scenario performance platform provides systematic methodology and collaboration, testing, DevOps and container tools to help companies pull through the requirements, design, development, deployment, testing and operation processes, and improve management efficiency and quality in one stop. From team collaboration to DevOps tool chain, from platform tools to systemic methodology, Pigtooth fully meets the needs of collaborative management and engineering efficiency, runs through the entire end-to-end process, and helps the team to be faster, stronger and more stable. here to try the pig tooth fish
**粗体** _斜体_ [链接](http://example.com) `代码` - 列表 > 引用
。你还可以使用@
来通知其他用户。