GitLab Performance Tool (GPT) Quick Start
What is GitLab Performance Tool (GPT)
The GitLab Performance Tool (
gpt) is built and maintained by the GitLab Quality Engineering - Enablement team to provide performance testing of any GitLab instance. The tool has itself been built upon the industry-leading open-source tool k6 and provides numerous tests that are designed to effectively performance test GitLab.
GitLab recommends running GPT against your GitLab environment to get an effective performance test. We do not recommend running on a production instance. Only run on production if it’s really required. If so, then run it at the quietest possible time. Depending on your system environment, the test may take up at least 4 hours.
NOTE: This quick start was written and adopted based on documentation for
GPT v2 (2.10.0). Please always check the the official GitLab Project documentation: GitLab Performance Tool for latest changes.
- A separate workstation or server with Docker installed.
- Must be able to connect to the GitLab instance.
Initializing the environment
On your workstation with Docker installed, please run the following in your terminal:
Preparing the Environment
This will generate the data that will be used for the test later. More details on GPT Project:
Create Personal Access Token with API scope from an Admin user.
- In the top-right corner on your GitLab UI, select your avatar.
- Select Edit profile.
- In the left sidebar, select Access Tokens.
- Enter a name and optional expiry date for the token.
- Select the API scopes.
- Select Create personal access token.
- Save the personal access token somewhere safe. After you leave the page, you no longer have access to the token.
Next, edit your environment file under
./k6/config/environment/. In this example, we will use the 2k users environment. Hence, it will be the
2k.json. Replace the values for
"url"with the URL of your GitLab instance and
"user"with the username of the Admin user created in the above step.
Edit the following lines:
"url": "<your gitlab url>", "user": "<username that the access token belong to>",
NOTE: If you have a top-level group named
gpt, please replace the value for
"root_group"with another unique top-level group name.
Run the following docker command to generate the data needed for the performance test:
docker run -it -e ACCESS_TOKEN=<TOKEN> -v $(pwd)/k6/config:/config -v $(pwd)/results:/results gitlab/gpt-data-generator --environment 2k.json
with your personal access token created in Step 1.
Running the Tests
Full details and explanation are available at GPT project
Run the following Docker command:
NOTE: Please replace
<TOKEN> with your personal access token, then replace
60s_40rps.json to run against 2k users testing. [Optional] You may replace
<OPTIONS-FILE> with the following recommended options files based on your target environment user count:
- 1k -
- 2k -
- 3k -
- 5k -
- 10k -
- 25k -
- 50k -
Viewing Test Output and Results
After starting the tool you will see it running each test in order. Once all tests have completed you will be presented with a results summary. As an example, here is a test summary you can view.
For your reference, here are the test runs from GitLab:
- Latest Results - Our automated CI pipelines run multiple times each week and will post their result summaries to the wiki here each time.
- GitLab Versions - A collection of performance test results done against several select release versions of GitLab.'
There are known issues when running the GitLab Performance Tool. Some tests run against parts of the product which are known to be non-performant.
- Improve performance of users API under load.
- Check for other issues the Quality team has raised about other tests
For more details on other possible problems see Troubleshooting section
This step will delete the test data generated.
Method 1: Run the following Docker command
Method 2: Delete the top-level group
gpt (or the unique name you’ve replaced at your environment json) from GitLab UI.
NOTE: There is no preference of one method over the other as both will delete the top-level group.
Reviewing results for customers
Customers often ask for their GPT results to be reviewed as part of building out a Reference Architecture.
- Check the GPT issues list if errors or issues .
- Ask for help from support team members with GPT experience.
- Alternatively reach out to the Quality Engineering - Enablement team who manage GPT over on the
#gitlab-performance-toolchannel on Slack.
- The Reference Architecture group can also review the results as well as the environment as a whole on request. This can be done by asking the customer’s CSM to raise an issue on the Reference Architectures project with the