Documentation: https://codspeed.io/docs/reference/pytest-codspeed
pip install pytest-codspeed
In a nutshell, pytest-codspeed
offers two approaches to create performance benchmarks that integrate seamlessly with your existing test suite.
Use @pytest.mark.benchmark
to measure entire test functions automatically:
import pytest
from statistics import median
@pytest.mark.benchmark
def test_median_performance():
input = [1, 2, 3, 4, 5]
output = sum(i**2 for i in input)
assert output == 55
Since this measure the entire function, you might want to use the benchmark
fixture for precise control over what code gets measured:
def test_mean_performance(benchmark):
data = [1, 2, 3, 4, 5]
# Only the function call is measured
result = benchmark(lambda: sum(i**2 for i in data))
assert result == 55
Check out the full documentation for more details.
If you want to run the benchmarks tests locally, you can use the --codspeed
pytest flag:
$ pytest tests/ --codspeed
============================= test session starts ====================
platform darwin -- Python 3.13.0, pytest-7.4.4, pluggy-1.5.0
codspeed: 3.0.0 (enabled, mode: walltime, timer_resolution: 41.7ns)
rootdir: /home/user/codspeed-test, configfile: pytest.ini
plugins: codspeed-3.0.0
collected 1 items
tests/test_sum_squares.py . [ 100%]
Benchmark Results
┏━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━┳━━━━━━━━┓
┃ Benchmark ┃ Time (best) ┃ Rel. StdDev ┃ Run time ┃ Iters ┃
┣━━━━━━━━━━━━━━━━╋━━━━━━━━━━━━━╋━━━━━━━━━━━━━╋━━━━━━━━━━╋━━━━━━━━┫
┃test_sum_squares┃ 1,873ns ┃ 4.8% ┃ 3.00s ┃ 66,930 ┃
┗━━━━━━━━━━━━━━━━┻━━━━━━━━━━━━━┻━━━━━━━━━━━━━┻━━━━━━━━━━┻━━━━━━━━┛
=============================== 1 benchmarked ========================
=============================== 1 passed in 4.12s ====================
You can use the CodSpeedHQ/action to run the benchmarks in Github Actions and upload the results to CodSpeed.
Here is an example of a GitHub Actions workflow that runs the benchmarks and reports the results to CodSpeed on every push to the main
branch and every pull request:
name: CodSpeed
on:
push:
branches:
- "main" # or "master"
pull_request:
# `workflow_dispatch` allows CodSpeed to trigger backtest
# performance analysis in order to generate initial data.
workflow_dispatch:
jobs:
benchmarks:
name: Run benchmarks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: "3.13"
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
run: pytest tests/ --codspeed