Description
Hey folks,
I guess this is a wrong place to post it, though at this point I kinda feel helpless and I won't get any substantial insights on StackOverflow. What's the story then? I've started working on some kaggle project with a friend using a fresh repo. We use mypy
in our CI build pipeline. At some point github actions started reporting really long execution times on mypy step. Running mypy
locally takes barely seconds as it's still very small repo. Let me show you one of them:
mypy
takes almost 2 minutes for just a few files in a repo. Setting up python 3.10 takes considerable time, but it's mostly recovering poetry
cache with some heavy libraries. That's ok. At first I thought we might have some problems using --install-types
flag but that's not a case. I went throught the CI history and found out that mypy
execution time went up, once we added pytorch
to the project's dependencies. I guess mypy
is running some type resolution/discovery stuff for used dependencies. Is there any way to cache it in a remote CI setting to reduce the execution times?
repo: https://github.com/piotr-rarus/kaggle-rsna-breast-cancer
build workflow: https://github.com/piotr-rarus/kaggle-rsna-breast-cancer/blob/main/.github/workflows/build.yml
- name: Mypy
run: |
poetry run mypy --incremental --show-error-codes --pretty src
All relevant env info is stated in the repo's toml. Dependencies are managed using poetry
and pinned down in poetry.lock
: https://github.com/piotr-rarus/kaggle-rsna-breast-cancer/blob/main/pyproject.toml
By no means I expect to get a fix, that's not a bug. Some substantial insights into proper CI mypy
usage would be most welcome :)
Take care, you're my favourite python's lib.