Skip to content

refactor: enhance error reporting in pytest output for better debugging #3943

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
May 22, 2025

Conversation

germa89
Copy link
Collaborator

@germa89 germa89 commented May 22, 2025

Description

As the title.

Issue linked

Spin out of #1300

Checklist

Summary by Sourcery

Enhance Pytest output by parsing report details to provide concise, consistent summaries of test failures, errors, and skips

Enhancements:

  • Introduce get_error_message helper to extract concise file path, error type, and cause from pytest reports

Tests:

  • Update Pytest short_test_summary hook to use formatted messages for failed, error, and skipped tests

@germa89 germa89 requested a review from a team as a code owner May 22, 2025 12:32
@ansys-reviewer-bot
Copy link
Contributor

Thanks for opening a Pull Request. If you want to perform a review write a comment saying:

@ansys-reviewer-bot review

Copy link
Contributor

sourcery-ai bot commented May 22, 2025

Reviewer's Guide

Centralize error message parsing in get_error_message and apply across pytest summary output, while extending reporting to skipped tests for improved debugging.

File-Level Changes

Change Details Files
Introduce get_error_message helper for consistent error parsing
  • Added get_error_message that extracts file path, error type, and cause from rep.longreprtext or tuple
  • Handle multi-line and single-line representations with conditional parsing
  • Trim prefixes and format output as "path - type: cause"
tests/conftest.py
Refactor summary hooks to use helper and include skipped tests
  • Replaced direct longreprtext slicing in failed and error reports with get_error_message
  • Added loop for skipped tests reporting using get_error_message
  • Updated write_line calls to include consistent formatting for all outcomes
tests/conftest.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@germa89 germa89 self-assigned this May 22, 2025
@germa89 germa89 enabled auto-merge (squash) May 22, 2025 12:32
@germa89
Copy link
Collaborator Author

germa89 commented May 22, 2025

@pyansys-ci-bot LGTM.

Copy link
Contributor

@pyansys-ci-bot pyansys-ci-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Approving this PR because germa89 said so in here 😬

LGTM

@github-actions github-actions bot added the enhancement Improve any current implemented feature label May 22, 2025
Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @germa89 - I've reviewed your changes - here's some feedback:

  • Rather than parsing rep.longreprtext.splitlines(), consider using rep.longrepr.reprcrash.path and rep.longrepr.reprcrash.message for a more robust extraction of filename, line number, and error message.
  • The inline get_error_message function could be moved out of the hook and into a shared utility to make it easier to test and reuse across other pytest hooks.
  • Be cautious with ast.literal_eval on rep.longreprtext lines, as complex or parameterized failures might not match the expected tuple format—adding a fallback for unparsable cases would avoid exceptions in the summary.
Here's what I looked at during the review
  • 🟡 General issues: 1 issue found
  • 🟢 Security: all looks good
  • 🟢 Testing: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Copy link

codecov bot commented May 22, 2025

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 88.34%. Comparing base (a8de41a) to head (ccf4930).
Report is 2 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3943      +/-   ##
==========================================
+ Coverage   88.27%   88.34%   +0.06%     
==========================================
  Files         187      187              
  Lines       14840    14906      +66     
==========================================
+ Hits        13100    13168      +68     
+ Misses       1740     1738       -2     
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@germa89 germa89 merged commit 426ed0b into main May 22, 2025
45 checks passed
@germa89 germa89 deleted the feat/enhance-error-reporting-in-pytest-output branch May 22, 2025 16:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Improve any current implemented feature
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants