Skip to content

Unified Interface: Absolute Tolerance#1660

Open
PaulJonasJost wants to merge 11 commits intodevelopfrom
unified_interface_tolerance
Open

Unified Interface: Absolute Tolerance#1660
PaulJonasJost wants to merge 11 commits intodevelopfrom
unified_interface_tolerance

Conversation

@PaulJonasJost
Copy link
Copy Markdown
Collaborator

Add tolerance to the unified interface. Some optimizers not supported, especially Dlib with epsilon, though epsilon is not strictly a tolerance!

@codecov-commenter
Copy link
Copy Markdown

codecov-commenter commented Dec 15, 2025

⚠️ Please install the 'codecov app svg image' to ensure uploads and comments are reliably processed by Codecov.

Codecov Report

❌ Patch coverage is 72.91667% with 13 lines in your changes missing coverage. Please review.
✅ Project coverage is 83.12%. Comparing base (aae71ef) to head (be9fd7a).
⚠️ Report is 14 commits behind head on develop.

Files with missing lines Patch % Lines
pypesto/optimize/optimizer.py 72.91% 13 Missing ⚠️
❗ Your organization needs to install the Codecov GitHub app to enable full functionality.
Additional details and impacted files
@@             Coverage Diff             @@
##           develop    #1660      +/-   ##
===========================================
- Coverage    84.31%   83.12%   -1.19%     
===========================================
  Files          164      164              
  Lines        14328    14393      +65     
===========================================
- Hits         12080    11964     -116     
- Misses        2248     2429     +181     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@PaulJonasJost
Copy link
Copy Markdown
Collaborator Author

closes #1646

@PaulJonasJost PaulJonasJost self-assigned this Dec 15, 2025
@PaulJonasJost PaulJonasJost marked this pull request as ready for review December 16, 2025 14:44
Copy link
Copy Markdown
Member

@dweindl dweindl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, @PaulJonasJost. Good feature to add, but I think this requires a few changes:

It's currently unclear

  • whether it's an absolute or relative tolerance. (included in the docstring, but not in the method name. we might want to support both in the future, so the method name should be unambiguous.)
  • what the tolerance applies to. many gradient-based optimizers will support tolerances on the objective as well as its gradient, so it needs to become clear which one is specified here.

tol
Absolute tolerance for termination.
"""
self._set_option_tol(tol, "tol")
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

From what I remember, ipopt has quite complex termination criteria. While various tolerances are supported, I think just hitting this single value is insufficient for termination, so it might be a bit confusing. Not completely sure whether it should be added here or not.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, it says so in the IPOPT documentation with a quite lengthy passage. I would in general be fine with removing it, but i am not sure how to handle the supports_rel_tol later. I think we can say it does not directly support f_abs_tol but this one is a bit harder. Removed it for now!

@PaulJonasJost PaulJonasJost requested a review from dweindl March 5, 2026 10:42
f"Check supports_f_abs_tol() before calling set_f_abs_tol()."
)

def _set_option_tol(self, tol: float, option_key: str) -> None:
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

class Optimizer does not have an options attribute, we should not try accessing it here. I'd leave validation to the optimizer and just set the options directly in each optimizer without this extra method.

Comment on lines -825 to -827
# We explicitly cast to float, as the IpoptOptimizer requires
# the provision of a float for the max_wall_time option.
self.options["max_wall_time"] = float(seconds)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why removed?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that should not have happened. I think something got mixed up when i merged develop...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants