-
Notifications
You must be signed in to change notification settings - Fork 1.6k
polish & clean code #566
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
polish & clean code #566
Conversation
WalkthroughThe changes refactor random state management across acquisition functions in the Bayesian optimization package. Random state is now passed explicitly to suggestion and optimization methods rather than stored internally. Constructors and tests are updated accordingly. Target space constraint handling is modified to always wrap constraints in a Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant BayesianOptimizer
participant AcquisitionFunction
participant TargetSpace
User->>BayesianOptimizer: suggest()
BayesianOptimizer->>AcquisitionFunction: suggest(..., random_state)
AcquisitionFunction->>TargetSpace: optimization routines (with random_state)
TargetSpace-->>AcquisitionFunction: candidate points
AcquisitionFunction-->>BayesianOptimizer: suggested point
BayesianOptimizer-->>User: result
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (2)
🚧 Files skipped from review as they are similar to previous changes (2)
⏰ Context from checks skipped due to timeout of 90000ms (9)
✨ Finishing Touches
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (3)
bayes_opt/acquisition.py (3)
155-156
: Passing anint
seed here will silently recreate a fresh RNG every call
ensure_rng(random_state)
converts anint
into a newRandomState
each time, meaning two consecutivesuggest()
calls with the same integer seed will always sample the exact same “random” candidates.
If that is unintended, cache the convertedRandomState
once persuggest()
invocation:- random_state = ensure_rng(random_state) + random_state = ensure_rng(random_state) # convert once + # Keep a reference so children & helpers re-use the same generator + rng = random_state
230-233
: Docstring now mentions differential-evolution but examples still talk about “warm-up points”Lines 231-233 updated the wording, but the rest of the paragraph still references the old
n_warmup
constant. Consider updating the whole block for consistency and to avoid confusion.
276-280
: Return-type annotation out of sync with actual return value
_random_sample_minimize
returns three values(x_min, min_acq, x_seeds)
but the type hint saystuple[NDArray | None, float]
. Update the annotation to reflect the extra element.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
bayes_opt/acquisition.py
(27 hunks)bayes_opt/bayesian_optimization.py
(2 hunks)tests/test_acquisition.py
(12 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (9)
- GitHub Check: Python 3.13 - numpy >=2
- GitHub Check: Python 3.11 - numpy >=2
- GitHub Check: Python 3.12 - numpy >=1.25,<2
- GitHub Check: Python 3.11 - numpy >=1.25,<2
- GitHub Check: Python 3.10 - numpy >=2
- GitHub Check: Python 3.12 - numpy >=2
- GitHub Check: Python 3.10 - numpy >=1.25,<2
- GitHub Check: Python 3.9 - numpy >=1.25,<2
- GitHub Check: Python 3.9 - numpy >=2
🔇 Additional comments (32)
bayes_opt/bayesian_optimization.py (1)
252-254
: Random state propagation looks good
BayesianOptimization.suggest()
now forwards its internal RNG to the acquisition function, so determinism is preserved even after the refactor. No issues spotted here.tests/test_acquisition.py (31)
45-46
: New random_state fixture provides consistent test behaviorAdding a fixed RandomState fixture with seed 0 is a good practice that ensures tests are deterministic across different runs.
73-74
: Constructor simplified to align with new random state managementThe MockAcquisition constructor no longer accepts or stores a random_state parameter, which aligns with the PR's objective of making random_state a method parameter rather than a class property.
99-99
: Parameter renamed from n_l_bfgs_b to n_smartThis rename better reflects the purpose of the parameter, especially now that Differential Evolution optimization has been introduced alongside L-BFGS-B.
102-106
: Random state now passed explicitly to suggest methodThe test has been updated to pass the random_state explicitly to the suggest method, consistent with the PR's objective of making random_state a method parameter rather than a class property.
109-114
: New test validates acquisition function maximizationThis addition helps verify the fix for the bug mentioned in the PR objectives - ensuring that the best random sample is correctly included in the seeds for smart optimization.
120-120
: Parameter renamed from n_l_bfgs_b to n_smartConsistent with other changes, the parameter name has been updated to better reflect its purpose in the optimization process.
125-125
: UpperConfidenceBound constructor no longer accepts random_stateThe constructor has been simplified to remove the random_state parameter, aligned with the PR's goal of centralizing random state management.
135-137
: Random state now passed explicitly to suggest methodAcquisition functions now require random_state as a parameter to the suggest method, making the stochastic behavior more explicit and controlled.
142-142
: UpperConfidenceBound constructor simplifiedConstructor no longer accepts random_state, consistent with the new approach to random state management.
150-152
: Random state passed explicitly to _smart_minimize methodThe internal optimization method now receives random_state as a parameter, ensuring deterministic behavior during testing.
157-157
: UpperConfidenceBound constructor simplified for constraint testConstructor no longer accepts random_state, consistent with the new approach to random state management.
165-165
: ProbabilityOfImprovement constructor simplifiedThe constructor no longer accepts random_state, aligned with the refactored random state management.
171-173
: Random state now passed explicitly to suggest methodThe suggest method now receives random_state as a parameter, ensuring deterministic behavior in tests.
177-181
: Consistent pattern for passing random_state to methodsThe constructor no longer accepts random_state, and it's now passed explicitly to the suggest method, maintaining the consistent pattern throughout the codebase.
186-192
: ProbabilityOfImprovement with constraints follows new patternThe constructor no longer accepts random_state, and the suggest method now requires it as a parameter, consistent with other acquisition functions.
196-199
: Consistent pattern for passing random_state to methodsRandom state is passed explicitly to suggest method calls, ensuring deterministic behavior in constraint-related tests.
203-220
: ExpectedImprovement follows new random state patternAll instances of ExpectedImprovement initialization and suggest method calls have been updated to follow the new pattern: no random_state in constructor, explicit random_state in method calls.
224-237
: ExpectedImprovement with constraints follows new patternThe constructor no longer accepts random_state, and the suggest method now requires it as a parameter, consistent with other acquisition functions.
242-244
: ConstantLiar constructor simplifiedThe base acquisition and ConstantLiar constructors no longer accept random_state, aligned with the refactored random state management.
252-253
: Random state passed explicitly to ConstantLiar suggest methodThe suggest method now receives random_state as a parameter, ensuring deterministic behavior in tests.
266-266
: Consistent pattern for passing random_state to methodsRandom state is passed explicitly to the suggest method call, maintaining consistency throughout the test suite.
277-281
: ConstantLiar with constraints follows new patternThe constructor no longer accepts random_state, and the suggest method now requires it as a parameter, consistent with other acquisition functions.
285-285
: Consistent pattern for passing random_state to methodsRandom state is passed explicitly to suggest method calls, ensuring deterministic behavior in constraint-related tests.
293-293
: GPHedge constructor simplifiedThe constructor no longer requires base acquisitions to have random_state parameters, aligned with the refactored random state management.
311-315
: Base acquisitions for GPHedge follow new patternAll base acquisition constructors have been simplified to remove the random_state parameter, consistent with the refactored approach.
335-339
: Base acquisitions for softmax sampling follow new patternAll base acquisition constructors have been simplified to remove the random_state parameter, consistent with the refactored approach.
355-355
: Random state passed explicitly to _sample_idx_from_softmax_gains methodThe internal method now receives random_state as a parameter, ensuring deterministic behavior during testing.
359-366
: GPHedge integration test follows new patternBase acquisitions no longer receive random_state in constructors, and the suggest method now requires it as a parameter, consistent with the refactored approach.
370-371
: Random state passed explicitly to suggest method in loopThe suggest method consistently receives random_state as a parameter, ensuring deterministic behavior across multiple iterations.
597-598
: Custom acquisition constructors simplifiedThe constructor for custom acquisition functions has been simplified to no longer accept random_state, aligned with the refactored random state management.
618-619
: Custom acquisition without set params follows new patternThe constructor has been simplified to remove the random_state parameter, consistent with the refactored approach throughout the codebase.
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #566 +/- ##
==========================================
- Coverage 97.95% 97.76% -0.20%
==========================================
Files 10 10
Lines 1175 1164 -11
==========================================
- Hits 1151 1138 -13
- Misses 24 26 +2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
@CodeRabbit review |
✅ Actions performedReview triggered.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
🔭 Outside diff range comments (1)
bayes_opt/acquisition.py (1)
230-272
:⚠️ Potential issue
_acq_min
can returnNone
when both optimisers failIf
n_random > 0
succeeds but every smart optimisation run fails (min_acq_s == np.inf
) andmin_acq_r == np.inf
(e.g. becausen_random==0
), the function falls through and returnsx_min_r
, which isNone
.
Down-stream callers expect a NumPy array and will crash.- if n_smart: + if n_smart: ... - return x_min_s - return x_min_r + return x_min_s + # Fallbacks + if x_min_r is None: + return x_min_s + if x_min_s is None: + return x_min_r + return x_min_r
🧹 Nitpick comments (4)
tests/test_target_space.py (2)
101-105
: Lambda signature does not match TargetSpace parameter names
ConstraintModel
expects the constraint function to accept named parameters identical to the optimization variables (p1
,p2
).
Usinglambda x: x
will raise aTypeError
ifconstraint.eval(**kwargs)
is ever invoked (e.g. when callingprobe()
).-constraint = ConstraintModel(lambda x: x, -2, 2, transform=None) +constraint = ConstraintModel(lambda p1, p2: p1 - p2, -2, 2, transform=None)Even though the current test registers explicit
constraint_value
s and therefore never callseval
, using a correctly-typed lambda future-proofs the test and documents intent.
199-206
: Redundanttransform=None
argument
ConstraintModel
’stransform
parameter already defaults toNone
.
Unless the test explicitly verifies that the explicitNone
is propagated, the extra argument is superfluous and makes the call longer than necessary.-constraint = ConstraintModel(lambda p1, p2: p1 - p2, -2, 2) +constraint = ConstraintModel(lambda p1, p2: p1 - p2, -2, 2)bayes_opt/acquisition.py (2)
66-73
: Deprecation warning is good – update docstring too
random_state
is now deprecated at construction time but the class-level docstring still advertises it as an active parameter. Updating the docstring will avoid confusing library users.
1187-1203
: Type hint mismatch in_sample_idx_from_softmax_gains
The function signature now specifies
RandomState
, butensure_rng
may supply aGenerator
once the earlier SciPy fix (see above) is applied.
Consider typing againstnp.random.RandomState | np.random.Generator
or simplyAny
to avoid mypy/IDE warnings.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (5)
bayes_opt/acquisition.py
(29 hunks)bayes_opt/bayesian_optimization.py
(2 hunks)bayes_opt/target_space.py
(1 hunks)tests/test_acquisition.py
(12 hunks)tests/test_target_space.py
(4 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
- bayes_opt/bayesian_optimization.py
- tests/test_acquisition.py
🧰 Additional context used
🧬 Code Graph Analysis (2)
tests/test_target_space.py (3)
bayes_opt/target_space.py (3)
constraint
(203-210)TargetSpace
(34-713)set_constraint
(102-117)bayes_opt/constraint.py (1)
ConstraintModel
(23-263)tests/test_bayesian_optimization.py (1)
target_func
(19-21)
bayes_opt/acquisition.py (5)
tests/test_acquisition.py (9)
random_state
(45-46)target_space
(55-56)fun
(144-148)gp
(50-51)pbounds
(30-31)constraint
(35-36)base_acq
(82-83)base_acq
(600-601)base_acq
(621-622)bayes_opt/util.py (1)
ensure_rng
(8-30)bayes_opt/target_space.py (8)
max
(623-651)TargetSpace
(34-713)random_sample
(564-602)bounds
(193-200)continuous_dimensions
(223-234)res
(653-684)set_constraint
(102-117)constraint
(203-210)tests/test_bayesian_optimization.py (1)
random_sample
(500-511)bayes_opt/parameter.py (4)
random_sample
(68-87)random_sample
(262-281)random_sample
(359-381)bounds
(59-61)
⏰ Context from checks skipped due to timeout of 90000ms (9)
- GitHub Check: Python 3.12 - numpy >=2
- GitHub Check: Python 3.10 - numpy >=1.25,<2
- GitHub Check: Python 3.13 - numpy >=2
- GitHub Check: Python 3.12 - numpy >=1.25,<2
- GitHub Check: Python 3.11 - numpy >=1.25,<2
- GitHub Check: Python 3.11 - numpy >=2
- GitHub Check: Python 3.10 - numpy >=2
- GitHub Check: Python 3.9 - numpy >=2
- GitHub Check: Python 3.9 - numpy >=1.25,<2
@fmfn since you asked me to tag you, here you can have a look. What's nice about this feature is that it does catch some things that are hard for humans to catch (e.g. this commit was actually a result of me testing the feature on a PR of my fork and getting this problem pointed out to me). OTOH, sometimes it misreports things, e.g. this comment is wrong, since the In the end, it's a good tool to point out potential problems but one should probably not mindlessly "fix" them. |
Docstrings generation was requested by @till-m. * #566 (comment) The following files were modified: * `bayes_opt/acquisition.py` * `bayes_opt/bayesian_optimization.py` * `bayes_opt/target_space.py` * `tests/test_acquisition.py` * `tests/test_constraint.py` * `tests/test_target_space.py`
Note Generated docstrings for this pull request at #568 |
This PR contains various improvements:
random_state
not a property of the acquisition function, but something to be provided during.suggest
n_lbfgs_b
ton_smart
since we now have the DE optimizationTargetSpace.__init___
instead ofBayesianOptimizer.__init___
Summary by CodeRabbit
Summary by CodeRabbit
New Features
Bug Fixes
Refactor
n_l_bfgs_b
withn_smart
).Tests