Skip to content

Add a verbose option to BOptimizer that prints progress information. …#103

Merged
icouckuy merged 8 commits intomasterfrom
feat_print
Sep 3, 2018
Merged

Add a verbose option to BOptimizer that prints progress information. …#103
icouckuy merged 8 commits intomasterfrom
feat_print

Conversation

@icouckuy
Copy link
Contributor

Fixes #82.

Prints:

  • the log likelihood of all models (MLL)
  • the function minimum satisfying all constraints (fmin)
  • the constraint values thereof (if not all constraints are satisfied this will be mentioned and fmin will correspond to the unconstrained function minimum)
  • the acquisition function minimum (alpha)
  • if verbose=True in the CTor of BOptimizer (defaults to False)

Some comments:

  1. For the moment, I set a precision of 3 significant digits to align things somewhat. Perfect alignment of the values over the whole run is only possible if I set a (very large) fixed column width (padding).
  2. To avoid cluttering the progress information, BOptimizer now always supresses the stdout of the underlying model optimization and optimizer. Information on NaN gradients, etc. is not very useful in any way and if the final model contains NaNs this will show up in the printed MLL.
  3. Does not work with multiobjective optimization yet. What should I print for fmin?
  4. I have updated some notebooks to use verbose=True (not all of them). I will commit this if the format of the print is finalized. I think those notebooks can serve as test as correctness is hard to check using unit tests.

Unconstrained:

iter #  0 - MLL [-14.2] - fmin [0.0469] - alpha [-2.92]
iter #  1 - MLL [-15.5] - fmin [0.0202] - alpha [-6.11]
iter #  2 - MLL [-16.9] - fmin [0.0202] - alpha [-6.71]

Constrained:

iter #  0 - MLL [-15.4, -16.4] - fmin [-1.17] - constraints [-0.464] - alpha [-0.486]
iter #  1 - MLL [-16.4, -16.7] - fmin [-1.31] - constraints [-0.803] - alpha [-0.194]
iter #  2 - MLL [-14.3, -18.9] - fmin [-1.31] - constraints [-0.803] - alpha [-0.112]

@codecov-io
Copy link

codecov-io commented Aug 21, 2018

Codecov Report

Merging #103 into master will increase coverage by 0.09%.
The diff coverage is 100%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #103      +/-   ##
==========================================
+ Coverage   99.81%   99.91%   +0.09%     
==========================================
  Files          18       18              
  Lines        1095     1118      +23     
==========================================
+ Hits         1093     1117      +24     
+ Misses          2        1       -1
Impacted Files Coverage Δ
gpflowopt/bo.py 100% <100%> (+1.05%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update ef924f5...b65fb3d. Read the comment docs.

gpflowopt/bo.py Outdated
valid_X = X[valid, :]
valid_Y = Y[valid, :]
# Split between objectives and constraints
valid_Yo = valid_Y[:, self.acquisition.objective_indices()]
Copy link
Contributor Author

@icouckuy icouckuy Aug 21, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Won't work if no feasible samples are found. Will fix, but I also found that EI won't work in any case if nothing is feasible (came across this before).

@icouckuy icouckuy requested a review from gpfins August 22, 2018 12:12
@icouckuy icouckuy requested a review from javdrher August 29, 2018 15:06
@icouckuy
Copy link
Contributor Author

icouckuy commented Sep 3, 2018

Code coverage decreased because:

  • the constraint-only approach is not tested
  • the case when there are no feasible samples also does not happen

Looking at the modifications needed to implement those I think we can just merge this and focus on gpflowopt 1.0 as the tests have changed significantly anyway. We should not forget to test those cases in gpflowopt 1.0 then.

@icouckuy icouckuy removed the request for review from gpfins September 3, 2018 19:34
@icouckuy icouckuy merged commit 4835f02 into master Sep 3, 2018
@icouckuy icouckuy deleted the feat_print branch September 3, 2018 19:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add printing flag to bayesianoptimizer.optimize

3 participants