Skip to content

Commit

Permalink
Merge pull request #997 from Simnol22/fixing-tutorials
Browse files Browse the repository at this point in the history
Fixing some details for the tutorials to run.
  • Loading branch information
bouthilx authored Sep 23, 2022
2 parents 9f11b7d + 5660f24 commit f6ca7d8
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/src/tutorials/pytorch-mnist.rst
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ be called only once because Oríon only looks at 1 ``'objective'`` value per run

.. code-block:: python
test_error_rate = test(args, model, device, test_loader)
test_error_rate = test(model, device, test_loader)
report_objective(test_error_rate)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/tutorials/scikit-learn.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ Sample script

.. literalinclude:: /../../examples/scikitlearn-iris/main.py
:language: python
:lines: 1-2, 5-9, 13-30
:lines: 1-9, 13-30

This very basic script takes in parameter one positional argument for the hyper-parameter *epsilon*
which control the loss in the script.
Expand Down
3 changes: 3 additions & 0 deletions examples/tutorials/code_1_python_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,9 @@
1-d ``rosenbrock`` function with random search and TPE and visualize the regret curve to compare
the algorithms.
Note for macos users : You will need to either run this page as a jupyter notebook in order for it to compile, or
encapsulate the code in a main function and running it under ``if __name__ == '__main__'``.
We first import the only function needed, :func:`build experiment <orion.client.build_experiment>`.
"""
from orion.client import build_experiment
Expand Down
5 changes: 4 additions & 1 deletion examples/tutorials/code_2_hyperband_checkpoint.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,9 @@
checkpoints.
We will demonstrate below how this can be done with PyTorch, but using Oríon's Python API.
Note for macos users : You will need to either run this page as a jupyter notebook in order for it to compile, or
encapsulate the code in a main function and running it under ``if __name__ == '__main__'``.
Training code
-------------
Expand Down Expand Up @@ -122,7 +125,7 @@ def build_data_loaders(batch_size, split_seed=1):
# Next, we write the function to save checkpoints. It is important to include
# not only the model in the checkpoint, but also the optimizer and the learning rate
# schedule when using one. In this example we will use the exponential learning rate schedule,
# so we checkpoint it. We save the current epoch as well so that we now where we resume from.
# so we checkpoint it. We save the current epoch as well so that we know where we resume from.


def save_checkpoint(checkpoint, model, optimizer, lr_scheduler, epoch):
Expand Down

0 comments on commit f6ca7d8

Please sign in to comment.