Skip to content

Commit

Permalink
Bugfix cmd-line overrides and add instance method (#50)
Browse files Browse the repository at this point in the history
* fixed silent fail with cmd line overrides due to calling parse_known_args. switched command line overrides to be only at the class level to prevent verbosity and clashing. Updated tests to reflect the new command line override syntax. Fixed missing help for iterables of spock class types.

* added in isinstance analog for checking if an obj is a Spock decorated class. Updated docs for Tuple len restrictions.

* WIP. Now catching the List of spock class (for repeated classes) in the cmd line override correctly. Need to figure out how to map to the correct portion of the payload. Fixed Enum overrides not getting caught correctly.

* WIP: Override of list of spock classes working in debug script but failing test.

* finally figured out the right logic for overrides. added tests to catch this in the future.

* finally figured out the right logic for overrides. added tests to catch this in the future.

* updated docstrings

* updated docs

* added missing info from the docstring
  • Loading branch information
ncilfone authored Apr 5, 2021
1 parent 3efd1a4 commit 2fa32fc
Show file tree
Hide file tree
Showing 21 changed files with 510 additions and 192 deletions.
50 changes: 44 additions & 6 deletions docs/advanced_features/Command-Line-Overrides.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
optimizer: Optimizer
cache_path: Optional[str]
Expand Down Expand Up @@ -70,14 +70,52 @@ But with command line overrides we can also pass parameter arguments to override
file:

```bash
$ python tutorial.py --config tutorial.yaml --cache_path /tmp/trash
$ python tutorial.py --config tutorial.yaml --DataConfig.cache_path /tmp/trash
```

Each parameter can be overridden at the global level or the class specific level with the syntax `--name.parameter`. For
instance, our previous example would override any parameters named `cache_path` regardless of what class they are
defined in. In this case `cache_path` in both `ModelConfig` and `DataConfig`. To override just a class specific value
we would use the class specific override:
Each parameter can be overridden **ONLY** at the class specific level with the syntax `--classname.parameter`. For
instance, our previous example would only override the `DataConfig.cache_path` and not the `ModelConfig.cache_path` even
though they have the same parameter name (due to the different class names).

```bash
$ python tutorial.py --config tutorial.yaml --DataConfig.cache_path /tmp/trash
```

### Overriding List/Tuple of Repeated `@spock` Classes

For `List` of Repeated `@spock` Classes the syntax is slightly different to allow for the repeated nature of the type.
Given the below example code:

```python
from spock.config import spock
from typing import List


@spock
class NestedListStuff:
one: int
two: str

@spock
class TypeConfig:
nested_list: List[NestedListStuff] # To Set Default Value append '= NestedListStuff'
```

With YAML definitions:

```yaml
# Nested List configuration
nested_list: NestedListStuff
NestedListStuff:
- one: 10
two: hello
- one: 20
two: bye
```
We could override the parameters like so (note that the len must match the defined length from the YAML):
```bash
$ python tutorial.py --config tutorial.yaml --TypeConfig.nested_list.NestedListStuff.one [1,2] \
--TypeConfig.nested_list.NestedListStuff.two [ciao,ciao]
```
2 changes: 1 addition & 1 deletion docs/advanced_features/Defaults.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ class ModelConfig:
lr: float = 0.01
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
```

Expand Down
2 changes: 1 addition & 1 deletion docs/advanced_features/Inheritance.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
optimizer: Optimizer

Expand Down
2 changes: 1 addition & 1 deletion docs/advanced_features/Local-Definitions.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
optimizer: Optimizer
cache_path: Optional[str]
Expand Down
2 changes: 1 addition & 1 deletion docs/advanced_features/Optional-Parameters.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ class ModelConfig:
lr: float = 0.01
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
```

Expand Down
2 changes: 1 addition & 1 deletion docs/advanced_features/Parameter-Groups.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
optimizer: Optimizer

Expand Down
2 changes: 1 addition & 1 deletion docs/basic_tutorial/Building.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ class Activation(Enum):
class ModelConfig:
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation
```

Expand Down
2 changes: 1 addition & 1 deletion docs/basic_tutorial/Configuration-Files.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ class Activation(Enum):
class ModelConfig:
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation
```

Expand Down
23 changes: 13 additions & 10 deletions docs/basic_tutorial/Define.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,11 @@ standard library while `Enum` is within the `enum` standard library):
| int | Optional[int] | Basic integer type parameter (e.g. 2) |
| str | Optional[str] | Basic string type parameter (e.g. 'foo') |
| List[type] | Optional[List[type]] | Basic list type parameter of base types such as int, float, etc. (e.g. [10.0, 2.0]) |
| Tuple[type] | Optional[Tuple[type]] | Basic tuple type parameter of base types such as int, float, etc. (e.g. (10, 2)) |
| Tuple[type] | Optional[Tuple[type]] | Basic tuple type parameter of base types such as int, float, etc. Length enforced unlike List. (e.g. (10, 2)) |
| Enum | Optional[Enum] | Parameter that must be from a defined set of values of base types such as int, float, etc. |

Use `List` types when the length of the `Iterable` is not fixed and `Tuple` when length needs to be strictly enforced.

Parameters that are specified without the `Optional[]` type will be considered **REQUIRED** and therefore will raise an
Exception if not value is specified.

Expand Down Expand Up @@ -58,7 +60,7 @@ class Activation(Enum):
class ModelConfig:
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation
```

Expand Down Expand Up @@ -103,7 +105,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation
```

Expand All @@ -123,11 +125,11 @@ spock Basic Tutorial
configuration(s):
ModelConfig (Main model configuration for a basic neural net)
save_path Optional[SavePath] spock special keyword -- path to write out spock config state (default: None)
n_features int number of data features
dropout List[float] dropout rate for each layer
hidden_sizes Tuple[int] hidden size for each layer
activation Activation choice from the Activation enum of the activation function to use
save_path Optional[SavePath] spock special keyword -- path to write out spock config state (default: None)
n_features int number of data features
dropout List[float] dropout rate for each layer
hidden_sizes Tuple[int, int, int] hidden size for each layer
activation Activation choice from the Activation enum of the activation function to use
Activation (Options for activation functions)
relu str relu activation
Expand All @@ -141,8 +143,9 @@ In another file let's write our simple neural network code: `basic_nn.py`

Notice that even before we've built and linked all of the related `spock` components together we are referencing the
parameters we have defined in our `spock` class. Below we are passing in the `ModelConfig` class as a parameter
`model_config` to the `__init__` function where we can then access the parameters with `.` notation. We could have
also passed in individual parameters instead if that is the preferred syntax.
`model_config` to the `__init__` function where we can then access the parameters with `.` notation (if we import
the `ModelConfig` class here and add it as a type hint to `model_config` most IDE auto-complete will work out of the
box). We could have also passed in individual parameters instead if that is the preferred syntax.

```python
import torch.nn as nn
Expand Down
23 changes: 21 additions & 2 deletions docs/basic_tutorial/Saving.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation
```

Expand Down Expand Up @@ -83,7 +83,26 @@ def main():
# A simple description
description = 'spock Tutorial'
# Build out the parser by passing in Spock config objects as *args after description
config = ConfigArgBuilder(ModelConfig, desc=description, create_save_path=True).save().generate()
config = ConfigArgBuilder(ModelConfig, desc=description).save(create_save_path=True).generate()
# One can now access the Spock config object by class name with the returned namespace
# For instance...
print(config.ModelConfig)
```

### Override UUID Filename

By default `spock` uses an automatically generated UUID as the filename when saving. This can be overridden with the
`file_name` keyword argument. The specified filename will be appended with .spock.cfg.file_extension (e.g. .yaml,
.toml or. json).

In: `tutorial.py`

```python
def main():
# A simple description
description = 'spock Tutorial'
# Build out the parser by passing in Spock config objects as *args after description
config = ConfigArgBuilder(ModelConfig, desc=description).save(file_name='cool_name_here').generate()
# One can now access the Spock config object by class name with the returned namespace
# For instance...
print(config.ModelConfig)
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial/advanced/tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: Optional[List[float]]
hidden_sizes: Tuple[int] = (32, 32, 32)
hidden_sizes: Tuple[int, int, int] = (32, 32, 32)
activation: Activation = 'relu'
optimizer: Optimizer
cache_path: Optional[str]
Expand Down
2 changes: 1 addition & 1 deletion examples/tutorial/basic/tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ class ModelConfig:
save_path: SavePath
n_features: int
dropout: List[float]
hidden_sizes: Tuple[int]
hidden_sizes: Tuple[int, int, int]
activation: Activation


Expand Down
49 changes: 1 addition & 48 deletions spock/backend/attr/builder.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,54 +46,7 @@ def print_usage_and_exit(self, msg=None, sys_exit=True, exit_code=1):
sys.exit(exit_code)

def _handle_help_info(self):
# List to catch Enum classes and handle post spock wrapped attr classes
enum_list = []
for attrs_class in self.input_classes:
# Split the docs into class docs and any attribute docs
class_doc, attr_docs = self._split_docs(attrs_class)
print(' ' + attrs_class.__name__ + f' ({class_doc})')
# Keep a running info_dict of all the attribute level info
info_dict = {}
for val in attrs_class.__attrs_attrs__:
# If the type is an enum we need to handle it outside of this attr loop
# Match the style of nested enums and return a string of module.name notation
if isinstance(val.type, EnumMeta):
enum_list.append(f'{val.type.__module__}.{val.type.__name__}')
# if there is a type (implied Iterable) -- check it for nested Enums
nested_enums = self._extract_enum_types(val.metadata['type']) if 'type' in val.metadata else []
if len(nested_enums) > 0:
enum_list.extend(nested_enums)
# Grab the base or type info depending on what is provided
type_string = repr(val.metadata['type']) if 'type' in val.metadata else val.metadata['base']
# Regex out the typing info if present
type_string = re.sub(r'typing.', '', type_string)
# Regex out any nested_enums that have module path information
for enum_val in nested_enums:
split_enum = f"{'.'.join(enum_val.split('.')[:-1])}."
type_string = re.sub(split_enum, '', type_string)
# Regex the string to see if it matches any Enums in the __main__ module space
# for val in sys.modules
# Construct the type with the metadata
if 'optional' in val.metadata:
type_string = f"Optional[{type_string}]"
info_dict.update(self._match_attribute_docs(val.name, attr_docs, type_string, val.default))
self._handle_attributes_print(info_dict=info_dict)
# Convert the enum list to a set to remove dupes and then back to a list so it is iterable
enum_list = list(set(enum_list))
# Iterate any Enum type classes
for enum in enum_list:
enum = self._get_enum_from_sys_modules(enum)
# Split the docs into class docs and any attribute docs
class_doc, attr_docs = self._split_docs(enum)
print(' ' + enum.__name__ + f' ({class_doc})')
info_dict = {}
for val in enum:
info_dict.update(self._match_attribute_docs(
attr_name=val.name,
attr_docs=attr_docs,
attr_type_str=type(val.value).__name__
))
self._handle_attributes_print(info_dict=info_dict)
self._attrs_help(self.input_classes)

def _handle_arguments(self, args, class_obj):
attr_name = class_obj.__name__
Expand Down
Loading

0 comments on commit 2fa32fc

Please sign in to comment.