-
Notifications
You must be signed in to change notification settings - Fork 188
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: Detect generate failures #1729
Conversation
Currently, since we don't exit on a command failure, it's been missed that generation is currently broken for the dataplane. You can see an example in these logs from a previous workflow (https://github.com/SeldonIO/MLServer/actions/runs/8880776962/job/24381621013). ``` Writing mypy to dataplane_pb2.pyi Traceback (most recent call last): File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/datamodel_code_generator/__main__.py", line 428, in main generate( File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/datamodel_code_generator/__init__.py", line 462, in generate results = parser.parse() File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/datamodel_code_generator/parser/base.py", line 1153, in parse self.parse_raw() File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/datamodel_code_generator/parser/openapi.py", line 571, in parse_raw specification: Dict[str, Any] = load_yaml(source.text) File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/datamodel_code_generator/__init__.py", line 51, in load_yaml return yaml.load(stream, Loader=SafeLoader) File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/__init__.py", line 81, in load return loader.get_single_data() File "/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/yaml/constructor.py", line 49, in get_single_data node = self.get_single_node() File "yaml/_yaml.pyx", line 673, in yaml._yaml.CParser.get_single_node File "yaml/_yaml.pyx", line 687, in yaml._yaml.CParser._compose_document File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node File "yaml/_yaml.pyx", line 845, in yaml._yaml.CParser._compose_mapping_node File "yaml/_yaml.pyx", line 731, in yaml._yaml.CParser._compose_node File "yaml/_yaml.pyx", line 847, in yaml._yaml.CParser._compose_mapping_node File "yaml/_yaml.pyx", line 860, in yaml._yaml.CParser._parse_next_event yaml.scanner.ScannerError: mapping values are not allowed in this context in "<unicode string>", line 324, column 23 /opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/site-packages/grpc_tools/protoc.py:21: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html import pkg_resources Writing mypy to model_repository_pb2.pyi poetry run black . Skipping virtualenv creation, as specified in config file. Skipping .ipynb files as Jupyter dependencies are not installed. You can fix this by running ``pip install "black[jupyter]"`` reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/dataplane_pb2_grpc.py reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/model_repository_pb2.py reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/model_repository_pb2.pyi reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/dataplane_pb2.py reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/model_repository_pb2_grpc.py reformatted /home/runner/work/MLServer/MLServer/mlserver/grpc/dataplane_pb2.pyi reformatted /home/runner/work/MLServer/MLServer/mlserver/types/model_repository.py ``` This will now be detected/surfaced. This allows linting to go on and makes the generation check separate. The intent is to surface this failure but still let PRs go in, as they currently have, as we don't currently have (some) required status checks. I've also reduced what's installed for the lint job which dramatically speeds it up. Along with this, I've cleaned up run commands that didn't need multi-line syntax.
bonus: Using a YAML linter tool, the YAML error can be reproduced [1]. [1]
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice catch!
could you create a ticket to sort out the actual generate issue for the pydantic model? is it part of the v2 pydantic upgrade work?
@sakoush: Will do! I'm working through now on if this is a strict blocker or not. |
update: Created a ticket. This doesn't necessarily block the Pydantic v2 migration as the changes can still be made, it just may be a bit more manual. I'll know more soon. |
* build: Add Python version to matrix for all jobs These 2 were missing this supported version. It would've caught me using `TypeAlias` on 3.9 earlier. * build: Add granularity in types generation This makes it easier to know which types generation is failing. It's a follow-on from #1729.
Currently, since we don't exit on a command failure, it's been missed that generation is currently broken for the dataplane.
You can see an example in these logs from a previous workflow (https://github.com/SeldonIO/MLServer/actions/runs/8880776962/job/24381621013).
This will now be detected/surfaced.
This allows linting to go on and makes the generation check separate.
The intent is to surface this failure but still let PRs go in, as they currently have, as we don't currently have (some) required status checks.
I've also reduced what's installed for the lint job which dramatically speeds it up. Along with this, I've cleaned up run commands that didn't need multi-line syntax.