-
Notifications
You must be signed in to change notification settings - Fork 6.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RLlib][Windows] error in custom_env.py #30525
Comments
seems work when use ray-3.0 Howeveer no instruction tell me that |
btw I am windows |
The same error is generated while running the custom model/ custom environment example |
The solution that works for me: |
I can confirm it's working on master (3.0) on Windows as well as 2.1. This script is part of our CI testing suite, so it's constantly checked.
|
What happened + What you expected to happen
when i install raylib and run my first rayrl code(example "custom_env.py"), i come across an error,
(rllib2) C:\Users\Raymond\Desktop\rayRL>python custom_env.py --framework torch
Running with following CLI options: Namespace(as_test=False, framework='torch', local_mode=False, no_tune=False, run='PPO', stop_iters=50, stop_reward=0.1, stop_timesteps=100000)
2022-11-21 20:41:05,424 INFO worker.py:1528 -- Started a local Ray instance.
Traceback (most recent call last):
File "custom_env.py", line 161, in
get_trainable_cls(args.run)
AttributeError: 'dict' object has no attribute 'environment'
I am very consusing .
Versions / Dependencies
install as instruction
Reproduction script
python custom_env.py
Issue Severity
No response
The text was updated successfully, but these errors were encountered: