Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I can't get it to work #145

Open
equaerdist opened this issue Sep 10, 2024 · 2 comments
Open

I can't get it to work #145

equaerdist opened this issue Sep 10, 2024 · 2 comments

Comments

@equaerdist
Copy link

equaerdist commented Sep 10, 2024

I did everything according to the instructions

git clone https://github.com/black-forest-labs/flux
python -m venv .venv
source .venv/Scripts/activate
pip install -e “.[all]”

then run the model with python -m flux --name “flux-schnell” --loop.
The first time I had a bin file of 45gb downloaded when everything was loaded and the next times I got a warning message

You are using the default legacy behaviour of the <class 'transformers.models.t5.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thoroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565 D:\Program Files\python\Lib\site-packages\transformers\tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( then the script closed.
I understand this is an error in the source files and did not change something by hand.
My pc configuration:
AMD RYZEN 5 5600
16GB DDR4
RX580
WIN10
Run everything in PowerShell
Ty for ur attention

@MartinAbilev
Copy link

something similar for me. nor cuda nor cpu works

$ python demo_gr.py --name flux-schnell --device cuda
You are using the default legacy behaviour of the <class 'transformers.models.t5
.tokenization_t5.T5Tokenizer'>. This is expected, and simply means that the leg acy (previous) behavior will be used so nothing changes for you. If you want to
use the new behaviour, set legacy=False. This should only be set if you under
stand what it means, and thoroughly read the reason why this was added as explai
ned in huggingface/transformers#24565
H:\flux.venv\Lib\site-packages\transformers\tokenization_utils_base.py:1601: Fu
tureWarning: clean_up_tokenization_spaces was not set. It will be set to True by default. This behavior will be depracted in transformers v4.45, and will be
then set to False by default. For more details check this issue: https://gith
ub.com/huggingface/transformers/issues/31884
warnings.warn(
Segmentation fault
(.venv)

@MartinAbilev
Copy link

code stops when loading t5

t5 = load_t5(device, max_length=256 if is_schnell else 512)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants