You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Full finetuning on OpenShift using latest fms-hf-tuning image failed when I set padding_free and multipack properties in training configuration.
Error message:
/home/tuning/.local/lib/python3.11/site-packages/accelerate/utils/launch.py:253: FutureWarning: `fsdp_backward_prefetch_policy` is deprecated and will be removed in version 0.27.0 of 🤗 Accelerate. Use `fsdp_backward_prefetch` instead
warnings.warn(
/home/tuning/.local/lib/python3.11/site-packages/tuning/config/acceleration_configs/acceleration_framework_config.py:284: UserWarning: An experimental acceleration feature is requested by specifying the '--padding_free' argument. Please note this feature may not support certain edge cases at this juncture. When the feature matures this message will be turned off.
warnings.warn(
ERROR:sft_trainer.py:Traceback (most recent call last):
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/sft_trainer.py", line 643, in main
trainer = train(
^^^^^^
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/sft_trainer.py", line 207, in train
).get_framework()
^^^^^^^^^^^^^^^
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/config/acceleration_configs/acceleration_framework_config.py", line 230, in get_framework
raise e
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/config/acceleration_configs/acceleration_framework_config.py", line 217, in get_framework
self.to_yaml(f.name)
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/config/acceleration_configs/acceleration_framework_config.py", line 316, in to_yaml
configuration_contents = self.to_dict()
^^^^^^^^^^^^^^
File "/home/tuning/.local/lib/python3.11/site-packages/tuning/config/acceleration_configs/acceleration_framework_config.py", line 294, in to_dict
raise ValueError(
ValueError: An acceleration feature is requested by specifying the '--padding_free' argument, but the this requires acceleration packages to be installed. Please do:
- python -m fms_acceleration.cli install fms_acceleration_aadp
When I installed needed dependency by using command python -m fms_acceleration.cli install fms_acceleration_aadp before running image default command then I managed to run finetuning without issue.
Describe the bug
Full finetuning on OpenShift using latest fms-hf-tuning image failed when I set
padding_free
andmultipack
properties in training configuration.Error message:
When I installed needed dependency by using command
python -m fms_acceleration.cli install fms_acceleration_aadp
before running image default command then I managed to run finetuning without issue.It seems that the dependency installation is missing in https://github.com/foundation-model-stack/fms-hf-tuning/blob/926fb9bf8e36a585e4b35199efbdaf9066c224d4/build/Dockerfile
Platform
Please provide details about the environment you are using, including the following:
quay.io/modh/fms-hf-tuning@sha256:f8f732c340488734bf9023953d14bb2410991bd3ff2a519ad2ce07c531353797
Sample Code
See additional context.
Expected behavior
Finetuning using
fms-hf-tuning
pass without any additional configuration when parameterspadding_free
andmultipack
are provided in configuration.Observed behavior
Finetuning failed, I had to install missing dependency in running container before invoking finetuning command.
Additional context
Used finetuning configuration:
The text was updated successfully, but these errors were encountered: