Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v 0.0.7 #84

Merged
merged 35 commits into from
Dec 25, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
7e78c8d
Add parameter to control rank of decomposition (#28)
brian6091 Dec 13, 2022
6aee5f3
Merge branch 'master' of https://github.com/cloneofsimo/lora into dev…
cloneofsimo Dec 14, 2022
9f31bd0
feat : statefully monkeypatch different loras + example ipynb + readme
cloneofsimo Dec 14, 2022
fececf3
Fix lora inject, added weight self apply lora (#39)
DavidePaglieri Dec 15, 2022
65438b5
Revert "Fix lora inject, added weight self apply lora (#39)" (#40)
cloneofsimo Dec 15, 2022
4975cfa
Merge branch 'master' of https://github.com/cloneofsimo/lora into dev…
cloneofsimo Dec 15, 2022
9ca7bc8
fix : rank bug in monkeypatch
cloneofsimo Dec 15, 2022
6a3ad97
fix cli fix
cloneofsimo Dec 15, 2022
40ad282
visualizatio on effect of LR
cloneofsimo Dec 15, 2022
a386525
Fix save_steps, max_train_steps, and logging (#45)
hdon96 Dec 16, 2022
6767142
Enable resuming (#52)
hdon96 Dec 16, 2022
24af4c8
feat : low-rank pivotal tuning
cloneofsimo Dec 16, 2022
046422c
feat : pivotal tuning
cloneofsimo Dec 16, 2022
0a92e62
Merge branch 'develop' of https://github.com/cloneofsimo/lora into de…
cloneofsimo Dec 16, 2022
4abbf90
v 0.0.6
cloneofsimo Dec 16, 2022
d0c4cc5
Merge branch 'master' into develop
cloneofsimo Dec 16, 2022
986626f
Learning rate switching & fix indent (#57)
hdon96 Dec 19, 2022
bbda1e5
Re:Fix indent (#58)
hdon96 Dec 19, 2022
46d9cf6
Merge branch 'master' into develop
cloneofsimo Dec 19, 2022
e1ea114
Merge branch 'develop' of https://github.com/cloneofsimo/lora into de…
cloneofsimo Dec 19, 2022
24617ea
ff now training default
cloneofsimo Dec 21, 2022
283f4bd
feat : dataset
cloneofsimo Dec 21, 2022
27145c3
feat : utils to back training
cloneofsimo Dec 21, 2022
7faef9f
readme : more contents. citations, etc.
cloneofsimo Dec 21, 2022
0e799a9
fix : weight init
cloneofsimo Dec 21, 2022
1abfc58
Merge branch 'master' of https://github.com/cloneofsimo/lora into dev…
cloneofsimo Dec 21, 2022
4869fe3
Feature/monkeypatch improvements (#73)
hafriedlander Dec 24, 2022
39affb7
Turn off resizing images with --resize=False (#71)
hdon96 Dec 24, 2022
4b4e220
Revert "Turn off resizing images with --resize=False (#71)" (#77)
cloneofsimo Dec 24, 2022
d590799
Use safetensors to store Loras (#74)
hafriedlander Dec 24, 2022
97b8897
Fix typing-related syntax errors in Python < 3.10 introduced in recen…
hafriedlander Dec 25, 2022
d5138ed
Fix the --resize=False option (#81)
hdon96 Dec 25, 2022
cb69ad4
Pivotal Tuning with hackable training code for CLI (#83)
cloneofsimo Dec 25, 2022
082d653
merge master
cloneofsimo Dec 25, 2022
b8948c0
Merge branch 'master' of https://github.com/cloneofsimo/lora into dev…
cloneofsimo Dec 25, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
482 changes: 482 additions & 0 deletions lora_diffusion/cli_lora_pti.py

Large diffs are not rendered by default.

59 changes: 41 additions & 18 deletions lora_diffusion/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -185,22 +185,29 @@ def __getitem__(self, index):


class PivotalTuningDatasetCapation(Dataset):
"""
A dataset to prepare the instance and class images with the prompts for fine-tuning the model.
It pre-processes the images and the tokenizes prompts.
"""

def __init__(
self,
instance_data_root,
learnable_property,
placeholder_token,
stochastic_attribute,
tokenizer,
class_data_root=None,
class_prompt=None,
size=512,
h_flip=True,
center_crop=False,
color_jitter=False,
resize=True,
):
self.size = size
self.center_crop = center_crop
self.tokenizer = tokenizer
self.resize = resize

self.instance_data_root = Path(instance_data_root)
if not self.instance_data_root.exists():
Expand All @@ -210,7 +217,6 @@ def __init__(
self.num_instance_images = len(self.instance_images_path)

self.placeholder_token = placeholder_token
self.stochastic_attribute = stochastic_attribute.split(",")

self._length = self.num_instance_images

Expand All @@ -224,22 +230,38 @@ def __init__(
else:
self.class_data_root = None

self.image_transforms = transforms.Compose(
[
transforms.Resize(
size, interpolation=transforms.InterpolationMode.BILINEAR
),
transforms.CenterCrop(size)
if center_crop
else transforms.RandomCrop(size),
transforms.ColorJitter(0.2, 0.1)
if color_jitter
else transforms.Lambda(lambda x: x),
transforms.RandomHorizontalFlip(),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5]),
]
)
if resize:
self.image_transforms = transforms.Compose(
[
transforms.Resize(
size, interpolation=transforms.InterpolationMode.BILINEAR
),
transforms.ColorJitter(0.2, 0.1)
if color_jitter
else transforms.Lambda(lambda x: x),
transforms.RandomHorizontalFlip()
if h_flip
else transforms.Lambda(lambda x: x),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5]),
]
)
else:
self.image_transforms = transforms.Compose(
[
transforms.CenterCrop(size)
if center_crop
else transforms.Lambda(lambda x: x),
transforms.ColorJitter(0.2, 0.1)
if color_jitter
else transforms.Lambda(lambda x: x),
transforms.RandomHorizontalFlip()
if h_flip
else transforms.Lambda(lambda x: x),
transforms.ToTensor(),
transforms.Normalize([0.5], [0.5]),
]
)

def __len__(self):
return self._length
Expand All @@ -255,6 +277,7 @@ def __getitem__(self, index):

text = self.instance_images_path[index % self.num_instance_images].stem

# print(text)
example["instance_prompt_ids"] = self.tokenizer(
text,
padding="do_not_pad",
Expand Down
Loading