Skip to content

Minimal helpers to run standalone Stable Diffusion experiments

License

Notifications You must be signed in to change notification settings

enzokro/min_diffusion

Repository files navigation

The min_diffusion library

This library was put together for a series of experiments on Classifier-free Guidance.

Install

pip install min_diffusion

How to use min_diffusion

The library has a single main class MinimalDiffusion.

This class takes three arguments:

  • model_name
  • device
  • dtype

model_name is the string model name on the HuggingFace hub.
device sets the hardware to run on.
dtype is the torch.dtype precision for the torch modules.

# import the library
from min_diffusion.core import MinimalDiffusion

Loading a sample model

Below is an example to load the openjourney model from PromptHero.

The model will be loaded in torch.float16 precision and placed on the GPU.

# set the model to load and its options
model_name = 'prompthero/openjourney'
device     = 'cuda'
dtype      = torch.float32
revision   = "fp32"

Creating a MinimalDiffusion with these arguments:

# create the minimal diffusion pipeline
pipeline = MinimalDiffusion(model_name, device, dtype, revision)

Loading the pipeline:

# load the pipeline
pipeline.load();
Enabling default unet attention slicing.

Generating an image

Below is an example text prompt for image generation.

Note the keyword "mdjrny-v4 style" at the start of the prompt. This is how the openjourney model creates images in the style of Midjourney v4.

# text prompt for image generations
prompt = "mdjrny-v4 style a photograph of an astronaut riding a horse"

Calling MinimalDiffusion on the input text prompt

# generate the image
img = pipeline.generate(prompt);
Using the default Classifier-free Guidance.

  0%|          | 0/50 [00:00<?, ?it/s]

Here is the generated image:

# view the output image
img

Notes:

The pipeline assumes you have logged in to the HuggingFace hub.

About

Minimal helpers to run standalone Stable Diffusion experiments

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published