Skip to content

pipe.unet.load_attn_procs not working in diffusers` version: 0.16.0 #3221

Closed
@sing817

Description

@sing817

Describe the bug

I use unet.load_attn_procs but not working after update diffusers project.

what is load_lora_weights?
what is the difference between new and old lora format?
How can I know it is new or old?

Reproduction

When

pipe.load_lora_weights("./testlora/pytorch_lora_weights.bin")

or

pipe.unet.load_attn_procs("./testlora/pytorch_lora_weights.bin")

get

    KeyError: 'down_blocks.0.attentions.0.transformer_blocks.0.attn1.processor'

``
When

pipe.load_attn_procs("./testlora/pytorch_lora_weights.bin")

it can run but not loading any lora weights.

Logs

No response

System Info

  • diffusers version: 0.16.0.dev0
  • Platform: Linux-5.4.0-144-generic-x86_64-with-glibc2.27
  • Python version: 3.10.11
  • PyTorch version (GPU?): 2.0.0+cu117 (True)
  • Huggingface_hub version: 0.13.4
  • Transformers version: 4.28.1
  • Accelerate version: 0.18.0
  • xFormers version: 0.0.18
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions