Skip to content

Converting LoRA (adapters.safetensors) to GGUF #1507

Answered by stippi2
stippi2 asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @awni
After understanding more about how LoRA works and using tools like gguf-tools, I arrived at the following conversion script:

import json
from safetensors.torch import load_file, save_file
from pathlib import Path


loaded_state_dict = load_file("adapters/adapters.safetensors")

def rename_key(old_key):
    # Prepend prefix
    new_key = f"base_model.model.{old_key}"
    # lora_a -> lora_A.weight
    new_key = new_key.replace('lora_a', 'lora_A.weight')
    # lora_b -> lora_B.weight
    new_key = new_key.replace('lora_b', 'lora_B.weight')
    return new_key

def convert_value(old_value):
    return old_value.transpose(0, 1).contiguous()

new_state_dict = {
    rename_key(k): conver…

Replies: 3 comments 1 reply

Comment options

You must be logged in to vote
1 reply
@stippi2
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by stippi2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants