Skip to content

Commit

Permalink
Update lora example
Browse files Browse the repository at this point in the history
  • Loading branch information
fchollet committed Mar 21, 2024
1 parent 02c9bae commit 08b59fc
Show file tree
Hide file tree
Showing 2 changed files with 9 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -903,7 +903,11 @@
" A_weights = value_lora_layer.A.kernel # (768, 1) (a, b)\n",
" B_weights = value_lora_layer.B.kernel # (1, 12, 64) (b, c, d)\n",
" increment_weights = tf.einsum(\"ab,bcd->acd\", A_weights, B_weights) * (ALPHA / RANK)\n",
" value_lora_layer.original_layer.kernel.assign_add(increment_weights)"
" value_lora_layer.original_layer.kernel.assign_add(increment_weights)\n",
"\n",
" # Put back in place the original layers with updated weights\n",
" self_attention_layer._query_dense = query_lora_layer.original_layer\n",
" self_attention_layer._value_dense = value_lora_layer.original_layer"
]
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -810,6 +810,10 @@ for layer_idx in range(lora_model.backbone.num_layers):
B_weights = value_lora_layer.B.kernel # (1, 12, 64) (b, c, d)
increment_weights = tf.einsum("ab,bcd->acd", A_weights, B_weights) * (ALPHA / RANK)
value_lora_layer.original_layer.kernel.assign_add(increment_weights)

# Put back in place the original layers with updated weights
self_attention_layer._query_dense = query_lora_layer.original_layer
self_attention_layer._value_dense = value_lora_layer.original_layer
```

We are now all set to generate text with our LoRA model :).
Expand Down

0 comments on commit 08b59fc

Please sign in to comment.