Skip to content

Commit

Permalink
fix bm
Browse files Browse the repository at this point in the history
  • Loading branch information
lccurious committed Aug 24, 2024
1 parent 6477a30 commit 1180273
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion _posts/2024-02-25-Extend-LLMs-Context-Window.md
Original file line number Diff line number Diff line change
Expand Up @@ -214,7 +214,7 @@ A explanation can be:

> Consequently, the model tends to dump unnecessary attention values to specific tokens.
> 📌 Extensive research has been done on applying LLMs to lengthy texts, with three main areas of focus: **Length Extrapolation, Context Window Extension, **and **Improving LLMs’s Utilization of Long Text.** While seemingly related, it’s worth nothing that progress in one direction does’t necessarily lead to progress in the other.
> 📌 Extensive research has been done on applying LLMs to lengthy texts, with three main areas of focus: **Length Extrapolation, Context Window Extension,** and **Improving LLMs’s Utilization of Long Text.** While seemingly related, it’s worth nothing that progress in one direction does’t necessarily lead to progress in the other.
> This paper does not expand the attention window size of LLMs or enhance the model’s memory and usage on long texts.
{: .block-tip }

Expand Down

0 comments on commit 1180273

Please sign in to comment.