You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As context lengths increase, it looks like different models are going about it in different ways. For example, Qwen uses a sliding window in their config.json file while Llama uses RoPE.
I was curious how they work in contrast with each other, if they can be combined, what types of RoPE scaling exist, and how all of these parameters can be optimized separately or in conjunction with each other.
I am also curious how setting the --rope-scaling and --rope-theta interacts with the configs if it is already set or using sliding windows. I can't find too much information regarding the combination of all of these settings so any help would be awesome.
Hi,
As context lengths increase, it looks like different models are going about it in different ways. For example, Qwen uses a sliding window in their config.json file while Llama uses RoPE.
I was curious how they work in contrast with each other, if they can be combined, what types of RoPE scaling exist, and how all of these parameters can be optimized separately or in conjunction with each other.
I am also curious how setting the
--rope-scaling
and--rope-theta
interacts with the configs if it is already set or using sliding windows. I can't find too much information regarding the combination of all of these settings so any help would be awesome.The text was updated successfully, but these errors were encountered: