Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] multiheadattention optional q k v out affine #5908

Closed
wants to merge 1 commit into from

Conversation

nihui
Copy link
Member

@nihui nihui commented Feb 17, 2025

  • optional q k v out affine aka. sdpa
  • pnnx ncnn sdpa
  • x86 sdpa
  • arm sdpa
  • vulkan sdpa
  • test sdpa
  • test sdpa int8
  • mqa and gqa
  • pnnx ncnn mqa and gpa fusion
  • x86 mqa and gqa
  • arm mqa and gqa
  • vulkan mqa and gqa
  • test mqa and gqa
  • test position embedding
  • test latent attention

@github-actions github-actions bot added the layer label Feb 17, 2025
@codecov-commenter
Copy link

codecov-commenter commented Feb 17, 2025

Codecov Report

Attention: Patch coverage is 60.48387% with 49 lines in your changes missing coverage. Please review.

Project coverage is 95.36%. Comparing base (2389090) to head (94039d8).
Report is 3 commits behind head on master.

Files with missing lines Patch % Lines
src/layer/multiheadattention.cpp 60.48% 49 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #5908      +/-   ##
==========================================
- Coverage   95.37%   95.36%   -0.01%     
==========================================
  Files         818      818              
  Lines      268009   268431     +422     
==========================================
+ Hits       255610   255990     +380     
- Misses      12399    12441      +42     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@nihui nihui closed this Feb 18, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants