Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference Latency + Memory Usage: Proper Dim Order Support #9006

Open
cbilgin opened this issue Mar 6, 2025 · 0 comments
Open

Inference Latency + Memory Usage: Proper Dim Order Support #9006

cbilgin opened this issue Mar 6, 2025 · 0 comments
Assignees
Labels
module: xnnpack Issues related to xnnpack delegation and the code under backends/xnnpack/ triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Milestone

Comments

@cbilgin
Copy link

cbilgin commented Mar 6, 2025

cc @digantdesai @mcr229

@cbilgin cbilgin added the module: xnnpack Issues related to xnnpack delegation and the code under backends/xnnpack/ label Mar 6, 2025
@cbilgin cbilgin added this to the 0.6.0 milestone Mar 6, 2025
@cbilgin cbilgin moved this to Backlog in ExecuTorch - CPU Mar 6, 2025
@iseeyuan iseeyuan added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Mar 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module: xnnpack Issues related to xnnpack delegation and the code under backends/xnnpack/ triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module
Projects
Status: Backlog
Development

No branches or pull requests

3 participants