Tenstorrent Inference Server (tt-inference-server
) is the repo of available model APIs for deploying on Tenstorrent hardware.
https://github.com/tenstorrent/tt-inference-server
Please follow setup instructions found in each model folder's README.md doc