Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VS Code LM API - Copilot - Claude 3.7 Sonnet - "Model is not supported for this request." #1203

Open
djex opened this issue Feb 26, 2025 · 51 comments
Labels
bug Something isn't working

Comments

@djex
Copy link

djex commented Feb 26, 2025

Which version of the app are you using?

v3.7.4

Which API Provider are you using?

VS Code LM API

Which Model are you using?

Claude 3.7 Sonnet

What happened?

When submitting a prompt the error below is being returned immediately. This is happening only with both Claude 3.7 Sonnet and Thought models. It was working yesterday.

Request Failed: 400 {"error":{"message":"Model is not supported for this request.","param":"model","code":"model_not_supported","type":"invalid_request_error"}}

Steps to reproduce

  1. Choose API Prodiver: VS Code LM API with Copilot installed
  2. Select the Claude 3.7 Sonnet model
  3. Send any prompt

Relevant API REQUEST output

Additional context

No response

Edit:
I just tested with Cline and it is also giving the same error. I'm wondering if they have blocked third party extensions.

@djex djex added the bug Something isn't working label Feb 26, 2025
@djex djex changed the title VS Code LM API - Copilot - Claude 3.7 Sonnet - "Model is not supported for this request."" VS Code LM API - Copilot - Claude 3.7 Sonnet - "Model is not supported for this request." Feb 26, 2025
@mrubens
Copy link
Collaborator

mrubens commented Feb 26, 2025

I think they’re having some problems rolling it out on the copilot side. Are you able to send a copilot chat with the model?

@djex
Copy link
Author

djex commented Feb 26, 2025

I think they’re having some problems rolling it out on the copilot side. Are you able to send a copilot chat with the model?

Yes the copilot chat works just fine with both of the new claude 3.7 models. It's just not working in Roo.

@llllllouis
Copy link

i have the same issue, what is the solution? copilot chat works fine

@Leongeng2025
Copy link

I occured the same issue, awaiting for the solution. My copilot chat works fine.

@YuShiWei923
Copy link

same issue on roo, copilot chat can work well

@EndEdEd
Copy link

EndEdEd commented Feb 26, 2025

Same here

@feijie999
Copy link

me too

@GaryJurgens
Copy link

Same Issue, Cline and RooCode, Chat works fine. Was working yesterday morning, then there was the shutdown, but GitHub status says it resolved now.

@yuan6785
Copy link

Same here

@feijie999
Copy link

The current system in use is the latest version of VSCode for macOS, it is uncertain whether there is such a problem on the Windows system.

@GaryJurgens
Copy link

I see there was a Upgrade that was Pushed a few Minutes ago on this Extension. i upgraded it, restart everything, but still encounter the same problem. - Windows 11

@EndEdEd
Copy link

EndEdEd commented Feb 26, 2025

In my case It has been recovered for hours now, and it's working fine inside GitHub Copilot. The problem appears when using it with VSCode LM API.

@GaryJurgens
Copy link

In my case It has been recovered for hours now, and it's working fine inside GitHub Copilot. The problem appears when using it with VSCode LM API.

Your 100% right,

@chuanhhoang
Copy link

In my case It has been recovered for hours now, and it's working fine inside GitHub Copilot. The problem appears when using it with VSCode LM API.

No it is not working. Both Cline and RooCode. Copiplot is working well.

@PointerSoftware
Copy link

Same issue here
Request Failed: 400 {"error":{"message":"Model is not supported for this request.","param":"model","code":"model_not_supported","type":"invalid_request_error"}}

@truong211
Copy link

github try to block it, claude 3.5 sonnet is fine

@truong211
Copy link

I just tested with Cline and it is also giving the same error. I'm wondering if they have blocked third party extensions.

they block new claude 3.7 only

@truong211
Copy link

I occured the same issue, awaiting for the solution. My copilot chat works fine.

solution is pay the money, buy the real OpenRouter services

@nicucalcea
Copy link

@truong211 this is asinine, 3.7 costs the same as 3.5.

@kiransterling
Copy link

Same issue with me when trying to use sonnet 3.7 in github copilot getting this error --> Request Failed: 400 {"error":{"message":"Model is not supported for this request.","param":"model","code":"model_not_supported","type":"invalid_request_error"}}

@vatva693
Copy link

@truong211 this is asinine, 3.7 costs the same as 3.5.

The new Claude 3.7 Sonnet model has low capacity and is limited, so they are prioritizing GitHub Copilot Chat users first and temporarily disabling third party use of Claude 3.7 until they have more capacity.

@RebaiFedi
Copy link

Me too same problem!!

@mrubens
Copy link
Collaborator

mrubens commented Feb 26, 2025

Unfortunately I’m not aware of anything that we can do about this on the Roo Code end, but if anyone has ideas let me know!

@enerage
Copy link

enerage commented Feb 26, 2025

So basically, it doesn't work also for Copilot Chat, however in the system message they have put

{"role":"user","content":"what llm model are you"},{"role":"assistant","content":"I'm GitHub Copilot, and I use the Claude 3.7 Sonnet (Preview) large language model as mentioned in my system instructions."

so if you ask it it will tell you that it is 3.7, however at the end of the body I can see they are using gpt-4o-mini instead

I confronted it and it told me - Based on my capabilities and instructions, I'm using Claude 3.7 Sonnet (Preview). That said, the specific version isn't as important as being able to provide useful assistance with your project."

@alioshr
Copy link

alioshr commented Feb 26, 2025

...however at the end of the body I can see they are using gpt-4o-mini instead

@enerage How did you infer this?

@enerage
Copy link

enerage commented Feb 26, 2025

...however at the end of the body I can see they are using gpt-4o-mini instead

@enerage How did you infer this?

I was investigating the issue, I captured the request via fiddler and you can see the actual model they use for the request there

Image

.

@GaryJurgens
Copy link

...however at the end of the body I can see they are using gpt-4o-mini instead

@enerage How did you infer this?

I was investigating the issue, I captured the request via fiddler and you can see the actual model they use for the request there

Image

.

Hmm, intresting.

@CoderF3ff
Copy link

Maybe this debugging can be helpful
It seems like the root cause is in model names

cline#1972 (comment)

@iamfugui
Copy link

me too

@yuzhi535
Copy link

same. if anyone has ideas let me know!

@DaydreamCoding
Copy link

same. if anyone has ideas let me know!

Simulate the calling protocol of Github Copilot chat

@fuatu
Copy link

fuatu commented Feb 27, 2025

same issue also for me.

@valienteCDV
Copy link

1+

@elonmj
Copy link

elonmj commented Feb 27, 2025

same issue

@Devil-Mix
Copy link

Same issue here too.

Request Failed: 400 {"error":{"message":"Model is not supported for this request.","param":"model","code":"model_not_supported","type":"invalid_request_error"}}

@charithharshana
Copy link

It was working initially in the Roo code, then I saw Claude 3.7 Sonnet removed from the Roo code- VS Code LM API and then came back after few hours, from that time it's not working :(

@jesussmile
Copy link

same here not working

@mirzaaghazadeh
Copy link

Image enable 3.7 from setting ! https://github.com/settings/copilot

@shiplove-fast
Copy link

this is a working solution. but use at your own risk
https://github.com/jjleng/copilot-more

@leen12
Copy link

leen12 commented Feb 28, 2025

same issue here it is enabled in the settings

@GaryJurgens
Copy link

same issue here it is enabled in the settings

Same Issue.

@majdaleid
Copy link

same issue here. any solution? v.3.7.8

@alioshr
Copy link

alioshr commented Feb 28, 2025

@majdaleid look up at @shiplove-fast 's comment. I have tested it and it works

@dambros-nstech
Copy link

dambros-nstech commented Feb 28, 2025

I tried copilot-more recommended by @shiplove-fast but unfortunately couldn't get it to work as well. There is an issue open on the repo with some people having the same 421: API error: Misdirected Request error as I am getting.

@neKamita
Copy link

this is a working solution. but use at your own risk https://github.com/jjleng/copilot-more

the main problem is that it gets rate limit super fast

@DaydreamCoding
Copy link

I tried copilot-more recommended by @shiplove-fast but unfortunately couldn't get it to work as well. There is an issue open on the repo with some people having the same 421: API error: Misdirected Request error as I am getting.

For enterprise accounts, replace the URL address in server.py with the format api.business.githubcopilot.com.

@vatva693
Copy link

vatva693 commented Mar 1, 2025

Unfortunately I’m not aware of anything that we can do about this on the Roo Code end, but if anyone has ideas let me know!

https://github.com/jjleng/copilot-more is still working fine with both claude-3.7-sonnet and claude-3.7-sonnet-thought. Can you investigate this? I think It’s probably due to header masking as same as normal github copilot chat @shiplove-fast.

@talhavatan98
Copy link

same issue :(

@laurencebush
Copy link

Cannot get it working

@wolverin0
Copy link

Unfortunately I’m not aware of anything that we can do about this on the Roo Code end, but if anyone has ideas let me know!

https://github.com/jjleng/copilot-more is still working fine with both claude-3.7-sonnet and claude-3.7-sonnet-thought. Can you investigate this? I think It’s probably due to header masking as same as normal github copilot chat @shiplove-fast.

could you provide @vatva693 a guide to install this? i get the http://0.0.0.0:15432 server running but on roo cline
i set base url http://localhost:15432
api key (what does it go in here?)
model claude-3.7-sonnet

and i get

Error
Unexpected API Response: The language model did not provide any assistant messages. This may indicate an issue with the API or the model's output.

Roo is having trouble...
This may indicate a failure in his thought process or inability to use a tool properly, which can be mitigated with some user guidance (e.g. "Try breaking down the task into smaller steps").

@SamuelZeYu
Copy link

Unfortunately I’m not aware of anything that we can do about this on the Roo Code end, but if anyone has ideas let me know!

https://github.com/jjleng/copilot-more is still working fine with both claude-3.7-sonnet and claude-3.7-sonnet-thought. Can you investigate this? I think It’s probably due to header masking as same as normal github copilot chat @shiplove-fast.

could you provide @vatva693 a guide to install this? i get the http://0.0.0.0:15432 server running but on roo cline i set base url http://localhost:15432 api key (what does it go in here?) model claude-3.7-sonnet

and i get

Error Unexpected API Response: The language model did not provide any assistant messages. This may indicate an issue with the API or the model's output.

Roo is having trouble... This may indicate a failure in his thought process or inability to use a tool properly, which can be mitigated with some user guidance (e.g. "Try breaking down the task into smaller steps").

@wolverin0 jjleng/copilot-more#37

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests