Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Enable Copilot by default if available #1037

Merged
merged 7 commits into from
Oct 17, 2024

Conversation

dividedmind
Copy link
Contributor

Changes the default to true and changes the flow a bit (this is executed on initialization and on state change):

  • If Copilot backend is enabled:
    • if Copilot is available, use it;
      • when it becomes unavailable, switch to default;
    • if LM not available (old VSCode), fall back to default backend;
    • if Copilot is not available, suggest installation;
      • when it becomes available, switch to it;
  • else:
    • show informational message if user manually disabled it.

Note for all these messages are shown only once, unless the state change is caused by user action.

Other changes:

  • wording changes to reference Copilot instead of generic LM (we only really support it currently anyway),
  • make sure to restart Navie backend as necessary and without debouncing (which is only really necessary when changing the environment variables manually),
  • force OpenAI backend when using Copilot (to avoid issues with leftover environment variables when switching around).

@dustinbyrne
Copy link
Contributor

A couple of things we talked about this morning:

  1. The dialog where VS Code requests the user's permission to allow AppMap to access the language model API needs to occur at an expected time.
  2. User's must not be able to enter into a state where a prompt can be submitted resulting in an error because Copilot is not ready or properly configured.

One more thing I was just thinking about: what is this experience like on Windows? If we're hosting a proxy to the VSCode LM API, do Windows users need to accept access to the network? At what point does this appear? We may need to set expectations with the user.

E.g., this prompt:
image

@dividedmind
Copy link
Contributor Author

I changed it so that it binds to localhost; this should avoid windows firewall prompt (if I understand correctly).

I can't reproduce the GitHub authorization problem we've seen anymore:
Peek 2024-10-13 10-13

I'm not sure if Copilot changed something or if it was a fluke, but this behavior seems reasonable to me.

@dustinbyrne
Copy link
Contributor

I was able to get into the following state, but I don't have a lead on how this occurred just yet:

1467526 [Stdout] spawned /home/db/.appmap/bin/appmap rpc --port 0 with options {"retryTimes":3,"retryThreshold":180000,"id":"rpc","binPath":"/home/db/.appmap/bin/appmap","log":{"name":"AppMap: Services"},"args":["rpc","--port","0"],"env":{"APPMAP_CODE_EDITOR":"vscode","APPMAP_API_URL":"https://api.getappmap.com","APPMAP_API_KEY":"***","OPENAI_API_KEY":"***","OPENAI_BASE_URL":"http://localhost:33211/vscode/copilot","APPMAP_NAVIE_MODEL":"gpt-4o","APPMAP_NAVIE_COMPLETION_BACKEND":"openai","APPMAP_NAVIE_TOKEN_LIMIT":"8000"}}
1467526 [Stderr] Using local Navie provider due to presence of environment variable OPENAI_API_KEY
1467526 [Stderr] Detected code editor: vscode
1467526 [Stderr] Setting RPC configuration: {"projectDirectories":["/home/db/dev/applandinc/appmap-server"],"appmapConfigFiles":["/home/db/dev/applandinc/appmap-server/appmap.yml"]}
1467526 [Stderr] RPC handler appmap.filter is deprecated, use appmap.data instead
1467526 [Stdout] Available JSON-RPC methods: appmap.data, appmap.filter, appmap.metadata, appmap.sequenceDiagram, appmap.stats, explain, explain.status, explain.thread.load, file.update, search, v1.appmap.stats, v1.configuration.get, v1.configuration.set, v1.navie.metadata, v1.navie.suggest, v2.appmap.stats, v2.configuration.get, v2.configuration.set
1467526 [Stderr] Consult @appland/rpc for request and response data types.
AppMap RPC process listening on port 37017
Pushing workspace configuration to RPC server:
- /home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml
1467526 [Stdout] Running JSON-RPC server on port: 37017
1467526 [Stderr] Setting RPC configuration: {"appmapConfigFiles":["/home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml"],"projectDirectories":["/home/db/dev/land-of-apps/sample_app_6th_ed"]}
1467526 [Stderr] (node:1467526) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
1467526 [Stderr] (Use `appmap --trace-warnings ...` to show where the warning was created)
1467526 [Stderr] Thread 0ff3789e-4fe5-46bc-977e-a864319b65db not found
1467526 [Stderr] [remote-navie] Creating new thread 0ff3789e-4fe5-46bc-977e-a864319b65db (thread not found)
AppMap RPC process listening on port 37017
Pushing workspace configuration to RPC server:
- /home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml
1467526 [Stdout] [local-navie] Processing question 76b4e354-38f9-4050-b3ce-27ccb46db8eb in thread 0ff3789e-4fe5-46bc-977e-a864319b65db
1467526 [Stderr] Using OpenAI backend
1467526 [Stderr] Explain received context request: projectInfo
1467526 [Stderr] Setting RPC configuration: {"appmapConfigFiles":["/home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml"],"projectDirectories":["/home/db/dev/land-of-apps/sample_app_6th_ed"]}
1467526 [Stderr] APIConnectionError: Connection error.
1467526 [Stderr]     at OpenAI.makeRequest (/snapshot/appmap-js/node_modules/openai/core.js:304:19)
1467526 [Stderr]     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
1467526 [Stderr]     at async /snapshot/appmap-js/node_modules/@langchain/openai/dist/chat_models.cjs:1306:29
1467526 [Stderr]     at async RetryOperation._fn (/snapshot/appmap-js/node_modules/p-retry/index.js:50:12) {
1467526 [Stderr]   status: undefined,
1467526 [Stderr]   headers: undefined,
1467526 [Stderr]   request_id: undefined,
1467526 [Stderr]   error: undefined,
1467526 [Stderr]   code: undefined,
1467526 [Stderr]   param: undefined,
1467526 [Stderr]   type: undefined,
1467526 [Stderr]   cause: FetchError: request to http://localhost:33211/vscode/copilot/chat/completions failed, reason: connect ECONNREFUSED ::1:33211
1467526 [Stderr]       at ClientRequest.<anonymous> (/snapshot/appmap-js/node_modules/node-fetch/lib/index.js:1491:11)
1467526 [Stderr]       at ClientRequest.emit (node:events:537:28)
1467526 [Stderr]       at Socket.socketErrorListener (node:_http_client:465:9)
1467526 [Stderr]       at Socket.emit (node:events:549:35)
1467526 [Stderr]       at emitErrorNT (node:internal/streams/destroy:151:8)
1467526 [Stderr]       at emitErrorCloseNT (node:internal/streams/destroy:116:3)
1467526 [Stderr]       at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
1467526 [Stderr]     type: 'system',
1467526 [Stderr]     errno: 'ECONNREFUSED',
1467526 [Stderr]     code: 'ECONNREFUSED'
1467526 [Stderr]   },
1467526 [Stderr]   attemptNumber: 1,
1467526 [Stderr]   retriesLeft: 6
1467526 [Stderr] }

@dividedmind
Copy link
Contributor Author

dividedmind commented Oct 15, 2024

I was able to get into the following state, but I don't have a lead on how this occurred just yet:

1467526 [Stdout] spawned /home/db/.appmap/bin/appmap rpc --port 0 with options {"retryTimes":3,"retryThreshold":180000,"id":"rpc","binPath":"/home/db/.appmap/bin/appmap","log":{"name":"AppMap: Services"},"args":["rpc","--port","0"],"env":{"APPMAP_CODE_EDITOR":"vscode","APPMAP_API_URL":"https://api.getappmap.com","APPMAP_API_KEY":"***","OPENAI_API_KEY":"***","OPENAI_BASE_URL":"http://localhost:33211/vscode/copilot","APPMAP_NAVIE_MODEL":"gpt-4o","APPMAP_NAVIE_COMPLETION_BACKEND":"openai","APPMAP_NAVIE_TOKEN_LIMIT":"8000"}}
1467526 [Stderr] Using local Navie provider due to presence of environment variable OPENAI_API_KEY
1467526 [Stderr] Detected code editor: vscode
1467526 [Stderr] Setting RPC configuration: {"projectDirectories":["/home/db/dev/applandinc/appmap-server"],"appmapConfigFiles":["/home/db/dev/applandinc/appmap-server/appmap.yml"]}
1467526 [Stderr] RPC handler appmap.filter is deprecated, use appmap.data instead
1467526 [Stdout] Available JSON-RPC methods: appmap.data, appmap.filter, appmap.metadata, appmap.sequenceDiagram, appmap.stats, explain, explain.status, explain.thread.load, file.update, search, v1.appmap.stats, v1.configuration.get, v1.configuration.set, v1.navie.metadata, v1.navie.suggest, v2.appmap.stats, v2.configuration.get, v2.configuration.set
1467526 [Stderr] Consult @appland/rpc for request and response data types.
AppMap RPC process listening on port 37017
Pushing workspace configuration to RPC server:
- /home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml
1467526 [Stdout] Running JSON-RPC server on port: 37017
1467526 [Stderr] Setting RPC configuration: {"appmapConfigFiles":["/home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml"],"projectDirectories":["/home/db/dev/land-of-apps/sample_app_6th_ed"]}
1467526 [Stderr] (node:1467526) ExperimentalWarning: The Fetch API is an experimental feature. This feature could change at any time
1467526 [Stderr] (Use `appmap --trace-warnings ...` to show where the warning was created)
1467526 [Stderr] Thread 0ff3789e-4fe5-46bc-977e-a864319b65db not found
1467526 [Stderr] [remote-navie] Creating new thread 0ff3789e-4fe5-46bc-977e-a864319b65db (thread not found)
AppMap RPC process listening on port 37017
Pushing workspace configuration to RPC server:
- /home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml
1467526 [Stdout] [local-navie] Processing question 76b4e354-38f9-4050-b3ce-27ccb46db8eb in thread 0ff3789e-4fe5-46bc-977e-a864319b65db
1467526 [Stderr] Using OpenAI backend
1467526 [Stderr] Explain received context request: projectInfo
1467526 [Stderr] Setting RPC configuration: {"appmapConfigFiles":["/home/db/dev/land-of-apps/sample_app_6th_ed/appmap.yml"],"projectDirectories":["/home/db/dev/land-of-apps/sample_app_6th_ed"]}
1467526 [Stderr] APIConnectionError: Connection error.
1467526 [Stderr]     at OpenAI.makeRequest (/snapshot/appmap-js/node_modules/openai/core.js:304:19)
1467526 [Stderr]     at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
1467526 [Stderr]     at async /snapshot/appmap-js/node_modules/@langchain/openai/dist/chat_models.cjs:1306:29
1467526 [Stderr]     at async RetryOperation._fn (/snapshot/appmap-js/node_modules/p-retry/index.js:50:12) {
1467526 [Stderr]   status: undefined,
1467526 [Stderr]   headers: undefined,
1467526 [Stderr]   request_id: undefined,
1467526 [Stderr]   error: undefined,
1467526 [Stderr]   code: undefined,
1467526 [Stderr]   param: undefined,
1467526 [Stderr]   type: undefined,
1467526 [Stderr]   cause: FetchError: request to http://localhost:33211/vscode/copilot/chat/completions failed, reason: connect ECONNREFUSED ::1:33211
1467526 [Stderr]       at ClientRequest.<anonymous> (/snapshot/appmap-js/node_modules/node-fetch/lib/index.js:1491:11)
1467526 [Stderr]       at ClientRequest.emit (node:events:537:28)
1467526 [Stderr]       at Socket.socketErrorListener (node:_http_client:465:9)
1467526 [Stderr]       at Socket.emit (node:events:549:35)
1467526 [Stderr]       at emitErrorNT (node:internal/streams/destroy:151:8)
1467526 [Stderr]       at emitErrorCloseNT (node:internal/streams/destroy:116:3)
1467526 [Stderr]       at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
1467526 [Stderr]     type: 'system',
1467526 [Stderr]     errno: 'ECONNREFUSED',
1467526 [Stderr]     code: 'ECONNREFUSED'
1467526 [Stderr]   },
1467526 [Stderr]   attemptNumber: 1,
1467526 [Stderr]   retriesLeft: 6
1467526 [Stderr] }

Is it possible that somehow the chat completion server got bound to IPv4 but the client resolves localhost to IPv6? I'm not sure how that might happen, they both use localhost so they should resolve the same, but that's the only thing I can think of.

@@ -44,7 +45,7 @@ export default class ChatCompletion implements Disposable {
res.end(isNativeError(e) && e.message);
}
});
this.server.listen(portNumber);
this.server.listen(portNumber, 'localhost');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we hard code to bind to 127.0.0.1 here? I have a scenario where this resolves to IPv4 and the client resolves localhost to IPv6. I can't imagine we'd have any compatibility issues?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

afaik binding to 127.0.0.1 is pretty standard

Copy link
Contributor Author

@dividedmind dividedmind Oct 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about this, but then we have to change the client code to call 127.0.0.1 across the board and it will be broken with the mismatch. Here calling would be fine (because we're passing the URL to the CLI) but components have special casing for localhost for the backend display (so it won't say "(via copilot)" as is).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For RPC it will be a bigger issue I imagine, since we only pass the port number to the UI to connect to the server IIRC.

@dustinbyrne out of curiosity, what's the scenario? Is it something that we expect to be common? I had another solution in mind: a shadow server that would (try to) bind to the other localhost. I expect this will be fairly simple to implement. (Obviously we'll have to do the same thing for RPC.)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This happened to both Kevin and I, and we're on different operating systems, so it seems like it could be fairly common. I don't think the UI is even a consideration here as the Copilot proxy is only used as the OPENAI_BASE_URL.

I've added this commit to the branch which integrates the front end, and it does seem to resolve the issue.
b7a12bb

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For RPC it will be a bigger issue I imagine, since we only pass the port number to the UI to connect to the server IIRC.

Ah, you mean the RPC also binding to localhost. Surprisingly, I haven't seen any issue here as of yet.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This happened to both Kevin and I, and we're on different operating systems, so it seems like it could be fairly common. I don't think the UI is even a consideration here as the Copilot proxy is only used as the OPENAI_BASE_URL.

OPENAI_BASE_URL is used to display the backend: https://github.com/getappmap/appmap-js/blob/19ea34b4cf94b76e14a32b5a1587f3b88bcbde06/packages/components/src/components/chat-search/LlmConfiguration.vue#L121

But it actually does check for 127.0.0.1 as well, so it's fine. (I thought it only checked for localhost.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For RPC it will be a bigger issue I imagine, since we only pass the port number to the UI to connect to the server IIRC.

Ah, you mean the RPC also binding to localhost. Surprisingly, I haven't seen any issue here as of yet.

We should keep an eye out for that.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dustinbyrne FYI, here's what I had in mind: getappmap/appmap-js#2075

It's only used locally anyway, although security-wise it shouldn't
matter much since it needs an authorization key to use anyway. As such,
this change is mainly in an effort to avoid tripping up Windows
firewall dialog.
@dustinbyrne dustinbyrne self-requested a review October 16, 2024 18:46
shouldRestart = true;
}

if (shouldRestart) this.restart();
Copy link
Contributor Author

@dividedmind dividedmind Oct 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to me this should be unnecessary and is covered by the onDidChangeConfiguration callbacks (with the possible exception of changing the OpenAI API key).

@@ -169,6 +175,7 @@ export class ProcessWatcher implements vscode.Disposable {
}

async restart(): Promise<void> {
this._onRestart.fire();
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this fire after restart?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can rename it onBeforeRestart or something similar.

Moazzem-Hossain-pixel

This comment was marked as spam.

This change removes a global five second debounce on changes to
`appMap.commandLineEnvironment`. However, this behavior is not ideal in
settings where the configuration is changed programatically along with
other configuration which also restarts the service.

A 100ms grace period is added to ensure multiple programmatic changes to
configuration can be submitted before the process is restarted without
triggering a longer debounce timeout.
@dustinbyrne dustinbyrne merged commit 8a697e0 into develop Oct 17, 2024
4 checks passed
@dustinbyrne dustinbyrne deleted the feat/copilot-by-default branch October 17, 2024 14:32
@appland-release
Copy link
Contributor

🎉 This PR is included in version 0.131.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

moazzem-pixel

This comment was marked as off-topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants