Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Listener memory leak in AI system #14902

Closed
colin-grant-work opened this issue Feb 11, 2025 · 1 comment · Fixed by #14920
Closed

Listener memory leak in AI system #14902

colin-grant-work opened this issue Feb 11, 2025 · 1 comment · Fixed by #14920
Labels

Comments

@colin-grant-work
Copy link
Contributor

Bug Description:

There appears to be a memory leak of some kind in the AI system. When doing fairly ordinary chat interactions, I got the following warnings:

2025-02-11T19:19:17.819Z root WARN Possible Emitter memory leak detected. 1232 listeners added. Use event.maxListeners to increase the limit (175). MOST frequent listener (1057):
2025-02-11T19:19:17.819Z root WARN     at _event.Object.assign.maxListeners (...theia/examples/electron/lib/backend/packages_core_lib_common_index_js-node_modules_vscode-languageserver-types_lib_umd_sync_recursive.js:1456:54)
    at ...theia/examples/electron/lib/backend/main.js:6599:113
    at new Promise (<anonymous>)
    at [Symbol.asyncIterator] (...theia/examples/electron/lib/backend/main.js:6597:37)
    at asyncIterator.next (<anonymous>)
    at ...theia/examples/electron/lib/backend/main.js:5052:34
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-02-11T19:19:22.113Z root WARN Possible Emitter memory leak detected. 1320 listeners added. Use event.maxListeners to increase the limit (175). MOST frequent listener (1145):
2025-02-11T19:19:22.113Z root WARN     at _event.Object.assign.maxListeners (...theia/examples/electron/lib/backend/packages_core_lib_common_index_js-node_modules_vscode-languageserver-types_lib_umd_sync_recursive.js:1456:54)
    at ...theia/examples/electron/lib/backend/main.js:6599:113
    at new Promise (<anonymous>)
    at [Symbol.asyncIterator] (...theia/examples/electron/lib/backend/main.js:6597:37)
    at asyncIterator.next (<anonymous>)
    at ...theia/examples/electron/lib/backend/main.js:5052:34
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-02-11T19:19:25.447Z root WARN Possible Emitter memory leak detected. 1408 listeners added. Use event.maxListeners to increase the limit (175). MOST frequent listener (1233):
2025-02-11T19:19:25.447Z root WARN     at _event.Object.assign.maxListeners (...theia/examples/electron/lib/backend/packages_core_lib_common_index_js-node_modules_vscode-languageserver-types_lib_umd_sync_recursive.js:1456:54)
    at ...theia/examples/electron/lib/backend/main.js:6599:113
    at new Promise (<anonymous>)
    at [Symbol.asyncIterator] (...theia/examples/electron/lib/backend/main.js:6597:37)
    at asyncIterator.next (<anonymous>)
    at ...theia/examples/electron/lib/backend/main.js:5052:34
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-02-11T19:19:29.901Z root WARN Possible Emitter memory leak detected. 1496 listeners added. Use event.maxListeners to increase the limit (175). MOST frequent listener (1321):
2025-02-11T19:19:29.901Z root WARN     at _event.Object.assign.maxListeners (...theia/examples/electron/lib/backend/packages_core_lib_common_index_js-node_modules_vscode-languageserver-types_lib_umd_sync_recursive.js:1456:54)
    at ...theia/examples/electron/lib/backend/main.js:6599:113
    at new Promise (<anonymous>)
    at [Symbol.asyncIterator] (...theia/examples/electron/lib/backend/main.js:6597:37)
    at asyncIterator.next (<anonymous>)
    at ...theia/examples/electron/lib/backend/main.js:5052:34
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

Those errors point to this code in the OpenAiLanguageModel

cancellationToken?.onCancellationRequested(() => {
rej(new Error('Canceled'));
runnerEnd = true; // Stop the iterator
});

LanguageModelFrontendDeleteImpl

for await (const token of stream) {
this.frontendDelegateClient.send(id, token);
}

The problem seems to be the addition of a new onCancellationRequested listener for every iteration. Likely that should be extracted out to the surrounding scope.

Steps to Reproduce:

Basically the 'To test' script from #14901

Additional Information

@JonasHelming
Copy link
Contributor

@eneufeld @sdirix

sdirix added a commit to eclipsesource/theia that referenced this issue Feb 13, 2025
Adjusts the 'OpenAiModel' to use a single cancellation listener instead
of adding a new one for every received chunk. This fixes the warning
about a possible emitter memory leak when receiving long responses from
the Open AI models.

fixes eclipse-theia#14902
@sdirix sdirix added the TheiaAI label Feb 13, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
3 participants