Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for custom S3 endpoints in plugin-node #2174

Closed
dtbuchholz opened this issue Jan 12, 2025 · 1 comment · Fixed by #2176
Closed

Add support for custom S3 endpoints in plugin-node #2174

dtbuchholz opened this issue Jan 12, 2025 · 1 comment · Fixed by #2176
Labels
enhancement New feature or request

Comments

@dtbuchholz
Copy link
Contributor

Is your feature request related to a problem? Please describe.

The plugin-node gives you an S3-compatible API. However, it assumes you're using AWS S3; it doesn't let you use S3-compatible tooling because the request URL is not configurable.

Describe the solution you'd like

We should allow the plugin to take an additional (optional) env var called AWS_S3_ENDPOINT, which if set, will use the URL for bucket operations. The plugin-node only needs a small change here to handle the URL. If it's set, use it; else, use AWS—and this ensures the setup is backward compatible:

const clientConfig: S3ClientConfig = {
    region: AWS_REGION,
    credentials: {
        accessKeyId: AWS_ACCESS_KEY_ID,
        secretAccessKey: AWS_SECRET_ACCESS_KEY,
    },
};
if (AWS_S3_ENDPOINT) {
    clientConfig.endpoint = AWS_S3_ENDPOINT;
    clientConfig.forcePathStyle = true;
}

this.s3Client = new S3Client(clientConfig);

Describe alternatives you've considered

If you're using non-AWS tooling that has an S3-compatible API, you must have the ability to customize this URL. There's no alternative. Without this feature, you'll get an error like this when, e.g., calling uploadFile against a local S3 service:

The AWS Access Key Id you provided does not exist in our records.

Additional context

You can test this pattern against something like MinIO. Run a local MinIO server, set the env var AWS_S3_ENDPOINT in your Eliza setup, and then upload a file.

Outside of the scope of this issue, it'd be nice if the S3 plugin had additional features for the following. These would be useful for additional agent memory or storage options.

  • Creating buckets
  • Listing buckets
  • Querying buckets
  • Getting / downloading objects

If these additional features are desirable, I can open a separate issue for adding new plugin-node S3 methods.

@dtbuchholz dtbuchholz added the enhancement New feature or request label Jan 12, 2025
Copy link
Contributor

Hello @dtbuchholz! Welcome to the ai16z community. Thank you for opening your first issue; we appreciate your contribution. You are now a ai16z contributor!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant