You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
The plugin-node gives you an S3-compatible API. However, it assumes you're using AWS S3; it doesn't let you use S3-compatible tooling because the request URL is not configurable.
Describe the solution you'd like
We should allow the plugin to take an additional (optional) env var called AWS_S3_ENDPOINT, which if set, will use the URL for bucket operations. The plugin-node only needs a small change here to handle the URL. If it's set, use it; else, use AWS—and this ensures the setup is backward compatible:
If you're using non-AWS tooling that has an S3-compatible API, you must have the ability to customize this URL. There's no alternative. Without this feature, you'll get an error like this when, e.g., calling uploadFile against a local S3 service:
The AWS Access Key Id you provided does not exist in our records.
Additional context
You can test this pattern against something like MinIO. Run a local MinIO server, set the env var AWS_S3_ENDPOINT in your Eliza setup, and then upload a file.
Outside of the scope of this issue, it'd be nice if the S3 plugin had additional features for the following. These would be useful for additional agent memory or storage options.
Creating buckets
Listing buckets
Querying buckets
Getting / downloading objects
If these additional features are desirable, I can open a separate issue for adding new plugin-node S3 methods.
The text was updated successfully, but these errors were encountered:
Hello @dtbuchholz! Welcome to the ai16z community. Thank you for opening your first issue; we appreciate your contribution. You are now a ai16z contributor!
Is your feature request related to a problem? Please describe.
The
plugin-node
gives you an S3-compatible API. However, it assumes you're using AWS S3; it doesn't let you use S3-compatible tooling because the request URL is not configurable.Describe the solution you'd like
We should allow the plugin to take an additional (optional) env var called
AWS_S3_ENDPOINT
, which if set, will use the URL for bucket operations. Theplugin-node
only needs a small change here to handle the URL. If it's set, use it; else, use AWS—and this ensures the setup is backward compatible:Describe alternatives you've considered
If you're using non-AWS tooling that has an S3-compatible API, you must have the ability to customize this URL. There's no alternative. Without this feature, you'll get an error like this when, e.g., calling
uploadFile
against a local S3 service:Additional context
You can test this pattern against something like MinIO. Run a local MinIO server, set the env var
AWS_S3_ENDPOINT
in your Eliza setup, and then upload a file.Outside of the scope of this issue, it'd be nice if the S3 plugin had additional features for the following. These would be useful for additional agent memory or storage options.
If these additional features are desirable, I can open a separate issue for adding new
plugin-node
S3 methods.The text was updated successfully, but these errors were encountered: