Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase Capella RAM per node to 32 GB #431

Open
gopa-noaa opened this issue Oct 16, 2024 · 4 comments
Open

Increase Capella RAM per node to 32 GB #431

gopa-noaa opened this issue Oct 16, 2024 · 4 comments
Assignees
Labels
Capella couchbase task Tasks break a project down into discrete steps

Comments

@gopa-noaa
Copy link
Contributor

Capella is giving low resident ratio errors, need to discuss if we can bump up RAM per node to 32 GB. This will increase our credit burn rate, need to estimate the new burn rate and see how long the current credis will take us with the increased burn rate.

@gopa-noaa gopa-noaa added couchbase task Tasks break a project down into discrete steps Capella labels Oct 16, 2024
@gopa-noaa gopa-noaa self-assigned this Oct 16, 2024
@ian-noaa
Copy link
Contributor

ian-noaa commented Oct 16, 2024

A couple of questions for discussion:

  1. What cluster resources was our original estimate based on? If the resource increase is below that, I think we can just go ahead and apply it.
  2. Since we're learning about the system, it'd be good to know what's causing the increased need for memory.
  3. It'd be good to know why we didn't get alerts for the low resident ratio errors.

@gopa-noaa
Copy link
Contributor Author

Our estimate and recommended configuration was 4 CPU, 32 GB RAM, 3 TB per node. So since we have been burning at a lower rate so far, it would be fine to bump up to 32 GB. Per discussion today I will create another issue to address Capella TTLs, since we already have about 16 million documents in Capella now.

@ian-noaa
Copy link
Contributor

ian-noaa commented Oct 16, 2024

For Time-To-Live, we had some prior discussion in #131. So we could reuse that or summarize the conclusions & open questions from there into a new implementation-focused issue.

@gopa-noaa
Copy link
Contributor Author

Forgot about that one, thanks, let me read up and see which would be a better approach ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Capella couchbase task Tasks break a project down into discrete steps
Projects
None yet
Development

No branches or pull requests

2 participants