You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to implement a read-only replica using the feature: ephemeral clone. This comment describes basically what I do. The only difference is that I don't have a destination bucket. But despite that, it's all the same.
Looking at my log I can see:
2024/10/02-09:40:36.036061 7fcdaac78000 [cloud_env_impl] SanitizeDirectory info. No destination bucket specified and options.max_open_files != -1 so sst files from src bucket /data are not copied into local dir /data at startup
We've put a limit for this option (max_open_files = 16000), as we've seen a big memory consumption using -1. However, this logline raises some questions:
When using something different to -1, not all files are copied when we open the DB. But some of them are, and all of them we'll be at some point right?
How does that work? Is there some background job doing it? I couldn't find the info in the code. If someone can point me to the right place that'd be helpful 🙏
When using -1, are all the files copied before the DB is ready to serve requests? If so, that means that opening a DB with a large amount of data will be very long, right?
Thanks in advance for any response/remark/feedback
The text was updated successfully, but these errors were encountered:
I'm trying to implement a read-only replica using the feature: ephemeral clone. This comment describes basically what I do. The only difference is that I don't have a destination bucket. But despite that, it's all the same.
Looking at my log I can see:
We've put a limit for this option (
max_open_files = 16000
), as we've seen a big memory consumption using-1
. However, this logline raises some questions:-1
, not all files are copied when we open the DB. But some of them are, and all of them we'll be at some point right?-1
, are all the files copied before the DB is ready to serve requests? If so, that means that opening a DB with a large amount of data will be very long, right?Thanks in advance for any response/remark/feedback
The text was updated successfully, but these errors were encountered: