You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
With a few thousand files, it's currently pretty slow even on a fast connection... The default is 100 files/request. Would changing that to 1000 slow down requests for users with <= 100 files? Would it be faster than 10 requests? Would a dynamic increase be more sensible? That is, request 100, then 200, then 400, then 800, etc. The limit is stated as 1000.
One thing worth checking on is if, now that the connection during the file list phase isn't destroyed and libcurl may be able to reuse it, this reduces the time required enough to reduce this value from 1000.
I should ask the Google Drive folks during their next office hours what they think is a good approach for the different use cases.
With a few thousand files, it's currently pretty slow even on a fast connection... The default is 100 files/request. Would changing that to 1000 slow down requests for users with <= 100 files? Would it be faster than 10 requests? Would a dynamic increase be more sensible? That is, request 100, then 200, then 400, then 800, etc. The limit is stated as 1000.
See: https://developers.google.com/google-apps/documents-list/#getting_all_pages_of_documents_and_files
The text was updated successfully, but these errors were encountered: