You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When copying a large S3 object with s3.object.copy() / s3.client.copy(), aioboto3 will use multiple uploader() tasks to manage the transfer. However, it doesn't put any limit on the number of tasks created, ignoring the TransferConfig.max_request_concurrency value.
This should be easy to fix; just create a asyncio.Semaphore() and pass in the max_request_concurrency value for this transfer, then have the uploader() task start with async with <shared semaphore>: to limit the number of active concurrent tasks.
The text was updated successfully, but these errors were encountered:
When copying a large S3 object with
s3.object.copy()
/s3.client.copy()
, aioboto3 will use multipleuploader()
tasks to manage the transfer. However, it doesn't put any limit on the number of tasks created, ignoring theTransferConfig.max_request_concurrency
value.This should be easy to fix; just create a
asyncio.Semaphore()
and pass in themax_request_concurrency
value for this transfer, then have theuploader()
task start withasync with <shared semaphore>:
to limit the number of active concurrent tasks.The text was updated successfully, but these errors were encountered: