Lost network connection while downloading s3 client within promise.all()

I am working on a nodejs-based server that downloads 1000 pdf files each 9MB from s3, compresses them into a zip file, and returns the URL.

However, a problem is occurring while downloading from S3.
When I start downloading on S3, About 100 downloads are downloaded and then Internet connection is not possible.
Looks like its not related to size, because it was successful when trying with 5 files with 1GB file.

This causes mysql connection lost > transaction fail [ERROR] Error: BEGIN; - Can't add new command when connection is in closed state
lost connection

I suspect that heavy network operations should not be placed inside promise.all(), but I don’t know the exact cause. Help!

await Promise.all(
  Ids.map((id) => {
    const readablePayloads: Array<{ filename: string; bugffer: Readable; }> = [];
    // get key, bucket by id
    
    const buffer = await S3Helper.getReadable({bucket, key} {accessKeyId, secretAccessKey});
    readablePayloads.push({ filename, buffer });
  })
)
static async getReadable(opts: { bucket: string; key: string }, awsOpts: AwsOptions): Promise<Readable> {
    return (await createS3Client(awsOpts).send(new AWS.GetObjectCommand({ Bucket: opts.bucket, Key: opts.key }))).Body as Readable;
  }

for sure, this code worked less than about 100 connections

  • PDF files dont work well when zipped as they are often like a zip file themselves just the outermost onion skin is text whilst outermost skin of zip is not text. So why on earth would it be of benefit to build a 9GB archive since often 2 GB is considered a limit and at 20:1 a good working size is 100 MB same as the ideal max for a PDF of say 1000 pages, all are just rules of thumb and each usage has its own optimal chunking but your method seems to building blocks for later problems with file handlers.

    – 

Leave a Comment