2

I am trying to upload a file to my AWS bucket using the AWS multipart upload. This works great for smaller files, however now I am trying to add a large file (which is split into 170 parts), and I get the following errors:

multiErr, upload part error: { [RequestTimeTooSkewed: The difference between the request time and the current time is too large.]

OR this error:

multiErr, upload part error: { [TimeoutError: Connection timed out after 120000ms]

Any idea how this can be fixed? Here is my code:

var fs = require('fs');
var AWS = require('aws-sdk');
AWS.config.loadFromPath('./config.json')
var s3 = new AWS.S3();

// File
var fileName = 'atom.mov';
var filePath = './' + fileName;
var fileKey = fileName;
var buffer = fs.readFileSync('./' + filePath);
// S3 Upload options
var bucket = 'test.bucket.1234';

// Upload
var startTime = new Date();
var partNum = 0;
var partSize = 1024 * 1024 * 5; // Minimum 5MB per chunk (except the last part) http://docs.aws.amazon.com/AmazonS3/latest/API/mpUploadComplete.html
var numPartsLeft = Math.ceil(buffer.length / partSize);
var maxUploadTries = 3;
var multiPartParams = {
    Bucket: bucket,
    Key: fileKey,
    ContentType: 'application/mov'
};
var multipartMap = { 
    Parts: []
};

function completeMultipartUpload(s3, doneParams) {
  s3.completeMultipartUpload(doneParams, function(err, data) {
    if (err) {
      console.log("An error occurred while completing the multipart upload");
      console.log(err);
    } else {
      var delta = (new Date() - startTime) / 1000;
      console.log('Completed upload in', delta, 'seconds');
      console.log('Final upload data:', data);
    }
  });
}

function uploadPart(s3, multipart, partParams, tryNum) {
  var tryNum = tryNum || 1;
  s3.uploadPart(partParams, function(multiErr, mData) {
    if (multiErr){
      console.log('multiErr, upload part error:', multiErr);
      if (tryNum < maxUploadTries) {
        console.log('Retrying upload of part: #', partParams.PartNumber)
        uploadPart(s3, multipart, partParams, tryNum + 1);
      } else {
        console.log('Failed uploading part: #', partParams.PartNumber)
      }
      return;
    }
      .Parts[this.request.params.PartNumber - 1] = {
      ETag: mData.ETag,
      PartNumber: Number(this.request.params.PartNumber)
    };
    console.log("Completed part", this.request.params.PartNumber);
    console.log('mData', mData);
    if (--numPartsLeft > 0) return; // complete only when all parts uploaded

    var doneParams = {
      Bucket: bucket,
      Key: fileKey,
      MultipartUpload: multipartMap,
      UploadId: multipart.UploadId
    };

    console.log("Completing upload...");
    completeMultipartUpload(s3, doneParams);
  });
}

// Multipart
console.log("Creating multipart upload for:", fileKey);
s3.createMultipartUpload(multiPartParams, function(mpErr, multipart){
  if (mpErr) { console.log('Error!', mpErr); return; }
  console.log("Got upload ID", multipart.UploadId);

  // Grab each partSize chunk and upload it as a part
  for (var rangeStart = 0; rangeStart < buffer.length; rangeStart += partSize) {
    partNum++;
    var end = Math.min(rangeStart + partSize, buffer.length),
        partParams = {
          Body: buffer.slice(rangeStart, end),
          Bucket: bucket,
          Key: fileKey,
          PartNumber: String(partNum),
          UploadId: multipart.UploadId
        };

    // Send a single part
    console.log('Uploading part: #', partParams.PartNumber, ', Range start:', rangeStart);
    uploadPart(s3, multipart, partParams);
  }
});
5
  • Timeouts are a thing. They happen. But it looks as though you need to re-sign the requests before retrying them, when a timeout occurs -- otherwise the date in the request will be too stale, which is what RequestTimeTooSkewed means. Commented Sep 14, 2016 at 21:40
  • Hmm okay, how do I re-sign the requests before retrying them? Commented Sep 15, 2016 at 7:20
  • Actually, looking at the code more closely, I think that's already happening for you. The problem is that you are starting all 170 parts in rapid succession, leading to 170 simultaneous connections, which is potentially too many. That would lead to timeouts and indirectly to requests being delayed due to local resource (CPU, network) saturation. I solved this when I wrote an async parallel multipart uploader (not in js) by storing an array of all the needed part numbers and ranges, starting the first "n" parts (a configurable value) in a loop, and then each time the callback fired... Commented Sep 15, 2016 at 13:42
  • ...after a successful upload of any one part, I popped the next member off the array and started uploading the next part, until all had been successfully uploaded... so I always had no more than "n" parts running at the same time. On a 100 Mbit/sec connection, I found between 8 and 16 simultaneous uploads to be the optimal sweet spot. Does that make sense? It would explain why smaller files work fine but large ones fail. Commented Sep 15, 2016 at 13:46
  • Yes it makes sense. Thanks. I will try to implement this next week Commented Sep 15, 2016 at 14:28

2 Answers 2

1

Try setting the timeout property in your config. This worked for my 500MB+ files, 30 minutes might be overkill but it worked ;)

var aws = require('aws-sdk');

aws.config.update({
    region: "your region",
    accessKeyId: "your access key",
    secretAccessKey: "your secret",
    httpOptions: {"timeout: 1800000"}
});
Sign up to request clarification or add additional context in comments.

Comments

1

The default timeout for s3.upload is 120000 ms (two minutes). You might want to increase it when uploading large files, or if you have a poor connection.

Use the S3 constructor to set the default timeout.

 const s3 = new AWS.S3({
     httpOptions : {
        timeout: 300000 
     }
     //... other config
 });

take a look at the S3 documentation.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.