March 17, 2014

Nodejs Streaming S3 Module

The ultimate nodeJS s3 data streaming module. Seriously!

After the release of the amazing google spreadsheet module comes another exciting release by the team at Mindblaze.

Lo and behold the Streaming S3 module that contains tons of exciting features for budding nodeJS programmers who would like to play with Amazon S3. We were not happy with what was out there so we made our own!


Head-on over to GitHub at

Benefits & Features

  • Super fast and super easy to use
  • Low memory usage
  • Nothing is written to disk during upload
  • Parallel part uploading
  • No need to know total size of the object
  • Implicit retry mechanism for failed part uploads
  • Tons of configurable options
  • Simple interface (Asynchronous and evented)
  • Downloading and uploading statistics (U/D speed and U/D time)
  • Proper usage of streams and graceful error handling
  • Production ready (Used and tested on production environments, uploading gigabytes of files to S3)
  • Uses official AWS SDK

Defaults and Configurables

  • concurrentParts (Default: 5) – Parts that are uploaded simultaneously.
  • waitTime (Default: 1 min (60000 ms)) – Time to wait for verification from S3 after uploading parts.
  • retries (Default: 5) – Number of times to retry uploading a part, before failing.
  • maxPartSize (Default: 5 MB) – Maximum size of each part.

Example 1: Uploading local file with callback.

var Streaming-S3 = require('streaming-s3'),
    fs = require('fs');

var fStream = fs.CreateReadStream(__dirname + '/video.mp4');
var uploader = new Streaming-S3(fStream, 'accessKey', 'secretKey',
    Bucket: '',
    Key: 'video.mp4',
    ContentType: 'video/mp4'
  },  function (err, resp, stats) {
  if (err) return console.log('Upload error: ', e);
  console.log('Upload stats: ', stats);
  console.log('Upload successful: ', resp);

Example 2: Uploading remote file without callback and options

var Streaming-S3 = require('streaming-s3'),
    request = require('request');

var rStream = request.get('');
var uploader = new Streaming-S3(rStream, 'accessKey', 'secretKey',
    Bucket: '',
    Key: 'google.html',
    ContentType: 'text/html'
    concurrentParts: 2,
    waitTime: 10000,
    retries: 1,
    maxPartSize: 10*1024*1024,

uploader.begin(); // important if callback not provided.

uploader.on('data', function (bytesRead) {
  console.log(bytesRead, ' bytes read.');

uploader.on('part', function (number) {
  console.log('Part ', number, ' uploaded.');

// All parts uploaded, but upload not yet acknowledged.
uploader.on('uploaded', function (stats) {
  console.log('Upload stats: ', stats);

uploader.on('finished', function (resp, stats) {
  console.log('Upload finished: ', resp);

uploader.on('error', function (e) {
  console.log('Upload error: ', e);