Azure Storage JavaScript Client Library Sample for Blob Operations

In this sample, we will demonstrate common scenarios for Azure Blob Storage that includes creating, listing and deleting containers and blobs.

Azure Blob storage is a service for storing large amounts of unstructured object data, such as text or binary data, that can be accessed from anywhere in the world via HTTP or HTTPS. You can use Blob storage to expose data publicly to the world, or to store application data privately.

Note: You may need set up a HTTP server to host this sample for IE browser, because IndexedDB is only available on websites with http or https URL schemes in IE. Azure Storage JavaScript Client Library currently depends on IndexedDB.


Step 1: Preparing an Azure Storage account with CORS rules set

Cross-origin resource sharing, or CORS, must be configured on the Azure Storage account to be accessed directly from JavaScript in the browser. You are able to set the CORS rules for specific Azure Storage account on the Azure Portal. The "Allowed origins" could be set to "*" to allow all the origins in this sample. For more information about CORS, see Cross-Origin Resource Sharing (CORS).

Step 2: Importing Azure Storage JavaScript Client Library

Importing azure-storage.common.js and azure-storage.blob.js in your HTML file for blob operations, and make sure azure-storage.common.js is in front of azure-storage.blob.js.

<script src="azure-storage.common.js"></script>
<script src="azure-storage.blob.js"></script>

Step 3: Creating an Azure Storage Blob Service Object

The BlobService object lets you work with containers and blobs. Following code creates a BlobService object with storage account and SAS Token.

var blobUri = 'https://' + 'STORAGE_ACCOUNT' + '';
var blobService = AzureStorage.createBlobServiceWithSas(blobUri, 'SAS_TOKEN');

In Azure Storage JavaScript Client Library, a global variable AzureStorage is the start point where we can create service objects for blob/table/queue/file and access to the storage utilities.

How to get full detailed API definitions? Currently, the JavaScript Client Library shares the same API definitions with Node.js SDK. Please check API details on Azure Storage Node.js API reference documents. The JavaScript global variable AzureStorage is just like the object require('azure-storage') returns in Node.js.
Warning: Azure Storage JavaScript Client Library also supports creating BlobService based on Storage Account Key for authentication besides SAS Token. However, for security concerns, we recommend use of a limited time SAS Token, generated by a backend web server using a Stored Access Policy.

Step 4: Container Operations

A container provides a grouping of blobs. All blobs must be in a container. An account can contain an unlimited number of containers. A container can store an unlimited number of blobs. Note that the container name must be lowercase. BlobService object provides plenty of interfaces for container operations.

List Containers

BlobService provides listContainersSegmented and listContainersSegmentedWithPrefix for retrieving the containers list under a storage account.

blobService.listContainersSegmented(null, function (error, results) {
    if (error) {
        // List container error
    } else {
        for (var i = 0, container; container = results.entries[i]; i++) {
            // Deal with container object

Create Container

BlobService provides createContainer and createContainerIfNotExists for creating a container under a storage account.

blobService.createContainerIfNotExists('mycontainer', function(error, result) {
    if (error) {
        // Create container error
    } else {
        // Create container successfully

Delete Container

BlobService provides deleteContainer and deleteContainerIfExists for deleting a container under a storage account.

blobService.deleteContainerIfExists('mycontainer', function(error, result) {
    if (error) {
        // Delete container error
    } else {
        // Delete container successfully

Executable Example

The sample will try to create an Azure Storage blob service object based on SAS Token authorization. Enter your Azure Storage account name and SAS Token here, and executable examples in following steps dependent on the settings here. Make sure you have set the CORS rules for the Azure Storage blob service, and the SAS Token is in valid period.

In the following executable example, you can try to list all the containers under your storage account settings, and try to create or delete one container from your account.

Step 5: Blob Operations

Blob: A file of any type and size. Azure Storage offers three types of blobs: block blobs, page blobs, and append blobs.

Block blobs are ideal for storing text or binary files, such as documents and media files. Append blobs are similar to block blobs in that they are made up of blocks, but they are optimized for append operations, so they are useful for logging scenarios. A single block blob can contain up to 50,000 blocks of up to 100 MB each, for a total size of slightly more than 4.75 TB (100 MB X 50,000). A single append blob can contain up to 50,000 blocks of up to 4 MB each, for a total size of slightly more than 195 GB (4 MB X 50,000).

Page blobs can be up to 1 TB in size, and are more efficient for frequent read/write operations. Azure Virtual Machines use page blobs as OS and data disks.

For details about naming containers and blobs, see Naming and Referencing Containers, Blobs, and Metadata.

List Blobs

BlobService provides listBlobsSegmented and listBlobsSegmentedWithPrefix for retrieving the blobs list under a container.

blobService.listBlobsSegmented('mycontainer', null, function (error, results) {
    if (error) {
        // List blobs error
    } else {
        for (var i = 0, blob; blob = results.entries[i]; i++) {
            // Deal with blob object

Upload Blob

BlobService provides uploadBlobByStream for uploading a blob from stream. However, currently uploadBlobByStream only accepts a Node.js ReadableStream type which pure JavaScript doesn't provide.

Note: After importing azure-storage.common.js, you could require 3 Node.js types: stream, util and buffer, then wrap a ReadableStream based on HTML5 FileReader.
// Provides a Stream for a file in webpage, inheriting from NodeJS Readable stream.
var Buffer = require('buffer').Buffer;
var Stream = require('stream');
var util = require('util');

function FileStream(file, opt) {, opt);

    this.fileReader = new FileReader(file);
    this.file = file;
    this.size = file.size;
    this.chunkSize = 1024 * 1024 * 4; // 4MB
    this.offset = 0;
    var _me = this;
    this.fileReader.onloadend = function loaded(event) {
        var data =;
        var buf = Buffer.from(data);
util.inherits(FileStream, Stream.Readable);
FileStream.prototype._read = function() {
    if (this.offset > this.size) {
    } else {
        var end = this.offset + this.chunkSize;
        var slice = this.file.slice(this.offset, end);
        this.offset = end;

Uploading blob from stream. You can set up the blob name as well as the size of this uploading session.

// If one file has been selected in the HTML file input element
var files = document.getElementById('fileinput').files;
var file = files[0];
var fileStream = new FileStream(file);

var customBlockSize = file.size > 1024 * 1024 * 32 ? 1024 * 1024 * 4 : 1024 * 512;
blobService.singleBlobPutThresholdInBytes = customBlockSize;

var finishedOrError = false;
var speedSummary = blobService.createBlockBlobFromStream('mycontainer',, fileStream, file.size, {blockSize : customBlockSize}, function(error, result, response) {
    finishedOrError = true;
    if (error) {
        // Upload blob failed
    } else {
        // Upload successfully

Checking the upload progress with speedSummary object.

function refreshProgress() {
    setTimeout(function() {
        if (!finishedOrError) {
            var process = speedSummary.getCompletePercent();
    }, 200);
Warning: By default, the speedSummary.getCompletePercent() only updates progress when a block is uploaded to server. There are 2 default settings may influence the upload progress display.
  • blobService.singleBlobPutThresholdInBytes is the maximum size (default 32MB), in bytes, of a blob before it must be separated into blocks.
  • Option {blockSize: SizeInBytes} of blobService.createBlockBlobFromStream() is the size (default 4MB) of every block in the storage layer.
This means, by default, blobs smaller than 32MB will only get the progress update when the upload is over, and blobs larger than 32MB will update the process every 4MB. For slow connections or progress reporting for small blobs, you can customize both the two settings into samller values such as 1MB or 512KB. Thus the progress will update with the smaller step you set. However, very small block sizes will impact the storage performance especially for a large blob.

Download Blob

BlobService provides interfaces for downloading a blob into browser memory. Because of browser's sandbox limitation, we cannot save the downloaded data trunks into disk until we get all the data trunks of a blob into browser memory. The browser's memory size is also limited especially for downloading huge blobs, so it's recommended to download a blob in browser with SAS Token authorized link directly.

Shared access signatures (SAS) are a secure way to provide granular access to blobs and containers without providing your storage account name or keys. Shared access signatures are often used to provide limited access to your data, such as allowing a mobile app to access blobs. The following code example generates a new shared access policy that allows the shared access signatures holder to perform read operations on the myblob blob, and expires 100 minutes after the time it is created.

Note: You can choose to use the SAS Token in browser side, or generate a temporary SAS Token dynamically in your server side with Azure Storage C# or Node.js SDKs etc. according to your security requirements.
var downloadLink = blobService.getUrl('mycontainer', 'myblob', 'SAS_TOKEN');

Delete Blob

BlobService provides deleteContainer and deleteContainerIfExists for deleting a blob under a storage account.

blobService.deleteBlobIfExists(container, blob, function(error, result) {
    if (error) {
        // Delete blob failed
    } else {
        // Delete blob successfully

Executable Example

After clicked the "Select" button on the container list in last step, you are able to operate with the blobs under the selected container.

Uploaded Bytes:

Step 6: Creating your JavaScript Application based on Azure Storage JavaScript Client Library

You can view the source code of this sample for detailed reference.