Tag Archives: Azure Import/Export

#Azure: Step by step Azure Import/Export service


In my preceding blogpost I had covered Azure Import/Export service concept and requirements. In this post let me explain how to do it step by step.

First look at the Azure import service.

  1. Look at the data that you need to migrate, and note down the capacity, number of drives required, data type and destination blob location in Microsoft Azure.
  2. Procure and prepare the drives using WAImportExport tool and bitlocker. WAImportExport tool to copy the data and bitlocker to encrypt the data.
  3. Create an import job through Azure portal and upload the journal file created be WAImportExport tool. Journal file is created for each drive that contains drive ID and bitlocker key.
    1. Login to the Azure portal and search for import/export service.

    2. In the Import/Export jobs panel, select “create import/export job” to initiate a new job request.

    3. Fill the basic configuration details as needed.

    4. In job details panel, upload the journal files, select destination storage account.

    5. Drop-off location will be selected by default based on your storage account location and click on OK.

    6. Fill the return shipping information and verify the summary to create a job successfully.
  4. Ship the drives to the shipping address as described in summary page.

  5. Update the delivery tracking number in your import job details and submit the import job.
  6. Once drives received, will be processed in the Azure datacenter.
  7. Drives will be returned to you once import completed based on the return address provided.

Here is the graphical representation of the above process.

Courtesy: Microsoft

Now, look at the Azure export service.

  1. Look at the data that you need to export from Azure storage account, and note down the capacity, number of drives required, data type and destination location.
  2. Procure the number of drives that you need to export data from storage account.
  3. Create an export job through Azure portal.
    1. Login to the Azure portal and search for import/export service.

    2. In the Import/Export jobs panel, select “create import/export job” to initiate a new job request.

    3. Fill the basic configuration details as needed.

    4. In job details panel, select the source storage account.

    5. Drop-off location will be selected by default based on your storage account location, select required export option and click on OK.

    6. Fill the return shipping information.

    7. verify the summary and click on OK to create the job successfully.

  4. Ship the drives to the shipping address as described in summary page.

  5. Update the delivery tracking number in your import job details and submit the export job.
  6. Once drives received, will be processed in the Azure datacenter.
  7. The drives will be encrypted by bitlocker and keys will be provided to you via Azure portal.
  8. Drives will be returned to you once import completed based on the return address provided.

Here is the graphical representation of the above process.

Courtesy: Microsoft

I hope, this blogpost helped you with Azure Import//Export job. Please share your feedback in comments section.

#Azure: Azure Import/Export service


Azure Import/Export service allows data transfer between Azure datacenters and customer locations. It is a secure service to send or receive medium-to-large amount of data when the bandwidth becomes bottleneck and costly. Azcopy is preferred tool for online data migration if you look Microsoft Azure data transfer options. While Azure Import/Export provides large amount of physical data transfer in secure and reliable manner. The data can be copied in one or more drives to import to and to export from Azure blob and file storages.

This Import/Export service use either 2.5-inch SSDs or 2.5/3.5-inch SATA II & III HDDs or mix of these. External HDD with built-in USB adapter and drives in external casing are not supported. Here is the quick snapshot of possible import and export data transfers.

Job Storage Accounts Supported Not Supported
Import Classic

Blob Storage accounts

General Purpose v1 storage accounts.

Azure Blob storage.

Block/page blobs.

Azure File storage.

Export Classic

Blob Storage accounts

General Purpose v1 storage accounts.

Azure Blob storage.

Block, page and append blobs.

Azure File storage.

Points to remember while sending drives for import job.

  • A maximum of 10 drives for each job.
  • Use only single data volume partition.
  • Data volume must be formatted with NTFS.
  • Supported external USB adaptors to copy data to internal HDDs.
    • Anker 68UPSATAA-02BU
    • Anker 68UPSHHDS-BU
    • Startech SATADOCK22UE
    • Orico 6628SUS3-C-BK (6628 Series)
    • Thermaltake BlacX Hot-Swap SATA External Hard Drive Docking Station (USB 2.0 & eSATA)

Let me explain you use cases and process to perform import/export job.

You can use this service in following scenarios:

  • Move data to the cloud as part of the data migration strategy.
  • Data backup to the cloud.
  • Data recovery from the cloud.
  • Data distribution to the customer sites.

Here is the high-level process and components and locations available for Import/Export job.

Components:

  • Import/Export service in Azure portal to create a new job
  • Hard disk drives to copy the data
  • WAImportExport tool to prepare drives and encrypt data

Location available on the date of writing this blog post:

Country Country Country Country
East US North Europe Central India US Gov Iowa
West US West Europe South India US DoD East
East US 2 East Asia West India US DoD Central
West US 2 Southeast Asia Canada Central China East
Central US Australia East Canada East China North
North Central US Australia Southeast Brazil South UK South
South Central US Japan West Korea Central Germany Central
West Central US Japan East US Gov Virginia Germany Northeast

Courtesy: Microsoft

If your Azure storage account location is not available in the above list, you can create a job and send it to the alternate location as specified in the tool while creating an Import job.

Next blogpost covers, step by step process of Azure Import/Export service/job.

#Azure : Data transfer


Cloud has become a prominent option for all kind of organizations. When any medium to large organization moves to the cloud, data transfer becomes a biggest challenge. To address this concern, Microsoft provides different types of data transfer options to the customers. Before you get into the details of data transfer options, answer the following questions:

  • How much data, we need to migrate?
  • What is going to be the frequency of data transfer?
  • Data source and destination locations, and respective data regulations?
  • Find the bottlenecks that may arise at the time of migration?
  • Do the cost, time and effort comparison between possible data migration types?

If you look from the Microsoft point of view, they have divided the data transfer into four major categories:

  • Physical data transfer
  • Data transfer using command line tools and APIs
  • Data transfer using graphical user interface
  • Data pipeline

Let me explain you briefly about each one of the data transfer methodology:

Physical data transfer: Widely used when you have large data sets to migrate. It could be leveraged for either one-time data migration activity or for less frequent data migration activity. For physical data transfer you can choose one data methodology based on the size.

  • Azure Import/Export: The Azure Import/Export can be used to transfer large amount of data using internal SATA HDDs or SSDs. By using this service, you can securely transfer data from on-premises to the cloud blob or file storage and vice-versa. When procuring drives for this service, don’t get confuse between SATA and SAS drives. Order SATA III drives as it is faster than older version of SATA drives and support speed of 6 Gbps.
  • Azure Data Box: Azure Data Box is an option to transfer very large amount of data, it is very similar to Azure Import/Export but avoids the hurdles of procuring, writing and sending multiple data disks. In this service, Microsoft provides you secure and reliable appliance to transfer data between on-premises and cloud blob and file storage. It is much easier than Azure Import/Export service as Microsoft takes the responsibility of end-to-end logistics.

Data transfer using command line tools and APIs: used when you have enough bandwidth available to migrate limited amount of data between on-premises and cloud blob and file storage. There are multiple tools available to perform this activity.

  • AzCopy: It is command line to tool to transfer data to and from Azure blob, file and table storage in fast, secure and reliable manner. You can install this tool on Windows or Linux machine to transfer data. It supports parallelism and the ability to resume copy operation when interrupted.
  • Azure CLI: It is a command line tool to manage Azure services and to upload data to Azure storage. Azure CLI doesn’t need any installation and configuration as it is available through Azure portal itself.
  • PowerShell: PowerShell is an alternative option for windows administrators to transfer data.

Data transfer using graphical user interface: is a most simpler way of transferring data between cloud on-premises. You have two options available to transfer data using graphical tools.

  • Azure Portal: Simplest way of exploring and uploading files to the Azure blob storage and data lake store but it has a limitation of exploring and uploading only file at a time.
  • Azure Storage Explorer: Azure storage explorer is a great option for GUI lovers, it provides a capability to manage, upload and download files through interactive interface for blobs, files, queues, tables and Azure Cosmos DBs objects. It also allows to manage data between blob storage, and between storage accounts.

Data Pipeline: used when you need to transfer data regularly.

  • Azure Data Factory: It is an option to transfer and transform data using data-driven workflows (a.k.a data pipeline) on a regular basis by leveraging orchestration or automation processes. It is a managed service to transfer data between Azure services, on-premises, or a combination of the two. The workflow can be created and scheduled based on your requirements. It can process and transform the data by leveraging compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning.

Apart from the above core data transfer options, following are the list of tools that can be leveraged to transfer data within specific Azure services.