Learn about how to prepare for a disk-based data import job.
This topic describes the tasks associated with preparing for the Disk-Based Data Import. The Project Sponsor role typically performs these tasks. See Roles and Responsibilities.
Import Disk Requirements
You're responsible for performing following tasks in order:
Obtaining the required number of hard drives to migrate the data to Oracle Cloud Infrastructure. Use USB 2.0/3.0 external hard disk drives (HDD) with a single partitioned file system containing the data.
Note
Oracle doesn't certify or test disks you intend to use for disk import jobs. Calculate the disk capacity requirements and disk I/O to decide what USB 2.0/3.0 disk works best for the data transfer needs.
Copying the data to the HDDs following the procedures described in this import
disk documentation.
Shipping the disks to the specified Oracle data transfer site.
After the data is copied successfully to an Oracle Cloud InfrastructureObject Storage bucket, the hard drives are sanitized to remove the data before they're shipped back to you.
Installing the Data Transfer Utility π
Learn about the installation of the Data Transfer Utility for running disk-based data import
jobs..
This topic describes how to install and configure the Data Transfer Utility for use in disk-based data transfers. In addition, this topic describes the syntax for the Data Transfer Utility commands.
Important
With this release, the Data Transfer Utility only
supports disk-based data transfers. Use of the Data Transfer Utility for appliance-based transfers
has been replaced with the Oracle Cloud Infrastructure command line
interface (CLI).
The Data Transfer Utility is licensed under the Universal Permissive License 1.0 and the Apache License 2.0. Third-party content is separately licensed as described in the code.
Note
The Data Transfer Utility must be run as the root
user.
Prerequisites π
Learn about prerequisites for installing the Data Transfer Utility.
To install and use the Data Transfer Utility, obtain the following:
An Oracle Cloud Infrastructure account.
The required Oracle Cloud Infrastructure users and groups with the required IAM policies.
Oracle Linux 6 or greater, Ubuntu 14.04 or greater, or SUSE 11 or
greater. All Linux operating systems must have the ability to create
an EXT file system.
Note
Windows-based machines are not supported in disk-based transfer
jobs.
Java 1.8 or Java 1.11
hdparm 9.0 or later
Cryptsetup 1.2.0 or greater
Firewall access: If you have a restrictive firewall in the environment where you are using the Data Transfer Utility, you may need to open your firewall configuration to allow access to the Data Transfer service as well as Object Storage. See Configuring Firewall Settings for the specific IP address ranges required for your region.
Installation π
Download and install the Data Transfer Utility installer
that corresponds to your Data Host's operating system.
Issue the yum install command as the root user that has write permissions to the /opt directory.
Copy
sudo yum localinstall ./dts-X.Y.Z.x86_64.rpm
X.Y.Z represents the version numbers that match the installer you downloaded.
Confirm that the Data Transfer Utility installed successfully.
Copy
sudo dts --version
Your Data Transfer Utility version number is returned.
Configuration π
Before using the Data Transfer Utility, you must create a base Oracle Cloud Infrastructure directory and two configuration files with the required credentials. One configuration file is for the data transfer administrator, the IAM user with the authorization and permissions to create and manage transfer jobs. The other configuration file is for the data transfer upload user, the temporary IAM user that Oracle uses to upload your data on your behalf.
Base Data Transfer Directory
Create a base Oracle Cloud Infrastructure directory:
Copy
mkdir /root/.oci/
Configuration File for the Data Transfer Administrator
Create a data transfer administrator configuration file /root/.oci/config with the following structure:
Copy
[DEFAULT]
user=<The OCID for the data transfer administrator>
fingerprint=<The fingerprint of the above user's public key>
key_file=<The _absolute_ path to the above user's private key file on the host machine>
tenancy=<The OCID for the tenancy that owns the data transfer job and bucket>
region=<The region where the transfer job and bucket should exist. Valid values are:
us-ashburn-1, us-phoenix-1, eu-frankfurt-1, and uk-london-1.>
For the data transfer administrator, you can create a single configuration file that contains different profile sections with the credentials for multiple users. Then use the ββprofile option to specify which profile to use in the command. Here is an example of a data transfer administrator configuration file with different profile sections:
Instead, you can issue any Data Transfer Utility command with the --profile option to specify a different data transfer administrator profile. For example:
Using the example configuration file above, the <profile_name> would be profile1.
Configuration File for the Data Transfer Upload User π
Create a data transfer upload user /root/.oci/config_upload_user configuration file with the following structure:
Copy
[DEFAULT]
user=<The OCID for the data transfer upload user>
fingerprint=<The fingerprint of the above user's public key>
key_file=<The _absolute_ path to the above user's private key file on the host machine>
tenancy=<The OCID for the tenancy that owns the data transfer job and bucket>
region=<The region where the transfer job and bucket should exist. Valid values are:
us-ashburn-1, us-phoenix-1, eu-frankfurt-1, and uk-london-1.>
Creating an upload user configuration file with multiple profiles is not
supported.
Configuration File Entries
The following table lists the basic entries that are required for each configuration file and where to get the information for each entry.
Note
Data Transfer Service does not support
passphrases on the key files for both data transfer administrator and data
transfer upload user.
Entry
Description and Where to Get the Value
Required?
user
OCID of the data transfer administrator or the data transfer upload user, depending on which profile you are creating. To get the value, see Required Keys and OCIDs.
In the previous examples, provide a friendly name for the transfer job using the ββdisplayβname option.
Finding Out the Installed Version of the Data Transfer Utility
You can get the installed version of the Data Transfer Utility using --version or -v. For example:
dts --version
0.6.183
Accessing Data Transfer Utility Help
All Data Transfer Utility help commands have an associated help component you can access from the command line. To view the help, enter any command followed by the --help or -h option. For example:
dts job --help
Usage: job [COMMAND]
Transfer disk or appliance job operations - {job action [options]}
Commands:
create Creates a new transfer disk or appliance job.
show Shows the transfer disk or appliance job details.
update Updates the transfer disk or appliance job details.
delete Deletes the transfer disk or appliance job.
close Closes the transfer disk or appliance job.
list Lists all transfer disk or appliance jobs.
verify-upload-user-credentials Verifies the transfer disk or appliance upload user credentials.
When you run the help option (--help or -h) for a specified command, all the subordinate commands and options for that level of the Data Transfer Utility are displayed. If you want to access the Data Transfer Utility help for a specific subordinate command, include it in the Data Transfer Utility string, for example:
dts job create --help
Usage: job create --bucket=<bucket> --compartment-id=<compartmentId>
[--defined-tags=<definedTags>] --device-type=<deviceType>
--display-name=<displayName>
[--freeform-tags=<freeformTags>] [--profile=<profile>]
Creates a new transfer disk or appliance job.
--bucket=<bucket> Upload bucket for the job.
--compartment-id=<compartmentId> Compartment OCID.
--defined-tags=<definedTags> Defined tags for the new transfer job in JSON format.
--device-type=<deviceType> Device type for the job: DISK or APPLIANCE.
--display-name=<displayName> Display name for the job.
--freeform-tags=<freeformTags> Free-form tags for the new transfer job in JSON format.
--profile=<profile> Profile.
Creating the Required IAM Users, Groups, and Policies π
Each service in Oracle Cloud Infrastructure integrates with IAM for authentication and authorization.
To use Oracle Cloud Infrastructure, you must be given the required type of access in a policy written by an administrator, whether you're using the Console or the REST API with an SDK, CLI, or other tool. If you try to perform an action and get a message that you don't have permission or are unauthorized, confirm with your administrator the type of access you've been granted and which compartment you should work in.
Access to resources is provided to groups using policies and then inherited by the users that are assigned to those groups. Data transfer requires the creation of two distinct groups:
Data transfer administrators who can create and manage transfer
jobs.
Data transfer upload users who can upload data to Object Storage. For your data security, the
permissions for upload users allow Oracle personnel to upload standard and
multi-part objects on your behalf and inspect bucket and object metadata.
The permissions do not allow Oracle personnel to inspect the actual
data.
The Data Administrator is responsible for generating the required RSA keys needed for the temporary upload users. These keys should never be shared between users.
An administrator creates these groups with the following policies:
The data transfer administrator group requires an authorization policy that includes the following:
Allow group group_name to manage data-transfer-jobs in compartment compartment_name
Allow group group_name to manage objects in compartment compartment_name
Allow group group_name to manage buckets in compartment compartment_name
Alternatively, you can consolidate the manage buckets and manage objects policies into the following:
Allow group group_name to manage object-family in compartment compartment_name
The data transfer upload user group requires an authorization policy that includes the following:
Allow group group_name to manage buckets in compartment compartment_name where all { request.permission='BUCKET_READ', target.bucket.name='<bucket_name>' }
Allow group group_name to manage objects in compartment compartment_name where all { target.bucket.name='<bucket_name>', any { request.permission='OBJECT_CREATE', request.permission='OBJECT_OVERWRITE', request.permission='OBJECT_INSPECT' }}
To enable notifications, add the following policies:
Allow group group name to manage ons-topics in tenancy
Allow group group name to manage ons-subscriptions in tenancy
Allow group group name to manage cloudevents-rules in tenancy
Allow group group name to inspect compartments in tenancy
The Oracle Cloud Infrastructure administrator then adds a user to each of the data transfer groups created. For details on creating users, see Managing Users.
Important
For security reasons, we recommend that you create a unique IAM data transfer upload user for each
transfer job and then delete that user once your data is uploaded to Oracle Cloud Infrastructure.
Creating Object Storage Buckets π
The Object Storage service is used to upload your data to Oracle Cloud Infrastructure. Object Storage stores objects in a container called a bucket within a compartment in your tenancy. For details on creating the bucket to store uploaded data, see Object Storage Buckets.
Configuring Firewall Settings π
The firewall port number is 443 for all data transfer methods.
Ensure that your local environment's firewall can communicate with the Data Transfer Service running on the IP address ranges for your Oracle Cloud Infrastructure region based on the following table. Also ensure that open access exists to the Object Storage IP address range. You only need to configure this IP access for the region where your data transfer job is associated.
Region
Data Transfer
Object Storage
US East (Ashburn)
140.91.0.0/16
134.70.24.0/21
US West (Phoenix)
129.146.0.0/16
134.70.8.0/21
US Gov East (Ashburn)
splat-api.us-langley-1.oraclegovcloud.com
objectstorage.us-gov-ashburn-1.oraclegovcloud.com
US Gov West (Phoenix)
splat-api.us-luke-1.oraclegovcloud.com
objectstorage.us-luke-1.oraclegovcloud.com
US DoD East (Ashburn)
splat-api.us-gov-ashburn-1.oraclegovcloud.com
objectstorage.us-gov-ashburn-1.oraclegovcloud.com
US DoD West (Phoenix)
splat-api.us-gov-phoenix-1.oraclegovcloud.com
objectstorage.us-gov-phoenix-1.oraclegovcloud.com
Brazil East (Sao Paulo)
140.204.0.0/16
134.70.84.0/22
Canada Southeast (Toronto)
140.204.0.0/16
134.70.116.0/22
Germany Central (Frankfurt)
130.61.0.0/16
134.70.40.0/21
India West (Mumbai)
140.204.0.0/16
134.70.76.0/22
Japan Central (Osaka)
140.204.0.0/16
134.70.112.0/22
Japan East (Tokyo)
140.204.0.0/16
134.70.80.0/22
South Korea Central (Seoul)
140.204.0.0/16
134.70.96.0/22
UK South (London)
132.145.0.0/16
134.70.56.0/21
Creating Transfer Jobs π
This section describes how to create a disk-based transfer job as part of the preparation for the data transfer. See Disk Import Transfer Jobs for complete details on all tasks related to transfer jobs.
Tip
You can use the Console or the Data Transfer Utility to create a transfer job.
A disk-based transfer job represents the collection of files that you want to transfer and signals the intention to upload those files to Oracle Cloud Infrastructure. A disk-based transfer job combines at least one transfer disk with a transfer package. Identify which compartment and Object Storage bucket that Oracle is to upload your data to. Create the disk-based transfer job in the same compartment as the upload bucket and supply a human-readable name for the transfer job.
Note
It is recommended that you create a compartment for each transfer job to minimize the
required access your tenancy.
Creating a transfer job returns a job ID that you specify in other transfer tasks. For example:
Open the navigation menu and click Migration & Disaster Recovery. Under Data Transfer, click Imports. The Transfer Jobs page appears.
Choose a Compartment you have permission to work in under List scope. All transfer jobs in that compartment are listed in tabular form.
Click Create Transfer Job. The Create Transfer Job dialog box appears.
Complete the following:
Job Name: Enter a name for the transfer job.
Bucket: Select the bucket that contains the transfer data from the list. All available buckets for the selected compartment are listed. If you want to select a bucket in a different compartment, click Change Compartment and select the compartment that contains the bucket you want.
Run dts job create --help to view the complete list of flags and variable options.
For example:
oci dts job create --bucket MyBucket1 --compartment-id ocid.compartment.oc1..exampleuniqueID --display-name MyDiskImportJob
Transfer Job :
ID : ocid1.datatransferjob.oc1..exampleuniqueID
CompartmentId : ocid.compartment.oc1..exampleuniqueID
UploadBucket : MyBucket1
Name : MyDiskImportJob
Label : JZM9PAVWH
CreationDate : 2019/06/04 17:07:05 EDT
Status : PREPARING
freeformTags : *** none ***
definedTags : *** none ***
Packages :
[1] :
Label : PBNZOX9RU
TransferSiteShippingAddress : Oracle Data Transfer Service; Job:JZM9PAVWH Package:PBNZOX9RU ; 21111 Ridgetop Circle; Dock B; Sterling, VA 20166; USA
DeliveryVendor : FedEx
DeliveryTrackingNumber : *** none ***
ReturnDeliveryTrackingNumber : *** none ***
Status : PREPARING
Devices : [*** none ***]
UnattachedDevices : [*** none ***]
Appliances : [*** none ***]When you use the to display the details of a job, tagging details are also included in the output if you specified tags.
Optionally, you can specify one or more defined or free-form tags when you create a transfer job. For more information about tagging, see Resource Tags.
oci dts job create --bucket MyBucket1 --compartment-id ocid.compartment.oc1..exampleuniqueID --display-name MyDiskImportJob --defined-tags '{"Operations": {"CostCenter": "01"}}'
Transfer Job :
ID : ocid1.datatransferjob.oc1..exampleuniqueID
CompartmentId : ocid.compartment.oc1..exampleuniqueID
UploadBucket : MyBucket1
Name : MyDiskImportJob
Label : JZM9PAVWH
CreationDate : 2019/06/04 17:07:05 EDT
Status : PREPARING
freeformTags : *** none ***
definedTags :
Operations :
CostCenter : 01
Packages :
[1] :
Label : PBNZOX9RU
TransferSiteShippingAddress : Oracle Data Transfer Service; Job:JZM9PAVWH Package:PBNZOX9RU ; 21111 Ridgetop Circle; Dock B; Sterling, VA 20166; USA
DeliveryVendor : FedEx
DeliveryTrackingNumber : *** none ***
ReturnDeliveryTrackingNumber : *** none ***
Status : PREPARING
Devices : [*** none ***]
UnattachedDevices : [*** none ***]
Appliances : [*** none ***]When you use the to display the details of a job, tagging details are also included in the output if you specified tags.
Note
Users create tag namespaces and tag keys with the required permissions. These items must exist before you can specify them when creating a job. See Tags and Tag Namespace Concepts for details.
Each transfer job you create has a unique OCID within Oracle Cloud Infrastructure. For example:
ocid1.datatransferjob.region1.phx..unique_ID
You will need to forward this transfer job OCID to the Data Administrator.
Using the Console π
Open the navigation menu and click Migration & Disaster Recovery. Under Data Transfer, click Imports. The Transfer Jobs page appears.
Choose a Compartment you have permission to work in under List scope. All transfer jobs in that compartment are listed in tabular form.
Click the transfer job whose OCID you want to get. The transfer job's Details page appears.
Find the OCID field and click Show to display it or Copy to copy it to your computer.
Using the Data Transfer Utility π
Use the dts job list command and required parameters to list the transfer jobs in your compartment. Here you can view the job OCID.
Copy
dts job list --compartment-id compartment_id
Run dts job list --help to view the complete list of flags and variable options.
For example:
dts job list --compartment-id ocid.compartment.oc1..exampleuniqueID
Transfer Job List :
[1] :
ID : ocid1.datatransferjob.oc1..exampleuniqueID
Name : MyDiskImportJob
Label : JVWK5YWPU
BucketName : MyBucket1
CreationDate : 2020/06/01 17:33:16 EDT
Status : INITIATED
FreeformTags : *** none ***
DefinedTags :
Financials :
key1 : nondefault
The ID for each transfer job is returned:
ID : ocid1.datatransferjob.oc1..exampleuniqueID
Tip
When you create a transfer job using the dts job create CLI, the transfer job ID is displayed in the CLI's return.
Creating Upload Configuration Files π
The Project Sponsor is responsible for creating or obtaining configuration files that allow the uploading of user data to the transfer appliance. Send these configuration files to the Data Administrator where they can be placed in the Data Host. The config file is for the data transfer administrator, the IAM user with the authorization and permissions to create and manage transfer jobs. The config_upload_user file is for the data transfer upload user, the temporary IAM user that Oracle uses to upload your data on your behalf.
Create a base Oracle Cloud Infrastructure directory and two configuration files with the required credentials.
Creating the Data Transfer Directory
Create a Oracle Cloud Infrastructure directory (.oci) on the same Data Host where the CLI is installed. For example:
Copy
mkdir /root/.oci/
The two configuration files (config and config_upload_user) are placed in this directory.
Creating the Data Transfer Administrator Configuration File π
Create the data transfer administrator configuration file /root/.oci/config with the following structure:
Copy
[DEFAULT]
user=<The OCID for the data transfer administrator>
fingerprint=<The fingerprint of the above user's public key>
key_file=<The _absolute_ path to the above user's private key file on the host machine>
tenancy=<The OCID for the tenancy that owns the data transfer job and bucket>
region=<The region where the transfer job and bucket should exist. Valid values are:
supported regions>.
For the data transfer administrator, you can create a single configuration file that contains different profile sections with the credentials for multiple users. Then use the ββprofile option to specify which profile to use in the command.
Here is an example of a data transfer administrator configuration file with different profile sections:
Creating the Data Transfer Upload User Configuration File π
The config_upload_user configuration file is for the data transfer upload user, the temporary IAM user that Oracle uses to upload your data on your behalf. Create this configuration file with the following structure:
Copy
[DEFAULT]
user=<The OCID for the data transfer upload user>
fingerprint=<The fingerprint of the above user's public key>
key_file=<The _absolute_ path to the above user's private key file on the host machine>
tenancy=<The OCID for the tenancy that owns the data transfer job and bucket>
region=<The region where the transfer job and bucket should exist. Valid values are:
supported regions>
The following table lists the basic entries that are required for each configuration file and where to get the information for each entry.
Note
Data Transfer Service does not support
passphrases on the key files for both data transfer administrator and data
transfer upload user.
Entry
Description and Where to Get the Value
Required?
user
OCID of the data transfer administrator or the data transfer upload user, depending on which profile you are creating. To get the value, see Required Keys and OCIDs.
A transfer package is the virtual representation of the physical disk package that you are shipping to Oracle for upload to Oracle Cloud Infrastructure. See Transfer Packages for complete details on all tasks related to transfer packages.
Creating a transfer package requires the job ID returned from when you created the transfer job. For example:
Open the navigation menu and click Migration & Disaster Recovery. Under Data Transfer, click Imports. The Transfer Jobs page appears.
Choose a Compartment you have permission to work in under List scope. All transfer jobs in that compartment are listed in tabular form.
Click the transfer job for which you want to get a transfer package's label. The transfer job's Details page appears.
Click Transfer Packages under Resources. The Transfer Packages page appears. All transfer packages are listed in tabular form.
View the Labels column for a list of transfer package labels.
Using the Data Transfer Utility π
Use the dts job show command and required parameters to display the details of a transfer jobs in your compartment.
Copy
dts job show --job-id job_id
Run dts job show --help to view the complete list of flags and variable options.
For example:
dts job show --job-id ocid1.datatransferjob.oc1..exampleuniqueID
Transfer Job :
ID : ocid1.datatransferjob.oc1..exampleuniqueID
CompartmentId : ocid.compartment.oc1..exampleuniqueID
UploadBucket : MyBucket1
Name : MyDiskImportJob
Label : JZM9PAVWH
CreationDate : 2019/06/04 17:07:05 EDT
Status : PREPARING
freeformTags : *** none ***
definedTags : *** none ***
Packages :
[1] :
Label : PBNZOX9RU
TransferSiteShippingAddress : Oracle Data Transfer Service; Job:JZM9PAVWH Package:PBNZOX9RU ; 21111 Ridgetop Circle; Dock B; Sterling, VA 20166; USA
DeliveryVendor : FedEx
DeliveryTrackingNumber : *** none ***
ReturnDeliveryTrackingNumber : *** none ***
Status : PREPARING
Devices : [*** none ***]
UnattachedDevices : [*** none ***]
Appliances : [*** none ***]
The transfer package label is displayed as part of the job details.
Getting Shipping Labels π
You can find the shipping address in the transfer package details. Use this information to get a shipping label for the transfer package that is used to send the disk to Oracle.
After getting the shipping labels from the Console or Data Transfer Utility, go to the supported carrier you are using (UPS, FedEx, or DHL) and manually create both the SHIP TO ORACLE and RETURN TO CUSTOMER labels. See Shipping Import Disks and Monitoring the Import Disk Shipment and Data Transfer for information.
Using the Console π
Open the navigation menu and click Migration & Disaster Recovery. Under Data Transfer, click Imports. The Transfer Jobs page appears.
Choose a Compartment you have permission to work in under List scope. All transfer jobs in that compartment are listed in tabular form.
Click the transfer job for which you want to see the details. The transfer job's Details page appears.
Click Transfer Packages under Resources. The Transfer Packages page appears. All transfer packages are listed in tabular form.
Click the transfer package whose details you want to get. The transfer package's Details page appears.
Get the shipping label.
Using the Data Transfer Utility π
Use the dts package show command and required parameters to get the shipping label for a transfer package.
Copy
dts package show --job-id job_id --package-label package_label
Run dts package show --help to view the complete list of flags and variable options.
For example:
dts package show --job-id ocid1.datatransferjob.oci1..exampleuniqueID --package-label PWA8O67MI
Transfer Package :
Label : PWA8O67MI
TransferSiteShippingAddress : Oracle Data Transfer Service; Job:JZM9PAVWH Package:PWA8O67MI ; 21111 Ridgetop Circle; Dock B; Sterling, VA 20166; USA
DeliveryVendor : *** none ***
DeliveryTrackingNumber : *** none ***
ReturnDeliveryTrackingNumber : *** none ***
Status : PREPARING
Devices : [*** none ***]
Notifying the Data Administrator π
When you have completed all the tasks in this topic, provide the Data Administrator of the following: