Skip to main content
Planview Customer Success Center

Step 3 - Configure Cloud Storage

Edit sectionStep 3 - Configure Cloud Storage 

clipboard_e71cfc0bad36aab382935d1e0c7c84944.png

 


S3 and Amazon Redshift 

You can use Data Warehouse Export to extract data to S3 or Amazon Redshift

  1. Enter a bucket name where the data is to be stored.
  2. Enter a prefix if you want to store files extracted from AdaptiveWork in a subfolder within the Bucket. Enter the main folder and the sub-folder in this format: main/sub
    mceclip0.png
  3. Select the File Type.
  4. Select the region where the files will be uploaded to.
  5. Enter the AWS Access Key.
  6. Enter the AWS Secret Access Key.
  7. Click Test Cloud Storage Connection.
    Note: AdaptiveWork recommends performing this test to ensure successful export to your selected cloud storage system. If the connection fails, make sure you’ve entered your account information correctly and try again.

File Limitation: 1000 rows

 

Box 

You can use Data Warehouse Export to extract data to Box

  1. Click Associate with Box Account and log into Box.
  2. Optional - Use a prefix to store AdaptiveWork extracted files in a subfolder under the selected folder.
  3. Select the File Type.

File Limitation: 5000 rows

 

Azure Blob Storage 

You can use Data Warehouse Export to extract data to Azure Blob Storage

Use a Connection String or Shared Access Signature (SAS) to connect to Azure.

  • Connection String:
    1. Enter the Connection String you generated in Azure.

    2. In the Container, specify the target container name (case sensitive). If the container is not found, it will be automatically created. mceclip1.png

  • Shared Access Signature (SAS):
    1. Specify the Shared Access Signature from Azure 

    2. Click Test Cloud Storage Connection

    3. Select the File Type

clipboard_e8416e54ea7c83a5a764de9abddb5a26e.png

  mceclip2.png

Note: AdaptiveWork recommends performing this test to ensure successful export to your selected cloud storage system. If the connection fails, make sure you’ve entered your account information correctly and try again.

File Limitation: 1000 rows

SFTP Server 

You can use Data Warehouse Export to extract data to an SFTP Server

  1. Access your Data Warehouse and follow the instructions for Step 1
  2. In Step 2, select SFTP Server in the Database Type field and complete the other settings:
    • Host - specify the IP Address
    • Port - set to 22 by default
    • Username
    • Password
  3. Complete fields in Step 3:
    • Directory Name
    • File Type

SFTP_Server_Connectivity.PNG

Important: A request for a Security / Firewall IP whitelisting exception must be submitted. Please contact your CSM or AE or submit a case and provide the SFTP Server IP Address details.

File Limitation: 1000 rows

 

Google Cloud Storage 

You can use Data Warehouse Export to extract data to Google Cloud Storage

  1. Access your Data Warehouse and follow the instructions for Step 1
  2. In Step 2, select Google Cloud Storage in the Database Type field. All other fields are disabled.
  3. In Step 3, in the Google Cloud Storage Private Key field, enter the private key in JSON format that is used to access Google Cloud Storage
  4. In the Bucket Name field, enter the designated budget for this service.
  5. To store the extracted fields in a subfolder under the budget, enter a name for the subfolder in the Prefix field

File Limitation: 1000 rows

DWH_Google_Cloud_Storage.png