Uploading CSV/XLS with Request Upload

Overview

This step-by-step instruction guides the user through the data upload process using Upload Request. Uploaded data can be used with Use Case Enablers.

Scenario

In this instruction user will learn how to:

  1. Create a Data Storage,
  2. Request a file upload
  3. Upload desired data,
  4. Read uploaded data,
  5. Delete unused data storage.

Step 1. Create a storage

attention
Skip this step if a Data Storage was created before. You can use it multiple times for your files. Get the id of your Data Storage and go directly to the Step 2

To upload desired files storage is needed! Before upload:

  1. Use the Create Storage endpoint to send the below request
  2. Select the Create a storage example
  3. Replace the dataStorageExample and the dataSourceExample with new names
application/json
{
  • "name": "dataStorageExample",
  • "dataMatchingDefinitionId": "string",
  • "dataSources": [
    • {
      • "dataSourceName": "dataSourceExample",
      • "dataMapperDefinitionId": "61a5d4dc773d1012a3dfa5aa"
      }
    ],
  • "featuresOn": [
    • "LOOKUP",
    • "DATA_MIRROR"
    ],
  • "sharedWithOrganization": false,
  • "labels": [
    • "string"
    ]
}
  1. Check the above example in the Try It Console now:
Loading...

Important information:

  • Your storage is created if the status: CREATED_SUCCESSFULLY is displayed in the response body.
  • The id refers to the unique identifier of a newly created storage (for example id:3a10645d8be23f53a20b30bfa936e63d).
  • The storage is empty by default.
Congratulations!
The Data Storage is successfully created. Use storage id in the next step

Step 2. Request the file upload

Get the upload link

To start the import job of a csv or xlsx file:

  1. Use the Request file upload endpoint to send the below request
  2. Change the name of the imported file if needed
application/json
{
  • "url": "file.xlsx"
}
  1. Check the above example in the Try It Console now:
Loading...

In the response you will find 3 important parameters:

  • The uploadLink with the link to be used in the next step
  • The uploadMethod : with the method to be used during the upload
  • The url with the link with your uploaded file
Copy
Copied
{
  "uploadLink": "https://example.org/some.line",
  "uploadMethod": "PUT",
  "url": "customer-uploads/cdq/user-id/file.xlsx"
}
attention
Use uploadLink and uploadMethod during the upload. Use url in the next step.
warning

The validation of the link is temporary. Upload the file as fast as possible.

Upload the file

Uploading the file must be performed by an external client like Insomnia.

To upload the file:

  1. Create a new request
  2. Set the uploadMethod and paste the uploadLink
UploadIns1
Defining the request parameters
  1. Change the body of the request to Binary File
  2. Select the file
UploadIns2 UploadIns3
Setting the request bodySelecting the file
  1. Send the request
info

No request body expected. Go to the next step

Step 3. Upload data to the storage

Use received url link

To start the import job of the uploaded file:

  1. Use the Start Import Job endpoint to send the below request
  2. Paste the existing storage id
  3. Paste the url from step 2.
  1. Check the above example in the Try It Console now:
Loading...

In the response you will find 3 important parameters:

  • The id with the value representing YOUR JOB ID - the unique number of every upload job.
  • The progress : 0 representing the progress of the job.
  • The status : SCHEDULED - the job is in the job queue.
attention
Note down received YOUR JOB ID for polling the job status.

Pool for the completion status

To check the import job status:

  1. Use the Poll Import Job endpoint to send the below request
  2. Use the YOUR JOB ID from the previous step and replace the jobId
  1. Check the above example in the Try It Console now:
Loading...
  • If the job is still running the status with the progress will be displayed:
Copy
Copied
{
  "id" : "YOUR JOB ID",
  "progress" : "71",
  "status" : "RUNNING"
}
info

The import job runtime is dependent on the size of the imported file. Poll the status regularly to find out when the import is done.

  • When the job is done the FINISHED status will be displayed
Copy
Copied
{
  "id" : "YOUR JOB ID",
  "progress" : "100",
  "status" : "FINISHED"
}
success

The data is fully imported to your storage now. The uploaded data can be processed now by the wide range of CDQ Solutions.

Step 4. Read data

To read uploaded data:

  1. Use the Read Business Partners endpoint to send the below request
  2. Use the existing storage id
  1. Check the above example in the Try It Console now:
Loading...
In the response the Business Partners data will be displayed in the value object:
Copy
Copied
{
    "values" : [
      {
        `businessPartnerData1`
      },
      {
        `businessPartnerData2`
      }
    ]
}
attention
The businessPartnerData1 and businessPartnerData2 represent the sets of parameters for a particular Business Partner. It is used for simplification.

Step 5. Delete storage

GOOD PRACTISES

Delete the storage after you no longer need it.

  1. Use the Delete Storage endpoint to send the below request
  2. Use the existing storage id no longer in use and replace the storageId in the request
  1. Check the above example in the Try It Console now:
Loading...

Your opinion matters!

We are constantly working on providing an outstanding user experience with our products. Please share your opinion about this tutorial!

Mail our developer-portal team: developer-portal@cdq.com