Please note that this instruction is deprecated. To follow the up-to-date file upload process use the Request Upload.
Overview
This step-by-step instruction guides the user through data upload process. Uploaded data can be used with Use Case Enablers.
Scenario
In this instruction user will learn how to:
- Create the Data Storage,
- Upload the desired data,
- Read uploaded data,
- Delete unused data storage.
Skip this step if a Data Storage was created before. You can use it multiple times for your files. Get the id of your Data Storage and go directly to the Step 2
To upload desired files the storage is needed! Before upload:
- Use the Create Storage endpoint to send the below request
- Select Create a storage example
- Replace the
dataStorageExampleand thedataSourceExamplewith new names - Check the above example in the Try It Console now:
Important information:
- Your storage is created if the
status:CREATED_SUCCESSFULLYis displayed in the response body. - The
idrefers to the unique identifier of a newly created storage (for exampleid:3a10645d8be23f53a20b30bfa936e63d). - The storage is empty by default.
The Data Storage is successfully created. Use the storage id in the next step.
To start the import job of a csv or xlsx file:
- Use the Start Import Job endpoint to send the below request
- Paste the existing storage
id - Point the file path
- Check the above example in the Try It Console now:
In the response you will find 3 important parameters:
- The id with the value representing
YOUR JOB ID- the unique number of every upload job. - The progress :
0representing the progress of the job. - The status :
SCHEDULED- the job is in the job queue.
Note down received YOUR JOB ID for polling the job status.
To check the import job status:
- Use the Poll Import Job endpoint to send the below request
- Use the
YOUR JOB IDfrom the previous step and replace thejobId - Check the above example in the Try It Console now:
- If the job is still running the status with the progress will be displayed:
{
"id" : "YOUR JOB ID",
"progress" : "71",
"status" : "RUNNING"
}The import job runtime is dependent on the size of the imported file. Poll the status regularly to find out when the import is done.
- When the job is done the
FINISHEDstatus will be displayed
{
"id" : "YOUR JOB ID",
"progress" : "100",
"status" : "FINISHED"
}The data is fully imported to your storage now. The uploaded data can be processed now by wide range of CDQ Solutions.
To read uploaded data:
- Use the Read Business Partners endpoint to send the below request
- Use the existing storage
id - Check the above example in the Try It Console now:
In the response the Business Partners data will be displayed in the value object:
{
"values" : [
{
`businessPartnerData1`
},
{
`businessPartnerData2`
}
]
}The businessPartnerData1 and businessPartnerData2 represent the sets of parameters for a particular Business Partner. It is used for simplification.
Before the secret expires create a new one and contact CDQ to switch to the new secret. Otherwise, users will not be able to log into the CDQ WebApps after the expiration date. After the old secret has expired and the switch to the new secret has happened the old secret can be deleted.
- Use the Delete Storage endpoint to send the below request
- Use the existing storage
idno longer in use and replace thestorageIdin the request - Check the above example in the Try It Console now:
We are constantly working on providing an outstanding user experience with our products. Please share your opinion about this tutorial!