# Tutorial: How to qualify tax and business identifier? ## Overview **Use case** The purpose of the qualified check of EU VAT identifiers is to prove the consistency of the maintained address and name against the address and name with which the EU VAT is registered. This use case describes how to perform such a qualified validation of records to comply with EU tax compliance requirements (in German usually referred to as: "UID Bestätigungsverfahren"). The typical business use case is to qualify name and address data against a VAT number when doing intra-community business, i.e., delivering/selling goods across inner-European borders. A German company that sells and delivers goods to a French business partner has to perform a qualified check of the French business partner's VAT number. This use case is based on the wiki [use case](https://meta.cdq.com/Use_case/Qualified_Check_of_a_European_VAT_Identifier) but presents an integration approach using CDQ APIs. | | | --- | | ![](/assets/tax-indentifier-overview.df048f1eb156deff0023a5b7cc9385c6fd9965e5655c44e9ea85c9446353479a.9e768016.png) | | *Use Case: Situation, Approach and Results* | **Learning Goals** | | | --- | | ![](/assets/qcevi_process.5fe9f9caf7e329238de01227fea266ed0c5aa24cc8aceb39cd68af6254aa6fb9.9e768016.png) | | *Common Process and involved APIs/Endpoints* | In this tutorial, you will learn how: 1. Create storage and upload data 2. Run the validation job with Data Validation API. 3. Report building 4. Download report 5. Delete storage ## Prerequisites ### Authorization Before trying CDQ APIs, user must be authenticated: 1. Paste the API Key in the console's security bar into the `X-API-KEY` field. ![](/assets/auth7.0d4fb05fa55bc16d828c76e9976f979930ffadb949cfc149a6617cc6518ab57d.6fde6558.png) 1. After pasting the API Key, the green padlock will appear. ![](/assets/auth8.28e02515b78db57ec32c07fd6bd7d296484e64f65ef6a3bc5296520cb1da8d65.6fde6558.png) Be careful Green padlock doesn't mean that the API Key was pasted correctly. 1. Check your API key for missing characters or extra space before trying. #### No API Key? 1. Check how to get one on [authentication page](/documentation/instructions/authentication). 2. Follow the steps above. ## Step 1: Data Import ### Create a storage Skip this step if a **Data Storage** was created before. You can use it multiple times for your files. Get the `id` of your **Data Storage** and go directly to the next step. To upload desired files storage is needed! Before upload: 1. Use the **Create Storage** endpoint to send the below request 2. Select **Create a storage** example 3. Replace the `dataStorageExample` and the `dataSourceExample` with new names For the tutorial needs the value of `dataMapperDefinitionId` is set to `default`. 1. Check the above example in the **Try It Console** now: Important information: * Your storage is created if the `status`: `CREATED_SUCCESSFULLY` is displayed in the response body. * The `id` refers to the unique identifier of a newly created storage (for example `id`:`3a10645d8be23f53a20b30bfa936e63d`). * The storage is empty by default. Congratulations The Data Storage is successfully created. Use storage `id` in the next step. ### Upload data via CSV/XLSX To start the import job of a csv or xlsx file: 1. Use the **Start Import Job** endpoint to send the below request 2. Paste the existing storage `id` 3. Select the file to upload You can use your own file or use [CDQ Sample File](https://apps.cdq.com/files/SampleData.xlsx). 1. Check the above example in the **Try It Console** now: In response, you will find three important parameters: * The value of the `id` representing `YOUR_UPLOAD_JOB_ID` - the unique number of every upload job. * The `progress` : `0` represents the progress of the job. * The `status` : `SCHEDULED` - the job is in the job queue. Note down received `YOUR_UPLOAD_JOB_ID` for polling the job status. ### Wait until upload is finished To check the import job status: 1. Use the **Poll Import Job** endpoint to send the below request 2. Use the `YOUR_JOB_ID1` from the previous step and replace the `jobId` 3. Check the above example in the **Try It Console** now: * If the job is still running, the status with the progress will be displayed: ```json { "id" : "YOUR_UPLOAD_JOB_ID", "progress" : "71", "status" : "RUNNING" } ``` The import job runtime is dependent on the size of the imported file. Poll the status regularly to find out when the import is done. * When the job is done the `FINISHED` status will be displayed ```json { "id" : "YOUR_UPLOAD_JOB_ID", "progress" : "100", "status" : "FINISHED" } ``` The data is fully imported to your storage now. The uploaded data can be processed now by a wide range of CDQ Solutions. ## Step 2: Execute a qualification batch job ### Start the EU VAT Qualification To verify exported data: 1. Use the **Business Partner Validation Jobs** endpoint 2. Modify request and replace the `YOUR_STORAGE_ID` with the storage `id` received from Step 1 * The `EU_VAT_QUALIFICATION` is set as a profile. More profiles are described on the [documentation page](https://developer.cdq.com/reference-docs/data-validation/V2/tag/Batch-Validation/#tag/Batch-Validation/paths/~1v2~1businesspartnervalidationjobs/post) 3. Check the above example in the **Try It Console** now: In response, you will find three important parameters: * The value of the `id` representing `YOUR_VALIDATION_JOB_ID` - the unique number of every upload job. * The `progress` : `0` represents the progress of the job. * The `status` : `SCHEDULED` - the job is in the job queue. Save received `YOUR_VALIDATION_JOB_ID` for polling the job status and in the building report step. ### Wait for completion To poll the validation job status: 1. Use the **Poll Validation Job** endpoint to send the below request 2. Use the `YOUR_VALIDATION_JOB_ID` from the previous step and paste it into the job `id` field 3. Check the above example in the **Try It Console** now: /> * If the job is still running, the status with the progress will be displayed: ```json { "id" : "YOUR_VALIDATION_JOB_ID", "status" : "RUNNING", "progress" : "71" } ``` The import job runtime is dependent on the size of the imported file. Poll the status regularly to find out when the import is done. * When the job is done the `FINISHED` status will be displayed ```json { "id": "YOUR_VALIDATION_JOB_ID", "status": "FINISHED", "progress": "100" } ``` The data is now qualified!. ## Step 3: Build the report ### Start the report building job To start the report building job: 1. Use the **Build Validation Reports** endpoint 2. Modify request and replace the `YOUR_VALIDATION_JOB_ID` with the `id` value received from Step 2 * The reportRequest with build parameter set to `true`. The default value is `false`. 1. Check the above example in the **Try It Console** now: In response, you will find three important parameters: * The value of the `id` representing `YOUR_RAPORT_JOB_ID` - the unique number of every upload job. * The `progress` : `0` represents the progress of the job. * The `status` : `SCHEDULED` - the job is in the job queue. Save received `YOUR_RAPORT_JOB_ID` for polling the job status and in the building report step. ### Wait for completion To poll the report job status: 1. Use the **Poll Validation Reports** endpoint to send the below request 2. Use the `YOUR_RAPORT_JOB_ID` from the previous step and paste it into the job `id` field 3. Check the above example in the **Try It Console** now: In response, you will find two important parameters: * The `status` : `FINISHED` - the job is done * The `url` : `job-files/YOUR_FILE_ID/Qualification+Report+Collection.zip` - the link to the desired report. ```json { "id": "YOUR_RAPORT_JOB_ID", "status": "FINISHED", "attachments": [ { "name": "Qualification+Report+Collection", "url": "YOUR_REPORT_URL" } ] } ``` Save received `YOUR_REPORT_URL` for download. Copy entire `url` value. ## Step 4: Download result report To download the report: 1. Use the **Request a file download** endpoint to send the below request 2. Use the `YOUR_REPORT_URL` from the previous step and paste into it the `url` value 3. Check the above example in the **Try It Console** now: In response, you will find download details: ```json { "url": "YOUR_REPORT_URL", "downloadLink": "YOUR_DOWNLOAD_LINK", "downloadMethod": "GET" } ``` The file is waiting to be downloaded. Use the provided link and the HTTP method. ## Step 5. Delete storage GOOD PRACTISES Delete the storage after you no longer need it. 1. Use the **Delete Storage** endpoint to send the below request 2. Use the existing storage `id` no longer in use and replace the `storageId` in the request 3. Check the above example in the **Try It Console** now: ## Your opinion matters! We are constantly working on providing an outstanding user experience with our products. Please share your opinion about this tutorial!