Overview
Validatar's Authoring API supports exporting tests and jobs from one project and importing them into another. This enables CI/CD workflows where test content is promoted across environments (e.g., development to staging to production) as part of your deployment pipeline.
The export produces an XML file (base64-encoded) containing the full definition of selected tests, jobs, and their folder structure. The import endpoint accepts this file and creates or updates the content in the target project.
Authentication
All CI/CD API calls use a custom header:
x-val-api-token: {your_api_token}
Create an API token with the Authoring scope (and Execution scope if you plan to run tests after import). A single token can have multiple scopes. See User Tokens for setup instructions.
Base URL
{VALIDATAR_API_URL}/core/api/v1/{endpoint}
Replace {VALIDATAR_API_URL} with your Validatar instance URL (e.g., https://mycompany.cloud.validatar.com).
Common Pattern
- Export — POST to the export endpoint on the source project with the tests, jobs, or folders to include
- Transfer — POST the export payload (base64 XML) to the import endpoint on the target project with data source mappings
- Validate — optionally execute the imported tests or jobs to verify they work in the target environment
- Report — surface import and execution results in the pipeline output
Key API Endpoints
| Endpoint | Method | Scope | Description |
|---|---|---|---|
projects/{id}/export |
POST | Authoring | Export tests and jobs from a project |
projects/{id}/import |
POST | Authoring | Import tests and jobs into a project |
projects |
GET | Authoring | List all projects (to discover project IDs) |
data-sources |
GET | Authoring | List data sources (to discover data source IDs for mapping) |
projects/{projectId}/folders/{folderId} |
GET | Authoring | Get folder details |
projects/{projectId}/jobs/{jobId}/execute |
POST | Execution | Execute a job after import |
Export Request
{
"name": "my-export-2026-04-02",
"description": "Weekly promotion from dev to staging",
"testFolderIds": [101, 102],
"jobFolderIds": [201],
"testIds": [5001, 5002],
"jobIds": [3001],
"includeCustomFields": true,
"includeJobSchedules": false
}
| Field | Type | Description |
|---|---|---|
name |
string | Filename for the export (required) |
description |
string | Description of the export (optional) |
testFolderIds |
int[] | Export entire test folders by ID |
jobFolderIds |
int[] | Export entire job folders by ID |
testIds |
int[] | Export individual tests by ID |
jobIds |
int[] | Export individual jobs by ID |
includeCustomFields |
bool | Include custom field values in the export (recommended: true) |
includeJobSchedules |
bool | Include job schedules (set to false for CI/CD — schedules are environment-specific) |
Tip: You can export by folder or by individual test/job. For CI/CD, organizing tests into promotion-ready folders makes export simpler — export the entire folder rather than tracking individual IDs.
Export Response
{
"fileName": "my-export-2026-04-02.xml",
"mimeType": "application/xml",
"contentBase64": "PFZhbGlkYXRhckV4cG9ydD4..."
}
The contentBase64 field contains the full export file as a base64-encoded XML string. This is the payload you pass to the import endpoint.
Tip: For troubleshooting, decode the base64 content and save it as a local
.xmlfile. The XML format is human-readable and can help you verify what was exported.
Import Request
{
"mimeType": "application/xml",
"contentBase64": "PFZhbGlkYXRhckV4cG9ydD4...",
"targetTestFolderId": 301,
"targetJobFolderId": 401,
"dataSourceMappings": [
{
"sourceFileKey": "42",
"targetId": 87
}
],
"includeJobSchedules": false,
"conflictResolution": "Overwrite"
}
| Field | Type | Description |
|---|---|---|
mimeType |
string | Always "application/xml" |
contentBase64 |
string | The base64 content from the export response |
targetTestFolderId |
int? | Destination folder for imported tests (optional — uses original structure if omitted) |
targetJobFolderId |
int? | Destination folder for imported jobs (optional) |
dataSourceMappings |
array | Maps source data source IDs to target data source IDs (see below) |
includeJobSchedules |
bool | Whether to import schedules |
conflictResolution |
string | How to handle name conflicts: "Overwrite", "Rename", or "Skip" |
Data Source Mappings
When tests reference data sources in the source project, those data sources may have different IDs in the target project. The dataSourceMappings array tells Validatar how to remap them.
The sourceFileKey is the data source ID from the source project (visible in the export file or discoverable via GET /core/api/v1/data-sources). The targetId is the corresponding data source ID in the target project.
"dataSourceMappings": [
{ "sourceFileKey": "42", "targetId": 87 },
{ "sourceFileKey": "43", "targetId": 88 }
]
Tip: Use the
GET /core/api/v1/data-sourcesendpoint on both the source and target instances to list data sources with their IDs and names, then build the mapping by matching names.
Conflict Resolution
| Option | Behavior |
|---|---|
Overwrite |
Replaces existing tests/jobs with the same name (recommended for CI/CD) |
Rename |
Imports with a modified name to avoid conflicts |
Skip |
Skips any test/job that already exists by name |
For CI/CD pipelines, Overwrite is the recommended default. Running the same import repeatedly with Overwrite is safe and idempotent.
Import Response
A successful import returns 200 OK with no body.
A failed import returns 400 Bad Request with details:
{
"validationErrors": ["Error description"],
"invalidDataSources": [
{
"sourceFileKey": "42",
"message": "Data source not found"
}
],
"invalidMacros": [],
"commitErrors": ["Error description"]
}
Finding IDs
Project, test, job, and folder IDs are visible in the Validatar UI URL:
https://mycompany.cloud.validatar.com/projects/43/folders/1408/tests/57878
You can also discover IDs programmatically:
GET /core/api/v1/projects → list projects with IDs
GET /core/api/v1/data-sources → list data sources with IDs and names
GET /core/api/v1/projects/{id}/folders/{id} → get folder details
Guides in This Section
- GitHub Actions — highly requested; includes both CI/CD and orchestration examples
- Bitbucket Pipelines — full standalone pipeline definitions
- Azure DevOps Pipelines
- Generic CI/CD (Other Tools) — portable Python, PowerShell, and Bash scripts for any CI/CD platform