Overview
GitHub Actions can automate Validatar test content promotion across environments and run data quality gates as part of your workflows. This guide covers two use cases:
- CI/CD — Export/Import: Export tests and jobs from a source Validatar project and import them into a target project on merge or manual dispatch
- Orchestration — Execute and Gate: Run Validatar tests as a PR check or deployment gate, failing the workflow if quality checks don't pass
Prerequisites
- Validatar API token with Authoring and Execution scopes — see User Tokens
- GitHub repository with Actions enabled
- Source and target Validatar project IDs and data source IDs
- Python 3.x available on the runner (included on all GitHub-hosted runners)
Note: For Validatar Server behind a firewall, you'll need a self-hosted runner with network access to the Validatar API endpoint.
Step 1: Configure GitHub Secrets
Store your Validatar credentials as repository secrets under Settings > Secrets and variables > Actions > New repository secret:
| Secret Name | Value |
|---|---|
VALIDATAR_API_TOKEN |
Your Validatar API token (with Authoring + Execution scopes) |
SOURCE_VALIDATAR_API_URL |
Source instance URL (e.g., https://dev.cloud.validatar.com) |
TARGET_VALIDATAR_API_URL |
Target instance URL (e.g., https://prod.cloud.validatar.com) |
If your source and target are projects within the same Validatar instance, you only need one URL and one token.
Use Case 1: Export/Import on Manual Dispatch
This workflow exports tests and jobs from a development project and imports them into a staging or production project. It uses workflow_dispatch for on-demand triggering with configurable parameters.
Workflow File (Python)
Create .github/workflows/validatar-promote.yml:
name: Validatar — Promote Tests
on:
workflow_dispatch:
inputs:
source_project_id:
description: "Source Validatar project ID"
required: true
type: string
target_project_id:
description: "Target Validatar project ID"
required: true
type: string
test_folder_ids:
description: "Test folder IDs to export (comma-separated)"
required: false
type: string
default: ""
job_folder_ids:
description: "Job folder IDs to export (comma-separated)"
required: false
type: string
default: ""
data_source_mappings:
description: "Data source mappings (source_id:target_id, comma-separated)"
required: false
type: string
default: ""
jobs:
promote:
runs-on: ubuntu-latest
steps:
- name: Export from source and import to target
env:
SOURCE_VALIDATAR_API_URL: ${{ secrets.SOURCE_VALIDATAR_API_URL }}
SOURCE_VALIDATAR_API_TOKEN: ${{ secrets.VALIDATAR_API_TOKEN }}
SOURCE_PROJECT_ID: ${{ github.event.inputs.source_project_id }}
TARGET_VALIDATAR_API_URL: ${{ secrets.TARGET_VALIDATAR_API_URL }}
TARGET_VALIDATAR_API_TOKEN: ${{ secrets.VALIDATAR_API_TOKEN }}
TARGET_PROJECT_ID: ${{ github.event.inputs.target_project_id }}
EXPORT_TEST_FOLDER_IDS: ${{ github.event.inputs.test_folder_ids }}
EXPORT_JOB_FOLDER_IDS: ${{ github.event.inputs.job_folder_ids }}
DATA_SOURCE_MAPPINGS: ${{ github.event.inputs.data_source_mappings }}
run: |
python3 << 'SCRIPT'
import requests
import base64
import os
import sys
# Configuration from environment
source_url = os.environ["SOURCE_VALIDATAR_API_URL"]
source_token = os.environ["SOURCE_VALIDATAR_API_TOKEN"]
source_proj = int(os.environ["SOURCE_PROJECT_ID"])
target_url = os.environ["TARGET_VALIDATAR_API_URL"]
target_token = os.environ["TARGET_VALIDATAR_API_TOKEN"]
target_proj = int(os.environ["TARGET_PROJECT_ID"])
test_folder_ids = [int(x) for x in os.environ.get("EXPORT_TEST_FOLDER_IDS", "").split(",") if x.strip()]
job_folder_ids = [int(x) for x in os.environ.get("EXPORT_JOB_FOLDER_IDS", "").split(",") if x.strip()]
ds_mappings = []
raw = os.environ.get("DATA_SOURCE_MAPPINGS", "")
if raw:
for pair in raw.split(","):
src, tgt = pair.strip().split(":")
ds_mappings.append({"sourceFileKey": src.strip(), "targetId": int(tgt.strip())})
# --- Export ---
print(f"Exporting from project {source_proj}...")
print(f" Test folders: {test_folder_ids}")
print(f" Job folders: {job_folder_ids}")
export_resp = requests.post(
f"{source_url}/core/api/v1/projects/{source_proj}/export",
headers={"x-val-api-token": source_token, "Content-Type": "application/json"},
json={
"name": "github-actions-export",
"description": f"Automated export via GitHub Actions run",
"testFolderIds": test_folder_ids,
"jobFolderIds": job_folder_ids,
"includeCustomFields": True,
"includeJobSchedules": False,
}
)
if export_resp.status_code != 200:
print(f"ERROR: Export failed ({export_resp.status_code}): {export_resp.text}")
sys.exit(1)
export_data = export_resp.json()
content_b64 = export_data["contentBase64"]
print(f"Export successful: {export_data['fileName']} ({len(content_b64)} chars)")
# --- Import ---
print(f"\nImporting into project {target_proj}...")
print(f" Data source mappings: {len(ds_mappings)}")
print(f" Conflict resolution: Overwrite")
import_payload = {
"mimeType": "application/xml",
"contentBase64": content_b64,
"dataSourceMappings": ds_mappings,
"includeJobSchedules": False,
"conflictResolution": "Overwrite",
}
import_resp = requests.post(
f"{target_url}/core/api/v1/projects/{target_proj}/import",
headers={"x-val-api-token": target_token, "Content-Type": "application/json"},
json=import_payload
)
if import_resp.status_code == 200:
print("Import successful.")
else:
print(f"ERROR: Import failed ({import_resp.status_code})")
try:
err = import_resp.json()
for e in err.get("validationErrors", []):
print(f" Validation: {e}")
for ds in err.get("invalidDataSources", []):
print(f" Invalid data source: {ds.get('sourceFileKey')}")
for e in err.get("commitErrors", []):
print(f" Commit: {e}")
except Exception:
print(import_resp.text)
sys.exit(1)
SCRIPT
Workflow File (PowerShell)
For teams that prefer PowerShell, create .github/workflows/validatar-promote-ps.yml:
name: Validatar — Promote Tests (PowerShell)
on:
workflow_dispatch:
inputs:
source_project_id:
description: "Source Validatar project ID"
required: true
type: string
target_project_id:
description: "Target Validatar project ID"
required: true
type: string
test_folder_ids:
description: "Test folder IDs to export (comma-separated)"
required: false
type: string
default: ""
data_source_mappings:
description: "Data source mappings (source_id:target_id, comma-separated)"
required: false
type: string
default: ""
jobs:
promote:
runs-on: ubuntu-latest
steps:
- name: Export from source and import to target
shell: pwsh
env:
SOURCE_VALIDATAR_API_URL: ${{ secrets.SOURCE_VALIDATAR_API_URL }}
VALIDATAR_API_TOKEN: ${{ secrets.VALIDATAR_API_TOKEN }}
TARGET_VALIDATAR_API_URL: ${{ secrets.TARGET_VALIDATAR_API_URL }}
run: |
$SourceUrl = $env:SOURCE_VALIDATAR_API_URL
$Token = $env:VALIDATAR_API_TOKEN
$TargetUrl = $env:TARGET_VALIDATAR_API_URL
$SourceProj = [int]"${{ github.event.inputs.source_project_id }}"
$TargetProj = [int]"${{ github.event.inputs.target_project_id }}"
$TestFolderIds = @()
if ("${{ github.event.inputs.test_folder_ids }}") {
$TestFolderIds = "${{ github.event.inputs.test_folder_ids }}" -split "," |
ForEach-Object { [int]$_.Trim() }
}
$DsMappings = @()
if ("${{ github.event.inputs.data_source_mappings }}") {
$DsMappings = "${{ github.event.inputs.data_source_mappings }}" -split "," |
ForEach-Object {
$parts = $_.Trim() -split ":"
@{ sourceFileKey = $parts[0].Trim(); targetId = [int]$parts[1].Trim() }
}
}
$Headers = @{ "x-val-api-token" = $Token; "Content-Type" = "application/json" }
# Export
Write-Host "Exporting from project $SourceProj..."
$ExportBody = @{
name = "github-actions-export"
description = "Automated export"
testFolderIds = $TestFolderIds
jobFolderIds = @()
includeCustomFields = $true
includeJobSchedules = $false
} | ConvertTo-Json -Depth 10
$ExportResp = Invoke-RestMethod -Uri "$SourceUrl/core/api/v1/projects/$SourceProj/export" `
-Method Post -Headers $Headers -Body $ExportBody -ErrorAction Stop
Write-Host "Export successful: $($ExportResp.fileName)"
# Import
Write-Host "Importing into project $TargetProj..."
$ImportBody = @{
mimeType = "application/xml"
contentBase64 = $ExportResp.contentBase64
dataSourceMappings = $DsMappings
includeJobSchedules = $false
conflictResolution = "Overwrite"
} | ConvertTo-Json -Depth 10
try {
Invoke-WebRequest -Uri "$TargetUrl/core/api/v1/projects/$TargetProj/import" `
-Method Post -Headers $Headers -Body $ImportBody -ErrorAction Stop | Out-Null
Write-Host "Import successful."
}
catch {
Write-Error "Import failed: $_"
exit 1
}
Use Case 2: Execute Tests as a PR Check
This workflow runs a Validatar job as part of a pull request check. If any tests fail, the PR check fails.
Create .github/workflows/validatar-quality-gate.yml:
name: Validatar — Quality Gate
on:
pull_request:
branches: [main]
workflow_dispatch:
inputs:
project_id:
description: "Validatar project ID"
required: true
type: string
job_id:
description: "Validatar job ID to execute"
required: true
type: string
jobs:
quality-gate:
runs-on: ubuntu-latest
steps:
- name: Run Validatar quality checks
env:
VALIDATAR_API_URL: ${{ secrets.SOURCE_VALIDATAR_API_URL }}
VALIDATAR_API_TOKEN: ${{ secrets.VALIDATAR_API_TOKEN }}
PROJECT_ID: ${{ github.event.inputs.project_id || vars.VALIDATAR_PROJECT_ID }}
JOB_ID: ${{ github.event.inputs.job_id || vars.VALIDATAR_JOB_ID }}
run: |
python3 << 'SCRIPT'
import requests
import time
import os
import sys
api_url = os.environ["VALIDATAR_API_URL"]
token = os.environ["VALIDATAR_API_TOKEN"]
proj_id = int(os.environ["PROJECT_ID"])
job_id = int(os.environ["JOB_ID"])
timeout = 1800 # 30 minutes
poll_sec = 5
headers = {"x-val-api-token": token, "Content-Type": "application/json"}
base = f"{api_url}/core/api/v1"
# Execute the job
print(f"Executing job {job_id} in project {proj_id}...")
resp = requests.post(f"{base}/projects/{proj_id}/jobs/{job_id}/execute", headers=headers)
resp.raise_for_status()
batch_key = resp.json()["executionKey"]
print(f"Execution started. Batch key: {batch_key}")
# Poll for results
results_url = f"{base}/projects/{proj_id}/jobs/{job_id}/results/{batch_key}"
elapsed = 0
result = None
while elapsed < timeout:
time.sleep(poll_sec)
elapsed += poll_sec
poll = requests.get(results_url, headers=headers)
poll.raise_for_status()
result = poll.json()
status = result.get("status", "Unknown")
print(f" [{elapsed}s] Status: {status}")
if status in ("Completed", "Aborted", "Abandoned"):
break
else:
print(f"ERROR: Timeout after {timeout}s")
sys.exit(1)
# Evaluate
status = result.get("status")
if status != "Completed":
print(f"FAILED: Job ended with status '{status}'")
sys.exit(1)
tests = result.get("tests", [])
failed = [t for t in tests if t.get("status") not in ("Passed", "Completed")]
print(f"\nJob completed: {len(tests)} test(s) ran")
if failed:
print(f"{len(failed)} test(s) did not pass:")
for t in failed:
print(f" - {t.get('name')} (ID: {t.get('id')}): {t.get('status')}")
sys.exit(1)
print("All tests passed. Quality gate OK.")
SCRIPT
Tip: For PR checks, store the default
PROJECT_IDandJOB_IDas GitHub Actions variables (not secrets) under Settings > Variables so thepull_requesttrigger can use them without manual input.
Step 2: Trigger the Workflow
Manual Dispatch
- Go to Actions in your GitHub repository
- Select the workflow (e.g., "Validatar — Promote Tests")
- Click Run workflow
- Fill in the parameters (project IDs, folder IDs, data source mappings)
- Click Run workflow
Automatic on Merge
To trigger promotion automatically when code merges to main, change the on trigger:
on:
push:
branches: [main]
And move the project IDs and folder IDs to repository variables or hardcode them in the workflow.
Troubleshooting
Workflow Fails with "401" or "403"
- Verify the
VALIDATAR_API_TOKENsecret is set correctly - Confirm the token has the required scopes (Authoring for export/import, Execution for quality gate)
- Check that the token user has access to the specified projects
Import Fails with Data Source Mapping Errors
- Use the
GET /core/api/v1/data-sourcesendpoint to list data sources on both instances - Verify every data source referenced in the export has a corresponding mapping entry
- The
sourceFileKeyis the data source ID (integer) from the source project, passed as a string
Self-Hosted Runner Cannot Reach Validatar
- Ensure the runner has network access to the Validatar API endpoint
- Check firewall rules and proxy settings
- For Validatar Server instances, verify the URL includes the correct port and path
Quality Gate Times Out
- Increase the
timeoutvalue in the script if your jobs run longer than 30 minutes - Check the Validatar UI to see if the job is stuck or has errored out
- Verify the job has tests assigned to it