API regression testing in GitHub Actions
API regression tests catch “it worked yesterday” bugs before they ship. Use DevTools YAML flows with JUnit + JSON outputs, PR‑visible summaries, caching, and artifacts to debug failures quickly.
Jump to: Copy‑paste workflow
Deep reference: CI/CD integrations · Template: GitHub Actions · Related: Newman alternative for CI
What “API regression testing” means (in practice)
A repeatable suite that verifies your API still behaves after code changes:
- Same endpoints, same auth, same workflow paths
- Assertions on status codes and critical fields
- Run on every PR so failures block merges
Prerequisites
- A YAML flow file committed to your repo (see layout below)
- Secrets configured in GitHub (API keys, login creds, base URL)
- DevTools CLI installed in the workflow
Recommended repo layout
Split tests by intent so your pipeline stays fast:
tests/
smoke-tests.yaml # 1–2 minutes, critical endpoints only
regression-tests.yaml # 10–15 minutes, fuller coverageWhy it matters: smoke fails fast; regression runs only after smoke passes.
Copy‑paste GitHub Actions workflow
Create .github/workflows/api-regression.yml:
name: API Regression Tests
on:
pull_request:
branches: [ main ]
push:
branches: [ main ]
schedule:
- cron: "0 2 * * *" # nightly at 02:00 UTC
jobs:
smoke:
name: Smoke (fast gate)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Cache DevTools CLI
id: cache
uses: actions/cache@v4
with:
path: /usr/local/bin/devtools
key: devtools-cli-${{ runner.os }}-v0.5.1
- name: Install DevTools CLI
if: steps.cache.outputs.cache-hit != 'true'
run: curl -fsSL https://sh.dev.tools/install.sh | bash
- name: Run smoke tests (JUnit + JSON)
timeout-minutes: 10
env:
API_BASE_URL: ${{ secrets.API_BASE_URL }}
API_KEY: ${{ secrets.API_KEY }}
LOGIN_EMAIL: ${{ secrets.LOGIN_EMAIL }}
LOGIN_PASSWORD: ${{ secrets.LOGIN_PASSWORD }}
run: |
devtools flow run tests/smoke-tests.yaml --report junit:smoke-results.xml --report json:smoke-results.json
- name: Upload smoke artifacts
uses: actions/upload-artifact@v4
if: always()
with:
name: smoke-results
path: |
smoke-results.xml
smoke-results.json
- name: Publish smoke report to PR
uses: dorny/test-reporter@v1
if: always()
with:
name: Smoke API Test Results
path: smoke-results.xml
reporter: java-junit
regression:
name: Regression (full suite)
runs-on: ubuntu-latest
needs: smoke
steps:
- uses: actions/checkout@v4
- name: Cache DevTools CLI
id: cache
uses: actions/cache@v4
with:
path: /usr/local/bin/devtools
key: devtools-cli-${{ runner.os }}-v0.5.1
- name: Install DevTools CLI
if: steps.cache.outputs.cache-hit != 'true'
run: curl -fsSL https://sh.dev.tools/install.sh | bash
- name: Run regression tests (JUnit + JSON)
timeout-minutes: 20
env:
API_BASE_URL: ${{ secrets.API_BASE_URL }}
API_KEY: ${{ secrets.API_KEY }}
LOGIN_EMAIL: ${{ secrets.LOGIN_EMAIL }}
LOGIN_PASSWORD: ${{ secrets.LOGIN_PASSWORD }}
run: |
devtools flow run tests/regression-tests.yaml --report junit:regression-results.xml --report json:regression-results.json
- name: Upload regression artifacts
uses: actions/upload-artifact@v4
if: always()
with:
name: regression-results
path: |
regression-results.xml
regression-results.json
- name: Publish regression report to PR
uses: dorny/test-reporter@v1
if: always()
with:
name: Regression API Test Results
path: regression-results.xml
reporter: java-junitWhat you must change
- Flow paths:
tests/smoke-tests.yaml,tests/regression-tests.yaml - Secrets: match your environment variable names
- Cache key version: bump
v0.5.1when updating the CLI
Optional: matrix testing (staging + production)
Test multiple environments using a matrix and pass the URL into your flow:
strategy:
matrix:
environment:
- name: Staging
url: ${{ secrets.STAGING_API_BASE_URL }}
- name: Production
url: ${{ secrets.PROD_API_BASE_URL }}Then pass into the job env: API_BASE_URL: ${{{ matrix.environment.url }}}
Flake control (don’t ignore this)
Not everything is deterministic in API testing. Your engine can be deterministic, but traffic reality isn’t. Use these rules and see Determinism:
- Prefer explicit selectors for critical values
- Add assertions at step boundaries
- Use retries or soft gates for non‑critical steps
- Keep tokens/credentials in CI secrets, not in YAML
Troubleshooting
Tests pass locally but fail in CI
Missing secrets or different base URL; rate limits; timeouts too low.
CLI not found
Install step didn’t run due to cache logic; confirm PATH on runner.
Workflow hangs
Add job timeout-minutes; add per‑request timeouts in flows.
Secrets not loading
Confirm secrets exist and job has access. Fork PRs won’t get secrets.
FAQ
Does this show results in pull requests?
Yes. JUnit + a PR reporter surfaces pass/fail in PR checks.
Can I upload test artifacts even when tests fail?
Yes. Use if: always() on upload and reporting steps.
Should I run smoke tests separately?
Yes. Use a fast gate to catch failures early.
Should I run nightly regressions too?
Yes. Nightly runs catch drift independent of PR cadence.