Skip to main content
Back to blog

API Automation Framework in Azure DevOps: Best Practices

Best practices for building and running an API test automation framework in Azure DevOps. Covers framework design, test isolation, environment management, result publishing, and integrating API tests into CI/CD pipelines.

InnovateBits4 min read
Share

An API automation framework in Azure DevOps needs to do two things well: run reliably in CI and give clear, actionable results when something breaks. This article covers the design decisions and pipeline configuration that make that possible.


Framework design principles for CI

Isolation: Each test creates its own data and cleans up after itself. No shared state between tests.

Idempotency: Running the same test twice produces the same result. No tests that depend on data created by previous tests.

Fast failure: Tests that check fundamental prerequisites (auth, connectivity) run first. If they fail, the rest skip.

Clear output: Test names describe exactly what was tested. Failure messages show expected vs actual, not just "test failed".


Node.js API test framework (Jest + Supertest)

// tests/api/users.test.ts
import request from 'supertest'
import { createTestUser, deleteTestUser } from '../helpers/user-factory'
 
const BASE_URL = process.env.BASE_URL || 'http://localhost:3000'
const AUTH_TOKEN = process.env.TEST_AUTH_TOKEN
 
describe('Users API', () => {
  let testUserId: string
 
  beforeEach(async () => {
    // Create isolated test data
    const user = await createTestUser({ role: 'member' })
    testUserId = user.id
  })
 
  afterEach(async () => {
    // Clean up — always runs even if test fails
    await deleteTestUser(testUserId)
  })
 
  describe('GET /api/users/:id', () => {
    it('returns 200 with correct user data for valid ID', async () => {
      const res = await request(BASE_URL)
        .get(`/api/users/${testUserId}`)
        .set('Authorization', `Bearer ${AUTH_TOKEN}`)
        .expect(200)
 
      expect(res.body).toMatchObject({
        id: testUserId,
        role: 'member',
      })
      expect(res.body.password).toBeUndefined() // Never expose password
    })
 
    it('returns 404 for non-existent user', async () => {
      const res = await request(BASE_URL)
        .get('/api/users/non-existent-id')
        .set('Authorization', `Bearer ${AUTH_TOKEN}`)
        .expect(404)
 
      expect(res.body.error).toMatch(/not found/i)
    })
 
    it('returns 401 without auth token', async () => {
      await request(BASE_URL)
        .get(`/api/users/${testUserId}`)
        .expect(401)
    })
  })
})

Environment management

// config/environments.ts
export const config = {
  staging: {
    baseUrl: 'https://staging.app.com',
    adminEmail: process.env.STAGING_ADMIN_EMAIL!,
    adminPassword: process.env.STAGING_ADMIN_PASSWORD!,
  },
  uat: {
    baseUrl: 'https://uat.app.com',
    adminEmail: process.env.UAT_ADMIN_EMAIL!,
    adminPassword: process.env.UAT_ADMIN_PASSWORD!,
  },
}
 
export const getEnvConfig = () => {
  const env = process.env.TEST_ENV || 'staging'
  return config[env as keyof typeof config]
}

Jest configuration for CI

// jest.config.js
module.exports = {
  testEnvironment: 'node',
  testMatch: ['**/tests/api/**/*.test.ts'],
  transform: { '^.+\\.tsx?$': 'ts-jest' },
  testTimeout: process.env.CI ? 30000 : 10000,
  maxWorkers: process.env.CI ? 4 : '50%',
  reporters: [
    'default',
    ['jest-junit', {
      outputDirectory: './test-results',
      outputName: 'api-results.xml',
      classNameTemplate: '{classname}',
      titleTemplate: '{title}',
    }],
  ],
  globalSetup: './tests/setup/global-setup.ts',
  globalTeardown: './tests/setup/global-teardown.ts',
}

Pipeline YAML

trigger:
  branches:
    include: [main]
 
pool:
  vmImage: ubuntu-latest
 
variables:
  - group: api-test-credentials
  - name: TEST_ENV
    value: staging
 
stages:
  - stage: APISmoke
    displayName: API Smoke Tests
    jobs:
      - job: Smoke
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
          - script: npm ci
          - script: |
              npx jest --testPathPattern="smoke" \
                --forceExit \
                --detectOpenHandles
            displayName: Run API smoke tests
            env:
              BASE_URL: $(STAGING_URL)
              TEST_AUTH_TOKEN: $(API_TEST_TOKEN)
              TEST_ENV: $(TEST_ENV)
          - task: PublishTestResults@2
            inputs:
              testResultsFormat: JUnit
              testResultsFiles: test-results/api-results.xml
              testRunTitle: API Smoke — $(Build.BuildNumber)
            condition: always()
 
  - stage: APIRegression
    displayName: API Regression
    dependsOn: APISmoke
    jobs:
      - job: Regression
        timeoutInMinutes: 20
        steps:
          - task: NodeTool@0
            inputs:
              versionSpec: '20.x'
          - script: npm ci
          - script: |
              npx jest tests/api/ \
                --forceExit \
                --detectOpenHandles \
                --verbose
            displayName: Run full API regression
            env:
              BASE_URL: $(STAGING_URL)
              TEST_AUTH_TOKEN: $(API_TEST_TOKEN)
              STAGING_ADMIN_EMAIL: $(ADMIN_EMAIL)
              STAGING_ADMIN_PASSWORD: $(ADMIN_PASSWORD)
          - task: PublishTestResults@2
            inputs:
              testResultsFormat: JUnit
              testResultsFiles: test-results/api-results.xml
              testRunTitle: API Regression — $(Build.BuildNumber)
            condition: always()

Common errors and fixes

Error: Jest did not exit one second after the test run has completed Fix: Add --forceExit flag to Jest. API tests often leave open HTTP connections. Also check for unclosed database connections or event listeners in teardown code.

Error: Tests fail with connection reset in CI but work locally Fix: The staging server may have rate limiting. Add a delay between requests with await new Promise(r => setTimeout(r, 100)) or reduce maxWorkers to limit concurrent requests.

Error: Test data from a failed test persists and causes the next run to fail Fix: Use afterEach with try/catch for cleanup, and implement a cleanup script that runs at the start of the pipeline to reset known test data.

Error: Cannot find module 'supertest' in pipeline Fix: supertest should be in dependencies or devDependencies in package.json. Run npm ci (not npm install) in the pipeline to ensure exact versions are installed.

Free newsletter

Stay ahead in AI-driven QA

Get practical tutorials on test automation, AI testing, and quality engineering — straight to your inbox. No spam, unsubscribe any time.

Discussion

Sign in with GitHub to comment · powered by Giscus