Home / Blog / API test automation: from Postman to Newman, straight into CI

API test automation: from Postman to Newman, straight into CI

Running API tests by hand is a bad habit. The discipline to take a Postman collection into a CI pipeline.

Running API tests by hand doesn’t scale. You have a Postman collection, developers click through it in the GUI, bugs land in production anyway. The way out of that cycle: automation.

This post walks through going from a Postman collection, to Newman, to a CI pipeline.

Postman collection: start with the basics

In the Postman GUI you build a collection. Each folder is a feature, each request is a test.

Collection structure:

MyApp API
  /Auth
    - Login
    - Register
    - Logout
    - Refresh Token
  /Users
    - Get User List
    - Get User by ID
    - Update User
    - Delete User
  /Products
    - List Products
    - Create Product
    - ...

Once this hierarchy is in place, your API’s test coverage becomes visible.

Test scripts: beyond the request

Postman lets you attach a JavaScript test script to every request. It validates the response.

Example test:

// Runs after the Login request
pm.test("Status code is 200", () => {
  pm.response.to.have.status(200);
});

pm.test("Response has token", () => {
  const jsonData = pm.response.json();
  pm.expect(jsonData).to.have.property('token');
  pm.expect(jsonData.token).to.not.be.empty;
});

// Save the token to an environment variable (used by later requests)
pm.environment.set("authToken", pm.response.json().token);

With this pattern every request gets:
– Status code validation
– Response schema check
– Business logic assertion
– Dynamic data extraction (tokens, IDs, etc)

Environments

Postman environment variables let you handle different environments:

Environment: Development

baseUrl: http://localhost:3000
apiKey: dev_abc123
userEmail: dev@example.com

Environment: Staging

baseUrl: https://staging-api.example.com
apiKey: stg_xyz456
userEmail: test@example.com

Environment: Production

baseUrl: https://api.example.com
apiKey: prod_... (smoke test only)
userEmail: test@example.com

Request URL: {{baseUrl}}/users/{{userId}}. Switch environments and every request runs against the new target.

Pre-request scripts

Scripts that run before a request:

// Generate dynamic test data
const randomEmail = `test_${Date.now()}@example.com`;
pm.environment.set("testEmail", randomEmail);

// Add an authentication header
const token = pm.environment.get("authToken");
pm.request.headers.add({key: "Authorization", value: `Bearer ${token}`});

// Prepare the request body
const body = {
  email: randomEmail,
  password: "TestPass123",
  timestamp: new Date().toISOString()
};
pm.request.body = {mode: 'raw', raw: JSON.stringify(body)};

With this pattern, each test run uses unique data, avoiding duplicate-entry issues.

Newman: Postman on the command line

Newman is the CLI runner for Postman collections:

npm install -g newman

newman run MyApp.postman_collection.json 
  -e Staging.postman_environment.json 
  --reporters cli,json 
  --reporter-json-export results.json

Output:

MyApp API

-> Login
  POST https://staging-api.example.com/auth/login [200 OK, 234ms]
  passed: Status code is 200
  passed: Response has token

-> Get User List
  GET https://staging-api.example.com/users [200 OK, 156ms]
  passed: Status code is 200
  passed: Response is array
  passed: At least 1 user exists

+-------------------------+------------------+------------------+
|                         |         executed |           failed |
+-------------------------+------------------+------------------+
|              iterations |                1 |                0 |
+-------------------------+------------------+------------------+
|                requests |               15 |                0 |
+-------------------------+------------------+------------------+
|            test-scripts |               30 |                0 |
+-------------------------+------------------+------------------+
|      prerequest-scripts |                5 |                0 |
+-------------------------+------------------+------------------+
|              assertions |               45 |                0 |
+-------------------------+------------------+------------------+

15 requests, 45 assertions, 0 failures. One command, the whole API tested.

CI pipeline integration

Newman in GitHub Actions:

# .github/workflows/api-tests.yml
name: API Tests
on:
  push:
    branches: [main]
  pull_request:

jobs:
  api-tests:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - uses: actions/setup-node@v3
      
      - name: Install Newman
        run: npm install -g newman newman-reporter-htmlextra
      
      - name: Run API Tests
        env:
          API_BASE_URL: https://staging-api.example.com
          API_KEY: ${{ secrets.STAGING_API_KEY }}
        run: |
          newman run postman/collection.json 
            -e postman/staging.json 
            --reporters cli,htmlextra 
            --reporter-htmlextra-export test-report.html 
            --env-var "baseUrl=$API_BASE_URL" 
            --env-var "apiKey=$API_KEY"
      
      - name: Upload Test Report
        if: always()
        uses: actions/upload-artifact@v3
        with:
          name: api-test-report
          path: test-report.html

API tests run on every push. Fail, and the PR is blocked.

Test coverage strategy

Which tests to write?

Happy path (60% of effort):
– Success scenario for every endpoint
– Valid input gives expected response
– Authentication flow end to end

Error cases (30% of effort):
– Invalid input: 400 Bad Request
– Missing auth: 401 Unauthorized
– Wrong permission: 403 Forbidden
– Non-existent resource: 404 Not Found
– Rate limit hit: 429 Too Many Requests

Edge cases (10% of effort):
– Boundary values (empty string, max length, negative numbers)
– Special characters (SQL injection, XSS patterns)
– Concurrent request handling
– Race conditions

Prioritize happy path coverage, then invest in error cases. For edge cases chaos testing is a better fit.

Contract testing vs integration testing

Two different concepts:

Contract testing: does the API spec (OpenAPI) match the implementation? Dredd is the tool.

dredd api-spec.yaml https://api.example.com

Does every endpoint in the spec actually exist, and does its response match?

Integration testing: is the business logic correct? Postman/Newman lives at this level.

You need both. Contract for spec compliance, integration for business logic.

Mock services for tests

Sometimes the downstream service isn’t available during testing (third-party API rate limit, external service down).

Solution: mock server.

Postman has a mock server feature: turn a collection into a mock server, point tests at its URL.

Prism (OpenAPI-based mock) is the alternative.

Use the real service in integration tests, a production-like environment in end-to-end tests.

Test data management

Tests must be idempotent. The same test, run 5 times, should produce the same result.

Pattern 1: each test creates its own data

// Pre-request: create a new user
const randomUserId = Date.now();
pm.environment.set("testUserId", randomUserId);

// Test: use this user, then delete them
pm.test("User created", () => { ... });
// Post-test: delete user

Pattern 2: test fixtures
Pre-defined test users and baseline data. Tests read this data, don’t modify it.

Pattern 3: rollback after test
Run the test, then rollback the database. Impossible in production, doable in staging.

I prefer pattern 1. Isolated, safe, repeatable.

Performance assertions

Newman can assert on response time:

pm.test("Response time is less than 500ms", () => {
  pm.expect(pm.response.responseTime).to.be.below(500);
});

If the API slows down, the test fails. Performance regression detection.

Note: this measures a single request, not load. k6, Artillery, and JMeter are better for load testing.

Smoke tests in production

After a production deploy, a smoke test:

newman run production-smoke-tests.json 
  -e production.json

Two or three critical endpoints (login, health check, core feature). Runs in 30 seconds after every deployment. On failure: automated rollback.

Postman collection for production:
– Authentication flow
– 3 critical endpoints
– No data creation or modification (side-effect free)

Common pitfalls

1. Collection lives only in one developer’s local Postman. Team isn’t in sync. Solution: commit the collection to Git.

2. Environment files contain secrets. Security risk. Solution: add them to .gitignore, use env vars in CI.

3. Tests depend on order. Change the sequence, tests fail. Solution: each test sets itself up.

4. Testing in production. You can corrupt real user data. Solution: test against staging, only smoke test against production.

5. Assertions too generic. expect(response).to.be.ok is meaningless. Be specific: response shape, exact values.

Test maintenance

As the API changes, tests change with it. Maintenance overhead:

  • Endpoint removed: delete the test
  • Field renamed: update the test script
  • New required field: update the test

When a PR changes the API, also expect a test update. Without that discipline, tests go stale in 6 months.

Pre-commit hook: a Postman collection validator. Is the test structure correct?

Reporting

Newman reporters:

  • CLI: terminal output, default
  • JSON: machine-readable, for CI parsing
  • HTML (htmlextra): rendered in the browser, visual report
  • JUnit XML: Jenkins/GitLab compatibility

Save the HTML as a CI artifact. When a test fails, the team can review which test and why.

Bottom line

Automating API tests replaces manual process and quality jumps. Postman collection + Newman + CI integration is a mature, reliable workflow.

Start with the happy path, add error cases, leave edge cases for later. Idempotent test design, environment management, secret hygiene.

Every release Newman passes = confidence. Fail means fix immediately. Manual testing is for exploratory work only; regression testing should be automated.

Have a project on this topic?

Leave a brief summary — I’ll get back to you within 24 hours.

Get in touch