Skip to main content

Testing Environment Setup Guides

This document provides detailed instructions for setting up various testing environments for the Gradiant platform. Following these guides will ensure consistent testing across development, CI/CD, and production environments.

Local Development Environment Setup

Prerequisites

  • Node.js v18.17.0 or higher
  • pnpm v8.6.0 or higher
  • Docker and Docker Compose (for integration tests)
  • Git

Basic Setup

  1. Clone the repository:
git clone https://github.com/gradiantascent/gradiant.git
cd gradiant
  1. Install dependencies:
pnpm install --frozen-lockfile
  1. Set up environment variables:
cp .env.example .env.test.local
  1. Edit .env.test.local to configure test-specific variables:
# Test Database Configuration
TEST_DATABASE_URL=postgresql://postgres:postgres@localhost:5432/gradiant_test
TEST_REDIS_URL=redis://localhost:6379/1

# Test API Keys (use test mode keys)
TEST_OPENAI_$1=YOUR_API_KEY_HERE
TEST_$1=YOUR_API_KEY_HERE

# Test Authentication
TEST_AUTH_SECRET=random-test-secret
  1. Run the test suite:
pnpm test

Setting Up Vitest

Gradiant uses Vitest for unit and integration testing. Here’s how to configure it:
  1. The Vitest configuration is in vitest.config.ts:
import { defineConfig } from 'vitest/config'
import react from '@vitejs/plugin-react'
import tsconfigPaths from 'vite-tsconfig-paths'

export default defineConfig({
  plugins: [react(), tsconfigPaths()],
  test: {
    environment: 'jsdom',
    globals: true,
    setupFiles: ['./vitest.setup.ts'],
    coverage: {
      reporter: ['text', 'json', 'html'],
      exclude: ['**/node_modules/**', '**/dist/**', '**/types/**'],
    },
    include: ['**/*.test.{ts,tsx}'],
    exclude: ['**/node_modules/**', '**/dist/**', '**/e2e/**'],
  },
})
  1. Create a vitest.setup.ts file for global test setup:
import { vi } from 'vitest'

// Mock global fetch
global.fetch = vi.fn()

// Mock environment variables
process.env.NEXT_PUBLIC_API_URL = 'http://localhost:3000/api'

// Clean up after each test
afterEach(() => {
  vi.clearAllMocks()
})

CI/CD Testing Environment

GitHub Actions Setup

  1. Create a .github/workflows/test.yml file:
name: Test

on:
  push:
    branches: [main, develop]
  pull_request:
    branches: [main, develop]

jobs:
  test:
    runs-on: ubuntu-latest
    services:
      postgres:
        image: postgres:14
        env:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: postgres
          POSTGRES_DB: gradiant_test
        ports:
          - 5432:5432
        options: >-
          --health-cmd pg_isready
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5
      redis:
        image: redis:7
        ports:
          - 6379:6379
        options: >-
          --health-cmd "redis-cli ping"
          --health-interval 10s
          --health-timeout 5s
          --health-retries 5

    steps:
      - uses: actions/checkout@v3

      - name: Setup Node.js
        uses: actions/setup-node@v3
        with:
          node-version: '18'

      - name: Install pnpm
        uses: pnpm/action-setup@v2
        with:
          version: 8

      - name: Install dependencies
        run: pnpm install --frozen-lockfile

      - name: Run tests
        run: pnpm test
        env:
          TEST_DATABASE_URL: postgresql://postgres:postgres@localhost:5432/gradiant_test
          TEST_REDIS_URL: redis://localhost:6379/1
          TEST_AUTH_SECRET: ${{ secrets.TEST_AUTH_SECRET }}
          TEST_OPENAI_API_KEY: ${{ secrets.TEST_OPENAI_API_KEY }}
          TEST_ANTHROPIC_API_KEY: ${{ secrets.TEST_ANTHROPIC_API_KEY }}

      - name: Upload coverage reports
        uses: codecov/codecov-action@v3
        with:
          token: ${{ secrets.CODECOV_TOKEN }}

Docker Testing Environment

For consistent testing across environments, use Docker:
  1. Create a docker-compose.test.yml file:
version: '3.8'

services:
  test:
    build:
      context: .
      dockerfile: Dockerfile.test
    environment:
      - TEST_DATABASE_URL=postgresql://postgres:postgres@postgres:5432/gradiant_test
      - TEST_REDIS_URL=redis://redis:6379/1
      - NODE_ENV=test
    depends_on:
      - postgres
      - redis
    volumes:
      - .:/app
      - /app/node_modules
    command: pnpm test

  postgres:
    image: postgres:14
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_DB=gradiant_test
    ports:
      - '5432:5432'
    volumes:
      - postgres-test-data:/var/lib/postgresql/data

  redis:
    image: redis:7
    ports:
      - '6379:6379'
    volumes:
      - redis-test-data:/data

volumes:
  postgres-test-data:
  redis-test-data:
  1. Create a Dockerfile.test:
FROM node:18-alpine

WORKDIR /app

# Install pnpm
RUN npm install -g pnpm@8

# Copy package files
COPY package.json pnpm-lock.yaml ./

# Install dependencies
RUN pnpm install --frozen-lockfile

# Copy the rest of the application
COPY . .

# Run tests
CMD ["pnpm", "test"]
  1. Run tests with Docker:
docker-compose -f docker-compose.test.yml up --build

Testing with Mocks

Setting Up Mock Services

  1. Create a mocks directory for mock services:
mkdir -p src/mocks
  1. Create a mock service for external APIs:
// src/mocks/openai.ts
import { vi } from 'vitest'

export const mockOpenAIResponse = {
  id: 'chatcmpl-123',
  object: 'chat.completion',
  created: 1677652288,
  model: 'gpt-4o',
  choices: [
    {
      index: 0,
      message: {
        role: 'assistant',
        content: 'Hello, how can I help you today?',
      },
      finish_reason: 'stop',
    },
  ],
  usage: {
    prompt_tokens: 10,
    completion_tokens: 10,
    total_tokens: 20,
  },
}

export const mockOpenAIClient = {
  chat: {
    completions: {
      create: vi.fn().mockResolvedValue(mockOpenAIResponse),
    },
  },
}
  1. Use the mock in tests:
// src/services/ai.test.ts

import { mockOpenAIClient } from '../mocks/openai'
import { AIService } from './ai'

vi.mock('openai', () => ({
  OpenAI: vi.fn().mockImplementation(() => mockOpenAIClient),
}))

describe('AIService', () => {
  let aiService: AIService

  beforeEach(() => {
    aiService = new AIService()
  })

  it('should generate a response', async () => {
    const response = await aiService.generateResponse('Hello')

    expect(mockOpenAIClient.chat.completions.create).toHaveBeenCalledWith({
      model: 'gpt-4o',
      messages: [{ role: 'user', content: 'Hello' }],
    })

    expect(response).toBe('Hello, how can I help you today?')
  })
})

Database Testing

Setting Up Test Databases

  1. Create a separate database for testing:
createdb gradiant_test
  1. Use Prisma for database testing:
// src/lib/prisma-test.ts
import { PrismaClient } from '@prisma/client'
import { execSync } from 'child_process'
import { v4 as uuid } from 'uuid'

const prismaBinary = './node_modules/.bin/prisma'

export const setupTestDatabase = async () => {
  const testUrl = process.env.TEST_DATABASE_URL

  if (!testUrl) {
    throw new Error('TEST_DATABASE_URL is not defined')
  }

  // Generate a unique schema name for this test run
  const schema = `test_${uuid().replace(/-/g, '_')}`
  const url = `${testUrl}?schema=${schema}`

  // Create a new Prisma client with the test URL
  const prisma = new PrismaClient({
    datasources: {
      db: {
        url,
      },
    },
  })

  // Run migrations on the test database
  execSync(`${prismaBinary} migrate deploy`, {
    env: {
      ...process.env,
      DATABASE_URL: url,
    },
  })

  return {
    prisma,
    url,
    cleanup: async () => {
      // Drop the schema when done
      await prisma.$executeRawUnsafe(
        `DROP SCHEMA IF EXISTS "${schema}" CASCADE`,
      )
      await prisma.$disconnect()
    },
  }
}
  1. Use in tests:
// src/services/user.test.ts
import { describe, it, expect, beforeAll, afterAll } from 'vitest'
import { setupTestDatabase } from '../lib/prisma-test'
import { UserService } from './user'

describe('UserService', () => {
  let userService: UserService
  let cleanup: () => Promise<void>

  beforeAll(async () => {
    const { prisma, cleanup: cleanupFn } = await setupTestDatabase()
    cleanup = cleanupFn
    userService = new UserService(prisma)
  })

  afterAll(async () => {
    await cleanup()
  })

  it('should create a user', async () => {
    const user = await userService.create({
      email: '[email protected]',
      name: 'Test User',
    })

    expect(user).toHaveProperty('id')
    expect(user.email).toBe('[email protected]')
    expect(user.name).toBe('Test User')
  })
})

Performance Testing

Setting Up Performance Tests

  1. Create a performance test directory:
mkdir -p tests/performance
  1. Create a performance test file:
// tests/performance/api.bench.ts
import { bench, describe } from 'vitest'
import { createServer } from '../../src/server'
import { request } from 'undici'

describe('API Performance', () => {
  const server = createServer()
  const baseUrl = 'http://localhost:3000'

  beforeAll(async () => {
    await server.listen(3000)
  })

  afterAll(async () => {
    await server.close()
  })

  bench(
    'GET /api/health',
    async () => {
      const response = await request(`${baseUrl}/api/health`)
      return response.statusCode
    },
    { iterations: 100 },
  )

  bench(
    'POST /api/chat',
    async () => {
      const response = await request(`${baseUrl}/api/chat`, {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json',
        },
        body: JSON.stringify({
          message: 'Hello, world!',
        }),
      })
      return response.statusCode
    },
    { iterations: 50 },
  )
})
  1. Run performance tests:
pnpm test:perf

Troubleshooting Common Issues

Database Connection Issues

If you encounter database connection issues:
  1. Verify the database is running:
pg_isready -h localhost -p 5432
  1. Check the connection URL:
psql "postgresql://postgres:postgres@localhost:5432/gradiant_test"
  1. Ensure the database exists:
createdb gradiant_test

Test Timeouts

If tests are timing out:
  1. Increase the timeout in Vitest config:
// vitest.config.ts
export default defineConfig({
  test: {
    // ...
    testTimeout: 10000, // 10 seconds
  },
})
  1. Check for long-running operations or infinite loops.
  2. Use proper mocking for external services.

Memory Issues

If you encounter memory issues during testing:
  1. Run tests with increased memory:
NODE_OPTIONS=--max_old_space_size=4096 pnpm test
  1. Split large test suites into smaller files.
  2. Use beforeAll and afterAll for expensive setup/teardown operations.

Conclusion

Following these setup guides will ensure a consistent testing environment across all stages of development. For more detailed information on specific testing patterns, refer to the Test Patterns Documentation. For any issues not covered in this guide, please refer to the Troubleshooting Guide or open an issue on GitHub.