Skip to content

Latest commit

 

History

History
683 lines (483 loc) · 17.1 KB

File metadata and controls

683 lines (483 loc) · 17.1 KB
title Testing and Validation
description Comprehensive guide to testing infrastructure components and validating changes, covering testing strategies, validation procedures, and quality assurance practices
author Edge AI Team
ms.date 2025-06-06
ms.topic how-to
estimated_reading_time 7
keywords
testing
validation
quality assurance
infrastructure testing
terraform testing
bicep testing
pester
terratest
checkov
security testing

This guide covers testing strategies, validation procedures, and quality assurance practices for the AI on Edge Flagship Accelerator. Following these practices ensures reliable, secure, and maintainable infrastructure components.

Test Policy

To maintain code quality and the OSSF Best Practices Badge, we enforce the following:

  1. New Functionality: All new major functionality requires corresponding automated tests.
  2. Bug Fixes: Bug fixes require regression tests that verify the fix.

Technology Requirements

Technology Framework Minimum Requirement
Terraform native terraform test One .tftest.hcl per component with command = plan
Rust cargo test #[cfg(test)] module covering core logic
.NET xUnit / NUnit Test project covering business logic
JavaScript vitest Test file with 80% coverage threshold

Testing Philosophy

The project follows a comprehensive testing approach:

  • Infrastructure as Code Testing: Validate templates before deployment
  • Security-First Validation: Continuous security scanning and compliance checking
  • Multi-Environment Testing: Validate across development, staging, and production scenarios
  • Automated Quality Gates: Prevent issues through automated validation pipelines

Infrastructure Testing

Terraform Testing

Static Analysis

Validate Terraform configurations without deployment:

# Navigate to component directory
cd src/000-cloud/010-security-identity/terraform

# Initialize Terraform
terraform init

# Validate syntax and configuration
terraform validate

# Check formatting
terraform fmt -check

# Plan to verify resource configuration
terraform plan

Linting with TFLint

Run advanced Terraform linting:

# Run TFLint on current directory
tflint

# Run with specific configuration
tflint --config=.tflint.hcl

# Run on specific files
tflint main.tf variables.tf

Testing Framework

Use Terratest for integration testing:

# Navigate to test directory
cd src/000-cloud/010-security-identity/tests

# Run Go tests
go test -v -timeout 30m

# Run specific test
go test -v -run TestTerraformSecurityIdentity

# Run tests with verbose output
go test -v -timeout 30m ./...

Bicep Testing

Template Validation

Validate Bicep templates:

# Navigate to component directory
cd src/000-cloud/010-security-identity/bicep

# Validate template syntax
az bicep validate --file main.bicep

# Build to ARM template
az bicep build --file main.bicep

# Test deployment (what-if)
az deployment group what-if \
  --resource-group "test-rg" \
  --template-file main.bicep \
  --parameters @parameters.json

Linting with Bicep

Use built-in Bicep linting:

# Lint Bicep files
az bicep lint --file main.bicep

# Check for security issues
az bicep lint --file main.bicep --level Error

Validation Tools

Security Scanning with Checkov

Run comprehensive security analysis:

# Scan changed folders only
npm run checkov-changes

# Scan all folders
npm run checkov-all

# Scan specific directory
checkov -d src/000-cloud/010-security-identity

# Generate detailed report
checkov -d . --output json --output-file checkov-report.json

Common Checks

Checkov validates:

  • Resource configuration against security best practices
  • Access control and RBAC configurations
  • Network security settings and firewall rules
  • Encryption configuration for data at rest and in transit
  • Compliance with industry standards (CIS, PCI DSS, GDPR)

Code Quality with MegaLinter

Run comprehensive linting across all file types:

# Run all linters in Dev Container mode
npm run lint-devcontainer

# Fix automatically fixable issues
npm run lint-fix-devcontainer

# Run specific linter category
npx mega-linter-runner --flavor terraform

# Run with specific configuration
npx mega-linter-runner --env MEGALINTER_CONFIG=.mega-linter.yml

Spell Checking

Maintain documentation quality:

# Check spelling in all markdown files
npm run cspell

# Check specific file
npx cspell docs/contributor/testing-validation.md

# Add words to project dictionary
echo "terratest" >> .cspell-dictionary.txt

Pre-Commit Validation

Run these checks before committing changes:

Quick Validation Script

#!/bin/bash
# Save as scripts/pre-commit-check.sh

echo "Running pre-commit validation..."

# 1. Format and lint code
echo "1. Running linters..."
npm run lint-fix-devcontainer

# 2. Security scanning
echo "2. Running security scans..."
npm run checkov-changes

# 3. Spell checking
echo "3. Checking spelling..."
npm run cspell

# 4. Test changed components
echo "4. Testing components..."
# Add component-specific testing here

echo "Pre-commit validation complete!"

Manual Validation Checklist

Before committing, verify:

  • All linting issues resolved
  • Security scans pass without new high-severity issues
  • Terraform/Bicep templates validate successfully
  • Documentation is spell-checked and formatted
  • Tests pass for modified components
  • Commit messages follow conventional commit format

Component Testing

Test Structure

Each component should include comprehensive tests:

src/000-cloud/010-security-identity/
├── terraform/
│   ├── main.tf
│   ├── variables.tf
│   └── outputs.tf
├── tests/
│   ├── go.mod
│   ├── go.sum
│   ├── terraform_test.go
│   └── fixtures/
│       └── test-parameters.tfvars
└── ci/
    └── terraform/
        ├── main.tf
        └── variables.tf

Writing Component Tests

Create comprehensive test coverage:

// Example: tests/terraform_test.go
package test

import (
    "testing"
    "github.com/gruntwork-io/terratest/modules/terraform"
    "github.com/stretchr/testify/assert"
)

func TestTerraformSecurityIdentity(t *testing.T) {
    t.Parallel()

    terraformOptions := terraform.WithDefaultRetryableErrors(t, &terraform.Options{
        TerraformDir: "../terraform",
        VarFiles:     []string{"fixtures/test-parameters.tfvars"},
    })

    defer terraform.Destroy(t, terraformOptions)

    // Apply the Terraform configuration
    terraform.InitAndApply(t, terraformOptions)

    // Validate outputs
    keyVaultName := terraform.Output(t, terraformOptions, "key_vault_name")
    assert.NotEmpty(t, keyVaultName)

    // Additional validations
    resourceGroupName := terraform.Output(t, terraformOptions, "resource_group_name")
    assert.Contains(t, resourceGroupName, "test")
}

Test Data Management

Use fixture files for test parameters:

# tests/fixtures/test-parameters.tfvars
prefix = "test"
environment = "dev"
location = "East US"
enable_monitoring = true

Blueprint Testing

Some blueprints include comprehensive test suites using Go and the Terratest framework. The testing infrastructure validates both IaC declarations and actual deployments.

Blueprint Test Architecture

Shared Test Utilities: src/900-tools-utilities/904-test-utilities/

Provides reusable testing functions for all blueprints including:

  • Contract validation functions for Terraform and Bicep
  • Deployment and cleanup utilities
  • Output normalization across frameworks

Reference Implementation: blueprints/full-single-node-cluster/tests/

Complete test suite demonstrating:

  • Contract tests for both Terraform and Bicep
  • End-to-end deployment validation
  • Helper scripts for test execution
  • Output contract definitions

Contract Testing

Purpose: Fast static validation ensuring output declarations match test expectations

Characteristics:

  • Runs in seconds without Azure authentication
  • Zero cost - no Azure resources created
  • Validates IaC configuration correctness
  • Catches drift before expensive deployments

Running Contract Tests:

cd blueprints/full-single-node-cluster/tests

# Test both frameworks
./run-contract-tests.sh both

# Test specific framework
./run-contract-tests.sh terraform
./run-contract-tests.sh bicep

# Direct Go execution
go test -v -run Contract

Deployment Testing

Purpose: Full end-to-end validation with real Azure resource deployment

Characteristics:

  • Creates billable Azure resources
  • Tests actual infrastructure deployment
  • Validates resource connectivity and functionality
  • Duration: 30-45 minutes per test

Running Deployment Tests:

cd blueprints/full-single-node-cluster/tests

# Enable automatic cleanup
export CLEANUP_RESOURCES=true

# Test specific framework
./run-deployment-tests.sh terraform
./run-deployment-tests.sh bicep

# Direct Go execution
go test -v -run TestTerraformFullSingleNodeClusterDeploy -timeout 2h
go test -v -run TestBicepFullSingleNodeClusterDeploy -timeout 2h

Environment Variables:

  • CLEANUP_RESOURCES - Auto-delete resources after test (default: false)
  • TEST_ENVIRONMENT - Environment name (default: dev)
  • TEST_LOCATION - Azure region (default: eastus2)
  • TEST_RESOURCE_PREFIX - Resource naming prefix (default: t6)
  • SKIP_BICEP_DEPLOYMENT - Use existing deployment (default: false)

Blueprint Test Organization

Each blueprint test suite includes:

blueprints/{blueprint-name}/tests/
├── outputs.go                     # Output contract definition
├── contract_terraform_test.go     # Terraform contract validation
├── contract_bicep_test.go         # Bicep contract validation
├── deploy_terraform_test.go       # Terraform deployment test
├── deploy_bicep_test.go           # Bicep deployment test
├── validation.go                  # Shared validation functions
├── setup.go                       # Post-deployment setup
├── run-contract-tests.sh          # Contract test runner
└── run-deployment-tests.sh        # Deployment test runner

Blueprint Integration Testing

# Navigate to blueprint directory
cd blueprints/full-single-node-cluster/terraform

# Initialize with test backend
terraform init -backend-config="container_name=test-tfstate"

# Plan deployment
terraform plan -var-file="test.tfvars"

# Apply to test environment
terraform apply -var-file="test.tfvars" -auto-approve


# Clean up
terraform destroy -var-file="test.tfvars" -auto-approve

Creating Blueprint Tests

When creating a new blueprint, add comprehensive test coverage:

  1. Define output contract in tests/outputs.go with struct tags for both frameworks
  2. Create contract tests for static validation
  3. Create deployment tests for end-to-end validation
  4. Add helper scripts for simplified test execution
  5. Document test requirements in blueprint README

See: Blueprint Developer Guide for detailed instructions

See: test-utilities README for complete API reference

Blueprint Validation Script

Create comprehensive validation:

#!/bin/bash
# scripts/validate-blueprint.sh

BLUEPRINT_DIR=$1
ENVIRONMENT=${2:-test}

echo "Validating blueprint: $BLUEPRINT_DIR"

# 1. Validate Terraform
cd "$BLUEPRINT_DIR/terraform"
terraform init
terraform validate
terraform plan -var-file="${ENVIRONMENT}.tfvars"

# 2. Run security scans
checkov -d .

# 3. Validate dependencies
echo "Checking component dependencies..."
# Add component dependency validation

echo "Blueprint validation complete!"

Continuous Integration

Pipeline Testing

The CI/CD pipeline includes comprehensive testing:

Build Stage

# .azdo/pipelines/build.yml excerpt
- task: TerraformCLI@0
  displayName: 'Terraform Validate'
  inputs:
    command: 'validate'
    workingDirectory: '$(System.DefaultWorkingDirectory)/src/*/terraform'

- task: PowerShell@2
  displayName: 'Run Checkov Security Scan'
  inputs:
    filePath: 'scripts/Run-Checkov.ps1'
    arguments: '-IncludeChangedFolders'

Test Stage

- task: GoTool@0
  displayName: 'Use Go 1.19'
  inputs:
    version: '1.19'

- task: Go@0
  displayName: 'Run Infrastructure Tests'
  inputs:
    command: 'test'
    arguments: '-v -timeout 30m ./tests/...'
    workingDirectory: '$(System.DefaultWorkingDirectory)'

Quality Gates

The pipeline enforces these quality gates:

  • Linting: All files must pass MegaLinter validation
  • Security: No new high-severity security issues
  • Testing: All component tests must pass
  • Documentation: All documentation must be current and properly formatted

Test Environments

Use dedicated environments for testing:

  • Development: Individual developer testing and validation
  • CI/CD: Automated testing in isolated environments
  • Staging: Integration testing with production-like configurations
  • Production: Final validation before deployment

Troubleshooting

Common Testing Issues

Terraform State Issues

# Reset state for testing
terraform workspace new test-$(date +%s)
terraform init

# Or use isolated backend
terraform init -backend-config="container_name=test-tfstate"

Azure Authentication Issues

# Verify Azure CLI authentication
az account show

# Login with service principal (CI/CD)
az login --service-principal -u $CLIENT_ID -p $CLIENT_SECRET --tenant $TENANT_ID

# Set subscription
az account set --subscription "your-subscription-id"

Test Timeout Issues

# Increase timeout for long-running tests
go test -v -timeout 60m ./tests/...

# Run tests in parallel with limited concurrency
go test -v -parallel 2 ./tests/...

Debugging Test Failures

Terraform Debugging

# Enable detailed logging
export TF_LOG=DEBUG
export TF_LOG_PATH=terraform.log

# Run with verbose output
terraform apply -auto-approve -input=false

Bicep Debugging

# Deploy with verbose output
az deployment group create \
  --resource-group "test-rg" \
  --template-file main.bicep \
  --parameters @parameters.json \
  --verbose

Test Data Investigation

# Preserve test resources for investigation
export SKIP_teardown=true
go test -v -run TestSpecificCase

# Manual cleanup after investigation
terraform destroy -auto-approve

Performance Testing

Infrastructure Performance

Monitor deployment times and resource utilization:

# Time Terraform operations
time terraform apply -auto-approve

# Monitor Azure resource deployment
az deployment group show \
  --resource-group "test-rg" \
  --name "deployment-name" \
  --query "properties.duration"

Test Execution Performance

Optimize test execution time:

# Run tests in parallel
go test -v -parallel 4 ./tests/...

# Profile test performance
go test -v -cpuprofile=cpu.prof -memprofile=mem.prof ./tests/...

Best Practices

Test Organization

  • Separate unit and integration tests clearly
  • Use descriptive test names that explain what is being tested
  • Include both positive and negative test cases
  • Test error conditions and edge cases
  • Implement contract tests for fast validation before deployment tests
  • Use test-utilities package for consistent testing patterns across blueprints

Best Practices for Test Data

  • Use parameterized tests for multiple scenarios
  • Clean up test resources automatically (set CLEANUP_RESOURCES=true)
  • Isolate test environments to prevent interference
  • Use realistic test data that represents production scenarios
  • Run contract tests first to catch errors before expensive deployments
  • Enable cleanup in CI/CD to prevent resource accumulation

Validation Strategy

  • Test early and often during development
  • Automate repetitive validation tasks
  • Include security testing in all validation procedures
  • Document test procedures for team consistency

Continuous Improvement

  • Regular review of test coverage and effectiveness
  • Update tests when requirements change
  • Share testing knowledge across the team
  • Contribute improvements to testing frameworks and tools

For more information about development workflows, see the Development Environment and Contributing Guidelines.

🤖 Crafted with precision by ✨Copilot following brilliant human instruction, then carefully refined by our team of discerning human reviewers.