| title | Coding Conventions | |||||||||
|---|---|---|---|---|---|---|---|---|---|---|
| description | Comprehensive coding standards and conventions for the AI on Edge Flagship Accelerator project, covering Terraform, Bicep, PowerShell, Python, and documentation standards | |||||||||
| author | Edge AI Team | |||||||||
| ms.date | 2025-06-06 | |||||||||
| ms.topic | reference | |||||||||
| estimated_reading_time | 14 | |||||||||
| keywords |
|
This document outlines the coding conventions and standards for this repository. Following these conventions ensures consistency across the codebase and makes it easier for contributors to collaborate effectively. For information about the overall contribution process, please refer to our Contributing Guide.
This document uses terminology as defined in RFC 2119 where the keywords " MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" are to be interpreted as described in the RFC.
The repository follows a structured organization to maintain clarity and separation of concerns. This section outlines where different types of files SHOULD be placed and how folders SHOULD be named.
The repository is organized into these primary directories:
| Directory | Purpose |
|---|---|
/src |
Individual infrastructure components as reusable modules |
/blueprints |
Deployable infrastructure combinations using the src components |
/docs |
Project documentation and architectural decision records |
/scripts |
Utility scripts for development, deployment, and maintenance |
/docs/build-cicd |
CI/CD pipeline definitions and configuration |
/.devcontainer |
Development container configuration for consistent environments |
The /src directory contains all individual infrastructure components:
- Components MUST follow the decimal naming convention for deployment order (e.g.,
010-security-identity,100-cncf-cluster) - Component names SHOULD clearly indicate their purpose
- Each component directory MUST contain:
- A
terraformsubdirectory for the Terraform module implementation - A
README.mdfile documenting the module's purpose and usage - A
testsdirectory with Terraform tests - A
cidirectory with CI-specific configurations when applicable - Tool-generated, SDK-style readmes such as terraform-docs
- A
Example structure:
src/
000-cloud/
010-security-identity/
README.md
terraform/ # This is a COMPONENT MODULE
main.tf
variables.tf
variables.core.tf
variables.deps.tf
outputs.tf
versions.tf
README.md
modules/
key-vault/ # This is an INTERNAL MODULE
main.tf
variables.tf
outputs.tf
bicep/ # This is a BICEP COMPONENT MODULE
ci/
terraform/ # This is a CI TERRAFORM DIRECTORY
main.tf
variables.tf
versions.tf
bicep/ # This is a CI BICEP DIRECTORY
Blueprints combine multiple components from /src to create complete deployable infrastructure solutions:
- Blueprint directories SHOULD use descriptive names reflecting their purpose
- The main blueprint files MUST be in either
terraformorbicepsubdirectories - Each blueprint MUST include a README.md describing:
- Purpose of the blueprint
- Components included
- Deployment instructions
- Required parameters
Example structure:
/blueprints/
terraform/
full-single-cluster/
main.tf
variables.tf
outputs.tf
README.md
bicep/
minimal-deployment/
main.bicep
parameters.json
README.md
The /docs directory contains:
- Project-wide documentation MUST be in markdown format
- Architectural decision records (ADRs) MUST follow the format
adr-NNN-title.md - Technical specifications SHOULD include diagrams where appropriate
- Diagrams SHOULD use standard formats (PNG, SVG) with source files when available
The /scripts directory contains utility scripts for various purposes:
- Deployment scripts SHOULD be in a relevant subdirectory (e.g.,
deployment) - Scripts MUST include inline documentation and usage information
- Cross-platform scripts SHOULD be provided where possible (both PowerShell and Bash)
- Scripts MUST be executable and have consistent permissions
Notable scripts include:
update-all-terraform-docs.sh: Updates documentation for all Terraform modulestf-docs-check.sh: Verifies Terraform documentation is currentwiki-build.sh: Compiles documentation for Azure DevOps wiki publicationaio-version-checker.sh: Verifies Azure IoT Operations component versionstf-var-compliance-check.py: Ensures variable consistency across modules
The /docs/build-cicd directory contains CI/CD pipeline configurations:
- Pipeline definitions MUST use the
.ymlextension - Template files SHOULD include
-templatein their name - Each pipeline template MUST have an accompanying markdown document explaining its usage
- Pipeline variables and secrets SHOULD be documented but not committed
The /.github directory contains GitHub-specific configurations and workflows:
- GitHub Actions workflows MUST be stored in the
/.github/workflowsdirectory with the.ymlextension - Workflow names MUST clearly indicate their purpose (e.g.,
ci.yml,release.yml,docs-validation.yml) - Issue and PR templates MUST be stored in
/.github/ISSUE_TEMPLATEand/.github/PULL_REQUEST_TEMPLATEdirectories - GitHub-specific documentation (e.g.,
SECURITY.md,SUPPORT.md) SHOULD be stored in the root or/.githubdirectory - GitHub environment configurations SHOULD be documented but secret values MUST NOT be committed
- Reusable workflow files SHOULD use the format
reusable-[purpose].yml
Example structure:
/.github/
workflows/
ci.yml # Main CI workflow
dependency-review.yml # Dependency scanning
release.yml # Release automation
reusable-terraform.yml # Reusable Terraform workflow
ISSUE_TEMPLATE/
bug-report.md
feature-request.md
PULL_REQUEST_TEMPLATE.md
dependabot.yml # Dependabot configuration
CODEOWNERS # Code ownership definitions
GitHub workflows SHOULD follow these principles:
- Single responsibility (each workflow should have a clear purpose)
- Use of GitHub environment variables and secrets for configuration
- Consistent job and step naming conventions
- Appropriate triggering conditions (e.g., branches, paths, events)
- Clear job dependencies and workflow structure
- Use of reusable workflows for common tasks
- Consistent error handling and notifications
We use MegaLinter as our comprehensive linting solution to ensure code quality across all languages and file types in the repository.
For detailed information about our MegaLinter configuration, integration with our CI/CD pipeline, and how to use it in your development workflow, please refer to our MegaLinter documentation.
This includes:
- How to run MegaLinter locally
- Available linters and configuration options
- CI/CD integration details
- Pipeline optimization with caching
- Each module MUST be in its own directory under
/srcwith a meaningful name - Modules MUST follow the decimal naming convention (e.g.,
000-subscription,010-vm-host) to indicate deployment order - Each module MUST include a
README.mdwith documentation generated byterraform-docs - Each module MUST include a
testsdirectory with Terraform tests for the module
Variables MUST be defined consistently across modules:
variable "resource_prefix" {
description = "Prefix for all resources created by this module"
type = string
validation {
condition = length(var.resource_prefix) <= 13
error_message = "The resource_prefix value must be 13 characters or less."
}
}- Naming Convention:
- Variable names MUST use
snake_case - Variable names MUST be descriptive and indicate purpose
- Environment-specific variables MUST be prefixed with
env_(e.g.,env_name) - Common concept variables MUST use consistent names across modules
- Boolean variables SHOULD start with
should_oris_
- Variable names MUST use
- Documentation:
- Descriptions MUST end with a period
- Descriptions MUST explain the purpose, expected format, and constraints
- For complex variables, descriptions SHOULD include examples
- Type Constraints:
- Variables MUST specify their type
- Specific subtypes SHOULD be used where applicable (e.g.,
list(string)instead of justlist) - Complex types MUST use
object()with clear attribute definitions
- Default Values:
- Optional variables SHOULD provide sensible defaults
- Required variables MUST NOT have defaults
- Required status MUST be clearly documented
- Internal Modules MUST NOT have defaults
- Validation Rules:
- Important constraints SHOULD include validation rules
- Validation error messages MUST clearly guide the user
- Validation rules MUST be tested in module tests
- Variable Files:
- Variables MUST be organized in
variables.tf,variables.core.tf,variables.deps.tf, orvariables.<internal-module>.tf - Environment-specific values SHOULD use
.tfvarsfiles - Sensitive values MUST NEVER be committed in
.tfvarsfiles
- Variables MUST be organized in
- All Terraform code MUST be formatted with
terraform fmtbefore committing - Code MUST follow HashiCorp's Terraform Style Conventions
- Code MUST use consistent indentation (2 spaces)
- Related resources SHOULD be grouped logically
- Resource names and IDs MUST be meaningful and descriptive
- Bicep modules MUST be in a dedicated directory
/moduleswithin the component directory, module file name should be descriptive of the module's purpose - Each module MUST include a
README.mdwith clear documentation - Parameters and outputs MUST be well-documented within the module
Bicep parameters MUST follow these conventions:
@description('The name of the resource prefix to use for all resources.')
@maxLength(13)
param resourcePrefix string
@description('The Azure region to deploy resources to.')
param location string = resourceGroup().location- Naming Convention:
- Parameter names MUST use
camelCase - Parameter names MUST be descriptive and indicate purpose
- Type names MUST use
PascalCase - Resource names MUST use
kebab-case - Common concept parameters MUST use consistent names across modules
- Parameter names MUST use
- Documentation:
- All parameters MUST use the
@description()decorator - Descriptions MUST explain the purpose, expected format, and constraints
- All parameters MUST use the
- Type Constraints:
- All parameters MUST specify their type
- Parameters SHOULD use decorators like
@minLength(),@maxLength(),@allowed()for validation - Constraint violations MUST produce clear error messages
- Default Values:
- Optional parameters SHOULD provide sensible defaults
- Required parameters MUST NOT have defaults
- Context-aware defaults (e.g.,
resourceGroup().location) SHOULD be used when appropriate
- Parameter Files:
- Environment-specific values SHOULD use parameter files (
.parameters.json) - Sensitive values MUST NEVER be committed in parameter files
- Environment-specific values SHOULD use parameter files (
We follow the Conventional Commits specification for commit messages. This creates a readable commit history and enables automated versioning and changelog generation.
All commits MUST adhere to this structure:
<type>[optional scope]: <description>
[optional body]
[optional footer(s)]
Commit types MUST be one of the following:
- feat: A new feature
- fix: A bug fix
- docs: Documentation only changes
- style: Changes that do not affect the meaning of the code (white-space, formatting, etc.)
- refactor: A code change that neither fixes a bug nor adds a feature
- perf: A code change that improves performance
- test: Adding missing tests or correcting existing tests
- build: Changes that affect the build system or external dependencies
- ci: Changes to our CI configuration files and scripts
- chore: Other changes that don't modify src or test files
- revert: Reverts a previous commit
feat(vm-host): add support for premium SSD disks
This change allows users to specify premium SSD disks for the VM host,
which provides better performance for I/O-intensive workloads.
Fixes #123
fix(iot-ops): correct connection string format in messaging module
The connection string was incorrectly formatted, causing connection failures.
This fix ensures the proper format is used.
BREAKING CHANGE: Connection string format has changed and requires reconfiguration.
Pull request titles MUST follow the Conventional Commits format:
feat(k8s): add support for external secrets
Each pull request MUST:
- Address a single concern
- Include comprehensive tests
- Update documentation as needed
- Pass all CI checks
Each pull request SHOULD:
- Be reviewed by at least one core team member
- Include a clear description of the changes and the motivation
All pull requests MUST have an associated work item in the project backlog:
- Contributors MUST link their PR to the relevant work item using the Azure DevOps PR creation interface
- Contributors SHOULD reference the work item ID in the PR description with the format
AB#123 - Work items MUST be in an appropriate state (e.g., "In Progress")
When code changes relate to a specific customer implementation or request:
- Contributors SHOULD tag the pull request with the customer's name in the PR's tags section
- Contributors MUST NOT add NDA or customer specific data to PR titles or descriptions, including:
- NO customer names
- NO specific requirements or constraints that would identify a customer
- NO links to customer-specific documentation
- NO product/project names or descriptions of customer business units
Our repository uses automated reviewer assignment based on the areas of the codebase being modified:
- Reviewer groups will be automatically assigned based on the directories and components modified
- The following specialist teams are configured:
- IaC Team (Terraform): Changes to core Terraform infrastructure modules
- IaC Team (Bicep): Changes to core Bicep infrastructure modules
- Security Reviewers: Changes to security related folders
- TPM Reviewers: Significant documentation changes including ADRs & tech papers
- Contributors MUST NOT manually remove automatically assigned reviewers
- Contributors MAY add additional reviewers if needed for specific expertise or perspective
- Documentation MUST be kept up-to-date with code changes
- Documentation MUST use markdown for all documentation files
- Documentation SHOULD be placed as close as possible to the code it documents
- Contributors MUST run
./scripts/update-all-terraform-docs.shto update Terraform module documentation - Complex features (e.g. a new blueprint) SHOULD include examples
- Breaking changes MUST be prominently documented
The CI pipeline includes checks to ensure documentation is up-to-date:
DocsCheckTerraformverifies that Terraform documentation is currentDocsCheckBicepverifies that Bicep documentation is current- Documentation changes MUST be included in the same PR as the related code changes
The repository includes an automated system that collects all project documentation and publishes it to the Azure DevOps wiki:
- Documentation from markdown files in
/docsand component READMEs is automatically gathered - Upon successful builds of the main branch, documentation is synchronized to the Azure DevOps wiki
- This ensures that the latest documentation is always available in a user-friendly format
- Code and documentation changes are kept in sync through this automated process
The wiki update process:
- Checks out both the main code repo and the wiki repo
- Runs the wiki-build.sh script to process and structure documentation
- Pushes the updated content to the wiki repository
For detailed information about the wiki auto-publishing system, configuration, and how it works, see the Wiki Update documentation.
All infrastructure code MUST follow security best practices:
-
Secrets Management:
- Never commit secrets, API keys, or sensitive data to the repository
- Use Azure Key Vault for secret storage
- Reference secrets using secure methods (Key Vault references, Managed Identity)
-
Resource Security:
- Enable encryption at rest and in transit by default
- Use least privilege access principles
- Enable audit logging for all resources
- Follow Azure Security Benchmark recommendations
-
Network Security:
- Use private endpoints where possible
- Implement network segmentation
- Apply appropriate firewall rules and NSG configurations
-
Code Scanning:
- All code MUST pass security scanning (Checkov, Gitleaks)
- False positives MUST be documented with skip annotations
- Regular updates to scanning tools and rules
-
Documentation:
- Security configurations MUST be documented
- Compliance mappings SHOULD be included for regulated environments
- Risk assessments SHOULD be documented for architectural decisions
Each component MUST include comprehensive testing:
-
Unit Tests:
- Terraform syntax validation
- Variable type and constraint validation
- Output verification
-
Integration Tests:
- End-to-end deployment testing
- Resource dependency validation
- Cross-component interaction testing
-
Security Tests:
- Infrastructure security scanning
- Compliance validation
- Access control verification
Blueprints MUST undergo additional testing:
-
Deployment Validation:
- Full deployment testing in isolated environments
- Rollback testing
- Performance validation
-
Scenario Testing:
- Use case-specific validation
- Load testing where applicable
- Disaster recovery testing
When using AI assistance tools:
-
Code Quality:
- All AI-generated code MUST be reviewed for security
- Follow project conventions consistently
- Validate against coding standards
-
Documentation:
- Use AI to maintain consistent documentation style
- Verify AI-generated documentation for accuracy
- Include realistic examples and use cases
-
Testing:
- Generate comprehensive test cases with AI assistance
- Validate AI-suggested test scenarios
- Ensure test coverage meets project requirements
-
Context Awareness:
- Provide clear context about project structure
- Reference existing patterns and conventions
- Specify framework preferences (Terraform vs Bicep)
-
Validation:
- Review all AI suggestions carefully
- Test generated code thoroughly
- Verify compliance with project standards
-
Cost Management:
- Use appropriate SKUs for workload requirements
- Implement auto-scaling where beneficial
- Include cost optimization recommendations
-
Performance:
- Choose appropriate Azure regions
- Optimize for workload characteristics
- Monitor and tune resource configurations
-
Code Reusability:
- Maximize component reuse across blueprints
- Avoid duplicating functionality
- Design for extensibility
-
Automation:
- Automate repetitive tasks
- Use infrastructure automation tools effectively
- Implement CI/CD best practices
All contributions MUST undergo thorough code review:
-
Technical Review:
- Code quality and maintainability
- Security and compliance validation
- Performance considerations
-
Documentation Review:
- Accuracy and completeness
- Consistency with project standards
- User experience considerations
-
Testing Review:
- Test coverage and quality
- Scenario validation
- Integration testing completeness
-
Feedback Loops:
- Regular review of coding standards
- Community feedback integration
- Performance metrics analysis
-
Standards Evolution:
- Update standards based on project learnings
- Incorporate industry best practices
- Maintain alignment with Azure recommendations
- Development Environment Setup - Dev Container configuration and tooling
- AI-Assisted Engineering - GitHub Copilot integration and best practices
- Testing and Validation - Comprehensive testing strategies
- Contributing Guidelines - Contribution process and requirements
- Troubleshooting Guide - Common issues and solutions
For questions about coding conventions, see our troubleshooting guide or reach out through repository discussions.
🤖 Crafted with precision by ✨Copilot following brilliant human instruction, then carefully refined by our team of discerning human reviewers.