Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 2 additions & 21 deletions includes/provisioning_platforms/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -82,27 +82,8 @@ A *programming language* is either "declarative" or "imperative". Declarative pr

Imperative programming languages state the how. The internal delta calculation needs to be explicitly programmed here. If possible declarative programming languages are recommended due to automatic delta calculation. Typical case is infrastructure.

Typical declarative options are shown in detail in the table below. The overall recommendation is to go for terraform. Major reasons for downvoting Bicep/ ARM:

* ARM: difficult readability for humans
* Bicep: Lack of support for testing based on plan and testing ecosystem since first added recently.

Table with declarative programming language options:
[options="header"]
|=======================
|Criteria|Bicep |ARM | Terraform
|Same syntax across clouds |- (Azure Only) |- (Azure Only) |+ (multi)
|What if |o (no complete prop list;only display of plan; unexpected delete) |- (not available) |+ (plan command)
|Detection current |o (Real anaylsis but time) |+ (Real anaylsis) |o (Statefile)
|Testing/ static analysis |o (Only via ARM)|+ (available) |+ (available)
|Human Readability |+ |- |+
|Reverse Engineering |- (Extra ARM step + adjust) |o (adjust) |+ (Direct via Terraformer)
|Latest features |o (No embedded fallback) |+ (native) |o (Time lag but embedded fallback)
|=======================

The major options for imperative programming languages are Azure CLI, Powershell (Windows) or Linux based scripting. Azure CLI is recommended as prefered choice since it works on linux and windows based VMs.

The created resources should follow a *uniform naming schema*. This requires naming to be factored out in a centralized module. Terraform supports factoring out common code in modules. However the backend must already exist and should also follow a naming convention. The recommendation is therefore to expose the common terraform module via an additional path that does not require a backend to determine the names for the azure resources representing the backend.
[.internal]
provisioning_platforms_azure_dec_opt

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Internal snippets that are used in solutions currently must be prefixed with solution_, because of technical reasons in a synchronization action.


==== Provisioning
===== Organizational Mapping
Expand Down
2 changes: 1 addition & 1 deletion solutions/microservices_azure_aks/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ The picture below summarizes some of the services mentioned above:
image::aks_overview.png[AKS Overview, width=794, height=568]

[.internal]
solution_microservices_azure_aks_infra_detailed_native_setup
microservices_azure_aks_infra_detailed_native_setup

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Internal snippets that are used in solutions currently must be prefixed with solution_, because of technical reasons in a synchronization action.


=== Application
==== Overview
Expand Down
80 changes: 5 additions & 75 deletions solutions/provisioning_azure_azuredevops/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -101,13 +101,8 @@ Adding teams instead of projects is recommended over projects due to https://doc
* Tracking and auditing: It's easier to link work items and other objects for tracking and auditing purposes
* Maintainability: You minimize the maintenance of security groups and process updates.

The table below lists typical configurations along with their characteristics:
[options="header"]
|=======================
|Criteria|1 project, N teams |1 org, N projects/ teams | N orgs
|General guidance | Smaller or larger organizations with highly aligned teams | Good when different efforts require different processes (multi) | Legacy migration
|Process |Aligned processes across teams; team flexibility to customize boards, dashboards, and so on |Different processes per prj;e.g. different work item types, custom fields |same as many projects
|=======================
[.internal]
provisioning_azure_devops_struct

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Internal snippets that are used in solutions currently must be prefixed with solution_, because of technical reasons in a synchronization action.


==== Remaining goals (Automation Code)

Expand Down Expand Up @@ -159,75 +154,10 @@ resources:
trigger: true # Run app-ci pipeline when any run of security-lib-ci completes
```

Implicit Chaining for *orchestration* is possible by using trigger condition. Calling pipelines explicitly is so far only possible with scripting. The code snippet below shows an example:
```Powershell
#
# Make call to schedule pipeline run
#

# Body
$body = @{
stagesToSkip = @()
resources = @{
self = @{
refName = $branch_name
}
}
templateParameters = $params
variables = @{}
}
$bodyJson = $body | ConvertTo-Json
# Uri extracted from the Azure DevOps UI
# $org_uri and $prj_id contain names of current organization/ project
# $pl_id denotes the internal pipeline id to be started
$uri = "${org_uri}${prj_id}/_apis/pipelines/${pl_id}/runs?api-version=5.1-preview.1"

# Output paramters
Write-Host("-------- Call ${pl_name} --------")
Write-Host("Headers: ${headersJson}")
Write-Host("Json body: ${bodyJson}")
Write-Host("Uri: ${uri}")

try
{
# Trigger pipeline
$result = Invoke-RestMethod -Method POST -Headers $headers -Uri $uri -Body $bodyJson
Write-Host("Result: ${result}")

# Wait until run completed
$buildid = $result.id
$start_time = (get-date).ToString('T')
Write-Host("------------ Loop until ${pl_name} completed --------")
Write-Host("started runbuild ${buildid} at ${start_time}")

# Uri for checking state
$uri = "${org_uri}${prj_id}/_apis/pipelines/${pl_id}/runs/${buildid}?api-version=5.1-preview.1"

Do {
Start-Sleep -Seconds 60
$current_time = (get-date).ToString('T')

# Retrieve current state
$result = Invoke-RestMethod -Method GET -Headers $headers -Uri $uri
$status = $result.state
Write-Host("Received state ${status} at ${current_time}...")
} until ($status -eq "completed")

# return result
$pl_run_result = $result.result
Write-Host("Result: ${pl_run_result}")
return $pl_run_result
}
catch {
$excMsg = $_.Exception.Message
Write-Host("Exception text: ${excMsg}")
return "Failed"
}
```
Orchestration must take dependencies into account. They might result from the deployed code or the scope of the pipeline (scope is e.g. a single microservice and code includes the libraries needed).
Orchestrated pipelines must pass data between them. The recommended method is to use key vault.
Implicit Chaining for *orchestration* is possible by using trigger condition. Calling pipelines explicitly is so far only possible with scripting.

*Recreation of resources in short intervals* might cause pipelines to fail. Even if resources are deleted they might still exist in the background (even although soft delete is not applicable). Programming languages can therefore get confused if pipelines recreate things in short intervals. Creating a new resource group can solve the problem since they are part of the tecnical resource id.
[.internal]
provisioning_azure_devops_orch

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Internal snippets that are used in solutions currently must be prefixed with solution_, because of technical reasons in a synchronization action.


As part of the *configuration* Azure DevOps provides the possibility to provide various settings that are used for development such as enforcing pull requests instead of direct pushes to the repo.
The major configuration mechanisms in YAML are variables, parameters and variable groups. Variable groups bundle multiple settings as key value pairs. Parameters are not possible in a variable section (Dynamic inclusion of variable groups is possible via file switching). If they are declared on top level they have to be passed when the pipeline is called programmatically or manually by the user.
Expand Down
4 changes: 4 additions & 0 deletions solutions/streamproc_azure_kafka/index.asciidoc
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,10 @@ toc::[]
:idprefix:
:idseparator: -

include::../../includes/streamproc_problem/index.asciidoc[]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please do not use includes as a first content in solutions, because the solution is then named by the header in the include. That can lead to multiple solutions with the same name, when new solutions use the include in the same way.

Either change the structure of the document and the include or add a first level header above
E.g.:

= Apache Kafka on Microsoft Azure

include::../../includes/streamproc_problem/index.asciidoc[]
...


include::../../includes/streamproc_platforms/index.asciidoc[]

== Apache Kafka on Microsoft Azure

=== Options for running Apache Kafka on Microsoft Azure
Expand Down