You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Changes to code are made in a separate Git branch & when changes are ready, a pull request is opened
12
+
* Upon opening of the pull request, the build pipeline is triggered, and following operations are performed:
13
+
* Initializes Terraform using a remote backend to store a [Terraform state](https://www.terraform.io/language/state).
14
+
* Perform check of the Terraform code for formatting consistency.
15
+
* Performs check of the Terraform code using [terraform validate](https://www.terraform.io/cli/commands/validate).
16
+
* Executes `terraform plan` to get the list changes that will be made during deployment.
17
+
* If the build pipeline is executed without errors, results of `terraform plan` and the code could be reviewed by reviewer, and merged into the `main` branch.
18
+
* When code is merged into the `main` branch, the release pipeline is triggered, and after a manual approval, changes are applied to the deployment using the `terraform apply` command.
Code in the repository is organized into following folders:
8
24
9
25
*`modules` - implementation of specific Terraform modules:
10
-
*[databricks-department-clusters](modules/databricks-department-clusters/) - Terraform module that creates Databricks resources for a some team.
26
+
*[databricks-department-clusters](modules/databricks-department-clusters/) - Terraform module that creates Databricks resources for a team.
11
27
*`environments` - specific instances that use Terraform modules, providing CI/CD capabilities for deployment. Refer to `README.md` files inside specific folder:
12
28
*[databricks-department-clusters-pat](environments/databricks-department-clusters-pat) - implementation of `databricks-department-clusters` module using authentication with Databricks personal access token (PAT).
* Changes to code in this directory or in the module are made in a separate Git branch & when changes are ready, a pull request is opened
11
+
* Changes to the code in this directory or in the module are made in a separate Git branch & when changes are ready, a pull request is opened
12
12
* Upon opening of the pull request, the build pipeline is triggered, and following operations are performed:
13
13
* Initializes Terraform using a remote backend to store a [Terraform state](https://www.terraform.io/language/state).
14
-
* Perform check of the Terraform code for formatting consistence.
14
+
* Perform check of the Terraform code for formatting consistency.
15
15
* Performs check of the Terraform code using [terraform validate](https://www.terraform.io/cli/commands/validate).
16
-
* Executes `terraform plan` to get the list changes that will be made.
17
-
* If build pipeline executed without errors, results of `terraform plan` and code could be reviewed by reviewer, and merged into the `main` branch.
16
+
* Executes `terraform plan` to get the list changes that will be made during deployment.
17
+
* If the build pipeline is executed without errors, results of `terraform plan` and the code could be reviewed by reviewer, and merged into the `main` branch.
18
18
* When code is merged into the `main` branch, the release pipeline is triggered, and after a manual approval, changes are applied to the deployment using the `terraform apply` command.
19
19
20
20
As result of the pipeline execution, following resources will be created:
@@ -39,26 +39,26 @@ You can customize this project by modifying the `terraform.tfvars` file that def
39
39
40
40
As described above, we need two pipelines:
41
41
42
-
* build pipeline is responsible for validation of changes in pull request.
43
-
* release pipeline is responsible for deploying the changes.
42
+
*The build pipeline is responsible for validation of changes in pull requests.
43
+
*The release pipeline is responsible for deploying the changes.
44
44
45
45
Both pipelines are using the [Azure Pipelines Terraform Tasks](https://marketplace.visualstudio.com/items?itemName=charleszipp.azure-pipelines-tasks-terraform) that is available on the Visual Studio marketplace. Just add this extension to your Azure DevOps project
46
46
47
47
We also need to define auxiliary objects in the Azure DevOps project that will be used by the both pipelines:
48
48
49
49
* Azure Data Lake Storage (ADLS) account and container that will be used to store Terraform state.
50
50
* Service connection for Github that will be used to detect the changes in the repository (not necessary if you use Azure DevOps Repos).
51
-
*[Service connection for Azure Resource Manager](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#azure-resource-manager-service-connection) that will be used to access data of Terraform state in Azure Data Lake Storage (ADLS) container via [azure remote backend](https://www.terraform.io/language/settings/backends/azurerm). The configured identity need to have write access to the configured ADLS container.
51
+
*[Service connection for Azure Resource Manager](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#azure-resource-manager-service-connection) that will be used to access data of Terraform state in Azure Data Lake Storage (ADLS) container via [azure remote backend](https://www.terraform.io/language/settings/backends/azurerm). The configured identity needs to have write access to the configured ADLS container.
52
52
*[Azure DevOps variable group](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups) will store all variables used by the both pipelines.
53
53
54
54
### Configuring the variable group
55
55
56
-
We need to configure a variable group with name `TerraformProdDeploy`. It should contain following variables:
56
+
We need to configure a variable group with the name `TerraformProdDeploy`. It should contain following variables:
57
57
58
58
*`BACKEND_RG_NAME` - name of resource group containing storage account.
59
59
*`BACKEND_SA_NAME` - name of the storage account.
60
60
*`BACKEND_CONTAINER_NAME` - name of the container inside the storage account.
61
-
*`BACKEND_KEY` - name of the blob (file) object that will be used to store Terraform state of our deployment.
61
+
*`BACKEND_KEY` - name of the blob (file) object that will be used to store the Terraform state of our deployment.
62
62
*`SERVICE_CONNECTION_NAME` - name of the Azure DevOps service connection for Azure Resource Manager that was defined earlier.
63
63
*`DATABRICKS_HOST` - URL of the Databricks workspace where resources will be deployed.
64
64
*`DATABRICKS_TOKEN` - personal access token for the Databricks workspace (follow [documentation](https://docs.databricks.com/dev-tools/api/latest/authentication.html) for instructions on how to obtain it). Please mark this variable as **secret** to avoid exposing its value.
@@ -97,7 +97,7 @@ At the end your release pipeline should look as following (don't forget to press
97
97
98
98
Release artifact is configured as following:
99
99
100
-
* Click on "Add an artifact" button, then select your Git implementation (GitHub, Azure DevOps, ...), and select repository with the code.
100
+
* Click on the "Add an artifact" button, then select your Git implementation (GitHub, Azure DevOps, ...), and select a repository with the code.
101
101
* Select the default branch - set it to `main`
102
102
* Set the "Default version" field to value "Latest from the default branch"
103
103
* Set the "Source alias" field to something easy to remember - we'll use that value in the stages. For example, `terraform-databricks-pipeline`
We also need to define two environment variables that will link script with our variable group. First one is the `DATABRICKS_TOKEN` with value `$(DATABRICKS_TOKEN)`, and second - `DATABRICKS_HOST` with value `$(DATABRICKS_HOST)`
124
+
We also need to define two environment variables that will link the script with our variable group. First one is the `DATABRICKS_TOKEN` with value `$(DATABRICKS_TOKEN)`, and second - `DATABRICKS_HOST` with value `$(DATABRICKS_HOST)`
125
125
126
-
2. Task that will install Terraform - search for task with name "Terraform installer" and after adding it, define version of Terraform to use.
126
+
2. Task that will install Terraform - search for a task with name "Terraform installer" and after adding it, define a version of Terraform to use.
127
127
128
-
3. Task to perform initialization of Terraform using the state in remote backend. Search for "Terraform CLI" task, select the `init` command, and configure it as following:
128
+
3. Task to perform initialization of Terraform using the state in the remote backend. Search for "Terraform CLI" task, select the `init` command, and configure it as following:
129
129
130
130
* Put `$(System.DefaultWorkingDirectory)/terraform-databricks-pipeline/environments/databricks-department-clusters-pat` into the "Configuration Directory" field (`terraform-databricks-pipeline` is the value of the "Source alias" that we defined in Artifact.
131
131
* Put `-input=false -no-color` into the `Command Options` field
0 commit comments