Skip to content

Commit 8c99ef6

Browse files
committed
Seperate environments in two: one for gh actions, one for ado
2 parents b8d69fb + 5ba2a76 commit 8c99ef6

File tree

3 files changed

+31
-14
lines changed

3 files changed

+31
-14
lines changed

README.md

+17-1
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,27 @@
22

33
This repository contains multiple examples of implementing CI/CD pipelines to deploy Databricks resources using Terraform.
44

5+
## General workflow
6+
7+
The general workflow for examples looks as following:
8+
9+
![Workflow](images/terraform-databricks-pipeline.png)
10+
11+
* Changes to code are made in a separate Git branch & when changes are ready, a pull request is opened
12+
* Upon opening of the pull request, the build pipeline is triggered, and following operations are performed:
13+
* Initializes Terraform using a remote backend to store a [Terraform state](https://www.terraform.io/language/state).
14+
* Perform check of the Terraform code for formatting consistency.
15+
* Performs check of the Terraform code using [terraform validate](https://www.terraform.io/cli/commands/validate).
16+
* Executes `terraform plan` to get the list changes that will be made during deployment.
17+
* If the build pipeline is executed without errors, results of `terraform plan` and the code could be reviewed by reviewer, and merged into the `main` branch.
18+
* When code is merged into the `main` branch, the release pipeline is triggered, and after a manual approval, changes are applied to the deployment using the `terraform apply` command.
19+
20+
521
## Repository organization & implemented solutions
622

723
Code in the repository is organized into following folders:
824

925
* `modules` - implementation of specific Terraform modules:
10-
* [databricks-department-clusters](modules/databricks-department-clusters/) - Terraform module that creates Databricks resources for a some team.
26+
* [databricks-department-clusters](modules/databricks-department-clusters/) - Terraform module that creates Databricks resources for a team.
1127
* `environments` - specific instances that use Terraform modules, providing CI/CD capabilities for deployment. Refer to `README.md` files inside specific folder:
1228
* [databricks-department-clusters-pat](environments/databricks-department-clusters-pat) - implementation of `databricks-department-clusters` module using authentication with Databricks personal access token (PAT).

environments/manual-approve-with-azure-devops/README.md

+13-13
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,13 @@ The general workflow looks as following:
88

99
![Workflow](../../images/terraform-databricks-pipeline-azure-devops.png)
1010

11-
* Changes to code in this directory or in the module are made in a separate Git branch & when changes are ready, a pull request is opened
11+
* Changes to the code in this directory or in the module are made in a separate Git branch & when changes are ready, a pull request is opened
1212
* Upon opening of the pull request, the build pipeline is triggered, and following operations are performed:
1313
* Initializes Terraform using a remote backend to store a [Terraform state](https://www.terraform.io/language/state).
14-
* Perform check of the Terraform code for formatting consistence.
14+
* Perform check of the Terraform code for formatting consistency.
1515
* Performs check of the Terraform code using [terraform validate](https://www.terraform.io/cli/commands/validate).
16-
* Executes `terraform plan` to get the list changes that will be made.
17-
* If build pipeline executed without errors, results of `terraform plan` and code could be reviewed by reviewer, and merged into the `main` branch.
16+
* Executes `terraform plan` to get the list changes that will be made during deployment.
17+
* If the build pipeline is executed without errors, results of `terraform plan` and the code could be reviewed by reviewer, and merged into the `main` branch.
1818
* When code is merged into the `main` branch, the release pipeline is triggered, and after a manual approval, changes are applied to the deployment using the `terraform apply` command.
1919

2020
As result of the pipeline execution, following resources will be created:
@@ -39,26 +39,26 @@ You can customize this project by modifying the `terraform.tfvars` file that def
3939

4040
As described above, we need two pipelines:
4141

42-
* build pipeline is responsible for validation of changes in pull request.
43-
* release pipeline is responsible for deploying the changes.
42+
* The build pipeline is responsible for validation of changes in pull requests.
43+
* The release pipeline is responsible for deploying the changes.
4444

4545
Both pipelines are using the [Azure Pipelines Terraform Tasks](https://marketplace.visualstudio.com/items?itemName=charleszipp.azure-pipelines-tasks-terraform) that is available on the Visual Studio marketplace. Just add this extension to your Azure DevOps project
4646

4747
We also need to define auxiliary objects in the Azure DevOps project that will be used by the both pipelines:
4848

4949
* Azure Data Lake Storage (ADLS) account and container that will be used to store Terraform state.
5050
* Service connection for Github that will be used to detect the changes in the repository (not necessary if you use Azure DevOps Repos).
51-
* [Service connection for Azure Resource Manager](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#azure-resource-manager-service-connection) that will be used to access data of Terraform state in Azure Data Lake Storage (ADLS) container via [azure remote backend](https://www.terraform.io/language/settings/backends/azurerm). The configured identity need to have write access to the configured ADLS container.
51+
* [Service connection for Azure Resource Manager](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#azure-resource-manager-service-connection) that will be used to access data of Terraform state in Azure Data Lake Storage (ADLS) container via [azure remote backend](https://www.terraform.io/language/settings/backends/azurerm). The configured identity needs to have write access to the configured ADLS container.
5252
* [Azure DevOps variable group](https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups) will store all variables used by the both pipelines.
5353

5454
### Configuring the variable group
5555

56-
We need to configure a variable group with name `TerraformProdDeploy`. It should contain following variables:
56+
We need to configure a variable group with the name `TerraformProdDeploy`. It should contain following variables:
5757

5858
* `BACKEND_RG_NAME` - name of resource group containing storage account.
5959
* `BACKEND_SA_NAME` - name of the storage account.
6060
* `BACKEND_CONTAINER_NAME` - name of the container inside the storage account.
61-
* `BACKEND_KEY` - name of the blob (file) object that will be used to store Terraform state of our deployment.
61+
* `BACKEND_KEY` - name of the blob (file) object that will be used to store the Terraform state of our deployment.
6262
* `SERVICE_CONNECTION_NAME` - name of the Azure DevOps service connection for Azure Resource Manager that was defined earlier.
6363
* `DATABRICKS_HOST` - URL of the Databricks workspace where resources will be deployed.
6464
* `DATABRICKS_TOKEN` - personal access token for the Databricks workspace (follow [documentation](https://docs.databricks.com/dev-tools/api/latest/authentication.html) for instructions on how to obtain it). Please mark this variable as **secret** to avoid exposing its value.
@@ -97,7 +97,7 @@ At the end your release pipeline should look as following (don't forget to press
9797

9898
Release artifact is configured as following:
9999

100-
* Click on "Add an artifact" button, then select your Git implementation (GitHub, Azure DevOps, ...), and select repository with the code.
100+
* Click on the "Add an artifact" button, then select your Git implementation (GitHub, Azure DevOps, ...), and select a repository with the code.
101101
* Select the default branch - set it to `main`
102102
* Set the "Default version" field to value "Latest from the default branch"
103103
* Set the "Source alias" field to something easy to remember - we'll use that value in the stages. For example, `terraform-databricks-pipeline`
@@ -121,11 +121,11 @@ echo "host = $DATABRICKS_HOST" >> ~/.databrickscfg
121121
echo "token = $DATABRICKS_TOKEN" >> ~/.databrickscfg
122122
```
123123

124-
We also need to define two environment variables that will link script with our variable group. First one is the `DATABRICKS_TOKEN` with value `$(DATABRICKS_TOKEN)`, and second - `DATABRICKS_HOST` with value `$(DATABRICKS_HOST)`
124+
We also need to define two environment variables that will link the script with our variable group. First one is the `DATABRICKS_TOKEN` with value `$(DATABRICKS_TOKEN)`, and second - `DATABRICKS_HOST` with value `$(DATABRICKS_HOST)`
125125

126-
2. Task that will install Terraform - search for task with name "Terraform installer" and after adding it, define version of Terraform to use.
126+
2. Task that will install Terraform - search for a task with name "Terraform installer" and after adding it, define a version of Terraform to use.
127127

128-
3. Task to perform initialization of Terraform using the state in remote backend. Search for "Terraform CLI" task, select the `init` command, and configure it as following:
128+
3. Task to perform initialization of Terraform using the state in the remote backend. Search for "Terraform CLI" task, select the `init` command, and configure it as following:
129129

130130
* Put `$(System.DefaultWorkingDirectory)/terraform-databricks-pipeline/environments/databricks-department-clusters-pat` into the "Configuration Directory" field (`terraform-databricks-pipeline` is the value of the "Source alias" that we defined in Artifact.
131131
* Put `-input=false -no-color` into the `Command Options` field

environments/manual-approve-with-azure-devops/azure-pipelines.yml

+1
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ pr:
1919
- modules/databricks-department-clusters/
2020
exclude:
2121
- environments/databricks-department-clusters-pat/README.md
22+
- environments/databricks-department-clusters-pat/github-actions.yml
2223
- modules/databricks-department-clusters/README.md
2324

2425
pool:

0 commit comments

Comments
 (0)