Did you know that it is possible to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory?
Allow me to explain. The problem was that in ADF pipelines, a static Key Vault URL and a Storage Account URL were not a great idea. To make the ADF pipeline deployment environment independent, there had to be a way to reference these URLs in a not-so-hardcoded or static fashion.
In my blog today, I am going to take you through the process of configuring a Key Vault URL and a Storage Account Base URL dynamically using a single global parameter in Azure Data Factory.
How to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory:
The steps you need to take to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory are as follows:
- Create a new global parameter by clicking on +New from Data factory -> Manage -> Global parameters.
- Add the name, data type and value for the newly added global parameter and click on the Save button.
- Then create a new Azure Key vault linked service by clicking +New or you can also edit an existing Linked Service.
- Search for ‘Key Vault’ and click on the Continue button.
- Create the linked service with a suitable name and select the Azure key vault selection method as ‘Enter manually’.
- As you have to pass a dynamic URL in Step 5, scroll down and click on +New to create a new parameter.
- Then click on the Base URL which will enable the ‘Add dynamic content’ option. Click on it to link the parameter (created in Step 6) with the Base URL.
- Select the LS_KV parameter in the parameters list to add it in the dynamic content space and click on the OK button.
- Then click on the Create button to create the linked service.
- Now create a dataset with its own linked service by clicking the +New option as shown below.
- Now name the linked service and select the existing Azure key vault service ‘AzureKVLS’ (as created in Steps 1 to 9) from the AKV Linked Service dropdown as shown below.
- Then scroll down and create the new parameter with the name ‘LS_KV_URL’ on the linked Service LS_Test.
- Now map the newly created parameter i.e., LS_KV_URL to LS_KV by clicking Add dynamic content under the ‘Value’ option of LS_KV linked service. Then click on the Create button to create the linked service.
- Now go back to the dataset configuration properties and click on the Parameters tab. After that, add the parameter on the dataset with an appropriate name
i.e. DS_VaultURL as shown below.
- Now click on the Connection tab of the dataset configuration. Map the parameter ‘DS_VaultURL’ of the dataset with ‘LS_KV_URL’ parameter of the linked service by clicking the Add dynamic content under the ‘Value’ option of Linked service properties as shown below.
- Now in the pipeline, when you use the same dataset (DS_Test), you need to map the Vault URL to the dataset parameter (DS_VaultURL).
- To do this, click in the box under ‘Value’ option of Dataset properties as shown above and click on Add dynamic content to create the dynamic Azure key vault URL using a Global parameter value.
Click on the OK button to save the content.
- Now your pipeline is ready to run. You can set ‘env_name’ as ‘dev’, ‘test’ or ‘prod’ as per the data factory environment to use the Azure key vault URL dynamically in the linked services.
Similarly, we can also set a Storage Account URL dynamic for the datasets of a data lake. The steps are as follows:
- Edit the existing data lake dataset (in our case it is DS_Client) and edit the linked service (LS_Client) used in it.
- Now create a new parameter named ‘StorageAccountURL’ by clicking the +New button.
- Now to make a URL of the linked service (LS_Client) dynamic, map the newly created parameter StorageAccountURL in the URL textbox under the Account Selection method property by clicking on the Add dynamic content option and then the Save button.
- Now go back to the dataset configuration properties and click on the Parameters tab to create the parameter ‘DS_StorageAcct_Param’ which will be linked with the Linked Service parameter ‘StorageAccountURL’ (as created in Step 2).
- Now click on the Connection tab of the dataset and map the parameter ‘DS_STorageAcct_Param’ with the parameter StorageAccountURL by clicking on the Add dynamic content option under the Value option of Linked service properties.
- Now go back to the pipeline and select the activity where we have used the dataset ‘DS_Client’. Then click on the Add dynamic content option for the parameter ‘DS_StorageAcct_Param’, where we will specify the value in the next step.
- Generate the Storage Account URL string dynamically using the Global parameter value and click on the OK button to save the changes.
- Now, the Pipeline is ready to run with a dynamic storage account URL to connect the pipeline with the data lake dataset.
Now that you are familiar with the process, I hope you will find it easy to configure a Key Vault URL and a Storage Account URL dynamically using a single global parameter in Azure Data Factory. Reach out to us at Nitor Infotech to learn more about how big data can help you extract high-value insights and heighten customer satisfaction for your business.