Configure Azure Key Vault & Storage Account URLs in ADF
Send me Nitor Infotech's Monthly Blog Newsletter!
×
Software Product Engineering Services Company
  • Company
    • About
    • Leadership
    • Partnership
  • Resource Hub
  • Blog
  • Contact
Software Product Engineering Services Company
Add more content here...
Artificial intelligence Big Data Blockchain and IoT
Business Intelligence Cloud and DevOps Digital Transformation
Healthcare IT Manufacturing Mobility
Product Modernization Software Engineering Thought Leadership
Aastha Sinha Abhijeet Shah Abhishek Suranglikar
Abhishek Tanwade Ajinkya Pathak Amol Jadhav
Ankita Kulkarni Antara Datta Anup Manekar
Chandra Gosetty Chandrakiran Parkar Dr. Girish Shinde
Gaurav Rathod Harshali Chandgadkar Madhavi Pawar
Milan Pansuriya Mohit Agarwal Mohit Borse
Nalini Vijayraghavan Neha Garg Omkar Ingawale
Omkar Kulkarni Pranit Gangurde Prashant Kamble
Priya Patole Ravi Agrawal Robin Pandita
Rohini Wwagh Sachin Saini Sadhana Sharma
Sambid Pradhan Sanjeev Fadnavis Shardul Gurjar
Shravani Dhavale Shubham Hedau Shubham Kamble
Shubham Muneshwar Sidhant Naveria Sujay Hamane
Tejbahadur Singh Tushar Sangore Vasishtha Ingale
Veena Metri Vidisha Chirmulay
Big Data | 03 Dec 2021 |   10 min

Configure Azure Key Vault & Storage Account URLs in ADF

Did you know that it is possible to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory?

Allow me to explain. The problem was that in ADF pipelines, a static Key Vault URL and a Storage Account URL were not a great idea. To make the ADF pipeline deployment environment independent, there had to be a way to reference these URLs in a not-so-hardcoded or static fashion.

In my blog today, I am going to take you through the process of configuring a Key Vault URL and a Storage Account Base URL dynamically using a single global parameter in Azure Data Factory.

How to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory:

The steps you need to take to pass an environment-specific Azure Key Vault URL value and Storage Account base URL dynamically using a single global parameter in Azure Data Factory are as follows:

  1. Create a new global parameter by clicking on +New from Data factory -> Manage -> Global parameters.
  1. Add the name, data type and value for the newly added global parameter and click on the Save button.
  1. Then create a new Azure Key vault linked service by clicking +New or you can also edit an existing Linked Service.
  1. Search for ‘Key Vault’ and click on the Continue button.
  1. Create the linked service with a suitable name and select the Azure key vault selection method as ‘Enter manually’.
  1. As you have to pass a dynamic URL in Step 5, scroll down and click on +New to create a new parameter.
  1. Then click on the Base URL which will enable the ‘Add dynamic content’ option. Click on it to link the parameter (created in Step 6) with the Base URL.
  1. Select the LS_KV parameter in the parameters list to add it in the dynamic content space and click on the OK button.
  1. Then click on the Create button to create the linked service.
  1. Now create a dataset with its own linked service by clicking the +New option as shown below.
  1. Now name the linked service and select the existing Azure key vault service ‘AzureKVLS’ (as created in Steps 1 to 9) from the AKV Linked Service dropdown as shown below.
  1. Then scroll down and create the new parameter with the name ‘LS_KV_URL’ on the linked Service LS_Test.
  1. Now map the newly created parameter i.e., LS_KV_URL to LS_KV by clicking Add dynamic content under the ‘Value’ option of LS_KV linked service. Then click on the Create button to create the linked service.
  1. Now go back to the dataset configuration properties and click on the Parameters tab. After that, add the parameter on the dataset with an appropriate name
    i.e. DS_VaultURL as shown below.
  1. Now click on the Connection tab of the dataset configuration. Map the parameter ‘DS_VaultURL’ of the dataset with ‘LS_KV_URL’ parameter of the linked service by clicking the Add dynamic content under the ‘Value’ option of Linked service properties as shown below.
  1. Now in the pipeline, when you use the same dataset (DS_Test), you need to map the Vault URL to the dataset parameter (DS_VaultURL).
  1. To do this, click in the box under ‘Value’ option of Dataset properties as shown above and click on Add dynamic content to create the dynamic Azure key vault URL using a Global parameter value.

Click on the OK button to save the content.

  1. Now your pipeline is ready to run. You can set ‘env_name’ as ‘dev’, ‘test’ or ‘prod’ as per the data factory environment to use the Azure key vault URL dynamically in the linked services.

Similarly, we can also set a Storage Account URL dynamic for the datasets of a data lake. The steps are as follows:

  1. Edit the existing data lake dataset (in our case it is DS_Client) and edit the linked service (LS_Client) used in it.
  1. Now create a new parameter named ‘StorageAccountURL’ by clicking the +New button.
  1. Now to make a URL of the linked service (LS_Client) dynamic, map the newly created parameter StorageAccountURL in the URL textbox under the Account Selection method property by clicking on the Add dynamic content option and then the Save button.
  1. Now go back to the dataset configuration properties and click on the Parameters tab to create the parameter ‘DS_StorageAcct_Param’ which will be linked with the Linked Service parameter ‘StorageAccountURL’ (as created in Step 2).
  1. Now click on the Connection tab of the dataset and map the parameter ‘DS_STorageAcct_Param’ with the parameter StorageAccountURL by clicking on the Add dynamic content option under the Value option of Linked service properties.
  1. Now go back to the pipeline and select the activity where we have used the dataset ‘DS_Client’. Then click on the Add dynamic content option for the parameter ‘DS_StorageAcct_Param’, where we will specify the value in the next step.
  1. Generate the Storage Account URL string dynamically using the Global parameter value and click on the OK button to save the changes.

  1. Now, the Pipeline is ready to run with a dynamic storage account URL to connect the pipeline with the data lake dataset.

Now that you are familiar with the process, I hope you will find it easy to configure a Key Vault URL and a Storage Account URL dynamically using a single global parameter in Azure Data Factory. Reach out to us at Nitor Infotech to learn more about how big data can help you extract high-value insights and heighten customer satisfaction for your business.

Related Topics

Artificial intelligence   Big Data   Blockchain and IoT   Business Intelligence   Cloud and DevOps   Digital Transformation   Healthcare IT   Manufacturing   Mobility   Product Modernization   Software Engineering   Thought Leadership  
<< Previous Blog Next Blog >>

Madhavi Pawar

Senior Software Engineer

Madhavi Pawar is an MSBI Developer responsible for the implementation, configuration, maintenance of SSIS, SSRS and performance tuning of SQL Server data manipulations. She is Microsoft-Certified, and has also completed AZ-900, DP-900 and PL-900 certifications. She is experienced in MSBI and Azure Data Factory and is familiar with PowerBI. She loves cooking and reading stories in her free time.

   

You may also like

Data Extraction from SAP

In our technology-focused world, SAP ERP systems are very popular. Irrespective of the size or industry, businesses can reap the benefits of SAP. As you may be ...
Read Blog


What is Business Process Automation?

You sit down at your desk, caught completing important, tedious tasks that require nearly no mind function on your part. Your squad appears unmotivated, and your organization boom goes slower than ...
Read Blog


Flutter State Management: Everything You Need to Know

Nowadays Flutter is one of the hottest topics for mobile developers. When it comes to Flutter, there is a wide range of topics to discuss. But the most important and necessary topic is ‘Flutter ...
Read Blog


Subscribe to Our Blog

Services

    Modern Software Engineering


  • Idea to MVP
  • Quality Engineering
  • Product Engineering
  • Product Modernization
  • Reliability Engineering
  • Product Maintenance

    Enterprise Solution Engineering


  • Idea to MVP
  • Strategy & Consulting
  • Enterprise Architecture & Digital Platforms
  • Solution Engineering
  • Enterprise Cognition Engineering

    Digital Experience Engineering


  • UX Engineering
  • Content Engineering
  • Peer Product Management
  • RaaS
  • Mobility Engineering

    Technology Engineering


  • Cloud Engineering
  • Cognitive Engineering
  • Blockchain Engineering
  • Data Engineering
  • IoT Engineering

    Industries


  • Healthcare
  • Retail
  • Manufacturing
  • BFSI

    Company


  • About
  • Leadership
  • Partnership
  • Contact Us

    Resource Hub


  • White papers
  • Brochures
  • Case studies
  • Datasheet

    Explore More


  • Blog
  • Career
  • Events
  • Press Releases
  • QnA

About


With more than 15 years of experience in handling multiple technology projects across industries, Nitor Infotech has gained strong expertise in areas of technology consulting, solutioning, and product engineering. With a team of 700+ technology experts, we help leading ISVs and Enterprises with modern-day products and top-notch services through our tech-driven approach. Digitization being our key strategy, we digitally assess their operational capabilities in order to achieve our customer's end- goals.

Get in Touch


  • +1 (224) 265-7110
  • marketing@nitorinfotech.com

We are Social 24/7


© 2022 Nitor Infotech All rights reserved

  • Terms of Usage
  • Privacy Policy
  • Cookie Policy
x
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Accept Cookie policy