Deploy an existing Azure Data Factory branch on a new ADF environment via Git Configuration
Send me Nitor Infotech's Monthly Blog Newsletter!
×
Software Product Engineering Services Company
  • Company
    • About
    • Leadership
    • Partnership
  • Resource Hub
  • Blog
  • Contact
Software Product Engineering Services Company
Add more content here...
Artificial intelligence Big Data Blockchain and IoT
Business Intelligence Cloud and DevOps Digital Transformation
Healthcare IT Manufacturing Mobility
Product Modernization Software Engineering Thought Leadership
Aastha Sinha Abhijeet Shah Abhishek Suranglikar
Abhishek Tanwade Ajinkya Pathak Amol Jadhav
Ankita Kulkarni Antara Datta Anup Manekar
Chandra Gosetty Chandrakiran Parkar Dr. Girish Shinde
Gaurav Rathod Harshali Chandgadkar Madhavi Pawar
Milan Pansuriya Mohit Agarwal Mohit Borse
Nalini Vijayraghavan Neha Garg Omkar Ingawale
Omkar Kulkarni Pranit Gangurde Prashant Kamble
Priya Patole Ravi Agrawal Robin Pandita
Rohini Wwagh Sachin Saini Sadhana Sharma
Sambid Pradhan Sanjeev Fadnavis Shardul Gurjar
Shravani Dhavale Shubham Hedau Shubham Kamble
Shubham Muneshwar Sidhant Naveria Sujay Hamane
Tejbahadur Singh Tushar Sangore Vasishtha Ingale
Veena Metri Vidisha Chirmulay
Big Data | 29 Dec 2021 |   8 min

Deploy an existing Azure Data Factory branch on a new ADF environment via Git Configuration

Imagine a scenario wherein you are making drastic changes in your existing data factory, say by changing naming conventions or handing over your development resources to a new customer. There are several instances where you may need to create a new data factory to develop in, or at least revise the source of your existing data factory and bring in a new repo.

All of this, and then some, can be done with the help of Azure Data Factory’s Git configuration. With it, you can readily manage the contents of your existing data factory with no hassle at all.

In my blog, I will show you how you can deploy an existing data factory repository brand on a new environment with the help of this Git configuration.

But, before I begin, I want to shed light on a prerequisite that will help you with this migration- you must create environment specific resources (such as Global Parameter, Private End Point, Integration Runtime etc.) on the new Data Factory environment, so that when we point it to our branch, the existing resources that are created manually will be published automatically to the branch.

An added benefit of this is that it resolves the error “Pipeline is not found, please publish first”.

Now, without further ado, here’s how you do it.

  1. Consider the branch “release” to be deployed on Azure Data Factory through Git Configuration. This branch will not have “factory” and “managedVirtualNetwork” folders as these will be created manually on the new environment and will be created automatically in this branch once pointed to the new Data Factory environment.

  1. Go to Data Factory environment and Open Git Configuration (Manage -> Git Configuration).

  1. Click on Configure and follow the steps as shown below.

  1. Set the “release” branch as Collaboration branch and tick the “Import existing resources to repository” option as given below. Then click on Apply

  1. After clicking on Apply, the already existing resources (such as Global Parameter, Private End Point, Integration Runtime) which we had created manually on Data Factory before connecting it to the “release” branch, will be added/published in the “release” branch.

This happened because we checked the “Import existing resources to repository” option during Git configuration.

So, we will get two new folders in the release branch as shown below:

  1. Now go to Azure Devops and create a local branch from the “release” branch with any name. This new local branch is for temporary use.

In our case, we named it as “ProdRepo”.

  1. The “ProdRepo” branch will be a replica of the “release” branch as shown below:

  1. Now go to Data Factory and disconnect the existing branch “release” from Data Factory.

  1. Follow the same steps (from 2 to 4) as mentioned above, except that you need to uncheck the “Import existing resources to repository” option and set the Collaboration branch as “ProdRepo”.

  1. After Clicking on Apply, your Data Factory will point to the new temporary branch i.e., “ProdRepo”.

Then Click on Publish to deploy the branch on the Data Factory.

a) As we know, the “adf_publish” branch is by default a publish branch in ADF and here we have the “ProdRepo” branch as a Collaboration Branch.

b) Azure wants to get deployed everything on the adf_publish branch from Collaboration Branch “ProdRepo” so that Data Factory can run independently even without pointing to any Collaboration branch.

c) After publishing now, even if we disconnect the Collaboration branch (through Git Configuration of data Factory), we can still see all the pipelines, datasets, linked services and other resources on ADF.

d) This is because everything has been published on adf_branch now and does not need a Collaboration_Branch anymore i.e., ProdRepo.

e) Publishing everything helps you to resolve the error – “Pipeline is not found, please publish first”.

  1. After successfully Publishing, now we can disconnect (as shown in Step No. 8) the “ProdRepo” Collaboration branch and again point the “release” Collaboration branch (as shown in Steps 2, 3 and 4) except unchecking the “Import existing resources to repository”

And there you have it! You are now equipped to move your existing data factory repositories to new ones using the Git configuration as arsenal.

Write to us at Nitor Infotech if you want to learn more about how you can manage copious amounts of data with our data engineering services.

Related Topics

Artificial intelligence   Big Data   Blockchain and IoT   Business Intelligence   Cloud and DevOps   Digital Transformation   Healthcare IT   Manufacturing   Mobility   Product Modernization   Software Engineering   Thought Leadership  
<< Previous Blog Next Blog >>

Robin Pandita

Senior Software Engineer

Robin is an experienced module lead and a BI developer with more than 5 years of working experience in the information technology and services industry. He is skilled in ETL tools like Azure Data Factory and Pentaho DI, business reporting tools like Power BI, and familiar with basics of Tableau, as well as database tools like SQL server and Oracle SQL. Additionally, he also has a good understanding of Data Warehouse and data modelling concepts. In his spare time, he loves to sing and play guitar. He abides by his self-made thought- “When your ability weighs more than probability, there lies no chance of impossibility”.

   

You may also like

Data Extraction from SAP

In our technology-focused world, SAP ERP systems are very popular. Irrespective of the size or industry, businesses can reap the benefits of SAP. As you may be ...
Read Blog


What is Business Process Automation?

You sit down at your desk, caught completing important, tedious tasks that require nearly no mind function on your part. Your squad appears unmotivated, and your organization boom goes slower than ...
Read Blog


Flutter State Management: Everything You Need to Know

Nowadays Flutter is one of the hottest topics for mobile developers. When it comes to Flutter, there is a wide range of topics to discuss. But the most important and necessary topic is ‘Flutter ...
Read Blog


Subscribe to Our Blog

Services

    Modern Software Engineering


  • Idea to MVP
  • Quality Engineering
  • Product Engineering
  • Product Modernization
  • Reliability Engineering
  • Product Maintenance

    Enterprise Solution Engineering


  • Idea to MVP
  • Strategy & Consulting
  • Enterprise Architecture & Digital Platforms
  • Solution Engineering
  • Enterprise Cognition Engineering

    Digital Experience Engineering


  • UX Engineering
  • Content Engineering
  • Peer Product Management
  • RaaS
  • Mobility Engineering

    Technology Engineering


  • Cloud Engineering
  • Cognitive Engineering
  • Blockchain Engineering
  • Data Engineering
  • IoT Engineering

    Industries


  • Healthcare
  • Retail
  • Manufacturing
  • BFSI

    Company


  • About
  • Leadership
  • Partnership
  • Contact Us

    Resource Hub


  • White papers
  • Brochures
  • Case studies
  • Datasheet

    Explore More


  • Blog
  • Career
  • Events
  • Press Releases
  • QnA

About


With more than 15 years of experience in handling multiple technology projects across industries, Nitor Infotech has gained strong expertise in areas of technology consulting, solutioning, and product engineering. With a team of 700+ technology experts, we help leading ISVs and Enterprises with modern-day products and top-notch services through our tech-driven approach. Digitization being our key strategy, we digitally assess their operational capabilities in order to achieve our customer's end- goals.

Get in Touch


  • +1 (224) 265-7110
  • marketing@nitorinfotech.com

We are Social 24/7


© 2022 Nitor Infotech All rights reserved

  • Terms of Usage
  • Privacy Policy
  • Cookie Policy
x
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Accept Cookie policy