Welcome, Guest: Register On Nairaland / LOGIN! / Trending / Recent / New
Stats: 3,148,574 members, 7,801,640 topics. Date: Thursday, 18 April 2024 at 07:06 PM

Azure Data Factory - Data Processing Services - Education - Nairaland

Nairaland Forum / Nairaland / General / Education / Azure Data Factory - Data Processing Services (290 Views)

SS1 Data Processing 2nd Term Lesson Notes; Evaluation Tests / Need Help With Lesson Notes For ICT And Data Processing / Data Processing And Computer Notes For Secondary Schools (2) (3) (4)

(1) (Reply)

Azure Data Factory - Data Processing Services by Ttacy341(f): 7:17am On Sep 04, 2018
Azure Data Factory will give you an in-depth information about how to utilize it efficiently and effectively.

Microsoft Azure is another offering in terms of cloud computing. It is one of the growing collections of cloud services where developers and IT professionals can utilize this platform to build, deploy and manage applications from any part of the global network of data centers. Using this cloud platform, one will get enough freedom to build and deploy applications from wherever he/she wants to practice using the tools that are available in Microsoft Azure.

Accelerate Your Career with Microsoft Azure Training and become expert in Microsoft Azure Enroll For Free Microsoft Azure Training Demo!
Azure Data Factory
Let us understand what is Azure data factory and how it is helping organizations and individuals in terms of accomplishing their day to day operational tasks.

Let say a gaming company is storing a lot of log information so that later on they can take collective decisions on certain parameters and they utilize this log information.

Usually, some of the information is stored in on-premise data storage and the rest of the information is stored in the cloud.

So to analyze the data we need to have an intermediary job that consolidates all the information into one place and then analyze the data by using Hadoop in the cloud (Azure HDInsight) use SQL server on data storage premises. Let say this process runs once in a week.

This is a platform where the organizations can create a workflow and can ingest the data from on-premise data stores and also from the cloud stores.

Including the data from both these stores, the job can transform or process data by using Hadoop where it can be used for BI applications.

The above platform is much needed for all the organizations and Azure data factory is one of the biggest players in this genre.

Azure Data Factory Following Activities

1. First of all, it is a cloud-based solution where it can integrate with different types of data stores to gather information or data.

2. It helps you to create data driven workflows to execute the same

3. All the data driven workflows are called “pipelines”.

4. Once the data is gathered, processing tools like Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics can be used where the data can be transformed and can be pass to the BI professionals where they can analyze the data.

In a sense, it is an Extract and Load (EL) tool where it will then Transform and Load (TL) platform rather than our traditional methods of Extract, Transform and Load (ETL) tool.

As of now, in Azure Data Factory, the data is consumed and produced by the defined workflows where it is time-based data (i.e. it can be defined for hourly, daily, weekly etc).

So based on this parameter is set, the workflow would execute and does the job, i.e it happens on an hourly basis or on daily basis. It is all based on the setting.

Frequently Asked Azure Interview Questions & Answers

Workflow In Depth:

As we have discussed, a pipeline is nothing but a data-driven workflow wherein Azure Data Factory it is executed in three simple steps, they are:

1. Connect and Collect

2. Transform and Enrich

3. Publish

Connect and Collect:

When it comes to data storage, especially in enterprises, a variety of data stores are utilized to store the data. The first and foremost step in the building an Information production system is to connect all the required sources of the data, such as like Saas services, file shares, FTP, web services so that the data can be pushed to a centralized location for data processing.

Without a proper data factor, the organizations have to build or develop a custom data movement components so that the data sources can be integrated. This is an expensive affair without the use of Data Factory.

Related Page: Azure Stack

Even though if these data movement controls are custom build then it lacks the industry standards where the monitoring and alerting mechanism isn’t that effective when it is compared to the industry standard.

So data factor makes is comfortable for the enterprises where the pipelines would take care of the data consolidation point. For example, if you want to collect the data at a single point then you can do that in Azure Data Lake store.

Further, if you want to transform or analyze the data then the cloud source data can be the source and analysis can be done by using Azure Data Lake Analytics, etc.

Check out Microsoft Azure Tutorials

Transform and Enrich:

As completing the connect and collect phase, the next phase is to transform the data and massage it to a level where the reporting layer can be utilized and harvest the data and generate respective analyzed reports.

Tools like Data Lake Analytics and Machine learning can be achieved at this stage.

Within this process, it is considered to be reliable because the produced transformed data is well maintained and controlled.

Publish:

Once the above two stages are completed, the data will be transformed to a stage where the BI team can actually consume the data and start with their analysis. The transformed data from the cloud will be pushed to on-premises sources like SQL Server.

Key Components:

For an Azure subscription, Azure data factory instances can be more than one and it is not necessary to have one Azure data factory instance for one Azure subscription. The Azure data factor is defined as four key components that work hand in hand where it provides the platform to effectively execute the workflows.

Pipeline:

A data factory can have one too many pipelines associated with it and it is not mandatory to have only one pipeline per data factory. Further, a pipeline can be defined as a group of activities.

Activity:

As defined above, a group of activities is called together as a Pipeline. So activities are defined as a specific set of activities to perform on the data. For example, A copy activity will only copy data from one data store to another data store.


Data Factory Supports 2 Types Of Activities:

1. Data movement activities

2. Data transformation activities

Hope you have enjoyed reading about Azure Data Factory and the steps involved of consolidating the data and transforming the data altogether. If you have any valuable suggestions that is worth reading then please do advise in the comments section below

(1) (Reply)

Lautech 2018/2019 Direct Entry Admission / Cost Of Studying & Living In Canada For International Students / School Of Nursing, Amachara 2018/2019 Admission Form

(Go Up)

Sections: politics (1) business autos (1) jobs (1) career education (1) romance computers phones travel sports fashion health
religion celebs tv-movies music-radio literature webmasters programming techmarket

Links: (1) (2) (3) (4) (5) (6) (7) (8) (9) (10)

Nairaland - Copyright © 2005 - 2024 Oluwaseun Osewa. All rights reserved. See How To Advertise. 17
Disclaimer: Every Nairaland member is solely responsible for anything that he/she posts or uploads on Nairaland.