Week 1 What is Cloud?


what type of Azure storage redundancy options have you used?



Download 2.55 Mb.
Page4/26
Date23.09.2022
Size2.55 Mb.
#59575
1   2   3   4   5   6   7   8   9   ...   26
adfproject qstns main
what type of Azure storage redundancy options have you used?
To maintain the backups in different datacenters.
Locally Redundant Storage (LRS)
Zone-Redundant Storage (ZRS)
Geo-Redundant Storage (GRS):
Read-Access Geo-Redundant Storage (RA-GRS)
We have used Geo-zone-redundant storage (GZRS) and the benefits of using it copies the data synchronously across three Azure availability zones in the primary region using ZRS. It then copies your data asynchronously to a single physical location in the secondary region. The primary difference between GRS and GZRS is how data is replicated in the primary region.


Types of blobs and which one you used?
Three types of blobs are supported: block, append, and page blobs. Most of my applications will likely use block blobs, which handle text and other binary data. 
What is the use of Azure data factory?
To maintain the ingestion pipelines and orchestrating the data movement and transformation.
Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and data transformation. Azure Data Factory does not store any data itself. Easily construct ETL and ELT processes code-free in an intuitive environment or write your own code.


what are the main concepts in adf? Very imp
Pipelines
Activities
Datasets
Linked services
Triggers
Integration Runtimes
Data Flows


what is linked service? Very imp
Linked services are much like connection strings, which define the connection information needed for Data Factory to connect to external resources.


what is dataset? 
Where exactly you want to pull the data.
Theory: - Datasets represent data structures within the data stores, which simply points the data you want to use in your activities as inputs or outputs. An activity takes a zero or more datasets as inputs and one or more datasets as outputs. Example, an Azure Blob dataset specifies the blob container and folder in the Azure Blob Storage from which the pipeline should read the data.

Download 2.55 Mb.

Share with your friends:
1   2   3   4   5   6   7   8   9   ...   26




The database is protected by copyright ©ininet.org 2024
send message

    Main page