How to perform Get MetaData Activity using Azure Data Factory (ADF)

preview_player
Показать описание
In this video, you will learn How to perform Get MetaData Activity using Azure Data Factory (ADF)

Here are some recommendations for using Get MetaData Activity:
You can use the Get Metadata activity to retrieve the metadata of any data in Azure Data Factory or a Synapse pipeline. You can use the output from the Get Metadata activity in conditional expressions to perform validation, or consume the metadata in subsequent activities.

LinkedIn Profile of author:
Create a Get Metadata activity with UI
To use a Get Metadata activity in a pipeline, complete the following steps:
1. Search for Get Metadata in the pipeline Activities pane, and drag a Fail activity to the pipeline canvas.
2. Select the new Get Metadata activity on the canvas if it is not already selected, and its Settings tab, to edit its details.
3. Choose a dataset, or create a new one with the New button. Then you can specify filter options and add columns from the available metadata for the dataset.

• Azure Blob storage
• Azure Data Lake Storage Gen1
• Azure Data Lake Storage Gen2
• Azure Files
• File System
• FTP
• SFTP
• Microsoft Fabric Lakehouse
• Amazon S3
• Amazon S3 Compatible Storage
• Google Cloud Storage
• Oracle Cloud Storage
• HDFS

ADF - Azure Data Factory 2024, Azure Data Factory Step by Step - ADF Tutorial 2024 - ADF Tutorial 2024 Step by Step ADF Tutorial - Azure Data Factory Tutorial 2024
Brief description about the project
This video will help you understand the following components using a simple project.
Azure Data Lake Storage
Azure Data Factory
Azure SQL

Azure, Data Lake, Storage, Gen 2, Azure Data Factory, cloud, data integration, data movement, pipeline, linked service, authentication, connection, account, data copy, source, destination, data transformation, data mapping, publish, execute, schedule, orchestrate, backup, archive, migration, environment, structured data, unstructured data, scalable, flexible, automation, cloud-based, workflow, data workflow, data store, data assets, data management, data processing, data analytics, data warehouse, big data, cloud storage, Azure portal, Azure Blob Storage, Data Lake Analytics, Data Lake Storage Gen 1, HDInsight, data governance, data security, data privacy, data access, data lake architecture, data lakehouse, data ingestion, data pipeline, data transformation, data visualization, data lake solution, data migration, data synchronization, data replication, data backup, data recovery, data retention, data lifecycle, data quality, data integrity, data catalog, metadata management, data lineage, data monitoring, data compliance, data audit, data encryption, access control, role-based access, user authentication, data federation, data consolidation, data federation, data silos, data democratization, data exploration, data discovery, data science, machine learning, artificial intelligence, AI, ML, AI model, ML model, predictive analytics, real-time analytics, batch processing, stream processing, event-driven architecture, serverless computing, microservices, REST API, RESTful API

#Azure #DataLake #Storage #Gen2 #AzureDataFactory #Cloud #DataIntegration #DataMovement #Pipeline #LinkedService #Authentication #Connection #Account #DataCopy #Source #Destination #DataTransformation #DataMapping #Publish #Execute #Schedule #Orchestrate #Backup #Archive #Migration #Environment #StructuredData #UnstructuredData #Scalable #Flexible #Automation #CloudBased #Workflow #DataWorkflow #DataStore #DataAssets #DataManagement #DataProcessing #DataAnalytics #DataWarehouse #BigData #CloudStorage #AzurePortal #AzureBlobStorage
Рекомендации по теме