filmov
tv
Develop event-based solutions | Azure event grid | Azure event hub | Azure notification hub
Показать описание
SUPPORT ME ON PATREON
Develop event-based solutions for AZ-203 or AZ-204 | Azure event grid | Azure event hub | Azure notification hub
In this module, you will cover:
Microsoft Azure Event Grid.
Azure Event Hubs.
Azure Notification Hubs.
In an event-driven architecture, events are delivered in nearly real time, so that consumers can respond immediately to events as they occur.
An event-driven architecture can use a pub/sub model or an event-stream model.
Pub/sub: The messaging infrastructure keeps track of subscriptions. When an event is published, it sends the event to each subscriber. After an event is received, it cannot be replayed, and new subscribers do not observe the event.
Event streaming: Events are written to a log. Events are strictly ordered (within a partition) and durable. Clients don't subscribe to the stream. Instead, a client can read from any part of the stream. The client is responsible for advancing its position in the stream. That means that a client can join at any time, and can replay events.
On the consumer side, there are some common variations:
Simple event processing. An event immediately triggers an action in the consumer. For example, you could use Azure Functions with a Service Bus trigger, so that a function executes whenever a message is published to a Service Bus topic.
Complex event processing. A consumer processes a series of events, observing patterns in the event data by using a technology such as Azure Stream Analytics or Apache Storm. For example, you could aggregate readings from an embedded device over a time window, and generate a notification if the moving average crosses a certain threshold.
Event stream processing. Use a data-streaming platform, such as Azure IoT Hub or Apache Kafka, as a pipeline to ingest events and feed them to stream processors. The stream processors act to process or transform the stream. There may be multiple stream processors for different subsystems of the application. This approach is a good fit for Internet of Things (IoT )workloads.
The source of the events might be external to the system, such as physical devices in an IoT solution. In that case, the system must be able to ingest the data at the volume and throughput that is required by the data source.
Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource that you would like to subscribe to, and then provide the event handler or webhook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
You can use filters to route specific events to different endpoints, multicast to multiple endpoints, and make sure your events are reliably delivered.
Azure Event Hubs is a scalable event-processing service that ingests and processes large volumes of events and data, with low latency and high reliability.
An Event Hubs namespace provides a unique scoping container, referenced by its fully qualified domain name, in which you create one or more event hubs or Kafka topics.
Event publishers
Any entity that sends data to an event hub is an event producer or event publisher. Event publishers can publish events by using HTTPS or AMQP 1.0 or Kafka 1.0 and later. Event publishers use a SAS token to identify themselves to an event hub, and can have a unique identity or use a common SAS token.
Publishing an event
You can publish an event via AMQP 1.0, Kafka 1.0 (and later), or HTTPS. Event Hubs provides client libraries and classes for publishing events to an event hub from .NET clients. For other runtimes and platforms, you can use any AMQP 1.0 client, such as Apache Qpid. You can publish events individually or batched. A single publication (event data instance) has a limit of 1 megabyte (MB), regardless of whether it is a single event or a batch. Publishing events larger than this threshold results in an error. It is a best practice for publishers to be unaware of partitions within the event hub and to only specify a partition key (introduced in the next section), or their identity via their SAS token.
The choice to use AMQP or HTTPS is specific to the usage scenario. AMQP requires the establishment of a persistent bidirectional socket in addition to Transport Layer Security (TLS) or SSL/TLS. AMQP has higher network costs when initializing the session, however HTTPS requires additional SSL overhead for every request. AMQP has higher performance for frequent publishers.
Event Hubs ensures that all events sharing a partition key value are delivered in order and to the same partition. If partition keys are used with publisher policies, then the identity of the publisher and the value of the partition key must match. Otherwise, an error occurs.
#AzureEventGrid #AzureEventHub #AzureNotificationHub #AZ203 #AZ204 #AzureEventBasedSolutions
Develop event-based solutions for AZ-203 or AZ-204 | Azure event grid | Azure event hub | Azure notification hub
In this module, you will cover:
Microsoft Azure Event Grid.
Azure Event Hubs.
Azure Notification Hubs.
In an event-driven architecture, events are delivered in nearly real time, so that consumers can respond immediately to events as they occur.
An event-driven architecture can use a pub/sub model or an event-stream model.
Pub/sub: The messaging infrastructure keeps track of subscriptions. When an event is published, it sends the event to each subscriber. After an event is received, it cannot be replayed, and new subscribers do not observe the event.
Event streaming: Events are written to a log. Events are strictly ordered (within a partition) and durable. Clients don't subscribe to the stream. Instead, a client can read from any part of the stream. The client is responsible for advancing its position in the stream. That means that a client can join at any time, and can replay events.
On the consumer side, there are some common variations:
Simple event processing. An event immediately triggers an action in the consumer. For example, you could use Azure Functions with a Service Bus trigger, so that a function executes whenever a message is published to a Service Bus topic.
Complex event processing. A consumer processes a series of events, observing patterns in the event data by using a technology such as Azure Stream Analytics or Apache Storm. For example, you could aggregate readings from an embedded device over a time window, and generate a notification if the moving average crosses a certain threshold.
Event stream processing. Use a data-streaming platform, such as Azure IoT Hub or Apache Kafka, as a pipeline to ingest events and feed them to stream processors. The stream processors act to process or transform the stream. There may be multiple stream processors for different subsystems of the application. This approach is a good fit for Internet of Things (IoT )workloads.
The source of the events might be external to the system, such as physical devices in an IoT solution. In that case, the system must be able to ingest the data at the volume and throughput that is required by the data source.
Azure Event Grid allows you to easily build applications with event-based architectures. First, select the Azure resource that you would like to subscribe to, and then provide the event handler or webhook endpoint to send the event to. Event Grid has built-in support for events coming from Azure services, like storage blobs and resource groups. Event Grid also has support for your own events, using custom topics.
You can use filters to route specific events to different endpoints, multicast to multiple endpoints, and make sure your events are reliably delivered.
Azure Event Hubs is a scalable event-processing service that ingests and processes large volumes of events and data, with low latency and high reliability.
An Event Hubs namespace provides a unique scoping container, referenced by its fully qualified domain name, in which you create one or more event hubs or Kafka topics.
Event publishers
Any entity that sends data to an event hub is an event producer or event publisher. Event publishers can publish events by using HTTPS or AMQP 1.0 or Kafka 1.0 and later. Event publishers use a SAS token to identify themselves to an event hub, and can have a unique identity or use a common SAS token.
Publishing an event
You can publish an event via AMQP 1.0, Kafka 1.0 (and later), or HTTPS. Event Hubs provides client libraries and classes for publishing events to an event hub from .NET clients. For other runtimes and platforms, you can use any AMQP 1.0 client, such as Apache Qpid. You can publish events individually or batched. A single publication (event data instance) has a limit of 1 megabyte (MB), regardless of whether it is a single event or a batch. Publishing events larger than this threshold results in an error. It is a best practice for publishers to be unaware of partitions within the event hub and to only specify a partition key (introduced in the next section), or their identity via their SAS token.
The choice to use AMQP or HTTPS is specific to the usage scenario. AMQP requires the establishment of a persistent bidirectional socket in addition to Transport Layer Security (TLS) or SSL/TLS. AMQP has higher network costs when initializing the session, however HTTPS requires additional SSL overhead for every request. AMQP has higher performance for frequent publishers.
Event Hubs ensures that all events sharing a partition key value are delivered in order and to the same partition. If partition keys are used with publisher policies, then the identity of the publisher and the value of the partition key must match. Otherwise, an error occurs.
#AzureEventGrid #AzureEventHub #AzureNotificationHub #AZ203 #AZ204 #AzureEventBasedSolutions
Комментарии