Application Information

Architecture

Rubin alerts are distributed by the Alert Stream service. An overview of the implementation can be found in ` DMTN-210 <https://dmtn-210.lsst.io/>`_ and the Alert Distribution System’s Operator’s Manual can be found in DMTN-214. The service is composed of an Schema Registry, Kafka messaging stream, and a producer located in ap_association where alerts are packaged and sent to the Kafka stream.

Architecture Diagram

Associated Systems

The Alert Stream is used by the Alert Archive Ingester and the Alert Archive Server.

Configuration Location

Config Area

Location

Configuration

lsst-sqre/phalanx

Vault Secrets Dev

secret/rubin/usdf-alert-stream-broker-dev/alert-stream-broker/

Vault Secrets Prod

Data Flow

Alerts are created and processed by the Prompt Processing pipeline. Once alerts have been generated within packageAlerts, they are serialized, compressed, and sent to the Kafka Alert Stream by a producer. The alerts are read into a Kafka Topic based on the current schema used by the pipelines. These topics are made available to our downstream Community Alert Brokers. Alerts are then held for a period of time defined in the Alert Stream Broker’s value.yaml helm chart before expiring.

The Alert Archive reads active Alert Stream topics via an ingester and sends the alerts to the Alert Archive.

The Alert Stream Schema Registry is used to serialized and deserialize the alerts.

Dependencies - S3DF

The following are systems hosted at USDF that the Alert Stream relies on.

  • ArgoCD

  • Phalanx

  • Confluent Schema Registry

  • Prompt Processing

Dependencies - External

Disaster Recovery

If any part of the Alert Stream or Schema registry has failed, follow instructions in DMTN-214 for recovery steps.