Airflow requires a metadata database for storing e.g. DAG, task and job data. The actual connection string is provided by the operator so that the user does not need to remember the exact structure. The same database can be accessed using different drivers (e.g. job metadata vs. queued job metadata).
spec:
clusterConfig:
metadataDatabase:
postgresql: # (1)
host: airflow-postgresql
database: airflow
credentialsSecret: postgresql-credentials # (2)-
A reference to one of the supported database backends (e.g.
postgresql). -
A reference to a Secret which must contain the two fields
usernameandpassword.
The queue/broker metadata and URL is only needed when running the celery executor.
The celeryResultBackend definition uses the same structure as metadataDatabase shown above.
The celeryBrokerUrl definition is similar but does not require a databaseName.
---
spec:
celeryExecutors:
celeryResultBackend:
postgresql: # (1)
host: airflow-postgresql
database: airflow
credentialsSecret: postgresql-credentials # (2)
celeryBrokerUrl:
redis: # (3)
host: airflow-redis-master
credentialsSecret: redis-credentials # (2)-
A reference to one of the supported database backends (e.g.
postgresql). -
A reference to a secret which must contain the two fields
usernameandpassword. -
A reference to one of the supported queue brokers (e.g.
redis).
Alternatively, these connections can also be defined in full in a referenced secret:
---
spec:
clusterConfig:
metadataDatabase:
generic:
uriSecret: postgresql-metadata # (1)---
spec:
celeryResultBackend:
generic:
uriSecret: postgresql-celery # (2)
celeryBrokerUrl:
generic:
uriSecret: redis-celery # (3)-
A reference to a secret which must contain the single fields
urie.g.uri: postgresql+psycopg2://airflow:airflow@airflow-postgresql/airflow -
A reference to a secret which must contain the single fields
urie.g.uri: db+postgresql://airflow:airflow@airflow-postgresql/airflow -
A reference to a secret which must contain the single fields
urie.g.uri: redis://:redis@airflow-redis-master:6379/0