|
- python - How to control the parallelism or concurrency of an Airflow . . .
Options that are specified across an entire Airflow setup: core parallelism: maximum number of tasks running across an entire Airflow installation; core dag_concurrency: max number of tasks that can be running per DAG (across multiple DAG runs) core non_pooled_task_slot_count: number of task slots allocated to tasks not running in a pool
- First time login to Apache Airflow asks for username and password, what . . .
user: airflow password: airflow If you want to create another account with docker use this (you have to be in the same folder of the docker-compose yaml file): docker-compose run airflow-worker airflow users create --role Admin --username admin --email admin --firstname admin --lastname admin --password admin
- How to use Dynamic Task Mapping with TaskGroups - airflow
In my actual DAG, I need to first get a list of IDs and then for each ID run a set of tasks I have used Dynamic Task Mapping to pass a list to a single task or operator to have it process the list
- How to create a conditional task in Airflow - Stack Overflow
Airflow 1 x Airflow has a BranchPythonOperator that can be used to express the branching dependency more directly The docs describe its use: The BranchPythonOperator is much like the PythonOperator except that it expects a python_callable that returns a task_id The task_id returned is followed, and all of the other paths are skipped
- Proper way to create dynamic workflows in Airflow
Problem Is there any way in Airflow to create a workflow such that the number of tasks B * is unknown until completion of Task A? I have looked at subdags but it looks like it can only work with a
- openssl - How to enable SSL on Apache Airflow? - Stack Overflow
I am using Airflow 1 7 0 with a LocalExecutor and documentation suggests that to enable SSL, we need to pass cert and key path and change the port to 443 as below [webserver] web_server_ssl_cert = <path to cert> web_server_ssl_key = <path to key> # Optionally, set the server to listen on the standard SSL port web_server_port = 443 base_url
- How exactly does the subDAG work in Airflow? What does it mean for a . . .
The 3 operators in this code get the number of lines of the file "airflow cfg", find the value of "airflow_home" in that file, and return both of those values to be printed This code works on its own, so I don't think it's the problem
- airflow - Run DAG at specific time each day - Stack Overflow
In Airflow the scheduling is calculated by start_date + schedule interval Airflow execute the job at the END of the interval Airflow execute the job at the END of the interval This is consistent with how data pipelines usually works
|
|
|