|
- python - How to control the parallelism or concurrency of an Airflow . . .
Options that are specified across an entire Airflow setup: core parallelism: maximum number of tasks running across an entire Airflow installation; core dag_concurrency: max number of tasks that can be running per DAG (across multiple DAG runs) core non_pooled_task_slot_count: number of task slots allocated to tasks not running in a pool
- How to create a conditional task in Airflow - Stack Overflow
Airflow 1 x Airflow has a BranchPythonOperator that can be used to express the branching dependency more directly The docs describe its use: The BranchPythonOperator is much like the PythonOperator except that it expects a python_callable that returns a task_id The task_id returned is followed, and all of the other paths are skipped
- How to use Dynamic Task Mapping with TaskGroups - airflow
In my actual DAG, I need to first get a list of IDs and then for each ID run a set of tasks I have used Dynamic Task Mapping to pass a list to a single task or operator to have it process the list
- openssl - How to enable SSL on Apache Airflow? - Stack Overflow
I am using Airflow 1 7 0 with a LocalExecutor and documentation suggests that to enable SSL, we need to pass cert and key path and change the port to 443 as below [webserver] web_server_ssl_cert = <path to cert> web_server_ssl_key = <path to key> # Optionally, set the server to listen on the standard SSL port web_server_port = 443 base_url
- Proper way to create dynamic workflows in Airflow
Problem Is there any way in Airflow to create a workflow such that the number of tasks B * is unknown until completion of Task A? I have looked at subdags but it looks like it can only work with a
- Refreshing dags without web server restart apache airflow
In your airflow cfg, you've these two configurations to control this behavior: # after how much time a new DAGs should be picked up from the filesystem min_file_process_interval = 0 dag_dir_list_interval = 60
- Airflow: how to delete a DAG? - Stack Overflow
In Airflow versions < 1 10 , its a two step process: 1 Remove the Dag from airflow dags folder This will remove the dag from airflow list_dags command But it will still be visible on GUI with a message that since its state is active, it is shown on Airflow GUI
- Airflow Python operator passing parameters - Stack Overflow
This is how you can pass arguments for a Python operator in Airflow from airflow import DAG from airflow operators dummy_operator import DummyOperator from airflow operators python_operator import PythonOperator from time import sleep from datetime import datetime def my_func(*op_args): print(op_args) return op_args[0] with DAG('python_dag', description='Python DAG', schedule_interval='* 5
|
|
|