In my DAG, I have some tasks that should only be run on Saturdays. Therefore I used a BranchPythonOperator to branch between the tasks for Saturdays and a DummyTask. After that, I join both branches and want to run other tasks.
The workflow looks like this:

Here I set the trigger rule for dummy3 to 'one_success' and everything works fine.
The problem I encountered is when something upstream of the BranchPythonOperator fails:

The BranchPythonOperator and the branches correctly have the state'upstream_failed', but the task joining the branches becomes 'skipped', therefore the whole workflow shows 'success'.
I tried using 'all_success' as the trigger rule, then it works correctly if something fails the whole workflow fails, but if nothing fails dummy3 gets skipped.
I also tried 'all_done' as the trigger rule, then it works correctly if nothing fails, but if something fails dummy3 still gets executed.
My test code looks like this:
from datetime import datetime, date
from airflow import DAG
from airflow.operators.python_operator import BranchPythonOperator, PythonOperator
from airflow.operators.dummy_operator import DummyOperator
dag = DAG('test_branches',
description='Test branches',
catchup=False,
schedule_interval='0 0 * * *',
start_date=datetime(2018, 8, 1))
def python1():
raise Exception('Test failure')
# print 'Test success'
dummy1 = PythonOperator(
task_id='python1',
python_callable=python1,
dag=dag
)
dummy2 = DummyOperator(
task_id='dummy2',
dag=dag
)
dummy3 = DummyOperator(
task_id='dummy3',
dag=dag,
trigger_rule='one_success'
)
def is_saturday():
if date.today().weekday() == 6:
return 'dummy2'
else:
return 'today_is_not_saturday'
branch_on_saturday = BranchPythonOperator(
task_id='branch_on_saturday',
python_callable=is_saturday,
dag=dag)
not_saturday = DummyOperator(
task_id='today_is_not_saturday',
dag=dag
)
dummy1 >> branch_on_saturday >> dummy2 >> dummy3
branch_on_saturday >> not_saturday >> dummy3
EDIT
I just figured out an ugly workaround:

dummy4 represents a task that I actually need to run, dummy5 is just a dummy.
dummy3 still has the trigger rule 'one_success'.
Now dummy3 and dummy4 run if there is no upstream failure, dummy5 'runs' if the day is not saturday and gets skipped if the day is saturday, which means the DAG is marked as success in both cases.
If there is a failure upstream, dummy3 and dummy4 get skipped and dummy5 gets marked as 'upstream_failed' and the DAG is marked as failed.
This workaround makes my DAG run as I want it to, but I'd still prefer a solution without some hacky workaround.

