Airflow BashOperator prints ‘could not find xxx to execute’

When I use BashOperator to run pg_dump. Airflow prints could not find a "pg_dump" in log. However, airflow successfully use the pg_dump and did the job after it’s printed out.

[2023-12-19, 14:24:13 EST] {subprocess.py:75} INFO - Running command: ['/usr/bin/bash', '-c', 'pg_dump -Fc -v -n $SCHEMA --exclude-table measurement_orig -f $OUT_FILE ]
[2023-12-19, 14:24:13 EST] {subprocess.py:86} INFO - Output:
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - could not find a "pg_dump" to execute
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: last built-in OID is 16383
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: reading extensions
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: identifying extension members
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: reading schemas
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: reading user-defined tables
[2023-12-19, 14:24:13 EST] {subprocess.py:93} INFO - pg_dump: reading user-defined functions
...

I’m curious where this information comes from and why Airflow is printing out this message.

Deployment Information:

airflow version: v2.6.0
executor type: Celery
server: Red Hat 7.9

Running which pg_dump gives me result /usr/bin/pg_dump

Leave a Comment