Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
Repo info
  • 18:34
    o-nikolas commented #26742
  • 18:29
    boring-cyborg[bot] labeled #26784
  • 18:29
    boring-cyborg[bot] labeled #26784
  • 18:29
    Taragolis opened #26784
  • 18:29
    bdsoha synchronize #26766
  • 18:29
    bdsoha synchronize #26766
  • 18:29
    ashb synchronize #26779
  • 18:29
    ashb synchronize #26779
  • 18:14

    pierrejeambrun on main

    add icon legend to datasets gra… (compare)

  • 18:14
    pierrejeambrun closed #26781
  • 18:09
    jedcunningham milestoned #26781
  • 18:09
    jedcunningham milestoned #26781
  • 18:03
    merobi-hub synchronize #26605
  • 17:40
    blag synchronize #26358
  • 17:38
  • 17:33
    bdsoha synchronize #26766
  • 17:32
    bdsoha synchronize #26766
  • 17:32
    bdsoha review_requested #26766
  • 17:16
    bdsoha edited #26766
  • 16:59
    patricker synchronize #26768
hi team, Im facing issue with the Dag stucking in schedule state forever, and none of the task is getting process
As a workaround Im Scaling down /Up my Airflow and postgres
airflow webserver not able to pick up dags, here's the message:

/home/ubuntu/.pyenv/versions/3.8.10/envs/airflowenv/lib/python3.8/site-packages/pandas/compat/init.py:109 UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError.

__ |( )_ / /__
/| | / / / /_ _ | /| / /
| / / / / / // / |/ |/ / // |// // // // __/__/|/
[2021-07-08 10:00:01,260] {dagbag.py:496} INFO - Filling up the DagBag from /dev/null
Running the Gunicorn Server with:
Workers: 4 sync
Timeout: 120
Logfiles: - -

airflow version 2.1.1

I set dag folder path as this in airflow.cfg file: [core]

The folder where your airflow pipelines live, most likely a

subfolder in a code repository. This path must be absolute.

dags_folder = /home/ubuntu/airflow/dags

Thuc Nguyen Canh
hello, i have a production issue with airflow <v0.19 very urgent, all my task are in queued, I used LocalExecutor, anyone can help
any expert can help, I will make a call, very stressed today with this issue
Thuc Nguyen Canh
[2021-08-17 07:07:15,579] {jobs.py:1109} INFO - Tasks up for execution:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,583] {jobs.py:1144} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 1 task instances in queue
[2021-08-17 07:07:15,586] {jobs.py:1180} INFO - DAG Dashboard_C2C has 0/2 running and queued tasks
[2021-08-17 07:07:15,586] {jobs.py:1218} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,597] {jobs.py:1301} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [queued]>
[2021-08-17 07:07:15,597] {jobs.py:1343} INFO - Sending ('Dashboard_C2C', 'Dashboard_C2C', datetime.datetime(2021, 8, 17, 6, 40, 14, 195012, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 1 and queue default
[2021-08-17 07:07:15,598] {base_executor.py:56} INFO - Adding to queue: airflow run Dashboard_C2C Dashboard_C2C 2021-08-17T06:40:14.195012+00:00 --local -sd /root/airflow-dags/dags/Dashboard_C2C.py
Process QueuedLocalWorker-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
  File "/usr/local/lib/python3.6/dist-packages/airflow/executors/local_executor.py", line 113, in run
    key, command = self.task_queue.get()
  File "/usr/lib/python3.6/multiprocessing/queues.py", line 113, in get
    return _ForkingPickler.loads(res)
TypeError: __init__() missing 5 required positional arguments: 'tz', 'utc_offset', 'is_dst', 'dst', and 'abbrev'
currenly I got above issue with LocalExecutor
Thuc Nguyen Canh
I can make it work with 'airflow test' but failed with 'airflow run'
Thuc Nguyen Canh
hi, I fixed it by reinstalling airflow, so maybe there is library conflict.
Rohith Madamshetty
can anyone says how the kubernetes pod operator gets created with all the specs when we use pod operator in the airflow. I mean in the pod operator we specify arguments and kubernetes secrets and resources and dag parameters, but can anyone say how do they get created as a pod in the airflow cluster becuase i see that there are some extra env variables added to my pod and i can see them in pod spec file in the cluster
Avinash Pallerlamudi
Hello Everyone,
How do I call a big query stored proc using Airflow in a Task, Is there an operator?
Fran Sánchez
Do you know when 2.2.1 is expected?
hello I am getting error on airflow
import pwd
ModuleNotFoundError: No module named 'pwd'
Hi i am looking for a precise answer. Can we create a DAG dynamically using Airflow API ?

Hello All,
Looking for some help regarding the issue I'm facing after migrating from Airflow 1.10.11 to 2.2.3. I'm unable to execute the DAGs from UI as the task is going into queued state. I created the pod_template_file as suggested in the migration steps but still getting the below error:
"airflow.exceptions.AirflowException: Dag 'xxxxxxx' could not be found; either it does not exist or it failed to parse"

But I could see the DAGs inside the webserver and scheduler in the exact location where the task is trying to read from which is "/opt/airflow/dags/repo/". Surprisingly, I'm able to trigger the DAG from webserver's command line and the DAG is completing successfully. Any help please?

Hello everyone, I have my access control organized, I would like to disable the login/ login page. According to the documentation AUTH_ROLE_PUBLIC = 'Admin' should help. But the problem is not solved ... Has anyone encountered a similar one?
Vigneshwar Subramanian

Hi All, Looking for some help in troubleshooting my airflow setup
I have setup the airflow in in Azure Cloud (Azure Container Apps) and attached an Azure File Share as an external mount/volume

  1. I ran airflow init service, it had created the airflow.cfg and 'webserver_config.py' file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  2. I ran airflow webserver service, it had created the airflow-webserver.pid file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  3. Now the problem is all the files created above are created with root user&groups not as airflow user(50000),
    I have also set the env variable AIRFLOW_UID to 50000 during the creation of container app. due to this my webservers are not starting, throwing below error
    PermissionError: [Errno 1] Operation not permitted: '/opt/airflow/airflow-webserver.pid'

Attached screenshot for reference

Your help is much appreciated!
Josué Pradas
Hello everyone, does anyone know why this error "cannot pickle 'LockType' object" usually happens when clearing a task on Airflow?
deepan ramachandran
i have issue with ModuleNotFoundError: No module named 'pwd'
I tried to install without docker and wsl2, Any help?
Rishi Kaushal
getting following error when airflow webserver goes down : ERROR - No response from gunicorn master within 120 seconds
above error is got in webserver.log file
can anyone tell what could be the reason for getting above error & how to fix it ?
Hello people, but how can I display the connections that we take from the vault on the web?
Prakash Chandrasekaran
We wrote a simple python program that connects to postgres db in AWS RDS and executes a SELECT query. However it fails with connection time out error. We are able to connect to db from the docker container in which the airflow runs, though. Any help is appreciated.
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection timed out
Is the server running on host “hostname.domain.com” (99.999.999.99) and accepting
TCP/IP connections on port 5432?
Prakash Chandrasekaran
@EthanBeauvais were you able to resolve this error?
(psycopg2.OperationalError) could not connect to server: Connection timed out Is the server running on host “dbinfo” (IP address) and accepting TCP/IP connections on port 5432?
Hello Everyone I have a question about apache airflow 1.10.10. , I found some as "airflow.exceptions.AirflowException: Celery command failed " can help me ?
This is very strange, the upstream project has succeeded, but the downstream task is not running, prompting upstream_failed.