Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 01:00
    kweitzner commented #8923
  • 00:24
    github-actions[bot] labeled #23105
  • 00:24
    github-actions[bot] commented #23105
  • Sep 29 23:57
    bdsoha edited #26766
  • Sep 29 23:40
    bdsoha synchronize #26766
  • Sep 29 23:38
    Taragolis synchronize #26687
  • Sep 29 23:28

    jedcunningham on main

    Add restarting state to TaskSta… (compare)

  • Sep 29 23:28
    jedcunningham closed #26776
  • Sep 29 23:20
    bhavaniravi commented #24079
  • Sep 29 23:19
    Taragolis commented #26752
  • Sep 29 23:18
    billyross starred apache/airflow
  • Sep 29 23:18
    bhavaniravi commented #8923
  • Sep 29 23:15
    bhavaniravi commented #9327
  • Sep 29 23:13
  • Sep 29 22:52
    kazanzhy synchronize #25717
  • Sep 29 22:03
    boring-cyborg[bot] labeled #26791
  • Sep 29 22:03
    raphaelauv review_requested #26791
  • Sep 29 22:03
    raphaelauv review_requested #26791
  • Sep 29 22:03
    raphaelauv review_requested #26791
  • Sep 29 22:03
    raphaelauv opened #26791
haoguoxuan
@haoguoxuan
airflow version 2.1.1

I set dag folder path as this in airflow.cfg file: [core]

The folder where your airflow pipelines live, most likely a

subfolder in a code repository. This path must be absolute.

dags_folder = /home/ubuntu/airflow/dags

Thuc Nguyen Canh
@thucnc
hello, i have a production issue with airflow <v0.19 very urgent, all my task are in queued, I used LocalExecutor, anyone can help
any expert can help, I will make a call, very stressed today with this issue
Thuc Nguyen Canh
@thucnc
[2021-08-17 07:07:15,579] {jobs.py:1109} INFO - Tasks up for execution:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,583] {jobs.py:1144} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 1 task instances in queue
[2021-08-17 07:07:15,586] {jobs.py:1180} INFO - DAG Dashboard_C2C has 0/2 running and queued tasks
[2021-08-17 07:07:15,586] {jobs.py:1218} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,597] {jobs.py:1301} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [queued]>
[2021-08-17 07:07:15,597] {jobs.py:1343} INFO - Sending ('Dashboard_C2C', 'Dashboard_C2C', datetime.datetime(2021, 8, 17, 6, 40, 14, 195012, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 1 and queue default
[2021-08-17 07:07:15,598] {base_executor.py:56} INFO - Adding to queue: airflow run Dashboard_C2C Dashboard_C2C 2021-08-17T06:40:14.195012+00:00 --local -sd /root/airflow-dags/dags/Dashboard_C2C.py
Process QueuedLocalWorker-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/airflow/executors/local_executor.py", line 113, in run
    key, command = self.task_queue.get()
  File "/usr/lib/python3.6/multiprocessing/queues.py", line 113, in get
    return _ForkingPickler.loads(res)
TypeError: __init__() missing 5 required positional arguments: 'tz', 'utc_offset', 'is_dst', 'dst', and 'abbrev'
currenly I got above issue with LocalExecutor
Thuc Nguyen Canh
@thucnc
I can make it work with 'airflow test' but failed with 'airflow run'
Thuc Nguyen Canh
@thucnc
hi, I fixed it by reinstalling airflow, so maybe there is library conflict.
Rohith Madamshetty
@rohithbittu33
can anyone says how the kubernetes pod operator gets created with all the specs when we use pod operator in the airflow. I mean in the pod operator we specify arguments and kubernetes secrets and resources and dag parameters, but can anyone say how do they get created as a pod in the airflow cluster becuase i see that there are some extra env variables added to my pod and i can see them in pod spec file in the cluster
Avinash Pallerlamudi
@stpavinash
Hello Everyone,
How do I call a big query stored proc using Airflow in a Task, Is there an operator?
Fran Sánchez
@fj-sanchez
Do you know when 2.2.1 is expected?
Jitendra-ICLR
@Jitendra-ICLR
hello I am getting error on airflow
import pwd
ModuleNotFoundError: No module named 'pwd'
mashirali
@mashirali
Hi i am looking for a precise answer. Can we create a DAG dynamically using Airflow API ?
krishnachiranjeevi
@krishnachiranjeevi

Hello All,
Looking for some help regarding the issue I'm facing after migrating from Airflow 1.10.11 to 2.2.3. I'm unable to execute the DAGs from UI as the task is going into queued state. I created the pod_template_file as suggested in the migration steps but still getting the below error:
"airflow.exceptions.AirflowException: Dag 'xxxxxxx' could not be found; either it does not exist or it failed to parse"

But I could see the DAGs inside the webserver and scheduler in the exact location where the task is trying to read from which is "/opt/airflow/dags/repo/". Surprisingly, I'm able to trigger the DAG from webserver's command line and the DAG is completing successfully. Any help please?

Mikhail
@Armadik
Hello everyone, I have my access control organized, I would like to disable the login/ login page. According to the documentation AUTH_ROLE_PUBLIC = 'Admin' should help. But the problem is not solved ... Has anyone encountered a similar one?
Vigneshwar Subramanian
@wiki9990:matrix.org
[m]

Hi All, Looking for some help in troubleshooting my airflow setup
I have setup the airflow in in Azure Cloud (Azure Container Apps) and attached an Azure File Share as an external mount/volume

  1. I ran airflow init service, it had created the airflow.cfg and 'webserver_config.py' file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  2. I ran airflow webserver service, it had created the airflow-webserver.pid file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  3. Now the problem is all the files created above are created with root user&groups not as airflow user(50000),
    I have also set the env variable AIRFLOW_UID to 50000 during the creation of container app. due to this my webservers are not starting, throwing below error
    PermissionError: [Errno 1] Operation not permitted: '/opt/airflow/airflow-webserver.pid'

Attached screenshot for reference

Your help is much appreciated!
Josué Pradas
@jpradass
Hello everyone, does anyone know why this error "cannot pickle 'LockType' object" usually happens when clearing a task on Airflow?
deepan ramachandran
@yahdeepan:matrix.org
[m]
i have issue with ModuleNotFoundError: No module named 'pwd'
I tried to install without docker and wsl2, Any help?
Rishi Kaushal
@rishikaushal:matrix.org
[m]
getting following error when airflow webserver goes down : ERROR - No response from gunicorn master within 120 seconds
above error is got in webserver.log file
can anyone tell what could be the reason for getting above error & how to fix it ?
Mikhail
@Armadik
Hello people, but how can I display the connections that we take from the vault on the web?
Prakash Chandrasekaran
@bheesma
We wrote a simple python program that connects to postgres db in AWS RDS and executes a SELECT query. However it fails with connection time out error. We are able to connect to db from the docker container in which the airflow runs, though. Any help is appreciated.
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) could not connect to server: Connection timed out
Is the server running on host “hostname.domain.com” (99.999.999.99) and accepting
TCP/IP connections on port 5432?
Prakash Chandrasekaran
@bheesma
@EthanBeauvais were you able to resolve this error?
(psycopg2.OperationalError) could not connect to server: Connection timed out Is the server running on host “dbinfo” (IP address) and accepting TCP/IP connections on port 5432?
lizeyang
@zeyangli_gitlab
Hello Everyone I have a question about apache airflow 1.10.10. , I found some as "airflow.exceptions.AirflowException: Celery command failed " can help me ?
This is very strange, the upstream project has succeeded, but the downstream task is not running, prompting upstream_failed.