Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 14:35
    morooshka synchronize #25787
  • 14:33
    morooshka synchronize #25787
  • 14:32
    kelcheone starred apache/airflow
  • 14:23
    ashb synchronize #25795
  • 14:19
    ephraimbuddy synchronize #25788
  • 14:09
    ephraimbuddy synchronize #25788
  • 14:09
    josh-fell synchronize #25752
  • 14:08
    feluelle synchronize #25785
  • 14:08
    feluelle synchronize #25785
  • 14:06
    ephraimbuddy synchronize #25788
  • 14:05
    feluelle synchronize #25785
  • 14:02
    rishkarajgi synchronize #24784
  • 13:54
    hamedhsn synchronize #22253
  • 13:52

    jedcunningham on main

    Rotate session id during login … (compare)

  • 13:52
    jedcunningham closed #25771
  • 13:23
    merobi-hub edited #25741
  • 13:21
    ashb commented #25795
  • 13:20
    boring-cyborg[bot] labeled #25795
  • 13:20
    boring-cyborg[bot] labeled #25795
  • 13:20
    boring-cyborg[bot] labeled #25795
Ashwini Padhy
@ashwini.padhy89_gitlab
Hi Guys,
I have currently migrated to airflow 2 from 1.10 and see the below error:
"airflow.exceptions.AirflowTaskTimeout: DagBag import timeout for /opt/airflow/dags/orders_dag.py after 30.0s, PID: 11866
How to resolve this
ganeshrajan-dev
@ganeshrajan-dev

I'm trying to make airflow2.0 as a service (systemd) in ubuntu 18.04.

Service file path:
sudo vi /etc/systemd/system/airflow-webserver.service

This is my code :

[Unit]
Description=Airflow webserver daemon
After=network.target
#Wants=postgresql-10.service
#After=network.target mysql.service
#Wants=mysql.service

[Service]
#EnvironmentFile=/usr/local/bin/airflow
#EnvironmentFile=/etc/environment
#Environment="PATH=/home/ubuntu/anaconda3/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
#User=airflow
#Environment="PATH=/home/ubuntu/python/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
#Group=airflow
#RuntimeDirectory=/home/ubuntu/airflow
#RuntimeDirectoryMode=0775
Environment=PATH=/home/airflow/.local/bin:$PATH
PIDFile=/home/ubuntu/airflow/airflow.pid
Type=simple
#ExecStart=/home/ubuntu/python/envs/airflow/bin/airflow webserver -p 8080 --pid /home/ubuntu/airflow/airflow-webserver.pid
ExecStart=/usr/local/bin/airflow webserver --pid /home/ubuntu/airflow/airflow.pid
Restart=on-failure
RestartSec=5s
PrivateTmp=true

[Install]
WantedBy=multi-user.target`

This error I'm getting:

● airflow-webserver.service - Airflow webserver daemon
   Loaded: loaded (/etc/systemd/system/airflow-webserver.service; enabled; vendor preset: enabled)
   Active: active (running) since Tue 2021-02-02 12:22:49 UTC; 4s ago
 Main PID: 15693 (airflow)
    Tasks: 2 (limit: 4402)
   CGroup: /system.slice/airflow-webserver.service
           └─15693 /usr/bin/python3 /usr/local/bin/airflow webserver --pid /home/ubuntu/airflow/airflow.pid

Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (?, ?, ?, ?, ?, ?, ?)]
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [parameters: ('2021-02-02 12:22:51.772511', None, None, 'cli_webserver', None, 'root', '{"host_name": "ip-150-31-11-187", "full_c
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: (Background on this error at: http://sqlalche.me/e/13/e3q8)
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:   ____________       _____________
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:  ____    |__( )_________  __/__  /________      __
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [2021-02-02 12:22:51,815] {dagbag.py:440} INFO - Filling up the DagBag from /dev/null
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [2021-02-02 12:22:51,867] {manager.py:727} WARNING - No user yet created, use flask fab command to do it.

@groodt @srikrishnavamsi_twitter

SP
@YellaSharmila_twitter
Hi guys ! how do i enable localhost for airflow .page displays error .
image.png
I did verify postgres config and airflow configs
Gelinger Media
@gelinger777
File "/home/airflow/.local/bin/airflow", line 5, in
flow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow'
intruder777
@intruder777:matrix.org
[m]
Hi. Can I use airflow to generate tasks on the fly based on records in database?
Sameer1808
@Sameer1808

Hello guys new to airflow, trying to integrate airflow with rbac+ldap but facing the error below
airflow version : 2.0.1
mysql version : 5.7.25

DEBUG - LDAP indirect bind with: CN=abcd,OU=SERVICE ACCT,DC=,DC=,DC=*
[2021-03-29 19:03:29,860] {manager.py:834} DEBUG - LDAP BIND indirect OK
[2021-03-29 19:03:29,861] {manager.py:853} DEBUG - LDAP bind failure: user not found
[2021-03-29 19:03:29,932] {manager.py:226} INFO - Updated user Sameer Sharma
[2021-03-29 19:03:29,932] {manager.py:928} WARNING - Login Failed for user: Sameer3.Sharma

can anyone help on this one?

phemmmie
@phemmmie
Hi Everyone I am having this issue on my airflow DAG showing inconsistent time from what the log is showing. The airflow UI is showing a time difference of 4 hours. How can i find the fix for it
Rahul1chandra
@Rahul1chandra
hi team, Im facing issue with the Dag stucking in schedule state forever, and none of the task is getting process
As a workaround Im Scaling down /Up my Airflow and postgres
haoguoxuan
@haoguoxuan
airflow webserver not able to pick up dags, here's the message:

/home/ubuntu/.pyenv/versions/3.8.10/envs/airflowenv/lib/python3.8/site-packages/pandas/compat/init.py:109 UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError.


__ |( )_ / /__
__
/| | / / / /_ _ | /| / /
| / / / / / // / |/ |/ / // |// // // // __/__/|/
[2021-07-08 10:00:01,260] {dagbag.py:496} INFO - Filling up the DagBag from /dev/null
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -

airflow version 2.1.1

I set dag folder path as this in airflow.cfg file: [core]

The folder where your airflow pipelines live, most likely a

subfolder in a code repository. This path must be absolute.

dags_folder = /home/ubuntu/airflow/dags

Thuc Nguyen Canh
@thucnc
hello, i have a production issue with airflow <v0.19 very urgent, all my task are in queued, I used LocalExecutor, anyone can help
any expert can help, I will make a call, very stressed today with this issue
Thuc Nguyen Canh
@thucnc
[2021-08-17 07:07:15,579] {jobs.py:1109} INFO - Tasks up for execution:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,583] {jobs.py:1144} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 1 task instances in queue
[2021-08-17 07:07:15,586] {jobs.py:1180} INFO - DAG Dashboard_C2C has 0/2 running and queued tasks
[2021-08-17 07:07:15,586] {jobs.py:1218} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,597] {jobs.py:1301} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [queued]>
[2021-08-17 07:07:15,597] {jobs.py:1343} INFO - Sending ('Dashboard_C2C', 'Dashboard_C2C', datetime.datetime(2021, 8, 17, 6, 40, 14, 195012, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 1 and queue default
[2021-08-17 07:07:15,598] {base_executor.py:56} INFO - Adding to queue: airflow run Dashboard_C2C Dashboard_C2C 2021-08-17T06:40:14.195012+00:00 --local -sd /root/airflow-dags/dags/Dashboard_C2C.py
Process QueuedLocalWorker-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/airflow/executors/local_executor.py", line 113, in run
    key, command = self.task_queue.get()
  File "/usr/lib/python3.6/multiprocessing/queues.py", line 113, in get
    return _ForkingPickler.loads(res)
TypeError: __init__() missing 5 required positional arguments: 'tz', 'utc_offset', 'is_dst', 'dst', and 'abbrev'
currenly I got above issue with LocalExecutor
Thuc Nguyen Canh
@thucnc
I can make it work with 'airflow test' but failed with 'airflow run'
Thuc Nguyen Canh
@thucnc
hi, I fixed it by reinstalling airflow, so maybe there is library conflict.
Rohith Madamshetty
@rohithbittu33
can anyone says how the kubernetes pod operator gets created with all the specs when we use pod operator in the airflow. I mean in the pod operator we specify arguments and kubernetes secrets and resources and dag parameters, but can anyone say how do they get created as a pod in the airflow cluster becuase i see that there are some extra env variables added to my pod and i can see them in pod spec file in the cluster
Avinash Pallerlamudi
@stpavinash
Hello Everyone,
How do I call a big query stored proc using Airflow in a Task, Is there an operator?
Fran Sánchez
@fj-sanchez
Do you know when 2.2.1 is expected?
Jitendra-ICLR
@Jitendra-ICLR
hello I am getting error on airflow
import pwd
ModuleNotFoundError: No module named 'pwd'
mashirali
@mashirali
Hi i am looking for a precise answer. Can we create a DAG dynamically using Airflow API ?
krishnachiranjeevi
@krishnachiranjeevi

Hello All,
Looking for some help regarding the issue I'm facing after migrating from Airflow 1.10.11 to 2.2.3. I'm unable to execute the DAGs from UI as the task is going into queued state. I created the pod_template_file as suggested in the migration steps but still getting the below error:
"airflow.exceptions.AirflowException: Dag 'xxxxxxx' could not be found; either it does not exist or it failed to parse"

But I could see the DAGs inside the webserver and scheduler in the exact location where the task is trying to read from which is "/opt/airflow/dags/repo/". Surprisingly, I'm able to trigger the DAG from webserver's command line and the DAG is completing successfully. Any help please?

Mikhail
@Armadik
Hello everyone, I have my access control organized, I would like to disable the login/ login page. According to the documentation AUTH_ROLE_PUBLIC = 'Admin' should help. But the problem is not solved ... Has anyone encountered a similar one?
Vigneshwar Subramanian
@wiki9990:matrix.org
[m]

Hi All, Looking for some help in troubleshooting my airflow setup
I have setup the airflow in in Azure Cloud (Azure Container Apps) and attached an Azure File Share as an external mount/volume

  1. I ran airflow init service, it had created the airflow.cfg and 'webserver_config.py' file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  2. I ran airflow webserver service, it had created the airflow-webserver.pid file in the AIRFLOW_HOME (/opt/airflow), which is actually an azure mounted file system
  3. Now the problem is all the files created above are created with root user&groups not as airflow user(50000),
    I have also set the env variable AIRFLOW_UID to 50000 during the creation of container app. due to this my webservers are not starting, throwing below error
    PermissionError: [Errno 1] Operation not permitted: '/opt/airflow/airflow-webserver.pid'

Attached screenshot for reference

Your help is much appreciated!
Josué Pradas
@jpradass
Hello everyone, does anyone know why this error "cannot pickle 'LockType' object" usually happens when clearing a task on Airflow?
deepan ramachandran
@yahdeepan:matrix.org
[m]
i have issue with ModuleNotFoundError: No module named 'pwd'
I tried to install without docker and wsl2, Any help?
Rishi Kaushal
@rishikaushal:matrix.org
[m]
getting following error when airflow webserver goes down : ERROR - No response from gunicorn master within 120 seconds
above error is got in webserver.log file
can anyone tell what could be the reason for getting above error & how to fix it ?
Mikhail
@Armadik
Hello people, but how can I display the connections that we take from the vault on the web?