Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
Daniel Papp
@jakab922
It seems to be working fine
I just don't know how to use the airflow cli tool to upload a dag
Can someone help me with that?
Akshay Krishnan R
@akshaykrishnanr_twitter
Hi, what is bind_password in Apache Airflow? And why is the official documentation asking to set it to insecure?
https://airflow.apache.org/docs/stable/security.html#ldap,
Ganapathi Chidambaram
@ganapathichidambaram
Anybody pls guide me to get the filename using pattern through filesensor
i need to listen for the files on particular local folder and get the list of filenames and call it to another task to process that file.
Ganapathi Chidambaram
@ganapathichidambaram
please help me to write plugin on airflow 1.10.6 version
ebernhardson
@ebernhardson
What will airflow do if a task completes, but the SQLAlchemy connection is unavailable? Our DB is going down for maintenance for a couple hours tomorrow and i'm planning to stop the scheduler so nothing is running then, but wondering what would happen if we just let it go
Emma Grotto
@GrottoEmma_twitter
Hey Everyone!
I downgraded airflow from 1.10.12 to 1.10.10 and I am getting:
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.DuplicateColumn) column "operator" of relation "task_instance" already exists
running resetdb is not an option for me. Does anyone know how to fix this?
1 reply
Neo-vijayk
@Neo-vijayk
Hello teram,
Hello Tram, Can anyone helpmeet on this issue. We are facing an issue where the tasks are not getting started they just queued due to celery task timeout. Our airflow scheduler encounters this AirflowTaskTimeout error. "name":"airflow.executors.celery_executor.CeleryExecutor", "level":"ERROR", "message":"Error sending Celery task:Timeout, PID: 3844\nCelery Task ID: ('tpf_daily_price', 'start', datetime.datetime(2020, 11, 3, 2, 30, tzinfo=<Timezone [UTC]>), 1)\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 42, in call\n return self.value\nAttributeError: 'ChannelPromise' object has no attribute 'value'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 921, in create_channel\n return self._avail_channels.pop()\nIndexError: pop from empty list\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/airflow\/executors\/celery_executor.py\", line 118, in send_task_to_executor\n result = task.apply_async(args=[command], queue=queue)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/task.py\", line 565, in apply_async\n options\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/base.py\", line 718, in send_task\n amqp.send_task_message(P, name, message, options)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/amqp.py\", line 547, in send_task_message\n properties\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 181, in publish\n exchange_name, declare,\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 518, in _ensured\n return fun(*args, kwargs)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 187, in _publish\n channel = self.channel\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 209, in _get_channel\n channel = self._channel = channel()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 44, in call\n value = self.value = self.contract()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 224, in <lambda>\n channel = ChannelPromise(lambda: connection.default_channel)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 866, in default_channel\n self.ensure_connection(conn_opts)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 430, in ensure_connection\n callback, timeout=timeout)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 343, in retry_over_time\n return fun(*args, kwargs)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 283, in connect\n return self.connection\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 837, in connection\n self._connection = self._establish_connection()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 792, in _establish_connection\n conn = self.transport.establish_connection()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 941, in establish_connection\n self._avail_channels.append(self.create_channel(self))\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 923, in create_channel\n channel = self.Channel(connection)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/redis.py\", line 521, in init\n self.client.ping()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/redis\/client.py\", line 1351, in ping\n return self.execute_command('PING')\n File \"\/usr\/lo
Appreciate any advise or workaround on this issue please
Neo-vijayk
@Neo-vijayk
Our is 1.10.3 with Redis and mysql and nodes on ec2 instances.
Neo-vijayk
@Neo-vijayk
hello all, Can anyone help me on one issue. We are facing an issue where the tasks are not getting started they just queued due to celery task timeout. the celery commands getting failed/timedout. error shown airflow-"name":"airflow.executors.celery_executor.CeleryExecutor", "level":"ERROR", "message":"Error sending Celery task:Timeout, PID: 3844\nCelery Task ID: ('tpf_daily_price', 'start', datetime.datetime(2020, 11, 3, 2, 30, tzinfo=<Timezone [UTC]>), 1)\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 42, in call\n return self.value\nAttributeError: 'ChannelPromise' object has no attribute 'value'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 921, in create_channel\n return self._avail_channels.pop()\nIndexError: pop from empty list\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/airflow\/executors\/celery_executor.py\", line 118, in send_task_to_executor\n result = task.apply_async(args=[command], queue=queue)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/task.py\", line 565, in apply_async\n options\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/base.py\", line 718, in send_task\n amqp.send_task_message(P, name, message, options)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/celery\/app\/amqp.py\", line 547, in send_task_message\n properties\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 181, in publish\n exchange_name, declare,\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 518, in _ensured\n return fun(*args, kwargs)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 187, in _publish\n channel = self.channel\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 209, in _get_channel\n channel = self._channel = channel()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 44, in call\n value = self.value = self.contract()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/messaging.py\", line 224, in <lambda>\n channel = ChannelPromise(lambda: connection.default_channel)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 866, in default_channel\n self.ensure_connection(conn_opts)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 430, in ensure_connection\n callback, timeout=timeout)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/utils\/functional.py\", line 343, in retry_over_time\n return fun(*args, kwargs)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 283, in connect\n return self.connection\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 837, in connection\n self._connection = self._establish_connection()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/connection.py\", line 792, in _establish_connection\n conn = self.transport.establish_connection()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 941, in establish_connection\n self._avail_channels.append(self.create_channel(self))\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/virtual\/base.py\", line 923, in create_channel\n channel = self.Channel(connection)\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/kombu\/transport\/redis.py\", line 521, in init\n self.client.ping()\n File \"\/usr\/local\/lib\/python3.7\/site-packages\/redis\/client.py\", line 1351, in ping\n return self.execute_command('PING')\n File \"\/usr\/lo
aakashroy-ds
@aakashroy-ds

Hi Experts,

I have Apache Airflow running on an EC2 instance (Ubuntu). Everything is running fine. The DB is SQLite and the executor is Sequential Executor (provided as default). But now I would like to run some DAGs which needs to be run at the same time every hour and every 2 minutes. My question is how can I upgrade my current setup to Celery executor and postgres DB to have the advantage of parallel execution?

Will it work, if I install and setup the postgres, rabbitmq and celery. And make the necessary changes in the airflow.cfg configuration file?

Or do I need to re-install everything from scratch (including airflow)?

Please guide me on this.

Kirill
@KIRILLxBREAK
Hi all.
I try to deploy airflow using docker (build image with Dockerfile from official repo https://github.com/apache/airflow). Can I change user in container to root? How can I do this?
Ashish
@ashish05_gitlab

Hi all -
I'm trying to deploy airflow on kubernetes, and I'm running into this error:

Traceback (most recent call last):
  File "/app/py3ven/bin/airflow", line 21, in <module>
    from airflow import configuration
  File "/app/py3ven/lib/python3.6/site-packages/airflow/__init__.py", line 31, in <module>
    from airflow.utils.log.logging_mixin import LoggingMixin
  File "/app/py3ven/lib/python3.6/site-packages/airflow/utils/__init__.py", line 24, in <module>
    from .decorators import apply_defaults as _apply_defaults
  File "/app/py3ven/lib/python3.6/site-packages/airflow/utils/decorators.py", line 34, in <module>
    from airflow import settings
  File "/app/py3ven/lib/python3.6/site-packages/airflow/settings.py", line 83, in <module>
    prefix=conf.get('scheduler', 'statsd_prefix'))
  File "/app/py3ven/lib/python3.6/site-packages/statsd/client/udp.py", line 35, in __init__
    host, port, fam, socket.SOCK_DGRAM)[0]
  File "/usr/local/lib/python3.6/socket.py", line 745, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -5] No address associated with hostname

My config is as follows:
base_url = http://localhost:8080
web_server_host = 0.0.0.0
web_server_port = 8080

Does anybody know what could be wrong?

Ramonzin
@RamonBarros19_twitter

To run the jar in this way, you need to:

Either change Spark Master Address in template projects or simply delete it. Currently, they are hard coded to local[4] which means run locally with 4 cores.

Change the dependency packaging scope of Apache Spark from "compile" to "provided". This is a common packaging strategy in Maven and SBT which means do not package Spark into your fat jar. Otherwise, this may lead to a huge jar and version conflicts!

Make sure the dependency versions in build.sbt and POM.xml are consistent with your Spark version.

My difficulty is in this part of submitting, I don't know where I change these steps you sent, if you can explain the locations, I would be grateful
ramonbarrosk
@ramonbarrosk

How do I create a polygonRDD from a shapefile entry?
input_location = ShapefileReader.readToGeometryRDD(sc, "Estados",)
in python

The data in the shapefile is of the multi-polygon type, does the polygonRDD function in python support this data type as well?

Anurag-Shetty
@Anurag-Shetty
i am getting this error while running example dags "from airflow.operators.bash import BashOperator
ModuleNotFoundError: No module named 'airflow.operators.bash"
pravinborate04
@pravinborate04
airflow 2.0.0, not able to login into webserver , http://0.0.0.0:8081/login/?next=http%3A%2F%2F0.0.0.0%3A8081%2Fhome
Cristián Andrés
@slotbite
hello everyone, a query, has anyone had this connection error with postrgres?
psycopg2.OperationalError: could not translate host name "postgresql" to address: Temporary failure in name resolution
I am running airflow with docker as a service (redis postgres, worker, webserver, shceduler), my task requires a high consumption of cpu 90%, but not network, so is it possible that that affects the connectivity with postgres? Service in docker compose
postgresql:
image: 'docker.io/bitnami/postgresql:10-debian-10'
# Uncomment these lines to persist data on the local filesystem.
#volumes:
# - 'postgresql_data: / bitnami / postgresql'
deploy:
restart_policy:
condition: on-failure
delay: 8s
max_attempts: 3
environment:
- POSTGRESQL_DATABASE = airflow
- POSTGRESQL_USERNAME = airflow
- POSTGRESQL_PASSWORD = airflow
- ALLOW_EMPTY_PASSWORD = yes
- POSTGRES_HOST_AUTH_METHOD = trust
networks:
- default
Ric Dong
@ricdong
hi, we are running airflow with celery mode, and we found a stuck job with up_for_retry state, each time the Airflow Worker gives the new retry time but not running the task .
below are the log: [2021-01-08 03:35:24,430] {cli.py:374} INFO - Running on host emr-header-1.cluster-191613
[2021-01-08 03:35:24,458] {models.py:1215} DEBUG - <TaskInstance: data_etl.analysis_overlord_study_metrics_daily 2021-01-07 00:00:00 [up_for_retry]> dependency 'Previous Dagrun State' PASSED: True, This task instance was the first task instance for its task.
[2021-01-08 03:35:24,458] {models.py:1215} DEBUG - <TaskInstance: data_etl.analysis_overlord_study_metrics_daily 2021-01-07 00:00:00 [up_for_retry]> dependency 'Task Instance State' PASSED: True, Task state up_for_retry was valid.
[2021-01-08 03:35:24,464] {models.py:1215} DEBUG - <TaskInstance: data_etl.analysis_overlord_study_metrics_daily 2021-01-07 00:00:00 [up_for_retry]> dependency 'Not In Retry Period' PASSED: False, Task is not ready for retry yet but will be retried automatically. Current date is 2021-01-08T03:35:24.464363 and task will be retried at 2021-01-08T03:35:50.245051.
[2021-01-08 03:35:24,464] {models.py:1190} INFO - Dependencies not met for <TaskInstance: data_etl.analysis_overlord_study_metrics_daily 2021-01-07 00:00:00 [up_for_retry]>, dependency 'Not In Retry Period' FAILED: Task is not ready for retry yet but will be retried automatically. Current date is 2021-01-08T03:35:24.464363 and task will be retried at 2021-01-08T03:35:50.245051.
Xingze Zhang
@diggzhang
Hi there, Is there any way to get the exception of the task in on_failure_callback?
Ashwini Padhy
@ashwini.padhy89_gitlab
Hi Guys,
I have currently migrated to airflow 2 from 1.10 and see the below error:
"airflow.exceptions.AirflowTaskTimeout: DagBag import timeout for /opt/airflow/dags/orders_dag.py after 30.0s, PID: 11866
How to resolve this
ganeshrajan-dev
@ganeshrajan-dev

I'm trying to make airflow2.0 as a service (systemd) in ubuntu 18.04.

Service file path:
sudo vi /etc/systemd/system/airflow-webserver.service

This is my code :

[Unit]
Description=Airflow webserver daemon
After=network.target
#Wants=postgresql-10.service
#After=network.target mysql.service
#Wants=mysql.service

[Service]
#EnvironmentFile=/usr/local/bin/airflow
#EnvironmentFile=/etc/environment
#Environment="PATH=/home/ubuntu/anaconda3/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
#User=airflow
#Environment="PATH=/home/ubuntu/python/envs/airflow/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
#Group=airflow
#RuntimeDirectory=/home/ubuntu/airflow
#RuntimeDirectoryMode=0775
Environment=PATH=/home/airflow/.local/bin:$PATH
PIDFile=/home/ubuntu/airflow/airflow.pid
Type=simple
#ExecStart=/home/ubuntu/python/envs/airflow/bin/airflow webserver -p 8080 --pid /home/ubuntu/airflow/airflow-webserver.pid
ExecStart=/usr/local/bin/airflow webserver --pid /home/ubuntu/airflow/airflow.pid
Restart=on-failure
RestartSec=5s
PrivateTmp=true

[Install]
WantedBy=multi-user.target`

This error I'm getting:

● airflow-webserver.service - Airflow webserver daemon
   Loaded: loaded (/etc/systemd/system/airflow-webserver.service; enabled; vendor preset: enabled)
   Active: active (running) since Tue 2021-02-02 12:22:49 UTC; 4s ago
 Main PID: 15693 (airflow)
    Tasks: 2 (limit: 4402)
   CGroup: /system.slice/airflow-webserver.service
           └─15693 /usr/bin/python3 /usr/local/bin/airflow webserver --pid /home/ubuntu/airflow/airflow.pid

Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (?, ?, ?, ?, ?, ?, ?)]
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [parameters: ('2021-02-02 12:22:51.772511', None, None, 'cli_webserver', None, 'root', '{"host_name": "ip-150-31-11-187", "full_c
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: (Background on this error at: http://sqlalche.me/e/13/e3q8)
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:   ____________       _____________
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:  ____    |__( )_________  __/__  /________      __
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]:  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [2021-02-02 12:22:51,815] {dagbag.py:440} INFO - Filling up the DagBag from /dev/null
Feb 02 12:22:51 ip-150-31-11-187 airflow[15693]: [2021-02-02 12:22:51,867] {manager.py:727} WARNING - No user yet created, use flask fab command to do it.

@groodt @srikrishnavamsi_twitter

SP
@YellaSharmila_twitter
Hi guys ! how do i enable localhost for airflow .page displays error .
image.png
I did verify postgres config and airflow configs
Gelinger Media
@gelinger777
File "/home/airflow/.local/bin/airflow", line 5, in
flow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow' airflow-init_1 | Traceback (most recent call last): airflow-init_1 | File "/home/airflow/.local/bin/airflow", line 5, in <module> airflow-init_1 | from airflow.__main__ import main airflow-init_1 | ModuleNotFoundError: No module named 'airflow'
intruder777
@intruder777:matrix.org
[m]
Hi. Can I use airflow to generate tasks on the fly based on records in database?
Sameer1808
@Sameer1808

Hello guys new to airflow, trying to integrate airflow with rbac+ldap but facing the error below
airflow version : 2.0.1
mysql version : 5.7.25

DEBUG - LDAP indirect bind with: CN=abcd,OU=SERVICE ACCT,DC=,DC=,DC=*
[2021-03-29 19:03:29,860] {manager.py:834} DEBUG - LDAP BIND indirect OK
[2021-03-29 19:03:29,861] {manager.py:853} DEBUG - LDAP bind failure: user not found
[2021-03-29 19:03:29,932] {manager.py:226} INFO - Updated user Sameer Sharma
[2021-03-29 19:03:29,932] {manager.py:928} WARNING - Login Failed for user: Sameer3.Sharma

can anyone help on this one?

phemmmie
@phemmmie
Hi Everyone I am having this issue on my airflow DAG showing inconsistent time from what the log is showing. The airflow UI is showing a time difference of 4 hours. How can i find the fix for it
Rahul1chandra
@Rahul1chandra
hi team, Im facing issue with the Dag stucking in schedule state forever, and none of the task is getting process
As a workaround Im Scaling down /Up my Airflow and postgres
haoguoxuan
@haoguoxuan
airflow webserver not able to pick up dags, here's the message:

/home/ubuntu/.pyenv/versions/3.8.10/envs/airflowenv/lib/python3.8/site-packages/pandas/compat/init.py:109 UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError.


__ |( )_ / /__
__
/| | / / / /_ _ | /| / /
| / / / / / // / |/ |/ / // |// // // // __/__/|/
[2021-07-08 10:00:01,260] {dagbag.py:496} INFO - Filling up the DagBag from /dev/null
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -

airflow version 2.1.1

I set dag folder path as this in airflow.cfg file: [core]

The folder where your airflow pipelines live, most likely a

subfolder in a code repository. This path must be absolute.

dags_folder = /home/ubuntu/airflow/dags

Thuc Nguyen Canh
@thucnc
hello, i have a production issue with airflow <v0.19 very urgent, all my task are in queued, I used LocalExecutor, anyone can help
any expert can help, I will make a call, very stressed today with this issue
Thuc Nguyen Canh
@thucnc
[2021-08-17 07:07:15,579] {jobs.py:1109} INFO - Tasks up for execution:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,583] {jobs.py:1144} INFO - Figuring out tasks to run in Pool(name=None) with 128 open slots and 1 task instances in queue
[2021-08-17 07:07:15,586] {jobs.py:1180} INFO - DAG Dashboard_C2C has 0/2 running and queued tasks
[2021-08-17 07:07:15,586] {jobs.py:1218} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [scheduled]>
[2021-08-17 07:07:15,597] {jobs.py:1301} INFO - Setting the follow tasks to queued state:
        <TaskInstance: Dashboard_C2C.Dashboard_C2C 2021-08-17 06:40:14.195012+00:00 [queued]>
[2021-08-17 07:07:15,597] {jobs.py:1343} INFO - Sending ('Dashboard_C2C', 'Dashboard_C2C', datetime.datetime(2021, 8, 17, 6, 40, 14, 195012, tzinfo=<TimezoneInfo [UTC, GMT, +00:00:00, STD]>), 1) to executor with priority 1 and queue default
[2021-08-17 07:07:15,598] {base_executor.py:56} INFO - Adding to queue: airflow run Dashboard_C2C Dashboard_C2C 2021-08-17T06:40:14.195012+00:00 --local -sd /root/airflow-dags/dags/Dashboard_C2C.py
Process QueuedLocalWorker-2:
Traceback (most recent call last):
  File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/local/lib/python3.6/dist-packages/airflow/executors/local_executor.py", line 113, in run
    key, command = self.task_queue.get()
  File "/usr/lib/python3.6/multiprocessing/queues.py", line 113, in get
    return _ForkingPickler.loads(res)
TypeError: __init__() missing 5 required positional arguments: 'tz', 'utc_offset', 'is_dst', 'dst', and 'abbrev'
currenly I got above issue with LocalExecutor
Thuc Nguyen Canh
@thucnc
I can make it work with 'airflow test' but failed with 'airflow run'
Thuc Nguyen Canh
@thucnc
hi, I fixed it by reinstalling airflow, so maybe there is library conflict.
Rohith Madamshetty
@rohithbittu33
can anyone says how the kubernetes pod operator gets created with all the specs when we use pod operator in the airflow. I mean in the pod operator we specify arguments and kubernetes secrets and resources and dag parameters, but can anyone say how do they get created as a pod in the airflow cluster becuase i see that there are some extra env variables added to my pod and i can see them in pod spec file in the cluster
Avinash Pallerlamudi
@stpavinash
Hello Everyone,
How do I call a big query stored proc using Airflow in a Task, Is there an operator?