Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 23:31
    mik-laj review_requested #11669
  • 23:30
    mik-laj review_requested #11669
  • 23:30
    boring-cyborg[bot] labeled #11669
  • 23:30
    boring-cyborg[bot] labeled #11669
  • 23:30
    mik-laj opened #11669
  • 23:27
    mik-laj synchronize #11668
  • 23:18
    mik-laj commented #11667
  • 23:18
    mik-laj assigned #11667
  • 22:27
    alexbegg commented #11151
  • 22:25
    alexbegg synchronize #10637
  • 22:14
    github-actions[bot] commented #11668
  • 22:13
    alexbegg synchronize #11151
  • 21:41
    boring-cyborg[bot] labeled #11668
  • 21:40
    mik-laj review_requested #11668
  • 21:40
    mik-laj review_requested #11668
  • 21:40
    mik-laj opened #11668
  • 21:23
    github-actions[bot] commented #11516
  • 21:23
    github-actions[bot] commented #11519
  • 21:23
    github-actions[bot] commented #11659
  • 21:23
    github-actions[bot] commented #11516
dvirgiln
@dvirgiln

Hi, I am receiving an error in the web server UI:

DAG "as400_csv_daily" seems to be missing.

It happens every time I click in one dag to see the details.

The dag folder is different than the default one:

dags_folder = /git/dags/dags

I entered in the container and checked that the dags are placed in that folder.

I can see in the logs just this entry:

Filling up the DagBag from /dev/null
Joao Da Silva
@jsilva
Hi, I am looking at the docs in https://airflow.apache.org/docs/stable/executor/celery.html and the section with Workers –> DAG files - Reveal the DAG structure and execute the tasks and I would like to know if workers can handle serialised Dags as well ? thanks
Marwane Bel-lahcen
@bellahcenmarwane
hello
i get this error
File "/usr/lib/python3.6/socket.py", line 745, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -2] Name or service not known
for sending mail
Ash Berlin-Taylor
@ashb
@jsilva No, workers don't currently operator on serialized dags, they still need the real dag files to work against.
We can't serialize custom operators, and nor (reliably) serialize callbacks or code for PythonOperator, so we took the decision to not change how the workers operate right now
Joao Da Silva
@jsilva
@ashb thank you
dvirgiln
@dvirgiln

Hi,

If I deploy Airflow with Authentication:

  AIRFLOW__WEBSERVER__RBAC: True
  AIRFLOW__WEBSERVER__AUTHENTICATE: True

I receive an error:

DAG "as400_csv_daily" seems to be missing.

This error disappear if I set this variable:

AIRFLOW__CORE__STORE_SERIALIZED_DAGS: True

But then the problem is in the scheduler when the Kubernetes worker has to execute a dag:

ModuleNotFoundError: No module named 'dns_operators'

This dns_operators is a python folder inside of the project.

Any idea?

dvirgiln
@dvirgiln

@ashb could be because of the serialized dags?? As I am using a custom operator in a Kubernetes Executor.

I imagine that the better would be not to serialise the dags, but then the authentication fails

Ash Berlin-Taylor
@ashb
Serialisation only affects webserver - executors and scheduler still use dag files
Problem sounds like the missing module
dvirgiln
@dvirgiln

Thanks @ashb, you have helped me a lot just to discard scenarios. In this case the problem is coming with custom modules that are defined inside of the dags folder.

The Dag execution works fine when everything is deployed in a standalone version or from docker, but using the Kubernetes executor it happens:

  1. It works fine for a simple Dag that uses BashOperators.
  2. It does not work with custom modules that are embedded in the dags folder.

When the dag is being sent to the worker, it is being send just the python file that contains the dag, or it is being sent all the dags folder?

This is the Kubernetes configuration I set for the workers:

      worker_container_repository = apache/airflow
      worker_container_tag = 1.10.10-python3.7
      worker_container_image_pull_policy = IfNotPresent
dvirgiln
@dvirgiln

What I noticed, and I think is part of the problem is this log:

[2020-06-29 17:48:47,124] {dagbag.py:396} INFO - Filling up the DagBag from /git/dags/dags/isa_store_zoning_daily.py

Why the dagbag is being filled up with the specific dag, and not from /git/dags/dags/

in the scheduler config:
dags_folder = /git/dags/dags
Sri-nidhi
@Sri-nidhi
I have a plugin registered with airflow. Even after removing the plugin script from the plugins folder, its still available. How to deregister or remove the necessary dependencies of a plugin. Even after removing all contents, it still gets the data from pycache folder( inspite of removing it)
Joao Da Silva
@jsilva
Hello, I'm using the remote logging option with remote_base_log_folder storing the log files on S3. This adds the logs to S3 but does not remove them locally. I guess I was wondering how other people are dealing with large amount of log files filling up their disks in production. Any tips please ?
simisoz
@simisoz
Ho to fix the error fom production docker imagesModuleNotFoundError: No module named 'airflow.providers'
@alltej
@llntjn_twitter
I have Admin role in Airflow. Recently, I just noticed that I cannot pause/unpause a DAG in the Airflow UI. I am using v 1.10.10.
1 reply
I can pause or unpause it using the airflow cli.
naveeng68
@naveeng68
hi, how we pass parameter from Jenkins.
matrixbot
@matrixbot

adan_geno Hi everybody,

check out the new platform:
https://nearventur.com/auth/signup

chenzuoli
@chenzuoli
hi, all, my airflow version is 1.10.8, and there are 3000 dags, the duration between task and task is about 3mins, it's a bit too long for us, so we want to optimize the platform, is there any idea?
airflow.cfg:
parallelism=32
dag_concurrency=16
max_active_runs_per_dag=32
run_duration = -1
num_runs = -1
processor_poll_interval = 1
min_file_process_interval = 0
dag_dir_list_interval = 300
max_thread=15
store_serialized_dags = True
min_serialized_dag_update_interval = 600
chenzuoli
@chenzuoli
i found a bug in the new version above 1.10.8: if you run the no_status task whose upstream tasks are already success in the UI, then you will get an dependencies not met error, is there any one get this situation?
i update airflow source code:
vim airflow/serialization/serialized_objects.py +582
replace the code :dag.task_dict[task_id]._upstream_task_ids.add(task_id) # pylint: disable=protected-access
ManiBharataraju
@ManiBharataraju
Hi All, I had a question regarding the ExternalTaskSensor. The operator requires the parent DAG's run date to exactly match the child DAG's run date(along with timestamp) , otherwise it keeps poking the parent task. Is that correct? I was thinking it will consider the most recent run for the same date if timedelta/execution_dt_fn is not provided.
Mohit Bhandari
@mohitbhandari37
Hi Everyone , good evening i am new to apache-airflow , i have setup it on my local env but when i run it from UI it got stucked at Adding at queue
[2020-08-14 19:30:09,730] {base_executor.py:58} INFO - Adding to queue: ['airflow', 'run', 'hello_world', 'hello_task', '2020-08-14T14:00:07.217726+00:00', '--local', '--pool', 'default_pool', '-sd', '/Users/user/airflow/dags/hello_world.py']
same i can run with command line
any help?
any help
lopesdiego122
@lopesdiego122
Hi All, anybody doing deploy of dags folders from AWS S3 with celery executor? Which strategy are you doing, rsync, cp or mount s3 folder on airflow server?
Kenney He
@kenney2_gitlab
any one can help me with bitnami airflow oauth setup? I used webserver_config.py with airflow webserver_config instruction and environment but still fail
Ankur Nayyar
@anayyar82
Hello.. in Airflow how to capture ldap user name
neil90
@neil90
Can somebody help me understand what the point is of the official Airflow Docker Image?
From my understanding I still need to do my own level of orchestration
of which daemons i want to start etc
Ashish Mishra
@ashismi
I just want to make sure of one thing. Suppose a dag.py file which contain lets say two task A and B. is dag.py parsed/run for each task. I see in log in each task "Filling up the DagBag from /usr/local/airflow/dags/dag.py". is it correct or we have configured it wrong
if dag.py file has sleep(3000) then will it sleep for both task run. or parsing happens only once.
Thiago Salgado
@ScrimForever
Hello all, have a new docker image on hub. Anyone know how can i deploy this image correctly ?
Elhay Efrat
@shdowofdeath
someone saw this isssue when running long tasks [2020-09-14 16:45:26,189] {{taskinstance.py:1150}} ERROR - (psycopg2.OperationalError) could not translate host name "airflow-dev-postgresql" to address: Temporary failure in name resolution
Nikolaas Steenbergen
@nikste

Hi all, I'm trying to run a complete dag from within a unit test.
For a simple sample graph it works, however for the actual test graph i get:

[2020-09-24 16:55:57,343] {backfill_job.py:461} ERROR - Task instance <TaskInstance: dag_name.print_the_context 2020-05-22 12:00:00+00:00 [failed]> with failed state

Is there a default log directory with more information on what exactly goes wrong ?
I cannot seem to find anything..

Nikolaas Steenbergen
@nikste
ok, it seems its very important to call .clear() on the dag before execution, then this wont fail
Daniel Papp
@jakab922
Hello
I deployed airflow via the helm chart from the stable repository.
It seems to be working fine
I just don't know how to use the airflow cli tool to upload a dag
Can someone help me with that?