Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    idreamussr
    @idreamussr
    HI everyone
    Who can help me with configuring celery + django http://stackoverflow.com/questions/41639334/broadcast-messages-in-celery-4-x
    sulphide
    @sulphide
    hi i'm trying to use celery with airbnb's airflow job scheduler and when airflow sends a message i get a garbled header on the consumer side
    it looks like the keys are escaped somehow
    {'\xe5\xca.\xdb\x00\x00\x00\x00\x00': None, 'P&5\x07\x00': None, 'T\nKB\x00\x00\x00': 'd3b7edf2-ed7a-465b-82e8-441819ad5933', '\xcfb\xddR': 'py', '9*\xa8': None, '\xb7/b\x84\x00\x00\x00': 0, '\xe0\x0b\xfa\x89\x00\x00\x00': None, '\xdfR\xc4x\x00\x00\x00\x00\x00': [None, None], 'T3\x1d ': 'airflow.executors.celery_executor.execute_command', '\xae\xbf': 'd3b7edf2-ed7a-465b-82e8-441819ad5933', '\x11s\x1f\xd8\x00\x00\x00\x00': "['airflow run depends_on_past_dag task1 2017-01-13T00:00:00 --local -sd DAGS_FOLDER/depends_on_past.py ']", 'UL\xa1\xfc\x00\x00\x00\x00\x00\x00': '{}'}}
    any ideas?
    Tibo Beijen
    @TBeijen
    Does anyone know if using separate master and read redis locations as broker url is supported? (multi-node AWS elasticache cluster)
    Ahmed Kotb
    @AEzzatA
    Hello guy, I have some tasks in a project, one of does some NLP processing. when i create 4 workers i assume it takes the same space for memory 4 times, which kills my server
    is there a way to init the object outside of celery and use it as a deamon to call it so that would not happen?
    Kamal Adusumilli
    @kamalkanth
    Hi, I am using celery in one of our projects.
    We are using redis to send our tasks
    After a task is executed, we want to send the results back again as a task on the same redis queue for another celery broker to execute them
    Is this possible?
    Brian A.
    @brjadams
    Anyone a solution to the Refusing to deserialize untrusted content of type pickle (application/x-python-serialize) ?
    I'm using using Celery 4.02. All the suggestions Ive seen online haven't helped.
    Suresh Veeragoni
    @veeragoni
    anyone know the solution to " WARNING/MainProcess] Received and deleted unknown message. Wrong destination?!?"
    when processing a message from celery task queue
    Steve Peak
    @stevepeak
    Hey friends! How to you rename tasks safely so that queued jobs are executed by the new task name? e.g. app.tasks.old_version.delay(...) => app.tasks.new_version.delay(...) Thanks!
    samrose
    @samrose
    using celery v. 3.1 what is a good practice for consuming results, then running new tasks?
    russellriotly
    @russellriotly
    is it good to run 40 concurrency on AWS T2.small?
    Aswin Kalarickal
    @aswinkalarickal
    I want to run tasks in two queues simultaneously with both of them running serially. A task in a queue should run only after the old task was completed.
    Italo Maia
    @imaia
    hey folks
    celery chain returns a AsyncResult
    can I keep track of the whole chain with its id?
    Italo Maia
    @imaia
    hello
    Navid Nabavi
    @navidnabavi
    Hi, is there any way to find out how much time does it takes from a task queued until it became consumed or done?
    Navid Nabavi
    @navidnabavi
    @navidnabavi I added headers for creation time. as my server times are sync via ntp so I can trust this durations. I used inheritance to override apply_async and call to hide time calculation. I can share the code if anyone needs it.
    Stephen Estrada
    @stcrestrada
    I just began using celery. I am inheriting from Task. Whenever I run my class Scheduler.apply_async, I get TypeError: Object of type type is not JSON serializable.
    Anyone have any ideas for a workaround?
    class CeleryBase(app.Task):
        abstract = True
    
        def run(self, *args, **kwargs):
            pass
    
        def __call__(self, *args, **kwargs):
            """In celery task this function call the run method, here you can
            set some environment variable before the run of the task"""
            logger.info("Starting Run")
            return super(CeleryBase, self).__call__(*args, **kwargs)
    
        def after_return(self, status, retval, task_id, args, kwargs, einfo):
            # exit point of the task whatever is the state
            logger.info("Ending run")
            pass
    
    
    class ScheduleHarvest(CeleryBase):
        ignore_result = False
        name = None
    
        def run(self, x, y):
            try:
                if x and y:
                    result = add(x, y)
                    logger.info('result = %d' % result)
                    return result
                else:
                    logger.error('No x or y in arguments.')
            except KeyError:
                return False
    
    
    def add(x, y):
        return x + y
    
    ScheduleHarvest.name = ScheduleHarvest
    app.register_task(ScheduleHarvest)
    ScheduleHarvest().apply_async([1, 2])
    Stephen Estrada
    @stcrestrada
    Any thoughts on this engineers? ^
    posted on github too
    celery/celery#4708
    Sam
    @samuelmullin

    Is it possible to set 'max-tasks-per-child' per-worker using the multi syntax? It's throwing an error for me when I do it ala:

    CELERYD_OPTS="-c:webhooks 2 -Q:webhooks webhooks --max-tasks-per-child:webhooks 100

    I have a half dozen workers, some running high-volume tasks and some running longer high-memory tasks so I am hoping to avoid having to set a single value across the board
    Kd Johar
    @kdjohar91_twitter
    Using aws elastic cache as a broker. When worker is down and a task is triggered it is moved to failure state and celery signal task_failure is also not triggered. Anyone have any idea ?
    {"status": "FAILURE", "result": {"exc_type": "NotRegistered", "exc_message": "\'tasks.common_tasks.get_pending_task_count_in_queue\'"}, "traceback": null, "children": [], "task_id": "8b637704-a3e0-46b1-b09b-b3fadd032e6a"}'
    Naggappan
    @naggappan
    Hi team is there any way in django celery beat to schedule the periodic tasks with recurrence. Like every 2 weeks or 3 weeks etc.. I would like to put like monday and tuesday but every 3 weeks once or 2 weeks once etc.. i.e recurrence feature
    Artem
    @Plazmius
    Hello everyone. I'm trying to create some kind of pipeline for my application. I have a problem - main target of application is to read video, take every N-th shot of video, and put it in the pipeline. Inside the pipeline there is 5 different tasks, for example:
    1. Crop image
    2. Store image in the array. if array length = IMAGES_NEEDED_FOR_TASK3, launch task 3
    3. Apply some transforms to image, make one big image from IMAGES_NEEDED_FOR_TASK3,.
    4. Stack transformed images in the array. if array length = IMAGES_NEEDED_FOR_TASK5, launch task 5
    5. Write info about income images from task 4 to database
      I struggle with implementations of task 2 and 4, because they have conditions. If they wouldn't have conditions, I would just use chain method. I thought about calling task 3 from task 2 (I thought to create a different queue for every task), but I read that this is considered as bad practice. Thank you in advance
    kshilov
    @kshilov

    Hello everyone!
    I'm using celery 4.3.0 with flask. And have an issue - if the broker is down then the task.delay() call blocks forever.

    I tryied to solved it as described here: celery/celery#4296

    But it didn't help.

    Does anybody know how to solve an issue?

    James Campbell
    @jcampbell05
    Has anyone every had a chain or a linked task not execute i.e Task A completes successfully but Task B is never called ?
    Jacob
    @goraj
    Hi is there a way to check if the broker connection is still alive that is not blocking?
    celery_app.connection().ensure_connection(max_retries=1) this would require reconnecting right
    Mohit Chawla
    @mohit-chawla
    Hello all, i cant find much resources around celery signals/task-lifecycle! (https://docs.celeryproject.org/en/latest/userguide/signals.html#basics)
    Does anyone here know if task_postrun will be triggered or not if the task fails. ?
    Thanks in advance.
    Aj1907
    @ArijitDesai
    Hello everyone,
    Has any one used Celery Instrumentation in OpenTelemetry
    amilcar-capsus
    @amilcar-capsus

    Hey all. I'm trying to get Celery working, but I get the following error:
    No module named 'celery.five'

    Any suggestions on what I can do? Thanks in advance.

    Harsha Vardhan
    @hackwithharsha
    Hi everyone, is there any way to get a number of tasks that were pending before current_task. I have seen inspect module of celery which will provide me with the count of currently registered tasks. However, that inspect module count is how many are registered in celery workers irrespective of current task_id... in simple terms, I would like to pass a task_id to function which will have to return a count of a number of pending tasks before that task_id? is there any way that I could achieve this in celery ?
    It would be like getting a count of pending tasks before that tasks_id? inspect module provide overall statistics which is not I want in this current scenario..