Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Aug 16 04:09
    MudassirNoor closed #435
  • Jul 19 00:55
    MudassirNoor opened #435
  • Jul 11 12:21
    Suor closed #323
  • Jul 11 12:21
    Suor commented #323
  • Jul 11 12:20

    Suor on master

    Add command to clear stale cach… (compare)

  • Jul 11 12:20
    Suor closed #434
  • Jul 11 12:20
    Suor commented #434
  • Jul 11 09:29
    browniebroke synchronize #434
  • Jul 11 09:19
    browniebroke synchronize #434
  • Jul 11 09:12
    browniebroke synchronize #434
  • Jul 11 08:40
    browniebroke synchronize #434
  • Jul 10 04:06

    Suor on master

    Fix `invalidate_m2o` for polymo… (compare)

  • Jul 10 04:06
    Suor closed #430
  • Jul 08 18:04
    browniebroke synchronize #434
  • Jul 08 17:36
    browniebroke opened #434
  • Jul 05 04:39
    Suor commented #433
  • Jul 05 04:39

    Suor on master

    docs: Fix a few typos (#433) T… (compare)

  • Jul 05 04:39
    Suor closed #433
  • Jul 02 23:59
    timgates42 opened #433
  • Jul 02 03:48
    Suor commented #430
Alexander Schepanovski
@Suor
Hi, glad you liked it
Manjit Kumar
@manjitkumar
I was able to configure it with my project where it's not working for automatic queryset caching whereas .cache() does cache the results. Any one can tell me what might be wrong.
CACHEOPS_REDIS = "redis://localhost:6379/1"

CACHEOPS_DEFAULTS = {
    'timeout': 60*60
}


CACHEOPS = {
    '*.*': {},
}

CACHEOPS_DEGRADE_ON_FAILURE = True
Alexander Schepanovski
@Suor
Did you supply list of operations to cache?
You've not
Manjit Kumar
@manjitkumar
Oh
CACHEOPS = {
    'api.Topic': {'ops': 'all', 'cache_on_save': True},       # cache all operations Topic model
    'api.Subscriber': {'ops': 'all', 'cache_on_save': True},  # cache all operations Subscriber model
    'api.SubscriberContract': {'ops': 'all', 'cache_on_save': True},  # cache all operations SubscriberContract model
}
This should be like this set ops to all
Alexander Schepanovski
@Suor
Yes
Manjit Kumar
@manjitkumar
oh i get it, apologies i misread the config in README.md. Thanks for your help :)
Alexander Schepanovski
@Suor
You are welcome
Chetan Dhembre
@chetandhembre
hi .. how can i invalidate nested function
Alexander Schepanovski
@Suor
hi
There is no easy way
Chetan Dhembre
@chetandhembre
ok
can you tell me?
Alexander Schepanovski
@Suor
And if your function is nested it probably means its value depends not only on arguments but on closure, so you can cache different things into same key
If this is not the case then you can just unnest it
Chetan Dhembre
@chetandhembre
i guess i will unnest it
i want to create my own key for function .. thats why i decided to go with nested function
Alexander Schepanovski
@Suor
Hmm, not sure I still follow
Why do you need custom key?
Maybe you can post the code?
Chetan Dhembre
@chetandhembre
def get_a(start, end):
    start_str = start.date().isoformat()
    end_str = end.date().isoformat()
    key = '%s-%s' % (start_str, end_str)

    @cached_as(extra=key, timeout=30 * 60)
    def _get_a():
        a = some_function_call()
        # do not want to cache results if a are empty
        if len(a.keys()) == 0:
            e = UncachedResult(a)
            raise e

        return a

    try:
        return _get_a()
    except UncachedResult, e:
        return e.result
Alexander Schepanovski
@Suor
First argument to @cached_as() should be queryset or model or object
This is why it's called "cached as"
And I don't see invalidation here
Chetan Dhembre
@chetandhembre
can i use cached?
Alexander Schepanovski
@Suor
Anyway, if you want to use invalidation you need to use simple - @cached(), and unnest _get_a() moving extra to arg
Chetan Dhembre
@chetandhembre
problem with that is my current nested function will get (key, start, end) as argument
start/end are datetime with second level granularity
how can i just use only single argument for caching purpose?
Alexander Schepanovski
@Suor
Just use start and end, and don't bother with key
@cached(timeout=30 * 60)
def _get_a(start, end):
    a = some_function_call()
    # Do cache empty results
    if not a:
        raise UncachedResult(a)

    return a

def get_a(start, end):
    start_str = start.date().isoformat()
    end_str = end.date().isoformat()

    try:
        return _get_a(start, end)
    except UncachedResult as e:
        return e.result
Chetan Dhembre
@chetandhembre
actually i missed one thing in my code here. .i am using start and end in my some_function_call() which are mostly get from datetime.utcnow()
so above your solution wont work
i need second level granularity for start and end
Alexander Schepanovski
@Suor
It will, just pass then
*them
What do you mean by second level granularity?
So it's
@cached(timeout=30 * 60)
def _get_a(start, end):
    a = some_function_call(start, end)
    # Do cache empty results
    if not a:
        raise UncachedResult(a)

    return a

def get_a(start, end):
    try:
        return _get_a(start, end)
    except UncachedResult as e:
        return e.result
Chetan Dhembre
@chetandhembre
end = datetime.now()
start = end - timedelta(days-12)
what will get key of cache result for _get_a()?
Alexander Schepanovski
@Suor
To get a key:
_get_a.key(start, end)
To invalidate:
_get_a.invalidate(start, end)
Chetan Dhembre
@chetandhembre
i want to cache result for 30 minutes ..
so result for datetime.datetime(2016, 10, 14, 12, 17, 0, 0) will be same for datetime.datetime(2016, 10, 14, 12, 3, 0, 0)
Alexander Schepanovski
@Suor
Then do not pass timestamp to _get_a() and some_func(), pass dates.
Chetan Dhembre
@chetandhembre
i can not
Alexander Schepanovski
@Suor
why?
Your function doesn't depend on time, right?