Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • Sep 29 09:06
    Suor commented #416
  • Sep 29 07:24
    moonseoklee commented #416
  • Sep 29 07:18
    moonseoklee closed #416
  • Sep 29 07:18
    moonseoklee synchronize #416
  • Sep 22 05:30
    billpull commented #436
  • Sep 22 05:30
    billpull closed #436
  • Sep 22 05:27
    billpull edited #436
  • Sep 22 05:26
    billpull opened #436
  • Aug 25 10:23
    Suor commented #432
  • Aug 25 07:57
    M1ha-Shvn commented #251
  • Aug 25 07:20
    stuaxo commented #251
  • Aug 16 04:09
    MudassirNoor closed #435
  • Jul 19 00:55
    MudassirNoor opened #435
  • Jul 11 12:21
    Suor closed #323
  • Jul 11 12:21
    Suor commented #323
  • Jul 11 12:20

    Suor on master

    Add command to clear stale cach… (compare)

  • Jul 11 12:20
    Suor closed #434
  • Jul 11 12:20
    Suor commented #434
  • Jul 11 09:29
    browniebroke synchronize #434
  • Jul 11 09:19
    browniebroke synchronize #434
Alexander Schepanovski
@Suor
Then you should cache smaller function, which can't return empty response
Chetan Dhembre
@chetandhembre
btw we are using cacheops in production from feb.. and we are loving it
Alexander Schepanovski
@Suor
glad you like it
Manjit Kumar
@manjitkumar
Hi, I was exploring caching options for and encountered django-cacheops on Github, seems pretty good with granule level of invalidation as satisfy my use cases.
Alexander Schepanovski
@Suor
Hi, glad you liked it
Manjit Kumar
@manjitkumar
I was able to configure it with my project where it's not working for automatic queryset caching whereas .cache() does cache the results. Any one can tell me what might be wrong.
CACHEOPS_REDIS = "redis://localhost:6379/1"

CACHEOPS_DEFAULTS = {
    'timeout': 60*60
}


CACHEOPS = {
    '*.*': {},
}

CACHEOPS_DEGRADE_ON_FAILURE = True
Alexander Schepanovski
@Suor
Did you supply list of operations to cache?
You've not
Manjit Kumar
@manjitkumar
Oh
CACHEOPS = {
    'api.Topic': {'ops': 'all', 'cache_on_save': True},       # cache all operations Topic model
    'api.Subscriber': {'ops': 'all', 'cache_on_save': True},  # cache all operations Subscriber model
    'api.SubscriberContract': {'ops': 'all', 'cache_on_save': True},  # cache all operations SubscriberContract model
}
This should be like this set ops to all
Alexander Schepanovski
@Suor
Yes
Manjit Kumar
@manjitkumar
oh i get it, apologies i misread the config in README.md. Thanks for your help :)
Alexander Schepanovski
@Suor
You are welcome
Chetan Dhembre
@chetandhembre
hi .. how can i invalidate nested function
Alexander Schepanovski
@Suor
hi
There is no easy way
Chetan Dhembre
@chetandhembre
ok
can you tell me?
Alexander Schepanovski
@Suor
And if your function is nested it probably means its value depends not only on arguments but on closure, so you can cache different things into same key
If this is not the case then you can just unnest it
Chetan Dhembre
@chetandhembre
i guess i will unnest it
i want to create my own key for function .. thats why i decided to go with nested function
Alexander Schepanovski
@Suor
Hmm, not sure I still follow
Why do you need custom key?
Maybe you can post the code?
Chetan Dhembre
@chetandhembre
def get_a(start, end):
    start_str = start.date().isoformat()
    end_str = end.date().isoformat()
    key = '%s-%s' % (start_str, end_str)

    @cached_as(extra=key, timeout=30 * 60)
    def _get_a():
        a = some_function_call()
        # do not want to cache results if a are empty
        if len(a.keys()) == 0:
            e = UncachedResult(a)
            raise e

        return a

    try:
        return _get_a()
    except UncachedResult, e:
        return e.result
Alexander Schepanovski
@Suor
First argument to @cached_as() should be queryset or model or object
This is why it's called "cached as"
And I don't see invalidation here
Chetan Dhembre
@chetandhembre
can i use cached?
Alexander Schepanovski
@Suor
Anyway, if you want to use invalidation you need to use simple - @cached(), and unnest _get_a() moving extra to arg
Chetan Dhembre
@chetandhembre
problem with that is my current nested function will get (key, start, end) as argument
start/end are datetime with second level granularity
how can i just use only single argument for caching purpose?
Alexander Schepanovski
@Suor
Just use start and end, and don't bother with key
@cached(timeout=30 * 60)
def _get_a(start, end):
    a = some_function_call()
    # Do cache empty results
    if not a:
        raise UncachedResult(a)

    return a

def get_a(start, end):
    start_str = start.date().isoformat()
    end_str = end.date().isoformat()

    try:
        return _get_a(start, end)
    except UncachedResult as e:
        return e.result
Chetan Dhembre
@chetandhembre
actually i missed one thing in my code here. .i am using start and end in my some_function_call() which are mostly get from datetime.utcnow()
so above your solution wont work
i need second level granularity for start and end
Alexander Schepanovski
@Suor
It will, just pass then
*them
What do you mean by second level granularity?
So it's
@cached(timeout=30 * 60)
def _get_a(start, end):
    a = some_function_call(start, end)
    # Do cache empty results
    if not a:
        raise UncachedResult(a)

    return a

def get_a(start, end):
    try:
        return _get_a(start, end)
    except UncachedResult as e:
        return e.result
Chetan Dhembre
@chetandhembre
end = datetime.now()
start = end - timedelta(days-12)
what will get key of cache result for _get_a()?
Alexander Schepanovski
@Suor
To get a key:
_get_a.key(start, end)
To invalidate:
_get_a.invalidate(start, end)
Chetan Dhembre
@chetandhembre
i want to cache result for 30 minutes ..
so result for datetime.datetime(2016, 10, 14, 12, 17, 0, 0) will be same for datetime.datetime(2016, 10, 14, 12, 3, 0, 0)