Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Austin Keller
    @WhosAustinK_twitter
    Do you prefer one over the other?
    Sven Anderson
    @svenanderson
    to me go-redis was more intuitive
    Austin Keller
    @WhosAustinK_twitter

    Alright, Thanks! I found lambda store through this medium post: https://medium.com/lambda-store/serverless-redis-is-here-34c2fa335f24

    Do you know of any other places I can read a bit more about the service?

    Sven Anderson
    @svenanderson
    [Sven Anderson, lstr] https://medium.com/lambda-store
    Austin Keller
    @WhosAustinK_twitter
    Sweet. Thanks :) I'll probably be popping in here every now and then as I get further along
    Sven Anderson
    @svenanderson
    sure
    Karan Shahani
    @kshahani
    Hey! I'm trying to set a key/value in lambda store using IORedis in Node.js, and I have a large payload (like a 500kB JSON)... I'm attempting to stringify it and set it using client.set with IORedis, but I'm seeing the error "ERR max request size is exceeded"
    Is this some configuration I can work around, or do I have to cache the data in a different way?
    Sven Anderson
    @svenanderson
    Hello. Lambda Store has a 400K max size
    Karan Shahani
    @kshahani
    What's the recommended solution to work around this?
    Sven Anderson
    @svenanderson
    you may refactor your code to do partial updates
    what is your usecase?
    Karan Shahani
    @kshahani
    we're requesting a big set of data from an API (it can take a few seconds to request), and caching it for 15 seconds for subsequent requests on our API so we don't need to query the downstream endpoints too frequently
    no worries, we can probably refactor the caching to chunk it down into groups of 100
    Sven Anderson
    @svenanderson
    yeah smaller requests will be best. but if it is not possible, mail to support@lambda.store
    Inas Luthfi
    @nascode_twitter
    Hi there, since your pricing is based on command count, is mget with many keys considered single command?
    What about if we batch command using pipeline?
    Mattia Bianchi
    @mattiabi
    Hey, mget with many keys are considered as single command. But batch commands in a single pipeline are not. We literally count the actual Redis commands.
    Inas Luthfi
    @nascode_twitter
    Thanks for the answer @mattiabi
    Evan Schwartz
    @emschwartz
    Hi all, my HMSET commands are failing or timing out and I'm having trouble figuring out what might be going on. The payload is about 6000 characters at the moment. Most times, the request just hangs until the client (ioredis) times out. However, one of the recent requests errored out with the message: ERR Consistent write timeout. Latest write request may not be persisted to disk or may not be replicated to sufficient replicas. Do you have any suggestions for what to do about this?
    Mattia Bianchi
    @mattiabi
    Hi Evan, it seems consistent writes are failing to complete in time. Do you know what's the timeout setting for your ioredis client? Also can you please send your database id (which you can see from the console.lambda.store URL), so we can investigate the issue in detail?
    Evan Schwartz
    @emschwartz
    @mattiabi Thanks for the quick response! I DM'ed you the URL. I'm looking for the timeout setting in the client but it may not be in the client itself. I just tried it again and it's a n Error: socket hang up rather than a timeout
    Mattia Bianchi
    @mattiabi
    :+1: I forwarded the URL to the team to investigate the issue. I'll be back to you as soon as we get a clue.
    Evan Schwartz
    @emschwartz
    It looks like write requests of any kind are failing and not just for that DB. I just created another DB (both have strong consistency turned on) and a simple set a 1 command fails with the same error :(
    Mattia Bianchi
    @mattiabi
    Hey Evan. We figured out the cause. It's a configuration issue. We are planning to do a hot fix very soon. Sorry for the inconvenience. I'll inform you when the fix goes live.
    Evan Schwartz
    @emschwartz
    Thanks!
    Mattia Bianchi
    @mattiabi
    Evan, fix is live now. Can you try again to verify?
    Evan Schwartz
    @emschwartz
    Looks good. Thanks!
    Evan Schwartz
    @emschwartz
    I'm seeing Max Concurrent Connections errors with 20 connections even though I upgraded to the standard db type. Can someone help with this?
    Sven Anderson
    @svenanderson
    Sure. Just a minute
    Sven Anderson
    @svenanderson
    We have found the bug and fixed now.
    rheyer
    @rheyer
    Hey guys, how do i go about switching my database to premium?
    Sven Anderson
    @svenanderson
    Hello! Premium is not publicly available yet. ETA is February
    rheyer
    @rheyer
    Ah okay thanks. I think im hitting the connection cap. This weekend i was trying it out and was running ~1000 concurrent aws lambda executions and each was subscribing to the same channel and i think i was hitting concurrent connection limits, at one point the graph was showing 1400 concurrent connections and I was getting disconnects
    i guess i could change it to setting a key value and the other lambdas could on a loop be checking for the value to exist
    but that will spam a bunch of commands and it wont be as responsive when the key exists
    because ill need to sleep in between my checks
    Sven Anderson
    @svenanderson
    do you use AWS LAmbda?
    rheyer
    @rheyer
    yep
    Sven Anderson
    @svenanderson
    do you connect and close your redis client inside the function?
    we recommend to connect and close your redis client inside the function. otherwise, connections piles up, AWS Lambda does not close them properly
    rheyer
    @rheyer
    ok, i'll give that a try. I also was going to test more than 1000 concurrent executions but if the database doesnt support that I'll have to wait uuntil febuary
    Sven Anderson
    @svenanderson
    I think closing the connection ( quit() ) inside function will make a big impact. I have seen a similar situation with another user.
    rheyer
    @rheyer
    Thanks for the feedback, i'll let you know how it goes :)
    Sven Anderson
    @svenanderson
    sounds good!
    rheyer
    @rheyer
    Hmm yeah I moved the init of the redis client and subscribing and unsubscribing into my functions and when stress testing my service I am still hitting the max # of connections which is causing the redis client to raise a an exception "max concurrent connections exceeded". I think I just need to wait until the premium database comes out in febuary. I had 1119 connections at the same time
    which is expected, because of how many concurrent lambda executions. Maybe i'll check back in febuary