Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
    Meir Shpilraien (Spielrein)
    @MeirShpilraien

    Hey @leonh, regarding your first question, if you are using v0.4.0, it is possible to see registered executions using 'RG.DUMPREGISTRATIONS' command (https://oss.redislabs.com/redisgears/commands.html#rgdumpregistrations) then you can unregistered a specific registration using 'RG.UNREGISTER' (https://oss.redislabs.com/redisgears/commands.html#rgunregister). Notice that each time a registration is triggered it creates an execution, those executions can be managed using 'RG.DUMPEXECUTIONS', 'RG.GETEXECUTION' and 'RG.DROPEXECUTION'.

    Regarding examples, there is a very small and simple examples here: https://oss.redislabs.com/redisgears/examples.html.
    For more complex examples, take a look at those links:
    https://github.com/RedisGears/AnimalRecognitionDemo
    https://github.com/RedisGears/EdgeRealtimeVideoAnalytics
    https://github.com/RedisGears/MultiModelExample

    ludwig404
    @leonh
    thanks! @MeirShpilraien
    Mohit Kumar
    @mhtkmr74
    Hi Team,
    I am learning redis gears. I am stuck with one issue, I am unable to switch the DB index in Redis.
    Ex:- In redis-cli if I execute RG.PYEXECUTE "GearsBuilder('KeysReader').run()"
    I am getting the result from index 0 but if i switch db index from 0 to say 3 by command select 3, In this also I am getting the same response what I was getting in db index 0. My DB index 3 is empty.
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Hey @mhtkmr74
    This is a known limitation, Gears runs in the background and might not even be related to a specific client (on registers for example). So it runs on db 0 by default (we might allow to change it in the future as an argument to the run/register functions but currently its not possible)
    Can you explain a little about the usecase? maybe we can find a way to work around it.
    Mohit Kumar
    @mhtkmr74
    Hi @MeirShpilraien
    Thanks for your quick response.
    I was storing two different types of data in two different indexes. So I came across this issue. I figured out the workaround now, I am storing both data in same index and going ahead
    xxGL1TCHxx
    @xxGL1TCHxx

    I've been a Redis fan for a while, and this provides some great new abilities. I'm able to make everything work except the StreamReader. Using the Basic Redis Stream Processing example found here: https://oss.redislabs.com/redisgears/examples.html#basic-redis-stream-processing

    Should I have to do anything more except publish to 'mystream'?

    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    @xxGL1TCHxx You do not have to do anything else, can you post the reply of RG.DUMPREGISTRATIONS?
    @xxGL1TCHxx, when you say publish you mean XADD command right? PUBLISH command is for PUBSUB and not for streams.
    xxGL1TCHxx
    @xxGL1TCHxx
    Thanks for the response. I believe I had my stream and pubsub topics crossed. Great catch and thanks for pointing that out. I've been testing everything at once and must have just missed it. Sorry for the troubles!
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    No problem, let us know if you have any other questions/issues.
    Aman Singh
    @amansheaven
    Hey guys, I was working on the beyond the cache hackathon, and I am kinda stuck at adding multiple modules on the server. The .so files link the libraries to the server if they are added standalone, I mean if I compile it on a different machine and importing on other would it work?
    On Github, the way to do is copy Linux libraries and the .so file in my use case I would need three modules stream, graph and AI so should I just compile them take the .so files and get bin/linux-x64-release/python3_<version> from my local machine!
    or is there a better way of doing all this, by the great work at RedisEdge, I was about to work on building something similar but you guys did it already and in a way better manner :)
    Guy Korland
    @gkorland
    You can use the redismod docker which already comes with all the modules preinstalled docker pull redislabs/redismod:latest
    Aman Singh
    @amansheaven
    one more query, can gears process pub/sub messages?
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    @amansheaven currently there is no good way to achieve it, can you share the use case? maybe we can come up with another idea?
    Aman Singh
    @amansheaven
    Yeah I was thinking to make a random chat website that connects with people based on sentimental analysis of what you've been talking. To achieve this communication I am using web sockets, I do get an option in my library to choose Redis as my broker which basically makes a channel and execute it likewise! I wanted to query this pubsub and apply the AI module with the help of gears.
    I don't think I would be able to make it till the hackathon ends but it would be helpful if you have any intresteting additions to it. I'd be greatful to try then out!
    thanks @gkorland & @MeirShpilraien for your help!
    Guy Korland
    @gkorland
    @amansheaven thanks for the input and feedback
    Meha
    @mehamasum_twitter
    Hi folks, I was wondering why the write operation is first written to a hashkey first and then to a stream in Write Behind recipe in this blog post: https://redislabs.com/blog/redisgears-serverless-engine-for-redis/
    I mean why not directly write to the stream? Is there any extra benefits in this architecture?
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Mainly the use-case was that you want to replicated redis hashes to some target database. If you want you can skip and write directly to streams but notice that you need to know the streams names and distribute the writes between them. You can get the steams name with a simple RedisGears function:
    GB('KeysOnlyReader').run('< Stream prefix >') # will return all the keys that starts with '< Stream prefix >'
    Valery Lavrentiev
    @lavsurgut
    Hey guys, I have a question about possible integration of redisgraph and redisgears. Do you know how one can react to redisgraph changes and perform some computation in redisgears. It seems that one has to use redisgraph-py library to connect separately to redis to extract records when a trigger is called (for example). Is there any other way get redisgears started and passing in input data from redisgraph?
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    hey @lavsurgut,
    so currently there is no keyspace notification trigger when the graph is changing. There are two workarounds I can think about, one is to write to another key when you update the graph and register on changes or to update the graph directly from gears function. If you choose the second approach notice that graph take the execution to run in another thread, so you will not get a reply when running the 'execute' function. To work around that you need to wrap the graph command inside lua script, this way graph will not move the execution to a background thread and you will get the reply in the gears function. I know those are lots of workarounds, we do plane a direct C level API between graph and gears (just like we have with RedisAI) and I am sure that graph changes events will be part of that API (as you are not the first one who asks for it)...
    Guy Korland
    @gkorland
    Guy Korland
    @gkorland
    We have a new Discord channel https://discord.gg/6yaVTtp
    Sam
    @jsam
    Hey guys, quick question about the StreamReader ... it says in the docs that the output is always a dict, but if I provide the batch argument, I receive a list where the first value is a message-id and the second is the payload of the message. In this "batch" way there is no stream_key_name found. Is this a bug or expected behavior? If it's expected, how can I setup trigger every 200 messages which can figure out from which stream are they coming?
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Hey @jsam , which RedisGears version are you using, you should get the same record, the only different is the amount of records you get
    If you want each 200 messages you should set batch to 200
    Sam
    @jsam
    I'm using the 1.0.0 one, let me try to pull the latest one and check
    Sam
    @jsam
    Ok, I can confirm that the above-mentioned problem is not present in the latest 1.0.2 image. Thanks for help @MeirShpilraien
    I have another question though, with this batch mode I would expect that .foreach(lambda x: ...) gives x as the batch of the size specified in the .register, however, I'm getting each record replayed one by one ... how can I process entire batch at once?
    I'm using the following example: gb = GearsBuilder("StreamReader") gb.foreach(lambda x: process_batch(x)) gb.register("livestream:*", trimStream=False, batch=STREAM_BATCH_SIZE)
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    @jsam you will need to aggregare all the record to a list
    .aggregate([], lambda a,r: a + [r], lambda a, r: a+r) IIRC
    Sam
    @jsam
    perfect, it works! thanks alot!
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Sure
    Istemi Ekin Akkus
    @iakkus
    Hello, quick question: I am trying to get the rgsync examples run in cluster mode with 3 masters. The modules and scripts are loaded at each node's start, and then I create the cluster. However, when populating the database, only updates to the first node (i.e., whichever was started first) are propagated to the backend, even though the data is also populated via client connections to the other nodes. Am I missing something? Is there a separate procedure to run it as such?
    Istemi Ekin Akkus
    @iakkus
    (By the way, https://oss.redislabs.com/redisgears/#contact-us doesn't properly link to this chatroom, so that you know)
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Thanks @iakkus, for notifying us about the link, we have discord now (https://discord.com/invite/6yaVTtp), I will update it.
    Regarding your question, can you please share the output of RG.DUMPREGISTRATIONS of each node? Also please notice that on oss cluster you need to send RG.REFRESHCLUSTER to each node after setting up the cluster, have you done that? (https://oss.redislabs.com/redisgears/commands.html#rgrefreshcluster)
    Istemi Ekin Akkus
    @iakkus
    @MeirShpilraien Thanks! I had noticed some "operation not permitted" errors when executing the trigger (i.e., when the entry was added to the stream). with the rg.refreshcluster, i see everything working now. I'll play around some more.
    nilaykindan
    @nilaykindan
    Hello, I have a question about redisgear integration with redis and redisgraph. Redisgraph currently does not support map type in property.
    I need a functionality to relate map types in redisgraph properties, I thought that I can join redis table with redisgraph to get properties
    and make sophisticated queries with redisgear both using redis and redisgraph.
    Is it possible? If it is in roadmap could you please give an approximate time, for example 2. quarter of 2021.
    Thanks
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Hey @nilaykindan can you be more specific on what you want to achieve and what you mean by integration? If you need to call graph commands from gears then there is a small issue with getting back the replies but you can work around it by wrapping it with a small Lua script.
    Harshdeep
    @Harshdeep1996
    Hi everyone, I have recently started working with Redis Gears
    and I wanted to know if it is possible to listen to a stream?
    gear_builder.foreach(process_stream).register('test:db')
    def process_stream(x): print(x['key'])
    but apparently it does not work, and just closes the pipe too soon
    Meir Shpilraien (Spielrein)
    @MeirShpilraien
    Hey @Harshdeep1996 please notice that we moved to discord: https://discord.gg/6yaVTtp
    Regarding your question, yes it is totally possible to listen to a stream, if you can share your entire gears function we can try to figure out what's wrong. We can continue the discussion on discord ..