Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
  • 13:10
    sonyshchuk commented #544
  • Sep 20 14:14
    mensfeld commented #544
  • Sep 19 13:15
    sonyshchuk commented #544
  • Sep 17 13:01

    mensfeld on master

    gem bump-2019-09-17 (compare)

  • Sep 17 10:00
    mensfeld commented #544
  • Sep 17 09:59
    mensfeld closed #543
  • Sep 17 09:59
    mensfeld commented #543
  • Sep 12 12:51
    sonyshchuk commented #544
  • Sep 12 12:47
    sonyshchuk commented #544
  • Sep 12 12:37
    mensfeld commented #543
  • Sep 12 12:19
    OPhamster commented #543
  • Sep 12 12:19
    OPhamster commented #543
  • Sep 12 11:50

    mensfeld on master

    gem bump-2019-09-12 (compare)

  • Sep 12 11:47
    mensfeld labeled #544
  • Sep 12 11:47
    mensfeld assigned #544
  • Sep 12 11:47
    mensfeld commented #544
  • Sep 12 11:43
    mensfeld edited #544
  • Sep 12 11:43
    mensfeld edited #544
  • Sep 12 11:43
    mensfeld labeled #543
  • Sep 12 11:43
    mensfeld labeled #543
Maciej Mensfeld
@mensfeld
thanks - doing our best
anhtuanluu36
@anhtuanluu36
Hi all
I have an issue when connecting to Kafka via SSL
Here is my config:
config.kafka.ssl_client_cert = Rails.root.join(ENV['KARAFKA_CLIENT_CERT']).read if ENV['KARAFKA_CLIENT_CERT'].present?
config.kafka.ssl_client_cert_key = Rails.root.join(ENV['KARAFKA_CLIENT_KEY']).read if ENV['KARAFKA_CLIENT_KEY'].present?
config.kafka.ssl_ca_cert = Rails.root.join(ENV['KARAFKA_CA_CERT']).read if ENV['KARAFKA_CA_CERT'].present?
It works on 1.2.8 but it doesn't work on 1.2.13
Maciej Mensfeld
@mensfeld
@anhtuanluu36 can you open an issue in Github please?
KylliS
@kyllisi_twitter
Hello! I have a problem in using multiple topics on responder in rails application. I followed wiki https://github.com/karafka/wiki/blob/1.2/Responders.md#registering-topics but on the runtime it gives me error ":required_usage=>["must be greater than or equal to 1"]" With one topic it works without any problems. Is there something that I should add to configuration or ...?
Maciej Mensfeld
@mensfeld
Hey @kyllisi_twitter could you create an issue OR provide an example?
KylliS
@kyllisi_twitter
Sure. Is responder and karafka config enough for the example?
Maciej Mensfeld
@mensfeld
@kyllisi_twitter yes. Just provide version info as well. Thanks!
Chad Wilken
@chadwilken
Is there a comparison anywhere that compares Karafka with Racecar/DeliveryBoy? I already have most of the code written with the latter but if Karaka handles a bunch of the implementation details then it may be worth making the switch
Maciej Mensfeld
@mensfeld
@chadwilken there isn't one that I would know. I do recall one in terms of performance but it's old.
Karafka is built with different principles in mind - to provide users with a seamsless solution they can use (pretty much like rails) out of the box
without having to understand too many things
Maciej Mensfeld
@mensfeld
@/all - WaterDrop 2.0 code and API: karafka/waterdrop#106
Ilia Donskikh
@IlyaDonskikh
Hello, guys. I did run two similar karafka microservices which consume from the same topic. Problem is when i produce message to the topic both consumers receive that and do same actions twice. How to avoid behavior like that? Both consumers marked as a same consumer group.
Ilia Donskikh
@IlyaDonskikh
Problem was: the microservices should have the same client_id not only the consumer_group.
Maciej Mensfeld
@mensfeld
@IlyaDonskikh but this is suppose to work like that ;)
if you have different client id and consumer groups don't expect a different behaviour ;)
Ilia Donskikh
@IlyaDonskikh
I realised that when i saw the consumer group name (which like #{client_id}_#{consumer_group} ) in the Kafka panel. It's was bit unpredictable for me because documentation mentioned only consumer group option.
Ilia Donskikh
@IlyaDonskikh
Anyway now i faced the following problem. ) I do handle all consumer request by sidekiq background job (not by the gem karafka-sidekiq-backend) so, when i reach 10 request in queue i would like to stop the consumer to readdress Kafka to another microservice or just wait for a while (and i know how to track the "exception"). I found out that kafka ruby has a method stop for consumer and it's sounds good for my issue. But i don't know the best way to get consumer instance from the Karafka::BaseConsumer class and especially from specific consumer class instance (in my case ImagesQueuingConsumer < ApplicationConsumer). Or maybe i missed something and skip the easiest way to solve the issue?
Maciej Mensfeld
@mensfeld
Ilia Donskikh
@IlyaDonskikh
1.2. But thanks i see that i works for 1.2 as well. Awesome framework!
Maciej Mensfeld
@mensfeld
@IlyaDonskikh thanks. Ping me in case of any problems! :)
and don't forget to :star: if you like ;)
Jan Wiemers
@janwiemers
hey there
I got started with karafka this morning and I’m a little confused
I did created a setup which is working in Docker
I can send messages against it but it seems it is only logging out when the process gets a sigterm
anything I do wrong here?
Maciej Mensfeld
@mensfeld
Hmm
@janwiemers can you show any repro?
what logger do you use?
what rails version (if any)
Jan Wiemers
@janwiemers
hey no I haven’t a repo as of now
no rails, stand alone
logger is the default one
and it is logging everything out, but only if I terminate the karafka server
Jan Wiemers
@janwiemers
@mensfeld
Maciej Mensfeld
@mensfeld
Please open an issue and provide as much info as you can or a repro script. That should not happen and it's definetely something weird
@janwiemers
but I need to be able to reproduce it
Jan Wiemers
@janwiemers
@mensfeld so yeah it seems that karafka is only logging out every 5 messages. could that be?
Maciej Mensfeld
@mensfeld
@janwiemers no
:)
that is impossible
Maciej Mensfeld
@mensfeld
@janwiemers please create an issue. With a repro I should be able to tackle it in an hour max
David Patterson
@davidpatters0n
Can you use Karafka with Avro Turf?
Maciej Mensfeld
@mensfeld
@davidpatters0n no problem at all. You can use custom serializers and deserializers