Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Magnus Edenhill
    @edenhill
    @oronsh -o latest
    Oron Sharabi
    @oronsh
    Thanks @edenhill
    Haruhiko Nishi
    @hanishi
    Hi, I am new to this room and I joined as I am facing an issue that kafkacat accessing a topic through a NodePort from outside Kubernetes cluster(actually it's minikube) does seem to consume any record. Is setting up for kafka NodePortnot sufficient?
    Magnus Edenhill
    @edenhill
    Haruhiko Nishi
    @hanishi
    @edenhill Thank you for the pointer. Kafkacat has become an indispensable tool when working with kafka topic, btw. Thank you !
    Magnus Edenhill
    @edenhill
    Glad to hear it!
    sharonsyra
    @Sharonsyra
    Hi all,
    I am trying to create a command that will enable me to see messages from offset m to n. I am able to get the first x messages.
    First - kafkacat -C -b -t topic -o earliest -c X
    Last - kafkacat -C -b -t topic -p 0 -o X
    Next - kafkacat -C -b -t topic -p 0 -o offset -c X
    Any ideas. I would appreciate the help.
    Magnus Edenhill
    @edenhill
    how about -o START_OFFSET -c END_OFFSET-START_OFFSET
    sharonsyra
    @Sharonsyra
    I will try that thank you. Yeah I was thinking something along that. Ensuring m is always present for option n. (m -> n) Then getting the difference between the two and using that for count value. My bad. Thank you for your time. :slight_smile:
    Yannick Koechlin
    @yannick
    there is no support for topic admin api (deleting topics) yet right ?
    Magnus Edenhill
    @edenhill
    @yannick Unfortunately no
    matrixbot
    @matrixbot
    @julius:mtx.liftm.de edenhill: I'm curious what you think of https://github.com/jcaesar/kafkacat-static — should I try and get that merged into the main repo?
    Magnus Edenhill
    @edenhill
    I'm not too fond of introducing and maintaining a new build system (meson). There's also some recent work in mklove and librdkafka (not yet merged to master) that will make static building easier
    matrixbot
    @matrixbot
    @julius:mtx.liftm.de kk. Not terribly difficult to maintain it outside until that makes it, so I'll just keep doing that.
    sharonsyra
    @Sharonsyra

    Hi folks,
    I want to implement the High-Level consumer group feature in kafka cat.
    This is my code kafkacat -b ${KAFKA_HOST}:${KAFKA_PORT} ${KAFKA_CAT_OPTS} -X security.protocol=SASL_SSL -X sasl.mechanisms=PLAIN -X sasl.username=${KAFKA_API_KEY} -X sasl.password=${KAFKA_API_SECRET} -X api.version.request=true -b $Instance -t $TOPIC

    I, however, keep getting this error. % ERROR: Failed to subscribe to 0 topics: Local: Invalid argument or configuration
    At first I though the cause was in that the topic I was testing with had one partition and it was failing because I was trying to add another consumer to the consumer group. I then tested with a topic with more partitions. The same was happening. What configuration am I doing wrong? Where am I going wrong?
    Thank you.

    sharonsyra
    @Sharonsyra
    I see where I was going wrong :smile:. It expects a list of topics. No need for the -t argument.
    ajayakumar-jayaraj
    @ajayakumar-jayaraj
    i have confluent kafka with ssl enabled with jks , how do i configure python client configured using ? can i someone pls document for ssl config.propertoes ?
    Magnus Edenhill
    @edenhill
    @ajayakumar-jayaraj Search for something like: convert jks to openssl
    ajayakumar-jayaraj
    @ajayakumar-jayaraj
    @edenhill I tried it http://maximilianchrist.com/python/databases/2016/08/13/connect-to-apache-kafka-from-python-using-ssl.html , is there different doc should i follow for confluent kafka ..
    armisz
    @armisz
    Building kafkacat 1.5.0 using bootstrap.sh on Ubuntu 18.04 fails:
    Build of avroc FAILED!
    CMake Error at CMakeLists.txt:178 (message):
    libjansson >=2.3 not found
    arnon
    @arnonrodman
    Hi All, My Kafkacat is working with one Kafka cluster and not with other ( both the same version), I can read the Kafka msg through Kafka CLI console consumer, any logs, error messages I can view... ?? onre more question is lastest version support lz4 ?
    Magnus Edenhill
    @edenhill
    @arnonrodman -d cgrp,fetch,broker should give you more information. Yes, all compression codecs are supported.
    Ratna
    @nani2ratna_twitter
    Hi, i havent checked previous messges, but can anyone help me with installing kafkacat in ubuntu 19.04
    i can only install 1.3.1, i even tried with sudo apt-get install kafkacat=1.5.0 but no luck
    it cant find any version later than 1.3.1
    I need later than 1.4.0 because i need to use -F option
    Mohammad Roohitavaf
    @roohitavaf
    Hi,
    Is it possible to get the committed metadata using kafkacat?
    I mean the one that we can get from rd_kafka_committed
    Magnus Edenhill
    @edenhill
    @roohitavaf Currently no, and that data is typically binary so not sure how we would display it meaningfully.
    Mohammad Roohitavaf
    @roohitavaf
    Thanks @edenhill
    Can we have jwt authentication and authorization?
    Mohammad Roohitavaf
    @roohitavaf
    Can I describe topics with kafkacat and see for example what is current retention time for a topic?
    Mohammad Roohitavaf
    @roohitavaf
    @edenhill Please help.
    When I set "enable.auto.commit=false" I get this
    Configuration property auto.commit.enable is deprecated: [**LEGACY PROPERTY:** This property is used by the simple legacy consumer only. When using the high-level KafkaConsumer, the global `enable.auto.commit` property must be used instead]. If true, periodically commit offset of the last message handed to the application. This committed offset will be used when the process restarts to pick up where it left off. If false, the application will have to call `rd_kafka_offset_store()` to store an offset (optional). **NOTE:** There is currently no zookeeper integration, offsets will be written to broker or local file according to offset.store.method. % Group mygroup2 rebalanced (memberid rdkafka-e0855e1f-26cf-406a-8f9f-ad4df088fba8): assigned: my-topic [0]
    It says set enable.auto.commit. I am setting that one!
    Why it is complaining??
    Magnus Edenhill
    @edenhill
    @roohitavaf For jwt/oauthbearer auth: create an issue detailing exactly how tokens and stuff will be passed to kafkacat
    @roohitavaf You can ignore that warning, kafkacat sets auto.commit.enable=false by itself, but that should be removed. Can you create an issue please?
    Magnus Edenhill
    @edenhill
    @roohitavaf kafkacat currently doesnt expose any admin apis (for describing topic config, for example).
    Dwijadas Dey
    @DwijadasDey
    Hi How can i make sure that ssl.endpoint.identification.algorithm option in kafkacat command is not https ? i have set it to -X ssl.endpoint.identification.algorithm= but it produces '' Configuration property "ssl.endpoint.identification.algorithm" cannot be set to empty value.
    Magnus Edenhill
    @edenhill
    @DwijadasDey set it to "none"
    Taulant
    @taulanti
    Hello everyone, I 'm trying to set SSL authentication to kafka following this course(https://www.udemy.com/course/apache-kafka-security), how ever I get this error: Handshake failed, caused by javax.net.ssl.SSLProtocolException: Unexpected handshake message: server_hello, On google I'm not able to find an answer specific to this. (Sorry if not the right chat topic)
    Vishal Shivare
    @vishalshivare
    Hi Everyone, In our project we are using Kafka Streams and ksqlDB to process our events. We are able to run KSQL server and able to create and listen stream logs. If we create any stream or table in ksqldb it will create new topic on Kafka so that we can get streamed data from kafka topic so I have all data stored in multiple Kafka topics. Now I want to process the data which is available on multiple Kafka topics.
    What are available ways are there to consume that data. I already have kafkacat and kafka tools to listen that topic logs but I want to do some processing on that data so that in future I can connect any BI tools to generate some dashboard with that data.
    Magnus Edenhill
    @edenhill
    @vishalshivare Use any of the Kafka clients for the language you want to work in. I.e., confluent-kafka-python if your BI is in Python.
    @taulanti What error are you seeing in kafkacat? Try with -d security,broker for more info
    Gordon Rennie
    @gordon-rennie
    I am using kafkacat to consume data from an avro-encoded topic, outputting in JSON format. Is it possible to make it decode avro Decimal values into a number (or a string of the number)? The behaviour I am seeing instead is that my decimals are just being dumped out as bytes, interpreted as a string with weird glyphs (e.g. Æ).
    3 replies
    Yordan Strahinov
    @Lagoran

    Hi All,
    I was digging into the documentation to find out a way to dump stats for per-topic CPU and RAM utilization, but I am only finding the ,,Per-Topic metric" described here:

    https://docs.confluent.io/current/kafka/monitoring.html

    Any suggestions how to extract this information/idea how to calculate them if You know the formula behind the listed ones?

    Regards ;)

    2 replies
    Gerry Fletcher
    @gerryfletch
    Afternoon all :) I have a service that publishes to multiple queues, that I'm trying to write integration tests for. Due to the startup cost of consumers, I am trying to run one consumer but reset the offset to the latest message before every test such that they should only consume new messages published during the lifetime of the test. Originally this was done by using a logical seek to OFFSET_END, but I saw a recommendation online to use the end_offset of get_watermark_offsets, and commit the partitions synchronously. That's working fine, but I'm finding tests that are unfortunately still consuming messages from previous tests. If anyone has any guidance in writing these kind of consumer-driven tests that avoid repeated messages and other race conditions, please do let me know :(
    12 replies
    Harmeet Singh(Taara)
    @harmeetsingh0013
    Hye guys, is it possible to check the size of the partitions using Kafkacat?
    Harmeet Singh(Taara)
    @harmeetsingh0013
    Is there any way to listing partitions and offset of kafka topic using kafkacar?
    ˈt͡sɛːzaɐ̯
    @julius:mtx.liftm.de
    [m]
    @harmeetsingh0013: Yes. You can get a list of topics/partitions with -L and query latest offsets with -Q -t sometopic:42:-1 (where 42 is the partition). Explanation is in the commands help output.
    charles lescot
    @clescot
    hi, is there any way to use kafkacat through a proxy (socks ?) ?
    3 replies