updating sink.endpoint.opetration to createAppendBlob is just creating the file but content is not writing to it.
Omar Al-Safi
@omarsmak
which version do use of camel kafka connector?
Vidyaranya
@vidyaranya92
camel-kafka-connector-0.4.0-20200726.181658-48
camel core using 3.4.2
Andrea Cosentino
@oscerd
I think you have something wrong in your configuration
you said it was working
what has changed?
Omar Al-Safi
@omarsmak
@vidyaranya92 if you want to creeate a file automaticly upon commitAppendBlob operation, this will be only possible in the next version of camel 3.5, not currently
Vidyaranya
@vidyaranya92
@oscerd earlier, to make it work, I have created a file manually before starting the connector to write data. For dynamic file creation, it does not work. Thats y looking a way to combine createappendblob and commitappendblob operations.
Vidyaranya
@vidyaranya92
May I know when is it planned to bump the camel-core version to 3.5 from 3.4.2?
Andrea Cosentino
@oscerd
When it will be released upstream, there is no ETA for the moment for 3.5.0
Rogério Ferreira
@rogeriob2br
hi @oscerd , i commented on the issue #389 . I will start testing the camal-kafka for a large customer, I will try to contribute as I find any bugs or inconsistencies in the docs. If possible I help to correct ...
Andrea Cosentino
@oscerd
Thanks @rogeriob2br
Rogério Ferreira
@rogeriob2br
@oscerd What is the right branch to originate my pull request?
Otavio Rodolfo Piske
@orpiske
you can use the master branch @rogeriob2br
Rogério Ferreira
@rogeriob2br
@orpiske Thanks
Yashodhan Ghadge
@codexetreme
Hello
I am facing a difficulty in the sqs connector, more specifically, when we run the connector as a source connector, the kafka records are created without a key. I wanted to know how to tweak the config/ extend the connector to process the sqs record such that I can add a key and then let the connector do its thing and insert it into the specified kafka topic
the docs are a bit incomplete from my observation and finding a solution for adding/extending the connector is proving a bit difficult, I would appreciate some insight on this
Andrea Cosentino
@oscerd
I think you need to look at camel documemtation first
And understand how to set a key through camel
Options
Also is this for aws sqs or aws2 sqs
Yashodhan Ghadge
@codexetreme
aws2 sqs
can you perhaps point to the right documentation ?
I have obviously missed a few important points in setting up camel
the aws2-sqs docs dont seem to have anything that allows us to hook into the data. If this is a missing feature, I am happy to contribute and add transformation functionality to the camel connectors
because in my opinion this is a quite a common use case, where light edits are made to the data before it is written into kafka. On the sink side, this functionality already exists in the form of the aggregator
add to the archetype your trasformer and use it. We probably need to create a transformer in this case yes, or a converter not sure. Can you please open an improvement request on the camel-kafka-connector issue tracker? Thanks
The connectors are too much, we are doing what we can to integration tests them
Yashodhan Ghadge
@codexetreme
archetypes are the way forward for our usecase, will explore deeper there. I will open the request on the issue tracker as you mention, thanks for your time and effort :)
VISHAL GAURAV
@vgaur
Hi Everyone , I am trying to replay messages in case of some failure. But the offset rest only support earliest or latest. Is there any way I can seek to some offset based on timestamp and replay the message ?
Otavio Rodolfo Piske
@orpiske
@vgaur I am not entirely sure, but I think not.
VISHAL GAURAV
@vgaur
Thanks @orpiske
Andrea Cosentino
@oscerd
Notice for all chat: the channel for discussin camel-kafka-connector has been moved to https://camel.zulipchat.com/