Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Repo info
Activity
    Omar Al-Safi
    @omarsmak
    @vidyaranya92 if you want to creeate a file automaticly upon commitAppendBlob operation, this will be only possible in the next version of camel 3.5, not currently
    Vidyaranya
    @vidyaranya92
    @oscerd earlier, to make it work, I have created a file manually before starting the connector to write data. For dynamic file creation, it does not work. Thats y looking a way to combine createappendblob and commitappendblob operations.
    Vidyaranya
    @vidyaranya92
    May I know when is it planned to bump the camel-core version to 3.5 from 3.4.2?
    Andrea Cosentino
    @oscerd
    When it will be released upstream, there is no ETA for the moment for 3.5.0
    Rogério Ferreira
    @rogeriob2br
    hi @oscerd , i commented on the issue #389 . I will start testing the camal-kafka for a large customer, I will try to contribute as I find any bugs or inconsistencies in the docs. If possible I help to correct ...
    Andrea Cosentino
    @oscerd
    Thanks @rogeriob2br
    Rogério Ferreira
    @rogeriob2br
    @oscerd What is the right branch to originate my pull request?
    Otavio Rodolfo Piske
    @orpiske
    you can use the master branch @rogeriob2br
    Rogério Ferreira
    @rogeriob2br
    @orpiske Thanks
    Yashodhan Ghadge
    @codexetreme
    Hello
    I am facing a difficulty in the sqs connector, more specifically, when we run the connector as a source connector, the kafka records are created without a key. I wanted to know how to tweak the config/ extend the connector to process the sqs record such that I can add a key and then let the connector do its thing and insert it into the specified kafka topic
    the docs are a bit incomplete from my observation and finding a solution for adding/extending the connector is proving a bit difficult, I would appreciate some insight on this
    Andrea Cosentino
    @oscerd
    I think you need to look at camel documemtation first
    And understand how to set a key through camel
    Options
    Also is this for aws sqs or aws2 sqs
    Yashodhan Ghadge
    @codexetreme
    aws2 sqs
    can you perhaps point to the right documentation ?
    I have obviously missed a few important points in setting up camel
    Also why do you have to manipulate a key if you are consuming? Anyway the messageId is an header coming through the exchange. I'll have a look anyway.
    Yashodhan Ghadge
    @codexetreme
    we are producing to kafka from sqs (source connector config)
    Andrea Cosentino
    @oscerd
    No, if you are producing from kafka to sqs ypu need to use a sink connector
    Yashodhan Ghadge
    @codexetreme
    we are doing the opposite
    SQS-> CONNECTOR -> KAFKA this is our flow
    Andrea Cosentino
    @oscerd
    Then the key is in the header reported in the docs
    Yashodhan Ghadge
    @codexetreme
    you are right, I need to tweak the key to have a certain format
    essentially the key right now is the id from the sqs record, we need to set a custom key based on the data inside the sqs record
    essentially, apply a transformer according to the Kafka Connect API documentation
    something along these lines
    Yashodhan Ghadge
    @codexetreme
    the aws2-sqs docs dont seem to have anything that allows us to hook into the data. If this is a missing feature, I am happy to contribute and add transformation functionality to the camel connectors
    because in my opinion this is a quite a common use case, where light edits are made to the data before it is written into kafka. On the sink side, this functionality already exists in the form of the aggregator
    Andrea Cosentino
    @oscerd
    yes, but the aggregator is a camel concept. In this case you have the connector and you can use an archetype https://camel.apache.org/camel-kafka-connector/latest/archetypes.html
    add to the archetype your trasformer and use it. We probably need to create a transformer in this case yes, or a converter not sure. Can you please open an improvement request on the camel-kafka-connector issue tracker? Thanks
    The connectors are too much, we are doing what we can to integration tests them
    Yashodhan Ghadge
    @codexetreme
    archetypes are the way forward for our usecase, will explore deeper there. I will open the request on the issue tracker as you mention, thanks for your time and effort :)
    VISHAL GAURAV
    @vgaur
    Hi Everyone ,
    I am trying to replay messages in case of some failure. But the offset rest only support earliest or latest. Is there any way I can seek to some offset based on timestamp and replay the message ?
    Otavio Rodolfo Piske
    @orpiske
    @vgaur I am not entirely sure, but I think not.
    VISHAL GAURAV
    @vgaur
    Thanks @orpiske
    Andrea Cosentino
    @oscerd
    Notice for all chat: the channel for discussin camel-kafka-connector has been moved to https://camel.zulipchat.com/
    sayak-ghosh1990
    @sayak-ghosh1990
    HI Everyone, we have implemented camel-kafka-azure-blob-sink connector (camel-kafka-connector-0.9.x) with static Blob Name. But is it possible to create dynamic blob with this below configuration ? -
    azure-storage-blob://<storage-account>/<container>?blobName=name-${date:now:yyyyMMdd}.txt&accessKey=<key>&operation=commitAppendBlob&createAppendBlob=true. Please help!
    Andrea Tarocchi
    @valdar
    @sayak-ghosh1990 please use https://camel.zulipchat.com, this chat is not more active
    sayak-ghosh1990
    @sayak-ghosh1990
    @valdar yes thanks!
    Sarwar Bhuiyan
    @sarwarbhuiyan
    Hello
    I'm trying to use "camel.source.marshal": "avro-jackson" but I keep getting an error :
    com.fasterxml.jackson.databind.JsonMappingException: Can not write Avro output without specifying Schema at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._wrapAsIOE(DefaultSerializerProvider.java:509) at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:482) at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:319) at com.fasterxml.jackson.databind.ObjectWriter$Prefetch.serialize(ObjectWriter.java:1514) at com.fasterxml.jackson.databind.ObjectWriter._writeValueAndClose(ObjectWriter.java:1215) at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:1043) at org.apache.camel.component.jackson.AbstractJacksonDataFormat.marshal(AbstractJacksonDataFormat.java:152) at org.apache.camel.support.processor.MarshalProcessor.process(MarshalProcessor.java:64) at org.apache.camel.processor.errorhandler.RedeliveryErrorHandler$SimpleTask.run(RedeliveryErrorHandler.java:463) at org.apache.camel.impl.engine.DefaultReactiveExecutor$Worker.schedule(DefaultReactiveExecutor.java:179) at org.apache.camel.impl.engine.DefaultReactiveExecutor.scheduleMain(DefaultReactiveExecutor.java:64) at org.apache.camel.processor.Pipeline.process(Pipeline.java:184) at org.apache.camel.impl.engine.CamelInternalProcessor.process(CamelInternalProcessor.java:398) at org.apache.camel.impl.engine.DefaultAsyncProcessorAwaitManager.process(DefaultAsyncProcessorAwaitManager.java:83) at org.apache.camel.support.AsyncProcessorSupport.process(AsyncProcessorSupport.java:41) at org.apache.camel.component.sql.SqlConsumer.processBatch(SqlConsumer.java:269) at org.apache.camel.component.sql.SqlConsumer$1.doInPreparedStatement(SqlConsumer.java:187) at org.apache.camel.componen^C at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:829) Caused by: java.lang.IllegalStateException: Can not write Avro output without specifying Schema at com.fasterxml.jackson.dataformat.avro.ser.AvroWriteContext$NullContext._reportError(AvroWriteContext.java:577) at com.fasterxml.jackson.dataformat.avro.ser.AvroWriteContext$NullContext.createChildObjectContext(AvroWriteContext.java:552) at com.fasterxml.jackson.dataformat.avro.AvroGenerator.writeStartObject(AvroGenerator.java:394) at com.fasterxml.jackson.databind.ser.std.MapSerializer.serialize(MapSerializer.java:719) at com.fasterxml.jackson.databind.ser.std.MapSerializer.serialize(MapSerializer.java:35) at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider._serialize(DefaultSerializerProvider.java:480)
    any ideas how I can hook in a schema?
    Andrea Tarocchi
    @valdar
    @sarwarbhuiyan please use https://camel.zulipchat.com, this chat is not more active
    kafkamike
    @kafkamike:matrix.org
    [m]
    Hi, I'm trying to use camel-kafka-connector
    I get [2021-09-14 14:57:30,347] INFO Kafka Connect started (org.apache.kafka.connect.runtime.Connect:57)
    [2021-09-14 14:57:30,359] ERROR Failed to create job for ../../camel.properties (org.apache.kafka.connect.cli.ConnectStandalone:107)
    [2021-09-14 14:57:30,359] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:117)
    java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: org/apache/camel/kafkaconnector/CamelSinkConnector
    at org.apache.kafka.connect.util.ConvertingFutureCallback.result(ConvertingFutureCallback.java:115)
    at org.apache.kafka.connect.util.ConvertingFutureCallback.get(ConvertingFutureCallback.java:99)
    at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:114)
    Caused by: java.lang.NoClassDefFoundError: org/apache/camel/kafkaconnector/CamelSinkConnector