Where communities thrive


  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
People
Activity
  • May 02 2018 13:18
    anandrajneesh commented #6
  • May 02 2018 13:18
    anandrajneesh commented #6
  • Oct 01 2017 06:31
    atiqsayyed opened #24
  • Oct 01 2017 06:25
    codecov[bot] commented #23
  • Oct 01 2017 06:25
    codecov[bot] commented #23
  • Oct 01 2017 06:25
    Travis agoda-com/kafka-jdbc-connector#23 passed (94)
  • Oct 01 2017 06:24
    Travis agoda-com/kafka-jdbc-connector#23 passed (93)
  • Oct 01 2017 06:23
    codecov[bot] commented #23
  • Oct 01 2017 06:23
    codecov[bot] commented #23
  • Oct 01 2017 06:23
    atiqsayyed synchronize #23
  • Oct 01 2017 06:23
    CLAassistant commented #23
  • Oct 01 2017 06:23
    codecov[bot] commented #23
  • Oct 01 2017 06:23
    codecov[bot] commented #23
  • Oct 01 2017 06:23
    atiqsayyed synchronize #23
  • Oct 01 2017 06:16
    codecov[bot] commented #23
  • Oct 01 2017 06:15
    Travis agoda-com/kafka-jdbc-connector#23 passed (92)
  • Oct 01 2017 06:14
    CLAassistant commented #23
  • Oct 01 2017 06:14
    atiqsayyed opened #23
  • Sep 18 2017 13:29
    arpanchaudhury assigned #1
  • Sep 17 2017 07:00
kommuri44
@kommuri44
Hi All
I'm newbie to developed. i'm trying to setup mysql with kafka. May i know the procedure to be used for the setup
kommuri44
@kommuri44
Hi
is ony there.
is anyone there?
can you pleae help me out
Arpan Chaudhury
@arpanchaudhury
Hi
Sorry for the late reply
@kommuri44 are you trying to poll the change logs from your MSSQL database to Kafka?
Arpan Chaudhury
@arpanchaudhury
Currently we have support for MSSQL
Check out the sample repository https://github.com/agoda-com/kafka-jdbc-connector-samples. I am opening an enhancement for MYSQL support as well.
kommuri44
@kommuri44
@ @arpanchaudhury , yes. I'm trying to pull the changes form MySQL database to
kafka.
Arpan Chaudhury
@arpanchaudhury
please wait for agoda-com/kafka-jdbc-connector#2 to be release. in the mean time you can check out https://github.com/agoda-com/kafka-jdbc-connector-samples project where we have shown how to use Kafka JDBC Connector for MsSQL
Arpan Chaudhury
@arpanchaudhury
@kommuri44 agoda-com/kafka-jdbc-connector#2 has been done. please check the sample project for usage.
Arpan Chaudhury
@arpanchaudhury
@atiqsayyed Did you already completed agoda-com/kafka-jdbc-connector#4
Atiq Sayyed
@atiqsayyed
@arpanchaudhury I'll soon complete that
Arpan Chaudhury
@arpanchaudhury
Thanks @atiqsayyed
Arpan Chaudhury
@arpanchaudhury
@/all The CLA is ready and we are ready to accept contributions from outside Agoda.
Atiq Sayyed
@atiqsayyed
@arpanchaudhury I've create a PR agoda-com/kafka-jdbc-connector#23 for PostgreSQL
Arpan Chaudhury
@arpanchaudhury
@atiqsayyed Thanks for the PR, I will review it soon.
sushmagokula13
@sushmagokula13
hi all,i have setup kafka connect but kafka avro consumer is unable to listen.can somebody pls help me out with this
Juan José Morales
@juanux
Hi all. RIght now Im facing this exception that is obviusly killing the pod ' java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "kafka-producer-network-thread | connector-producer-''. I already assigned a lot of memory (4GB ) to the JVM and still happening. I'm not loading the whole table, I'm loading only the latest inserts after the Connector runs.
This is the config Im using =>
{ "name": "myschema.mytable", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "connection.url": "jdbc:postgresql://....", "connection.password": "...", "connection.user": "..", "mode": "timestamp", "timestamp.column.name": "created_at", "timestamp.initial": -1, "query": "select issuer_id, isin, cast(request_uuid as varchar(50)), client, status, batch_id, s3_bucket_url, created_at from mytable", "topic.prefix": "wmimport_status_updated", "validate.non.null": false, "key.converter": "io.confluent.connect.avro.AvroConverter", "key.converter.schema.registry.url": "http://schema-registry:8081", "value.converter": "io.confluent.connect.avro.AvroConverter", "value.converter.schema.registry.url": "http://schema-registry:8081", "transforms": "ValueToKey,TimestampConverter", "transforms.ValueToKey.type": "org.apache.kafka.connect.transforms.ValueToKey", "transforms.ValueToKey.fields": "issuer_id,isin", "transforms.TimestampConverter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value", "transforms.TimestampConverter.format": "yyyy-MM-dd HH:mm:ss", "transforms.TimestampConverter.target.type": "string", "transforms.TimestampConverter.field": "created_at" } }
Juan José Morales
@juanux