Where communities thrive

  • Join over 1.5M+ people
  • Join over 100K+ communities
  • Free without limits
  • Create your own community
    Martin Eigenmann
    same error
    Konstantinos 'Dean' Linaras
    Are you running it on your computer or some cloud environment ?
    Martin Eigenmann
    local computer
    Konstantinos 'Dean' Linaras
    Architecture ?
    Martin Eigenmann
    Konstantinos 'Dean' Linaras
    Using any kind of virtualization under the docker daemon?
    Martin Eigenmann
    hmm, how can I check?
    I mean docker is installed directly on ubuntu
    and ubuntu is the only system running on this laptop
    Konstantinos 'Dean' Linaras
    @eigenmannmartin let me do some digging around but that's a really weird error you are getting
    Martin Eigenmann
    yes :-) - I will ttry out an other system and let you know how it went.
    I will be away for the next 2-3h and will check in again afterwards
    Konstantinos 'Dean' Linaras
    We are also a bit more active in our Slack channel @ lensesio.slack.com
    Laura Ulmer
    Howdy... I'm just getting familiar and a am a bit hamfisted... Can someone help me understand why I am getting:
    '''Status: Downloaded newer image for landoop/kafka-lenses-dev:latest
    docker: Error response from daemon: driver failed programming external connectivity on endpoint amazing_lalande (8f31ce4995c7356777b2b16ecbb50d067ffdea947769f5f9af36f63ae44ae943): Bind for failed: port is already allocated.
    ERRO[0118] error waiting for container: context canceled'''
    Spiros Economakis
    @missulmer hi. it seems that 9092 port is already used in your host machine. how do you run kafka-lenses-dev could you share the command?
    Laura Ulmer

    Lauras-MacBook-Pro:~ lauraulmer$ docker run -e ADV_HOST= \

    -e EULA="https://dl.lenses.stream/d/?id=REGISTER_FOR_KEY" \
    --rm -p 3030:3030 -p 9092:9092 landoop/kafka-lenses-dev

    Unable to find image 'landoop/kafka-lenses-dev:latest' locally
    latest: Pulling from landoop/kafka-lenses-dev

    (man that came out ugly)
    Laura Ulmer
    I'm running a port scan just now.
    Spiros Economakis
    just try 9093:9092 instead for the exposed port and you will be fine
    Laura Ulmer
    cool thank you
    Spiros Economakis
    Fabio Guelfi
    Unable to download license. Maybe the link was wrong or the license has expired?
    Please check and try again. If the problem persists, please contact us.
    Harsh Gupta

    @/all I am not able to get rid of dependency conflicts for kafka-testing
    sometimes I get for version of kafka below 2.4.0 :

    [error] Caused by: sbt.ForkMain$ForkError: java.lang.ClassNotFoundException: org.apache.kafka.test.TestUtils

    sometimes its for kafka version 2.4.0

    [error] Caused by: sbt.ForkMain$ForkError: java.lang.ClassNotFoundException: kafka.utils.ZkUtils

    I have done , which is recommended

          "org.apache.kafka" %% "kafka" % "2.3.1",
          "org.apache.kafka" % "kafka-clients" % "2.3.1",
          "org.apache.kafka" %% "kafka" % "2.3.1" % Test classifier "test",
          "org.apache.kafka" % "kafka-clients" % "2.3.1" % Test classifier "test",
          "com.landoop" %% "kafka-testing" % "2.1" % Test,

    kindly help


    Hi all,
    I am trying to launch lenses box docker sandbox but it failing with below error
    2020-03-18 12:15:23,059 INFO exited: lenses (exit status 100; not expected) 2020-03-18 12:15:23,101 INFO spawned: 'lenses' with pid 7817 2020-03-18 12:15:24,162 INFO success: lenses entered RUNNING state, process has stayed up for > than 1 seconds (startsecs)

    I logger i could see the below error message
    `2020-03-18 12:08:32,077 ERROR [c.l.k.l.Main$:758] Could not validate license.

    Your license is not valid.

    Please contact us at info@lenses.io to obtain a valid license.
    Your customer id is:'6667f8ad-d54a-45f3-96b3-xxxxxxxxxx', and the reference code is:'l5!'
    to reference as well.

    Kindly help!

    issue resolved after the system restart!!!
    Ritesh Nadhani
    Hello, I am just trying to figure out if https://github.com/lensesio/kafka-connect-ui is still maintained and can be used in production?

    I am trying to run it inside our k8s cluster (no helm charts) and i was able to run it as a deployment (passing CONNECT_URL as env variable) with service/ingress setup but when I go to the UI it shows: ```KAFKA CONNECT
    SELECT CLUSTER : {{connectEndPoint.NAME}}
    Missing Cluster Configuration

    In order to configure kafka-connect-ui you need to add env.js file in the root directory of the app.
    Example env.js structure:

    var clusters = [
    NAME:"prod", //Required
    KAFKA_CONNECT: "http://kafka-connect.url", //Required
    KAFKA_TOPICS_UI: "http://kafka-topics-ui.url", //Optional
    KAFKA_TOPICS_UI_ENABLED: true, //Optional
    COLOR: "#141414" // Optional

    KAFKA_CONNECT: "https://kafka-connec.dev.url",
    KAFKA_CONNECT: "http://localhost:8083"


    when i log inside the pod, i see an updated env.js with something like var clusters = [ { NAME: "kafka-connect-1", KAFKA_CONNECT: "/api/kafka-connect-1" } ]
    Tim Chan
    has anyone seen this error before?
    2020-04-15 02:23:49,696 ERROR There was an error writing the records Boxed Error (com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter) [task-thread-logger-to-cassandra-connector-0] java.util.concurrent.ExecutionException: Boxed Error at scala.concurrent.impl.Promise$.resolver(Promise.scala:59) at scala.concurrent.impl.Promise$.scala$concurrent$impl$Promise$$resolveTry(Promise.scala:51) at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) at scala.concurrent.Promise$class.complete(Promise.scala:55) at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:157) at scala.concurrent.Promise$class.failure(Promise.scala:104) at scala.concurrent.impl.Promise$DefaultPromise.failure(Promise.scala:157) at com.datamountaineer.streamreactor.connect.concurrent.ExecutorExtension$RunnableWrapper$$anon$1.run(ExecutorExtension.scala:29) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.landoop.json.sql.JacksonJson$ at com.datamountaineer.streamreactor.connect.converters.Transform$.apply(Transform.scala:93) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$com$datamountaineer$streamreactor$connect$cassandra$sink$CassandraJsonWriter$$insert$1.apply(CassandraJsonWriter.scala:182) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$com$datamountaineer$streamreactor$connect$cassandra$sink$CassandraJsonWriter$$insert$1.apply(CassandraJsonWriter.scala:181) at scala.collection.immutable.Map$Map1.foreach(Map.scala:116) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter.com$datamountaineer$streamreactor$connect$cassandra$sink$CassandraJsonWriter$$insert(CassandraJsonWriter.scala:181) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$4$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(CassandraJsonWriter.scala:161) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$4$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(CassandraJsonWriter.scala:159) at scala.collection.Iterator$class.foreach(Iterator.scala:891) at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) at scala.collection.AbstractIterable.foreach(Iterable.scala:54) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$4$$anonfun$apply$1.apply$mcV$sp(CassandraJsonWriter.scala:159) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$4$$anonfun$apply$1.apply(CassandraJsonWriter.scala:159) at com.datamountaineer.streamreactor.connect.cassandra.sink.CassandraJsonWriter$$anonfun$4$$anonfun$apply$1.apply(CassandraJsonWriter.scala:159) at com.datamountaineer.streamreactor.connect.concurrent.ExecutorExtension$RunnableWrapper$$anon$1.run(ExecutorExtension.scala:30) ... 3 more
    I want to use connect-cli to run mqtt source connector. I am using confluent platform community edition on a VM. Can anyone please provide me the steps on how to get proceed . I found this url from there I can download the required files, but still I do not have any clue on how to get the connect-cli in operation so that I can proceed with setting up the source connector. Where should I place the downloaded jar ?
    Fatai Jimoh
    I ran to this problem when I ran docker-compose up kafka-cluster kafka-cluster_1 | 2020-05-13 20:10:06,564 INFO exited: smoke-tests (exit status 0; expected)
    kafka-cluster_1 | 2020-05-13 20:12:06,689 INFO exited: logs-to-kafka (exit status 0; expected)
    kafka-cluster_1 | 2020-05-13 20:14:06,683 INFO exited: sample-data (exit status 0; expected)
    Please help
    Is it possible to use event time from data payload in sql processor just like ksql?
    ksql> CREATE STREAM event_data_by_loadDate (loadDate LONG, code VARCHAR, user VARCHAR) 
          WITH (KAFKA_TOPIC='event_data', 

    I'm trying to install lenses.io in AKS cluster but it seems to be not working
    the same code/deployment file is working when connected locally using docker-compose up

    getting below error in AKS cluster after deploying lenses.io

    2020-10-09 11:53:26,377 WARN [i.l.c.a.BrokersStatusActor:32] [default-akka.actor.default-dispatcher-4] There was an error listing broker nodes.
    org.apache.kafka.common.errors.TimeoutException: Timed out waiting for a node assignment.

    also I have message like this

    2020-10-09 11:53:50,398 INFO [a.a.ActorSystemImpl:95] [default-akka.actor.default-dispatcher-5] Request: GET-> returned 200 OK in 7ms

    but when trying to connect to the link, it's not running as well

    While running the docker-compose command using below config i receive error saying '2020-12-12 05:10:58,330 INFO exited: caddy (exit status 2; not expected)'.
    version: '3.6'
    container_name: kafka_cluster
    image: landoop/fast-data-dev:latest
      - ADV_HOST=
      - EULA= "http://dl.lenses.io/d/?id=dca6a57f-2abc-48b8-897b-fef05c28184d"
      - "2181:2181"
      - "9092:9092"
      - "8081:8081"
      - "8082:8082"
      - "3030:3030"
      - "9581-9585:9581-9585"
    Kelvin Mungai
    Hello, while working with a custom serde, is there any way of getting the headers of the current message?
    Guilherme Garcia Alves 🏴🅰️➕
    Hi, I'm studying Kafka and discovered the kafka-lenses-dev image and I don't know if this question is a bit silly but I don't know how to use kSQL in it. If someone can show me some tutorial or example for me I appreciate it. tks
    Hello, I am using a JDBC source connector at Landoop's Kafka-Connect-UI ( The configuration is as follows: name=JdbcSourceConnector
    table.whitelist: Table1
    However I get the error: Invalid value java.sql.SQLException: No suitable driver found for jdbc:sqlserver://kafka.database.windows.net:1433/kafka-database-azure?user=.....
    I have downloaded the JDBC driver for Windows and defined the classpath, and also in the docker-compose added: volumes:
      - C:\Users\med\Downloads\sqljdbc_9.2.1.0_enu\sqljdbc_9.2\enu\mssql-jdbc-9.2.1.jre8.jar:/opt/confluent-3.3.0/share/java/kafka-connect-jdbc/lib/mssql-jdbc-9.2.1.jre8.jar
    but without any help, the same error
    Anybody has an idea?
    See if this helps.. i had faced similar issue initially, placed the jar files in multiple locations . Java classpath, jdbc connector lib folder .. debug option shown in the blog above should help understand if your plugin/plugin-location is being loaded at startup
    Thank you very much Muthu, I will look at and try the solutions
    Hi, I have a problem using the FileStreamSinkConnector from Kafka Connect UI. It can not write (or find) the file to sink data into. The connector and the error message are as follow. The source part which is a FileStreamSourceConnector functions very well and successfully fill the topic demo-2-distributed. I appreciate any help and suggestion very much:
    I have also created the sink file (demo-file-sink.txt) in advance to make sure that the error has nothing to do with the creation of the file