Exception java.util.concurrent.ExecutionException: org.apache.flink.shaded.netty4.io.netty.handler.codec.TooLongFrameException: Response entity too large: DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
Job has been submitted with JobID e2411ae0efe9e8a4f71519aeab2aa5d3
24/04/2020 01:09:33
24/04/2020 01:09:33
24/04/2020 01:09:33 Exception java.util.concurrent.ExecutionException: org.apache.flink.shaded.netty4.io.netty.handler.codec.TooLongFrameException: Response entity too large: DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
24/04/2020 01:09:33 HTTP/1.1 200 OK
24/04/2020 01:09:33 Content-Type: application/json; charset=UTF-8
24/04/2020 01:09:33 Access-Control-Allow-Origin: *
24/04/2020 01:09:33 content-length: 242880344 Try to raise [rest.client.max-content-length] occurred while reading the file from s3a://loadtesting-sophi-events-2x/tgam/enriched/original/derived_tstamp=2019_12_19/part-00003-f70da1f8-6267-412c-bcb0-c74d0f90a42f-c000.avro
24/04/2020 01:09:33
24/04/2020 01:09:33 java.util.concurrent.ExecutionException: org.apache.flink.shaded.netty4.io.netty.handler.codec.TooLongFrameException: Response entity too large: DefaultHttpResponse(decodeResult: success, version: HTTP/1.1)
24/04/2020 01:09:33 HTTP/1.1 200 OK
24/04/2020 01:09:33 Content-Type: application/json; charset=UTF-8
24/04/2020 01:09:33 Access-Control-Allow-Origin: *
24/04/2020 01:09:33 content-length: 242880344 Try to raise [rest.client.max-content-length]
24/04/2020 01:09:33 at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
24/04/2020 01:09:33 at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
24/04/2020 01:09:33 at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:73)
24/04/2020 01:09:33 at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:844)
24/04/2020 01:09:33 at org.apache.flink.api.java.DataSet.collect(DataSet.java:413)
24/04/2020 01:09:33 at record.GenericRecordReader$.readFromS3(GenericRecordReader.scala:41)
24/04/2020 01:09:33 at record.RecordProcessor.$anonfun$sendInBatches$1(RecordProcessor.scala:36)
24/04/2020 01:09:33 at scala.util.Try$.apply(Try.scala:209)
24/04/2020 01:09:33 at record.RecordProcessor.sendInBatches(RecordProcessor.scala:36)
24/04/2020 01:09:33 at simulator.LoadSimulator.$anonfun$run$4(LoadSimulator.scala:85)
24/04/2020 01:09:33 at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:122)
24/04/2020 01:09:33 at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:118)
24/04/2020 01:09:33 at scala.collection.immutable.List.foldLeft(List.scala:85)
24/04/2020 01:09:33 at simulator.LoadSimulator.$anonfun$run$2(LoadSimulator.scala:80)
24/04/2020 01:09:33 at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:122)
24/04/2020 01:09:33 at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:118)
24/04/2020 01:09:33 at scala.collection.immutable.List.foldLeft(List.scala:85)
24/04/2020 01:09:33 at simulator.LoadSimulator.run(LoadSimulator.scala:72)
24/04/2020 01:09:33 at simulator.LoadSimulator$.delayedEndpoint$simulator$LoadSimulator$1(LoadSimulator.scala:132)
24/04/2020 01:09:33 at simulator.LoadSimulator$delayedInit$body.apply(LoadSimulator.scala:130)
24/04/2020 01:09:33 at scala.Function0.apply$mcV$sp(Function0.scala:34)
24/04/2020 01:09:33 at scala.Function0.apply$mcV$sp$(Function0.scala:34)
24/04/2020 01:09:33 at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
24/04/2020 01:09:33 at scala.App.$anonfun$main$1$adapted(App.scala:76)
24/04/2020 01:09:33 at scala.collection.immutable.List.foreach(List.scala:388)
24/04/2020 01:09:33 at scala.App.main(App.scala:76)
24/04/2020 01:09:33 at scala.App.main$(App.scala:74)
24/04/2020 01:09:33 at simulator.LoadSimulator$.main(LoadSimulator.scala:130)
24/04/2020 01:09:33 at simulator.LoadSimulator.main(LoadSimulator.scala)
24/04/2020 01:09:33 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
24/04/2020 01:09:33 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
24/04/2020 01:09:33 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
24/04/2020 0
Hi There!
I'm new to Flink, I want to create a REST endpoint which returns Kafka data via Flink.
I have posted my query in Stackoverflow in the below link,
Any suggestions or inputs are appreciated, thank you!
Hello Team,
I am looking for an option to pass additional configuration file from cluster to Flink runner. I come from Spark background and I am looking for option like --flink
with flink run
command. Basically, I want to pass /etc/hbase/conf/hbase-site.xml
file to flink job. Does anyone know how to achieve this? When I put this file in src/main/resources
directory, it seems to load correct hbase configuration, but without this file, it doesn't find correct zookeeper.quorum
. I also posted the same question on Stackoverflow but haven't heard back.
https://stackoverflow.com/questions/71887621/how-to-pass-extra-files-to-flink-runner