Hi folks,
I am building application with rdkafka-rust, version pinned to 0.22. I am compiling this to windows binary from linux box.
I have installed all the toolchain for windows.
Following this example
I am getting following errors during build.
error: linking with `/usr/bin/x86_64-w64-mingw32-gcc` failed: exit code: 1
= note: /home/vagrant/code/demo-example/target/x86_64-pc-windows-gnu/release/deps/demo_example-feb82719e9d890fc.demo_example.h10zmb5r-cgu.12.rcgu.o:demo_example.h:(.text+0x2bf): undefined reference to `rd_kafka_opaque'
/home/vagrant/code/demo-example/target/x86_64-pc-windows-gnu/release/deps/demo_example-feb82719e9d890fc.demo_example.h10zmb5r-cgu.12.rcgu.o:demo_example.h:(.text+0x3d8): undefined reference to `rd_kafka_queue_get_consumer'
/home/vagrant/code/demo-example/target/x86_64-pc-windows-gnu/release/deps/demo_example-feb82719e9d890fc.demo_example.h10zmb5r-cgu.12.rcgu.o:demo_example.h:(.text+0x43b): undefined reference to `rd_kafka_conf_set_opaque'
/home/vagrant/code/demo-example/target/x86_64-pc-windows-gnu/release/deps/demo_example-feb82719e9d890fc.demo_example.h10zmb5r-cgu.12.rcgu.o:demo_example.h:(.text+0x454): undefined reference to `rd_kafka_conf_set_log_cb'
/home/vagrant/code/demo-example/target/x86_64-pc-windows-gnu/release/deps/demo_example-feb82719e9d890fc.demo_example.h10zmb5r-cgu.12.rcgu.o:demo_example.h:(.text+0x46d): undefined reference to `rd_kafka_conf_set_stats_cb'
Found nothing on web, any help is appreciated :)
metrics.into_par_iter().for_each(|metric: Metric| {
let json = serde_json::to_string(&metric);
if let Err(e) = &json {
log::error!("{}", e)
// TODO: IMPORTANT: FIX the deserialization issue where everything comes back as null
}
if let Ok(j) = &json {
log::info!("{}", j);
producer.send(FutureRecord::to("METRICS").payload(j).key(""), 0);
pb.inc(1);
}
});
KafkaError (Message production error: MessageTimedOut (Local: Message timed out)), OwnedMessage {
payload: Some([214, 29, 23]), key: Some([]),
topic: 'auditlog.json',
timestamp: NotAvailable,
partition: -1,
offset: -1001,
headers: Some(OwnedHeaders { ptr: 0x7f84c4000b90 })
}"
max.poll.interval.ms
(300000) is effectively limited by session.timeout.ms
(6000) with this broker version
Hi. I am trying to use dynamic_linking feature and as it is stated in readme that current librdkafka dependency for it is 1.4.2. But looks like it tries to find 1.5.0 version when building.
--- stderr
librdkafka will be linked dynamically
librdkafka 1.5.0 cannot be found on the system
Dynamic linking failed. Exiting.
Does someone knows why is it the case?
Hi, I've open an issue about sending a FutureRecord without a key because it's not working (fede1024/rust-rdkafka#370). Inside the documentation of Kafka (Java, Python), it's possible to do it because the key is used to fine controlled partition. In my case, I prefer to have Kafka doing this management.
Anyone with the same issue or a workaround? Adding a key, it's not the solution
ld: symbol(s) not found for architecture x86_64
Undefined symbols for architecture x86_64:
"_gssrpc_auth_debug_gssapi", referenced from:
_main in client.o
"_gssrpc_auth_gssapi_create_default", referenced from:
_main in client.o
"_gssrpc_clnt_pcreateerror", referenced from:
_main in client.o
"_gssrpc_clnt_perror", referenced from:
_main in client.o
"_gssrpc_clnt_sperror", referenced from:
_main in client.o
"_gssrpc_clnttcp_create", referenced from:
_main in client.o
"_gssrpc_clntudp_create", referenced from:
_main in client.o
"_gssrpc_misc_debug_gssapi", referenced from:
_main in client.o
"_gssrpc_svc_debug_gssapi", referenced from:
_main in client.o
"_gssrpc_xdr_free", referenced from:
_main in client.o
"_gssrpc_xdr_wrapstring", referenced from:
_main in client.o
_rpc_test_echo_1 in rpc_test_clnt.o
"_krb5_gss_dbg_client_expcreds", referenced from:
_main in client.o
ld: symbol(s) not found for architecture x86_64
clang: clang: errorerror: : linker command failed with exit code 1 (use -v to see invocation)
linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [client] Error 1
make[2]: *** Waiting for unfinished jobs....
make[2]: *** [server] Error 1
make[1]: *** [all-recurse] Error 1
make: *** [all-recurse] Error 1
StreamConsumer
and a particular use case I have. I would like to process messages in small batches, rather than one at a time, preferrably with manual commit control. I can imagine several ways to go about it, but they all seem a bit complex. Is there an idiomatic way this can be achieved?
StreamConsumer
with a custom ConsumerContext
where I need to create some future (invoke some async function) and wait for it to finish before returning from the post_rebalance
callback. If I understand well, that callback is invoked while a poll
is being executed from the Runtime context, which means that the thread handling it has a Runtime associated, but the ConsumerContext::post_rebalance
is not an async function. So I'm a bit confused on how to deal with it. Any ideas on where to look at?
BaseProducer
to send things to kafka, but we never poll
or flush
it. Still messages arrive to kafka normally. I checked if BaseProducer
implements Drop
and calls flush
there but as far as I understand it doesn't so I'm confused as to why this works