Hi - I’m implementing a logs receiver, and wish to call
consumer.ConsumeLogs(ctx context.Context, ld pdata.Logs) error.
I have a struct representing log records, and wish to convert this to
pdata.Logs. However, I’m not finding a way to do this that doesn’t require direct interaction with internal packages.
Am I right to think this should be possible? Is there an expected approach to this that someone could point me to?
kafkaexporterto output logs. Should not be very difficult. The primary question will be in what format you write to kafka.
kafkaexportercurrently supports a couple serialization formats for traces (otlp, jaeger). We will need to decide what to support for logs.
Fluent bit->OpenTelemetry Collector->any output of choicepipeline and found that it's actually plausible with the recent additions to the collector and a (rather small) modification to a FluentBit Helm chart. I have prepared a doc that summarises it and brings rationale behind some newly added issues (mostly related to K8s Processor extensions): https://docs.google.com/document/d/1QlFbXz0eQUaKXK1WrnEs3VqRkcPD2RAdXJdJMjnOSd8/edit?usp=sharing Would love to hear if that ideas sounds reasonable and if so, will be happy to work on that. I could also demo that during the upcoming SIG meeting and discuss it
@contextfield and being able to mix and max schemas
Elastic Common Schema (https://www.elastic.co/guide/en/ecs),
Splunk Schema (https://docs.splunk.com/Documentation/CIM/4.17.0/User/Overview),
Graylog Schema (https://schema.graylog.org),
Security Logging Standard still an open question (https://www.scip.ch/en/?labs.20180315)
we find the logging practice is adopted highly inconsistently
among different developers both across projects and even one project
Conclusion: Both .. have forced us to question whether it is a reliable means to
understand the runtime behavior of software systems via analyzing the
logs produced by the current logging practice.