Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

opentelemetry sink and source not working peroperly together #22607

Open
itstaby opened this issue Mar 7, 2025 · 3 comments
Open

opentelemetry sink and source not working peroperly together #22607

itstaby opened this issue Mar 7, 2025 · 3 comments
Labels
sink: opentelemetry Anything `opentelemetry` sink related source: opentelemetry Anything `opentelemetry` source related type: bug A code related bug.

Comments

@itstaby
Copy link

itstaby commented Mar 7, 2025

A note for the community

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Problem

I'm trying to use the opentelemetry sink in one vector instance to send data to another vector instance's opentelemetry source. The receiving source returns a 400 Bad Request error with no additional info on the problem.

I am creating the data object in a VRL remap. For the purpose of the experiment, I used a generic object (copied in the example data section).

I've seen from the source code that the opentelemetry source only accepts content-type: application/x-protobuf so I'm using protobuf encoding on the sink.

For the protobuf descriptor file, I have attempted generating it with the logs.proto file from the original opentelemetry repo here: https://github.com/open-telemetry/opentelemetry-proto/blob/main/opentelemetry/proto/logs/v1/logs.proto

Used the following settings for the encoding:

encoding:
  codec: protobuf
  protobuf:
    desc_file: "/etc/vector-desc/otlp_descriptor.desc"
    message_type: "opentelemetry.proto.logs.v1.LogsData"

After looking at the source code for the vector source, I saw that it uses this proto for decoding, so I've also attempted to generate the descriptor with this as the root and using that for encoding settings, with the same result:

encoding:
  codec: protobuf
  protobuf:
    desc_file: "/etc/vector-desc/otlp_descriptor.desc"
    message_type: "opentelemetry.proto.logs.v1.ExportLogsServiceRequest"

No helpful logs on the receiver source explaining what the issue is.

Configuration

Sink on the sending vector instance:

    sinks:
      vector_collector:
        type: opentelemetry
        inputs:
        - otlp_format_transform
        protocol:
          type: http
          uri: "http://collector.logging.svc.cluster.local:4318/v1/logs"
          encoding:
            codec: protobuf
            protobuf:
              desc_file: "/etc/vector-desc/otlp_descriptor.desc"
              message_type: "opentelemetry.proto.collector.logs.v1.ExportLogsServiceRequest"
          headers:
            content-type: application/x-protobuf



Source config on collector:

    sources:
      opentelemetry:
        grpc:
          address: 0.0.0.0:4317
        http:
          address: 0.0.0.0:4318
        type: opentelemetry

Version

0.43.1

Debug Output


Example Data

Here's the input to the sink:

          # Create the OTLP structure
          . = {
            "resource_logs": [
              {
                "resource": {
                  "attributes": []
                },
                "scope_logs": [{
                  "scope": {
                    "name": "vector",
                    "version": "0.1.0"
                  },
                  "log_records": [{
                    "time_unix_nano": 0,
                    "severity_number": 9,
                    "severity_text": info,
                    "body": {
                          "stringValue": "foo-bar"
                    },
                    "attributes": []
                  }]
                }]
              }
            ]
          }

Additional Context

Both vector instances are running in Kubernetes in the same cluster. Logs confirm that the requests are from sink are reaching the source.

References

No response

@itstaby itstaby added the type: bug A code related bug. label Mar 7, 2025
@pront pront added source: opentelemetry Anything `opentelemetry` source related sink: opentelemetry Anything `opentelemetry` sink related labels Mar 7, 2025
@fbs
Copy link

fbs commented Mar 9, 2025

They do work but its a bit finicky to get working as you basically have to write protobuf as json, which is a big pain.

                    "body": {
                          "stringValue": "foo-bar"
                    },

this must be string_value https://github.com/open-telemetry/opentelemetry-proto/blob/d7770822d70c7bd47a6891fc9faacc66fc4af3d3/opentelemetry/proto/common/v1/common.proto#L32

You also have to make sure both source and sink use the same framing (e.g. bytes).

I want to improve the sink a bit so the json mangling is not required anymore

@itstaby
Copy link
Author

itstaby commented Mar 9, 2025

@fbs I had noticed this after posting and changed to string_value as well, still had no luck.
I intercepted the request going from sink to source and decoded it using the .proto files in the sink source code here, and was able to decode it successfully, but the sink still fails to decode it.

i have not added any framing params, so it's whatever is default on both sides.

@fbs
Copy link

fbs commented Mar 9, 2025

Can you give this a go?

sources:
  generate_syslog:
    type: "demo_logs"
    format: "syslog"
    count: 5
    interval: 1

  otel:
    type: opentelemetry
    http:
      address: 127.0.0.1:4318
    grpc:
      address: 127.0.0.1:4317

transforms:
  mangle:
    inputs: ["generate_syslog"]
    type: "remap"
    source: |
      . = {}
      .resource_logs = [{
        "resource": {
          "attributes": [{
            "key": "bla",
            "value": { "string_value": "blalaallala" }
          }]
        },
        "scope_logs": {
          "log_records": [{
            "time_unix_nano": to_unix_timestamp(timestamp(t'2025-01-01T00:11:22Z'), unit: "nanoseconds"),
            "observed_time_unix_nano": to_unix_timestamp(timestamp(t'2025-02-02T22:11:22Z'), unit: "nanoseconds"),
            "body": { "string_value": "this is a test" },
            "severity_text": "ERROR",
            "attributes": [{
              "key": "xxx",
              "value": { "string_value": "blalaallala" }
            }, {
              "key": "int",
              "value": { "int_value": 10 },
            }
            ]

          }]
        }
      }]

sinks:
  emit_otlp:
    inputs: ["mangle"]
    type: opentelemetry
    protocol:
      type: http
      uri: http://localhost:4318/v1/logs
      method: post
      framing:
        method: bytes
      encoding:
        codec: protobuf
        protobuf:
          desc_file: otel.desc
          message_type: opentelemetry.proto.logs.v1.LogsData
      headers:
        content-type: application/x-protobuf

  emit_console:
    type: console
    encoding:
      codec: json
      json:
        pretty: true
    inputs:
    - otel.logs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
sink: opentelemetry Anything `opentelemetry` sink related source: opentelemetry Anything `opentelemetry` source related type: bug A code related bug.
Projects
None yet
Development

No branches or pull requests

3 participants