Routing and Monitoring Telemetry Data

Overview

Telemetry is vital but hard to manage. Expanding cloud environments, fleeting infrastructure, and tool sprawl flood teams with logs, metrics, and traces—expensive to store, difficult to use. OpenTelemetry, now the industry standard for data collection, is at the core of Onum.

You can use the OpenTelemetry Listener in Onum to centralize all your telemetry data and then send it to various monitoring tools using the OpenTelemetry Data sink.

Use case

In this use case, we are working with an e-commerce shop. Each of the product services in the shop generates specific telemetry data sets that we want to send to different monitoring services:

  • Metrics - we are sending these to Prometheus.

  • Traces - we are sending these to Jaeger.

  • Logs - we are sending these to Elastic.

We want to centralize and analyze all this data, and Onum enables us to do so using the OpenTelemetry Listener and Data sink. We will define a Pipeline that forwards all our data to the required service.

Configure OpenTelemetry

First of all, users who want to integrate OpenTelemetry with Onum need to configure the OpenTelemetry collector or agent to start sending data to Onum.

To do it, you must configure the following YAML file in your OpenTelemetry collector. In the exporters section, you must define whether you want to use gRPC or HTTP, and add the distributor endpoint. Then, in the service section, you must indicate again if you're using gRPC or HTTP in the exporters parameter.

exporters:
  otlphttp/onum:
    endpoint: http://192.168.1.75:8080
    tls:
      insecure: true
  otlp/onum:
    endpoint: 192.168.1.75:8080
    tls:
      insecure: true
  
service:
  pipelines:
    traces:
      receivers: [otlp]
      exporters: [otlphttp/onum, otlp/onum]
    metrics:
      receivers: [otlp]
      exporters: [otlphttp/onum, otlp/onum]
    logs:
      receivers: [otlp]
      exporters: [otlphttp/onum, otlp/onum]

Once the collector is configured, you need to set the OpenTelemetry Listener in Onum using this data.

Create your OpenTelemetry Listener

First, you must configure the OpenTelemetry Listener you'll use in your Pipeline. To do it:

1

Go to Listeners > New listener. Select OpenTelemetry and click Configuration.

2

Give it a name and allow both HTTP and gRPC configurations. Then, enter the required sending ports in the gRPC port and HTTP port parameters.

3

Optionally, enter any additional information. If you don't need any, set everything to false and click Create labels > Create listener (define any required labels if needed, learn more in this article).

Create the required Data sinks

Now, you must create a Data sink for each type of data we want to analyze:

For your metrics, you need an OpenTelemetry Data sink that sends them to your Prometheus instance.

To do it, go to Data sinks > New data sink. Select OpenTelemetry and click Configuration. Give it a name (for example PrometheusSink) and configure the following:

  • Set Protocol to HTTP.

  • Enter the required endpoint/port in the Endpoint and Port parameters.

  • Set Send Metrics to true.

  • Enter the required path in the HTTP Metrics Path parameter.

  • Optionally, enter any additional information. If you don't need any, set everything to false.

Click Finish when you're done.

Define the Pipeline in Onum

Now we can start creating our Pipeline in Onum:

1

First, go to Pipelines and click New Pipeline. Drag the configured OpenTelemetry Listener you created before into the Pipeline canvas.

2

Add a Conditional Action to extract each of the 3 data sets we want to get from our source data. To do it, add three different ports:

  • A metrics port to filter data that includes the resourceMetrics literal.

  • A traces port to filter data that includes the resourceSpans literal.

  • A logs port to filter data that includes the resourceLogs literal.

Click Save and link your OpenTelemetry Listener to the Conditional Action.

3

Now, you must drag into the canvas the Data sinks created previously to send each data set to its specific analytic service and link them to the Conditional Action as follows:

  • Link the Metrics port of the Conditional Action to the PrometheusSink.

  • Link the Traces port of the Conditional Action to the JaegerSink.

4

For the logs, we need a couple of additional steps. First, link the Logs port of the Conditional Action to a Message Builder Action and configure it to extract only the raw field. Click Save.

5

Now, link the Message Builder Action to the ElasticSink. Double click the ElasticSink and enter the following:

  • Choose POST as the HTTP method.

  • Enter the required URL to send the data.

  • In Message, choose the field generated by the Message Builder.

  • Choose application/json as Content type.

Enter any other data if required and click Save.

6

Finally, click Publish, and your data sets will be sent to each specific service so you can start analyzing them.

Last updated

Was this helpful?