Send data to Prometheus
Overview
Onum supports integration with Prometheus. Use the Open Telemetry Sink to send metrics to your Prometheus endpoint.
Log in to your Onum tenant and click Data Sinks > New Sink.
Double-click the Open Telemetry Sink.
Enter a Name for the new Sink. Optionally, add a Description and some Tags to identify the Sink.
Choose the HTTP Protocol. When sending traces to Jaeger, a popular distributed tracing backend, this protocol ensures structured, high-performance communication.
Endpoint - the default URL is usually http://<your-prometheus-host>:9090
Go to Status > Targets URL:
http://<host>:9090/targetsYou’ll see a list of endpoints Prometheus is currently scraping, such as:
http://node-exporter:9100/metrics http://my-app:8080/metrics
Port - enter the port included in your endpoint.
Send Metrics - true Your metrics must have a specific set of information in OpenTelemetry format. Check an example for every data type below:
Metrics
{
"resourceMetrics": [
{
"resource": {
"attributes": [
{
"key": "service.name",
"value": {
"stringValue": "my.service"
}
}
]
},
"scopeMetrics": [
{
"scope": {
"name": "my.library",
"version": "1.0.0",
"attributes": [
{
"key": "my.scope.attribute",
"value": {
"stringValue": "some scope attribute"
}
}
]
},
"metrics": [
{
"name": "my.counter",
"unit": "1",
"description": "I am a Counter",
"sum": {
"aggregationTemporality": 1,
"isMonotonic": true,
"dataPoints": [
{
"asDouble": 5,
"startTimeUnixNano": "1544712660300000000",
"timeUnixNano": "1544712660300000000",
"attributes": [
{
"key": "my.counter.attr",
"value": {
"stringValue": "some value"
}
}
]
}
]
}
},
{
"name": "my.gauge",
"unit": "1",
"description": "I am a Gauge",
"gauge": {
"dataPoints": [
{
"asDouble": 10,
"timeUnixNano": "1544712660300000000",
"attributes": [
{
"key": "my.gauge.attr",
"value": {
"stringValue": "some value"
}
}
]
}
]
}
},
{
"name": "my.histogram",
"unit": "1",
"description": "I am a Histogram",
"histogram": {
"aggregationTemporality": 1,
"dataPoints": [
{
"startTimeUnixNano": "1544712660300000000",
"timeUnixNano": "1544712660300000000",
"count": 2,
"sum": 2,
"bucketCounts": [1,1],
"explicitBounds": [1],
"min": 0,
"max": 2,
"attributes": [
{
"key": "my.histogram.attr",
"value": {
"stringValue": "some value"
}
}
]
}
]
}
},
{
"name": "my.exponential.histogram",
"unit": "1",
"description": "I am an Exponential Histogram",
"exponentialHistogram": {
"aggregationTemporality": 1,
"dataPoints": [
{
"startTimeUnixNano": "1544712660300000000",
"timeUnixNano": "1544712660300000000",
"count": 3,
"sum": 10,
"scale": 0,
"zeroCount": 1,
"positive": {
"offset": 1,
"bucketCounts": [0,2]
},
"min": 0,
"max": 5,
"zeroThreshold": 0,
"attributes": [
{
"key": "my.exponential.histogram.attr",
"value": {
"stringValue": "some value"
}
}
]
}
]
}
}
]
}
]
}
]
Enter the Metrics Path.
Optionally, enter any additional information. If you don't need any, set everything to fals
Pipeline configuration
When it comes to using this Data sink in a Pipeline, you must configure the following output parameters. To do it, simply click the Data sink on the canvas and select Configuration.
Output configuration
Event field to be sent*
Choose the field that contains the data to be sent.
Click Save to save your configuration.
Last updated
Was this helpful?

