Skip to content

Commit 566a495

Browse files
brett0000FFclaude
andcommitted
[DOCS-11944] Add Node.js examples and non-JSON log correlation for OTel traces
Add a Node.js tab to the trace context injection section showing Winston auto-instrumentation, and add a new sub-section for correlating non-JSON logs with a manual Winston formatter example and regex_parser Collector config. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 3dc6753 commit 566a495

1 file changed

Lines changed: 123 additions & 13 deletions

File tree

content/en/tracing/other_telemetry/connect_logs_and_traces/opentelemetry.md

Lines changed: 123 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,7 @@ To correlate OpenTelemetry traces and logs in Datadog, you must:
3939

4040
### 1. Inject trace context into your logs
4141

42-
The following examples for Go and Java use logging bridges. These bridges intercept logs from common logging libraries (such as `zap` and `Logback`), convert them into the OpenTelemetry log data model, and forward them to the OpenTelemetry SDK. This process automatically enriches the logs with the active trace context.
42+
The following examples use logging bridges or auto-instrumentation. These tools intercept logs from common logging libraries (such as `zap`, `Logback`, and `Winston`), convert them into the OpenTelemetry log data model, and forward them to the OpenTelemetry SDK. This process automatically enriches the logs with the active trace context.
4343

4444
For complete, working applications, see the [Datadog OpenTelemetry Examples repository][2].
4545

@@ -88,6 +88,51 @@ For complete, working example configuration, see the [full Java example in the e
8888

8989
[200]: https://github.com/DataDog/opentelemetry-examples/blob/main/apps/rest-services/java/calendar/src/main/resources/logback.xml
9090

91+
{{% /tab %}}
92+
93+
{{% tab "Node.js" %}}
94+
95+
For Node.js, use the `@opentelemetry/instrumentation-winston` package with [Winston][300] to automatically inject trace context into your logs. Install the required packages:
96+
97+
```shell
98+
npm install winston @opentelemetry/instrumentation-winston
99+
```
100+
101+
Then, register the Winston instrumentation as part of your OpenTelemetry SDK setup:
102+
103+
```javascript
104+
const { WinstonInstrumentation } = require('@opentelemetry/instrumentation-winston');
105+
const { NodeSDK } = require('@opentelemetry/sdk-node');
106+
107+
const sdk = new NodeSDK({
108+
instrumentations: [
109+
new WinstonInstrumentation(),
110+
// ... other instrumentations
111+
],
112+
});
113+
sdk.start();
114+
```
115+
116+
After the instrumentation is registered, any Winston logger automatically includes `trace_id`, `span_id`, and `trace_flags` fields when a log is emitted within an active trace:
117+
118+
```javascript
119+
const winston = require('winston');
120+
121+
const logger = winston.createLogger({
122+
level: 'info',
123+
format: winston.format.json(),
124+
transports: [new winston.transports.Console()],
125+
});
126+
127+
// Logs emitted within a traced context automatically include trace_id and span_id
128+
logger.info('Processing user request');
129+
```
130+
131+
The same approach works with [Pino][301] using the `@opentelemetry/instrumentation-pino` package.
132+
133+
[300]: https://github.com/winstonjs/winston
134+
[301]: https://github.com/pinojs/pino
135+
91136
{{% /tab %}}
92137
{{< /tabs >}}
93138

@@ -141,25 +186,27 @@ The OpenTelemetry Collector and the Datadog Agent can both receive OTLP logs.
141186
exporters: [datadog]
142187
```
143188
144-
#### Scrape logs from files
189+
#### Scrape logs from files
145190
146191
This approach is useful if you have a requirement to keep local log files for compliance or other tooling.
147192
148-
For Datadog to correlate your logs and traces, your JSON log files must contain specific fields formatted correctly:
193+
For Datadog to correlate your logs and traces, your log files must contain specific fields formatted correctly:
149194
- `trace_id`: The ID of the trace. It must be a 32-character lowercase hexadecimal string.
150-
- `span_id`: The ID of the span. It must be a 16-character lowercase hexadecimal string.
195+
- `span_id`: The ID of the span. It must be a 16-character lowercase hexadecimal string.
151196

152197
The OpenTelemetry SDK typically provides these in a raw format (such as an integer or byte array), which must be formatted into hexadecimal strings without any <code>0x</code> prefix.
153198

199+
##### JSON logs
200+
154201
1. **Configure your Application to Output JSON Logs**: Use a standard logging library to write logs as JSON to a file or `stdout`. The following Python example uses the standard `logging` library.
155-
2. **Manually Inject Trace Context**: In your application code, retrieve the current span context and add the `trace_id` and `span_id` to your log records. The following Python example shows how to create a custom logging.Filter to do this automatically:
202+
2. **Manually Inject Trace Context**: In your application code, retrieve the current span context and add the `trace_id` and `span_id` to your log records. The following Python example shows how to create a custom `logging.Filter` to do this automatically:
156203

157204
```python
158205
import logging
159206
import sys
160207
from opentelemetry import trace
161208
from pythonjsonlogger import jsonlogger
162-
209+
163210
# 1. Create a filter to inject trace context
164211
class TraceContextFilter(logging.Filter):
165212
def filter(self, record):
@@ -169,25 +216,25 @@ The OpenTelemetry SDK typically provides these in a raw format (such as an integ
169216
record.trace_id = f'{span_context.trace_id:032x}'
170217
record.span_id = f'{span_context.span_id:016x}'
171218
return True
172-
219+
173220
# 2. Configure a JSON logger
174221
logger = logging.getLogger("my-json-logger")
175222
logger.setLevel(logging.DEBUG)
176-
223+
177224
# 3. Add the filter to the logger
178225
logger.addFilter(TraceContextFilter())
179-
226+
180227
handler = logging.StreamHandler(sys.stdout)
181228
formatter = jsonlogger.JsonFormatter(
182229
'%(asctime)s %(name)s %(levelname)s %(message)s %(trace_id)s %(span_id)s'
183230
)
184231
handler.setFormatter(formatter)
185232
logger.addHandler(handler)
186-
233+
187234
# Logs will now contain the trace_id and span_id
188235
logger.info("Processing user request with trace context.")
189236
```
190-
237+
191238
3. **Configure the Collector to Scrape Log Files**: In your Collector's `config.yaml`, enable the `filelog` receiver. Configure it to find your log files and parse them as JSON.
192239
```yaml
193240
receivers:
@@ -197,14 +244,77 @@ The OpenTelemetry SDK typically provides these in a raw format (such as an integ
197244
- type: json_parser
198245
# The timestamp and severity fields should match your JSON output
199246
timestamp:
200-
parse_from: attributes.asctime
247+
parse_from: attributes.asctime
201248
layout: '%Y-%m-%d %H:%M:%S,%f'
202249
severity:
203250
parse_from: attributes.levelname
204251
# ... your logs pipeline ...
205252
```
206253

207-
This manual approach gives you full control over the log format, ensuring it is clean and easily parsable by the Collector or Datadog Agent.
254+
##### Non-JSON logs
255+
256+
If your application outputs logs in a plain text or key-value format instead of JSON, you can still correlate them with traces. You need to manually inject the trace context into the log string and configure the Collector to extract it using a `regex_parser`.
257+
258+
The following Node.js example uses Winston with a custom formatter to inject `trace_id` and `span_id` into a plain text log line:
259+
260+
```javascript
261+
const { trace, context } = require('@opentelemetry/api');
262+
const winston = require('winston');
263+
264+
// Custom format that injects trace context into plain text logs
265+
const traceFormat = winston.format((info) => {
266+
const span = trace.getSpan(context.active());
267+
if (span) {
268+
const spanContext = span.spanContext();
269+
info.trace_id = spanContext.traceId;
270+
info.span_id = spanContext.spanId;
271+
}
272+
return info;
273+
});
274+
275+
const logger = winston.createLogger({
276+
level: 'info',
277+
format: winston.format.combine(
278+
traceFormat(),
279+
winston.format.timestamp(),
280+
winston.format.printf(({ timestamp, level, message, trace_id, span_id }) => {
281+
return `${timestamp} ${level} trace_id=${trace_id || '0'} span_id=${span_id || '0'} ${message}`;
282+
})
283+
),
284+
transports: [
285+
new winston.transports.File({ filename: '/var/log/my-app/app.log' })
286+
],
287+
});
288+
289+
// Output: 2025-01-15T12:00:00.000Z info trace_id=4bf92f3577b34da6a3ce929d0e0e4736 span_id=00f067aa0ba902b7 Processing request
290+
logger.info('Processing request');
291+
```
292+
293+
Then, configure the Collector's `filelog` receiver with a `regex_parser` to extract the trace context from the plain text log lines:
294+
295+
```yaml
296+
receivers:
297+
filelog:
298+
include: [ /var/log/my-app/*.log ]
299+
operators:
300+
- type: regex_parser
301+
regex: '^(?P<timestamp>\S+) (?P<severity>\S+) trace_id=(?P<trace_id>[a-f0-9]+) span_id=(?P<span_id>[a-f0-9]+) (?P<body>.*)$'
302+
timestamp:
303+
parse_from: attributes.timestamp
304+
layout: '%Y-%m-%dT%H:%M:%S.%fZ'
305+
severity:
306+
parse_from: attributes.severity
307+
trace:
308+
trace_id:
309+
parse_from: attributes.trace_id
310+
span_id:
311+
parse_from: attributes.span_id
312+
# ... your logs pipeline ...
313+
```
314+
315+
This approach works with any logging library or language. The key requirements are:
316+
- Include `trace_id` and `span_id` as parseable fields in your log output.
317+
- Configure the Collector's `regex_parser` to match your log format and extract the trace context fields.
208318

209319
#### Collect logs using the Datadog Agent
210320

0 commit comments

Comments
 (0)