Skip to content

Commit bfe90b2

Browse files
committed
feat(dsm): Add Live Messages (v2) documentation for Data Streams Monitoring
1 parent 830d938 commit bfe90b2

File tree

1 file changed

+83
-0
lines changed

1 file changed

+83
-0
lines changed
Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
---
2+
title: Live Messages (v2)
3+
---
4+
5+
Live Messages allows you to view messages from specific offsets within Kafka. Inspecting these live messages can be crucial when troubleshooting issues related to specific services or data streams.
6+
7+
(Insert Screenshot here)
8+
9+
## Setup
10+
11+
To configure Live Messages, identify an appropriate agent to connect to your Kafka cluster and enable the Kafka consumer integration.
12+
13+
> **Important:** Before proceeding, ensure you have the `Data Streams Monitoring Capture Messages` permission enabled. See [Required Permission](#required-permission-data-streams-monitoring-capture-messages) for details.
14+
15+
16+
### Selecting an Agent
17+
18+
If you self-host Kafka, set up Live Messages on your Kafka broker’s agent. Otherwise, choose any agent your producer or consumer services communicate with that has access to your Kafka cluster.
19+
20+
### Step-by-Step Guide
21+
22+
Perform all steps below on the same agent:
23+
24+
#### 1. Verify Agent Version
25+
26+
Ensure your agent is running version `7.69.0` or newer.
27+
28+
#### 2. Enable [Remote Configuration][2]
29+
30+
Confirm Remote Configuration is enabled for your agent, which is typically active by default. You can verify this on the [Fleet Automation][3] page.
31+
32+
#### 3. Configure Kafka Consumer Integration
33+
34+
Create or update the Kafka consumer configuration file at `[agent config directory]/kafka_consumer.d/conf.yaml` with the following example:
35+
36+
```yaml
37+
init_config:
38+
39+
instances:
40+
- kafka_connect_str: kafka:29092 (use your own kafka string)
41+
monitor_unlisted_consumer_groups: true
42+
tags:
43+
- env:staging
44+
```
45+
46+
Ensure you correctly tag your clusters to facilitate filtering and identification.
47+
48+
For more comprehensive instructions and advanced configuration options, see the [Kafka consumer integration documentation][4].
49+
50+
#### 4. Verify Setup
51+
52+
Check your agent logs for entries containing `kafka_consumer` to confirm the integration is active.
53+
54+
Also, verify Datadog is receiving data by examining the metric `kafka.broker_offset` in the metrics explorer, filtering by your configured topics.
55+
56+
57+
### Required Permission: "Data Streams Monitoring Capture Messages"
58+
59+
You must have the `Data Streams Monitoring Capture Messages` permission enabled. You can verify or enable this permission by modifying an existing role or creating a new one on the [Roles][1] page.
60+
61+
### Usage
62+
63+
(TBD)
64+
65+
### Additional details
66+
67+
#### Message storage and access
68+
69+
(TBD)
70+
71+
#### Sensitive information redaction
72+
73+
(TBD)
74+
75+
#### SSL encryption on Kafka
76+
77+
78+
(TBD)
79+
80+
[1]: https://app.datadoghq.com/organization-settings/roles
81+
[2]: /agent/remote_config
82+
[3]: https://app.datadoghq.com/fleet
83+
[4]: /integrations/kafka-consumer/?tab=host#setup

0 commit comments

Comments
 (0)