|
| 1 | +--- |
| 2 | +title: Live Messages (v2) |
| 3 | +--- |
| 4 | + |
| 5 | +Live Messages allows you to view messages from specific offsets within Kafka. Inspecting these live messages can be crucial when troubleshooting issues related to specific services or data streams. |
| 6 | + |
| 7 | +(Insert Screenshot here) |
| 8 | + |
| 9 | +## Setup |
| 10 | + |
| 11 | +To configure Live Messages, identify an appropriate agent to connect to your Kafka cluster and enable the Kafka consumer integration. |
| 12 | + |
| 13 | +### Selecting an Agent |
| 14 | + |
| 15 | +If you self-host Kafka, set up Live Messages on your Kafka broker’s agent. Otherwise, choose any agent your producer or consumer services communicate with that has access to your Kafka cluster. |
| 16 | + |
| 17 | +### Step-by-Step Guide |
| 18 | + |
| 19 | +Perform all steps below on the same agent: |
| 20 | + |
| 21 | +#### 1. Verify Agent Version |
| 22 | + |
| 23 | +Ensure your agent is running version `7.69.0` or newer. |
| 24 | + |
| 25 | +#### 2. Enable [Remote Configuration][2] |
| 26 | + |
| 27 | +Confirm Remote Configuration is enabled for your agent, which is typically active by default. You can verify this on the [Fleet Automation][3] page. |
| 28 | + |
| 29 | +#### 3. Configure Kafka Consumer Integration |
| 30 | + |
| 31 | +Create or update the Kafka consumer configuration file at `[agent config directory]/kafka_consumer.d/conf.yaml` with the following example: |
| 32 | + |
| 33 | +```yaml |
| 34 | +init_config: |
| 35 | + |
| 36 | +instances: |
| 37 | + - kafka_connect_str: kafka:29092 (use your own kafka string) |
| 38 | + monitor_unlisted_consumer_groups: true |
| 39 | + tags: |
| 40 | + - env:staging |
| 41 | +``` |
| 42 | +
|
| 43 | +Ensure you correctly tag your clusters to facilitate filtering and identification. |
| 44 | +
|
| 45 | +For more comprehensive instructions and advanced configuration options, see the [Kafka consumer integration documentation][4]. |
| 46 | +
|
| 47 | +#### 4. Verify Setup |
| 48 | +
|
| 49 | +Check your agent logs for entries containing `kafka_consumer` to confirm the integration is active. |
| 50 | + |
| 51 | +Also, verify Datadog is receiving data by examining the metric `kafka.broker_offset` in the metrics explorer, filtering by your configured topics. |
| 52 | + |
| 53 | + |
| 54 | +### Required Permission: "Data Streams Monitoring Capture Messages" |
| 55 | + |
| 56 | +You must have the `Data Streams Monitoring Capture Messages` permission enabled. You can verify or enable this permission by modifying an existing role or creating a new one on the [Roles][1] page. |
| 57 | + |
| 58 | +If you don't have permission to modify roles, you will have to ask an admin. |
| 59 | + |
| 60 | +### Usage |
| 61 | + |
| 62 | +(TBD) |
| 63 | + |
| 64 | +### Additional details |
| 65 | + |
| 66 | +#### Message storage and access |
| 67 | + |
| 68 | +(TBD) |
| 69 | + |
| 70 | +#### Sensitive information redaction |
| 71 | + |
| 72 | +(TBD) |
| 73 | + |
| 74 | +#### SSL encryption on Kafka |
| 75 | + |
| 76 | + |
| 77 | +(TBD) |
| 78 | + |
| 79 | +[1]: https://app.datadoghq.com/organization-settings/roles |
| 80 | +[2]: /agent/remote_config |
| 81 | +[3]: https://app.datadoghq.com/fleet |
| 82 | +[4]: /integrations/kafka-consumer/?tab=host#setup |
0 commit comments