You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/app-management.md
+4Lines changed: 4 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -132,6 +132,10 @@ from quixstreams import App
132
132
App.run()
133
133
```
134
134
135
+
## Dispose events
136
+
137
+
When a topic is disposed, it is possible to run additional code by linking the `on_disposed` event to an appropriate handler. The `on_disposed` event occurs at the end of the topic disposal process.
138
+
135
139
## Keep alive
136
140
137
141
Unless you add an infinite loop or similar code, a Python code file will run each code statement sequentially until the end of the file, and then exit.
Copy file name to clipboardExpand all lines: docs/subscribe.md
+131-3Lines changed: 131 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -244,6 +244,38 @@ topic_consumer.subscribe()
244
244
245
245
The conversions from [TimeseriesData](#timeseriesdata-format) to pandas DataFrame have an intrinsic cost overhead. For high-performance models using pandas DataFrame, you should use the `on_dataframe_received` callback provided by the library, which is optimized to do as few conversions as possible.
246
246
247
+
### Raw data format
248
+
249
+
In addition to the `TimeseriesData` and pandas `DataFrame` formats (Python only), there is also the raw data format. You can use the `on_raw_received` callback (Python), or `OnRawRceived` event (C#) to handle this data format, as demonstrated in the following code:
250
+
251
+
=== "Python"
252
+
253
+
``` python
254
+
from quixstreams import TopicConsumer, StreamConsumer, TimeseriesDataRaw
Quix Streams provides you with an optional programmable buffer which you can tailor to your needs. Using buffers to consume data allows you to process data in batches according to your needs. The buffer also helps you to develop models with a high-performance throughput.
@@ -304,6 +336,8 @@ Consuming data from that buffer is as simple as using its callback (Python) or e
304
336
};
305
337
```
306
338
339
+
Other calbacks are available in addition to `on_data_released` (for `TimeseriesData`), including `on_dataframe_released` (for pandas `DataFrame`) and `on_raw_released` (for `TimeseriesDataRaw`). You use the callback appropriate to your stream data format.
340
+
307
341
You can configure multiple conditions to determine when the buffer has to release data, if any of these conditions become true, the buffer will release a new packet of data and that data is cleared from the buffer:
Console.WriteLine($"Properties changed for stream: {streamConsumer.StreamId}");
496
+
}
497
+
498
+
};
499
+
500
+
topicConsumer.Subscribe();
501
+
```
502
+
503
+
You can keep a copy of the properties if you need to find out which properties had changed.
504
+
505
+
## Responding to changes in parameter definitions
506
+
507
+
It is possible to handle changes in [parameter definitions](./publish.md#parameter-definitions). Parameter definitions are metadata attached to data in a stream. The `on_definitions_changed` event is linked to an appropriate event handler, as shown in the following example code:
It is important to be aware of the commit concept when working with a broker. Committing allows you to mark how far data has been processed, also known as creating a [checkpoint](kafka.md#checkpointing). In the event of a restart or rebalance, the client only processes messages from the last committed position. Commits are done for each consumer group, so if you have several consumer groups in use, they do not affect each another when committing.
The piece of code above will commit anything – like parameter, event or metadata - consumed and served to you from the topic you subscribed to up to this point.
489
615
490
-
### Commit callback
616
+
### Committed and committing events
491
617
492
618
=== "Python"
493
-
Whenever a commit occurs, a callback is raised to let you know. This callback is invoked for both manual and automatic commits. You can set the callback using the following code:
619
+
Whenever a commit completes, a callback is raised that can be connected to a handler. This callback is invoked for both manual and automatic commits. You can set the callback using the following code:
494
620
495
621
``` python
496
622
from quixstreams import TopicConsumer
@@ -502,7 +628,7 @@ Whenever a commit occurs, a callback is raised to let you know. This callback is
502
628
```
503
629
504
630
=== "C\#"
505
-
Whenever a commit occurs, an event is raised to let you know. This event is raised for both manual and automatic commits. You can subscribe to this event using the following code:
631
+
Whenever a commit completes, an event is raised that can be connected to a handler. This event is raised for both manual and automatic commits. You can subscribe to this event using the following code:
506
632
507
633
``` cs
508
634
topicConsumer.OnCommitted += (sender, args) =>
@@ -511,6 +637,8 @@ Whenever a commit occurs, an event is raised to let you know. This event is rais
511
637
};
512
638
```
513
639
640
+
While the `on_committed` event is triggered once the data has been committed, there is also the `on_committing` event which is triggered at the beginning of the commit cycle, should you need to carry out other tasks before the data is committed.
641
+
514
642
### Auto offset reset
515
643
516
644
You can control the offset that data is received from by optionally specifying `AutoOffsetReset` when you open the topic.
0 commit comments