File tree Expand file tree Collapse file tree 1 file changed +22
-0
lines changed Expand file tree Collapse file tree 1 file changed +22
-0
lines changed Original file line number Diff line number Diff line change @@ -100,6 +100,28 @@ Setting direct memory too low decreases the performance of ingestion.
100100
101101NOTE: Be sure that heap and direct memory combined does not exceed the total memory available on the server to avoid an OutOfDirectMemoryError
102102
103+ [id="plugins-{type}s-{plugin}-memory-sizing"]
104+ ===== How to size the direct memory used
105+
106+ To correctly size the direct memory to sustain the flow of incoming Beats connections, the medium size of the transmitted
107+ log lines has to be known and also the batch size used by Beats (default to 2048). Overall the connections, only a
108+ subset of them are actively processed in parallel by Netty, corresponding to the number of workers which equals the
109+ number of CPU cores available. For each under processing channel a batch of events is read and due to the way
110+ the decompressing and decoding part works, it keeps two copies of the batch in memory.
111+ The expression used to calculate the maximum direct memory usage is:
112+ ["source","text"]
113+ -----
114+ event size * batch size * 2 * netty workers
115+ -----
116+
117+ Supposing a 1Kb event size, there a small overhead of ~500 bytes of metadata transferred, considering 12 core CPU the memory
118+ consumption could be estimated as:
119+ ["source","text"]
120+ -----
121+ 1,5 KB * 2048 * 2 * 12
122+ -----
123+ This totalling to about 140MB. So if you have some data about the medium size of the events to process you can size
124+ the memory accordingly without risking to go in Out-Of-Memory error on the direct memory space in production environment.
103125
104126//Content for Beats
105127ifeval::["{plugin}"=="beats"]
You can’t perform that action at this time.
0 commit comments