I have a few general questions on Persistent Queues(PQ).
I see PQ settings for Source & Destination, and further my question are specific to size & compression.
If I enable PQ and let the default size to be 1 MB & with compression turned on and if I get a burst of 100MB of events which Cribl Stream is unable to handle, does it create a 100 compressed files of 1 MB each? I understand my events will get delayed in getting indexed in Splunk, but will this guarantee that I eventually will get these events(Assuming I have PQ enabled on the destination side too).
Is compression given as an option just so to mitigate disk utilization?
When I turn on PQ, I do see a few ndjson.*.gz files in the queues directory, but these are from about 20-60 minutes ago when I first enabled PQ, when i do a cursory check of the data, I am seeing all the events in Splunk. Does this mean the Cribl is busy and eventually will get through processing these gz files may be in a few hours when things calm down?